Replies: 1 comment 1 reply
-
The final model is the two adapter_. files. Use generate.py to run your lora. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Long story short: my friend and I have fine-tuned with a custom data.json till checkpoint-200 but now we don't know how to run it. We dont really understand how use the model we trained to input prompts and get answers, we just know that after running finetune.py a folder was created with the name "lora-alpaca", inside it has a "checkpoint-200" folder, a "run" folder, an "adapter_config.json" and an "adapter_model.bin". Please we would be really thankful if someone could help us.
Beta Was this translation helpful? Give feedback.
All reactions