Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stops generating after about 200 characters #212

Closed
somerandompiggo opened this issue Mar 16, 2023 · 2 comments
Closed

Stops generating after about 200 characters #212

somerandompiggo opened this issue Mar 16, 2023 · 2 comments

Comments

@somerandompiggo
Copy link

I have tried modifying the -n (number of tokens to predict) but it always stops generating after the same amount of time. Is there any way to stop this happening? It seems to be intended behavior since it shows in the README screenshots

@lorenzopicoli
Copy link

Probably duplicate of #71

@somerandompiggo
Copy link
Author

Thanks, I didn't see that. I'll close the issue, but memory is not an issue as it says in that issue. I have 32GB of swap and 16GB of physical memory

Deadsg pushed a commit to Deadsg/llama.cpp that referenced this issue Dec 19, 2023
chore: add note for Mac m1 installation
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants