Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistent API Behavior #251

Open
negaralizadeh opened this issue Aug 12, 2024 · 1 comment
Open

Inconsistent API Behavior #251

negaralizadeh opened this issue Aug 12, 2024 · 1 comment

Comments

@negaralizadeh
Copy link

I checked the documentation several times and I'm using the generate function in this way:

options = {
"temperature": 0.2,
"top_p": 0.8,
"seed": 42,
"num_predict": 300,
}
ollama.generate(model=model_name, prompt=prompt, stream=False, options=options, system=system_message, format="json")

Although I set the stream flag to false, I still don't get the whole response in one package (done = False in the first received package).
The other problem is that sometimes, even when the package is final (done=true), I don't see all the expected additional information there for example prompt_eval_count is missing.

Am I doing something wrong here?

@ParthSareen
Copy link
Contributor

Hey! Are you still running into this? If so could you send an example of a run and the output it gives you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants