Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Empty response on ollama #151

Open
hasaranga opened this issue Nov 8, 2024 · 1 comment
Open

Empty response on ollama #151

hasaranga opened this issue Nov 8, 2024 · 1 comment

Comments

@hasaranga
Copy link

most of times responses are empty when using continuously with ollama. (fp16)

@vikhyat
Copy link
Owner

vikhyat commented Nov 18, 2024

Would not recommend using the ollama version right now, it only supports a very old version of the model (from April). I need to reach out to them to figure out a path forward because I heard that llama.cpp might be dropping support for vision language models.

We have a Python client we're developing -- https://github.com/vikhyat/moondream/tree/main/clients/python -- only supports CPU for now but we're working on adding MPS and CUDA support as soon as possible./

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants