You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 14, 2024. It is now read-only.
Firstly thanks for a great app. I use it all the time!
Whle reading the guide "Integrate Ollama with Jan" I presumed it meant you can access your ollama models being served by the ollama server.
The guide mentions downloading the model from the Hub. Surely if you download the model from the Hub you wouldn't need to integrate with your ollama models.
Is there a way to "use" the models that have already been downloaded in to ollama?
The text was updated successfully, but these errors were encountered:
I second that request; Create a new server entry in JAN (instead of the hackish instruction to use open) that is specifically tailored to Ollama and let us run our Ollama models in the beauty and confort of JAN.
Thank you.
PS: A TON of ppl want this functionality for some time now...
Firstly thanks for a great app. I use it all the time!
Whle reading the guide "Integrate Ollama with Jan" I presumed it meant you can access your ollama models being served by the ollama server.
The guide mentions downloading the model from the Hub. Surely if you download the model from the Hub you wouldn't need to integrate with your ollama models.
Is there a way to "use" the models that have already been downloaded in to ollama?
The text was updated successfully, but these errors were encountered: