Skip to content

Commit

Permalink
Add Ollama (#646)
Browse files Browse the repository at this point in the history
* Add Ollama

* mistral:text for embeddings

* Add gemma

* Update the list of models

Co-authored-by: Bc <[email protected]>

* Mention Ollama in the docs

* Apply suggestions from code review

Co-authored-by: Piyush Jain <[email protected]>
Co-authored-by: david qiu <[email protected]>

---------

Co-authored-by: Bc <[email protected]>
Co-authored-by: Piyush Jain <[email protected]>
Co-authored-by: david qiu <[email protected]>
  • Loading branch information
4 people authored Jul 10, 2024
1 parent 6b21c03 commit 9f70ea7
Show file tree
Hide file tree
Showing 6 changed files with 36 additions and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ in JupyterLab and the Jupyter Notebook. More specifically, Jupyter AI offers:
* A native chat UI in JupyterLab that enables you to work with generative AI as a conversational assistant.
* Support for a wide range of generative model providers, including AI21, Anthropic, AWS, Cohere,
Gemini, Hugging Face, MistralAI, NVIDIA, and OpenAI.
* Local model support through GPT4All, enabling use of generative AI models on consumer grade machines
* Local model support through GPT4All and Ollama, enabling use of generative AI models on consumer grade machines
with ease and privacy.

Documentation is available on [ReadTheDocs](https://jupyter-ai.readthedocs.io/en/latest/).
Expand Down
4 changes: 4 additions & 0 deletions docs/source/users/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -357,6 +357,10 @@ GPT4All support is still an early-stage feature, so some bugs may be encountered
during usage. Our team is still actively improving support for locally-hosted
models.

### Ollama usage

To get started, follow the instructions on the [Ollama website](https://ollama.com/) to set up `ollama` and download the models locally. To select a model, enter the model name in the settings panel, for example `deepseek-coder-v2`.

### Asking about something in your notebook

Jupyter AI's chat interface can include a portion of your notebook in your prompt.
Expand Down
2 changes: 2 additions & 0 deletions packages/jupyter-ai-magics/jupyter_ai_magics/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
BedrockEmbeddingsProvider,
GPT4AllEmbeddingsProvider,
HfHubEmbeddingsProvider,
OllamaEmbeddingsProvider,
QianfanEmbeddingsEndpointProvider,
)
from .exception import store_exception
Expand All @@ -23,6 +24,7 @@
BedrockProvider,
GPT4AllProvider,
HfHubProvider,
OllamaProvider,
QianfanProvider,
SmEndpointProvider,
TogetherAIProvider,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
BedrockEmbeddings,
GPT4AllEmbeddings,
HuggingFaceHubEmbeddings,
OllamaEmbeddings,
QianfanEmbeddingsEndpoint,
)

Expand Down Expand Up @@ -66,6 +67,19 @@ def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs, **model_kwargs)


class OllamaEmbeddingsProvider(BaseEmbeddingsProvider, OllamaEmbeddings):
id = "ollama"
name = "Ollama"
# source: https://ollama.com/library
models = [
"nomic-embed-text",
"mxbai-embed-large",
"all-minilm",
"snowflake-arctic-embed",
]
model_id_key = "model"


class HfHubEmbeddingsProvider(BaseEmbeddingsProvider, HuggingFaceHubEmbeddings):
id = "huggingface_hub"
name = "Hugging Face Hub"
Expand Down
13 changes: 13 additions & 0 deletions packages/jupyter-ai-magics/jupyter_ai_magics/providers.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@
Bedrock,
GPT4All,
HuggingFaceEndpoint,
Ollama,
SagemakerEndpoint,
Together,
)
Expand Down Expand Up @@ -723,6 +724,18 @@ async def _acall(self, *args, **kwargs) -> Coroutine[Any, Any, str]:
return await self._call_in_executor(*args, **kwargs)


class OllamaProvider(BaseProvider, Ollama):
id = "ollama"
name = "Ollama"
model_id_key = "model"
help = (
"See [https://www.ollama.com/library](https://www.ollama.com/library) for a list of models. "
"Pass a model's name; for example, `deepseek-coder-v2`."
)
models = ["*"]
registry = True


class JsonContentHandler(LLMContentHandler):
content_type = "application/json"
accepts = "application/json"
Expand Down
2 changes: 2 additions & 0 deletions packages/jupyter-ai-magics/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,7 @@ anthropic-chat = "jupyter_ai_magics.partner_providers.anthropic:ChatAnthropicPro
cohere = "jupyter_ai_magics.partner_providers.cohere:CohereProvider"
gpt4all = "jupyter_ai_magics:GPT4AllProvider"
huggingface_hub = "jupyter_ai_magics:HfHubProvider"
ollama = "jupyter_ai_magics:OllamaProvider"
openai = "jupyter_ai_magics.partner_providers.openai:OpenAIProvider"
openai-chat = "jupyter_ai_magics.partner_providers.openai:ChatOpenAIProvider"
azure-chat-openai = "jupyter_ai_magics.partner_providers.openai:AzureChatOpenAIProvider"
Expand All @@ -76,6 +77,7 @@ cohere = "jupyter_ai_magics.partner_providers.cohere:CohereEmbeddingsProvider"
mistralai = "jupyter_ai_magics.partner_providers.mistralai:MistralAIEmbeddingsProvider"
gpt4all = "jupyter_ai_magics:GPT4AllEmbeddingsProvider"
huggingface_hub = "jupyter_ai_magics:HfHubEmbeddingsProvider"
ollama = "jupyter_ai_magics:OllamaEmbeddingsProvider"
openai = "jupyter_ai_magics.partner_providers.openai:OpenAIEmbeddingsProvider"
qianfan = "jupyter_ai_magics:QianfanEmbeddingsEndpointProvider"

Expand Down

0 comments on commit 9f70ea7

Please sign in to comment.