Skip to content

Commit

Permalink
Backport PR #533: Document entry point and API for custom embedding m…
Browse files Browse the repository at this point in the history
…odels (#537)

Co-authored-by: Michał Krassowski <[email protected]>
  • Loading branch information
meeseeksmachine and krassowski authored Dec 20, 2023
1 parent f594965 commit f6bd114
Show file tree
Hide file tree
Showing 2 changed files with 31 additions and 1 deletion.
31 changes: 30 additions & 1 deletion docs/source/developers/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,35 @@ your new provider's `id`:
[LLM]: https://api.python.langchain.com/en/v0.0.339/llms/langchain.llms.base.LLM.html#langchain.llms.base.LLM
[BaseChatModel]: https://api.python.langchain.com/en/v0.0.339/chat_models/langchain.chat_models.base.BaseChatModel.html

### Custom embeddings providers

To provide a custom embeddings model an embeddings providers should be defined implementing the API of `jupyter-ai`'s `BaseEmbeddingsProvider` and of `langchain`'s [`Embeddings`][Embeddings] abstract class.

```python
from jupyter_ai_magics import BaseEmbeddingsProvider
from langchain.embeddings import FakeEmbeddings

class MyEmbeddingsProvider(BaseEmbeddingsProvider, FakeEmbeddings):
id = "my_embeddings_provider"
name = "My Embeddings Provider"
model_id_key = "model"
models = ["my_model"]

def __init__(self, **kwargs):
super().__init__(size=300, **kwargs)
```

Jupyter AI uses entry points to discover embedding providers.
In the `pyproject.toml` file, add your custom embedding provider to the
`[project.entry-points."jupyter_ai.embeddings_model_providers"]` section:

```toml
[project.entry-points."jupyter_ai.embeddings_model_providers"]
my-provider = "my_provider:MyEmbeddingsProvider"
```

[Embeddings]: https://api.python.langchain.com/en/stable/embeddings/langchain_core.embeddings.Embeddings.html

## Prompt templates

Each provider can define **prompt templates** for each supported format. A prompt
Expand Down Expand Up @@ -155,7 +184,7 @@ Jupyter AI uses entry points to support custom slash commands.
In the `pyproject.toml` file, add your custom handler to the
`[project.entry-points."jupyter_ai.chat_handlers"]` section:

```
```toml
[project.entry-points."jupyter_ai.chat_handlers"]
custom = "custom_package:CustomChatHandler"
```
Expand Down
1 change: 1 addition & 0 deletions packages/jupyter-ai-magics/jupyter_ai_magics/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

# expose embedding model providers on the package root
from .embedding_providers import (
BaseEmbeddingsProvider,
BedrockEmbeddingsProvider,
CohereEmbeddingsProvider,
GPT4AllEmbeddingsProvider,
Expand Down

0 comments on commit f6bd114

Please sign in to comment.