Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(mistral): Mistral 1.3.1 migration #7218

Merged
merged 34 commits into from
Nov 22, 2024
Merged
Show file tree
Hide file tree
Changes from 29 commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
e133e5a
Finished rough draft migration
CarterMorris Oct 29, 2024
c330cfd
Added support for ContentChunk message type input to chat model
CarterMorris Oct 29, 2024
a695d90
Added delta msg null type options and fixed shouldStream nullish coal…
CarterMorris Oct 30, 2024
f24a26b
simplified logic for getContent()
TripleADC Oct 31, 2024
299fc37
Added support for new HTTPClient feature in v1.1.0
TripleADC Nov 3, 2024
716898f
Unified mistral import naming scheme
CarterMorris Nov 5, 2024
dac6e9c
Experimental addition of Mistral retry config for chat_models.ts
CarterMorris Nov 6, 2024
5faebfa
Fixed zod not being resolved
BaNg-W Nov 7, 2024
b72adc8
Removed global zod dependency from previous commit (0d778c9)
CarterMorris Nov 7, 2024
68f95fe
streamlined adding and removing hooks for httpclient
TripleADC Nov 9, 2024
32c9a6e
Fixed formatting of hook fields and deprecated endpoint
BaNg-W Nov 9, 2024
205bfb5
Integration Test debugging and fix implementations
BaNg-W Nov 10, 2024
3632d21
Reverted bindTools to bind in withStructuredOutput
CarterMorris Nov 10, 2024
701abed
Added support for new HTTPClient feature in v1.1.0
TripleADC Nov 3, 2024
679c761
Removed deprecated checks from mistral chat models integration test
CarterMorris Nov 10, 2024
b630f6d
import error resolution
BaNg-W Nov 11, 2024
cbef92e
Finished removing deprecated StructuredOutputMethodParams implementation
CarterMorris Nov 11, 2024
7f0ab21
Improved Langchain to Mistral message conversion flexibility
CarterMorris Nov 11, 2024
5d53364
Removed Mistral retry config support in favour of async caller retry …
CarterMorris Nov 13, 2024
48a8c3c
Upgraded to Mistral version 1.3.1
CarterMorris Nov 15, 2024
af2b4f2
Fixed assistant role only supporting content or tool calls but not both
BaNg-W Nov 16, 2024
ac2ebe8
Added tests for image url, contentChunk,and hooks + linting
BaNg-W Nov 16, 2024
c910f4f
Updated Mistral documentation to cover hook support
CarterMorris Nov 16, 2024
b702fbe
Added hook tests for mistral llms and embeddings
CarterMorris Nov 16, 2024
9213bfd
Fixed git cherry-pick mistakes in chat_models
BaNg-W Nov 16, 2024
9471c55
Merge branch 'main' of github.com:langchain-ai/langchainjs into 7218
jacoblee93 Nov 19, 2024
cc12d68
Remove hard dep, small fixes
jacoblee93 Nov 21, 2024
d10d9f9
Revert tsconfig, skip failing int test, format
jacoblee93 Nov 21, 2024
c103989
Format docs
jacoblee93 Nov 21, 2024
75292b9
Revert docs changes
jacoblee93 Nov 21, 2024
d8703f9
Remove log
jacoblee93 Nov 21, 2024
b780d85
Revert doc
jacoblee93 Nov 21, 2024
1e3ae5b
Remove log
jacoblee93 Nov 21, 2024
6621676
Update lock
jacoblee93 Nov 21, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
121 changes: 116 additions & 5 deletions docs/core_docs/docs/integrations/chat/mistral.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
"\n",
"[Mistral AI](https://mistral.ai/) is a platform that offers hosting for their powerful [open source models](https://docs.mistral.ai/getting-started/models/).\n",
"\n",
"This will help you getting started with ChatMistralAI [chat models](/docs/concepts/chat_models). For detailed documentation of all ChatMistralAI features and configurations head to the [API reference](https://api.js.langchain.com/classes/langchain_mistralai.ChatMistralAI.html).\n",
"This will help you getting started with ChatMistralAI [chat models](/docs/concepts/#chat-models). For detailed documentation of all ChatMistralAI features and configurations head to the [API reference](https://api.js.langchain.com/classes/langchain_mistralai.ChatMistralAI.html).\n",
"\n",
"## Overview\n",
"### Integration details\n",
Expand All @@ -38,7 +38,7 @@
"\n",
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
"| ✅ | ✅ | ✅ | | ❌ | ❌ | ✅ | ✅ | ❌ | \n",
"| ✅ | ✅ | ✅ | | ❌ | ❌ | ✅ | ✅ | ❌ | \n",
"\n",
"## Setup\n",
"\n",
Expand Down Expand Up @@ -88,7 +88,7 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"id": "cb09c344-1836-4e0c-acf8-11d13ac1dbae",
"metadata": {},
"outputs": [],
Expand Down Expand Up @@ -252,7 +252,7 @@
"source": [
"## Tool calling\n",
"\n",
"Mistral's API supports [tool calling](/docs/concepts/tool_calling) for a subset of their models. You can see which models support tool calling [on this page](https://docs.mistral.ai/capabilities/function_calling/).\n",
"Mistral's API supports [tool calling](/docs/concepts#functiontool-calling) for a subset of their models. You can see which models support tool calling [on this page](https://docs.mistral.ai/capabilities/function_calling/).\n",
"\n",
"The examples below demonstrates how to use it:"
]
Expand Down Expand Up @@ -323,6 +323,117 @@
"console.log(calcToolRes.tool_calls);"
]
},
{
"cell_type": "markdown",
"id": "85dcbecc",
"metadata": {},
"source": [
"## Hooks\n",
"\n",
"Mistral AI supports custom hooks for three events: beforeRequest, requestError, and reponse. Examples of the function signature for each hook type can be seen below:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "74b8b855",
"metadata": {},
"outputs": [],
"source": [
"const beforeRequestHook = (req: Request): Request | void | Promise<Request | void> => {\n",
" // Code to run before a request is processed by Mistral\n",
"};\n",
"\n",
"const requestErrorHook = (err: unknown, req: Request): void | Promise<void> => {\n",
" // Code to run when an error occurs as Mistral is processing a request\n",
"};\n",
"\n",
"const responseHook = (res: Response, req: Request): void | Promise<void> => {\n",
" // Code to run before Mistral sends a successful response\n",
"};"
]
},
{
"cell_type": "markdown",
"id": "930df6c4",
"metadata": {},
"source": [
"To add these hooks to the chat model, either pass them as arguments and they are automatically added:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "8b8084f6",
"metadata": {},
"outputs": [],
"source": [
"import { ChatMistralAI } from \"@langchain/mistralai\" \n",
"\n",
"const modelWithHooks = new ChatMistralAI({\n",
" model: \"mistral-large-latest\",\n",
" temperature: 0,\n",
" maxRetries: 2,\n",
" beforeRequestHooks: [ beforeRequestHook ],\n",
" requestErrorHooks: [ requestErrorHook ],\n",
" responseHooks: [ responseHook ],\n",
" // other params...\n",
"});"
]
},
{
"cell_type": "markdown",
"id": "cc9478f3",
"metadata": {},
"source": [
"Or assign and add them manually after instantiation:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "daa70dc3",
"metadata": {},
"outputs": [],
"source": [
"import { ChatMistralAI } from \"@langchain/mistralai\" \n",
"\n",
"const model = new ChatMistralAI({\n",
" model: \"mistral-large-latest\",\n",
" temperature: 0,\n",
" maxRetries: 2,\n",
" // other params...\n",
"});\n",
"\n",
"model.beforeRequestHooks = [ ...model.beforeRequestHooks, beforeRequestHook ];\n",
"model.requestErrorHooks = [ ...model.requestErrorHooks, requestErrorHook ];\n",
"model.responseHooks = [ ...model.responseHooks, responseHook ];\n",
"\n",
"model.addAllHooksToHttpClient();"
]
},
{
"cell_type": "markdown",
"id": "389f5159",
"metadata": {},
"source": [
"The method addAllHooksToHttpClient clears all currently added hooks before assigning the entire updated hook lists to avoid hook duplication.\n",
"\n",
"Hooks can be removed one at a time, or all hooks can be cleared from the model at once."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "a56b64bb",
"metadata": {},
"outputs": [],
"source": [
"model.removeHookFromHttpClient(beforeRequestHook);\n",
"\n",
"model.removeAllHooksFromHttpClient();"
]
},
{
"cell_type": "markdown",
"id": "3a5bb5ca-c3ae-4a58-be67-2cd18574b9a3",
Expand Down Expand Up @@ -354,4 +465,4 @@
},
"nbformat": 4,
"nbformat_minor": 5
}
}
107 changes: 105 additions & 2 deletions docs/core_docs/docs/integrations/llms/mistral.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
":::\n",
"\n",
":::caution\n",
"You are currently on a page documenting the use of Mistral models as [text completion models](/docs/concepts/text_llms). Many popular models available on Mistral are [chat completion models](/docs/concepts/chat_models).\n",
"You are currently on a page documenting the use of Mistral models as [text completion models](/docs/concepts/#llms). Many popular models available on Mistral are [chat completion models](/docs/concepts/#chat-models).\n",
"\n",
"You may be looking for [this page instead](/docs/integrations/chat/mistral/).\n",
":::\n",
Expand Down Expand Up @@ -271,6 +271,109 @@
"console.log(customOutputParser(resWithParser));"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Hooks\n",
"\n",
"Mistral AI supports custom hooks for three events: beforeRequest, requestError, and reponse. Examples of the function signature for each hook type can be seen below:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"const beforeRequestHook = (req: Request): Request | void | Promise<Request | void> => {\n",
" // Code to run before a request is processed by Mistral\n",
"};\n",
"\n",
"const requestErrorHook = (err: unknown, req: Request): void | Promise<void> => {\n",
" // Code to run when an error occurs as Mistral is processing a request\n",
"};\n",
"\n",
"const responseHook = (res: Response, req: Request): void | Promise<void> => {\n",
" // Code to run before Mistral sends a successful response\n",
"};"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To add these hooks to the chat model, either pass them as arguments and they are automatically added:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import { ChatMistralAI } from \"@langchain/mistralai\" \n",
"\n",
"const modelWithHooks = new ChatMistralAI({\n",
" model: \"mistral-large-latest\",\n",
" temperature: 0,\n",
" maxRetries: 2,\n",
" beforeRequestHooks: [ beforeRequestHook ],\n",
" requestErrorHooks: [ requestErrorHook ],\n",
" responseHooks: [ responseHook ],\n",
" // other params...\n",
"});"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Or assign and add them manually after instantiation:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import { ChatMistralAI } from \"@langchain/mistralai\" \n",
"\n",
"const model = new ChatMistralAI({\n",
" model: \"mistral-large-latest\",\n",
" temperature: 0,\n",
" maxRetries: 2,\n",
" // other params...\n",
"});\n",
"\n",
"model.beforeRequestHooks = [ ...model.beforeRequestHooks, beforeRequestHook ];\n",
"model.requestErrorHooks = [ ...model.requestErrorHooks, requestErrorHook ];\n",
"model.responseHooks = [ ...model.responseHooks, responseHook ];\n",
"\n",
"model.addAllHooksToHttpClient();"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The method addAllHooksToHttpClient clears all currently added hooks before assigning the entire updated hook lists to avoid hook duplication.\n",
"\n",
"Hooks can be removed one at a time, or all hooks can be cleared from the model at once."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"model.removeHookFromHttpClient(beforeRequestHook);\n",
"\n",
"model.removeAllHooksFromHttpClient();"
]
},
{
"cell_type": "markdown",
"id": "e9bdfcef",
Expand Down Expand Up @@ -307,4 +410,4 @@
},
"nbformat": 4,
"nbformat_minor": 5
}
}
Loading