Skip to content

Commit

Permalink
Merge branch 'main' of https://github.com/hwchase17/langchainjs into …
Browse files Browse the repository at this point in the history
…jacob/executor_stream_events
  • Loading branch information
jacoblee93 committed May 31, 2024
2 parents 63a7fa3 + 10f6878 commit 3b3edca
Show file tree
Hide file tree
Showing 16 changed files with 850 additions and 84 deletions.
Binary file not shown.
17 changes: 6 additions & 11 deletions docs/core_docs/docs/integrations/chat/mistral.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,21 +7,16 @@ import CodeBlock from "@theme/CodeBlock";
# ChatMistralAI

[Mistral AI](https://mistral.ai/) is a research organization and hosting platform for LLMs.
They're most known for their family of 7B models ([`mistral7b` // `mistral-tiny`](https://mistral.ai/news/announcing-mistral-7b/), [`mixtral8x7b` // `mistral-small`](https://mistral.ai/news/mixtral-of-experts/)).

The LangChain implementation of Mistral's models uses their hosted generation API, making it easier to access their models without needing to run them locally.

## Models

Mistral's API offers access to two of their open source, and proprietary models:
:::tip
Want to run Mistral's models locally? Check out our [Ollama integration](/docs/integrations/chat/ollama).
:::

- `open-mistral-7b` (aka `mistral-tiny-2312`)
- `open-mixtral-8x7b` (aka `mistral-small-2312`)
- `mistral-small-latest` (aka `mistral-small-2402`) (default)
- `mistral-medium-latest` (aka `mistral-medium-2312`)
- `mistral-large-latest` (aka `mistral-large-2402`)
## Models

See [this page](https://docs.mistral.ai/guides/model-selection/) for an up to date list.
Mistral's API offers access to two of their open source, and proprietary models.
See [this page](https://docs.mistral.ai/getting-started/models/) for an up to date list.

## Setup

Expand Down
160 changes: 160 additions & 0 deletions docs/core_docs/docs/integrations/llms/mistral.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,160 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# MistralAI\n",
"\n",
"```{=mdx}\n",
":::tip\n",
"Want to run Mistral's models locally? Check out our [Ollama integration](/docs/integrations/chat/ollama).\n",
":::\n",
"```\n",
"\n",
"Here's how you can initialize an `MistralAI` LLM instance:\n",
"\n",
"```{=mdx}\n",
"import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n",
"import Npm2Yarn from \"@theme/Npm2Yarn\";\n",
"\n",
"<IntegrationInstallTooltip></IntegrationInstallTooltip>\n",
"\n",
"<Npm2Yarn>\n",
" @langchain/mistralai\n",
"</Npm2Yarn>\n",
"```\n"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"console.log('hello world');\n",
"```\n",
"This will output 'hello world' to the console.\n"
]
}
],
"source": [
"import { MistralAI } from \"@langchain/mistralai\";\n",
"\n",
"const model = new MistralAI({\n",
" model: \"codestral-latest\", // Defaults to \"codestral-latest\" if no model provided.\n",
" temperature: 0,\n",
" apiKey: \"YOUR-API-KEY\", // In Node.js defaults to process.env.MISTRAL_API_KEY\n",
"});\n",
"const res = await model.invoke(\n",
" \"You can print 'hello world' to the console in javascript like this:\\n```javascript\"\n",
");\n",
"console.log(res);"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Since the Mistral LLM is a completions model, they also allow you to insert a `suffix` to the prompt. Suffixes can be passed via the call options when invoking a model like so:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"console.log('hello world');\n",
"```\n"
]
}
],
"source": [
"const res = await model.invoke(\n",
" \"You can print 'hello world' to the console in javascript like this:\\n```javascript\", {\n",
" suffix: \"```\"\n",
" }\n",
");\n",
"console.log(res);"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"As seen in the first example, the model generated the requested `console.log('hello world')` code snippet, but also included extra unwanted text. By adding a suffix, we can constrain the model to only complete the prompt up to the suffix (in this case, three backticks). This allows us to easily parse the completion and extract only the desired response without the suffix using a custom output parser."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"console.log('hello world');\n",
"\n"
]
}
],
"source": [
"import { MistralAI } from \"@langchain/mistralai\";\n",
"\n",
"const model = new MistralAI({\n",
" model: \"codestral-latest\",\n",
" temperature: 0,\n",
" apiKey: \"YOUR-API-KEY\",\n",
"});\n",
"\n",
"const suffix = \"```\";\n",
"\n",
"const customOutputParser = (input: string) => {\n",
" if (input.includes(suffix)) {\n",
" return input.split(suffix)[0];\n",
" }\n",
" throw new Error(\"Input does not contain suffix.\")\n",
"};\n",
"\n",
"const res = await model.invoke(\n",
" \"You can print 'hello world' to the console in javascript like this:\\n```javascript\", {\n",
" suffix,\n",
" }\n",
");\n",
"\n",
"console.log(customOutputParser(res));"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "TypeScript",
"language": "typescript",
"name": "tslab"
},
"language_info": {
"codemirror_mode": {
"mode": "typescript",
"name": "javascript",
"typescript": true
},
"file_extension": ".ts",
"mimetype": "text/typescript",
"name": "typescript",
"version": "3.7.2"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
4 changes: 4 additions & 0 deletions docs/core_docs/vercel.json
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,10 @@
"source": "/v0.2/docs(/?)",
"destination": "/v0.2/docs/introduction/"
},
{
"source": "/docs/integrations/:path(.*/?)*",
"destination": "/v0.2/docs/integrations/:path*"
},
{
"source": "/docs/:path(.*/?)*",
"destination": "/v0.1/docs/:path*"
Expand Down
2 changes: 1 addition & 1 deletion examples/src/models/chat/chat_mistralai_tools.ts
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ class CalculatorTool extends StructuredTool {

const model = new ChatMistralAI({
apiKey: process.env.MISTRAL_API_KEY,
model: "mistral-large",
model: "mistral-large-latest",
});

// Bind the tool to the model
Expand Down
2 changes: 1 addition & 1 deletion examples/src/models/chat/chat_mistralai_wsa.ts
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ const calculatorSchema = z

const model = new ChatMistralAI({
apiKey: process.env.MISTRAL_API_KEY,
model: "mistral-large",
model: "mistral-large-latest",
});

// Pass the schema and tool name to the withStructuredOutput method
Expand Down
2 changes: 1 addition & 1 deletion examples/src/models/chat/chat_mistralai_wsa_json.ts
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ const calculatorJsonSchema = {

const model = new ChatMistralAI({
apiKey: process.env.MISTRAL_API_KEY,
model: "mistral-large",
model: "mistral-large-latest",
});

// Pass the schema and tool name to the withStructuredOutput method
Expand Down
2 changes: 1 addition & 1 deletion langchain-core/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@langchain/core",
"version": "0.2.3",
"version": "0.2.4",
"description": "Core LangChain.js abstractions and schemas",
"type": "module",
"engines": {
Expand Down
4 changes: 2 additions & 2 deletions libs/langchain-mistralai/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@langchain/mistralai",
"version": "0.0.22",
"version": "0.0.23",
"description": "MistralAI integration for LangChain.js",
"type": "module",
"engines": {
Expand Down Expand Up @@ -41,7 +41,7 @@
"license": "MIT",
"dependencies": {
"@langchain/core": ">0.1.56 <0.3.0",
"@mistralai/mistralai": "^0.1.3",
"@mistralai/mistralai": "^0.4.0",
"uuid": "^9.0.0",
"zod": "^3.22.4",
"zod-to-json-schema": "^3.22.4"
Expand Down
Loading

0 comments on commit 3b3edca

Please sign in to comment.