diff --git a/README.md b/README.md index 62053a9da..5b9352378 100644 --- a/README.md +++ b/README.md @@ -243,7 +243,7 @@ Is there anything else you'd like to know about the weather in New York or any o - [Tutorials](https://langchain-ai.github.io/langgraphjs/tutorials/): Learn to build with LangGraph through guided examples. - [How-to Guides](https://langchain-ai.github.io/langgraphjs/how-tos/): Accomplish specific things within LangGraph, from streaming, to adding memory & persistence, to common design patterns (branching, subgraphs, etc.). These are the place to go if you want to copy and run a specific code snippet. - [Conceptual Guides](https://langchain-ai.github.io/langgraphjs/concepts/): In-depth explanations of the key concepts and principles behind LangGraph, such as nodes, edges, state and more. -- [API Reference](https://langchain-ai.github.io/langgraphjs/reference/graphs/): Review important classes and methods, simple examples of how to use the graph and checkpointing APIs, higher-level prebuilt components and more. +- [API Reference](https://langchain-ai.github.io/langgraphjs/reference/): Review important classes and methods, simple examples of how to use the graph and checkpointing APIs, higher-level prebuilt components and more. ## Running Example Jupyter Notebooks diff --git a/examples/how-tos/branching.ipynb b/examples/how-tos/branching.ipynb index bdfc2504c..5b5020aea 100644 --- a/examples/how-tos/branching.ipynb +++ b/examples/how-tos/branching.ipynb @@ -7,7 +7,7 @@ "# How to create branches for parallel node execution\n", "\n", "LangGraph natively supports fan-out and fan-in using either regular edges or\n", - "[conditionalEdges](https://langchain-ai.github.io/langgraphjs/reference/classes/index.StateGraph.html#addConditionalEdges).\n", + "[conditionalEdges](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.StateGraph.html#addConditionalEdges).\n", "\n", "This lets you run nodes in parallel to speed up your total graph execution.\n", "\n", diff --git a/examples/how-tos/dynamically-returning-directly.ipynb b/examples/how-tos/dynamically-returning-directly.ipynb index defa341f1..122a0e447 100644 --- a/examples/how-tos/dynamically-returning-directly.ipynb +++ b/examples/how-tos/dynamically-returning-directly.ipynb @@ -185,7 +185,7 @@ "## Define the agent state\n", "\n", "The main type of graph in `langgraph` is the\n", - "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/index.StateGraph.html).\n", + "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.StateGraph.html).\n", "\n", "This graph is parameterized by a state object that it passes around to each\n", "node. Each node then returns operations to update that state. These operations\n", diff --git a/examples/how-tos/human-in-the-loop.ipynb b/examples/how-tos/human-in-the-loop.ipynb index 1e7cba870..8b382abc6 100644 --- a/examples/how-tos/human-in-the-loop.ipynb +++ b/examples/how-tos/human-in-the-loop.ipynb @@ -138,7 +138,7 @@ "metadata": {}, "source": [ "We can now wrap these tools in a simple\n", - "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html).\n", + "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html).\n", "\n", "This is a simple class that takes in a list of messages containing an\n", "[AIMessage with tool_calls](https://v02.api.js.langchain.com/classes/langchain_core_messages.AIMessage.html),\n", diff --git a/examples/how-tos/managing-agent-steps.ipynb b/examples/how-tos/managing-agent-steps.ipynb index 81b6109f8..53131e3c1 100644 --- a/examples/how-tos/managing-agent-steps.ipynb +++ b/examples/how-tos/managing-agent-steps.ipynb @@ -1,469 +1,469 @@ { - "cells": [ - { - "cell_type": "markdown", - "id": "38a67792", - "metadata": {}, - "source": [ - "# How to manage agent steps\n", - "\n", - "In this example we will build a ReAct Agent that explicitly manages intermediate\n", - "steps.\n", - "\n", - "The previous examples just put all messages into the model, but that extra\n", - "context can distract the agent and add latency to the API calls. In this example\n", - "we will only include the `N` most recent messages in the chat history. Note that\n", - "this is meant to be illustrative of general state management.\n", - "\n", - "## Setup\n", - "\n", - "First we need to install the packages required\n", - "\n", - "```bash\n", - "yarn add @langchain/langgraph @langchain/openai\n", - "```\n", - "\n", - "Next, we need to set API keys for Anthropic (the LLM we will use)." - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "id": "36033b66", - "metadata": {}, - "outputs": [], - "source": [ - "// process.env.OPENAI_API_KEY = \"sk_...\";" - ] - }, - { - "cell_type": "markdown", - "id": "e98d96da", - "metadata": {}, - "source": [ - "Optionally, we can set API key for\n", - "[LangSmith tracing](https://smith.langchain.com/), which will give us\n", - "best-in-class observability." - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "id": "38934fde", - "metadata": {}, - "outputs": [ + "cells": [ { - "name": "stdout", - "output_type": "stream", - "text": [ - "Managing Agent Steps: LangGraphJS\n" - ] - } - ], - "source": [ - "// Optional, add tracing in LangSmith\n", - "// process.env.LANGCHAIN_API_KEY = \"ls__...\";\n", - "process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", - "process.env.LANGCHAIN_TRACING_V2 = \"true\";\n", - "process.env.LANGCHAIN_PROJECT = \"Managing Agent Steps: LangGraphJS\";" - ] - }, - { - "cell_type": "markdown", - "id": "8aeecba6", - "metadata": {}, - "source": [ - "## Set up the State\n", - "\n", - "The main type of graph in `langgraph` is the\n", - "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/index.StateGraph.html).\n", - "This graph is parameterized by a state object that it passes around to each\n", - "node. Each node then returns operations to update that state. These operations\n", - "can either SET specific attributes on the state (e.g. overwrite the existing\n", - "values) or ADD to the existing attribute. Whether to set or add is denoted in\n", - "the state object you construct the graph with.\n", - "\n", - "For this example, the state we will track will just be a list of messages. We\n", - "want each node to just add messages to that list. Therefore, we will define the\n", - "state as follows:" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "id": "e95ef6be", - "metadata": {}, - "outputs": [], - "source": [ - "import { Annotation } from \"@langchain/langgraph\";\n", - "import { BaseMessage } from \"@langchain/core/messages\";\n", - "\n", - "const AgentState = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: (x, y) => x.concat(y),\n", - " }),\n", - "});" - ] - }, - { - "cell_type": "markdown", - "id": "d6954509", - "metadata": {}, - "source": [ - "## Set up the tools\n", - "\n", - "We will first define the tools we want to use. For this simple example, we will\n", - "create a placeholder search engine. It is really easy to create your own tools -\n", - "see documentation\n", - "[here](https://js.langchain.com/docs/modules/agents/tools/dynamic) on how to do\n", - "that." - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "id": "ec9f73a5", - "metadata": {}, - "outputs": [], - "source": [ - "import { DynamicStructuredTool } from \"@langchain/core/tools\";\n", - "import { z } from \"zod\";\n", - "\n", - "const searchTool = new DynamicStructuredTool({\n", - " name: \"search\",\n", - " description: \"Call to surf the web.\",\n", - " schema: z.object({\n", - " query: z.string().describe(\"The query to use in your search.\"),\n", - " }),\n", - " func: async ({}: { query: string }) => {\n", - " // This is a placeholder, but don't tell the LLM that...\n", - " return \"Try again in a few seconds! Checking with the weathermen... Call be again next.\";\n", - " },\n", - "});\n", - "\n", - "const tools = [searchTool];" - ] - }, - { - "cell_type": "markdown", - "id": "e8669db6", - "metadata": {}, - "source": [ - "We can now wrap these tools in a simple\n", - "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html).\\\n", - "This is a simple class that takes in a list of messages containing an\n", - "[AIMessages with tool_calls](https://v02.api.js.langchain.com/classes/langchain_core_messages_ai.AIMessage.html),\n", - "runs the tools, and returns the output as\n", - "[ToolMessage](https://v02.api.js.langchain.com/classes/langchain_core_messages_tool.ToolMessage.html)s." - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "id": "7f4829c3", - "metadata": {}, - "outputs": [], - "source": [ - "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", - "\n", - "const toolNode = new ToolNode(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "81a0a750", - "metadata": {}, - "source": [ - "## Set up the model\n", - "\n", - "Now we need to load the chat model we want to use. This should satisfy two\n", - "criteria:\n", - "\n", - "1. It should work with messages, since our state is primarily a list of messages\n", - " (chat history).\n", - "2. It should work with tool calling, since we are using a prebuilt\n", - " [ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html)\n", - "\n", - "**Note:** these model requirements are not requirements for using LangGraph -\n", - "they are just requirements for this particular example." - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "id": "cf1fcc3f", - "metadata": {}, - "outputs": [], - "source": [ - "import { ChatOpenAI } from \"@langchain/openai\";\n", - "\n", - "const model = new ChatOpenAI({\n", - " model: \"gpt-4o\",\n", - " temperature: 0,\n", - "});" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "id": "a0903bb8", - "metadata": {}, - "outputs": [], - "source": [ - "// After we've done this, we should make sure the model knows that it has these tools available to call.\n", - "// We can do this by binding the tools to the model class.\n", - "const boundModel = model.bindTools(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "c96f67f3", - "metadata": {}, - "source": [ - "## Define the nodes\n", - "\n", - "We now need to define a few different nodes in our graph. In `langgraph`, a node\n", - "can be either a function or a\n", - "[runnable](https://js.langchain.com/docs/expression_language/). There are two\n", - "main nodes we need for this:\n", - "\n", - "1. The agent: responsible for deciding what (if any) actions to take.\n", - "2. A function to invoke tools: if the agent decides to take an action, this node\n", - " will then execute that action.\n", - "\n", - "We will also need to define some edges. Some of these edges may be conditional.\n", - "The reason they are conditional is that based on the output of a node, one of\n", - "several paths may be taken. The path that is taken is not known until that node\n", - "is run (the LLM decides).\n", - "\n", - "1. Conditional Edge: after the agent is called, we should either: a. If the\n", - " agent said to take an action, then the function to invoke tools should be\n", - " called\\\n", - " b. If the agent said that it was finished, then it should finish\n", - "2. Normal Edge: after the tools are invoked, it should always go back to the\n", - " agent to decide what to do next\n", - "\n", - "Let's define the nodes, as well as a function to decide how what conditional\n", - "edge to take." - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "id": "1249b1b3", - "metadata": {}, - "outputs": [], - "source": [ - "import { END } from \"@langchain/langgraph\";\n", - "import { AIMessage, ToolMessage } from \"@langchain/core/messages\";\n", - "import { RunnableConfig } from \"@langchain/core/runnables\";\n", - "\n", - "// Define the function that determines whether to continue or not\n", - "const shouldContinue = (state: typeof AgentState.State) => {\n", - " const { messages } = state;\n", - " const lastMessage = messages[messages.length - 1] as AIMessage;\n", - " // If there is no function call, then we finish\n", - " if (!lastMessage.tool_calls || lastMessage.tool_calls.length === 0) {\n", - " return END;\n", - " }\n", - " // Otherwise if there is, we continue\n", - " return \"tools\";\n", - "};\n", - "\n", - "// **MODIFICATION**\n", - "//\n", - "// Here we don't pass all messages to the model but rather only pass the `N` most recent. Note that this is a terribly simplistic way to handle messages meant as an illustration, and there may be other methods you may want to look into depending on your use case. We also have to make sure we don't truncate the chat history to include the tool message first, as this would cause an API error.\n", - "const callModel = async (\n", - " state: typeof AgentState.State,\n", - " config?: RunnableConfig,\n", - ") => {\n", - " let modelMessages = [];\n", - " for (let i = state.messages.length - 1; i >= 0; i--) {\n", - " modelMessages.push(state.messages[i]);\n", - " if (modelMessages.length >= 5) {\n", - " if (!ToolMessage.isInstance(modelMessages[modelMessages.length - 1])) {\n", - " break;\n", - " }\n", - " }\n", - " }\n", - " modelMessages.reverse();\n", - "\n", - " const response = await boundModel.invoke(modelMessages, config);\n", - " // We return an object, because this will get added to the existing list\n", - " return { messages: [response] };\n", - "};" - ] - }, - { - "cell_type": "markdown", - "id": "227a5040", - "metadata": {}, - "source": [ - "## Define the graph\n", - "\n", - "We can now put it all together and define the graph!" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "id": "ff5f7b65", - "metadata": {}, - "outputs": [], - "source": [ - "import { START, StateGraph } from \"@langchain/langgraph\";\n", - "\n", - "// Define a new graph\n", - "const workflow = new StateGraph(AgentState)\n", - " .addNode(\"agent\", callModel)\n", - " .addNode(\"tools\", toolNode)\n", - " .addEdge(START, \"agent\")\n", - " .addConditionalEdges(\n", - " \"agent\",\n", - " shouldContinue,\n", - " )\n", - " .addEdge(\"tools\", \"agent\");\n", - "\n", - "// Finally, we compile it!\n", - "// This compiles it into a LangChain Runnable,\n", - "// meaning you can use it as you would any other runnable\n", - "const app = workflow.compile();" - ] - }, - { - "cell_type": "markdown", - "id": "6049db62", - "metadata": {}, - "source": [ - "## Use it!\n", - "\n", - "We can now use it! This now exposes the\n", - "[same interface](https://js.langchain.com/docs/expression_language/) as all\n", - "other LangChain runnables." - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "id": "7bd7315e", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "38a67792", + "metadata": {}, + "source": [ + "# How to manage agent steps\n", + "\n", + "In this example we will build a ReAct Agent that explicitly manages intermediate\n", + "steps.\n", + "\n", + "The previous examples just put all messages into the model, but that extra\n", + "context can distract the agent and add latency to the API calls. In this example\n", + "we will only include the `N` most recent messages in the chat history. Note that\n", + "this is meant to be illustrative of general state management.\n", + "\n", + "## Setup\n", + "\n", + "First we need to install the packages required\n", + "\n", + "```bash\n", + "yarn add @langchain/langgraph @langchain/openai\n", + "```\n", + "\n", + "Next, we need to set API keys for Anthropic (the LLM we will use)." + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "[human]: what is the weather in sf? Don't give up! Keep using your tools.\n", - "-----\n", - "\n", - "[ai]: \n", - "Tools: \n", - "- search({\"query\":\"current weather in San Francisco\"})\n", - "-----\n", - "\n", - "[tool]: Try again in a few seconds! Checking with the weathermen... Call be again next.\n", - "-----\n", - "\n", - "[ai]: \n", - "Tools: \n", - "- search({\"query\":\"current weather in San Francisco\"})\n", - "-----\n", - "\n", - "[tool]: Try again in a few seconds! Checking with the weathermen... Call be again next.\n", - "-----\n", - "\n", - "[ai]: \n", - "Tools: \n", - "- search({\"query\":\"current weather in San Francisco\"})\n", - "-----\n", - "\n", - "[tool]: Try again in a few seconds! Checking with the weathermen... Call be again next.\n", - "-----\n", - "\n", - "[ai]: \n", - "Tools: \n", - "- search({\"query\":\"current weather in San Francisco\"})\n", - "-----\n", - "\n", - "[tool]: Try again in a few seconds! Checking with the weathermen... Call be again next.\n", - "-----\n", - "\n", - "[ai]: \n", - "Tools: \n", - "- search({\"query\":\"current weather in San Francisco\"})\n", - "-----\n", - "\n", - "As expected, maximum steps reached. Exiting.\n" - ] + "cell_type": "code", + "execution_count": 1, + "id": "36033b66", + "metadata": {}, + "outputs": [], + "source": [ + "// process.env.OPENAI_API_KEY = \"sk_...\";" + ] + }, + { + "cell_type": "markdown", + "id": "e98d96da", + "metadata": {}, + "source": [ + "Optionally, we can set API key for\n", + "[LangSmith tracing](https://smith.langchain.com/), which will give us\n", + "best-in-class observability." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "38934fde", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Managing Agent Steps: LangGraphJS\n" + ] + } + ], + "source": [ + "// Optional, add tracing in LangSmith\n", + "// process.env.LANGCHAIN_API_KEY = \"ls__...\";\n", + "process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", + "process.env.LANGCHAIN_TRACING_V2 = \"true\";\n", + "process.env.LANGCHAIN_PROJECT = \"Managing Agent Steps: LangGraphJS\";" + ] + }, + { + "cell_type": "markdown", + "id": "8aeecba6", + "metadata": {}, + "source": [ + "## Set up the State\n", + "\n", + "The main type of graph in `langgraph` is the\n", + "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.StateGraph.html).\n", + "This graph is parameterized by a state object that it passes around to each\n", + "node. Each node then returns operations to update that state. These operations\n", + "can either SET specific attributes on the state (e.g. overwrite the existing\n", + "values) or ADD to the existing attribute. Whether to set or add is denoted in\n", + "the state object you construct the graph with.\n", + "\n", + "For this example, the state we will track will just be a list of messages. We\n", + "want each node to just add messages to that list. Therefore, we will define the\n", + "state as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "e95ef6be", + "metadata": {}, + "outputs": [], + "source": [ + "import { Annotation } from \"@langchain/langgraph\";\n", + "import { BaseMessage } from \"@langchain/core/messages\";\n", + "\n", + "const AgentState = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: (x, y) => x.concat(y),\n", + " }),\n", + "});" + ] + }, + { + "cell_type": "markdown", + "id": "d6954509", + "metadata": {}, + "source": [ + "## Set up the tools\n", + "\n", + "We will first define the tools we want to use. For this simple example, we will\n", + "create a placeholder search engine. It is really easy to create your own tools -\n", + "see documentation\n", + "[here](https://js.langchain.com/docs/modules/agents/tools/dynamic) on how to do\n", + "that." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "ec9f73a5", + "metadata": {}, + "outputs": [], + "source": [ + "import { DynamicStructuredTool } from \"@langchain/core/tools\";\n", + "import { z } from \"zod\";\n", + "\n", + "const searchTool = new DynamicStructuredTool({\n", + " name: \"search\",\n", + " description: \"Call to surf the web.\",\n", + " schema: z.object({\n", + " query: z.string().describe(\"The query to use in your search.\"),\n", + " }),\n", + " func: async ({}: { query: string }) => {\n", + " // This is a placeholder, but don't tell the LLM that...\n", + " return \"Try again in a few seconds! Checking with the weathermen... Call be again next.\";\n", + " },\n", + "});\n", + "\n", + "const tools = [searchTool];" + ] + }, + { + "cell_type": "markdown", + "id": "e8669db6", + "metadata": {}, + "source": [ + "We can now wrap these tools in a simple\n", + "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html).\\\n", + "This is a simple class that takes in a list of messages containing an\n", + "[AIMessages with tool_calls](https://v02.api.js.langchain.com/classes/langchain_core_messages_ai.AIMessage.html),\n", + "runs the tools, and returns the output as\n", + "[ToolMessage](https://v02.api.js.langchain.com/classes/langchain_core_messages_tool.ToolMessage.html)s." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "7f4829c3", + "metadata": {}, + "outputs": [], + "source": [ + "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", + "\n", + "const toolNode = new ToolNode(tools);" + ] + }, + { + "cell_type": "markdown", + "id": "81a0a750", + "metadata": {}, + "source": [ + "## Set up the model\n", + "\n", + "Now we need to load the chat model we want to use. This should satisfy two\n", + "criteria:\n", + "\n", + "1. It should work with messages, since our state is primarily a list of messages\n", + " (chat history).\n", + "2. It should work with tool calling, since we are using a prebuilt\n", + " [ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html)\n", + "\n", + "**Note:** these model requirements are not requirements for using LangGraph -\n", + "they are just requirements for this particular example." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "cf1fcc3f", + "metadata": {}, + "outputs": [], + "source": [ + "import { ChatOpenAI } from \"@langchain/openai\";\n", + "\n", + "const model = new ChatOpenAI({\n", + " model: \"gpt-4o\",\n", + " temperature: 0,\n", + "});" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "a0903bb8", + "metadata": {}, + "outputs": [], + "source": [ + "// After we've done this, we should make sure the model knows that it has these tools available to call.\n", + "// We can do this by binding the tools to the model class.\n", + "const boundModel = model.bindTools(tools);" + ] + }, + { + "cell_type": "markdown", + "id": "c96f67f3", + "metadata": {}, + "source": [ + "## Define the nodes\n", + "\n", + "We now need to define a few different nodes in our graph. In `langgraph`, a node\n", + "can be either a function or a\n", + "[runnable](https://js.langchain.com/docs/expression_language/). There are two\n", + "main nodes we need for this:\n", + "\n", + "1. The agent: responsible for deciding what (if any) actions to take.\n", + "2. A function to invoke tools: if the agent decides to take an action, this node\n", + " will then execute that action.\n", + "\n", + "We will also need to define some edges. Some of these edges may be conditional.\n", + "The reason they are conditional is that based on the output of a node, one of\n", + "several paths may be taken. The path that is taken is not known until that node\n", + "is run (the LLM decides).\n", + "\n", + "1. Conditional Edge: after the agent is called, we should either: a. If the\n", + " agent said to take an action, then the function to invoke tools should be\n", + " called\\\n", + " b. If the agent said that it was finished, then it should finish\n", + "2. Normal Edge: after the tools are invoked, it should always go back to the\n", + " agent to decide what to do next\n", + "\n", + "Let's define the nodes, as well as a function to decide how what conditional\n", + "edge to take." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "1249b1b3", + "metadata": {}, + "outputs": [], + "source": [ + "import { END } from \"@langchain/langgraph\";\n", + "import { AIMessage, ToolMessage } from \"@langchain/core/messages\";\n", + "import { RunnableConfig } from \"@langchain/core/runnables\";\n", + "\n", + "// Define the function that determines whether to continue or not\n", + "const shouldContinue = (state: typeof AgentState.State) => {\n", + " const { messages } = state;\n", + " const lastMessage = messages[messages.length - 1] as AIMessage;\n", + " // If there is no function call, then we finish\n", + " if (!lastMessage.tool_calls || lastMessage.tool_calls.length === 0) {\n", + " return END;\n", + " }\n", + " // Otherwise if there is, we continue\n", + " return \"tools\";\n", + "};\n", + "\n", + "// **MODIFICATION**\n", + "//\n", + "// Here we don't pass all messages to the model but rather only pass the `N` most recent. Note that this is a terribly simplistic way to handle messages meant as an illustration, and there may be other methods you may want to look into depending on your use case. We also have to make sure we don't truncate the chat history to include the tool message first, as this would cause an API error.\n", + "const callModel = async (\n", + " state: typeof AgentState.State,\n", + " config?: RunnableConfig,\n", + ") => {\n", + " let modelMessages = [];\n", + " for (let i = state.messages.length - 1; i >= 0; i--) {\n", + " modelMessages.push(state.messages[i]);\n", + " if (modelMessages.length >= 5) {\n", + " if (!ToolMessage.isInstance(modelMessages[modelMessages.length - 1])) {\n", + " break;\n", + " }\n", + " }\n", + " }\n", + " modelMessages.reverse();\n", + "\n", + " const response = await boundModel.invoke(modelMessages, config);\n", + " // We return an object, because this will get added to the existing list\n", + " return { messages: [response] };\n", + "};" + ] + }, + { + "cell_type": "markdown", + "id": "227a5040", + "metadata": {}, + "source": [ + "## Define the graph\n", + "\n", + "We can now put it all together and define the graph!" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "ff5f7b65", + "metadata": {}, + "outputs": [], + "source": [ + "import { START, StateGraph } from \"@langchain/langgraph\";\n", + "\n", + "// Define a new graph\n", + "const workflow = new StateGraph(AgentState)\n", + " .addNode(\"agent\", callModel)\n", + " .addNode(\"tools\", toolNode)\n", + " .addEdge(START, \"agent\")\n", + " .addConditionalEdges(\n", + " \"agent\",\n", + " shouldContinue,\n", + " )\n", + " .addEdge(\"tools\", \"agent\");\n", + "\n", + "// Finally, we compile it!\n", + "// This compiles it into a LangChain Runnable,\n", + "// meaning you can use it as you would any other runnable\n", + "const app = workflow.compile();" + ] + }, + { + "cell_type": "markdown", + "id": "6049db62", + "metadata": {}, + "source": [ + "## Use it!\n", + "\n", + "We can now use it! This now exposes the\n", + "[same interface](https://js.langchain.com/docs/expression_language/) as all\n", + "other LangChain runnables." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "7bd7315e", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[human]: what is the weather in sf? Don't give up! Keep using your tools.\n", + "-----\n", + "\n", + "[ai]: \n", + "Tools: \n", + "- search({\"query\":\"current weather in San Francisco\"})\n", + "-----\n", + "\n", + "[tool]: Try again in a few seconds! Checking with the weathermen... Call be again next.\n", + "-----\n", + "\n", + "[ai]: \n", + "Tools: \n", + "- search({\"query\":\"current weather in San Francisco\"})\n", + "-----\n", + "\n", + "[tool]: Try again in a few seconds! Checking with the weathermen... Call be again next.\n", + "-----\n", + "\n", + "[ai]: \n", + "Tools: \n", + "- search({\"query\":\"current weather in San Francisco\"})\n", + "-----\n", + "\n", + "[tool]: Try again in a few seconds! Checking with the weathermen... Call be again next.\n", + "-----\n", + "\n", + "[ai]: \n", + "Tools: \n", + "- search({\"query\":\"current weather in San Francisco\"})\n", + "-----\n", + "\n", + "[tool]: Try again in a few seconds! Checking with the weathermen... Call be again next.\n", + "-----\n", + "\n", + "[ai]: \n", + "Tools: \n", + "- search({\"query\":\"current weather in San Francisco\"})\n", + "-----\n", + "\n", + "As expected, maximum steps reached. Exiting.\n" + ] + } + ], + "source": [ + "import { HumanMessage, isAIMessage } from \"@langchain/core/messages\";\n", + "import { GraphRecursionError } from \"@langchain/langgraph\";\n", + "\n", + "const prettyPrint = (message: BaseMessage) => {\n", + " let txt = `[${message._getType()}]: ${message.content}`;\n", + " if (\n", + " (isAIMessage(message) && (message as AIMessage)?.tool_calls?.length) ||\n", + " 0 > 0\n", + " ) {\n", + " const tool_calls = (message as AIMessage)?.tool_calls\n", + " ?.map((tc) => `- ${tc.name}(${JSON.stringify(tc.args)})`)\n", + " .join(\"\\n\");\n", + " txt += ` \\nTools: \\n${tool_calls}`;\n", + " }\n", + " console.log(txt);\n", + "};\n", + "\n", + "const inputs = {\n", + " messages: [\n", + " new HumanMessage(\n", + " \"what is the weather in sf? Don't give up! Keep using your tools.\",\n", + " ),\n", + " ],\n", + "};\n", + "// Setting the recursionLimit will set a max number of steps. We expect this to endlessly loop :)\n", + "try {\n", + " for await (\n", + " const output of await app.stream(inputs, {\n", + " streamMode: \"values\",\n", + " recursionLimit: 10,\n", + " })\n", + " ) {\n", + " const lastMessage = output.messages[output.messages.length - 1];\n", + " prettyPrint(lastMessage);\n", + " console.log(\"-----\\n\");\n", + " }\n", + "} catch (e) {\n", + " // Since we are truncating the chat history, the agent never gets the chance\n", + " // to see enough information to know to stop, so it will keep looping until we hit the\n", + " // maximum recursion limit.\n", + " if ((e as GraphRecursionError).name === \"GraphRecursionError\") {\n", + " console.log(\"As expected, maximum steps reached. Exiting.\");\n", + " } else {\n", + " console.error(e);\n", + " }\n", + "}" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" } - ], - "source": [ - "import { HumanMessage, isAIMessage } from \"@langchain/core/messages\";\n", - "import { GraphRecursionError } from \"@langchain/langgraph\";\n", - "\n", - "const prettyPrint = (message: BaseMessage) => {\n", - " let txt = `[${message._getType()}]: ${message.content}`;\n", - " if (\n", - " (isAIMessage(message) && (message as AIMessage)?.tool_calls?.length) ||\n", - " 0 > 0\n", - " ) {\n", - " const tool_calls = (message as AIMessage)?.tool_calls\n", - " ?.map((tc) => `- ${tc.name}(${JSON.stringify(tc.args)})`)\n", - " .join(\"\\n\");\n", - " txt += ` \\nTools: \\n${tool_calls}`;\n", - " }\n", - " console.log(txt);\n", - "};\n", - "\n", - "const inputs = {\n", - " messages: [\n", - " new HumanMessage(\n", - " \"what is the weather in sf? Don't give up! Keep using your tools.\",\n", - " ),\n", - " ],\n", - "};\n", - "// Setting the recursionLimit will set a max number of steps. We expect this to endlessly loop :)\n", - "try {\n", - " for await (\n", - " const output of await app.stream(inputs, {\n", - " streamMode: \"values\",\n", - " recursionLimit: 10,\n", - " })\n", - " ) {\n", - " const lastMessage = output.messages[output.messages.length - 1];\n", - " prettyPrint(lastMessage);\n", - " console.log(\"-----\\n\");\n", - " }\n", - "} catch (e) {\n", - " // Since we are truncating the chat history, the agent never gets the chance\n", - " // to see enough information to know to stop, so it will keep looping until we hit the\n", - " // maximum recursion limit.\n", - " if ((e as GraphRecursionError).name === \"GraphRecursionError\") {\n", - " console.log(\"As expected, maximum steps reached. Exiting.\");\n", - " } else {\n", - " console.error(e);\n", - " }\n", - "}" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "TypeScript", - "language": "typescript", - "name": "tslab" }, - "language_info": { - "codemirror_mode": { - "mode": "typescript", - "name": "javascript", - "typescript": true - }, - "file_extension": ".ts", - "mimetype": "text/typescript", - "name": "typescript", - "version": "3.7.2" - } - }, - "nbformat": 4, - "nbformat_minor": 5 -} + "nbformat": 4, + "nbformat_minor": 5 +} \ No newline at end of file diff --git a/examples/how-tos/persistence-postgres.ipynb b/examples/how-tos/persistence-postgres.ipynb index c6b7dc269..4766fd48c 100644 --- a/examples/how-tos/persistence-postgres.ipynb +++ b/examples/how-tos/persistence-postgres.ipynb @@ -396,7 +396,7 @@ "metadata": {}, "source": [ "We can now wrap these tools in a simple\n", - "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html).\n", + "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html).\n", "This object will actually run the tools (functions) whenever they are invoked by\n", "our LLM.\n" ] diff --git a/examples/how-tos/persistence.ipynb b/examples/how-tos/persistence.ipynb index ed58047c1..13602fda2 100644 --- a/examples/how-tos/persistence.ipynb +++ b/examples/how-tos/persistence.ipynb @@ -9,17 +9,17 @@ "\n", "Many AI applications need memory to share context across multiple interactions.\n", "In LangGraph, memory is provided for any\n", - "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/index.StateGraph.html)\n", + "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.StateGraph.html)\n", "through\n", - "[Checkpointers](https://langchain-ai.github.io/langgraphjs/reference/interfaces/index.Checkpoint.html).\n", + "[Checkpointers](https://langchain-ai.github.io/langgraphjs/reference/modules/checkpoint.html).\n", "\n", "When creating any LangGraph workflow, you can set them up to persist their state\n", "by doing using the following:\n", "\n", "1. A\n", - " [Checkpointer](https://langchain-ai.github.io/langgraphjs/reference/classes/index.BaseCheckpointSaver.html),\n", + " [Checkpointer](https://langchain-ai.github.io/langgraphjs/reference/classes/checkpoint.BaseCheckpointSaver.html),\n", " such as the\n", - " [MemorySaver](https://langchain-ai.github.io/langgraphjs/reference/classes/index.MemorySaver.html)\n", + " [MemorySaver](https://langchain-ai.github.io/langgraphjs/reference/classes/checkpoint.MemorySaver.html)\n", "2. Call `compile(checkpointer=myCheckpointer)` when compiling the graph.\n", "\n", "Example:\n", @@ -38,16 +38,16 @@ "```\n", "\n", "This works for\n", - "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/index.StateGraph.html)\n", + "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.StateGraph.html)\n", "and all its subclasses, such as\n", - "[MessageGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/index.MessageGraph.html).\n", + "[MessageGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.MessageGraph.html).\n", "\n", "Below is an example.\n", "\n", "
\n", "

Note

\n", "

\n", - " In this how-to, we will create our agent from scratch to be transparent (but verbose). You can accomplish similar functionality using the createReactAgent(model, tools=tool, checkpointer=checkpointer) (API doc) constructor. This may be more appropriate if you are used to LangChain's AgentExecutor class.\n", + " In this how-to, we will create our agent from scratch to be transparent (but verbose). You can accomplish similar functionality using the createReactAgent(model, tools=tool, checkpointer=checkpointer) (API doc) constructor. This may be more appropriate if you are used to LangChain's AgentExecutor class.\n", "

\n", "
\n", "\n", diff --git a/examples/how-tos/respond-in-format.ipynb b/examples/how-tos/respond-in-format.ipynb index b74deccb4..aa0851fd8 100644 --- a/examples/how-tos/respond-in-format.ipynb +++ b/examples/how-tos/respond-in-format.ipynb @@ -1,570 +1,570 @@ { - "cells": [ - { - "cell_type": "markdown", - "id": "2c3d2c48", - "metadata": {}, - "source": [ - "# How to have agent respond in structured format\n", - "\n", - "The typical ReAct agent prompts the LLM to respond in 1 of two formats: a\n", - "function call (~ JSON) to use a tool, or conversational text to respond to the\n", - "user.\n", - "\n", - "If your agent is connected to a structured (or even generative) UI, or if it is\n", - "communicating with another agent or software process, you may want it to resopnd\n", - "in a specific structured format.\n", - "\n", - "In this example we will build a conversational ReAct agent that responds in a\n", - "specific format. We will do this by using\n", - "[tool calling](https://js.langchain.com/docs/modules/model_io/models/chat/function-calling/).\n", - "This is useful when you want to enforce that an agent's response is in a\n", - "specific format. In this example, we will ask it respond as if it were a\n", - "weatherman, returning the temperature and additional info in separate,\n", - "machine-readable fields.\n" - ] - }, - { - "cell_type": "markdown", - "id": "5860c111", - "metadata": {}, - "source": [ - "## Setup\n", - "\n", - "First we need to install the packages required\n", - "\n", - "```bash\n", - "yarn add langchain @langchain/anthropic @langchain/langgraph\n", - "```\n" - ] - }, - { - "cell_type": "markdown", - "id": "23523fc0", - "metadata": {}, - "source": [ - "Next, we need to set API keys for OpenAI (the LLM we will use).\n" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "id": "fb3ada8f", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "// process.env.OPENAI_API_KEY = \"sk_...\";" - ] - }, - { - "cell_type": "markdown", - "id": "e7e7be8c", - "metadata": {}, - "source": [ - "Optionally, we can set API key for\n", - "[LangSmith tracing](https://smith.langchain.com/), which will give us\n", - "best-in-class observability.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "id": "bf127e2b", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [ + "cells": [ { - "name": "stdout", - "output_type": "stream", - "text": [ - "Respond in Format: LangGraphJS\n" - ] - } - ], - "source": [ - "// process.env.LANGCHAIN_API_KEY = \"ls...\";\n", - "process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", - "process.env.LANGCHAIN_TRACING_V2 = \"true\";\n", - "process.env.LANGCHAIN_PROJECT = \"Respond in Format: LangGraphJS\";" - ] - }, - { - "cell_type": "markdown", - "id": "b214dd10", - "metadata": {}, - "source": [ - "## Set up the State\n" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "id": "4ad79663", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { Annotation, messagesStateReducer } from \"@langchain/langgraph\";\n", - "import { BaseMessage } from \"@langchain/core/messages\";\n", - "\n", - "const GraphState = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: messagesStateReducer,\n", - " }),\n", - "});" - ] - }, - { - "cell_type": "markdown", - "id": "eeca531d", - "metadata": {}, - "source": [ - "## Set up the tools\n" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "id": "d0fe8477", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { tool } from \"@langchain/core/tools\";\n", - "import { z } from \"zod\";\n", - "\n", - "const searchTool = tool((_) => {\n", - " // This is a placeholder, but don't tell the LLM that...\n", - " return \"67 degrees. Cloudy with a chance of rain.\";\n", - "}, {\n", - " name: \"search\",\n", - " description: \"Call to surf the web.\",\n", - " schema: z.object({\n", - " query: z.string().describe(\"The query to use in your search.\"),\n", - " }),\n", - "});\n", - "\n", - "const tools = [searchTool];" - ] - }, - { - "cell_type": "markdown", - "id": "f1a6aa07", - "metadata": {}, - "source": [ - "We can now wrap these tools in a\n", - "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html).\n" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "id": "df80654e", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", - "\n", - "const toolNode = new ToolNode(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "19f17e92", - "metadata": {}, - "source": [ - "## Set up the model\n" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "id": "9c644fb9", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { ChatOpenAI } from \"@langchain/openai\";\n", - "\n", - "const model = new ChatOpenAI({\n", - " temperature: 0,\n", - " model: \"gpt-4o\",\n", - "});" - ] - }, - { - "cell_type": "markdown", - "id": "bb86967d", - "metadata": {}, - "source": [ - "After we've done this, we should make sure the model knows that it has these\n", - "tools available to call. We can do this by binding the LangChain tools to the model class.\n", - "\n", - "We also want to define a response schema for the language model and bind it to\n", - "the model as a tool. The idea is that when the model is ready to respond, it'll call this final\n", - "tool and populate arguments for it according to the schema we want. Rather than calling\n", - "a tool, we'll instead return from the graph.\n", - "\n", - "Because we only intend to use this final tool to guide the schema of the model's final response,\n", - "we'll declare it with a mocked out function:" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "id": "e148a48b", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { tool } from \"@langchain/core/tools\";\n", - "\n", - "const Response = z.object({\n", - " temperature: z.number().describe(\"the temperature\"),\n", - " other_notes: z.string().describe(\"any other notes about the weather\"),\n", - "});\n", - "\n", - "const finalResponseTool = tool(async () => \"mocked value\", {\n", - " name: \"Response\",\n", - " description: \"Always respond to the user using this tool.\",\n", - " schema: Response\n", - "})\n", - "\n", - "const boundModel = model.bindTools([\n", - " ...tools,\n", - " finalResponseTool\n", - "]);" - ] - }, - { - "cell_type": "markdown", - "id": "6e082c02", - "metadata": {}, - "source": [ - "## Define the nodes\n" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "id": "960ef633", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { AIMessage } from \"@langchain/core/messages\";\n", - "import { RunnableConfig } from \"@langchain/core/runnables\";\n", - "\n", - "// Define the function that determines whether to continue or not\n", - "const route = (state: typeof GraphState.State) => {\n", - " const { messages } = state;\n", - " const lastMessage = messages[messages.length - 1] as AIMessage;\n", - " // If there is no function call, then we finish\n", - " if (!lastMessage.tool_calls || lastMessage.tool_calls.length === 0) {\n", - " return \"__end__\";\n", - " }\n", - " // Otherwise if there is, we need to check what type of function call it is\n", - " if (lastMessage.tool_calls[0].name === \"Response\") {\n", - " return \"__end__\";\n", - " }\n", - " // Otherwise we continue\n", - " return \"tools\";\n", - "};\n", - "\n", - "// Define the function that calls the model\n", - "const callModel = async (\n", - " state: typeof GraphState.State,\n", - " config?: RunnableConfig,\n", - ") => {\n", - " const { messages } = state;\n", - " const response = await boundModel.invoke(messages, config);\n", - " // We return an object, because this will get added to the existing list\n", - " return { messages: [response] };\n", - "};" - ] - }, - { - "cell_type": "markdown", - "id": "3d09c2d0", - "metadata": {}, - "source": [ - "## Define the graph" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "id": "51179012", - "metadata": {}, - "outputs": [], - "source": [ - "import { StateGraph } from \"@langchain/langgraph\";\n", - "\n", - "// Define a new graph\n", - "const workflow = new StateGraph(GraphState)\n", - " .addNode(\"agent\", callModel)\n", - " .addNode(\"tools\", toolNode)\n", - " .addEdge(\"__start__\", \"agent\")\n", - " .addConditionalEdges(\n", - " // First, we define the start node. We use `agent`.\n", - " // This means these are the edges taken after the `agent` node is called.\n", - " \"agent\",\n", - " // Next, we pass in the function that will determine which node is called next.\n", - " route,\n", - " // We supply a map of possible response values to the conditional edge\n", - " // to make it possible to draw a visualization of the graph.\n", - " {\n", - " __end__: \"__end__\",\n", - " tools: \"tools\",\n", - " }\n", - " )\n", - " // We now add a normal edge from `tools` to `agent`.\n", - " // This means that after `tools` is called, `agent` node is called next.\n", - " .addEdge(\"tools\", \"agent\");\n", - "\n", - "// Finally, we compile it!\n", - "// This compiles it into a LangChain Runnable,\n", - "// meaning you can use it as you would any other runnable\n", - "const app = workflow.compile();" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "id": "b24e6c09", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "2c3d2c48", + "metadata": {}, + "source": [ + "# How to have agent respond in structured format\n", + "\n", + "The typical ReAct agent prompts the LLM to respond in 1 of two formats: a\n", + "function call (~ JSON) to use a tool, or conversational text to respond to the\n", + "user.\n", + "\n", + "If your agent is connected to a structured (or even generative) UI, or if it is\n", + "communicating with another agent or software process, you may want it to resopnd\n", + "in a specific structured format.\n", + "\n", + "In this example we will build a conversational ReAct agent that responds in a\n", + "specific format. We will do this by using\n", + "[tool calling](https://js.langchain.com/docs/modules/model_io/models/chat/function-calling/).\n", + "This is useful when you want to enforce that an agent's response is in a\n", + "specific format. In this example, we will ask it respond as if it were a\n", + "weatherman, returning the temperature and additional info in separate,\n", + "machine-readable fields.\n" + ] + }, { - "data": { - "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCADaAMcDASIAAhEBAxEB/8QAHQABAAIDAAMBAAAAAAAAAAAAAAYHBAUIAQMJAv/EAFAQAAEEAQICBAkGCAsGBwAAAAEAAgMEBQYRBxITITFVCBQWIkFRYZTRFRcyNpPhI1Jxc3SBsrMJGCQzQlZidpWh0iU1U3KRsSZDRVSCkqL/xAAbAQEAAgMBAQAAAAAAAAAAAAAAAgMBBAUGB//EADoRAAIBAgIGBQkHBQAAAAAAAAABAgMRBBMSITFRUpEVQWGhsQUUMmJxgcHR8DM0Y3Ky4fEiQlOCwv/aAAwDAQACEQMRAD8A+qaIiAIiIAiIgCIiALV+VOFH/q9D3lnxW0VFaEweNm0VgZJMfVfI6jCXOdC0knkHWTsqq1enhqeZNN60tXbf5G5h8Pntq9rFw+VWF74oe8s+KeVWF74oe8s+KrvyexfdtP7BnwTyexfdtP7BnwXO6Vw/BLmjd6O9buLE8qsL3xQ95Z8U8qsL3xQ95Z8VXfk9i+7af2DPgnk9i+7af2DPgnSuH4Jc0OjvW7ixPKrC98UPeWfFPKrC98UPeWfFV35PYvu2n9gz4J5PYvu2n9gz4J0rh+CXNDo71u4sTyqwvfFD3lnxTyqwvfFD3lnxVd+T2L7tp/YM+CeT2L7tp/YM+CdK4fglzQ6O9buLE8qsL3xQ95Z8Vl0slUyTHPqWobTWnZzoJA8A+3YqsPJ7F920/sGfBbXhdVhp5vVUcEMcEYlrnkjaGj+a9QW5hsXSxblGEWmlfXbel8TXr4PJhp6Vyw0RFtHOCIiAIiIAiIgCIiAIiIAiIgCpjh/9RtP/AKBB+wFc6pjh/wDUbT/6BB+wFy/Kf3X/AGXhI6/k70pG/REXkzuENj4v6Sm1nJpSLKmfORSGF8ENWZ8bZBH0hjMoYYw8MBdyc3Nt6FHOGHhB4PiJhs/kJYLeJjw89wzOsUrLIxWgkc0SmR8TW8xa3mMY3c3cgjcFRR3yrp/jkxui8PqelBlMuXakrX6B+Rp4uhIddhnPU2XdsY2a7z9utvVucLT1/WOkNB8S9MYTT+Wr6whv5fJ4y7JQLqdhk1gyROimP4N8hbJuGE78zSCFvZULatrt18zUzJX19vVyLSwHHHROp8fmrmPzRfFhqpu3o56c8E0MAa5xk6KRjXubs12xaCDtsFF9ZeE/pbB6NOoML43nq/jlKq18WPtthcLEnLztk6EtfytDzs3fzmhnU5zQao8nMjc1Hqm9jcPry9TyHD/JYtt7U0Fh8893drxGI37uj3BPKA1rHO5gwFWTr/SuVm8GXT1DHYizZyGLgwtp+Lgi2nLa0teSWNrDsecNjd5vbuNu1ZyqUZK/W11mMypKL7C4sFm6uo8RWyVLp/FbDeePxmtJXk23286ORrXt7OxwCz1qtMahj1Tha+TipZDHxzc21fKVH1bDdnEedG8Bzd9txv6CFtVotWdjbWtBZPDb6waq/O1v3Sxlk8NvrBqr87W/dLu+R/tKn5f+omhjvsfeT9EReiPOBERAEREAREQBERAEREAREQBUxw/+o2n/ANAg/YCudQSpwfxdCrDWr5TMw14WBkcbbnU1oGwA6lr4nDrFUcvSs7p9z+Zv4SvGg25dZWbuAHDNxJOgNNknrJOLh/0rzJwC4aSvc9+gtOPe4kuc7GQkk+s+arQ+aqj3xm/ffuT5qqPfGb99+5c3oyp/m8Td88ocPcjS0KFbFUa1KnBHVp1o2wwwQtDWRsaAGtaB1AAAAD2LIWy+aqj3xm/ffuT5qqPfGb99+5V9Efirkyfn9LczWoq08Gyrd4n4HWVvOZvKSTYvVeRxFYwWOjArwuaIwerrOxO59Kt35qqPfGb99+5Oh/xVyZnpCluZA9S8KdGayyPyhntK4fM3uQR+M3qUc0nKN9m8zgTsNz1e1an+L9wy3+oGm/8AC4f9KtL5qqPfGb99+5Pmqo98Zv337lYvJc1qVbxIPG0Hrce5EV0vo7BaJoSUtP4ejhKckpmfBQrthY55ABcQ0AE7NaN/YFIeG31g1V+drfulk/NVR74zfvv3Lc6X0fT0objqs1qxJbe18sluXpHEtGw69vUt3CYPzWU5ynpNq2x70/ga+IxVOrT0Io3qIi3TlBERAEREAREQBERAEREAREQBERAEREAREQHO/gU/VTiT/f8AzP7xi6IXO/gU/VTiT/f/ADP7xi6IQBERAEREAREQBERAEREAREQBERAEREAREQBERAEREAREQHO/gU/VTiT/AH/zP7xi6IXO/gU/VTiT/f8AzP7xi6IQBERAEREAREQBERAEREAREQBERAEREAREQBERAEXgnYbnsUIv8TBPIY8DjzlmAgG7LL0NU+1j9nGQe1rS0+h3qnGEp7CcISqO0VcnC5N/hHuCcvE/gozUePY6TL6QdLeEY/p1HhosgDs3AYyTf1RuA7VdB1pq1x3FTCs/smSZ2369h/2Xrsar1TbgkgnpYKaGVpY+OTpXNc0jYgg9oIVmUuJczY80rbj5UeBRwLPHTjhi6d2sZtOYjbJZUubux0bCOSI+g9I/laR28vOR2L7Srl3wfuEE3g5Y7P1NOQYyY5i863LNadIXsjG4igBAG7WAu2J6yXOPp2FseWerv/bYT/7TJlLiXMeaVtxZSKu4OIGoasgNzB07sG/WaFwtlA9jJGhp/W8KX6e1NQ1NWfLSkdzRnlmgmYY5YXep7HdY9h7COsEjrUZU5RWltXY7/wAe8qnRnT9JG1REVRSEREAREQBERAEREAREQBERAEREAREQFe67yrs1lnafjd/III2y5DY/zxd9CA/2SAXPHpBa07tc4HDAAAAGwHoWDTe6bO6mlk/nXZSRrvXs1jGt/wDy1qwNd52vpjRmbytrKR4SGpTllORlh6ZtYhp2f0f9PY7eb/S7PSrK+qSgti8ev63WPR4eCp0k/eb1Fy9h+NOvNLz6xrX48vnTV0nNqLGHP4qvRsOfG/kI6Ou7rjPO07PDXjlI9q9MfGjUejr+Xyb9aR8RMRR0XLnnsq1q8UMNx0kbYo3uibuGOHOWgnmAD9+bqI17Es+J1Oi590Rqbiw3PYuXJ1M1ewtyvM/IzZWjja0NM9C58b65r2Hvc3nDW8sgcdnb824WFoLX+uYcJwf1NmtUDNVtYTRUL2MOPggjidJVllZLG5jQ8PBh87clp5jytYNgMElVW5/X8nRrZGvLg1wcWnlcAd9j6j/1WLbNrHzsyuNBOSrNJbGHcrbDO0xP9YPoJ+idiPTvS/gx4HKUJdd2bepr2UrN1RlaxpT167I3SiwN7BcyNrud2x3aDyDc7NHUr0U4TcJaSMq1WH9S2lh4rJ181jKmQqP6StaibNE71tcAR/kVlqHcJnudoqJp+hFdvRR/8jbczWj9QAH6lMVfVioVJRXU2eZktGTQREVREIiIAiIgCIiAIiIAiIgCIiAIiICs9S0HYHWE8rgRSzPLJG8nzW2WMDXM9hcxjXD18sh9HXpdYaTx2utL5PT+XidNjchA6CZrHFrtj6WkdhB2IPrAUq1rxA0nX1fhOHuY8Zs5nUMb5a1SvVleGRx7uMzpWDaLlc1uz9wWuLSNttxgX9MahwT+WCEahpAgNkjeyK00f22uLWPPtaW7/i+u6Uc6zT1+J18NiYaGXUKjn4FRYyTJZyDUGps9qV+FtYlk13KMifNFIAWxh7Yg2Ite0ObIxoIJJdzdihPB/hXqzH5Kzhsth7uN0Dcx09bJ4nOWsdY8ZkeGtZ0PiULC0BvOHF53II6gRuugDeyLep2ms013pArNdt+sOIT5Qv8A9XM17p96j5vV3eBt3o3TUu8h2iuDlfRL+SHVOpspRjqOpVsfk77Za9aI7dTWhgLi0NAaXlxA6gesr90eDOEoab0LhI7V81NH2YrVB7pGc8ro4ZImiU8mxHLK4nlDesD8h2ml+IVPWte9Pg8dlMnDRuS4+y+CruIrEZAkjPX2t3G63Xyhf/q5mvdPvTzeruJ6dFdaI3pjhdR0hq3M5vG5XKxQZaeS3Yw7p2Oo+MScvPM1pZzhzuXc+ftuT1KU5O6+jULoYjYtPPR164OxmlP0WD8p9PoG57AvMMefvyCOnpq4wk7dNfljrxN/L5zn/wDRhUPg4/8ACjh3xHyGnNYa2pQa1oFkb2WKssVWsJI2v5YpC0s35XtDnF/MTuNmjzRlUtB6VTlfW+Wz3lNTE06UbQd2XlpXBjTencfjek6Z9eINkl/4kh63v/W4k/rW1UV0pxW0XrtzG6c1bhM5I/fljx+Qimf1Akjla4kHYE7behSpQlJybk9rOC3fWERFEwEREAREQBERAEREAREQBEVcT8VWap1brPQelI7UOrMJjhKclkMdL8mQ2ZGAwxvf1cx2ex5a3taSQTsQAJnm9U4fTUmPZlspTxsmQssp02WpmxusTuOzY4wT5zj6h1qAWTqji5BxA0plsNl+H+FjkbRxeo8fkmC5dbuTJNEGgmJuwYBvvuHO7CCBstN8LW38RpO3xEGL1trXAiSSLOPxzIhHK9wJdEzrDCA1g3G2/IHbNPULBQGo0tpmppLT+KxFWSxZhxtVlOGxdlM07mNAA55D1uJ5RufTstuiIAiIgK64J5b5XxOpX+QPzfdDn7sHivQdD8o8rm/y7boo+bpe3m2dvt9JysVQvhdi9bYrH5tmucxSzNyXMWZsdJSYGthoOI6CJ20ce72jfc7O7fpFTRAF88/4UXgKJIsbxWxNY8zOTHZvkHo7IJnfuyfbEF9DFqdV6VxWuNN5LAZ2lHkcRkYHVrVWQkCRjhsRuCC0+kOBBBAIIIBQHyT8CfA6b0TxO0zr/ibWyWG0w+WSPAZW3jwcVYvt3aHSzPBDejO5Y4DYSM5udvREH6/QTx2YY5oZGyxSND2SMcHNc0jcEEdoK02V0Np3N6Rdpa9hKFjTZrtqDFOrt8WbC0AMY1gGzQ0Acu23LsNttgucptCcQvBHmfe0Ay5xC4VB5ks6OsSGTI4lhO5dSkPXIwf8M9f5SXPAHVSKF8KeMGlONWmGZ3SeUjyFXcMnhPmT1ZPTHLGetjh7eo9oJHWpogCIiAIiIAiIgCIiALW5nPU8N0EU1qrHetl0dKpYsMidalDd+jZzHrP5N9u1bJVTxftaJr6+4Ws1RTu2c1LmJG4CSqSI4bXRHmdLs4bt5fWD1+hAYkeh8zx70HgJeI+NyGh71XKDJHC4PNO8+NjiYY7MkYG/a1xDT1OY0gt62i4AAN9htv2ryiAIiIAiIgCIiAqLwdodAY2prfG6Ez9nNOj1Lds5iK75stS9I4dLGGmNhEYLSGnYg7HZztjtbqgvEvSGpMjpyz83uYoaS1JLehuy3J6DZorvJyh0c4A5iHNa1pcPO2aADss3T3FHTWo9a57R1PKxTanwLYnZCiY3Rua17WuD2h30m+cAS0nYkAnrG4EtREQBERAUPxV8GU5TU79fcM8v5A8R2AmS3AzellR2mO5CBs8E/wBMAuHaQ4hu2fwS4/X9balv6C1tpuxpHiTiqnjlqi0GSnbr8wZ4zWlG4MZc4DYncE7bu2dtdK504Df+NvCS4461d+ErUbdXSdF/4ni0fNZbv7ZXNKA6LREQBERAEX5e9sbS5xDWtG5JOwAWt8qsL3xQ95Z8VJRlLYgbRFq/KrC98UPeWfFPKrC98UPeWfFSy58LM2ZmZGWzDj7MlKBlq4yJzoYJZeiZI8A8rS/Z3KCdhvsdt99j2L536m/hUcrRz0NSXhPVpT460+O7XvZcyzNc3drmMcIG9E8OB3JDvVsvoJ5VYXvih7yz4r5s+HB4NY1H4Q+nsro+WpJT1tYbBckge10VO23YSTSbHZrHM/CE+kslKZc+FizO3vBg472/CL4aP1fZ0u/SsL70tWtA+540LEbGs3ma/o2dXOXs227Yz1+q3FDNA0tJcN9F4XS+GydCHGYqqyrCDaj5nBo63O6+tzju4n0kkrf+VWF74oe8s+KZc+FizNoi1flVhe+KHvLPinlVhe+KHvLPimXPhYszaIsalkqmSY51S1Baa07OdBIHgH27FZKg007MwERFgBRbXmlchnNOZ1umcjDpnVV6mK1fPNqMmki5SSwODh5zQXP2B7OdxHWpSiApDWfhO6V4CYuDEcS87zasqYOPI2PEsfIyPJSb9G5tXfzXPMg+jzAAHmcWtDi3Y+DD4ReN8Jfh0/UlPHOwt2tbkp3cY+x05geNnNIfyt5muY5p35R18w6+Xc8o+HpwK478aNa1L9PTtDLaQw4mjxVTD2mPnja8t55ZhIGPdJJyM3awFrA0NG55nvg38Hnm9UcFvCFm0JqbA5bCs1VVfH4pepyQubPAx8schDgDy8gmG4G3ng77BAfUVFh3MvQx7uW1dr1neqaVrD/mVj+VWF74oe8s+KmoSetIzY86nz9bSmmstm7h5aeNqTXJjvtsyNhe7/JpVM+BDgLOL8HjB5XID/a2pbFnUFx+3032ZXOa79cfRrE8M/W0TfB51HicFer3MxnnQYWvFBM15/DytZJzbE7N6PpNyrd0rY07pXTWIwVHK0fFMbUhpQNFhn0I2Bjerf1NCzlz4WLMk6L8RSsmjD43tkY7sc07g/rX7VZgIiIDV6q+rGY/Q5v2CqswGAxj8FjnOx1RzjWjJJgbufNHsVp6q+rGY/Q5v2Cq709/uDG/o0X7AWtjJyjQjou2v4HnfLUnGFOz638B5PYvu2n9gz4J5PYvu2n9gz4LYIuLm1OJ8zymnLea/wAnsX3bT+wZ8E8nsX3bT+wZ8FFM/wActD6Yz8uGyeejrXYHsjsO6CV8FZztuVs0zWGOIncHZ7gdiD6V69RceNDaUy+SxmTzZgu4x0bb0bKdiUVQ+NsjHyuZGWsYWvaedxDe0b7ggT062995ao1nsT7yX+T2L7tp/YM+CeT2L7tp/YM+Cj2seL2kdA2KNfNZhkFi9GZq8FeCWzI+IdsnLE1xDB+OQG+1Y/BHXV3iXwq07qfIxVobuSgMsjKjXNiBD3NHKHOcdtgO0lY06ttLSdvaLVVDTd7fXyJT5PYvu2n9gz4J5PYvu2n9gz4LYIo5tTifMq05bz28LqsNPN6pjrwxwRiWueSNoaP5r1BWGoDw2+sGqvzlb90p8vTSbai3wx/Sj6HhNeHp+xeAREUDaCIiA/E00deF8sr2xRRtLnvedmtA6yST2BVnldQXtYOLop7GMwm/4KKImOe038eR30mNPaGN2dtsXEEljdzxStGShjMQCOjydvo5wd/OhYx0j29X4xa1pHpDj+RadXXyoqS2vuXzvyOrg6EZLMkauDSuGrD8HiqYPpcYGlx9PWSNz+te3yfxfdtP7BvwUEwHHDG5vi3qLQrqd2GxjDBHDZFKy6Od7o3vk5n9FyRNbyANLnbP380nsWwwnHDQ+otSswOOz0djIyySQw7QSthsSR787IpiwRyObsdwxxPUfUqXVqPW5PmdRSh1Mlfk/i+7af2Dfgh09iiCDjaZB9HQM+CiruOGh2ar8nDno/lTxoUCBBKYBZP/AJBn5eiEno5Ofm36tt1qtP8AGWtDjNd5TVVipisZp7UU2IimijeS+MMhLN2guL5HOlI2aOvq2CxmT4mNOG8ndfAQYyfxjEOfhLW4PSUdmNdt1bPj25Hj/mB9m2wKnmkNWPzRloX42QZeuwPkbECIp2E7CWPck7b9RaSSw9RJBa51faU1ditbYdmUw88lim57o+aWvJA8OadnAska1zSD6CAvflbRw9rGZiM8slK1Hzn1wyPEcrfb5rubY9W7W9m24vpzlWapzd77Pb1e41cRQjUg5R2lvoiKk4Bq9VfVjMfoc37BVd6e/wBwY39Gi/YCsTVX1YzH6HN+wVXODiZPpzHxyND431I2ua4bggsG4K1Mb9hH2/A835b9Cn7X8DZIoAPB/wCGYII0BpsEekYuH/SvH8X7hl/UDTf+Fw/6VxLR3/XM8xanvfL9ykKPD+DFZrWWmtY6c4gZb5ZzluzDNgb135LvVLMnMDKI5mxRlocWva8Dqb1c26mM+jshWPhC1ocRddWyGOgr40GB7vHA3Eti5YiR+EIcOXq3PN1dqvqCCOtBHDCxsUUbQxjGDYNaBsAB6l7FY6rZsPFSbv8AW1P4HN+iX5fhXreDMZjSufzFXN6Vw9OvZxlB9mSjNXjeJq0rB50XM57Xbu2buDudx1WD4M2Jv4PgTpCjk6NnGX4arhLUuROiliPSPOzmu6wdiFZ6iupeFOjNZZL5Qz2lcPmb3II/Gb1KOaTlG+zeZwJ2G56vasOekrMjKsqitJW2d2pEqRV//F94Zf1A03/hcP8ApUm0tovAaIpy1NPYajhKssnSyQ0K7YWPfsBzENA3OwA39ig7dRQ1C2pvl+5JOG31g1V+crfulPlAeG31g1V+crfulPl6h+jD8sf0o+g4T7vT9i8AiIom2EREBBOJ9cstabv7Exw3HwPIG/KJInBpPs5g0f8AyC1isDOYatqHE2sdca51ewzlcWHZzT2hzT6HAgEH0EAqtJJLOEvNxmY5YrhPLBYA5Yro9Do+vqdt9KPtad+1vK510k6kFbbHw2373c7GCqq2W9pT/Q5HAcbNeV58TlzU1dRoRY/L0Kb568L44pYn9NI3qiLS5rvO23Cg2nsXnsto7hRw9Zo/MYnL6Vy9Czk8hZpmOhFHUJMkkVj6Mpm7AGbn8IebbYrqRFqG+6V+v62nKMun9Qs4Rz8ImaUzLtRyZxzhm/Ez8nmI5HxoXTZ+juI9vN35+YbcqkU2n462P4rYTVGltTXKlnUzM3TtYKq58jmSCDopqz2ncyRPiLnNG5AHYd9l0YiXMZK3ldcCcnqnKaNtP1Uy6ZIshPDj7OUqircs0m7dFLPEAOSQ+cD1N3DQSBupfqaub+PioMBMl61BVaAN/pSN5j+QN5nH2ArYXL1fHwGazMyCIEDnkdsNz2D8p9S3WjNOT3chFncjXfWbE0jH1JmlsjOYbOmkafoucOpre1rS7m63FrNmgnGSqvYvHd9dRXWqKjTs3rJ0iIqzzp6LtSO/TnqygmKeN0bwDsdiNj/3UNh4SY6vCyKPLZpkbGhrWi71ADqA7FOUU1OUVZEJQjP0kn7SE/NVR74zfvv3J81VHvjN++/cpsizmPs5Ihk0uBckQn5qqPfGb99+5Pmqo98Zv337lNkTMfZyQyaXAuSIT81VHvjN++/cnzVUe+M3779ymyJmPs5IZNLgXJEJ+aqj3xm/ffuT5qqPfGb99+5TZEzH2ckMmlwLkjRaX0fT0objqs1qxJbc18sluXpHEtGw69vUt6iKMpOTuy1JJWQREUTIREQBY2RxlTL1H1b1aK3Wf9KKZgc0+rqKyUWU2ndAhkvCjDb/AMms5SiwdkcGQlLB+QOLgP1L1/NRQ73zXvv3Kbors+pvLVVqL+5kI+aih3vmvffuXkcKMfv15bNOHq8dI/7BTZEz6m8znVOJkdw2gMHhLbbcNR1i6z6Nq5M+xI3q280vJ5er8Xb0+tSJEVcpym7ydyptyd2ERFAwf//Z" - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "import * as tslab from \"tslab\";\n", - "\n", - "const graph = app.getGraph();\n", - "const image = await graph.drawMermaidPng();\n", - "const arrayBuffer = await image.arrayBuffer();\n", - "\n", - "await tslab.display.png(new Uint8Array(arrayBuffer));" - ] - }, - { - "cell_type": "markdown", - "id": "ae844f61", - "metadata": {}, - "source": [ - "## Use it!\n", - "\n", - "We can now use it! This now exposes the\n", - "[same interface](https://v02.api.js.langchain.com/classes/langchain_core_runnables.Runnable.html)\n", - "as all other LangChain runnables.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "id": "3ee8225f", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "5860c111", + "metadata": {}, + "source": [ + "## Setup\n", + "\n", + "First we need to install the packages required\n", + "\n", + "```bash\n", + "yarn add langchain @langchain/anthropic @langchain/langgraph\n", + "```\n" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "[human]: what is the weather in sf\n", - "\n", - "---\n", - "\n", - "[ai]: \n", - "Tools: \n", - "- search({\"query\":\"current weather in San Francisco\"})\n", - "\n", - "---\n", - "\n", - "[tool]: 67 degrees. Cloudy with a chance of rain.\n", - "\n", - "---\n", - "\n", - "[ai]: \n", - "Tools: \n", - "- Response({\"temperature\":67,\"other_notes\":\"Cloudy with a chance of rain.\"})\n", - "\n", - "---\n", - "\n" - ] - } - ], - "source": [ - "import { HumanMessage, isAIMessage } from \"@langchain/core/messages\";\n", - "\n", - "const prettyPrint = (message: BaseMessage) => {\n", - " let txt = `[${message._getType()}]: ${message.content}`;\n", - " if (\n", - " isAIMessage(message) && message?.tool_calls?.length\n", - " ) {\n", - " const tool_calls = message?.tool_calls\n", - " ?.map((tc) => `- ${tc.name}(${JSON.stringify(tc.args)})`)\n", - " .join(\"\\n\");\n", - " txt += ` \\nTools: \\n${tool_calls}`;\n", - " }\n", - " console.log(txt);\n", - "};\n", - "\n", - "const inputs = {\n", - " messages: [new HumanMessage(\"what is the weather in sf\")],\n", - "};\n", - "\n", - "const stream = await app.stream(inputs, { streamMode: \"values\" });\n", - "\n", - "for await (const output of stream) {\n", - " const { messages } = output;\n", - " prettyPrint(messages[messages.length - 1]);\n", - " console.log(\"\\n---\\n\");\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "332a421f", - "metadata": {}, - "source": [ - "## Partially streaming JSON\n", - "\n", - "If we want to stream the structured output as soon as it's available, we can use the [`.streamEvents()`](https://js.langchain.com/v0.2/docs/how_to/streaming#using-stream-events) method. We'll aggregate emitted `on_chat_model_events` and inspect the name field. As soon as we detect that the model is calling the final output tool, we can start logging the relevant chunks.\n", - "\n", - "Here's an example:" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "id": "c96cb38d", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "23523fc0", + "metadata": {}, + "source": [ + "Next, we need to set API keys for OpenAI (the LLM we will use).\n" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "fb3ada8f", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "// process.env.OPENAI_API_KEY = \"sk_...\";" + ] + }, + { + "cell_type": "markdown", + "id": "e7e7be8c", + "metadata": {}, + "source": [ + "Optionally, we can set API key for\n", + "[LangSmith tracing](https://smith.langchain.com/), which will give us\n", + "best-in-class observability.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "bf127e2b", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Respond in Format: LangGraphJS\n" + ] + } + ], + "source": [ + "// process.env.LANGCHAIN_API_KEY = \"ls...\";\n", + "process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", + "process.env.LANGCHAIN_TRACING_V2 = \"true\";\n", + "process.env.LANGCHAIN_PROJECT = \"Respond in Format: LangGraphJS\";" + ] + }, + { + "cell_type": "markdown", + "id": "b214dd10", + "metadata": {}, + "source": [ + "## Set up the State\n" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "4ad79663", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { Annotation, messagesStateReducer } from \"@langchain/langgraph\";\n", + "import { BaseMessage } from \"@langchain/core/messages\";\n", + "\n", + "const GraphState = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: messagesStateReducer,\n", + " }),\n", + "});" + ] + }, + { + "cell_type": "markdown", + "id": "eeca531d", + "metadata": {}, + "source": [ + "## Set up the tools\n" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "d0fe8477", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { tool } from \"@langchain/core/tools\";\n", + "import { z } from \"zod\";\n", + "\n", + "const searchTool = tool((_) => {\n", + " // This is a placeholder, but don't tell the LLM that...\n", + " return \"67 degrees. Cloudy with a chance of rain.\";\n", + "}, {\n", + " name: \"search\",\n", + " description: \"Call to surf the web.\",\n", + " schema: z.object({\n", + " query: z.string().describe(\"The query to use in your search.\"),\n", + " }),\n", + "});\n", + "\n", + "const tools = [searchTool];" + ] + }, + { + "cell_type": "markdown", + "id": "f1a6aa07", + "metadata": {}, + "source": [ + "We can now wrap these tools in a\n", + "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html).\n" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "df80654e", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", + "\n", + "const toolNode = new ToolNode(tools);" + ] + }, + { + "cell_type": "markdown", + "id": "19f17e92", + "metadata": {}, + "source": [ + "## Set up the model\n" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "---\n", - "{\"\n", - "---\n", - "{\"temperature\n", - "---\n", - "{\"temperature\":\n", - "---\n", - "{\"temperature\":67\n", - "---\n", - "{\"temperature\":67,\"\n", - "---\n", - "{\"temperature\":67,\"other\n", - "---\n", - "{\"temperature\":67,\"other_notes\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"Cloud\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"Cloudy\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"Cloudy with\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"Cloudy with a\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance of\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance of rain\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance of rain.\"\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance of rain.\"}\n", - "---\n", - "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance of rain.\"}\n", - "---\n", - "[\n", - " {\n", - " name: 'Response',\n", - " args: { temperature: 67, other_notes: 'Cloudy with a chance of rain.' },\n", - " id: 'call_oOhNx2SdeelXn6tbenokDtkO',\n", - " type: 'tool_call'\n", - " }\n", - "]\n" - ] + "cell_type": "code", + "execution_count": 6, + "id": "9c644fb9", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { ChatOpenAI } from \"@langchain/openai\";\n", + "\n", + "const model = new ChatOpenAI({\n", + " temperature: 0,\n", + " model: \"gpt-4o\",\n", + "});" + ] + }, + { + "cell_type": "markdown", + "id": "bb86967d", + "metadata": {}, + "source": [ + "After we've done this, we should make sure the model knows that it has these\n", + "tools available to call. We can do this by binding the LangChain tools to the model class.\n", + "\n", + "We also want to define a response schema for the language model and bind it to\n", + "the model as a tool. The idea is that when the model is ready to respond, it'll call this final\n", + "tool and populate arguments for it according to the schema we want. Rather than calling\n", + "a tool, we'll instead return from the graph.\n", + "\n", + "Because we only intend to use this final tool to guide the schema of the model's final response,\n", + "we'll declare it with a mocked out function:" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "e148a48b", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { tool } from \"@langchain/core/tools\";\n", + "\n", + "const Response = z.object({\n", + " temperature: z.number().describe(\"the temperature\"),\n", + " other_notes: z.string().describe(\"any other notes about the weather\"),\n", + "});\n", + "\n", + "const finalResponseTool = tool(async () => \"mocked value\", {\n", + " name: \"Response\",\n", + " description: \"Always respond to the user using this tool.\",\n", + " schema: Response\n", + "})\n", + "\n", + "const boundModel = model.bindTools([\n", + " ...tools,\n", + " finalResponseTool\n", + "]);" + ] + }, + { + "cell_type": "markdown", + "id": "6e082c02", + "metadata": {}, + "source": [ + "## Define the nodes\n" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "960ef633", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { AIMessage } from \"@langchain/core/messages\";\n", + "import { RunnableConfig } from \"@langchain/core/runnables\";\n", + "\n", + "// Define the function that determines whether to continue or not\n", + "const route = (state: typeof GraphState.State) => {\n", + " const { messages } = state;\n", + " const lastMessage = messages[messages.length - 1] as AIMessage;\n", + " // If there is no function call, then we finish\n", + " if (!lastMessage.tool_calls || lastMessage.tool_calls.length === 0) {\n", + " return \"__end__\";\n", + " }\n", + " // Otherwise if there is, we need to check what type of function call it is\n", + " if (lastMessage.tool_calls[0].name === \"Response\") {\n", + " return \"__end__\";\n", + " }\n", + " // Otherwise we continue\n", + " return \"tools\";\n", + "};\n", + "\n", + "// Define the function that calls the model\n", + "const callModel = async (\n", + " state: typeof GraphState.State,\n", + " config?: RunnableConfig,\n", + ") => {\n", + " const { messages } = state;\n", + " const response = await boundModel.invoke(messages, config);\n", + " // We return an object, because this will get added to the existing list\n", + " return { messages: [response] };\n", + "};" + ] + }, + { + "cell_type": "markdown", + "id": "3d09c2d0", + "metadata": {}, + "source": [ + "## Define the graph" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "51179012", + "metadata": {}, + "outputs": [], + "source": [ + "import { StateGraph } from \"@langchain/langgraph\";\n", + "\n", + "// Define a new graph\n", + "const workflow = new StateGraph(GraphState)\n", + " .addNode(\"agent\", callModel)\n", + " .addNode(\"tools\", toolNode)\n", + " .addEdge(\"__start__\", \"agent\")\n", + " .addConditionalEdges(\n", + " // First, we define the start node. We use `agent`.\n", + " // This means these are the edges taken after the `agent` node is called.\n", + " \"agent\",\n", + " // Next, we pass in the function that will determine which node is called next.\n", + " route,\n", + " // We supply a map of possible response values to the conditional edge\n", + " // to make it possible to draw a visualization of the graph.\n", + " {\n", + " __end__: \"__end__\",\n", + " tools: \"tools\",\n", + " }\n", + " )\n", + " // We now add a normal edge from `tools` to `agent`.\n", + " // This means that after `tools` is called, `agent` node is called next.\n", + " .addEdge(\"tools\", \"agent\");\n", + "\n", + "// Finally, we compile it!\n", + "// This compiles it into a LangChain Runnable,\n", + "// meaning you can use it as you would any other runnable\n", + "const app = workflow.compile();" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "b24e6c09", + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCADaAMcDASIAAhEBAxEB/8QAHQABAAIDAAMBAAAAAAAAAAAAAAYHBAUIAQMJAv/EAFAQAAEEAQICBAkGCAsGBwAAAAEAAgMEBQYRBxITITFVCBQWIkFRYZTRFRcyNpPhI1Jxc3SBsrMJGCQzQlZidpWh0iU1U3KRsSZDRVSCkqL/xAAbAQEAAgMBAQAAAAAAAAAAAAAAAgMBBAUGB//EADoRAAIBAgIGBQkHBQAAAAAAAAABAgMRBBMSITFRUpEVQWGhsQUUMmJxgcHR8DM0Y3Ky4fEiQlOCwv/aAAwDAQACEQMRAD8A+qaIiAIiIAiIgCIiALV+VOFH/q9D3lnxW0VFaEweNm0VgZJMfVfI6jCXOdC0knkHWTsqq1enhqeZNN60tXbf5G5h8Pntq9rFw+VWF74oe8s+KeVWF74oe8s+KrvyexfdtP7BnwTyexfdtP7BnwXO6Vw/BLmjd6O9buLE8qsL3xQ95Z8U8qsL3xQ95Z8VXfk9i+7af2DPgnk9i+7af2DPgnSuH4Jc0OjvW7ixPKrC98UPeWfFPKrC98UPeWfFV35PYvu2n9gz4J5PYvu2n9gz4J0rh+CXNDo71u4sTyqwvfFD3lnxTyqwvfFD3lnxVd+T2L7tp/YM+CeT2L7tp/YM+CdK4fglzQ6O9buLE8qsL3xQ95Z8Vl0slUyTHPqWobTWnZzoJA8A+3YqsPJ7F920/sGfBbXhdVhp5vVUcEMcEYlrnkjaGj+a9QW5hsXSxblGEWmlfXbel8TXr4PJhp6Vyw0RFtHOCIiAIiIAiIgCIiAIiIAiIgCpjh/9RtP/AKBB+wFc6pjh/wDUbT/6BB+wFy/Kf3X/AGXhI6/k70pG/REXkzuENj4v6Sm1nJpSLKmfORSGF8ENWZ8bZBH0hjMoYYw8MBdyc3Nt6FHOGHhB4PiJhs/kJYLeJjw89wzOsUrLIxWgkc0SmR8TW8xa3mMY3c3cgjcFRR3yrp/jkxui8PqelBlMuXakrX6B+Rp4uhIddhnPU2XdsY2a7z9utvVucLT1/WOkNB8S9MYTT+Wr6whv5fJ4y7JQLqdhk1gyROimP4N8hbJuGE78zSCFvZULatrt18zUzJX19vVyLSwHHHROp8fmrmPzRfFhqpu3o56c8E0MAa5xk6KRjXubs12xaCDtsFF9ZeE/pbB6NOoML43nq/jlKq18WPtthcLEnLztk6EtfytDzs3fzmhnU5zQao8nMjc1Hqm9jcPry9TyHD/JYtt7U0Fh8893drxGI37uj3BPKA1rHO5gwFWTr/SuVm8GXT1DHYizZyGLgwtp+Lgi2nLa0teSWNrDsecNjd5vbuNu1ZyqUZK/W11mMypKL7C4sFm6uo8RWyVLp/FbDeePxmtJXk23286ORrXt7OxwCz1qtMahj1Tha+TipZDHxzc21fKVH1bDdnEedG8Bzd9txv6CFtVotWdjbWtBZPDb6waq/O1v3Sxlk8NvrBqr87W/dLu+R/tKn5f+omhjvsfeT9EReiPOBERAEREAREQBERAEREAREQBUxw/+o2n/ANAg/YCudQSpwfxdCrDWr5TMw14WBkcbbnU1oGwA6lr4nDrFUcvSs7p9z+Zv4SvGg25dZWbuAHDNxJOgNNknrJOLh/0rzJwC4aSvc9+gtOPe4kuc7GQkk+s+arQ+aqj3xm/ffuT5qqPfGb99+5c3oyp/m8Td88ocPcjS0KFbFUa1KnBHVp1o2wwwQtDWRsaAGtaB1AAAAD2LIWy+aqj3xm/ffuT5qqPfGb99+5V9Efirkyfn9LczWoq08Gyrd4n4HWVvOZvKSTYvVeRxFYwWOjArwuaIwerrOxO59Kt35qqPfGb99+5Oh/xVyZnpCluZA9S8KdGayyPyhntK4fM3uQR+M3qUc0nKN9m8zgTsNz1e1an+L9wy3+oGm/8AC4f9KtL5qqPfGb99+5Pmqo98Zv337lYvJc1qVbxIPG0Hrce5EV0vo7BaJoSUtP4ejhKckpmfBQrthY55ABcQ0AE7NaN/YFIeG31g1V+drfulk/NVR74zfvv3Lc6X0fT0objqs1qxJbe18sluXpHEtGw69vUt3CYPzWU5ynpNq2x70/ga+IxVOrT0Io3qIi3TlBERAEREAREQBERAEREAREQBERAEREAREQHO/gU/VTiT/f8AzP7xi6IXO/gU/VTiT/f/ADP7xi6IQBERAEREAREQBERAEREAREQBERAEREAREQBERAEREAREQHO/gU/VTiT/AH/zP7xi6IXO/gU/VTiT/f8AzP7xi6IQBERAEREAREQBERAEREAREQBERAEREAREQBERAEXgnYbnsUIv8TBPIY8DjzlmAgG7LL0NU+1j9nGQe1rS0+h3qnGEp7CcISqO0VcnC5N/hHuCcvE/gozUePY6TL6QdLeEY/p1HhosgDs3AYyTf1RuA7VdB1pq1x3FTCs/smSZ2369h/2Xrsar1TbgkgnpYKaGVpY+OTpXNc0jYgg9oIVmUuJczY80rbj5UeBRwLPHTjhi6d2sZtOYjbJZUubux0bCOSI+g9I/laR28vOR2L7Srl3wfuEE3g5Y7P1NOQYyY5i863LNadIXsjG4igBAG7WAu2J6yXOPp2FseWerv/bYT/7TJlLiXMeaVtxZSKu4OIGoasgNzB07sG/WaFwtlA9jJGhp/W8KX6e1NQ1NWfLSkdzRnlmgmYY5YXep7HdY9h7COsEjrUZU5RWltXY7/wAe8qnRnT9JG1REVRSEREAREQBERAEREAREQBERAEREAREQFe67yrs1lnafjd/III2y5DY/zxd9CA/2SAXPHpBa07tc4HDAAAAGwHoWDTe6bO6mlk/nXZSRrvXs1jGt/wDy1qwNd52vpjRmbytrKR4SGpTllORlh6ZtYhp2f0f9PY7eb/S7PSrK+qSgti8ev63WPR4eCp0k/eb1Fy9h+NOvNLz6xrX48vnTV0nNqLGHP4qvRsOfG/kI6Ou7rjPO07PDXjlI9q9MfGjUejr+Xyb9aR8RMRR0XLnnsq1q8UMNx0kbYo3uibuGOHOWgnmAD9+bqI17Es+J1Oi590Rqbiw3PYuXJ1M1ewtyvM/IzZWjja0NM9C58b65r2Hvc3nDW8sgcdnb824WFoLX+uYcJwf1NmtUDNVtYTRUL2MOPggjidJVllZLG5jQ8PBh87clp5jytYNgMElVW5/X8nRrZGvLg1wcWnlcAd9j6j/1WLbNrHzsyuNBOSrNJbGHcrbDO0xP9YPoJ+idiPTvS/gx4HKUJdd2bepr2UrN1RlaxpT167I3SiwN7BcyNrud2x3aDyDc7NHUr0U4TcJaSMq1WH9S2lh4rJ181jKmQqP6StaibNE71tcAR/kVlqHcJnudoqJp+hFdvRR/8jbczWj9QAH6lMVfVioVJRXU2eZktGTQREVREIiIAiIgCIiAIiIAiIgCIiAIiICs9S0HYHWE8rgRSzPLJG8nzW2WMDXM9hcxjXD18sh9HXpdYaTx2utL5PT+XidNjchA6CZrHFrtj6WkdhB2IPrAUq1rxA0nX1fhOHuY8Zs5nUMb5a1SvVleGRx7uMzpWDaLlc1uz9wWuLSNttxgX9MahwT+WCEahpAgNkjeyK00f22uLWPPtaW7/i+u6Uc6zT1+J18NiYaGXUKjn4FRYyTJZyDUGps9qV+FtYlk13KMifNFIAWxh7Yg2Ite0ObIxoIJJdzdihPB/hXqzH5Kzhsth7uN0Dcx09bJ4nOWsdY8ZkeGtZ0PiULC0BvOHF53II6gRuugDeyLep2ms013pArNdt+sOIT5Qv8A9XM17p96j5vV3eBt3o3TUu8h2iuDlfRL+SHVOpspRjqOpVsfk77Za9aI7dTWhgLi0NAaXlxA6gesr90eDOEoab0LhI7V81NH2YrVB7pGc8ro4ZImiU8mxHLK4nlDesD8h2ml+IVPWte9Pg8dlMnDRuS4+y+CruIrEZAkjPX2t3G63Xyhf/q5mvdPvTzeruJ6dFdaI3pjhdR0hq3M5vG5XKxQZaeS3Yw7p2Oo+MScvPM1pZzhzuXc+ftuT1KU5O6+jULoYjYtPPR164OxmlP0WD8p9PoG57AvMMefvyCOnpq4wk7dNfljrxN/L5zn/wDRhUPg4/8ACjh3xHyGnNYa2pQa1oFkb2WKssVWsJI2v5YpC0s35XtDnF/MTuNmjzRlUtB6VTlfW+Wz3lNTE06UbQd2XlpXBjTencfjek6Z9eINkl/4kh63v/W4k/rW1UV0pxW0XrtzG6c1bhM5I/fljx+Qimf1Akjla4kHYE7behSpQlJybk9rOC3fWERFEwEREAREQBERAEREAREQBEVcT8VWap1brPQelI7UOrMJjhKclkMdL8mQ2ZGAwxvf1cx2ex5a3taSQTsQAJnm9U4fTUmPZlspTxsmQssp02WpmxusTuOzY4wT5zj6h1qAWTqji5BxA0plsNl+H+FjkbRxeo8fkmC5dbuTJNEGgmJuwYBvvuHO7CCBstN8LW38RpO3xEGL1trXAiSSLOPxzIhHK9wJdEzrDCA1g3G2/IHbNPULBQGo0tpmppLT+KxFWSxZhxtVlOGxdlM07mNAA55D1uJ5RufTstuiIAiIgK64J5b5XxOpX+QPzfdDn7sHivQdD8o8rm/y7boo+bpe3m2dvt9JysVQvhdi9bYrH5tmucxSzNyXMWZsdJSYGthoOI6CJ20ce72jfc7O7fpFTRAF88/4UXgKJIsbxWxNY8zOTHZvkHo7IJnfuyfbEF9DFqdV6VxWuNN5LAZ2lHkcRkYHVrVWQkCRjhsRuCC0+kOBBBAIIIBQHyT8CfA6b0TxO0zr/ibWyWG0w+WSPAZW3jwcVYvt3aHSzPBDejO5Y4DYSM5udvREH6/QTx2YY5oZGyxSND2SMcHNc0jcEEdoK02V0Np3N6Rdpa9hKFjTZrtqDFOrt8WbC0AMY1gGzQ0Acu23LsNttgucptCcQvBHmfe0Ay5xC4VB5ks6OsSGTI4lhO5dSkPXIwf8M9f5SXPAHVSKF8KeMGlONWmGZ3SeUjyFXcMnhPmT1ZPTHLGetjh7eo9oJHWpogCIiAIiIAiIgCIiALW5nPU8N0EU1qrHetl0dKpYsMidalDd+jZzHrP5N9u1bJVTxftaJr6+4Ws1RTu2c1LmJG4CSqSI4bXRHmdLs4bt5fWD1+hAYkeh8zx70HgJeI+NyGh71XKDJHC4PNO8+NjiYY7MkYG/a1xDT1OY0gt62i4AAN9htv2ryiAIiIAiIgCIiAqLwdodAY2prfG6Ez9nNOj1Lds5iK75stS9I4dLGGmNhEYLSGnYg7HZztjtbqgvEvSGpMjpyz83uYoaS1JLehuy3J6DZorvJyh0c4A5iHNa1pcPO2aADss3T3FHTWo9a57R1PKxTanwLYnZCiY3Rua17WuD2h30m+cAS0nYkAnrG4EtREQBERAUPxV8GU5TU79fcM8v5A8R2AmS3AzellR2mO5CBs8E/wBMAuHaQ4hu2fwS4/X9balv6C1tpuxpHiTiqnjlqi0GSnbr8wZ4zWlG4MZc4DYncE7bu2dtdK504Df+NvCS4461d+ErUbdXSdF/4ni0fNZbv7ZXNKA6LREQBERAEX5e9sbS5xDWtG5JOwAWt8qsL3xQ95Z8VJRlLYgbRFq/KrC98UPeWfFPKrC98UPeWfFSy58LM2ZmZGWzDj7MlKBlq4yJzoYJZeiZI8A8rS/Z3KCdhvsdt99j2L536m/hUcrRz0NSXhPVpT460+O7XvZcyzNc3drmMcIG9E8OB3JDvVsvoJ5VYXvih7yz4r5s+HB4NY1H4Q+nsro+WpJT1tYbBckge10VO23YSTSbHZrHM/CE+kslKZc+FizO3vBg472/CL4aP1fZ0u/SsL70tWtA+540LEbGs3ma/o2dXOXs227Yz1+q3FDNA0tJcN9F4XS+GydCHGYqqyrCDaj5nBo63O6+tzju4n0kkrf+VWF74oe8s+KZc+FizNoi1flVhe+KHvLPinlVhe+KHvLPimXPhYszaIsalkqmSY51S1Baa07OdBIHgH27FZKg007MwERFgBRbXmlchnNOZ1umcjDpnVV6mK1fPNqMmki5SSwODh5zQXP2B7OdxHWpSiApDWfhO6V4CYuDEcS87zasqYOPI2PEsfIyPJSb9G5tXfzXPMg+jzAAHmcWtDi3Y+DD4ReN8Jfh0/UlPHOwt2tbkp3cY+x05geNnNIfyt5muY5p35R18w6+Xc8o+HpwK478aNa1L9PTtDLaQw4mjxVTD2mPnja8t55ZhIGPdJJyM3awFrA0NG55nvg38Hnm9UcFvCFm0JqbA5bCs1VVfH4pepyQubPAx8schDgDy8gmG4G3ng77BAfUVFh3MvQx7uW1dr1neqaVrD/mVj+VWF74oe8s+KmoSetIzY86nz9bSmmstm7h5aeNqTXJjvtsyNhe7/JpVM+BDgLOL8HjB5XID/a2pbFnUFx+3032ZXOa79cfRrE8M/W0TfB51HicFer3MxnnQYWvFBM15/DytZJzbE7N6PpNyrd0rY07pXTWIwVHK0fFMbUhpQNFhn0I2Bjerf1NCzlz4WLMk6L8RSsmjD43tkY7sc07g/rX7VZgIiIDV6q+rGY/Q5v2CqswGAxj8FjnOx1RzjWjJJgbufNHsVp6q+rGY/Q5v2Cq709/uDG/o0X7AWtjJyjQjou2v4HnfLUnGFOz638B5PYvu2n9gz4J5PYvu2n9gz4LYIuLm1OJ8zymnLea/wAnsX3bT+wZ8E8nsX3bT+wZ8FFM/wActD6Yz8uGyeejrXYHsjsO6CV8FZztuVs0zWGOIncHZ7gdiD6V69RceNDaUy+SxmTzZgu4x0bb0bKdiUVQ+NsjHyuZGWsYWvaedxDe0b7ggT062995ao1nsT7yX+T2L7tp/YM+CeT2L7tp/YM+Cj2seL2kdA2KNfNZhkFi9GZq8FeCWzI+IdsnLE1xDB+OQG+1Y/BHXV3iXwq07qfIxVobuSgMsjKjXNiBD3NHKHOcdtgO0lY06ttLSdvaLVVDTd7fXyJT5PYvu2n9gz4J5PYvu2n9gz4LYIo5tTifMq05bz28LqsNPN6pjrwxwRiWueSNoaP5r1BWGoDw2+sGqvzlb90p8vTSbai3wx/Sj6HhNeHp+xeAREUDaCIiA/E00deF8sr2xRRtLnvedmtA6yST2BVnldQXtYOLop7GMwm/4KKImOe038eR30mNPaGN2dtsXEEljdzxStGShjMQCOjydvo5wd/OhYx0j29X4xa1pHpDj+RadXXyoqS2vuXzvyOrg6EZLMkauDSuGrD8HiqYPpcYGlx9PWSNz+te3yfxfdtP7BvwUEwHHDG5vi3qLQrqd2GxjDBHDZFKy6Od7o3vk5n9FyRNbyANLnbP380nsWwwnHDQ+otSswOOz0djIyySQw7QSthsSR787IpiwRyObsdwxxPUfUqXVqPW5PmdRSh1Mlfk/i+7af2Dfgh09iiCDjaZB9HQM+CiruOGh2ar8nDno/lTxoUCBBKYBZP/AJBn5eiEno5Ofm36tt1qtP8AGWtDjNd5TVVipisZp7UU2IimijeS+MMhLN2guL5HOlI2aOvq2CxmT4mNOG8ndfAQYyfxjEOfhLW4PSUdmNdt1bPj25Hj/mB9m2wKnmkNWPzRloX42QZeuwPkbECIp2E7CWPck7b9RaSSw9RJBa51faU1ditbYdmUw88lim57o+aWvJA8OadnAska1zSD6CAvflbRw9rGZiM8slK1Hzn1wyPEcrfb5rubY9W7W9m24vpzlWapzd77Pb1e41cRQjUg5R2lvoiKk4Bq9VfVjMfoc37BVd6e/wBwY39Gi/YCsTVX1YzH6HN+wVXODiZPpzHxyND431I2ua4bggsG4K1Mb9hH2/A835b9Cn7X8DZIoAPB/wCGYII0BpsEekYuH/SvH8X7hl/UDTf+Fw/6VxLR3/XM8xanvfL9ykKPD+DFZrWWmtY6c4gZb5ZzluzDNgb135LvVLMnMDKI5mxRlocWva8Dqb1c26mM+jshWPhC1ocRddWyGOgr40GB7vHA3Eti5YiR+EIcOXq3PN1dqvqCCOtBHDCxsUUbQxjGDYNaBsAB6l7FY6rZsPFSbv8AW1P4HN+iX5fhXreDMZjSufzFXN6Vw9OvZxlB9mSjNXjeJq0rB50XM57Xbu2buDudx1WD4M2Jv4PgTpCjk6NnGX4arhLUuROiliPSPOzmu6wdiFZ6iupeFOjNZZL5Qz2lcPmb3II/Gb1KOaTlG+zeZwJ2G56vasOekrMjKsqitJW2d2pEqRV//F94Zf1A03/hcP8ApUm0tovAaIpy1NPYajhKssnSyQ0K7YWPfsBzENA3OwA39ig7dRQ1C2pvl+5JOG31g1V+crfulPlAeG31g1V+crfulPl6h+jD8sf0o+g4T7vT9i8AiIom2EREBBOJ9cstabv7Exw3HwPIG/KJInBpPs5g0f8AyC1isDOYatqHE2sdca51ewzlcWHZzT2hzT6HAgEH0EAqtJJLOEvNxmY5YrhPLBYA5Yro9Do+vqdt9KPtad+1vK510k6kFbbHw2373c7GCqq2W9pT/Q5HAcbNeV58TlzU1dRoRY/L0Kb568L44pYn9NI3qiLS5rvO23Cg2nsXnsto7hRw9Zo/MYnL6Vy9Czk8hZpmOhFHUJMkkVj6Mpm7AGbn8IebbYrqRFqG+6V+v62nKMun9Qs4Rz8ImaUzLtRyZxzhm/Ez8nmI5HxoXTZ+juI9vN35+YbcqkU2n462P4rYTVGltTXKlnUzM3TtYKq58jmSCDopqz2ncyRPiLnNG5AHYd9l0YiXMZK3ldcCcnqnKaNtP1Uy6ZIshPDj7OUqircs0m7dFLPEAOSQ+cD1N3DQSBupfqaub+PioMBMl61BVaAN/pSN5j+QN5nH2ArYXL1fHwGazMyCIEDnkdsNz2D8p9S3WjNOT3chFncjXfWbE0jH1JmlsjOYbOmkafoucOpre1rS7m63FrNmgnGSqvYvHd9dRXWqKjTs3rJ0iIqzzp6LtSO/TnqygmKeN0bwDsdiNj/3UNh4SY6vCyKPLZpkbGhrWi71ADqA7FOUU1OUVZEJQjP0kn7SE/NVR74zfvv3J81VHvjN++/cpsizmPs5Ihk0uBckQn5qqPfGb99+5Pmqo98Zv337lNkTMfZyQyaXAuSIT81VHvjN++/cnzVUe+M3779ymyJmPs5IZNLgXJEJ+aqj3xm/ffuT5qqPfGb99+5TZEzH2ckMmlwLkjRaX0fT0objqs1qxJbc18sluXpHEtGw69vUt6iKMpOTuy1JJWQREUTIREQBY2RxlTL1H1b1aK3Wf9KKZgc0+rqKyUWU2ndAhkvCjDb/AMms5SiwdkcGQlLB+QOLgP1L1/NRQ73zXvv3Kbors+pvLVVqL+5kI+aih3vmvffuXkcKMfv15bNOHq8dI/7BTZEz6m8znVOJkdw2gMHhLbbcNR1i6z6Nq5M+xI3q280vJ5er8Xb0+tSJEVcpym7ydyptyd2ERFAwf//Z" + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "import * as tslab from \"tslab\";\n", + "\n", + "const graph = app.getGraph();\n", + "const image = await graph.drawMermaidPng();\n", + "const arrayBuffer = await image.arrayBuffer();\n", + "\n", + "await tslab.display.png(new Uint8Array(arrayBuffer));" + ] + }, + { + "cell_type": "markdown", + "id": "ae844f61", + "metadata": {}, + "source": [ + "## Use it!\n", + "\n", + "We can now use it! This now exposes the\n", + "[same interface](https://v02.api.js.langchain.com/classes/langchain_core_runnables.Runnable.html)\n", + "as all other LangChain runnables.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "id": "3ee8225f", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[human]: what is the weather in sf\n", + "\n", + "---\n", + "\n", + "[ai]: \n", + "Tools: \n", + "- search({\"query\":\"current weather in San Francisco\"})\n", + "\n", + "---\n", + "\n", + "[tool]: 67 degrees. Cloudy with a chance of rain.\n", + "\n", + "---\n", + "\n", + "[ai]: \n", + "Tools: \n", + "- Response({\"temperature\":67,\"other_notes\":\"Cloudy with a chance of rain.\"})\n", + "\n", + "---\n", + "\n" + ] + } + ], + "source": [ + "import { HumanMessage, isAIMessage } from \"@langchain/core/messages\";\n", + "\n", + "const prettyPrint = (message: BaseMessage) => {\n", + " let txt = `[${message._getType()}]: ${message.content}`;\n", + " if (\n", + " isAIMessage(message) && message?.tool_calls?.length\n", + " ) {\n", + " const tool_calls = message?.tool_calls\n", + " ?.map((tc) => `- ${tc.name}(${JSON.stringify(tc.args)})`)\n", + " .join(\"\\n\");\n", + " txt += ` \\nTools: \\n${tool_calls}`;\n", + " }\n", + " console.log(txt);\n", + "};\n", + "\n", + "const inputs = {\n", + " messages: [new HumanMessage(\"what is the weather in sf\")],\n", + "};\n", + "\n", + "const stream = await app.stream(inputs, { streamMode: \"values\" });\n", + "\n", + "for await (const output of stream) {\n", + " const { messages } = output;\n", + " prettyPrint(messages[messages.length - 1]);\n", + " console.log(\"\\n---\\n\");\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "332a421f", + "metadata": {}, + "source": [ + "## Partially streaming JSON\n", + "\n", + "If we want to stream the structured output as soon as it's available, we can use the [`.streamEvents()`](https://js.langchain.com/v0.2/docs/how_to/streaming#using-stream-events) method. We'll aggregate emitted `on_chat_model_events` and inspect the name field. As soon as we detect that the model is calling the final output tool, we can start logging the relevant chunks.\n", + "\n", + "Here's an example:" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "id": "c96cb38d", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "---\n", + "{\"\n", + "---\n", + "{\"temperature\n", + "---\n", + "{\"temperature\":\n", + "---\n", + "{\"temperature\":67\n", + "---\n", + "{\"temperature\":67,\"\n", + "---\n", + "{\"temperature\":67,\"other\n", + "---\n", + "{\"temperature\":67,\"other_notes\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"Cloud\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"Cloudy\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"Cloudy with\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"Cloudy with a\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance of\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance of rain\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance of rain.\"\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance of rain.\"}\n", + "---\n", + "{\"temperature\":67,\"other_notes\":\"Cloudy with a chance of rain.\"}\n", + "---\n", + "[\n", + " {\n", + " name: 'Response',\n", + " args: { temperature: 67, other_notes: 'Cloudy with a chance of rain.' },\n", + " id: 'call_oOhNx2SdeelXn6tbenokDtkO',\n", + " type: 'tool_call'\n", + " }\n", + "]\n" + ] + } + ], + "source": [ + "import { concat } from \"@langchain/core/utils/stream\";\n", + "\n", + "const eventStream = await app.streamEvents(inputs, { version: \"v2\" });\n", + "\n", + "let aggregatedChunk;\n", + "for await (const { event, data } of eventStream) {\n", + " if (event === \"on_chat_model_stream\") {\n", + " const { chunk } = data;\n", + " if (aggregatedChunk === undefined) {\n", + " aggregatedChunk = chunk;\n", + " } else {\n", + " aggregatedChunk = concat(aggregatedChunk, chunk);\n", + " }\n", + " const currentToolCalls = aggregatedChunk.tool_calls;\n", + " if (\n", + " currentToolCalls.length === 0 ||\n", + " currentToolCalls[0].name === \"\" ||\n", + " !finalResponseTool.name.startsWith(currentToolCalls[0].name)\n", + " ) {\n", + " // No tool calls or a different tool call in the message,\n", + " // so drop what's currently aggregated and start over\n", + " aggregatedChunk = undefined;\n", + " } else if (currentToolCalls[0].name === finalResponseTool.name) {\n", + " // Now we're sure that this event is part of the final output!\n", + " // Log the partially aggregated args.\n", + " console.log(aggregatedChunk.tool_call_chunks[0].args);\n", + "\n", + " // You can also log the raw args instead:\n", + " // console.log(chunk.tool_call_chunks);\n", + "\n", + " console.log(\"---\");\n", + " }\n", + " }\n", + "}\n", + "// Final aggregated tool call\n", + "console.log(aggregatedChunk.tool_calls);" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" } - ], - "source": [ - "import { concat } from \"@langchain/core/utils/stream\";\n", - "\n", - "const eventStream = await app.streamEvents(inputs, { version: \"v2\" });\n", - "\n", - "let aggregatedChunk;\n", - "for await (const { event, data } of eventStream) {\n", - " if (event === \"on_chat_model_stream\") {\n", - " const { chunk } = data;\n", - " if (aggregatedChunk === undefined) {\n", - " aggregatedChunk = chunk;\n", - " } else {\n", - " aggregatedChunk = concat(aggregatedChunk, chunk);\n", - " }\n", - " const currentToolCalls = aggregatedChunk.tool_calls;\n", - " if (\n", - " currentToolCalls.length === 0 ||\n", - " currentToolCalls[0].name === \"\" ||\n", - " !finalResponseTool.name.startsWith(currentToolCalls[0].name)\n", - " ) {\n", - " // No tool calls or a different tool call in the message,\n", - " // so drop what's currently aggregated and start over\n", - " aggregatedChunk = undefined;\n", - " } else if (currentToolCalls[0].name === finalResponseTool.name) {\n", - " // Now we're sure that this event is part of the final output!\n", - " // Log the partially aggregated args.\n", - " console.log(aggregatedChunk.tool_call_chunks[0].args);\n", - "\n", - " // You can also log the raw args instead:\n", - " // console.log(chunk.tool_call_chunks);\n", - "\n", - " console.log(\"---\");\n", - " }\n", - " }\n", - "}\n", - "// Final aggregated tool call\n", - "console.log(aggregatedChunk.tool_calls);" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "TypeScript", - "language": "typescript", - "name": "tslab" }, - "language_info": { - "codemirror_mode": { - "mode": "typescript", - "name": "javascript", - "typescript": true - }, - "file_extension": ".ts", - "mimetype": "text/typescript", - "name": "typescript", - "version": "3.7.2" - } - }, - "nbformat": 4, - "nbformat_minor": 5 -} + "nbformat": 4, + "nbformat_minor": 5 +} \ No newline at end of file diff --git a/examples/how-tos/stream-tokens.ipynb b/examples/how-tos/stream-tokens.ipynb index c2c0c6f05..eaed4d36d 100644 --- a/examples/how-tos/stream-tokens.ipynb +++ b/examples/how-tos/stream-tokens.ipynb @@ -1,637 +1,637 @@ { - "cells": [ - { - "cell_type": "markdown", - "id": "562ddb82", - "metadata": {}, - "source": [ - "# How to stream LLM tokens from your graph\n", - "\n", - "In this example, we will stream tokens from the language model powering an\n", - "agent. We will use a ReAct agent as an example. The tl;dr is to use\n", - "[streamEvents](https://js.langchain.com/v0.2/docs/how_to/chat_streaming/#stream-events)\n", - "([API Ref](https://api.js.langchain.com/classes/langchain_core_runnables.Runnable.html#streamEvents)).\n", - "\n", - "
\n", - "

Note

\n", - "

\n", - " If you are using a version of `@langchain/core` < 0.2.3, when calling chat models or LLMs you need to call `await model.stream()` within your nodes to get token-by-token streaming events, and aggregate final outputs if needed to update the graph state. In later versions of `@langchain/core`, this occurs automatically, and you can call `await model.invoke()`.\n", - "\n", - " For more on how to upgrade `@langchain/core`, check out [the instructions here](https://js.langchain.com/v0.2/docs/how_to/installation/#installing-integration-packages).\n", - "

\n", - "\n", - "
\n", - "\n", - "This how-to guide closely follows the others in this directory, showing how to\n", - "incorporate the functionality into a prototypical agent in LangGraph.\n", - "\n", - "This works for\n", - "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/index.StateGraph.html)\n", - "and all its subclasses, such as\n", - "[MessageGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/index.MessageGraph.html).\n", - "\n", - "
\n", - "

Streaming Support

\n", - "

\n", - " Token streaming is supported by many, but not all chat models. Check to see if your LLM integration supports token streaming here (doc). Note that some integrations may support _general_ token streaming but lack support for streaming tool calls.\n", - "

\n", - "
\n", - "\n", - "
\n", - "

Note

\n", - "

\n", - " In this how-to, we will create our agent from scratch to be transparent (but verbose). You can accomplish similar functionality using the createReactAgent({ llm, tools }) (API doc) constructor. This may be more appropriate if you are used to LangChain's AgentExecutor class.\n", - "

\n", - "
\n", - "\n", - "## Setup\n", - "\n", - "This guide will use OpenAI's GPT-4o model. We will optionally set our API key\n", - "for [LangSmith tracing](https://smith.langchain.com/), which will give us\n", - "best-in-class observability.\n", - "\n", - "---" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "id": "8e76833b", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "// process.env.OPENAI_API_KEY = \"sk_...\";\n", - "\n", - "// Optional, add tracing in LangSmith\n", - "// process.env.LANGCHAIN_API_KEY = \"ls__...\";\n", - "// process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", - "// process.env.LANGCHAIN_TRACING = \"true\";\n", - "// process.env.LANGCHAIN_PROJECT = \"Stream Tokens: LangGraphJS\";" - ] - }, - { - "cell_type": "markdown", - "id": "ab95dc97", - "metadata": {}, - "source": [ - "## Define the state\n", - "\n", - "The state is the interface for all of the nodes in our graph.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "id": "1648124b", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { Annotation } from \"@langchain/langgraph\";\n", - "import { BaseMessage } from \"@langchain/core/messages\";\n", - "\n", - "const GraphState = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: (x, y) => x.concat(y),\n", - " }),\n", - "});" - ] - }, - { - "cell_type": "markdown", - "id": "da50fbd8", - "metadata": {}, - "source": [ - "## Set up the tools\n", - "\n", - "First define the tools you want to use. For this simple example, we'll create a placeholder search engine, but see the documentation [here](https://js.langchain.com/v0.2/docs/how_to/custom_tools) on how to create your own custom tools." - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "id": "a8f1ae1c", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { tool } from \"@langchain/core/tools\";\n", - "import { z } from \"zod\";\n", - "\n", - "const searchTool = tool((_) => {\n", - " // This is a placeholder for the actual implementation\n", - " return \"Cold, with a low of 3℃\";\n", - "}, {\n", - " name: \"search\",\n", - " description:\n", - " \"Use to surf the web, fetch current information, check the weather, and retrieve other information.\",\n", - " schema: z.object({\n", - " query: z.string().describe(\"The query to use in your search.\"),\n", - " }),\n", - "});\n", - "\n", - "await searchTool.invoke({ query: \"What's the weather like?\" });\n", - "\n", - "const tools = [searchTool];" - ] - }, - { - "cell_type": "markdown", - "id": "19b27cb3", - "metadata": {}, - "source": [ - "We can now wrap these tools in a prebuilt\n", - "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html).\n", - "This object will actually run the tools (functions) whenever they are invoked by\n", - "our LLM." - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "id": "f02278b1", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", - "\n", - "const toolNode = new ToolNode(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "dd55ee5a", - "metadata": {}, - "source": [ - "## Set up the model\n", - "\n", - "Now load the [chat model](https://js.langchain.com/v0.2/docs/concepts/#chat-models).\n", - "\n", - "1. It should work with messages. We will represent all agent state in the form\n", - " of messages, so it needs to be able to work well with them.\n", - "2. It should work with\n", - " [tool calling](https://js.langchain.com/v0.2/docs/how_to/tool_calling/#passing-tools-to-llms),\n", - " meaning it can return function arguments in its response.\n", - "\n", - "
\n", - "

Note

\n", - "

\n", - " These model requirements are not general requirements for using LangGraph - they are just requirements for this one example.\n", - "

\n", - "
" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "id": "9c7210e7", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { ChatOpenAI } from \"@langchain/openai\";\n", - "\n", - "const model = new ChatOpenAI({ model: \"gpt-4o\", temperature: 0 });" - ] - }, - { - "cell_type": "markdown", - "id": "73e59248", - "metadata": {}, - "source": [ - "After you've done this, we should make sure the model knows that it has these\n", - "tools available to call. We can do this by calling\n", - "[bindTools](https://v01.api.js.langchain.com/classes/langchain_core_language_models_chat_models.BaseChatModel.html#bindTools)." - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "id": "b4ff23ee", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "const boundModel = model.bindTools(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "dbe67356", - "metadata": {}, - "source": [ - "## Define the graph\n", - "\n", - "We can now put it all together." - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "id": "0ba603bb", - "metadata": {}, - "outputs": [], - "source": [ - "import { StateGraph, END } from \"@langchain/langgraph\";\n", - "import { AIMessage } from \"@langchain/core/messages\";\n", - "\n", - "const routeMessage = (state: typeof GraphState.State) => {\n", - " const { messages } = state;\n", - " const lastMessage = messages[messages.length - 1] as AIMessage;\n", - " // If no tools are called, we can finish (respond to the user)\n", - " if (!lastMessage?.tool_calls?.length) {\n", - " return END;\n", - " }\n", - " // Otherwise if there is, we continue and call the tools\n", - " return \"tools\";\n", - "};\n", - "\n", - "const callModel = async (\n", - " state: typeof GraphState.State,\n", - "): Promise> => {\n", - " // For versions of @langchain/core < 0.2.3, you must call `.stream()`\n", - " // and aggregate the message from chunks instead of calling `.invoke()`.\n", - " const { messages } = state;\n", - " const responseMessage = await boundModel.invoke(messages);\n", - " return { messages: [responseMessage] };\n", - "};\n", - "\n", - "const workflow = new StateGraph(GraphState)\n", - " .addNode(\"agent\", callModel)\n", - " .addNode(\"tools\", toolNode)\n", - " .addEdge(\"__start__\", \"agent\")\n", - " .addConditionalEdges(\"agent\", routeMessage)\n", - " .addEdge(\"tools\", \"agent\");\n", - "\n", - "const agent = workflow.compile();" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "id": "a88cf20a", - "metadata": {}, - "outputs": [ + "cells": [ { - "data": { - "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCADaAMcDASIAAhEBAxEB/8QAHQABAAMBAAMBAQAAAAAAAAAAAAUGBwgCAwQJAf/EAE8QAAEDBAADAwYIBw0HBQAAAAECAwQABQYRBxIhEzFVCBYiQZTRFBUXMlFhk+EJN0JxdbO0IyQ0NkNSYnN2gaHB0hhUVpGSlbElM0Vyov/EABsBAQACAwEBAAAAAAAAAAAAAAACAwEEBQYH/8QANREAAgECAQgIBQUBAQAAAAAAAAECAxExBBITIUFRUpEFFBVhcaGxwSIyM2LRQnKB4fA0Y//aAAwDAQACEQMRAD8A/VOlKUApSlAK+SbdoNtKBMmx4pX1SH3Uo5vzbNfXWZ5/Cjzs/tSJMdqQkWyQQl1AUAe1a+mjlGEZTlgk2XUaelmoXxLx51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3Vye1cn4Jc0dPs77vI0TzqsvjED2lHvp51WXxiB7Sj31nfm9a/DYf2CPdTzetfhsP7BHup2rk/BLmh2d93kaJ51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3U7VyfglzQ7O+7yNE86rL4xA9pR76edVl8Yge0o99Z35vWvw2H9gj3U83rX4bD+wR7qdq5PwS5odnfd5GiedVl8Yge0o99eTWS2h91Dbd1hOOLISlCZCCVE9wA3Wc+b1r8Nh/YI91Rl/s1visW91mDGZdTdbfpbbKUqH78Z9YFX0OkKFetCiotZzSxW12IyyDNi5Z2BtdKUrfOQKUpQClKUApSlAKUpQClKUApSlAKznNfxg2v9FyP1rVaNWc5r+MG1/ouR+taqqt9Cp+1m5kn1onjSlK8IenILMs4snD6zi6X+cIENTqI6FBtbq3HVHSUIQgFS1HrpKQT0P0VQMr8pDHsemYQY7U242zJH5DZlsW+WtbCGW3CSGkslal9ogJKNBQHMrWgTUxxztlrueHxhdLbkE4MT2ZEaRjDCnp0B9IUUSEJTs+j1B9FXztFJBNZeZmcO2LhZmGT2O73WRYr5NMtuLbv/AFBcNxiQwxIcit9UrIU2VoSOm+4dQNulThKN5d+3u1GtUnJOy7vU1jJuOeEYbdmbder0q3yXG23SXYb/AGbSXOiC64G+Rrf9Mpr6cl4w4liWRjH7lcnU3tUduWmBGhSJLqmVqUhKwlptWxtCt6+boE6BG8H41NZRnxzu3ybTm0iPPs7Qxe22pl2PDV2kbbhmKSUjtEulQU08e5ICUqJrQ+HlonO8ZxfH7VOjRXcGtcdMmXFW1yu9u+txklQGnACgqQeo6bFSdKEYKT3b/DuIqpNyzUTnDjjjbeIWX5Tj7cObElWe4uQ2lLhSQ282httSlqcU0lCFcy1AIKuYgBQ2FA1plY9wzfnYjxTz+xXCx3dKb3e1XaFdWoS1wFsqiMpIU+PRQoKZUnlVo7I1vdbDVFVRUvhwsi6m21rFRGTfwOB+lLf+2M1L1EZN/A4H6Ut/7YzW10d/20f3R9UKv05eDNfpSlewPIilKUApSlAKUpQClKUApSlAKUpQCs5zX8YNr/Rcj9a1WjVXMlwaDk8+NNfkzYsmO0plK4b/AGe0qIJB6HfVIrEoqpCUG7XTRfQqKlUU2ZzlfD3GM6VGOR4/bL6YvMGDcIqHuy5tc3LzA63yp3r6BUB/s/cMt78wMb/7Wz/prUvkqg+MXv237qfJVB8Yvftv3VxV0XNKyrep1nltB63EpWLcOMVwd997HcctdjdkJCHV2+IhkuJB2AopA2BVjqS+SqD4xe/bfup8lUHxi9+2/dUX0S5O7qrkySy+ktSTI2lZpxkizcJ4ncI7HbL3dEQMlu78O4B2RzKU2hnnTynXonfrrXfkqg+MXv237qx2P/6rkzPaFLcyvXyxW7JrVItl2gx7nbpAAdiy2g404AQRzJPQ9QD/AHVUEcAeGjZ2nAccSdEbFsZHQjRHzforUPkqg+MXv237qfJVB8Yvftv3VNdFSjqVZcmReXUXjEzi18E+H9juMa4W/CrDBnRlh1mTHtzSHG1juUlQTsEfTU9k38DgfpS3/tjNWn5KoPjF79t+6v6nhRbO3juO3G7SUsPtyEtPS+ZBW2sLTsa6jmSD/dWxk/R7pV6dadW+a08HsdyEstpOLjFWuXWlKV0ziClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv9mNdEVzv5SP48fJ5/tDL/AGY10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv8AZjXRFc7+Uj+PHyef7Qy/2Y10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSoTJMvgYyG0P9rJmvAlmDFTzvOgd5A2AlPcOZRCRsbPUVKMXJ2iZScnZE3Xy3S2Rb1bZdvnx25cGWyuO+w6NodbUkpUlQ9YIJB/PVDczzJZJKmLPboTfXlEmWt1z6thKAAfqCj+evX555d/u1k/6nqt0W+S5m11Ws/wBJ+OnlF8G5fArjBfsRfClxWHu2t76v5eKv0mlb9Z16Kv6SVD1V+r3kU8F3+B/AOz2qehbV6ujirxcWXO9p51CAG9eopbQ2kj+clX01UOLnBg8Zs/wrLL5EtIn4w/2qW2u05ZiAoLQ07sbKErHMB/SWPyumueeeXf7tZP8AqepolxLmOqVtxpVKzdGa5Ykgrh2Zwb+al11HT8/Kf/FS1n4jsvyGot5hKskh1QQ26p0OxnFE6CQ7oaJOgAtKdkgDZ6VjRN/K0/B+2JCWT1YK7iXKlKVSa4pSlAKUpQClKUApSlAKUpQClKUApSlAKUpQEVlF+RjNhl3FbfbKaSEtsg6LrqlBLaAfUVLUlP8AfWew4zqFOyZbpk3CQQuQ+T3n1JT9CE7ISn1D6ySZ7istXwbHGv5J27oDn0aSy8tP/wC0IqKq2fwU4pbdfsl5M7OQwWa57RSudOPPFHJ8ZvGSSMQv1zlOY3ARMnWqJZYz0GOeQualSHVJX6aBvlaPMkddHYr08TuL2QfHOTohZfFwNiy41HvUCPIjMPLuzrqXVFO3QSUJLaG9NgK5l9/cK1bG660VdHSNK5me4ncQslvbGO2hrIYr9msdtk3J61wLdIlOy5LJWQ8JTjSUpHLrTaNlXP1SAAZm0ZRxLyrLsRxq63M4RcZeOS7hc2o0OM+6HmZbbSFo5u0QgrSsKI2sAKI79KCxlVU8Ezf1uJb5eZQTzHlGzrZ+ivF9huSy4y82l1pxJQttYBSpJGiCD3g1yreLtknEqzcHJUzI37ZeGcunWp+XAisacdYRMaTICHELAVytH0fm/uqunROuqIbLkeIw06+uU6hCUrfcSlKnCBoqISAAT39AB16Cs4ayUJ599RO4BfHkSpNgmOrfcjtiREfeXzLcYKtFKiepLatDZ6lKkbJPMau1ZXblqa4gY2pHznBKaXrv7Mtcx/u5kI/wrVK2qmtRnvXu17XODlUFCq0hSlKpNQUpSgFKUoBSlKAUpSgFKUoBSlKAUpSgK7ntjfv2NvtRE80+OtEuMknl5nG1BQRv1BYBQT9CzVNhTG7hFbkNE8ix3KGlJI6FJHqIIII9RBFapWP53fbHbOKNpxa1TixmV7juTfiv4K4uM80gHbzziEkMElJSHDvZ0ClXokWq045jdrYfg38lyhUm4ywZRcu4BWPMLtf5b92vkCLkDSGrtbbfMDUaaUt9mlaxyFYPIEpPKpIUEgKB67zzirwnyQ5JZn8etuR3ddstLEKJdI92tiOR1vm0txqSwSgn0SpbOubp6I5RXQjhvcMlEvGLilQ36cUtPtq+sFK+b/mkH6q9fxhP/wCHL17J99Y6vV2LzR03KjJapGbp4KyciYsl9vWQ3SxZ2m1swbtdMZkIYTNKRtSVpW2pJAUVEKCUkb6aGgLZb+G9vt+V2jIRMuEi4WyzrsjZkvh0OMqW2srcURzKc20n0ubrs7BJ3Xvu2dQ7DcLZAucSXbp1zcLMCLLDbTstY1tLSVLBWRsdE7PUVK/GE/8A4cvXsn306vV3E1Oitq5lEk8BLC/iUWxNXG7QzDvD18h3KM+hEuNKcdccUUK5OXl/dnE8qkn0T12etaBaoKrZbIkNUp+cqOyhoyZSgp14pAHOsgAFR1skAdT3V4ImXJ0hKMbvKlE60phKP8VLAqTtuH3u/KHxmn4it5+eyy8Fy3B/NK07S2PUSkqV1Oik6VTQSXz2S8fbEi61GmrpnswW3qu2TSLwRuFBaXCjq3tLjqlDtlD/AOnIlG/pLg6aO9Eqk8OOJWLZrJv9jx3to7+MSvi2bAehORTHUNhHKFJAKFBJKSn1a7t1dqTkpNJYLD/eZwatR1ZubFKUqsqFKUoBSlKAUpSgFKUoBSlKAUpSgFfwkDvOvz1HT8hgQLgzbFTIxvEllx+LblPoQ/ISjXMUJJ2QNjZ7hsbrNIWEz+OuKY3cOJmPycVuFtupujFit95WpBCFEx/hJb5QpSfRXoHopAOwCpFAfdcciu3FRzPcMtEfI8GdtyG4jOXLiIShx5Q5l/BkrO1gJ5RzgD550UkJJvWM4+nGrDbLaZsu6uwYrcX4wuKw5KfCQBzOLAHMo62TrqetStKAUpUbklkTkuO3W0LlyoCLhFdiGXCWEPshaCnnbUQQFp3sEggEDoaA/ILyy/KMmcT/ACiF3ewXBTVrxR8RLJIjr/LaXzKkJPdtTg2FfzUo+iv1L8n/AIvQ+OXCWwZfE5W3pjPJMjp/kJKPRdR9OuYEjfekpPrriHi3+D/4e4FxI4V4/b7zkz0PKro/CmuSZUdTjaEM84LRSwADvv5goa9VdreT/wCT/j3k4YbMxrGplznQJU9dxW5dXW3HQ4pttsgFttA5dNJ9W9k9e7QGmUpSgKvxFwCJxIw+6Y/Jn3CzonpRzT7PIMaU0pCgpCkuD1gpHfsEdKhY96ynFM0xXEW8bnZBi7lt7OTmD89tTrMltJ/99s+krnCUnnH5S+6tCpQEXjmUWfMLYLjY7pDu8ArU18JhPJdb50nSk7SSNg9CKlKzDKOFdxx3DrhE4Pu2TAb3LuKbk8tdtS5GlL6BaFpTrk5wlIKkgkAHQBOxLw+LFrVxTc4dyY9xayBu2puSJSoDiIcpvYDhac6j0CUbBOgVgAkg6AvFKUoBSlKAUpSgFKUoBSlKAVnWY57OvreX4rw6uFqd4i2VqMXI14Q6iPFD/pIcUQn0/wBz5lDl2NgA67q0Ws0uU5GN8d7NGhYIqQvJbe/8YZfGbJ+D/BgC2w8Qg6Srm9EqWOvQA+oCdtPDazoyO35fdrXbZudtW1u3v3xmNyKIAJX2YJVyJKlL9ZOiEkkCrdSlAKUpQClKw7jX5Q0jGsgZ4fcPLajLuKE9vmbgJV+9rW2dfviYsfMSNghOwVbHdzJ2BX/KPuURzyhvJ6tiJLS7im9S5KoiVguhr4OR2hT3hOwRvu6H6DXSNY7wL8nmPwxkzcoyO5Ly/iVeBzXTJJY2ob/kI6f5JlOgAABvQ3oBKU7FQClKUApSlAK9E2G3cIj8Z3nDbzamlFpam1hKho8qkkFJ+sEEeqvfSgMgj4vkPAPBccsOAWeZncBq59lKbvV51KjRHFHRaWtPKUtcydJ6aQg95JUNStV7t19aedts+LcGmXVMOLivJdShxPRSFFJOlD1g9RX21lPk6ysJl41kqsFhzYUBORz0TkTiSpc4LHbqTtSvQJ1ru/MKA1alKUApSlAKUpQClKUAr8+/KJ/CQXPF81h47jGK3exSrHdWlXtu9LjIclIbWsPQwlAeSlCwGyH0Ob79JI0T+gD8hqK2XHnUNIHepxQSP+Zri7y8vJnsfGSyO5ticy3jOLaz++IzUhG7pHSPmaB6upA9E96h6J36OpKMpYIFz8hzym808pa35fOyq1We3RbU7FZhOWlh1sOrWHS6F9o6vfKEta1r5x7/AFdRVyl+Dyx6Dw48nC3qucli23O9TZFzfjS3EtuoBIab2lWiAUNJWPqXv110z51WXxiB7Sj31LRz4WZsyUpXzQ7nDuG/gstiTrqexcC//BrmfJuIuVeVFkU/C+F8yRj2BQnVRb/nqElLj6h0XFt++9XqLvq3sdOXng01qZgluJfHTIc/zGXww4MdjLyBj0L5lrqe0gWBB2CAe52R0OkDYBHXelcui8FOBePcD7A/EtfbXG8T1/CLrfp6u0mXF87JcdWeutk6TvQ2e8kkzfDLhfjfCDEYmN4rbW7bbI/UhPVx5Z+c44vvWs66k/UBoAAWusAUpSgFKUoBSvkm3aDbSgTJseKV9Uh91KOb82zXzedVl8Yge0o99TUJNXSM2ZKUqL86rL4xA9pR76edVl8Yge0o99Z0c+FizMY8rPyn5vkv2XH7s3hispt9zkOxXnxcfgiYriUpU2k/uTnMVjtCO7XZnv3XO/Cf8Jffs3yy24rC4UQpd2vFx7GMIV3VHQhK1DRcBYXspGypewNAnQ1XV/HnD8Y438KMhw+Zd7ahc6OTEkLko/e8lPpNOdDvQUBvXekqHrrjn8G5wOZxbJ8izzLixbbhbXXLPbI0x1CFJc7pDwBPqGmwobB5nB6qaOfCxZn6NUqL86rL4xA9pR76edVl8Yge0o99NHPhYsyUpUX51WXxiB7Sj30TlFmUoAXeCSegAko6/wCNNHPhYsyUpSlVmBVQy7Ln4ksWm0hBuBSFvyXBzNxEHu6flOK/JT3AAqV05Urtch9EWO684dNtpK1H6gNmshxpbku1N3F/Rl3I/DX1DfVSwCB19SU8qR9SRVsbRi6j2YeJu5LRVWfxYI/i8agy3u3uLZvEsjRk3HTyz130BHKkfUkAfVXu837WP/jYf2CPdVO4wcXYnCOJj78qHImC63Vi3nsGHnS0hSvTc02hZUoDuR0Kj3b0RX0ZFxsw3FI1sdul0djKuUb4ZHjCBJXJ7HptxbKWy42kb6laU6OwdEGq3WqSxkzuJwjq1KxafN+1+Gw/sE+6nm/a/DYf2CfdVdv/ABgw/G7PaLnMvbS4l4Tz24wmnJTktPLzFTbbSVLUACCSBobG9VDSuLzFyyjhszjkiFdLBlLs5DkwBRUAxHW4OTqOVXOjlUFAkaI0DUdJPiZlyii7PYrZ3lBZtsZDqSFJdabDbiSO4hSdEf3GpbGr+7hfZw5jhkWNbhAkKSO1irWvZU4ofPbKlElZ9JJJUoqBKkU2wcXMTyjJ5WP2q6mbc4y3W3EojPBrmbOnEpeKOzUUnoQlRIq3PMokMradQlxtaSlSFDYUD0INWRrSwm7r/YbiqpShWjY1ClVPhjcHJmKNx33C6/b3nYKlkklSW1ENkk9SS3yEk+vff31bKTjmScdx5yUXFuLFKUqBEUpSgMzz+FHnZ/akSY7UhItkghLqAoA9q19NfH5vWvw2H9gj3VJZr+MG1/ouR+tarxrn5fUnGcUm1qXqzxfS0pLKWk9iI/zetfhsP7BHup5vWvw2H9gj3VIVGZLk1rw+ySrvepzVutsYAuyHjoDZAAHrJJIAA2SSAASa5ulqP9T5nHU5t2TZ5+b1r8Nh/YI91PN61+Gw/sEe6qjD474LNsV2vCb8lmFaezM/4VFeYdjJcUEoUtpxCXAlRPRXLroevQ1IYrxYxXNJc+Larr2kmCymS+1JjuxlBlW+V1IdSnnbOj6adp+us59ZbX5ljVZJtp6vEnvN61+Gw/sEe6nm9a/DYf2CPdWVRvKOsmUcTsLxvFJ0e6wruuaJj64j6PQZYUtCmHFBKFpK06Kk8419HfWy0lOrHGT5mJqrTtn3VyP83rX4bD+wR7qhc0sluYxa4uNQIrbiW9pWhlIIOx3HVWqoLOf4pXP+q/zFbOSVajyims5/MtveWZPOWmhr2r1NlpSldg+inzXKILhbpUUnQfaU3v6Ngj/OslxVxS8btoWlSHW2EsuIUNFK0DlWD+ZSSK2Os6yqwu45cZN1iMKetUtZdmNtDa4zpABdCfW2rXpa6pV6WiFKKLorPg6axxX4/wBusdDI6qpzaltMm8oK23GTjmOXK322Xd/iTI7fdZMSA2XZC2GnP3QtoHVagFb5R1OjVWVkcvFeK9yzl7E8mulnyGxxY0X4HaXHZcR1h17mYdY1ztBfaJUCoBOwdkVusaSzMYQ/HdQ+y4OZDjagpKh9II6GvZWq9WpnYcLvOTOWuHWJZDwZl4JkV9xy53GKmx3C3yIVmjGa9anX53wttPZo2op5D2RUgHRQN6FeeM4pkmP3vC8vmY1c24UjMLxc3bZHY7STb485lbbKnW0np6Wlr1vl5zvuNdRUrFyCopWs8P6/BgOAfGti4xfFuK2fJrbh0uRPfvUG+wC3CivbKkPwnj1IdcJJbSpSdKJ0kjVb9SvlhxnswkLt9scIjBXJMuKN8jKd6UhtQ6F0jYAHzPnK/JSuyEHUfdte4k3GjFuT1Fk4URyMdlzNEJn3CRIRsaJQFdmk/mIbBH1EVdK9EKGxbobESM0liMw2lpppA0lCEjQA+oACvfVtSWfNyR5ucs+TlvFKUqsgKUpQGc5r+MG1/ouR+tarxryzX8YNr/Rcj9a1Vcyvh7jGdKinI8ftt9MXmDBuEVD3Zc2ubl5gdb5U719ArmdIW0kb7l7niulbda17kWGsj8pfErrlWGWR61xJ1y+Jr7EusuBa5CmJcmO3zhxLK0qSQ4OcLTpQJKBo71U5/s+8Mt/xAxv/ALWz/pqdxXhviuDPvvY7jlrsbshIQ6u3xEMlxIOwFFIGwK5yai7o5kJRpyU4t3Xd/Zz3mWE23JeFmd3PHMZzpd9ehxbeheTKnPyZLQkodLbLT61r0ggknlA6nW+tWjjXgN/zTiDkEazxZCPjDh7cLa1N5FJYMhUlooZU5rlClDm6E70VHu3W/wBKlpWixZTJNNbL46934OdLJe5uYcRODoj4RkePR7C1OanfGFqcYjxCYRbSgOa5VJ5hpKh6J6ddnVdF181xt0W8W+TBnR2pkKS2pl+O+gLQ6hQ0pKknoQQSCDVJHk/8MwQRgGOAjuItjP8ApqLkpY6iE5wqWvqt/O1vf3l/qCzn+KVz/qv8xUBG4DcN4chp9jBMdZfaUFtuItjIUlQOwQeXoQan85/ilc/6r/MVsZJbrNO3EvUzQUdNDNe1evibLSlK7h9GFKUoCr3PhvYbnJckiM7BkuHa3bfIcjlZ3slQQQFHfrIJr4PkogeL3r237qu9KvVeov1FiqzjqUmUj5KIHi969t+6nyUQPF717b91XelZ09Tf6EtNU4mU5nhVYwoGUu4XJIIPZy5zimzr6UAhJ/MQRVriRGIEZuPGZbjx2khKGmkBKEAdwAHQCvdSq5VJz1SZXKUpfM7ilKVWRFKUoBSlKArmS4NByefGmvyZsWTHaUylcN/s9pUQSD0O+qRUZ8lUHxi9+2/dV2pVmklZL2RXKnCTvKKf8FJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IjoaXAuSKT8lUHxi9+2/dT5KoPjF79t+6rtSmkfdyQ0NLgXJFJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IaGlwLkik/JVB8Yvftv3V65HCG1y2lNSLneH2VfObXM2lQ+g9KvVKyqsk7r0RlUaSd1FckKUpVRaf/9k=" - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "import * as tslab from \"tslab\";\n", - "\n", - "const runnableGraph = agent.getGraph();\n", - "const image = await runnableGraph.drawMermaidPng();\n", - "const arrayBuffer = await image.arrayBuffer();\n", - "\n", - "await tslab.display.png(new Uint8Array(arrayBuffer));" - ] - }, - { - "cell_type": "markdown", - "id": "055aacad", - "metadata": {}, - "source": [ - "## How to stream tool calls\n", - "\n", - "You can now run your agent. Let's first look at an example of streaming back intermediate tool calls. This is not supported by all providers, but some support token-level streaming of tool invocations.\n", - "\n", - "To get the partially populated tool calls, you can access the message chunks' `tool_call_chunks` property:" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "id": "c704d23c", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "562ddb82", + "metadata": {}, + "source": [ + "# How to stream LLM tokens from your graph\n", + "\n", + "In this example, we will stream tokens from the language model powering an\n", + "agent. We will use a ReAct agent as an example. The tl;dr is to use\n", + "[streamEvents](https://js.langchain.com/v0.2/docs/how_to/chat_streaming/#stream-events)\n", + "([API Ref](https://api.js.langchain.com/classes/langchain_core_runnables.Runnable.html#streamEvents)).\n", + "\n", + "
\n", + "

Note

\n", + "

\n", + " If you are using a version of `@langchain/core` < 0.2.3, when calling chat models or LLMs you need to call `await model.stream()` within your nodes to get token-by-token streaming events, and aggregate final outputs if needed to update the graph state. In later versions of `@langchain/core`, this occurs automatically, and you can call `await model.invoke()`.\n", + "\n", + " For more on how to upgrade `@langchain/core`, check out [the instructions here](https://js.langchain.com/v0.2/docs/how_to/installation/#installing-integration-packages).\n", + "

\n", + "\n", + "
\n", + "\n", + "This how-to guide closely follows the others in this directory, showing how to\n", + "incorporate the functionality into a prototypical agent in LangGraph.\n", + "\n", + "This works for\n", + "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.StateGraph.html)\n", + "and all its subclasses, such as\n", + "[MessageGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.MessageGraph.html).\n", + "\n", + "
\n", + "

Streaming Support

\n", + "

\n", + " Token streaming is supported by many, but not all chat models. Check to see if your LLM integration supports token streaming here (doc). Note that some integrations may support _general_ token streaming but lack support for streaming tool calls.\n", + "

\n", + "
\n", + "\n", + "
\n", + "

Note

\n", + "

\n", + " In this how-to, we will create our agent from scratch to be transparent (but verbose). You can accomplish similar functionality using the createReactAgent({ llm, tools }) (API doc) constructor. This may be more appropriate if you are used to LangChain's AgentExecutor class.\n", + "

\n", + "
\n", + "\n", + "## Setup\n", + "\n", + "This guide will use OpenAI's GPT-4o model. We will optionally set our API key\n", + "for [LangSmith tracing](https://smith.langchain.com/), which will give us\n", + "best-in-class observability.\n", + "\n", + "---" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "[\n", - " {\n", - " name: 'search',\n", - " args: '',\n", - " id: 'call_ziGo5u8fYyqQ78SdLZTEC9Vg',\n", - " index: 0,\n", - " type: 'tool_call_chunk'\n", - " }\n", - "]\n", - "[\n", - " {\n", - " name: undefined,\n", - " args: '{\"',\n", - " id: undefined,\n", - " index: 0,\n", - " type: 'tool_call_chunk'\n", - " }\n", - "]\n", - "[\n", - " {\n", - " name: undefined,\n", - " args: 'query',\n", - " id: undefined,\n", - " index: 0,\n", - " type: 'tool_call_chunk'\n", - " }\n", - "]\n", - "[\n", - " {\n", - " name: undefined,\n", - " args: '\":\"',\n", - " id: undefined,\n", - " index: 0,\n", - " type: 'tool_call_chunk'\n", - " }\n", - "]\n", - "[\n", - " {\n", - " name: undefined,\n", - " args: 'current',\n", - " id: undefined,\n", - " index: 0,\n", - " type: 'tool_call_chunk'\n", - " }\n", - "]\n", - "[\n", - " {\n", - " name: undefined,\n", - " args: ' weather',\n", - " id: undefined,\n", - " index: 0,\n", - " type: 'tool_call_chunk'\n", - " }\n", - "]\n", - "[\n", - " {\n", - " name: undefined,\n", - " args: '\"}',\n", - " id: undefined,\n", - " index: 0,\n", - " type: 'tool_call_chunk'\n", - " }\n", - "]\n" - ] - } - ], - "source": [ - "import type { AIMessageChunk } from \"@langchain/core/messages\";\n", - "\n", - "const eventStream = await agent.streamEvents(\n", - " { messages: [[\"user\", \"What's the weather like today?\"]] },\n", - " {\n", - " version: \"v2\",\n", - " },\n", - ");\n", - "\n", - "for await (const { event, data } of eventStream) {\n", - " if (event === \"on_chat_model_stream\") {\n", - " const msg = data.chunk as AIMessageChunk;\n", - " if (msg.tool_call_chunks !== undefined && msg.tool_call_chunks.length > 0) {\n", - " console.log(msg.tool_call_chunks);\n", - " }\n", - " }\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "1d9b168b", - "metadata": {}, - "source": [ - "Because this is a ReAct-style agent, this will only log intermediate steps and not the final response because the model generates a final response with no tool calls when it no longer needs to gather more information from calling tools.\n", - "\n", - "## Streaming final responses\n", - "\n", - "### ReAct agents\n", - "\n", - "For ReAct-style agents, you know that as soon as you start message chunks with no `tool_call_chunks`, the model is responding directly to the user. So we can flip the conditional like this to only log tokens from the final response:" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "id": "86f843bb", - "metadata": {}, - "outputs": [ + "cell_type": "code", + "execution_count": 1, + "id": "8e76833b", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "// process.env.OPENAI_API_KEY = \"sk_...\";\n", + "\n", + "// Optional, add tracing in LangSmith\n", + "// process.env.LANGCHAIN_API_KEY = \"ls__...\";\n", + "// process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", + "// process.env.LANGCHAIN_TRACING = \"true\";\n", + "// process.env.LANGCHAIN_PROJECT = \"Stream Tokens: LangGraphJS\";" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "\n", - "The\n", - " weather\n", - " today\n", - " is\n", - " cold\n", - ",\n", - " with\n", - " a\n", - " low\n", - " of\n", - " \n", - "3\n", - "℃\n", - ".\n", - "\n" - ] - } - ], - "source": [ - "const eventStreamFinalRes = await agent.streamEvents(\n", - " { messages: [[\"user\", \"What's the weather like today?\"]] },\n", - " { version: \"v2\" });\n", - "\n", - "for await (const { event, data } of eventStreamFinalRes) {\n", - " if (event === \"on_chat_model_stream\") {\n", - " const msg = data.chunk as AIMessageChunk;\n", - " if (!msg.tool_call_chunks?.length) {\n", - " console.log(msg.content);\n", - " }\n", - " }\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "f13b4790", - "metadata": {}, - "source": [ - "### Other graphs\n", - "\n", - "If your graph has multiple model calls in multiple nodes and there's one that will always be called last, you can distinguish that model by assigning it a run name or a tag. To illustrate this, declare a new graph like this:" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "id": "0fea2f20", - "metadata": {}, - "outputs": [], - "source": [ - "import { HumanMessage } from \"@langchain/core/messages\";\n", - "\n", - "const OtherGraphState = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: (x, y) => x.concat(y),\n", - " }),\n", - "});\n", - "\n", - "const respond = async (state: typeof OtherGraphState.State): Promise> => {\n", - " const { messages } = state;\n", - " const model = new ChatOpenAI({ model: \"gpt-4o\", temperature: 0 });\n", - " const responseMessage = await model.invoke(messages);\n", - " return {\n", - " messages: [responseMessage],\n", - " }\n", - "};\n", - "\n", - "const summarize = async (state: typeof OtherGraphState.State): Promise> => {\n", - " const { messages } = state;\n", - " // Assign the final model call a run name\n", - " const model = new ChatOpenAI({\n", - " model: \"gpt-4o\",\n", - " temperature: 0\n", - " }).withConfig({ runName: \"Summarizer\" });\n", - " const userMessage = new HumanMessage(\"Now, summarize the above messages\")\n", - " const responseMessage = await model.invoke([\n", - " ...messages,\n", - " userMessage,\n", - " ]);\n", - " return { \n", - " messages: [userMessage, responseMessage]\n", - " };\n", - "}\n", - "\n", - "const otherWorkflow = new StateGraph(OtherGraphState)\n", - " .addNode(\"respond\", respond)\n", - " .addNode(\"summarize\", summarize)\n", - " .addEdge(\"__start__\", \"respond\")\n", - " .addEdge(\"respond\", \"summarize\")\n", - " .addEdge(\"summarize\", \"__end__\");\n", - "\n", - "const otherGraph = otherWorkflow.compile();" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "id": "2149f527", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "ab95dc97", + "metadata": {}, + "source": [ + "## Define the state\n", + "\n", + "The state is the interface for all of the nodes in our graph.\n" + ] + }, { - "data": { - "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCAEuAHUDASIAAhEBAxEB/8QAHQABAAIDAAMBAAAAAAAAAAAAAAYHBAUIAQMJAv/EAFIQAAEDAwEDBQoJCQUFCQAAAAECAwQABQYRBxIhExYxVZQIFBUXIkFRk9HhMjdUVmFxdpKzIzU2QlJ0kaGyJHWBtNQzRXKWsUNGYoOVosHS8P/EABsBAQACAwEBAAAAAAAAAAAAAAACAwEEBQYH/8QANhEAAgECAgcECQUAAwAAAAAAAAECAxEEURITFSExUpEFFEFhMnGBobHB0eHwIjNCYnI0Y8L/2gAMAwEAAhEDEQA/APqnSlKA1jmTWdpxSF3WChaSUqSqSgEHzgjWvHOqy9cQO0o9tVNjVlt8m3OuvQIzripkveWtlJJ/tDnSSK2vN619Ww/UI9laFftChQqzpOLei2uK8HY68cBpRUtLiWJzqsvXEDtKPbTnVZeuIHaUe2q75vWvq2H6hHspzetfVsP1CPZVG1cPyS6ols7+3uLE51WXriB2lHtpzqsvXEDtKPbVd83rX1bD9Qj2U5vWvq2H6hHsptXD8kuqGzv7e4sTnVZeuIHaUe2nOqy9cQO0o9tV3zetfVsP1CPZTm9a+rYfqEeym1cPyS6obO/t7ixOdVl64gdpR7ac6rL1xA7Sj21XfN619Ww/UI9lOb1r6th+oR7KbVw/JLqhs7+3uLNg3WFc9/vOZHl7mm9yDqV7uvRroeHQay6rXZ1Cjwc6yBEZhqOg22CopaQEgnlZXHQVZVdZSjJKUeDSfVXOZVp6qbhkKUpQpFKUoCnsT/NK/wB8l/5hytzWmxP80r/fJf8AmHK3NeO7Q/5lb/Uviz11L0I+pCoddNruJWfMWsVlXUi+uKZQYzUZ50NqdOjSXHEIKGyrzBahrUxrn/aGLrYdr6Lhg9nydnJJ0uAxdCIBcsl0ijdC1uvHUNLabKgFApVqkJ0UDWtSgptpipJxSaJhgO3q0Ztk2XWZUWZAcsU56Ml1yFJDTrTTTaluKcU0EIVvLUAgq3iEhQ1Cga3WFbZ8O2hXCRAsV476msMd8qjvRXo61M66cogOoSVo1IG8nUcRx4iq5tMzJsKyDa5Z7bj9zcvt4mybxYbj3kpdudUYDaW0re+AhQcZKSlRGpI8x1qLbPLXdZG1nCb0u251KJs0+HdbnkzL4QmWtDTm4ltXBlGrSxqhKWySgAqNbDpQd2t27PyKVUmml5/Mn+Ud1NiELZxesqx12RkiIMLvtoNQJTbDpJCQkvFkpSQVDeB4p47wFWdimUwcysrN0twlCM4SkCXDeiuajgfybyErA+nTQ+aqMs+z69zu4eRijFpfi39zHFtC2vtFl7luKigpVoQonXp04mrowPLU5lj7c9Nqu1mKSGlRbzCXEeCglJPkLAJTx03hwJB0qupCCi9DwbXEnTlJtaXikSOlKVqmwezBP09v/wDdkH8WVViVXeCfp7f/AO7IP4sqrEr3dH9qn/mPwR5fF/vyFKUq01BSlKAp7E/zSv8AfJf+YcrQT9huzu6TpE2Zg+PypclxTzz71taUtxajqpSiU6kkkkk+mrJGya2NrdLNyu8dDji3eTal6JSVKKlaDTgNSa8+KqD1xe+2+6uXiOznVr1KsKttJt8H4u53I42loqMlwKtHc/bMh/3Axv8A9LZ/+tTS02iDYbbHt9tiMwIEZAbZjRmwhttI6AlI4AVvvFVB64vfbfdTxVQeuL3233VQ+ypS41k/YySx1FcImtpWy8VUHri99t91VFsOizc8zTavbbte7ouLjmRKtsANSNwpZDaVaKOnlHUnjUNj/wDaujJbQpZMsuozlGzLEc3mtS8hxm03uU03yTb0+G28tCNSd0FQJA1JOn01OPFVB64vfbfdTxVQeuL3233VldkuLuqq6Mw8fSe5plXnYDs0KAg4DjhQCSE+DGdATpqfg/QP4VIMVwPG8GbkN47YrdY25BSp5NvjIZDhGuhVuga6an+NTDxVQeuL3233U8VUHri99t91SfZc2rOt8SKxtFb1EwcE/T2//wB2QfxZVWJUexnCIWLTJkuPImSpEpttpxyY9yh3UFZSBw4cXFfxqQ12lFQjGCd7JLojk15qpUc14ilKVkoFKUoBSlKAUpSgFc79y38Ze3/7Yq/BRXRFc79y38Ze3/7Yq/BRQHRFKUoBSlKAUpSgFKUoBSlKAUpSgFKUoBXO/ct/GXt/+2KvwUV0RXO/ct/GXt/+2KvwUUB0RSlKAUpSgFKUoBSlKAUpSgFKVCrrtLb5ZxixW9d6cbVuqkKc5CKlXnHKEEr+tCVAcQSCNKnGEp8Pz2k4QlN2irk1pVbKzTLVcRDsrf8A4S68v+eg/wCleOeeXfJrJ956rNUuZdTY7pWyK97u3YV47Nh09cCOHcjx7eudv0TqtwJT+WZHn8tA1AHSpCK+YHcw7Fn9vW2axYsEOeDCvvq6Oo4FqI2QXDr5irggH9paa+wPPPLvk1k+89VR7FtjB2F5PmV8x+Jae+ckl8uptwuBENrUqDDQA4I3lKP1BI/VBpqlzLqO6VsjqalVrzzy75NZPvPU555d8msn3nqapcy6julbIsqlV01nmSxiFSLPbprfDeEWWttz6dApBB+oqH11Ksby+BkyXEMcrGmMgF6DKTuPNa9BI1IIOh0UklJ0OhOhqLpSSut68ncqnRqU98kbulKVUUilKUApSlAQTPru5OuLePR3FNtFkSJ7ja91XJlRDbQI4jfKV6n0I0/WrVNtoZbS22kIQkBKUpGgAHQAK9Dqy7muXKX8NE1lpP0IERhQH8VqP+JrSbTLvLx/Zvld0t7ve8+DaZcmO7uhW44hlakq0IIOhAOhBFWV9zUFwSXvSZ6HDQVOknnvJJSudL1nmeYhs1w+e5fJOQ5Jmj0GKy1Gt8RKLeVsLedUwhRbDiylIADrm7vDUADyT4Xkm2qHi2TtMWy7yFsiI7BuVzg28XHcLukpKGI7qmXVJbAWjUJ1JIIPDXWsXa1ZM6MpXPCc+vt/Ts2tmPbQpEwXq7z4NwujlojsymwzFdd5FxhbejbqFIAPkjjoSCOBxrltXzXHZ95wRV2YuGQIyO2WWHkj0NCS2xMYU9yjjKdEKcbS24kaBKVEoJHTqsNasvzidIUrmrPdpec7M2syxxeSIvN1gxbRdLbepMBlK0tyLgmM6y82gJQr4J0KQk6LPEEAjZZXtUybYtdM2iXe7HMGYGMIv8JyTFajrbeL6mC0eSSkFsq3FakbwGvE9NLDXRXFfm/6F8SLvBiXCJAfmx2Z0wLVGiuOpS6+EAFZQknVW6FDXTo1GvTX6mx3lKbkw3e9rjHJVHfHmPnSr0oVoApPnHoIBFBN4/ldk28bK15VlYyWTIg3dXJogNRm4znJMb6WygAqQdQBv6nyddeOg6GqUZODUkST1iakidYxfW8msUS5NtlnlkkOMk6lpxJKXGyfOUrSpJ+qtpUG2UrV3pkLX/ZNXZwN+jRTTS1f+9a/51Oa2KsVGbS4fXeeaqR0JuOQpSlVFYpSlAVxmMBVmy0XDQiFdW0NLXr5LchGoTr/AMaCAPpbA6SK02TWCPlWN3WyS1utxblEdhvLZIC0ocQUKKSQQDoo6ag/VVr3G3RbvBehzGESYzyd1bTg1BH/AO46+Y1X1xxO/wBhUrvJsZDBHwEcolqWgeglZCHP+IqQejUE6mrnHXWadms91/kdbDYmKjq6hCMi2TWPKMFtmLTVSxFtYjqgzWHuSlxnWAA0824kDdcGnSBpxPDQ6Vix9k62can2pea5bIkS3m3zdnLggS2igghLZS2EJSdOKdzRWp111qXmbckcF41eUq/ZEdKv5pUR/OvHhCf83L12T31Hu9XL4G/p0Xvuik8i7np6LeMGj2G4XpLDN7n3a85Ama0J4deiLbDxKk7pKlBtG6lBG70p01NS+P3PuMt4pcbM/Jus2VPnouj99ky9biZjZTyT6XQAEqRupCQlISACNNCdZ74Qn/Ny9dk99aWwbQoeUTrxDtNuuk6VZ5Pec9pqLqY72mu4rj06EGnd6uRFOivFEWPc8WGTZb7DuF2vd2n3t2I5NvM6S2uYtMZ1LrLaSGwhKApPwUoGu8rz6ESHItlFhyvIrldrqh6WbjZFWGTDWsBhcYuFwnQDeC9VHiFcOGg141IvCE/5uXrsnvp4Qn/Ny9dk99O71ciWnRzRAcV2DW/GMksV6cyfJr5IsjL8aC1d5rbzbTbqUpUnQNpJ0CE6EnXhxJ4VY06a3b4jkh3XdQPgpGqlEnQJSPOokgAeckCvW0b3MUERMZuJUdPLlFphsfSSpe9/BJP0VKcawd6PKauN7eZlzm+LMZgEx4x/aSVDVa/NvkDQdCU6nUqOjvqPd69/29pTPEUqUf0u7M/A7E/YMbYZlgCe+tcqUAreCXXFFRQD5wnUIB9CRUhpSsTk5ycn4nBbcndilKVAwKUpQClKUApSlAK537lv4y9v/wBsVfgoroiud+5b+Mvb/wDbFX4KKA6IpSlAKUpQClKUApSlAKUpQClKUApSlAK537lv4y9v/wBsVfgoroiud+5b+Mvb/wDbFX4KKA6IpSlAKUpQClKUApSlAKUpQClet+Q1FbLjzqGkDpU4oJH8TWv502Uf73gdpR7akoylwQNpStXzqsvXEDtKPbTnVZeuIHaUe2paufKzNmVX3U3dA3PubsJg5TGxDnZbVyhFmEXExFRSofk1H8i5vJJBSTw0JSOO9w4e2K939MxTPMyXbtmy77cM4v6Z0eC3eeTU04sJbSyD3urfJOnHRPT0V9Edo9txLafgl8xS8XO3uW67RVxnf7Q2SgkeStOp+ElQSoegpFcA9wh3OAxzbvkd9zJyLHZwyQuHCU66lLUuYdQHWyT5aEt+UDp0uNkdFNXPlYsz6bUrV86rL1xA7Sj2051WXriB2lHtpq58rFmbSlavnVZeuIHaUe2vZHyC1ynAhi5Q3lnoS2+hRP8AgDTVzXgxZmwpSlVmBSlKAVEMuy5+JLFptIQbgUhb8lwbzcRB6OH6ziv1U9AAKlcN1K5XIfRFjuvOHRttJWo/QBqaqHGluS7U3cX9DLuR79fUNeKlgEDj5kp3Uj6EirY2jF1H4cPWbuFoqrP9XBHheNQZb3L3Fs3iWRoZNx0eWeOvAEbqR9CQB9Fe7m/ax/u2H6hHsrVZttFx7Z3FiP3+4GGJjvIRmmmHJDzywkqIQ22lS1aAEnQcB01+7NtBsGQXWHbYE/vibLtqLww3yLid+IpW4lzUpAGquG6TvfRVbrVJcZM7q0I/pRsub9r6th+oT7Kc37X1bD9Qn2VEJW3fB4kK1yl3lam7oqW3BQ1CkOOyVxnQ0+lDaWytSkrOmgGpAJGoBI9Nl7oXZ/kMy3RrfkCX13B8RY7hiPoaL510YU4psIQ6dP8AZrIV0cOIqOsqczGnDhdE15v2vq2H6hPspzftfVsP1CfZUJu3dDbPrFPlxJ+QpjLhyzBlOqiP8hGfB3dx10N7jZJ6CpQB6RqK2Nn2yYhfYF7mRbsUtWVoP3BMmI9HdYaKSoOFtxCVlBCVEKAIOh0JprKnMxpQ4XRJeb9r6th+oT7Kc37X1bD9Qn2VFsf214Zk9qvdwt94K49kYMq4IeiPsPR2glSt8tOISspISoghJ10OmtRzJu6Gx04iu6Y7d2lKXJiR2J1wtM9UFRec0A5RtrjqlLiQoHdSvdCiNdC1lTmZhzgle5ZnN+19Ww/UJ9lfh3GbO+jdctUFxP7K4yCP+lQa/d0ZgeMXZ+23O43KNLZld5KHgKetBe3twIStLBSvVXAbpIPm1rY37bjg+MX52z3O/NxprC0NyFcg6tiMpWm6l55KC20TqDotQOhB89NZUX8n1GlDNEvtom4opLlleX3sj4Vqec3o7g9CCdS0r0bp3fSk+aybFe42Q2xqbF3wheoU24NFtLHBSFDzKB4H+Wo0NQHpr34TL8G5tKgp0Szc4hmbvH/bMqQ2pXo1Uhxsf+WKvjJ1k1L0lvvnnf2bzn4uhHR1kVvLHpSlUnGMa5RBcLdKik6B9pTevo1BH/zVS4q4peN20LSpDrbCWXEKGhStA3Vg/UpJFXHVdZVYXccuMm6xGFPWqWsuzG2hquM6QAXQnztq08rTilXlaEKUUXRWnB01x4r6fmVjoYOqqc2peJUW3qBBkQbHMdhZR4WhSXHbbd8ThGVJt7pbKSVIAO82sEpKSkpPQdOkQrGrrmFizTE8zzPGrpImXDEfBk1NmgKkKZmJkh1KXG29eT30HXU+SlQIJGldCxpLMxhD8d1D7Lg3kONqCkqHpBHA17K1Xdbmdhwu9JM5i2O4vf4112QyZ+P3O296rypyYiVFUkxC/LCmg4dNE744pOuihxGor2O4hfPFPcIostw79O0gz0MCIvlO9/DAc5YJ013OT8rf6N3jrpXTNKwRVFJWv+WS+RzNkeIXyRsg2vQm7LcHJc7NDLix0xFlyQz3zDVyjadNVo0Qo7w1GiT6DW82qwcyt+0nNrziFvlqnqwqGxEltRt9KnkzpBcS2VDcW8lpRUEE66lPDQ1ftKGdUs/zf9Tk1jHJ8m/55LtlmziVAueAyrexNyZmQ5IlS0lai2Euaqb1Do3UbqAo7+4D57C2iYzcpHcuWazQrVKduDLFiQbexHUp5HJyopWOTA1G6EqJ4cAk69FXhSlzCpJJq/E55y3PXLntrC75ieYSMZxU62xEHHJclmdPUCFyipKN0paSSlvp1KlLB+DUYg4Axa7zmON5hjmf3bwzfJcll6wzZvgydEkubwLobdS02pIUUrSsDgnhva11bSlw6V3ds9caOiJGaYaBDbSAhIJJIAGg4npr34tHVM2hsOJCuTgW14uHTgFPONhHH6mXf4VgyrilmS1DYQZdyeGrEJojlHOOmunmSPOo8B5zU6w7GTjsB5UhSHblMWH5breu7v7oSEI147qQAB0a8ToCo1tU06adR+Ksvbufstf2mtjKqjDQXFm/pSlVHCFKUoCL3PZvYbnJckiM7BkuHVbtvkORys66kqCCAo6+cgmsDxUQOt71233VN6Veq9RfyLFVnHcpMhHiogdb3rtvup4qIHW967b7qm9Kzr6mfwJa6pzMhHiogdb3rtvuqoth0OZnmabV7bdr3dFxccyJVtgBqRuFLIbSrRR08o6k8a6UrnfuW/jL2/8A2xV+CimvqZ/Aa6pzMtHxUQOt71233U8VEDre9dt91TelNfUz+A11TmZCPFRA63vXbfdX6Tsoteo5W43l5I/VNwWjX/FG6f51NaVjX1MxrqnMzWWLGbXjTC2rZCbihZBcUnUrcPmKlnVSjx6STWzpSqpScneTuypu+9ilKVEwKUpQClKUApSlAK537lv4y9v/ANsVfgoroiud+5b+Mvb/APbFX4KKA6IpSlAKUpQClKUApSlAKUpQClKUApSlAK537lv4y9v/ANsVfgor193bsK8dmw6euBHDuR49vXO36J1W4Ep/LMjz+WgagDpUhFfMDuYdiz+3rbNYsWCHPBhX31dHUcC1EbILh18xVwQD+0tNAfculKUApSlAKUpQClKUApSlARHOcquVgm2iHbGIjr04u7ypZUEpCEg8N3061ped+X/J7J956sjaP+lGK/VL/oRWPWvisTLD6EYJb1fevN/Q832jjq+GrKFN7rZDnfl/yeyfeepzvy/5PZPvPUpWltCrkuhy9rYrNdEOd+X/ACeyfeeqpNi2xVWwvJ8yvmPxLT3zkkvl1NucoEQ2tSoMNADgjeUo/UEj9UGrbpTaFXJdBtbFZrohzvy/5PZPvPU535f8nsn3nqxLhd4NpVFTOmx4apb6Y0cSHUtl50glLaNT5SiEqISOPA+isum0KuS6DauLzXRDnfl/yeyfeerHuOeZbbrfKlqi2VaWGlOlIU9qQkE6fyrIrWZR+jV2/dHv6DVtLHVJVIxaVm14E4dq4qUkm10RaVtlGdboslSQlTzSHCkdA1AOn86ya1+P/mG2/uzX9ArYV0pK0mke1FKUqIFKUoCv9o/6UYr9Uv8AoRWPWRtH/SjFfql/0IqOZTNyCDFZVj1ogXeQpejjVwuK4aUJ06QpLLup104aD665vaG+VP8Az/6keM7XV8Sl5L5m7qA7dNoEzZlszud9tsZMu5JcYixG1gFPKvPIaSogqSCAV66FSQdNNRrrX55wbTfmTjf/ADQ9/oa/Muy3zaXabnjedYlZ4uOzo5Q6qDfHZTilbySnRJjNbuhG8FBWoKRoPRzUrNN8DlQhoyUp2aXHevqVcvMtrOM2LL5c9m9uW2NjVwnN3S/QbYw7DmtNFbPJpivOJcQfK1StPApTxUCa2sDaBlmE37DZWQZAcjtmSWKbcpEMQGWBDdjx23/yBQN4pKVKTo4pR10OtTiBsWYYx3ILNcMtym/xbzbnLW4q6z0OqYZWhSCWwGwnf0UfLUFE6DUmts9sutEi5YjNdckuKxmK/DitLUgtvNuspaXyo3fKO6gdG6NSeB6Ksc45Gw6tJ7ml48F5bvVvKFnu5lk9v2LZlkWTNy4t+ya3zmrDGgtIYhJdjvONBDoHKKKUHRW8SCTw0049VVTsLua7RYjZzAv2RyINhnC52ixSrigwo7qUrCGhq0pYb8sjiVFI6PODIxkG0zUa4TjYH0ZQ9/oaxNqXo/QjWcattC1lfJeJP61mUfo1dv3R7+g1EucG035k43/zS9/oKleSlRxe6lQCVd5u6gHUA7h89Zoq1WHrXxKIRcZxvnmizcf/ADDbf3Zr+gVsK1+P/mG2/uzX9ArYV6OfpM+kilKVAClKUBAdpcab4Xx6ZGt0q4NRzIS6Ije+pG8lISSNfoNaPwnO+bl77J76tqlYqQpVdHWRu0rcfNv5nOxGAo4menUvf1lS+E53zcvfZPfTwnO+bl77J76tqlVd2w3I+prbIwvn1+xUvhOd83L32T308Jzvm5e+ye+rapTu2G5H1GyML59fsVL4TnfNy99k99PCc75uXvsnvq2qU7thuR9RsjC+fX7FS+E53zcvfZPfWFe5Fyn2WfGaxy9F16O42gGLoNSkgef6auelSjQw8JKShw8zMeycNFpq/X7GFZWVx7PAacSUOIYbSpJ6QQkAis2lKtbu7nZFKUrAP//Z" - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "const otherRunnableGraph = otherGraph.getGraph();\n", - "const otherImage = await otherRunnableGraph.drawMermaidPng();\n", - "const otherArrayBuffer = await otherImage.arrayBuffer();\n", - "\n", - "await tslab.display.png(new Uint8Array(otherArrayBuffer));" - ] - }, - { - "cell_type": "markdown", - "id": "5ff9d991", - "metadata": {}, - "source": [ - "Now when we call `streamEvents`, we can see that we can now filter on run name to only see the final summary generation of the current chat history:" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "id": "51381303", - "metadata": {}, - "outputs": [ + "cell_type": "code", + "execution_count": 2, + "id": "1648124b", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { Annotation } from \"@langchain/langgraph\";\n", + "import { BaseMessage } from \"@langchain/core/messages\";\n", + "\n", + "const GraphState = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: (x, y) => x.concat(y),\n", + " }),\n", + "});" + ] + }, + { + "cell_type": "markdown", + "id": "da50fbd8", + "metadata": {}, + "source": [ + "## Set up the tools\n", + "\n", + "First define the tools you want to use. For this simple example, we'll create a placeholder search engine, but see the documentation [here](https://js.langchain.com/v0.2/docs/how_to/custom_tools) on how to create your own custom tools." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "a8f1ae1c", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { tool } from \"@langchain/core/tools\";\n", + "import { z } from \"zod\";\n", + "\n", + "const searchTool = tool((_) => {\n", + " // This is a placeholder for the actual implementation\n", + " return \"Cold, with a low of 3℃\";\n", + "}, {\n", + " name: \"search\",\n", + " description:\n", + " \"Use to surf the web, fetch current information, check the weather, and retrieve other information.\",\n", + " schema: z.object({\n", + " query: z.string().describe(\"The query to use in your search.\"),\n", + " }),\n", + "});\n", + "\n", + "await searchTool.invoke({ query: \"What's the weather like?\" });\n", + "\n", + "const tools = [searchTool];" + ] + }, + { + "cell_type": "markdown", + "id": "19b27cb3", + "metadata": {}, + "source": [ + "We can now wrap these tools in a prebuilt\n", + "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html).\n", + "This object will actually run the tools (functions) whenever they are invoked by\n", + "our LLM." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "f02278b1", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", + "\n", + "const toolNode = new ToolNode(tools);" + ] + }, + { + "cell_type": "markdown", + "id": "dd55ee5a", + "metadata": {}, + "source": [ + "## Set up the model\n", + "\n", + "Now load the [chat model](https://js.langchain.com/v0.2/docs/concepts/#chat-models).\n", + "\n", + "1. It should work with messages. We will represent all agent state in the form\n", + " of messages, so it needs to be able to work well with them.\n", + "2. It should work with\n", + " [tool calling](https://js.langchain.com/v0.2/docs/how_to/tool_calling/#passing-tools-to-llms),\n", + " meaning it can return function arguments in its response.\n", + "\n", + "
\n", + "

Note

\n", + "

\n", + " These model requirements are not general requirements for using LangGraph - they are just requirements for this one example.\n", + "

\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "9c7210e7", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { ChatOpenAI } from \"@langchain/openai\";\n", + "\n", + "const model = new ChatOpenAI({ model: \"gpt-4o\", temperature: 0 });" + ] + }, + { + "cell_type": "markdown", + "id": "73e59248", + "metadata": {}, + "source": [ + "After you've done this, we should make sure the model knows that it has these\n", + "tools available to call. We can do this by calling\n", + "[bindTools](https://v01.api.js.langchain.com/classes/langchain_core_language_models_chat_models.BaseChatModel.html#bindTools)." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "b4ff23ee", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "const boundModel = model.bindTools(tools);" + ] + }, + { + "cell_type": "markdown", + "id": "dbe67356", + "metadata": {}, + "source": [ + "## Define the graph\n", + "\n", + "We can now put it all together." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "0ba603bb", + "metadata": {}, + "outputs": [], + "source": [ + "import { StateGraph, END } from \"@langchain/langgraph\";\n", + "import { AIMessage } from \"@langchain/core/messages\";\n", + "\n", + "const routeMessage = (state: typeof GraphState.State) => {\n", + " const { messages } = state;\n", + " const lastMessage = messages[messages.length - 1] as AIMessage;\n", + " // If no tools are called, we can finish (respond to the user)\n", + " if (!lastMessage?.tool_calls?.length) {\n", + " return END;\n", + " }\n", + " // Otherwise if there is, we continue and call the tools\n", + " return \"tools\";\n", + "};\n", + "\n", + "const callModel = async (\n", + " state: typeof GraphState.State,\n", + "): Promise> => {\n", + " // For versions of @langchain/core < 0.2.3, you must call `.stream()`\n", + " // and aggregate the message from chunks instead of calling `.invoke()`.\n", + " const { messages } = state;\n", + " const responseMessage = await boundModel.invoke(messages);\n", + " return { messages: [responseMessage] };\n", + "};\n", + "\n", + "const workflow = new StateGraph(GraphState)\n", + " .addNode(\"agent\", callModel)\n", + " .addNode(\"tools\", toolNode)\n", + " .addEdge(\"__start__\", \"agent\")\n", + " .addConditionalEdges(\"agent\", routeMessage)\n", + " .addEdge(\"tools\", \"agent\");\n", + "\n", + "const agent = workflow.compile();" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "\n", - "You\n", - " asked\n", - " about\n", - " the\n", - " capital\n", - " of\n", - " Nepal\n", - ",\n", - " and\n", - " I\n", - " responded\n", - " that\n", - " it\n", - " is\n", - " Kathmandu\n", - ".\n", - "\n" - ] + "cell_type": "code", + "execution_count": 8, + "id": "a88cf20a", + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCADaAMcDASIAAhEBAxEB/8QAHQABAAMBAAMBAQAAAAAAAAAAAAUGBwgCAwQJAf/EAE8QAAEDBAADAwYIBw0HBQAAAAECAwQABQYRBxIhEzFVCBYiQZTRFBUXMlFhk+EJN0JxdbO0IyQ0NkNSYnN2gaHB0hhUVpGSlbElM0Vyov/EABsBAQACAwEBAAAAAAAAAAAAAAACAwEEBQYH/8QANREAAgECAQgIBQUBAQAAAAAAAAECAxExBBITIUFRUpEFFBVhcaGxwSIyM2LRQnKB4fA0Y//aAAwDAQACEQMRAD8A/VOlKUApSlAK+SbdoNtKBMmx4pX1SH3Uo5vzbNfXWZ5/Cjzs/tSJMdqQkWyQQl1AUAe1a+mjlGEZTlgk2XUaelmoXxLx51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3Vye1cn4Jc0dPs77vI0TzqsvjED2lHvp51WXxiB7Sj31nfm9a/DYf2CPdTzetfhsP7BHup2rk/BLmh2d93kaJ51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3U7VyfglzQ7O+7yNE86rL4xA9pR76edVl8Yge0o99Z35vWvw2H9gj3U83rX4bD+wR7qdq5PwS5odnfd5GiedVl8Yge0o99eTWS2h91Dbd1hOOLISlCZCCVE9wA3Wc+b1r8Nh/YI91Rl/s1visW91mDGZdTdbfpbbKUqH78Z9YFX0OkKFetCiotZzSxW12IyyDNi5Z2BtdKUrfOQKUpQClKUApSlAKUpQClKUApSlAKznNfxg2v9FyP1rVaNWc5r+MG1/ouR+taqqt9Cp+1m5kn1onjSlK8IenILMs4snD6zi6X+cIENTqI6FBtbq3HVHSUIQgFS1HrpKQT0P0VQMr8pDHsemYQY7U242zJH5DZlsW+WtbCGW3CSGkslal9ogJKNBQHMrWgTUxxztlrueHxhdLbkE4MT2ZEaRjDCnp0B9IUUSEJTs+j1B9FXztFJBNZeZmcO2LhZmGT2O73WRYr5NMtuLbv/AFBcNxiQwxIcit9UrIU2VoSOm+4dQNulThKN5d+3u1GtUnJOy7vU1jJuOeEYbdmbder0q3yXG23SXYb/AGbSXOiC64G+Rrf9Mpr6cl4w4liWRjH7lcnU3tUduWmBGhSJLqmVqUhKwlptWxtCt6+boE6BG8H41NZRnxzu3ybTm0iPPs7Qxe22pl2PDV2kbbhmKSUjtEulQU08e5ICUqJrQ+HlonO8ZxfH7VOjRXcGtcdMmXFW1yu9u+txklQGnACgqQeo6bFSdKEYKT3b/DuIqpNyzUTnDjjjbeIWX5Tj7cObElWe4uQ2lLhSQ282httSlqcU0lCFcy1AIKuYgBQ2FA1plY9wzfnYjxTz+xXCx3dKb3e1XaFdWoS1wFsqiMpIU+PRQoKZUnlVo7I1vdbDVFVRUvhwsi6m21rFRGTfwOB+lLf+2M1L1EZN/A4H6Ut/7YzW10d/20f3R9UKv05eDNfpSlewPIilKUApSlAKUpQClKUApSlAKUpQCs5zX8YNr/Rcj9a1WjVXMlwaDk8+NNfkzYsmO0plK4b/AGe0qIJB6HfVIrEoqpCUG7XTRfQqKlUU2ZzlfD3GM6VGOR4/bL6YvMGDcIqHuy5tc3LzA63yp3r6BUB/s/cMt78wMb/7Wz/prUvkqg+MXv237qfJVB8Yvftv3VxV0XNKyrep1nltB63EpWLcOMVwd997HcctdjdkJCHV2+IhkuJB2AopA2BVjqS+SqD4xe/bfup8lUHxi9+2/dUX0S5O7qrkySy+ktSTI2lZpxkizcJ4ncI7HbL3dEQMlu78O4B2RzKU2hnnTynXonfrrXfkqg+MXv237qx2P/6rkzPaFLcyvXyxW7JrVItl2gx7nbpAAdiy2g404AQRzJPQ9QD/AHVUEcAeGjZ2nAccSdEbFsZHQjRHzforUPkqg+MXv237qfJVB8Yvftv3VNdFSjqVZcmReXUXjEzi18E+H9juMa4W/CrDBnRlh1mTHtzSHG1juUlQTsEfTU9k38DgfpS3/tjNWn5KoPjF79t+6v6nhRbO3juO3G7SUsPtyEtPS+ZBW2sLTsa6jmSD/dWxk/R7pV6dadW+a08HsdyEstpOLjFWuXWlKV0ziClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv9mNdEVzv5SP48fJ5/tDL/AGY10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv8AZjXRFc7+Uj+PHyef7Qy/2Y10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSoTJMvgYyG0P9rJmvAlmDFTzvOgd5A2AlPcOZRCRsbPUVKMXJ2iZScnZE3Xy3S2Rb1bZdvnx25cGWyuO+w6NodbUkpUlQ9YIJB/PVDczzJZJKmLPboTfXlEmWt1z6thKAAfqCj+evX555d/u1k/6nqt0W+S5m11Ws/wBJ+OnlF8G5fArjBfsRfClxWHu2t76v5eKv0mlb9Z16Kv6SVD1V+r3kU8F3+B/AOz2qehbV6ujirxcWXO9p51CAG9eopbQ2kj+clX01UOLnBg8Zs/wrLL5EtIn4w/2qW2u05ZiAoLQ07sbKErHMB/SWPyumueeeXf7tZP8AqepolxLmOqVtxpVKzdGa5Ykgrh2Zwb+al11HT8/Kf/FS1n4jsvyGot5hKskh1QQ26p0OxnFE6CQ7oaJOgAtKdkgDZ6VjRN/K0/B+2JCWT1YK7iXKlKVSa4pSlAKUpQClKUApSlAKUpQClKUApSlAKUpQEVlF+RjNhl3FbfbKaSEtsg6LrqlBLaAfUVLUlP8AfWew4zqFOyZbpk3CQQuQ+T3n1JT9CE7ISn1D6ySZ7istXwbHGv5J27oDn0aSy8tP/wC0IqKq2fwU4pbdfsl5M7OQwWa57RSudOPPFHJ8ZvGSSMQv1zlOY3ARMnWqJZYz0GOeQualSHVJX6aBvlaPMkddHYr08TuL2QfHOTohZfFwNiy41HvUCPIjMPLuzrqXVFO3QSUJLaG9NgK5l9/cK1bG660VdHSNK5me4ncQslvbGO2hrIYr9msdtk3J61wLdIlOy5LJWQ8JTjSUpHLrTaNlXP1SAAZm0ZRxLyrLsRxq63M4RcZeOS7hc2o0OM+6HmZbbSFo5u0QgrSsKI2sAKI79KCxlVU8Ezf1uJb5eZQTzHlGzrZ+ivF9huSy4y82l1pxJQttYBSpJGiCD3g1yreLtknEqzcHJUzI37ZeGcunWp+XAisacdYRMaTICHELAVytH0fm/uqunROuqIbLkeIw06+uU6hCUrfcSlKnCBoqISAAT39AB16Cs4ayUJ599RO4BfHkSpNgmOrfcjtiREfeXzLcYKtFKiepLatDZ6lKkbJPMau1ZXblqa4gY2pHznBKaXrv7Mtcx/u5kI/wrVK2qmtRnvXu17XODlUFCq0hSlKpNQUpSgFKUoBSlKAUpSgFKUoBSlKAUpSgK7ntjfv2NvtRE80+OtEuMknl5nG1BQRv1BYBQT9CzVNhTG7hFbkNE8ix3KGlJI6FJHqIIII9RBFapWP53fbHbOKNpxa1TixmV7juTfiv4K4uM80gHbzziEkMElJSHDvZ0ClXokWq045jdrYfg38lyhUm4ywZRcu4BWPMLtf5b92vkCLkDSGrtbbfMDUaaUt9mlaxyFYPIEpPKpIUEgKB67zzirwnyQ5JZn8etuR3ddstLEKJdI92tiOR1vm0txqSwSgn0SpbOubp6I5RXQjhvcMlEvGLilQ36cUtPtq+sFK+b/mkH6q9fxhP/wCHL17J99Y6vV2LzR03KjJapGbp4KyciYsl9vWQ3SxZ2m1swbtdMZkIYTNKRtSVpW2pJAUVEKCUkb6aGgLZb+G9vt+V2jIRMuEi4WyzrsjZkvh0OMqW2srcURzKc20n0ubrs7BJ3Xvu2dQ7DcLZAucSXbp1zcLMCLLDbTstY1tLSVLBWRsdE7PUVK/GE/8A4cvXsn306vV3E1Oitq5lEk8BLC/iUWxNXG7QzDvD18h3KM+hEuNKcdccUUK5OXl/dnE8qkn0T12etaBaoKrZbIkNUp+cqOyhoyZSgp14pAHOsgAFR1skAdT3V4ImXJ0hKMbvKlE60phKP8VLAqTtuH3u/KHxmn4it5+eyy8Fy3B/NK07S2PUSkqV1Oik6VTQSXz2S8fbEi61GmrpnswW3qu2TSLwRuFBaXCjq3tLjqlDtlD/AOnIlG/pLg6aO9Eqk8OOJWLZrJv9jx3to7+MSvi2bAehORTHUNhHKFJAKFBJKSn1a7t1dqTkpNJYLD/eZwatR1ZubFKUqsqFKUoBSlKAUpSgFKUoBSlKAUpSgFfwkDvOvz1HT8hgQLgzbFTIxvEllx+LblPoQ/ISjXMUJJ2QNjZ7hsbrNIWEz+OuKY3cOJmPycVuFtupujFit95WpBCFEx/hJb5QpSfRXoHopAOwCpFAfdcciu3FRzPcMtEfI8GdtyG4jOXLiIShx5Q5l/BkrO1gJ5RzgD550UkJJvWM4+nGrDbLaZsu6uwYrcX4wuKw5KfCQBzOLAHMo62TrqetStKAUpUbklkTkuO3W0LlyoCLhFdiGXCWEPshaCnnbUQQFp3sEggEDoaA/ILyy/KMmcT/ACiF3ewXBTVrxR8RLJIjr/LaXzKkJPdtTg2FfzUo+iv1L8n/AIvQ+OXCWwZfE5W3pjPJMjp/kJKPRdR9OuYEjfekpPrriHi3+D/4e4FxI4V4/b7zkz0PKro/CmuSZUdTjaEM84LRSwADvv5goa9VdreT/wCT/j3k4YbMxrGplznQJU9dxW5dXW3HQ4pttsgFttA5dNJ9W9k9e7QGmUpSgKvxFwCJxIw+6Y/Jn3CzonpRzT7PIMaU0pCgpCkuD1gpHfsEdKhY96ynFM0xXEW8bnZBi7lt7OTmD89tTrMltJ/99s+krnCUnnH5S+6tCpQEXjmUWfMLYLjY7pDu8ArU18JhPJdb50nSk7SSNg9CKlKzDKOFdxx3DrhE4Pu2TAb3LuKbk8tdtS5GlL6BaFpTrk5wlIKkgkAHQBOxLw+LFrVxTc4dyY9xayBu2puSJSoDiIcpvYDhac6j0CUbBOgVgAkg6AvFKUoBSlKAUpSgFKUoBSlKAVnWY57OvreX4rw6uFqd4i2VqMXI14Q6iPFD/pIcUQn0/wBz5lDl2NgA67q0Ws0uU5GN8d7NGhYIqQvJbe/8YZfGbJ+D/BgC2w8Qg6Srm9EqWOvQA+oCdtPDazoyO35fdrXbZudtW1u3v3xmNyKIAJX2YJVyJKlL9ZOiEkkCrdSlAKUpQClKw7jX5Q0jGsgZ4fcPLajLuKE9vmbgJV+9rW2dfviYsfMSNghOwVbHdzJ2BX/KPuURzyhvJ6tiJLS7im9S5KoiVguhr4OR2hT3hOwRvu6H6DXSNY7wL8nmPwxkzcoyO5Ly/iVeBzXTJJY2ob/kI6f5JlOgAABvQ3oBKU7FQClKUApSlAK9E2G3cIj8Z3nDbzamlFpam1hKho8qkkFJ+sEEeqvfSgMgj4vkPAPBccsOAWeZncBq59lKbvV51KjRHFHRaWtPKUtcydJ6aQg95JUNStV7t19aedts+LcGmXVMOLivJdShxPRSFFJOlD1g9RX21lPk6ysJl41kqsFhzYUBORz0TkTiSpc4LHbqTtSvQJ1ru/MKA1alKUApSlAKUpQClKUAr8+/KJ/CQXPF81h47jGK3exSrHdWlXtu9LjIclIbWsPQwlAeSlCwGyH0Ob79JI0T+gD8hqK2XHnUNIHepxQSP+Zri7y8vJnsfGSyO5ticy3jOLaz++IzUhG7pHSPmaB6upA9E96h6J36OpKMpYIFz8hzym808pa35fOyq1We3RbU7FZhOWlh1sOrWHS6F9o6vfKEta1r5x7/AFdRVyl+Dyx6Dw48nC3qucli23O9TZFzfjS3EtuoBIab2lWiAUNJWPqXv110z51WXxiB7Sj31LRz4WZsyUpXzQ7nDuG/gstiTrqexcC//BrmfJuIuVeVFkU/C+F8yRj2BQnVRb/nqElLj6h0XFt++9XqLvq3sdOXng01qZgluJfHTIc/zGXww4MdjLyBj0L5lrqe0gWBB2CAe52R0OkDYBHXelcui8FOBePcD7A/EtfbXG8T1/CLrfp6u0mXF87JcdWeutk6TvQ2e8kkzfDLhfjfCDEYmN4rbW7bbI/UhPVx5Z+c44vvWs66k/UBoAAWusAUpSgFKUoBSvkm3aDbSgTJseKV9Uh91KOb82zXzedVl8Yge0o99TUJNXSM2ZKUqL86rL4xA9pR76edVl8Yge0o99Z0c+FizMY8rPyn5vkv2XH7s3hispt9zkOxXnxcfgiYriUpU2k/uTnMVjtCO7XZnv3XO/Cf8Jffs3yy24rC4UQpd2vFx7GMIV3VHQhK1DRcBYXspGypewNAnQ1XV/HnD8Y438KMhw+Zd7ahc6OTEkLko/e8lPpNOdDvQUBvXekqHrrjn8G5wOZxbJ8izzLixbbhbXXLPbI0x1CFJc7pDwBPqGmwobB5nB6qaOfCxZn6NUqL86rL4xA9pR76edVl8Yge0o99NHPhYsyUpUX51WXxiB7Sj30TlFmUoAXeCSegAko6/wCNNHPhYsyUpSlVmBVQy7Ln4ksWm0hBuBSFvyXBzNxEHu6flOK/JT3AAqV05Urtch9EWO684dNtpK1H6gNmshxpbku1N3F/Rl3I/DX1DfVSwCB19SU8qR9SRVsbRi6j2YeJu5LRVWfxYI/i8agy3u3uLZvEsjRk3HTyz130BHKkfUkAfVXu837WP/jYf2CPdVO4wcXYnCOJj78qHImC63Vi3nsGHnS0hSvTc02hZUoDuR0Kj3b0RX0ZFxsw3FI1sdul0djKuUb4ZHjCBJXJ7HptxbKWy42kb6laU6OwdEGq3WqSxkzuJwjq1KxafN+1+Gw/sE+6nm/a/DYf2CfdVdv/ABgw/G7PaLnMvbS4l4Tz24wmnJTktPLzFTbbSVLUACCSBobG9VDSuLzFyyjhszjkiFdLBlLs5DkwBRUAxHW4OTqOVXOjlUFAkaI0DUdJPiZlyii7PYrZ3lBZtsZDqSFJdabDbiSO4hSdEf3GpbGr+7hfZw5jhkWNbhAkKSO1irWvZU4ofPbKlElZ9JJJUoqBKkU2wcXMTyjJ5WP2q6mbc4y3W3EojPBrmbOnEpeKOzUUnoQlRIq3PMokMradQlxtaSlSFDYUD0INWRrSwm7r/YbiqpShWjY1ClVPhjcHJmKNx33C6/b3nYKlkklSW1ENkk9SS3yEk+vff31bKTjmScdx5yUXFuLFKUqBEUpSgMzz+FHnZ/akSY7UhItkghLqAoA9q19NfH5vWvw2H9gj3VJZr+MG1/ouR+tarxrn5fUnGcUm1qXqzxfS0pLKWk9iI/zetfhsP7BHup5vWvw2H9gj3VIVGZLk1rw+ySrvepzVutsYAuyHjoDZAAHrJJIAA2SSAASa5ulqP9T5nHU5t2TZ5+b1r8Nh/YI91PN61+Gw/sEe6qjD474LNsV2vCb8lmFaezM/4VFeYdjJcUEoUtpxCXAlRPRXLroevQ1IYrxYxXNJc+Larr2kmCymS+1JjuxlBlW+V1IdSnnbOj6adp+us59ZbX5ljVZJtp6vEnvN61+Gw/sEe6nm9a/DYf2CPdWVRvKOsmUcTsLxvFJ0e6wruuaJj64j6PQZYUtCmHFBKFpK06Kk8419HfWy0lOrHGT5mJqrTtn3VyP83rX4bD+wR7qhc0sluYxa4uNQIrbiW9pWhlIIOx3HVWqoLOf4pXP+q/zFbOSVajyims5/MtveWZPOWmhr2r1NlpSldg+inzXKILhbpUUnQfaU3v6Ngj/OslxVxS8btoWlSHW2EsuIUNFK0DlWD+ZSSK2Os6yqwu45cZN1iMKetUtZdmNtDa4zpABdCfW2rXpa6pV6WiFKKLorPg6axxX4/wBusdDI6qpzaltMm8oK23GTjmOXK322Xd/iTI7fdZMSA2XZC2GnP3QtoHVagFb5R1OjVWVkcvFeK9yzl7E8mulnyGxxY0X4HaXHZcR1h17mYdY1ztBfaJUCoBOwdkVusaSzMYQ/HdQ+y4OZDjagpKh9II6GvZWq9WpnYcLvOTOWuHWJZDwZl4JkV9xy53GKmx3C3yIVmjGa9anX53wttPZo2op5D2RUgHRQN6FeeM4pkmP3vC8vmY1c24UjMLxc3bZHY7STb485lbbKnW0np6Wlr1vl5zvuNdRUrFyCopWs8P6/BgOAfGti4xfFuK2fJrbh0uRPfvUG+wC3CivbKkPwnj1IdcJJbSpSdKJ0kjVb9SvlhxnswkLt9scIjBXJMuKN8jKd6UhtQ6F0jYAHzPnK/JSuyEHUfdte4k3GjFuT1Fk4URyMdlzNEJn3CRIRsaJQFdmk/mIbBH1EVdK9EKGxbobESM0liMw2lpppA0lCEjQA+oACvfVtSWfNyR5ucs+TlvFKUqsgKUpQGc5r+MG1/ouR+tarxryzX8YNr/Rcj9a1Vcyvh7jGdKinI8ftt9MXmDBuEVD3Zc2ubl5gdb5U719ArmdIW0kb7l7niulbda17kWGsj8pfErrlWGWR61xJ1y+Jr7EusuBa5CmJcmO3zhxLK0qSQ4OcLTpQJKBo71U5/s+8Mt/xAxv/ALWz/pqdxXhviuDPvvY7jlrsbshIQ6u3xEMlxIOwFFIGwK5yai7o5kJRpyU4t3Xd/Zz3mWE23JeFmd3PHMZzpd9ehxbeheTKnPyZLQkodLbLT61r0ggknlA6nW+tWjjXgN/zTiDkEazxZCPjDh7cLa1N5FJYMhUlooZU5rlClDm6E70VHu3W/wBKlpWixZTJNNbL46934OdLJe5uYcRODoj4RkePR7C1OanfGFqcYjxCYRbSgOa5VJ5hpKh6J6ddnVdF181xt0W8W+TBnR2pkKS2pl+O+gLQ6hQ0pKknoQQSCDVJHk/8MwQRgGOAjuItjP8ApqLkpY6iE5wqWvqt/O1vf3l/qCzn+KVz/qv8xUBG4DcN4chp9jBMdZfaUFtuItjIUlQOwQeXoQan85/ilc/6r/MVsZJbrNO3EvUzQUdNDNe1evibLSlK7h9GFKUoCr3PhvYbnJckiM7BkuHa3bfIcjlZ3slQQQFHfrIJr4PkogeL3r237qu9KvVeov1FiqzjqUmUj5KIHi969t+6nyUQPF717b91XelZ09Tf6EtNU4mU5nhVYwoGUu4XJIIPZy5zimzr6UAhJ/MQRVriRGIEZuPGZbjx2khKGmkBKEAdwAHQCvdSq5VJz1SZXKUpfM7ilKVWRFKUoBSlKArmS4NByefGmvyZsWTHaUylcN/s9pUQSD0O+qRUZ8lUHxi9+2/dV2pVmklZL2RXKnCTvKKf8FJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IjoaXAuSKT8lUHxi9+2/dT5KoPjF79t+6rtSmkfdyQ0NLgXJFJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IaGlwLkik/JVB8Yvftv3V65HCG1y2lNSLneH2VfObXM2lQ+g9KvVKyqsk7r0RlUaSd1FckKUpVRaf/9k=" + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "import * as tslab from \"tslab\";\n", + "\n", + "const runnableGraph = agent.getGraph();\n", + "const image = await runnableGraph.drawMermaidPng();\n", + "const arrayBuffer = await image.arrayBuffer();\n", + "\n", + "await tslab.display.png(new Uint8Array(arrayBuffer));" + ] + }, + { + "cell_type": "markdown", + "id": "055aacad", + "metadata": {}, + "source": [ + "## How to stream tool calls\n", + "\n", + "You can now run your agent. Let's first look at an example of streaming back intermediate tool calls. This is not supported by all providers, but some support token-level streaming of tool invocations.\n", + "\n", + "To get the partially populated tool calls, you can access the message chunks' `tool_call_chunks` property:" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "c704d23c", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[\n", + " {\n", + " name: 'search',\n", + " args: '',\n", + " id: 'call_ziGo5u8fYyqQ78SdLZTEC9Vg',\n", + " index: 0,\n", + " type: 'tool_call_chunk'\n", + " }\n", + "]\n", + "[\n", + " {\n", + " name: undefined,\n", + " args: '{\"',\n", + " id: undefined,\n", + " index: 0,\n", + " type: 'tool_call_chunk'\n", + " }\n", + "]\n", + "[\n", + " {\n", + " name: undefined,\n", + " args: 'query',\n", + " id: undefined,\n", + " index: 0,\n", + " type: 'tool_call_chunk'\n", + " }\n", + "]\n", + "[\n", + " {\n", + " name: undefined,\n", + " args: '\":\"',\n", + " id: undefined,\n", + " index: 0,\n", + " type: 'tool_call_chunk'\n", + " }\n", + "]\n", + "[\n", + " {\n", + " name: undefined,\n", + " args: 'current',\n", + " id: undefined,\n", + " index: 0,\n", + " type: 'tool_call_chunk'\n", + " }\n", + "]\n", + "[\n", + " {\n", + " name: undefined,\n", + " args: ' weather',\n", + " id: undefined,\n", + " index: 0,\n", + " type: 'tool_call_chunk'\n", + " }\n", + "]\n", + "[\n", + " {\n", + " name: undefined,\n", + " args: '\"}',\n", + " id: undefined,\n", + " index: 0,\n", + " type: 'tool_call_chunk'\n", + " }\n", + "]\n" + ] + } + ], + "source": [ + "import type { AIMessageChunk } from \"@langchain/core/messages\";\n", + "\n", + "const eventStream = await agent.streamEvents(\n", + " { messages: [[\"user\", \"What's the weather like today?\"]] },\n", + " {\n", + " version: \"v2\",\n", + " },\n", + ");\n", + "\n", + "for await (const { event, data } of eventStream) {\n", + " if (event === \"on_chat_model_stream\") {\n", + " const msg = data.chunk as AIMessageChunk;\n", + " if (msg.tool_call_chunks !== undefined && msg.tool_call_chunks.length > 0) {\n", + " console.log(msg.tool_call_chunks);\n", + " }\n", + " }\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "1d9b168b", + "metadata": {}, + "source": [ + "Because this is a ReAct-style agent, this will only log intermediate steps and not the final response because the model generates a final response with no tool calls when it no longer needs to gather more information from calling tools.\n", + "\n", + "## Streaming final responses\n", + "\n", + "### ReAct agents\n", + "\n", + "For ReAct-style agents, you know that as soon as you start message chunks with no `tool_call_chunks`, the model is responding directly to the user. So we can flip the conditional like this to only log tokens from the final response:" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "86f843bb", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "\n", + "The\n", + " weather\n", + " today\n", + " is\n", + " cold\n", + ",\n", + " with\n", + " a\n", + " low\n", + " of\n", + " \n", + "3\n", + "℃\n", + ".\n", + "\n" + ] + } + ], + "source": [ + "const eventStreamFinalRes = await agent.streamEvents(\n", + " { messages: [[\"user\", \"What's the weather like today?\"]] },\n", + " { version: \"v2\" });\n", + "\n", + "for await (const { event, data } of eventStreamFinalRes) {\n", + " if (event === \"on_chat_model_stream\") {\n", + " const msg = data.chunk as AIMessageChunk;\n", + " if (!msg.tool_call_chunks?.length) {\n", + " console.log(msg.content);\n", + " }\n", + " }\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "f13b4790", + "metadata": {}, + "source": [ + "### Other graphs\n", + "\n", + "If your graph has multiple model calls in multiple nodes and there's one that will always be called last, you can distinguish that model by assigning it a run name or a tag. To illustrate this, declare a new graph like this:" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "id": "0fea2f20", + "metadata": {}, + "outputs": [], + "source": [ + "import { HumanMessage } from \"@langchain/core/messages\";\n", + "\n", + "const OtherGraphState = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: (x, y) => x.concat(y),\n", + " }),\n", + "});\n", + "\n", + "const respond = async (state: typeof OtherGraphState.State): Promise> => {\n", + " const { messages } = state;\n", + " const model = new ChatOpenAI({ model: \"gpt-4o\", temperature: 0 });\n", + " const responseMessage = await model.invoke(messages);\n", + " return {\n", + " messages: [responseMessage],\n", + " }\n", + "};\n", + "\n", + "const summarize = async (state: typeof OtherGraphState.State): Promise> => {\n", + " const { messages } = state;\n", + " // Assign the final model call a run name\n", + " const model = new ChatOpenAI({\n", + " model: \"gpt-4o\",\n", + " temperature: 0\n", + " }).withConfig({ runName: \"Summarizer\" });\n", + " const userMessage = new HumanMessage(\"Now, summarize the above messages\")\n", + " const responseMessage = await model.invoke([\n", + " ...messages,\n", + " userMessage,\n", + " ]);\n", + " return { \n", + " messages: [userMessage, responseMessage]\n", + " };\n", + "}\n", + "\n", + "const otherWorkflow = new StateGraph(OtherGraphState)\n", + " .addNode(\"respond\", respond)\n", + " .addNode(\"summarize\", summarize)\n", + " .addEdge(\"__start__\", \"respond\")\n", + " .addEdge(\"respond\", \"summarize\")\n", + " .addEdge(\"summarize\", \"__end__\");\n", + "\n", + "const otherGraph = otherWorkflow.compile();" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "id": "2149f527", + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCAEuAHUDASIAAhEBAxEB/8QAHQABAAIDAAMBAAAAAAAAAAAAAAYHBAUIAQMJAv/EAFIQAAEDAwEDBQoJCQUFCQAAAAECAwQABQYRBxIhExYxVZQIFBUXIkFRk9HhMjdUVmFxdpKzIzU2QlJ0kaGyJHWBtNQzRXKWsUNGYoOVosHS8P/EABsBAQACAwEBAAAAAAAAAAAAAAACAwEEBQYH/8QANhEAAgECAgcECQUAAwAAAAAAAAECAxEEURITFSExUpEFFEFhMnGBobHB0eHwIjNCYnI0Y8L/2gAMAwEAAhEDEQA/APqnSlKA1jmTWdpxSF3WChaSUqSqSgEHzgjWvHOqy9cQO0o9tVNjVlt8m3OuvQIzripkveWtlJJ/tDnSSK2vN619Ww/UI9laFftChQqzpOLei2uK8HY68cBpRUtLiWJzqsvXEDtKPbTnVZeuIHaUe2q75vWvq2H6hHspzetfVsP1CPZVG1cPyS6ols7+3uLE51WXriB2lHtpzqsvXEDtKPbVd83rX1bD9Qj2U5vWvq2H6hHsptXD8kuqGzv7e4sTnVZeuIHaUe2nOqy9cQO0o9tV3zetfVsP1CPZTm9a+rYfqEeym1cPyS6obO/t7ixOdVl64gdpR7ac6rL1xA7Sj21XfN619Ww/UI9lOb1r6th+oR7KbVw/JLqhs7+3uLNg3WFc9/vOZHl7mm9yDqV7uvRroeHQay6rXZ1Cjwc6yBEZhqOg22CopaQEgnlZXHQVZVdZSjJKUeDSfVXOZVp6qbhkKUpQpFKUoCnsT/NK/wB8l/5hytzWmxP80r/fJf8AmHK3NeO7Q/5lb/Uviz11L0I+pCoddNruJWfMWsVlXUi+uKZQYzUZ50NqdOjSXHEIKGyrzBahrUxrn/aGLrYdr6Lhg9nydnJJ0uAxdCIBcsl0ijdC1uvHUNLabKgFApVqkJ0UDWtSgptpipJxSaJhgO3q0Ztk2XWZUWZAcsU56Ml1yFJDTrTTTaluKcU0EIVvLUAgq3iEhQ1Cga3WFbZ8O2hXCRAsV476msMd8qjvRXo61M66cogOoSVo1IG8nUcRx4iq5tMzJsKyDa5Z7bj9zcvt4mybxYbj3kpdudUYDaW0re+AhQcZKSlRGpI8x1qLbPLXdZG1nCb0u251KJs0+HdbnkzL4QmWtDTm4ltXBlGrSxqhKWySgAqNbDpQd2t27PyKVUmml5/Mn+Ud1NiELZxesqx12RkiIMLvtoNQJTbDpJCQkvFkpSQVDeB4p47wFWdimUwcysrN0twlCM4SkCXDeiuajgfybyErA+nTQ+aqMs+z69zu4eRijFpfi39zHFtC2vtFl7luKigpVoQonXp04mrowPLU5lj7c9Nqu1mKSGlRbzCXEeCglJPkLAJTx03hwJB0qupCCi9DwbXEnTlJtaXikSOlKVqmwezBP09v/wDdkH8WVViVXeCfp7f/AO7IP4sqrEr3dH9qn/mPwR5fF/vyFKUq01BSlKAp7E/zSv8AfJf+YcrQT9huzu6TpE2Zg+PypclxTzz71taUtxajqpSiU6kkkkk+mrJGya2NrdLNyu8dDji3eTal6JSVKKlaDTgNSa8+KqD1xe+2+6uXiOznVr1KsKttJt8H4u53I42loqMlwKtHc/bMh/3Axv8A9LZ/+tTS02iDYbbHt9tiMwIEZAbZjRmwhttI6AlI4AVvvFVB64vfbfdTxVQeuL3233VQ+ypS41k/YySx1FcImtpWy8VUHri99t91VFsOizc8zTavbbte7ouLjmRKtsANSNwpZDaVaKOnlHUnjUNj/wDaujJbQpZMsuozlGzLEc3mtS8hxm03uU03yTb0+G28tCNSd0FQJA1JOn01OPFVB64vfbfdTxVQeuL3233VldkuLuqq6Mw8fSe5plXnYDs0KAg4DjhQCSE+DGdATpqfg/QP4VIMVwPG8GbkN47YrdY25BSp5NvjIZDhGuhVuga6an+NTDxVQeuL3233U8VUHri99t91SfZc2rOt8SKxtFb1EwcE/T2//wB2QfxZVWJUexnCIWLTJkuPImSpEpttpxyY9yh3UFZSBw4cXFfxqQ12lFQjGCd7JLojk15qpUc14ilKVkoFKUoBSlKAUpSgFc79y38Ze3/7Yq/BRXRFc79y38Ze3/7Yq/BRQHRFKUoBSlKAUpSgFKUoBSlKAUpSgFKUoBXO/ct/GXt/+2KvwUV0RXO/ct/GXt/+2KvwUUB0RSlKAUpSgFKUoBSlKAUpSgFKVCrrtLb5ZxixW9d6cbVuqkKc5CKlXnHKEEr+tCVAcQSCNKnGEp8Pz2k4QlN2irk1pVbKzTLVcRDsrf8A4S68v+eg/wCleOeeXfJrJ956rNUuZdTY7pWyK97u3YV47Nh09cCOHcjx7eudv0TqtwJT+WZHn8tA1AHSpCK+YHcw7Fn9vW2axYsEOeDCvvq6Oo4FqI2QXDr5irggH9paa+wPPPLvk1k+89VR7FtjB2F5PmV8x+Jae+ckl8uptwuBENrUqDDQA4I3lKP1BI/VBpqlzLqO6VsjqalVrzzy75NZPvPU555d8msn3nqapcy6julbIsqlV01nmSxiFSLPbprfDeEWWttz6dApBB+oqH11Ksby+BkyXEMcrGmMgF6DKTuPNa9BI1IIOh0UklJ0OhOhqLpSSut68ncqnRqU98kbulKVUUilKUApSlAQTPru5OuLePR3FNtFkSJ7ja91XJlRDbQI4jfKV6n0I0/WrVNtoZbS22kIQkBKUpGgAHQAK9Dqy7muXKX8NE1lpP0IERhQH8VqP+JrSbTLvLx/Zvld0t7ve8+DaZcmO7uhW44hlakq0IIOhAOhBFWV9zUFwSXvSZ6HDQVOknnvJJSudL1nmeYhs1w+e5fJOQ5Jmj0GKy1Gt8RKLeVsLedUwhRbDiylIADrm7vDUADyT4Xkm2qHi2TtMWy7yFsiI7BuVzg28XHcLukpKGI7qmXVJbAWjUJ1JIIPDXWsXa1ZM6MpXPCc+vt/Ts2tmPbQpEwXq7z4NwujlojsymwzFdd5FxhbejbqFIAPkjjoSCOBxrltXzXHZ95wRV2YuGQIyO2WWHkj0NCS2xMYU9yjjKdEKcbS24kaBKVEoJHTqsNasvzidIUrmrPdpec7M2syxxeSIvN1gxbRdLbepMBlK0tyLgmM6y82gJQr4J0KQk6LPEEAjZZXtUybYtdM2iXe7HMGYGMIv8JyTFajrbeL6mC0eSSkFsq3FakbwGvE9NLDXRXFfm/6F8SLvBiXCJAfmx2Z0wLVGiuOpS6+EAFZQknVW6FDXTo1GvTX6mx3lKbkw3e9rjHJVHfHmPnSr0oVoApPnHoIBFBN4/ldk28bK15VlYyWTIg3dXJogNRm4znJMb6WygAqQdQBv6nyddeOg6GqUZODUkST1iakidYxfW8msUS5NtlnlkkOMk6lpxJKXGyfOUrSpJ+qtpUG2UrV3pkLX/ZNXZwN+jRTTS1f+9a/51Oa2KsVGbS4fXeeaqR0JuOQpSlVFYpSlAVxmMBVmy0XDQiFdW0NLXr5LchGoTr/AMaCAPpbA6SK02TWCPlWN3WyS1utxblEdhvLZIC0ocQUKKSQQDoo6ag/VVr3G3RbvBehzGESYzyd1bTg1BH/AO46+Y1X1xxO/wBhUrvJsZDBHwEcolqWgeglZCHP+IqQejUE6mrnHXWadms91/kdbDYmKjq6hCMi2TWPKMFtmLTVSxFtYjqgzWHuSlxnWAA0824kDdcGnSBpxPDQ6Vix9k62can2pea5bIkS3m3zdnLggS2igghLZS2EJSdOKdzRWp111qXmbckcF41eUq/ZEdKv5pUR/OvHhCf83L12T31Hu9XL4G/p0Xvuik8i7np6LeMGj2G4XpLDN7n3a85Ama0J4deiLbDxKk7pKlBtG6lBG70p01NS+P3PuMt4pcbM/Jus2VPnouj99ky9biZjZTyT6XQAEqRupCQlISACNNCdZ74Qn/Ny9dk99aWwbQoeUTrxDtNuuk6VZ5Pec9pqLqY72mu4rj06EGnd6uRFOivFEWPc8WGTZb7DuF2vd2n3t2I5NvM6S2uYtMZ1LrLaSGwhKApPwUoGu8rz6ESHItlFhyvIrldrqh6WbjZFWGTDWsBhcYuFwnQDeC9VHiFcOGg141IvCE/5uXrsnvp4Qn/Ny9dk99O71ciWnRzRAcV2DW/GMksV6cyfJr5IsjL8aC1d5rbzbTbqUpUnQNpJ0CE6EnXhxJ4VY06a3b4jkh3XdQPgpGqlEnQJSPOokgAeckCvW0b3MUERMZuJUdPLlFphsfSSpe9/BJP0VKcawd6PKauN7eZlzm+LMZgEx4x/aSVDVa/NvkDQdCU6nUqOjvqPd69/29pTPEUqUf0u7M/A7E/YMbYZlgCe+tcqUAreCXXFFRQD5wnUIB9CRUhpSsTk5ycn4nBbcndilKVAwKUpQClKUApSlAK537lv4y9v/wBsVfgoroiud+5b+Mvb/wDbFX4KKA6IpSlAKUpQClKUApSlAKUpQClKUApSlAK537lv4y9v/wBsVfgoroiud+5b+Mvb/wDbFX4KKA6IpSlAKUpQClKUApSlAKUpQClet+Q1FbLjzqGkDpU4oJH8TWv502Uf73gdpR7akoylwQNpStXzqsvXEDtKPbTnVZeuIHaUe2paufKzNmVX3U3dA3PubsJg5TGxDnZbVyhFmEXExFRSofk1H8i5vJJBSTw0JSOO9w4e2K939MxTPMyXbtmy77cM4v6Z0eC3eeTU04sJbSyD3urfJOnHRPT0V9Edo9txLafgl8xS8XO3uW67RVxnf7Q2SgkeStOp+ElQSoegpFcA9wh3OAxzbvkd9zJyLHZwyQuHCU66lLUuYdQHWyT5aEt+UDp0uNkdFNXPlYsz6bUrV86rL1xA7Sj2051WXriB2lHtpq58rFmbSlavnVZeuIHaUe2vZHyC1ynAhi5Q3lnoS2+hRP8AgDTVzXgxZmwpSlVmBSlKAVEMuy5+JLFptIQbgUhb8lwbzcRB6OH6ziv1U9AAKlcN1K5XIfRFjuvOHRttJWo/QBqaqHGluS7U3cX9DLuR79fUNeKlgEDj5kp3Uj6EirY2jF1H4cPWbuFoqrP9XBHheNQZb3L3Fs3iWRoZNx0eWeOvAEbqR9CQB9Fe7m/ax/u2H6hHsrVZttFx7Z3FiP3+4GGJjvIRmmmHJDzywkqIQ22lS1aAEnQcB01+7NtBsGQXWHbYE/vibLtqLww3yLid+IpW4lzUpAGquG6TvfRVbrVJcZM7q0I/pRsub9r6th+oT7Kc37X1bD9Qn2VEJW3fB4kK1yl3lam7oqW3BQ1CkOOyVxnQ0+lDaWytSkrOmgGpAJGoBI9Nl7oXZ/kMy3RrfkCX13B8RY7hiPoaL510YU4psIQ6dP8AZrIV0cOIqOsqczGnDhdE15v2vq2H6hPspzftfVsP1CfZUJu3dDbPrFPlxJ+QpjLhyzBlOqiP8hGfB3dx10N7jZJ6CpQB6RqK2Nn2yYhfYF7mRbsUtWVoP3BMmI9HdYaKSoOFtxCVlBCVEKAIOh0JprKnMxpQ4XRJeb9r6th+oT7Kc37X1bD9Qn2VFsf214Zk9qvdwt94K49kYMq4IeiPsPR2glSt8tOISspISoghJ10OmtRzJu6Gx04iu6Y7d2lKXJiR2J1wtM9UFRec0A5RtrjqlLiQoHdSvdCiNdC1lTmZhzgle5ZnN+19Ww/UJ9lfh3GbO+jdctUFxP7K4yCP+lQa/d0ZgeMXZ+23O43KNLZld5KHgKetBe3twIStLBSvVXAbpIPm1rY37bjg+MX52z3O/NxprC0NyFcg6tiMpWm6l55KC20TqDotQOhB89NZUX8n1GlDNEvtom4opLlleX3sj4Vqec3o7g9CCdS0r0bp3fSk+aybFe42Q2xqbF3wheoU24NFtLHBSFDzKB4H+Wo0NQHpr34TL8G5tKgp0Szc4hmbvH/bMqQ2pXo1Uhxsf+WKvjJ1k1L0lvvnnf2bzn4uhHR1kVvLHpSlUnGMa5RBcLdKik6B9pTevo1BH/zVS4q4peN20LSpDrbCWXEKGhStA3Vg/UpJFXHVdZVYXccuMm6xGFPWqWsuzG2hquM6QAXQnztq08rTilXlaEKUUXRWnB01x4r6fmVjoYOqqc2peJUW3qBBkQbHMdhZR4WhSXHbbd8ThGVJt7pbKSVIAO82sEpKSkpPQdOkQrGrrmFizTE8zzPGrpImXDEfBk1NmgKkKZmJkh1KXG29eT30HXU+SlQIJGldCxpLMxhD8d1D7Lg3kONqCkqHpBHA17K1Xdbmdhwu9JM5i2O4vf4112QyZ+P3O296rypyYiVFUkxC/LCmg4dNE744pOuihxGor2O4hfPFPcIostw79O0gz0MCIvlO9/DAc5YJ013OT8rf6N3jrpXTNKwRVFJWv+WS+RzNkeIXyRsg2vQm7LcHJc7NDLix0xFlyQz3zDVyjadNVo0Qo7w1GiT6DW82qwcyt+0nNrziFvlqnqwqGxEltRt9KnkzpBcS2VDcW8lpRUEE66lPDQ1ftKGdUs/zf9Tk1jHJ8m/55LtlmziVAueAyrexNyZmQ5IlS0lai2Euaqb1Do3UbqAo7+4D57C2iYzcpHcuWazQrVKduDLFiQbexHUp5HJyopWOTA1G6EqJ4cAk69FXhSlzCpJJq/E55y3PXLntrC75ieYSMZxU62xEHHJclmdPUCFyipKN0paSSlvp1KlLB+DUYg4Axa7zmON5hjmf3bwzfJcll6wzZvgydEkubwLobdS02pIUUrSsDgnhva11bSlw6V3ds9caOiJGaYaBDbSAhIJJIAGg4npr34tHVM2hsOJCuTgW14uHTgFPONhHH6mXf4VgyrilmS1DYQZdyeGrEJojlHOOmunmSPOo8B5zU6w7GTjsB5UhSHblMWH5breu7v7oSEI147qQAB0a8ToCo1tU06adR+Ksvbufstf2mtjKqjDQXFm/pSlVHCFKUoCL3PZvYbnJckiM7BkuHVbtvkORys66kqCCAo6+cgmsDxUQOt71233VN6Veq9RfyLFVnHcpMhHiogdb3rtvup4qIHW967b7qm9Kzr6mfwJa6pzMhHiogdb3rtvuqoth0OZnmabV7bdr3dFxccyJVtgBqRuFLIbSrRR08o6k8a6UrnfuW/jL2/8A2xV+CimvqZ/Aa6pzMtHxUQOt71233U8VEDre9dt91TelNfUz+A11TmZCPFRA63vXbfdX6Tsoteo5W43l5I/VNwWjX/FG6f51NaVjX1MxrqnMzWWLGbXjTC2rZCbihZBcUnUrcPmKlnVSjx6STWzpSqpScneTuypu+9ilKVEwKUpQClKUApSlAK537lv4y9v/ANsVfgoroiud+5b+Mvb/APbFX4KKA6IpSlAKUpQClKUApSlAKUpQClKUApSlAK537lv4y9v/ANsVfgor193bsK8dmw6euBHDuR49vXO36J1W4Ep/LMjz+WgagDpUhFfMDuYdiz+3rbNYsWCHPBhX31dHUcC1EbILh18xVwQD+0tNAfculKUApSlAKUpQClKUApSlARHOcquVgm2iHbGIjr04u7ypZUEpCEg8N3061ped+X/J7J956sjaP+lGK/VL/oRWPWvisTLD6EYJb1fevN/Q832jjq+GrKFN7rZDnfl/yeyfeepzvy/5PZPvPUpWltCrkuhy9rYrNdEOd+X/ACeyfeeqpNi2xVWwvJ8yvmPxLT3zkkvl1NucoEQ2tSoMNADgjeUo/UEj9UGrbpTaFXJdBtbFZrohzvy/5PZPvPU535f8nsn3nqxLhd4NpVFTOmx4apb6Y0cSHUtl50glLaNT5SiEqISOPA+isum0KuS6DauLzXRDnfl/yeyfeerHuOeZbbrfKlqi2VaWGlOlIU9qQkE6fyrIrWZR+jV2/dHv6DVtLHVJVIxaVm14E4dq4qUkm10RaVtlGdboslSQlTzSHCkdA1AOn86ya1+P/mG2/uzX9ArYV0pK0mke1FKUqIFKUoCv9o/6UYr9Uv8AoRWPWRtH/SjFfql/0IqOZTNyCDFZVj1ogXeQpejjVwuK4aUJ06QpLLup104aD665vaG+VP8Az/6keM7XV8Sl5L5m7qA7dNoEzZlszud9tsZMu5JcYixG1gFPKvPIaSogqSCAV66FSQdNNRrrX55wbTfmTjf/ADQ9/oa/Muy3zaXabnjedYlZ4uOzo5Q6qDfHZTilbySnRJjNbuhG8FBWoKRoPRzUrNN8DlQhoyUp2aXHevqVcvMtrOM2LL5c9m9uW2NjVwnN3S/QbYw7DmtNFbPJpivOJcQfK1StPApTxUCa2sDaBlmE37DZWQZAcjtmSWKbcpEMQGWBDdjx23/yBQN4pKVKTo4pR10OtTiBsWYYx3ILNcMtym/xbzbnLW4q6z0OqYZWhSCWwGwnf0UfLUFE6DUmts9sutEi5YjNdckuKxmK/DitLUgtvNuspaXyo3fKO6gdG6NSeB6Ksc45Gw6tJ7ml48F5bvVvKFnu5lk9v2LZlkWTNy4t+ya3zmrDGgtIYhJdjvONBDoHKKKUHRW8SCTw0049VVTsLua7RYjZzAv2RyINhnC52ixSrigwo7qUrCGhq0pYb8sjiVFI6PODIxkG0zUa4TjYH0ZQ9/oaxNqXo/QjWcattC1lfJeJP61mUfo1dv3R7+g1EucG035k43/zS9/oKleSlRxe6lQCVd5u6gHUA7h89Zoq1WHrXxKIRcZxvnmizcf/ADDbf3Zr+gVsK1+P/mG2/uzX9ArYV6OfpM+kilKVAClKUBAdpcab4Xx6ZGt0q4NRzIS6Ije+pG8lISSNfoNaPwnO+bl77J76tqlYqQpVdHWRu0rcfNv5nOxGAo4menUvf1lS+E53zcvfZPfTwnO+bl77J76tqlVd2w3I+prbIwvn1+xUvhOd83L32T308Jzvm5e+ye+rapTu2G5H1GyML59fsVL4TnfNy99k99PCc75uXvsnvq2qU7thuR9RsjC+fX7FS+E53zcvfZPfWFe5Fyn2WfGaxy9F16O42gGLoNSkgef6auelSjQw8JKShw8zMeycNFpq/X7GFZWVx7PAacSUOIYbSpJ6QQkAis2lKtbu7nZFKUrAP//Z" + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "const otherRunnableGraph = otherGraph.getGraph();\n", + "const otherImage = await otherRunnableGraph.drawMermaidPng();\n", + "const otherArrayBuffer = await otherImage.arrayBuffer();\n", + "\n", + "await tslab.display.png(new Uint8Array(otherArrayBuffer));" + ] + }, + { + "cell_type": "markdown", + "id": "5ff9d991", + "metadata": {}, + "source": [ + "Now when we call `streamEvents`, we can see that we can now filter on run name to only see the final summary generation of the current chat history:" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "id": "51381303", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "You\n", + " asked\n", + " about\n", + " the\n", + " capital\n", + " of\n", + " Nepal\n", + ",\n", + " and\n", + " I\n", + " responded\n", + " that\n", + " it\n", + " is\n", + " Kathmandu\n", + ".\n", + "\n" + ] + } + ], + "source": [ + "const otherEventStream = await otherGraph.streamEvents(\n", + " { messages: [[\"user\", \"What's the capital of Nepal?\"]] },\n", + " { version: \"v2\" },\n", + " { includeNames: [\"Summarizer\"] }\n", + ");\n", + "\n", + "for await (const { event, data } of otherEventStream) {\n", + " if (event === \"on_chat_model_stream\") {\n", + " console.log(data.chunk.content);\n", + " }\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "46998967", + "metadata": {}, + "source": [ + "And you can see the resulting chunks are only ones from the final summary model call.\n", + "\n", + "## Next steps\n", + "\n", + "You've now seen some ways to stream LLM tokens from within your graph. Next, check out some of the other how-tos around streaming by going [to this page](https://langchain-ai.github.io/langgraphjs/how-tos/#streaming)." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" } - ], - "source": [ - "const otherEventStream = await otherGraph.streamEvents(\n", - " { messages: [[\"user\", \"What's the capital of Nepal?\"]] },\n", - " { version: \"v2\" },\n", - " { includeNames: [\"Summarizer\"] }\n", - ");\n", - "\n", - "for await (const { event, data } of otherEventStream) {\n", - " if (event === \"on_chat_model_stream\") {\n", - " console.log(data.chunk.content);\n", - " }\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "46998967", - "metadata": {}, - "source": [ - "And you can see the resulting chunks are only ones from the final summary model call.\n", - "\n", - "## Next steps\n", - "\n", - "You've now seen some ways to stream LLM tokens from within your graph. Next, check out some of the other how-tos around streaming by going [to this page](https://langchain-ai.github.io/langgraphjs/how-tos/#streaming)." - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "TypeScript", - "language": "typescript", - "name": "tslab" }, - "language_info": { - "codemirror_mode": { - "mode": "typescript", - "name": "javascript", - "typescript": true - }, - "file_extension": ".ts", - "mimetype": "text/typescript", - "name": "typescript", - "version": "3.7.2" - } - }, - "nbformat": 4, - "nbformat_minor": 5 -} + "nbformat": 4, + "nbformat_minor": 5 +} \ No newline at end of file diff --git a/examples/how-tos/stream-updates.ipynb b/examples/how-tos/stream-updates.ipynb index fbeb2ba25..71ae564df 100644 --- a/examples/how-tos/stream-updates.ipynb +++ b/examples/how-tos/stream-updates.ipynb @@ -1,393 +1,393 @@ { - "cells": [ - { - "cell_type": "markdown", - "id": "562ddb82", - "metadata": {}, - "source": [ - "# How to stream state updates of your graph\n", - "\n", - "LangGraph supports multiple streaming modes. The main ones are:\n", - "\n", - "- `values`: This streaming mode streams back values of the graph. This is the\n", - " **full state of the graph** after each node is called.\n", - "- `updates`: This streaming mode streams back updates to the graph. This is the\n", - " **update to the state of the graph** after each node is called.\n", - "\n", - "This guide covers `streamMode=\"updates\"`." - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "id": "8e76833b", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "// process.env.OPENAI_API_KEY = \"sk-...\";" - ] - }, - { - "cell_type": "markdown", - "id": "ab95dc97", - "metadata": {}, - "source": [ - "## Define the state\n", - "\n", - "The state is the interface for all of the nodes in our graph.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "id": "1648124b", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { Annotation } from \"@langchain/langgraph\";\n", - "import { BaseMessage } from \"@langchain/core/messages\";\n", - "\n", - "const GraphState = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: (x, y) => x.concat(y),\n", - " }),\n", - "});" - ] - }, - { - "cell_type": "markdown", - "id": "da50fbd8", - "metadata": {}, - "source": [ - "## Set up the tools\n", - "\n", - "We will first define the tools we want to use. For this simple example, we will\n", - "use create a placeholder search engine. However, it is really easy to create\n", - "your own tools - see documentation\n", - "[here](https://js.langchain.com/v0.2/docs/how_to/custom_tools) on how to do\n", - "that.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "id": "a8f1ae1c", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { DynamicStructuredTool } from \"@langchain/core/tools\";\n", - "import { z } from \"zod\";\n", - "\n", - "const searchTool = new DynamicStructuredTool({\n", - " name: \"search\",\n", - " description:\n", - " \"Use to surf the web, fetch current information, check the weather, and retrieve other information.\",\n", - " schema: z.object({\n", - " query: z.string().describe(\"The query to use in your search.\"),\n", - " }),\n", - " func: async ({ query: _query }: { query: string }) => {\n", - " // This is a placeholder for the actual implementation\n", - " return \"Cold, with a low of 3℃\";\n", - " },\n", - "});\n", - "\n", - "await searchTool.invoke({ query: \"What's the weather like?\" });\n", - "\n", - "const tools = [searchTool];" - ] - }, - { - "cell_type": "markdown", - "id": "19b27cb3", - "metadata": {}, - "source": [ - "We can now wrap these tools in a simple\n", - "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html).\n", - "This object will actually run the tools (functions) whenever they are invoked by\n", - "our LLM.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "id": "f02278b1", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", - "\n", - "const toolNode = new ToolNode(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "dd55ee5a", - "metadata": {}, - "source": [ - "## Set up the model\n", - "\n", - "Now we will load the\n", - "[chat model](https://js.langchain.com/v0.2/docs/concepts/#chat-models).\n", - "\n", - "1. It should work with messages. We will represent all agent state in the form\n", - " of messages, so it needs to be able to work well with them.\n", - "2. It should work with\n", - " [tool calling](https://js.langchain.com/v0.2/docs/how_to/tool_calling/#passing-tools-to-llms),\n", - " meaning it can return function arguments in its response.\n", - "\n", - "
\n", - "

Note

\n", - "

\n", - " These model requirements are not general requirements for using LangGraph - they are just requirements for this one example.\n", - "

\n", - "
" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "id": "9c7210e7", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { ChatOpenAI } from \"@langchain/openai\";\n", - "\n", - "const model = new ChatOpenAI({ model: \"gpt-4o\" });" - ] - }, - { - "cell_type": "markdown", - "id": "73e59248", - "metadata": {}, - "source": [ - "After we've done this, we should make sure the model knows that it has these\n", - "tools available to call. We can do this by calling\n", - "[bindTools](https://v01.api.js.langchain.com/classes/langchain_core_language_models_chat_models.BaseChatModel.html#bindTools)." - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "id": "b4ff23ee", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "const boundModel = model.bindTools(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "dbe67356", - "metadata": {}, - "source": [ - "## Define the graph\n", - "\n", - "We can now put it all together." - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "id": "0ba603bb", - "metadata": {}, - "outputs": [], - "source": [ - "import { END, START, StateGraph } from \"@langchain/langgraph\";\n", - "import { AIMessage } from \"@langchain/core/messages\";\n", - "import { RunnableConfig } from \"@langchain/core/runnables\";\n", - "\n", - "const routeMessage = (state: typeof GraphState.State) => {\n", - " const { messages } = state;\n", - " const lastMessage = messages[messages.length - 1] as AIMessage;\n", - " // If no tools are called, we can finish (respond to the user)\n", - " if (!lastMessage?.tool_calls?.length) {\n", - " return END;\n", - " }\n", - " // Otherwise if there is, we continue and call the tools\n", - " return \"tools\";\n", - "};\n", - "\n", - "const callModel = async (\n", - " state: typeof GraphState.State,\n", - " config?: RunnableConfig,\n", - ") => {\n", - " // For versions of @langchain/core < 0.2.3, you must call `.stream()`\n", - " // and aggregate the message from chunks instead of calling `.invoke()`.\n", - " const { messages } = state;\n", - " const responseMessage = await boundModel.invoke(messages, config);\n", - " return { messages: [responseMessage] };\n", - "};\n", - "\n", - "const workflow = new StateGraph(GraphState)\n", - " .addNode(\"agent\", callModel)\n", - " .addNode(\"tools\", toolNode)\n", - " .addEdge(START, \"agent\")\n", - " .addConditionalEdges(\"agent\", routeMessage)\n", - " .addEdge(\"tools\", \"agent\");\n", - "\n", - "const graph = workflow.compile();" - ] - }, - { - "cell_type": "markdown", - "id": "a1ab3ad3", - "metadata": {}, - "source": [ - "## Stream updates\n", - "\n", - "We can now interact with the agent." - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "id": "cbcf7c39", - "metadata": {}, - "outputs": [ + "cells": [ + { + "cell_type": "markdown", + "id": "562ddb82", + "metadata": {}, + "source": [ + "# How to stream state updates of your graph\n", + "\n", + "LangGraph supports multiple streaming modes. The main ones are:\n", + "\n", + "- `values`: This streaming mode streams back values of the graph. This is the\n", + " **full state of the graph** after each node is called.\n", + "- `updates`: This streaming mode streams back updates to the graph. This is the\n", + " **update to the state of the graph** after each node is called.\n", + "\n", + "This guide covers `streamMode=\"updates\"`." + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "Receiving update from node: agent\n", - "{\n", - " messages: [\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y654VypbD3kE1xM8v4xaAHzZEOXa\",\n", - " \"content\": \"\",\n", - " \"additional_kwargs\": {\n", - " \"tool_calls\": [\n", - " {\n", - " \"id\": \"call_OxlOhnROermwae2LPs9SanmD\",\n", - " \"type\": \"function\",\n", - " \"function\": \"[Object]\"\n", - " }\n", - " ]\n", - " },\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 17,\n", - " \"promptTokens\": 70,\n", - " \"totalTokens\": 87\n", - " },\n", - " \"finish_reason\": \"tool_calls\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [\n", - " {\n", - " \"name\": \"search\",\n", - " \"args\": {\n", - " \"query\": \"current weather in San Francisco\"\n", - " },\n", - " \"type\": \"tool_call\",\n", - " \"id\": \"call_OxlOhnROermwae2LPs9SanmD\"\n", - " }\n", - " ],\n", - " \"invalid_tool_calls\": [],\n", - " \"usage_metadata\": {\n", - " \"input_tokens\": 70,\n", - " \"output_tokens\": 17,\n", - " \"total_tokens\": 87\n", - " }\n", - " }\n", - " ]\n", - "}\n", - "\n", - "====\n", - "\n", - "Receiving update from node: tools\n", - "{\n", - " messages: [\n", - " ToolMessage {\n", - " \"content\": \"Cold, with a low of 3℃\",\n", - " \"name\": \"search\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"call_OxlOhnROermwae2LPs9SanmD\"\n", - " }\n", - " ]\n", - "}\n", - "\n", - "====\n", - "\n", - "Receiving update from node: agent\n", - "{\n", - " messages: [\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y654dZ0zzZhPYm6lb36FkG1Enr3p\",\n", - " \"content\": \"It looks like it's currently quite cold in San Francisco, with a low temperature of around 3°C. Make sure to dress warmly!\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 28,\n", - " \"promptTokens\": 103,\n", - " \"totalTokens\": 131\n", - " },\n", - " \"finish_reason\": \"stop\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [],\n", - " \"invalid_tool_calls\": [],\n", - " \"usage_metadata\": {\n", - " \"input_tokens\": 103,\n", - " \"output_tokens\": 28,\n", - " \"total_tokens\": 131\n", - " }\n", - " }\n", - " ]\n", - "}\n", - "\n", - "====\n", - "\n" - ] + "cell_type": "code", + "execution_count": 1, + "id": "8e76833b", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "// process.env.OPENAI_API_KEY = \"sk-...\";" + ] + }, + { + "cell_type": "markdown", + "id": "ab95dc97", + "metadata": {}, + "source": [ + "## Define the state\n", + "\n", + "The state is the interface for all of the nodes in our graph.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "1648124b", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { Annotation } from \"@langchain/langgraph\";\n", + "import { BaseMessage } from \"@langchain/core/messages\";\n", + "\n", + "const GraphState = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: (x, y) => x.concat(y),\n", + " }),\n", + "});" + ] + }, + { + "cell_type": "markdown", + "id": "da50fbd8", + "metadata": {}, + "source": [ + "## Set up the tools\n", + "\n", + "We will first define the tools we want to use. For this simple example, we will\n", + "use create a placeholder search engine. However, it is really easy to create\n", + "your own tools - see documentation\n", + "[here](https://js.langchain.com/v0.2/docs/how_to/custom_tools) on how to do\n", + "that.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "a8f1ae1c", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { DynamicStructuredTool } from \"@langchain/core/tools\";\n", + "import { z } from \"zod\";\n", + "\n", + "const searchTool = new DynamicStructuredTool({\n", + " name: \"search\",\n", + " description:\n", + " \"Use to surf the web, fetch current information, check the weather, and retrieve other information.\",\n", + " schema: z.object({\n", + " query: z.string().describe(\"The query to use in your search.\"),\n", + " }),\n", + " func: async ({ query: _query }: { query: string }) => {\n", + " // This is a placeholder for the actual implementation\n", + " return \"Cold, with a low of 3℃\";\n", + " },\n", + "});\n", + "\n", + "await searchTool.invoke({ query: \"What's the weather like?\" });\n", + "\n", + "const tools = [searchTool];" + ] + }, + { + "cell_type": "markdown", + "id": "19b27cb3", + "metadata": {}, + "source": [ + "We can now wrap these tools in a simple\n", + "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html).\n", + "This object will actually run the tools (functions) whenever they are invoked by\n", + "our LLM.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "f02278b1", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", + "\n", + "const toolNode = new ToolNode(tools);" + ] + }, + { + "cell_type": "markdown", + "id": "dd55ee5a", + "metadata": {}, + "source": [ + "## Set up the model\n", + "\n", + "Now we will load the\n", + "[chat model](https://js.langchain.com/v0.2/docs/concepts/#chat-models).\n", + "\n", + "1. It should work with messages. We will represent all agent state in the form\n", + " of messages, so it needs to be able to work well with them.\n", + "2. It should work with\n", + " [tool calling](https://js.langchain.com/v0.2/docs/how_to/tool_calling/#passing-tools-to-llms),\n", + " meaning it can return function arguments in its response.\n", + "\n", + "
\n", + "

Note

\n", + "

\n", + " These model requirements are not general requirements for using LangGraph - they are just requirements for this one example.\n", + "

\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "9c7210e7", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { ChatOpenAI } from \"@langchain/openai\";\n", + "\n", + "const model = new ChatOpenAI({ model: \"gpt-4o\" });" + ] + }, + { + "cell_type": "markdown", + "id": "73e59248", + "metadata": {}, + "source": [ + "After we've done this, we should make sure the model knows that it has these\n", + "tools available to call. We can do this by calling\n", + "[bindTools](https://v01.api.js.langchain.com/classes/langchain_core_language_models_chat_models.BaseChatModel.html#bindTools)." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "b4ff23ee", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "const boundModel = model.bindTools(tools);" + ] + }, + { + "cell_type": "markdown", + "id": "dbe67356", + "metadata": {}, + "source": [ + "## Define the graph\n", + "\n", + "We can now put it all together." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "0ba603bb", + "metadata": {}, + "outputs": [], + "source": [ + "import { END, START, StateGraph } from \"@langchain/langgraph\";\n", + "import { AIMessage } from \"@langchain/core/messages\";\n", + "import { RunnableConfig } from \"@langchain/core/runnables\";\n", + "\n", + "const routeMessage = (state: typeof GraphState.State) => {\n", + " const { messages } = state;\n", + " const lastMessage = messages[messages.length - 1] as AIMessage;\n", + " // If no tools are called, we can finish (respond to the user)\n", + " if (!lastMessage?.tool_calls?.length) {\n", + " return END;\n", + " }\n", + " // Otherwise if there is, we continue and call the tools\n", + " return \"tools\";\n", + "};\n", + "\n", + "const callModel = async (\n", + " state: typeof GraphState.State,\n", + " config?: RunnableConfig,\n", + ") => {\n", + " // For versions of @langchain/core < 0.2.3, you must call `.stream()`\n", + " // and aggregate the message from chunks instead of calling `.invoke()`.\n", + " const { messages } = state;\n", + " const responseMessage = await boundModel.invoke(messages, config);\n", + " return { messages: [responseMessage] };\n", + "};\n", + "\n", + "const workflow = new StateGraph(GraphState)\n", + " .addNode(\"agent\", callModel)\n", + " .addNode(\"tools\", toolNode)\n", + " .addEdge(START, \"agent\")\n", + " .addConditionalEdges(\"agent\", routeMessage)\n", + " .addEdge(\"tools\", \"agent\");\n", + "\n", + "const graph = workflow.compile();" + ] + }, + { + "cell_type": "markdown", + "id": "a1ab3ad3", + "metadata": {}, + "source": [ + "## Stream updates\n", + "\n", + "We can now interact with the agent." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "cbcf7c39", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Receiving update from node: agent\n", + "{\n", + " messages: [\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y654VypbD3kE1xM8v4xaAHzZEOXa\",\n", + " \"content\": \"\",\n", + " \"additional_kwargs\": {\n", + " \"tool_calls\": [\n", + " {\n", + " \"id\": \"call_OxlOhnROermwae2LPs9SanmD\",\n", + " \"type\": \"function\",\n", + " \"function\": \"[Object]\"\n", + " }\n", + " ]\n", + " },\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 17,\n", + " \"promptTokens\": 70,\n", + " \"totalTokens\": 87\n", + " },\n", + " \"finish_reason\": \"tool_calls\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [\n", + " {\n", + " \"name\": \"search\",\n", + " \"args\": {\n", + " \"query\": \"current weather in San Francisco\"\n", + " },\n", + " \"type\": \"tool_call\",\n", + " \"id\": \"call_OxlOhnROermwae2LPs9SanmD\"\n", + " }\n", + " ],\n", + " \"invalid_tool_calls\": [],\n", + " \"usage_metadata\": {\n", + " \"input_tokens\": 70,\n", + " \"output_tokens\": 17,\n", + " \"total_tokens\": 87\n", + " }\n", + " }\n", + " ]\n", + "}\n", + "\n", + "====\n", + "\n", + "Receiving update from node: tools\n", + "{\n", + " messages: [\n", + " ToolMessage {\n", + " \"content\": \"Cold, with a low of 3℃\",\n", + " \"name\": \"search\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"call_OxlOhnROermwae2LPs9SanmD\"\n", + " }\n", + " ]\n", + "}\n", + "\n", + "====\n", + "\n", + "Receiving update from node: agent\n", + "{\n", + " messages: [\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y654dZ0zzZhPYm6lb36FkG1Enr3p\",\n", + " \"content\": \"It looks like it's currently quite cold in San Francisco, with a low temperature of around 3°C. Make sure to dress warmly!\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 28,\n", + " \"promptTokens\": 103,\n", + " \"totalTokens\": 131\n", + " },\n", + " \"finish_reason\": \"stop\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [],\n", + " \"invalid_tool_calls\": [],\n", + " \"usage_metadata\": {\n", + " \"input_tokens\": 103,\n", + " \"output_tokens\": 28,\n", + " \"total_tokens\": 131\n", + " }\n", + " }\n", + " ]\n", + "}\n", + "\n", + "====\n", + "\n" + ] + } + ], + "source": [ + "let inputs = { messages: [[\"user\", \"what's the weather in sf\"]] };\n", + "\n", + "for await (\n", + " const chunk of await graph.stream(inputs, {\n", + " streamMode: \"updates\",\n", + " })\n", + ") {\n", + " for (const [node, values] of Object.entries(chunk)) {\n", + " console.log(`Receiving update from node: ${node}`);\n", + " console.log(values);\n", + " console.log(\"\\n====\\n\");\n", + " }\n", + "}" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" } - ], - "source": [ - "let inputs = { messages: [[\"user\", \"what's the weather in sf\"]] };\n", - "\n", - "for await (\n", - " const chunk of await graph.stream(inputs, {\n", - " streamMode: \"updates\",\n", - " })\n", - ") {\n", - " for (const [node, values] of Object.entries(chunk)) {\n", - " console.log(`Receiving update from node: ${node}`);\n", - " console.log(values);\n", - " console.log(\"\\n====\\n\");\n", - " }\n", - "}" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "TypeScript", - "language": "typescript", - "name": "tslab" }, - "language_info": { - "codemirror_mode": { - "mode": "typescript", - "name": "javascript", - "typescript": true - }, - "file_extension": ".ts", - "mimetype": "text/typescript", - "name": "typescript", - "version": "3.7.2" - } - }, - "nbformat": 4, - "nbformat_minor": 5 -} + "nbformat": 4, + "nbformat_minor": 5 +} \ No newline at end of file diff --git a/examples/how-tos/stream-values.ipynb b/examples/how-tos/stream-values.ipynb index c92642f6b..c1ec2c080 100644 --- a/examples/how-tos/stream-values.ipynb +++ b/examples/how-tos/stream-values.ipynb @@ -1,472 +1,472 @@ { - "cells": [ - { - "cell_type": "markdown", - "id": "562ddb82", - "metadata": {}, - "source": [ - "# How to stream full state of your graph\n", - "\n", - "LangGraph supports multiple streaming modes. The main ones are:\n", - "\n", - "- `values`: This streaming mode streams back values of the graph. This is the\n", - " **full state of the graph** after each node is called.\n", - "- `updates`: This streaming mode streams back updates to the graph. This is the\n", - " **update to the state of the graph** after each node is called.\n", - "\n", - "This guide covers `streamMode=\"values\"`." - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "id": "8e76833b", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "// process.env.OPENAI_API_KEY = \"sk-...\";" - ] - }, - { - "cell_type": "markdown", - "id": "ab95dc97", - "metadata": {}, - "source": [ - "## Define the state\n", - "\n", - "The state is the interface for all of the nodes in our graph.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "id": "1648124b", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { Annotation } from \"@langchain/langgraph\";\n", - "import { BaseMessage } from \"@langchain/core/messages\";\n", - "\n", - "const GraphState = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: (x, y) => x.concat(y),\n", - " }),\n", - "});" - ] - }, - { - "cell_type": "markdown", - "id": "da50fbd8", - "metadata": {}, - "source": [ - "## Set up the tools\n", - "\n", - "We will first define the tools we want to use. For this simple example, we will\n", - "use create a placeholder search engine. However, it is really easy to create\n", - "your own tools - see documentation\n", - "[here](https://js.langchain.com/v0.2/docs/how_to/custom_tools) on how to do\n", - "that.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "id": "a8f1ae1c", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { DynamicStructuredTool } from \"@langchain/core/tools\";\n", - "import { z } from \"zod\";\n", - "\n", - "const searchTool = new DynamicStructuredTool({\n", - " name: \"search\",\n", - " description:\n", - " \"Use to surf the web, fetch current information, check the weather, and retrieve other information.\",\n", - " schema: z.object({\n", - " query: z.string().describe(\"The query to use in your search.\"),\n", - " }),\n", - " func: async ({ query: _query }: { query: string }) => {\n", - " // This is a placeholder for the actual implementation\n", - " return \"Cold, with a low of 3℃\";\n", - " },\n", - "});\n", - "\n", - "await searchTool.invoke({ query: \"What's the weather like?\" });\n", - "\n", - "const tools = [searchTool];" - ] - }, - { - "cell_type": "markdown", - "id": "19b27cb3", - "metadata": {}, - "source": [ - "We can now wrap these tools in a simple\n", - "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html).\n", - "This object will actually run the tools (functions) whenever they are invoked by\n", - "our LLM.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "id": "f02278b1", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", - "\n", - "const toolNode = new ToolNode(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "dd55ee5a", - "metadata": {}, - "source": [ - "## Set up the model\n", - "\n", - "Now we will load the\n", - "[chat model](https://js.langchain.com/v0.2/docs/concepts/#chat-models).\n", - "\n", - "1. It should work with messages. We will represent all agent state in the form\n", - " of messages, so it needs to be able to work well with them.\n", - "2. It should work with\n", - " [tool calling](https://js.langchain.com/v0.2/docs/how_to/tool_calling/#passing-tools-to-llms),\n", - " meaning it can return function arguments in its response.\n", - "\n", - "
\n", - "

Note

\n", - "

\n", - " These model requirements are not general requirements for using LangGraph - they are just requirements for this one example.\n", - "

\n", - "
" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "id": "9c7210e7", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { ChatOpenAI } from \"@langchain/openai\";\n", - "\n", - "const model = new ChatOpenAI({ model: \"gpt-4o\" });" - ] - }, - { - "cell_type": "markdown", - "id": "73e59248", - "metadata": {}, - "source": [ - "After we've done this, we should make sure the model knows that it has these\n", - "tools available to call. We can do this by calling\n", - "[bindTools](https://v01.api.js.langchain.com/classes/langchain_core_language_models_chat_models.BaseChatModel.html#bindTools)." - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "id": "b4ff23ee", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "const boundModel = model.bindTools(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "dbe67356", - "metadata": {}, - "source": [ - "## Define the graph\n", - "\n", - "We can now put it all together." - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "id": "0ba603bb", - "metadata": {}, - "outputs": [], - "source": [ - "import { END, START, StateGraph } from \"@langchain/langgraph\";\n", - "import { AIMessage } from \"@langchain/core/messages\";\n", - "import { RunnableConfig } from \"@langchain/core/runnables\";\n", - "\n", - "const routeMessage = (state: typeof GraphState.State) => {\n", - " const { messages } = state;\n", - " const lastMessage = messages[messages.length - 1] as AIMessage;\n", - " // If no tools are called, we can finish (respond to the user)\n", - " if (!lastMessage?.tool_calls?.length) {\n", - " return END;\n", - " }\n", - " // Otherwise if there is, we continue and call the tools\n", - " return \"tools\";\n", - "};\n", - "\n", - "const callModel = async (\n", - " state: typeof GraphState.State,\n", - " config?: RunnableConfig,\n", - ") => {\n", - " // For versions of @langchain/core < 0.2.3, you must call `.stream()`\n", - " // and aggregate the message from chunks instead of calling `.invoke()`.\n", - " const { messages } = state;\n", - " const responseMessage = await boundModel.invoke(messages, config);\n", - " return { messages: [responseMessage] };\n", - "};\n", - "\n", - "const workflow = new StateGraph(GraphState)\n", - " .addNode(\"agent\", callModel)\n", - " .addNode(\"tools\", toolNode)\n", - " .addEdge(START, \"agent\")\n", - " .addConditionalEdges(\"agent\", routeMessage)\n", - " .addEdge(\"tools\", \"agent\");\n", - "\n", - "const graph = workflow.compile();" - ] - }, - { - "cell_type": "markdown", - "id": "a1ab3ad3", - "metadata": {}, - "source": [ - "## Stream values\n", - "\n", - "We can now interact with the agent. Between interactions you can get and update\n", - "state.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "id": "cbcf7c39", - "metadata": {}, - "outputs": [ + "cells": [ + { + "cell_type": "markdown", + "id": "562ddb82", + "metadata": {}, + "source": [ + "# How to stream full state of your graph\n", + "\n", + "LangGraph supports multiple streaming modes. The main ones are:\n", + "\n", + "- `values`: This streaming mode streams back values of the graph. This is the\n", + " **full state of the graph** after each node is called.\n", + "- `updates`: This streaming mode streams back updates to the graph. This is the\n", + " **update to the state of the graph** after each node is called.\n", + "\n", + "This guide covers `streamMode=\"values\"`." + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "[ [ 'user', \"what's the weather in sf\" ] ]\n", - "\n", - "====\n", - "\n", - "[\n", - " [ 'user', \"what's the weather in sf\" ],\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU\",\n", - " \"content\": \"\",\n", - " \"additional_kwargs\": {\n", - " \"tool_calls\": [\n", - " {\n", - " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\",\n", - " \"type\": \"function\",\n", - " \"function\": \"[Object]\"\n", - " }\n", - " ]\n", - " },\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 17,\n", - " \"promptTokens\": 70,\n", - " \"totalTokens\": 87\n", - " },\n", - " \"finish_reason\": \"tool_calls\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [\n", - " {\n", - " \"name\": \"search\",\n", - " \"args\": {\n", - " \"query\": \"current weather in San Francisco\"\n", - " },\n", - " \"type\": \"tool_call\",\n", - " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\"\n", - " }\n", - " ],\n", - " \"invalid_tool_calls\": [],\n", - " \"usage_metadata\": {\n", - " \"input_tokens\": 70,\n", - " \"output_tokens\": 17,\n", - " \"total_tokens\": 87\n", - " }\n", - " }\n", - "]\n", - "\n", - "====\n", - "\n", - "[\n", - " [ 'user', \"what's the weather in sf\" ],\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU\",\n", - " \"content\": \"\",\n", - " \"additional_kwargs\": {\n", - " \"tool_calls\": [\n", - " {\n", - " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\",\n", - " \"type\": \"function\",\n", - " \"function\": \"[Object]\"\n", - " }\n", - " ]\n", - " },\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 17,\n", - " \"promptTokens\": 70,\n", - " \"totalTokens\": 87\n", - " },\n", - " \"finish_reason\": \"tool_calls\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [\n", - " {\n", - " \"name\": \"search\",\n", - " \"args\": {\n", - " \"query\": \"current weather in San Francisco\"\n", - " },\n", - " \"type\": \"tool_call\",\n", - " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\"\n", - " }\n", - " ],\n", - " \"invalid_tool_calls\": [],\n", - " \"usage_metadata\": {\n", - " \"input_tokens\": 70,\n", - " \"output_tokens\": 17,\n", - " \"total_tokens\": 87\n", - " }\n", - " },\n", - " ToolMessage {\n", - " \"content\": \"Cold, with a low of 3℃\",\n", - " \"name\": \"search\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\"\n", - " }\n", - "]\n", - "\n", - "====\n", - "\n", - "[\n", - " [ 'user', \"what's the weather in sf\" ],\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU\",\n", - " \"content\": \"\",\n", - " \"additional_kwargs\": {\n", - " \"tool_calls\": [\n", - " {\n", - " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\",\n", - " \"type\": \"function\",\n", - " \"function\": \"[Object]\"\n", - " }\n", - " ]\n", - " },\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 17,\n", - " \"promptTokens\": 70,\n", - " \"totalTokens\": 87\n", - " },\n", - " \"finish_reason\": \"tool_calls\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [\n", - " {\n", - " \"name\": \"search\",\n", - " \"args\": {\n", - " \"query\": \"current weather in San Francisco\"\n", - " },\n", - " \"type\": \"tool_call\",\n", - " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\"\n", - " }\n", - " ],\n", - " \"invalid_tool_calls\": [],\n", - " \"usage_metadata\": {\n", - " \"input_tokens\": 70,\n", - " \"output_tokens\": 17,\n", - " \"total_tokens\": 87\n", - " }\n", - " },\n", - " ToolMessage {\n", - " \"content\": \"Cold, with a low of 3℃\",\n", - " \"name\": \"search\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\"\n", - " },\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y660ZKNXvziVJze0X5aTlZ5IoN35\",\n", - " \"content\": \"Currently, in San Francisco, it's cold with a temperature of around 3℃ (37.4°F).\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 23,\n", - " \"promptTokens\": 103,\n", - " \"totalTokens\": 126\n", - " },\n", - " \"finish_reason\": \"stop\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [],\n", - " \"invalid_tool_calls\": [],\n", - " \"usage_metadata\": {\n", - " \"input_tokens\": 103,\n", - " \"output_tokens\": 23,\n", - " \"total_tokens\": 126\n", - " }\n", - " }\n", - "]\n", - "\n", - "====\n", - "\n" - ] + "cell_type": "code", + "execution_count": 1, + "id": "8e76833b", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "// process.env.OPENAI_API_KEY = \"sk-...\";" + ] + }, + { + "cell_type": "markdown", + "id": "ab95dc97", + "metadata": {}, + "source": [ + "## Define the state\n", + "\n", + "The state is the interface for all of the nodes in our graph.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "1648124b", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { Annotation } from \"@langchain/langgraph\";\n", + "import { BaseMessage } from \"@langchain/core/messages\";\n", + "\n", + "const GraphState = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: (x, y) => x.concat(y),\n", + " }),\n", + "});" + ] + }, + { + "cell_type": "markdown", + "id": "da50fbd8", + "metadata": {}, + "source": [ + "## Set up the tools\n", + "\n", + "We will first define the tools we want to use. For this simple example, we will\n", + "use create a placeholder search engine. However, it is really easy to create\n", + "your own tools - see documentation\n", + "[here](https://js.langchain.com/v0.2/docs/how_to/custom_tools) on how to do\n", + "that.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "a8f1ae1c", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { DynamicStructuredTool } from \"@langchain/core/tools\";\n", + "import { z } from \"zod\";\n", + "\n", + "const searchTool = new DynamicStructuredTool({\n", + " name: \"search\",\n", + " description:\n", + " \"Use to surf the web, fetch current information, check the weather, and retrieve other information.\",\n", + " schema: z.object({\n", + " query: z.string().describe(\"The query to use in your search.\"),\n", + " }),\n", + " func: async ({ query: _query }: { query: string }) => {\n", + " // This is a placeholder for the actual implementation\n", + " return \"Cold, with a low of 3℃\";\n", + " },\n", + "});\n", + "\n", + "await searchTool.invoke({ query: \"What's the weather like?\" });\n", + "\n", + "const tools = [searchTool];" + ] + }, + { + "cell_type": "markdown", + "id": "19b27cb3", + "metadata": {}, + "source": [ + "We can now wrap these tools in a simple\n", + "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html).\n", + "This object will actually run the tools (functions) whenever they are invoked by\n", + "our LLM.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "f02278b1", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", + "\n", + "const toolNode = new ToolNode(tools);" + ] + }, + { + "cell_type": "markdown", + "id": "dd55ee5a", + "metadata": {}, + "source": [ + "## Set up the model\n", + "\n", + "Now we will load the\n", + "[chat model](https://js.langchain.com/v0.2/docs/concepts/#chat-models).\n", + "\n", + "1. It should work with messages. We will represent all agent state in the form\n", + " of messages, so it needs to be able to work well with them.\n", + "2. It should work with\n", + " [tool calling](https://js.langchain.com/v0.2/docs/how_to/tool_calling/#passing-tools-to-llms),\n", + " meaning it can return function arguments in its response.\n", + "\n", + "
\n", + "

Note

\n", + "

\n", + " These model requirements are not general requirements for using LangGraph - they are just requirements for this one example.\n", + "

\n", + "
" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "9c7210e7", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { ChatOpenAI } from \"@langchain/openai\";\n", + "\n", + "const model = new ChatOpenAI({ model: \"gpt-4o\" });" + ] + }, + { + "cell_type": "markdown", + "id": "73e59248", + "metadata": {}, + "source": [ + "After we've done this, we should make sure the model knows that it has these\n", + "tools available to call. We can do this by calling\n", + "[bindTools](https://v01.api.js.langchain.com/classes/langchain_core_language_models_chat_models.BaseChatModel.html#bindTools)." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "b4ff23ee", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "const boundModel = model.bindTools(tools);" + ] + }, + { + "cell_type": "markdown", + "id": "dbe67356", + "metadata": {}, + "source": [ + "## Define the graph\n", + "\n", + "We can now put it all together." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "0ba603bb", + "metadata": {}, + "outputs": [], + "source": [ + "import { END, START, StateGraph } from \"@langchain/langgraph\";\n", + "import { AIMessage } from \"@langchain/core/messages\";\n", + "import { RunnableConfig } from \"@langchain/core/runnables\";\n", + "\n", + "const routeMessage = (state: typeof GraphState.State) => {\n", + " const { messages } = state;\n", + " const lastMessage = messages[messages.length - 1] as AIMessage;\n", + " // If no tools are called, we can finish (respond to the user)\n", + " if (!lastMessage?.tool_calls?.length) {\n", + " return END;\n", + " }\n", + " // Otherwise if there is, we continue and call the tools\n", + " return \"tools\";\n", + "};\n", + "\n", + "const callModel = async (\n", + " state: typeof GraphState.State,\n", + " config?: RunnableConfig,\n", + ") => {\n", + " // For versions of @langchain/core < 0.2.3, you must call `.stream()`\n", + " // and aggregate the message from chunks instead of calling `.invoke()`.\n", + " const { messages } = state;\n", + " const responseMessage = await boundModel.invoke(messages, config);\n", + " return { messages: [responseMessage] };\n", + "};\n", + "\n", + "const workflow = new StateGraph(GraphState)\n", + " .addNode(\"agent\", callModel)\n", + " .addNode(\"tools\", toolNode)\n", + " .addEdge(START, \"agent\")\n", + " .addConditionalEdges(\"agent\", routeMessage)\n", + " .addEdge(\"tools\", \"agent\");\n", + "\n", + "const graph = workflow.compile();" + ] + }, + { + "cell_type": "markdown", + "id": "a1ab3ad3", + "metadata": {}, + "source": [ + "## Stream values\n", + "\n", + "We can now interact with the agent. Between interactions you can get and update\n", + "state.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "cbcf7c39", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[ [ 'user', \"what's the weather in sf\" ] ]\n", + "\n", + "====\n", + "\n", + "[\n", + " [ 'user', \"what's the weather in sf\" ],\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU\",\n", + " \"content\": \"\",\n", + " \"additional_kwargs\": {\n", + " \"tool_calls\": [\n", + " {\n", + " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\",\n", + " \"type\": \"function\",\n", + " \"function\": \"[Object]\"\n", + " }\n", + " ]\n", + " },\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 17,\n", + " \"promptTokens\": 70,\n", + " \"totalTokens\": 87\n", + " },\n", + " \"finish_reason\": \"tool_calls\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [\n", + " {\n", + " \"name\": \"search\",\n", + " \"args\": {\n", + " \"query\": \"current weather in San Francisco\"\n", + " },\n", + " \"type\": \"tool_call\",\n", + " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\"\n", + " }\n", + " ],\n", + " \"invalid_tool_calls\": [],\n", + " \"usage_metadata\": {\n", + " \"input_tokens\": 70,\n", + " \"output_tokens\": 17,\n", + " \"total_tokens\": 87\n", + " }\n", + " }\n", + "]\n", + "\n", + "====\n", + "\n", + "[\n", + " [ 'user', \"what's the weather in sf\" ],\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU\",\n", + " \"content\": \"\",\n", + " \"additional_kwargs\": {\n", + " \"tool_calls\": [\n", + " {\n", + " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\",\n", + " \"type\": \"function\",\n", + " \"function\": \"[Object]\"\n", + " }\n", + " ]\n", + " },\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 17,\n", + " \"promptTokens\": 70,\n", + " \"totalTokens\": 87\n", + " },\n", + " \"finish_reason\": \"tool_calls\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [\n", + " {\n", + " \"name\": \"search\",\n", + " \"args\": {\n", + " \"query\": \"current weather in San Francisco\"\n", + " },\n", + " \"type\": \"tool_call\",\n", + " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\"\n", + " }\n", + " ],\n", + " \"invalid_tool_calls\": [],\n", + " \"usage_metadata\": {\n", + " \"input_tokens\": 70,\n", + " \"output_tokens\": 17,\n", + " \"total_tokens\": 87\n", + " }\n", + " },\n", + " ToolMessage {\n", + " \"content\": \"Cold, with a low of 3℃\",\n", + " \"name\": \"search\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\"\n", + " }\n", + "]\n", + "\n", + "====\n", + "\n", + "[\n", + " [ 'user', \"what's the weather in sf\" ],\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y660d49eLzT7DZeBk2ZmX8C5f0LU\",\n", + " \"content\": \"\",\n", + " \"additional_kwargs\": {\n", + " \"tool_calls\": [\n", + " {\n", + " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\",\n", + " \"type\": \"function\",\n", + " \"function\": \"[Object]\"\n", + " }\n", + " ]\n", + " },\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 17,\n", + " \"promptTokens\": 70,\n", + " \"totalTokens\": 87\n", + " },\n", + " \"finish_reason\": \"tool_calls\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [\n", + " {\n", + " \"name\": \"search\",\n", + " \"args\": {\n", + " \"query\": \"current weather in San Francisco\"\n", + " },\n", + " \"type\": \"tool_call\",\n", + " \"id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\"\n", + " }\n", + " ],\n", + " \"invalid_tool_calls\": [],\n", + " \"usage_metadata\": {\n", + " \"input_tokens\": 70,\n", + " \"output_tokens\": 17,\n", + " \"total_tokens\": 87\n", + " }\n", + " },\n", + " ToolMessage {\n", + " \"content\": \"Cold, with a low of 3℃\",\n", + " \"name\": \"search\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"call_iD5Wk4vPsTckffDKJpEQaMkg\"\n", + " },\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y660ZKNXvziVJze0X5aTlZ5IoN35\",\n", + " \"content\": \"Currently, in San Francisco, it's cold with a temperature of around 3℃ (37.4°F).\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 23,\n", + " \"promptTokens\": 103,\n", + " \"totalTokens\": 126\n", + " },\n", + " \"finish_reason\": \"stop\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [],\n", + " \"invalid_tool_calls\": [],\n", + " \"usage_metadata\": {\n", + " \"input_tokens\": 103,\n", + " \"output_tokens\": 23,\n", + " \"total_tokens\": 126\n", + " }\n", + " }\n", + "]\n", + "\n", + "====\n", + "\n" + ] + } + ], + "source": [ + "let inputs = { messages: [[\"user\", \"what's the weather in sf\"]] };\n", + "\n", + "for await (\n", + " const chunk of await graph.stream(inputs, {\n", + " streamMode: \"values\",\n", + " })\n", + ") {\n", + " console.log(chunk[\"messages\"]);\n", + " console.log(\"\\n====\\n\");\n", + "}" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" } - ], - "source": [ - "let inputs = { messages: [[\"user\", \"what's the weather in sf\"]] };\n", - "\n", - "for await (\n", - " const chunk of await graph.stream(inputs, {\n", - " streamMode: \"values\",\n", - " })\n", - ") {\n", - " console.log(chunk[\"messages\"]);\n", - " console.log(\"\\n====\\n\");\n", - "}" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "TypeScript", - "language": "typescript", - "name": "tslab" }, - "language_info": { - "codemirror_mode": { - "mode": "typescript", - "name": "javascript", - "typescript": true - }, - "file_extension": ".ts", - "mimetype": "text/typescript", - "name": "typescript", - "version": "3.7.2" - } - }, - "nbformat": 4, - "nbformat_minor": 5 -} + "nbformat": 4, + "nbformat_minor": 5 +} \ No newline at end of file diff --git a/examples/how-tos/subgraph.ipynb b/examples/how-tos/subgraph.ipynb index c7213dc2c..dd2aeea33 100644 --- a/examples/how-tos/subgraph.ipynb +++ b/examples/how-tos/subgraph.ipynb @@ -1,352 +1,352 @@ { - "cells": [ - { - "cell_type": "markdown", - "id": "f84bb90b", - "metadata": {}, - "source": [ - "# How to create subgraphs\n", - "\n", - "Graphs such as\n", - "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/index.StateGraph.html)'s\n", - "naturally can be composed. Creating subgraphs lets you build things like\n", - "[multi-agent teams](./multi_agent/hierarchical_agent_teams.ipynb), where each\n", - "team can track its own separate state.\n", - "\n", - "You can add a `StateGraph` instance as a node by first\n", - "[compiling](https://langchain-ai.github.io/langgraphjs/reference/classes/index.StateGraph.html#compile)\n", - "it to translate it to its lower-level Pregel operations.\n", - "\n", - "The main thing you should note is ensuring the \"handoff\" from the calling graph\n", - "to the called graph behaves as expected.\n", - "\n", - "Below are a couple of examples showing how to do so!\n", - "\n", - "## Setup\n", - "\n", - "First, install LangGraph.\n", - "\n", - "```bash\n", - "yarn add langraph\n", - "```\n", - "\n", - "Optionally, we can set API key for\n", - "[LangSmith tracing](https://smith.langchain.com/), which will give us\n", - "best-in-class observability." - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "id": "d8624447", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [ + "cells": [ { - "name": "stdout", - "output_type": "stream", - "text": [ - "Configuration: LangGraphJS\n" - ] - } - ], - "source": [ - "// process.env.OPENAI_API_KEY = \"sk_...\";\n", - "\n", - "// Optional, add tracing in LangSmith\n", - "// process.env.LANGCHAIN_API_KEY = \"ls__...\";\n", - "// process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", - "process.env.LANGCHAIN_TRACING_V2 = \"true\";\n", - "process.env.LANGCHAIN_PROJECT = \"Configuration: LangGraphJS\";" - ] - }, - { - "cell_type": "markdown", - "id": "f7c8afef", - "metadata": {}, - "source": [ - "## Create Parent + Child Graphs\n", - "\n", - "For this example, we will create two graphs: a parent graph with a few nodes,\n", - "and a child graph that is added as a node in the parent.\n", - "\n", - "For this example we will use the same `State` in both graphs, though we will\n", - "show how using the same keys can be a stumbling block if you're not careful." - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "id": "38d1f06f", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "f84bb90b", + "metadata": {}, + "source": [ + "# How to create subgraphs\n", + "\n", + "Graphs such as\n", + "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.StateGraph.html)'s\n", + "naturally can be composed. Creating subgraphs lets you build things like\n", + "[multi-agent teams](./multi_agent/hierarchical_agent_teams.ipynb), where each\n", + "team can track its own separate state.\n", + "\n", + "You can add a `StateGraph` instance as a node by first\n", + "[compiling](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.StateGraph.html#compile)\n", + "it to translate it to its lower-level Pregel operations.\n", + "\n", + "The main thing you should note is ensuring the \"handoff\" from the calling graph\n", + "to the called graph behaves as expected.\n", + "\n", + "Below are a couple of examples showing how to do so!\n", + "\n", + "## Setup\n", + "\n", + "First, install LangGraph.\n", + "\n", + "```bash\n", + "yarn add langraph\n", + "```\n", + "\n", + "Optionally, we can set API key for\n", + "[LangSmith tracing](https://smith.langchain.com/), which will give us\n", + "best-in-class observability." + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " name: 'test',\n", - " path: [\n", - " 'grandparent',\n", - " 'parent',\n", - " 'grandparent',\n", - " 'parent',\n", - " 'child_start',\n", - " 'child_middle',\n", - " 'child_end',\n", - " 'sibling',\n", - " 'fin'\n", - " ]\n", - "}\n" - ] - } - ], - "source": [ - "import { END, START, StateGraph, Annotation } from \"@langchain/langgraph\";\n", - "\n", - "function reduceList(\n", - " left?: string[] | string,\n", - " right?: string[] | string,\n", - "): string[] {\n", - " if (!left) {\n", - " left = [];\n", - " } else if (typeof left === \"string\") {\n", - " left = [left];\n", - " }\n", - " if (!right) {\n", - " right = [];\n", - " } else if (typeof right === \"string\") {\n", - " right = [right];\n", - " }\n", - " return [...left, ...right];\n", - "}\n", - "\n", - "const GraphState = Annotation.Root({\n", - " name: Annotation({\n", - " // Overwrite name if a new one is provided\n", - " reducer: (x, y) => y ?? x,\n", - " default: () => \"default\",\n", - " }),\n", - " path: Annotation({\n", - " reducer: reduceList,\n", - " }),\n", - "});\n", - "\n", - "const childBuilder = new StateGraph(GraphState)\n", - " .addNode(\"child_start\", (_state) => ({ path: [\"child_start\"] }))\n", - " .addEdge(START, \"child_start\")\n", - " .addNode(\"child_middle\", (_state) => ({ path: [\"child_middle\"] }))\n", - " .addNode(\"child_end\", (_state) => ({ path: [\"child_end\"] }))\n", - " .addEdge(\"child_start\", \"child_middle\")\n", - " .addEdge(\"child_middle\", \"child_end\")\n", - " .addEdge(\"child_end\", END);\n", - "\n", - "const builder = new StateGraph(GraphState)\n", - " .addNode(\"grandparent\", (_state) => ({ path: [\"grandparent\"] }))\n", - " .addEdge(START, \"grandparent\")\n", - " .addNode(\"parent\", (_state) => ({ path: [\"parent\"] }))\n", - " .addNode(\"child\", childBuilder.compile())\n", - " .addNode(\"sibling\", (_state) => ({ path: [\"sibling\"] }))\n", - " .addNode(\"fin\", (_state) => ({ path: [\"fin\"] }))\n", - " // Add connections\n", - " .addEdge(\"grandparent\", \"parent\")\n", - " .addEdge(\"parent\", \"child\")\n", - " .addEdge(\"parent\", \"sibling\")\n", - " .addEdge(\"child\", \"fin\")\n", - " .addEdge(\"sibling\", \"fin\")\n", - " .addEdge(\"fin\", END);\n", - "\n", - "const graph = builder.compile();\n", - "\n", - "const result1 = await graph.invoke({ name: \"test\" });\n", - "console.log(result1);" - ] - }, - { - "cell_type": "markdown", - "id": "9f3f4e12", - "metadata": {}, - "source": [ - "Notice here that the `[\"grandparent\", \"parent\"]` sequence is duplicated! This is\n", - "because our child state has received the full parent state and returns the full\n", - "parent state once it terminates. In the next section, we will show how you can\n", - "merge or separate state within nested graphs.\n", - "\n", - "## State handoff\n", - "\n", - "To avoid duplication or conflicts in state, you typically would do one or more\n", - "of the following:\n", - "\n", - "1. Handle duplicates in your `reducer` function.\n", - "2. Call the child graph from within a TypeScript function. In that function,\n", - " handle the state as needed.\n", - "3. Update the child graph keys to avoid conflicts. You would still need to\n", - " ensure the output can be interpreted by the parent, however.\n", - "\n", - "Let's re-implement the graph using technique (1) and add unique IDs for every\n", - "value in the list." - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "id": "34a51908", - "metadata": {}, - "outputs": [ + "cell_type": "code", + "execution_count": 1, + "id": "d8624447", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Configuration: LangGraphJS\n" + ] + } + ], + "source": [ + "// process.env.OPENAI_API_KEY = \"sk_...\";\n", + "\n", + "// Optional, add tracing in LangSmith\n", + "// process.env.LANGCHAIN_API_KEY = \"ls__...\";\n", + "// process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", + "process.env.LANGCHAIN_TRACING_V2 = \"true\";\n", + "process.env.LANGCHAIN_PROJECT = \"Configuration: LangGraphJS\";" + ] + }, + { + "cell_type": "markdown", + "id": "f7c8afef", + "metadata": {}, + "source": [ + "## Create Parent + Child Graphs\n", + "\n", + "For this example, we will create two graphs: a parent graph with a few nodes,\n", + "and a child graph that is added as a node in the parent.\n", + "\n", + "For this example we will use the same `State` in both graphs, though we will\n", + "show how using the same keys can be a stumbling block if you're not careful." + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " name: 'test',\n", - " path: [\n", - " { val: 'grandparent', id: 'd0cb05ae-85d7-4cf6-adf3-b78259509cb4' },\n", - " { val: 'parent', id: 'b5adf2c8-d70b-4f5a-b87c-c2ae09a5d046' },\n", - " { val: 'child_start', id: '57ac546a-a709-4a7d-bd17-90f1a8d4d338' },\n", - " { val: 'child_middle', id: '237b9419-dd07-4602-8abb-4b959228e2a2' },\n", - " { val: 'child_end', id: 'c02960fa-0aff-4f21-a35b-07eb8870bd90' },\n", - " { val: 'sibling', id: 'b097055f-530c-47c7-b704-245f2b2edfcc' },\n", - " { val: 'fin', id: '4ca6eae6-f265-4780-ae1c-c854bf7939dc' }\n", - " ]\n", - "}\n" - ] + "cell_type": "code", + "execution_count": 2, + "id": "38d1f06f", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " name: 'test',\n", + " path: [\n", + " 'grandparent',\n", + " 'parent',\n", + " 'grandparent',\n", + " 'parent',\n", + " 'child_start',\n", + " 'child_middle',\n", + " 'child_end',\n", + " 'sibling',\n", + " 'fin'\n", + " ]\n", + "}\n" + ] + } + ], + "source": [ + "import { END, START, StateGraph, Annotation } from \"@langchain/langgraph\";\n", + "\n", + "function reduceList(\n", + " left?: string[] | string,\n", + " right?: string[] | string,\n", + "): string[] {\n", + " if (!left) {\n", + " left = [];\n", + " } else if (typeof left === \"string\") {\n", + " left = [left];\n", + " }\n", + " if (!right) {\n", + " right = [];\n", + " } else if (typeof right === \"string\") {\n", + " right = [right];\n", + " }\n", + " return [...left, ...right];\n", + "}\n", + "\n", + "const GraphState = Annotation.Root({\n", + " name: Annotation({\n", + " // Overwrite name if a new one is provided\n", + " reducer: (x, y) => y ?? x,\n", + " default: () => \"default\",\n", + " }),\n", + " path: Annotation({\n", + " reducer: reduceList,\n", + " }),\n", + "});\n", + "\n", + "const childBuilder = new StateGraph(GraphState)\n", + " .addNode(\"child_start\", (_state) => ({ path: [\"child_start\"] }))\n", + " .addEdge(START, \"child_start\")\n", + " .addNode(\"child_middle\", (_state) => ({ path: [\"child_middle\"] }))\n", + " .addNode(\"child_end\", (_state) => ({ path: [\"child_end\"] }))\n", + " .addEdge(\"child_start\", \"child_middle\")\n", + " .addEdge(\"child_middle\", \"child_end\")\n", + " .addEdge(\"child_end\", END);\n", + "\n", + "const builder = new StateGraph(GraphState)\n", + " .addNode(\"grandparent\", (_state) => ({ path: [\"grandparent\"] }))\n", + " .addEdge(START, \"grandparent\")\n", + " .addNode(\"parent\", (_state) => ({ path: [\"parent\"] }))\n", + " .addNode(\"child\", childBuilder.compile())\n", + " .addNode(\"sibling\", (_state) => ({ path: [\"sibling\"] }))\n", + " .addNode(\"fin\", (_state) => ({ path: [\"fin\"] }))\n", + " // Add connections\n", + " .addEdge(\"grandparent\", \"parent\")\n", + " .addEdge(\"parent\", \"child\")\n", + " .addEdge(\"parent\", \"sibling\")\n", + " .addEdge(\"child\", \"fin\")\n", + " .addEdge(\"sibling\", \"fin\")\n", + " .addEdge(\"fin\", END);\n", + "\n", + "const graph = builder.compile();\n", + "\n", + "const result1 = await graph.invoke({ name: \"test\" });\n", + "console.log(result1);" + ] + }, + { + "cell_type": "markdown", + "id": "9f3f4e12", + "metadata": {}, + "source": [ + "Notice here that the `[\"grandparent\", \"parent\"]` sequence is duplicated! This is\n", + "because our child state has received the full parent state and returns the full\n", + "parent state once it terminates. In the next section, we will show how you can\n", + "merge or separate state within nested graphs.\n", + "\n", + "## State handoff\n", + "\n", + "To avoid duplication or conflicts in state, you typically would do one or more\n", + "of the following:\n", + "\n", + "1. Handle duplicates in your `reducer` function.\n", + "2. Call the child graph from within a TypeScript function. In that function,\n", + " handle the state as needed.\n", + "3. Update the child graph keys to avoid conflicts. You would still need to\n", + " ensure the output can be interpreted by the parent, however.\n", + "\n", + "Let's re-implement the graph using technique (1) and add unique IDs for every\n", + "value in the list." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "34a51908", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " name: 'test',\n", + " path: [\n", + " { val: 'grandparent', id: 'd0cb05ae-85d7-4cf6-adf3-b78259509cb4' },\n", + " { val: 'parent', id: 'b5adf2c8-d70b-4f5a-b87c-c2ae09a5d046' },\n", + " { val: 'child_start', id: '57ac546a-a709-4a7d-bd17-90f1a8d4d338' },\n", + " { val: 'child_middle', id: '237b9419-dd07-4602-8abb-4b959228e2a2' },\n", + " { val: 'child_end', id: 'c02960fa-0aff-4f21-a35b-07eb8870bd90' },\n", + " { val: 'sibling', id: 'b097055f-530c-47c7-b704-245f2b2edfcc' },\n", + " { val: 'fin', id: '4ca6eae6-f265-4780-ae1c-c854bf7939dc' }\n", + " ]\n", + "}\n" + ] + } + ], + "source": [ + "import { v4 as uuidv4 } from \"uuid\";\n", + "\n", + "type ValWithId = { id?: string; val: string };\n", + "\n", + "function reduceListWithIds(\n", + " left?: ValWithId[] | ValWithId,\n", + " right?: ValWithId[] | ValWithId,\n", + "): any[] {\n", + " /**\n", + " * Append the right-hand list, replacing any elements with the same id in the left-hand list.\n", + " */\n", + " if (!left) {\n", + " left = [];\n", + " } else if (!Array.isArray(left)) {\n", + " left = [left];\n", + " }\n", + " if (!right) {\n", + " right = [];\n", + " } else if (!Array.isArray(right)) {\n", + " right = [right];\n", + " }\n", + " // Ensure there's an id for each element\n", + " const [left_, right_] = [left, right].map((orig) =>\n", + " orig.map((val) => {\n", + " if (!val?.id) {\n", + " val.id = uuidv4();\n", + " }\n", + " return val;\n", + " })\n", + " );\n", + "\n", + " // Merge the two lists\n", + " const leftIdxById = left_.reduce(\n", + " (acc, val, i) => ({ ...acc, [val.id as string]: i }),\n", + " {} as Record,\n", + " );\n", + " const merged = [...left_];\n", + " for (const val of right_) {\n", + " const existingIdx = leftIdxById[val.id as string];\n", + " if (existingIdx !== undefined) {\n", + " merged[existingIdx] = val;\n", + " } else {\n", + " merged.push(val);\n", + " }\n", + " }\n", + " return merged;\n", + "}\n", + "\n", + "const GraphStateWithIds = Annotation.Root({\n", + " name: Annotation({\n", + " // Overwrite name if a new one is provided\n", + " reducer: (x, y) => y ?? x,\n", + " default: () => \"default\",\n", + " }),\n", + " path: Annotation({\n", + " reducer: reduceListWithIds,\n", + " }),\n", + "});\n", + "\n", + "const childBuilderWithIds = new StateGraph(GraphStateWithIds)\n", + " .addNode(\"child_start\", (_state) => ({\n", + " path: [{ val: \"child_start\" }],\n", + " }))\n", + " .addEdge(START, \"child_start\")\n", + " .addNode(\"child_middle\", (_state) => ({\n", + " path: [{ val: \"child_middle\" }],\n", + " }))\n", + " .addNode(\"child_end\", (_state) => ({\n", + " path: [{ val: \"child_end\" }],\n", + " }))\n", + " .addEdge(\"child_start\", \"child_middle\")\n", + " .addEdge(\"child_middle\", \"child_end\")\n", + " .addEdge(\"child_end\", END);\n", + "\n", + "const builderWithIds = new StateGraph(GraphStateWithIds)\n", + " .addNode(\"grandparent\", (_state) => ({\n", + " path: [{ val: \"grandparent\" }],\n", + " }))\n", + " .addEdge(START, \"grandparent\")\n", + " .addNode(\"parent\", (_state) => ({ path: [{ val: \"parent\" }] }))\n", + " .addNode(\"child\", childBuilderWithIds.compile())\n", + " .addNode(\"sibling\", (_state) => ({ path: [{ val: \"sibling\" }] }))\n", + " .addNode(\"fin\", (_state) => ({ path: [{ val: \"fin\" }] }))\n", + " // Add connections\n", + " .addEdge(\"grandparent\", \"parent\")\n", + " .addEdge(\"parent\", \"child\")\n", + " .addEdge(\"parent\", \"sibling\")\n", + " .addEdge(\"child\", \"fin\")\n", + " .addEdge(\"sibling\", \"fin\")\n", + " .addEdge(\"fin\", END);\n", + "\n", + "const graphWithIds = builderWithIds.compile();\n", + "\n", + "const result2 = await graphWithIds.invoke({ name: \"test\" });\n", + "console.log(result2);" + ] + }, + { + "cell_type": "markdown", + "id": "5dad1a18", + "metadata": {}, + "source": [ + "Duplicates are gone!" + ] + }, + { + "cell_type": "markdown", + "id": "b1b7f4e1", + "metadata": {}, + "source": [ + "```\n", + "```" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" } - ], - "source": [ - "import { v4 as uuidv4 } from \"uuid\";\n", - "\n", - "type ValWithId = { id?: string; val: string };\n", - "\n", - "function reduceListWithIds(\n", - " left?: ValWithId[] | ValWithId,\n", - " right?: ValWithId[] | ValWithId,\n", - "): any[] {\n", - " /**\n", - " * Append the right-hand list, replacing any elements with the same id in the left-hand list.\n", - " */\n", - " if (!left) {\n", - " left = [];\n", - " } else if (!Array.isArray(left)) {\n", - " left = [left];\n", - " }\n", - " if (!right) {\n", - " right = [];\n", - " } else if (!Array.isArray(right)) {\n", - " right = [right];\n", - " }\n", - " // Ensure there's an id for each element\n", - " const [left_, right_] = [left, right].map((orig) =>\n", - " orig.map((val) => {\n", - " if (!val?.id) {\n", - " val.id = uuidv4();\n", - " }\n", - " return val;\n", - " })\n", - " );\n", - "\n", - " // Merge the two lists\n", - " const leftIdxById = left_.reduce(\n", - " (acc, val, i) => ({ ...acc, [val.id as string]: i }),\n", - " {} as Record,\n", - " );\n", - " const merged = [...left_];\n", - " for (const val of right_) {\n", - " const existingIdx = leftIdxById[val.id as string];\n", - " if (existingIdx !== undefined) {\n", - " merged[existingIdx] = val;\n", - " } else {\n", - " merged.push(val);\n", - " }\n", - " }\n", - " return merged;\n", - "}\n", - "\n", - "const GraphStateWithIds = Annotation.Root({\n", - " name: Annotation({\n", - " // Overwrite name if a new one is provided\n", - " reducer: (x, y) => y ?? x,\n", - " default: () => \"default\",\n", - " }),\n", - " path: Annotation({\n", - " reducer: reduceListWithIds,\n", - " }),\n", - "});\n", - "\n", - "const childBuilderWithIds = new StateGraph(GraphStateWithIds)\n", - " .addNode(\"child_start\", (_state) => ({\n", - " path: [{ val: \"child_start\" }],\n", - " }))\n", - " .addEdge(START, \"child_start\")\n", - " .addNode(\"child_middle\", (_state) => ({\n", - " path: [{ val: \"child_middle\" }],\n", - " }))\n", - " .addNode(\"child_end\", (_state) => ({\n", - " path: [{ val: \"child_end\" }],\n", - " }))\n", - " .addEdge(\"child_start\", \"child_middle\")\n", - " .addEdge(\"child_middle\", \"child_end\")\n", - " .addEdge(\"child_end\", END);\n", - "\n", - "const builderWithIds = new StateGraph(GraphStateWithIds)\n", - " .addNode(\"grandparent\", (_state) => ({\n", - " path: [{ val: \"grandparent\" }],\n", - " }))\n", - " .addEdge(START, \"grandparent\")\n", - " .addNode(\"parent\", (_state) => ({ path: [{ val: \"parent\" }] }))\n", - " .addNode(\"child\", childBuilderWithIds.compile())\n", - " .addNode(\"sibling\", (_state) => ({ path: [{ val: \"sibling\" }] }))\n", - " .addNode(\"fin\", (_state) => ({ path: [{ val: \"fin\" }] }))\n", - " // Add connections\n", - " .addEdge(\"grandparent\", \"parent\")\n", - " .addEdge(\"parent\", \"child\")\n", - " .addEdge(\"parent\", \"sibling\")\n", - " .addEdge(\"child\", \"fin\")\n", - " .addEdge(\"sibling\", \"fin\")\n", - " .addEdge(\"fin\", END);\n", - "\n", - "const graphWithIds = builderWithIds.compile();\n", - "\n", - "const result2 = await graphWithIds.invoke({ name: \"test\" });\n", - "console.log(result2);" - ] - }, - { - "cell_type": "markdown", - "id": "5dad1a18", - "metadata": {}, - "source": [ - "Duplicates are gone!" - ] - }, - { - "cell_type": "markdown", - "id": "b1b7f4e1", - "metadata": {}, - "source": [ - "```\n", - "```" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "TypeScript", - "language": "typescript", - "name": "tslab" }, - "language_info": { - "codemirror_mode": { - "mode": "typescript", - "name": "javascript", - "typescript": true - }, - "file_extension": ".ts", - "mimetype": "text/typescript", - "name": "typescript", - "version": "3.7.2" - } - }, - "nbformat": 4, - "nbformat_minor": 5 -} + "nbformat": 4, + "nbformat_minor": 5 +} \ No newline at end of file diff --git a/examples/how-tos/time-travel.ipynb b/examples/how-tos/time-travel.ipynb index 23ba27fa8..6de9a7d44 100644 --- a/examples/how-tos/time-travel.ipynb +++ b/examples/how-tos/time-travel.ipynb @@ -1,1127 +1,1127 @@ { - "cells": [ - { - "cell_type": "markdown", - "id": "90616e9e", - "metadata": {}, - "source": [ - "# How to view and update past graph state\n", - "\n", - "Once you start [checkpointing](./persistence.ipynb) your graphs, you can easily\n", - "**get** or **update** the state of the agent at any point in time. This permits\n", - "a few things:\n", - "\n", - "1. You can surface a state during an interrupt to a user to let them accept an\n", - " action.\n", - "2. You can **rewind** the graph to reproduce or avoid issues.\n", - "3. You can **modify** the state to embed your agent into a larger system, or to\n", - " let the user better control its actions.\n", - "\n", - "The key methods used for this functionality are:\n", - "\n", - "- [getState](https://langchain-ai.github.io/langgraphjs/reference/classes/pregel.Pregel.html#getState):\n", - " fetch the values from the target config\n", - "- [updateState](https://langchain-ai.github.io/langgraphjs/reference/classes/pregel.Pregel.html#updateState):\n", - " apply the given values to the target state\n", - "\n", - "**Note:** this requires passing in a checkpointer.\n", - "\n", - "\n", - "\n", - "This works for\n", - "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/index.StateGraph.html)\n", - "and all its subclasses, such as\n", - "[MessageGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/index.MessageGraph.html).\n", - "\n", - "Below is an example.\n", - "\n", - "
\n", - "

Note

\n", - "

\n", - " In this how-to, we will create our agent from scratch to be transparent (but verbose). You can accomplish similar functionality using the createReactAgent(model, tools=tool, checkpointer=checkpointer) (API doc) constructor. This may be more appropriate if you are used to LangChain's AgentExecutor class.\n", - "

\n", - "
\n", - "\n", - "## Setup\n", - "\n", - "This guide will use OpenAI's GPT-4o model. We will optionally set our API key\n", - "for [LangSmith tracing](https://smith.langchain.com/), which will give us\n", - "best-in-class observability." - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "id": "9a7df1d0", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [ + "cells": [ { - "name": "stdout", - "output_type": "stream", - "text": [ - "Time Travel: LangGraphJS\n" - ] - } - ], - "source": [ - "// process.env.OPENAI_API_KEY = \"sk_...\";\n", - "\n", - "// Optional, add tracing in LangSmith\n", - "// process.env.LANGCHAIN_API_KEY = \"ls__...\";\n", - "process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", - "process.env.LANGCHAIN_TRACING_V2 = \"true\";\n", - "process.env.LANGCHAIN_PROJECT = \"Time Travel: LangGraphJS\";" - ] - }, - { - "cell_type": "markdown", - "id": "e79ba1c0", - "metadata": {}, - "source": [ - "## Define the state\n", - "\n", - "The state is the interface for all of the nodes in our graph.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "id": "44968352", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { Annotation } from \"@langchain/langgraph\";\n", - "import { BaseMessage } from \"@langchain/core/messages\";\n", - "\n", - "const GraphState = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: (x, y) => x.concat(y),\n", - " }),\n", - "});" - ] - }, - { - "cell_type": "markdown", - "id": "47c88187", - "metadata": {}, - "source": [ - "## Set up the tools\n", - "\n", - "We will first define the tools we want to use. For this simple example, we will\n", - "use create a placeholder search engine. However, it is really easy to create\n", - "your own tools - see documentation\n", - "[here](https://js.langchain.com/v0.2/docs/how_to/custom_tools) on how to do\n", - "that.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "id": "b22edfc4", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { DynamicStructuredTool } from \"@langchain/core/tools\";\n", - "import { z } from \"zod\";\n", - "\n", - "const searchTool = new DynamicStructuredTool({\n", - " name: \"search\",\n", - " description:\n", - " \"Use to surf the web, fetch current information, check the weather, and retrieve other information.\",\n", - " schema: z.object({\n", - " query: z.string().describe(\"The query to use in your search.\"),\n", - " }),\n", - " func: async (_) => {\n", - " // This is a placeholder for the actual implementation\n", - " return \"Cold, with a low of 13 ℃\";\n", - " },\n", - "});\n", - "\n", - "await searchTool.invoke({ query: \"What's the weather like?\" });\n", - "\n", - "const tools = [searchTool];" - ] - }, - { - "cell_type": "markdown", - "id": "7c764430", - "metadata": {}, - "source": [ - "We can now wrap these tools in a simple\n", - "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html).\n", - "This object will actually run the tools (functions) whenever they are invoked by\n", - "our LLM.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "id": "0cc63f1f", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", - "\n", - "const toolNode = new ToolNode(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "cc409cd5", - "metadata": {}, - "source": [ - "## Set up the model\n", - "\n", - "Now we will load the\n", - "[chat model](https://js.langchain.com/v0.2/docs/concepts/#chat-models).\n", - "\n", - "1. It should work with messages. We will represent all agent state in the form\n", - " of messages, so it needs to be able to work well with them.\n", - "2. It should work with\n", - " [tool calling](https://js.langchain.com/v0.2/docs/how_to/tool_calling/#passing-tools-to-llms),\n", - " meaning it can return function arguments in its response.\n", - "\n", - "
\n", - "

Note

\n", - "

\n", - " These model requirements are not general requirements for using LangGraph - they are just requirements for this one example.\n", - "

\n", - "
" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "id": "dae9ab9c", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import { ChatOpenAI } from \"@langchain/openai\";\n", - "\n", - "const model = new ChatOpenAI({ model: \"gpt-4o\" });" - ] - }, - { - "cell_type": "markdown", - "id": "b5cfd558", - "metadata": {}, - "source": [ - "After we've done this, we should make sure the model knows that it has these\n", - "tools available to call. We can do this by calling\n", - "[bindTools](https://v01.api.js.langchain.com/classes/langchain_core_language_models_chat_models.BaseChatModel.html#bindTools).\n" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "id": "ca438e74", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "const boundModel = model.bindTools(tools);" - ] - }, - { - "cell_type": "markdown", - "id": "4a2b8a4f", - "metadata": {}, - "source": [ - "## Define the graph\n", - "\n", - "We can now put it all together. Time travel requires a checkpointer to save the\n", - "state - otherwise you wouldn't have anything go `get` or `update`. We will use\n", - "the\n", - "[MemorySaver](https://langchain-ai.github.io/langgraphjs/reference/classes/index.MemorySaver.html),\n", - "which \"saves\" checkpoints in-memory." - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "id": "1a29ec2a", - "metadata": {}, - "outputs": [], - "source": [ - "import { END, START, StateGraph } from \"@langchain/langgraph\";\n", - "import { AIMessage } from \"@langchain/core/messages\";\n", - "import { RunnableConfig } from \"@langchain/core/runnables\";\n", - "import { MemorySaver } from \"@langchain/langgraph\";\n", - "\n", - "const routeMessage = (state: typeof GraphState.State) => {\n", - " const { messages } = state;\n", - " const lastMessage = messages[messages.length - 1] as AIMessage;\n", - " // If no tools are called, we can finish (respond to the user)\n", - " if (!lastMessage?.tool_calls?.length) {\n", - " return END;\n", - " }\n", - " // Otherwise if there is, we continue and call the tools\n", - " return \"tools\";\n", - "};\n", - "\n", - "const callModel = async (\n", - " state: typeof GraphState.State,\n", - " config?: RunnableConfig,\n", - ") => {\n", - " const { messages } = state;\n", - " const response = await boundModel.invoke(messages, config);\n", - " return { messages: [response] };\n", - "};\n", - "\n", - "const workflow = new StateGraph(GraphState)\n", - " .addNode(\"agent\", callModel)\n", - " .addNode(\"tools\", toolNode)\n", - " .addEdge(START, \"agent\")\n", - " .addConditionalEdges(\"agent\", routeMessage)\n", - " .addEdge(\"tools\", \"agent\");\n", - "\n", - "// Here we only save in-memory\n", - "let memory = new MemorySaver();\n", - "const graph = workflow.compile({ checkpointer: memory });" - ] - }, - { - "cell_type": "markdown", - "id": "a6dd42a3", - "metadata": {}, - "source": [ - "## Interacting with the Agent\n", - "\n", - "We can now interact with the agent. Between interactions you can get and update\n", - "state." - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "id": "0749329a", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "90616e9e", + "metadata": {}, + "source": [ + "# How to view and update past graph state\n", + "\n", + "Once you start [checkpointing](./persistence.ipynb) your graphs, you can easily\n", + "**get** or **update** the state of the agent at any point in time. This permits\n", + "a few things:\n", + "\n", + "1. You can surface a state during an interrupt to a user to let them accept an\n", + " action.\n", + "2. You can **rewind** the graph to reproduce or avoid issues.\n", + "3. You can **modify** the state to embed your agent into a larger system, or to\n", + " let the user better control its actions.\n", + "\n", + "The key methods used for this functionality are:\n", + "\n", + "- [getState](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_pregel.Pregel.html#getState):\n", + " fetch the values from the target config\n", + "- [updateState](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_pregel.Pregel.html#updateState):\n", + " apply the given values to the target state\n", + "\n", + "**Note:** this requires passing in a checkpointer.\n", + "\n", + "\n", + "\n", + "This works for\n", + "[StateGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.StateGraph.html)\n", + "and all its subclasses, such as\n", + "[MessageGraph](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph.MessageGraph.html).\n", + "\n", + "Below is an example.\n", + "\n", + "
\n", + "

Note

\n", + "

\n", + " In this how-to, we will create our agent from scratch to be transparent (but verbose). You can accomplish similar functionality using the createReactAgent(model, tools=tool, checkpointer=checkpointer) (API doc) constructor. This may be more appropriate if you are used to LangChain's AgentExecutor class.\n", + "

\n", + "
\n", + "\n", + "## Setup\n", + "\n", + "This guide will use OpenAI's GPT-4o model. We will optionally set our API key\n", + "for [LangSmith tracing](https://smith.langchain.com/), which will give us\n", + "best-in-class observability." + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "[ 'user', \"Hi I'm Jo.\" ]\n", - "-----\n", - "\n", - "Hello Jo! How can I assist you today?\n", - "-----\n", - "\n" - ] - } - ], - "source": [ - "let config = { configurable: { thread_id: \"conversation-num-1\" } };\n", - "let inputs = { messages: [[\"user\", \"Hi I'm Jo.\"]] };\n", - "for await (\n", - " const { messages } of await graph.stream(inputs, {\n", - " ...config,\n", - " streamMode: \"values\",\n", - " })\n", - ") {\n", - " let msg = messages[messages?.length - 1];\n", - " if (msg?.content) {\n", - " console.log(msg.content);\n", - " } else if (msg?.tool_calls?.length > 0) {\n", - " console.log(msg.tool_calls);\n", - " } else {\n", - " console.log(msg);\n", - " }\n", - " console.log(\"-----\\n\");\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "221f323d", - "metadata": {}, - "source": [ - "See LangSmith example run here\n", - "https://smith.langchain.com/public/b3feb09b-bcd2-4ad5-ad1d-414106148448/r\n", - "\n", - "Here you can see the \"agent\" node ran, and then our edge returned `__end__` so\n", - "the graph stopped execution there.\n", - "\n", - "Let's check the current graph state." - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "id": "6ff5468d", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [ + "cell_type": "code", + "execution_count": 1, + "id": "9a7df1d0", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Time Travel: LangGraphJS\n" + ] + } + ], + "source": [ + "// process.env.OPENAI_API_KEY = \"sk_...\";\n", + "\n", + "// Optional, add tracing in LangSmith\n", + "// process.env.LANGCHAIN_API_KEY = \"ls__...\";\n", + "process.env.LANGCHAIN_CALLBACKS_BACKGROUND = \"true\";\n", + "process.env.LANGCHAIN_TRACING_V2 = \"true\";\n", + "process.env.LANGCHAIN_PROJECT = \"Time Travel: LangGraphJS\";" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " messages: [\n", - " [ 'user', \"Hi I'm Jo.\" ],\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y6TlYVbfL3d3VonkF1b3iXwnbdFm\",\n", - " \"content\": \"Hello Jo! How can I assist you today?\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 11,\n", - " \"promptTokens\": 68,\n", - " \"totalTokens\": 79\n", - " },\n", - " \"finish_reason\": \"stop\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [],\n", - " \"invalid_tool_calls\": []\n", - " }\n", - " ]\n", - "}\n" - ] - } - ], - "source": [ - "let checkpoint = await graph.getState(config);\n", - "checkpoint.values;" - ] - }, - { - "cell_type": "markdown", - "id": "571077e2", - "metadata": {}, - "source": [ - "The current state is the two messages we've seen above, 1. the HumanMessage we\n", - "sent in, 2. the AIMessage we got back from the model.\n", - "\n", - "The `next` values are empty since the graph has terminated (transitioned to the\n", - "`__end__`)." - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "id": "22b25946", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [ + "cell_type": "markdown", + "id": "e79ba1c0", + "metadata": {}, + "source": [ + "## Define the state\n", + "\n", + "The state is the interface for all of the nodes in our graph.\n" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "[]\n" - ] - } - ], - "source": [ - "checkpoint.next;" - ] - }, - { - "cell_type": "markdown", - "id": "889cd8ce", - "metadata": {}, - "source": [ - "## Let's get it to execute a tool\n", - "\n", - "When we call the graph again, it will create a checkpoint after each internal\n", - "execution step. Let's get it to run a tool, then look at the checkpoint." - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "id": "873b3438", - "metadata": {}, - "outputs": [ + "cell_type": "code", + "execution_count": 1, + "id": "44968352", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { Annotation } from \"@langchain/langgraph\";\n", + "import { BaseMessage } from \"@langchain/core/messages\";\n", + "\n", + "const GraphState = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: (x, y) => x.concat(y),\n", + " }),\n", + "});" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "[ 'user', \"What's the weather like in SF currently?\" ]\n", - "-----\n", - "\n", - "[\n", - " {\n", - " name: 'search',\n", - " args: { query: 'current weather in San Francisco' },\n", - " type: 'tool_call',\n", - " id: 'call_IBDK50kVnVq2RtDjbpq0UiTA'\n", - " }\n", - "]\n", - "-----\n", - "\n", - "Cold, with a low of 13 ℃\n", - "-----\n", - "\n", - "The current weather in San Francisco is cold, with a low temperature of 13°C (55°F). Is there anything else you would like to know?\n", - "-----\n", - "\n" - ] - } - ], - "source": [ - "inputs = { messages: [[\"user\", \"What's the weather like in SF currently?\"]] };\n", - "for await (\n", - " const { messages } of await graph.stream(inputs, {\n", - " ...config,\n", - " streamMode: \"values\",\n", - " })\n", - ") {\n", - " let msg = messages[messages?.length - 1];\n", - " if (msg?.content) {\n", - " console.log(msg.content);\n", - " } else if (msg?.tool_calls?.length > 0) {\n", - " console.log(msg.tool_calls);\n", - " } else {\n", - " console.log(msg);\n", - " }\n", - " console.log(\"-----\\n\");\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "6384c1e3", - "metadata": {}, - "source": [ - "See the trace of the above execution here:\n", - "https://smith.langchain.com/public/0ef426fd-0da1-4c02-a50b-64ae1e68338e/r We can\n", - "see it planned the tool execution (ie the \"agent\" node), then \"should_continue\"\n", - "edge returned \"continue\" so we proceeded to \"action\" node, which executed the\n", - "tool, and then \"agent\" node emitted the final response, which made\n", - "\"should_continue\" edge return \"end\". Let's see how we can have more control over\n", - "this." - ] - }, - { - "cell_type": "markdown", - "id": "3a3fe0ce", - "metadata": {}, - "source": [ - "### Pause before tools\n", - "\n", - "If you notice below, we now will add `interruptBefore=[\"action\"]` - this means\n", - "that before any actions are taken we pause. This is a great moment to allow the\n", - "user to correct and update the state! This is very useful when you want to have\n", - "a human-in-the-loop to validate (and potentially change) the action to take." - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "id": "736be42e", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "47c88187", + "metadata": {}, + "source": [ + "## Set up the tools\n", + "\n", + "We will first define the tools we want to use. For this simple example, we will\n", + "use create a placeholder search engine. However, it is really easy to create\n", + "your own tools - see documentation\n", + "[here](https://js.langchain.com/v0.2/docs/how_to/custom_tools) on how to do\n", + "that.\n" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "[ 'user', \"What's the weather like in SF currently?\" ]\n", - "-----\n", - "\n", - "[\n", - " {\n", - " name: 'search',\n", - " args: { query: 'current weather in San Francisco, CA' },\n", - " type: 'tool_call',\n", - " id: 'call_upim4LMd1U6JdWlsGGk772Pa'\n", - " }\n", - "]\n", - "-----\n", - "\n" - ] - } - ], - "source": [ - "memory = new MemorySaver();\n", - "const graphWithInterrupt = workflow.compile({\n", - " checkpointer: memory,\n", - " interruptBefore: [\"tools\"],\n", - "});\n", - "\n", - "inputs = { messages: [[\"user\", \"What's the weather like in SF currently?\"]] };\n", - "for await (\n", - " const { messages } of await graphWithInterrupt.stream(inputs, {\n", - " ...config,\n", - " streamMode: \"values\",\n", - " })\n", - ") {\n", - " let msg = messages[messages?.length - 1];\n", - " if (msg?.content) {\n", - " console.log(msg.content);\n", - " } else if (msg?.tool_calls?.length > 0) {\n", - " console.log(msg.tool_calls);\n", - " } else {\n", - " console.log(msg);\n", - " }\n", - " console.log(\"-----\\n\");\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "bf27f2b4", - "metadata": {}, - "source": [ - "## Get State\n", - "\n", - "You can fetch the latest graph checkpoint using\n", - "[`getState(config)`](https://langchain-ai.github.io/langgraphjs/reference/classes/pregel.Pregel.html#getState)." - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "id": "0f434f69", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [ + "cell_type": "code", + "execution_count": 2, + "id": "b22edfc4", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { DynamicStructuredTool } from \"@langchain/core/tools\";\n", + "import { z } from \"zod\";\n", + "\n", + "const searchTool = new DynamicStructuredTool({\n", + " name: \"search\",\n", + " description:\n", + " \"Use to surf the web, fetch current information, check the weather, and retrieve other information.\",\n", + " schema: z.object({\n", + " query: z.string().describe(\"The query to use in your search.\"),\n", + " }),\n", + " func: async (_) => {\n", + " // This is a placeholder for the actual implementation\n", + " return \"Cold, with a low of 13 ℃\";\n", + " },\n", + "});\n", + "\n", + "await searchTool.invoke({ query: \"What's the weather like?\" });\n", + "\n", + "const tools = [searchTool];" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "[ 'tools' ]\n" - ] - } - ], - "source": [ - "let snapshot = await graphWithInterrupt.getState(config);\n", - "snapshot.next;" - ] - }, - { - "cell_type": "markdown", - "id": "1f78ad8f", - "metadata": {}, - "source": [ - "## Resume\n", - "\n", - "You can resume by running the graph with a `null` input. The checkpoint is\n", - "loaded, and with no new inputs, it will execute as if no interrupt had occurred." - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "id": "fd4d7eff", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "7c764430", + "metadata": {}, + "source": [ + "We can now wrap these tools in a simple\n", + "[ToolNode](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html).\n", + "This object will actually run the tools (functions) whenever they are invoked by\n", + "our LLM.\n" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "Cold, with a low of 13 ℃\n", - "-----\n", - "\n", - "Currently, it's cold in San Francisco, with a temperature around 13°C.\n", - "-----\n", - "\n" - ] - } - ], - "source": [ - "for await (\n", - " const { messages } of await graphWithInterrupt.stream(null, {\n", - " ...snapshot.config,\n", - " streamMode: \"values\",\n", - " })\n", - ") {\n", - " let msg = messages[messages?.length - 1];\n", - " if (msg?.content) {\n", - " console.log(msg.content);\n", - " } else if (msg?.tool_calls?.length > 0) {\n", - " console.log(msg.tool_calls);\n", - " } else {\n", - " console.log(msg);\n", - " }\n", - " console.log(\"-----\\n\");\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "2885d91d", - "metadata": {}, - "source": [ - "## Check full history\n", - "\n", - "Let's browse the history of this thread, from newest to oldest.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 14, - "id": "bc7acb70", - "metadata": { - "lines_to_next_cell": 2 - }, - "outputs": [ + "cell_type": "code", + "execution_count": 3, + "id": "0cc63f1f", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", + "\n", + "const toolNode = new ToolNode(tools);" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " values: {\n", - " messages: [\n", - " [Array],\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y6Tn0RGjUnVqxDHz5CxlGfldPS2E\",\n", - " \"content\": \"\",\n", - " \"additional_kwargs\": {\n", - " \"tool_calls\": [\n", - " {\n", - " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\",\n", - " \"type\": \"function\",\n", - " \"function\": \"[Object]\"\n", - " }\n", - " ]\n", - " },\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 19,\n", - " \"promptTokens\": 72,\n", - " \"totalTokens\": 91\n", - " },\n", - " \"finish_reason\": \"tool_calls\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [\n", - " {\n", - " \"name\": \"search\",\n", - " \"args\": {\n", - " \"query\": \"current weather in San Francisco, CA\"\n", - " },\n", - " \"type\": \"tool_call\",\n", - " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", - " }\n", - " ],\n", - " \"invalid_tool_calls\": []\n", - " },\n", - " ToolMessage {\n", - " \"content\": \"Cold, with a low of 13 ℃\",\n", - " \"name\": \"search\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", - " },\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y6ToC6yczhz1hzn5XMPt6Fha4CLJ\",\n", - " \"content\": \"Currently, it's cold in San Francisco, with a temperature around 13°C.\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 17,\n", - " \"promptTokens\": 107,\n", - " \"totalTokens\": 124\n", - " },\n", - " \"finish_reason\": \"stop\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [],\n", - " \"invalid_tool_calls\": []\n", - " }\n", - " ]\n", - " },\n", - " next: [],\n", - " metadata: { source: 'loop', step: 3, writes: { agent: [Object] } },\n", - " config: {\n", - " configurable: {\n", - " thread_id: 'conversation-num-1',\n", - " checkpoint_id: '1ef5e864-0045-68b1-8003-3da747a708d6'\n", - " }\n", - " },\n", - " createdAt: '2024-08-19T23:53:36.443Z',\n", - " parentConfig: undefined\n", - "}\n", - "--\n", - "{\n", - " values: {\n", - " messages: [\n", - " [Array],\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y6Tn0RGjUnVqxDHz5CxlGfldPS2E\",\n", - " \"content\": \"\",\n", - " \"additional_kwargs\": {\n", - " \"tool_calls\": [\n", - " {\n", - " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\",\n", - " \"type\": \"function\",\n", - " \"function\": \"[Object]\"\n", - " }\n", - " ]\n", - " },\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 19,\n", - " \"promptTokens\": 72,\n", - " \"totalTokens\": 91\n", - " },\n", - " \"finish_reason\": \"tool_calls\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [\n", - " {\n", - " \"name\": \"search\",\n", - " \"args\": {\n", - " \"query\": \"current weather in San Francisco, CA\"\n", - " },\n", - " \"type\": \"tool_call\",\n", - " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", - " }\n", - " ],\n", - " \"invalid_tool_calls\": []\n", - " },\n", - " ToolMessage {\n", - " \"content\": \"Cold, with a low of 13 ℃\",\n", - " \"name\": \"search\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", - " }\n", - " ]\n", - " },\n", - " next: [ 'agent' ],\n", - " metadata: { source: 'loop', step: 2, writes: { tools: [Object] } },\n", - " config: {\n", - " configurable: {\n", - " thread_id: 'conversation-num-1',\n", - " checkpoint_id: '1ef5e863-fa1c-6650-8002-bf4528305aac'\n", - " }\n", - " },\n", - " createdAt: '2024-08-19T23:53:35.797Z',\n", - " parentConfig: undefined\n", - "}\n", - "--\n", - "{\n", - " values: {\n", - " messages: [\n", - " [Array],\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y6Tn0RGjUnVqxDHz5CxlGfldPS2E\",\n", - " \"content\": \"\",\n", - " \"additional_kwargs\": {\n", - " \"tool_calls\": [\n", - " {\n", - " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\",\n", - " \"type\": \"function\",\n", - " \"function\": \"[Object]\"\n", - " }\n", - " ]\n", - " },\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 19,\n", - " \"promptTokens\": 72,\n", - " \"totalTokens\": 91\n", - " },\n", - " \"finish_reason\": \"tool_calls\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [\n", - " {\n", - " \"name\": \"search\",\n", - " \"args\": {\n", - " \"query\": \"current weather in San Francisco, CA\"\n", - " },\n", - " \"type\": \"tool_call\",\n", - " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", - " }\n", - " ],\n", - " \"invalid_tool_calls\": []\n", - " }\n", - " ]\n", - " },\n", - " next: [ 'tools' ],\n", - " metadata: { source: 'loop', step: 1, writes: { agent: [Object] } },\n", - " config: {\n", - " configurable: {\n", - " thread_id: 'conversation-num-1',\n", - " checkpoint_id: '1ef5e863-f976-6611-8001-af242a92fef8'\n", - " }\n", - " },\n", - " createdAt: '2024-08-19T23:53:35.729Z',\n", - " parentConfig: undefined\n", - "}\n", - "--\n", - "{\n", - " values: { messages: [ [Array] ] },\n", - " next: [ 'agent' ],\n", - " metadata: { source: 'loop', step: 0, writes: null },\n", - " config: {\n", - " configurable: {\n", - " thread_id: 'conversation-num-1',\n", - " checkpoint_id: '1ef5e863-f365-6a51-8000-6443aafd5477'\n", - " }\n", - " },\n", - " createdAt: '2024-08-19T23:53:35.093Z',\n", - " parentConfig: undefined\n", - "}\n", - "--\n", - "{\n", - " values: {},\n", - " next: [ '__start__' ],\n", - " metadata: { source: 'input', step: -1, writes: { __start__: [Object] } },\n", - " config: {\n", - " configurable: {\n", - " thread_id: 'conversation-num-1',\n", - " checkpoint_id: '1ef5e863-f365-6a50-ffff-0ae60570513f'\n", - " }\n", - " },\n", - " createdAt: '2024-08-19T23:53:35.093Z',\n", - " parentConfig: undefined\n", - "}\n", - "--\n" - ] - } - ], - "source": [ - "let toReplay;\n", - "const states = await graphWithInterrupt.getStateHistory(config);\n", - "for await (const state of states) {\n", - " console.log(state);\n", - " console.log(\"--\");\n", - " if (state.values?.messages?.length === 2) {\n", - " toReplay = state;\n", - " }\n", - "}\n", - "if (!toReplay) {\n", - " throw new Error(\"No state to replay\");\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "342f0154", - "metadata": {}, - "source": [ - "## Replay a past state\n", - "\n", - "To replay from this place we just need to pass its config back to the agent.\n" - ] - }, - { - "cell_type": "code", - "execution_count": 15, - "id": "c1cefbfa", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "cc409cd5", + "metadata": {}, + "source": [ + "## Set up the model\n", + "\n", + "Now we will load the\n", + "[chat model](https://js.langchain.com/v0.2/docs/concepts/#chat-models).\n", + "\n", + "1. It should work with messages. We will represent all agent state in the form\n", + " of messages, so it needs to be able to work well with them.\n", + "2. It should work with\n", + " [tool calling](https://js.langchain.com/v0.2/docs/how_to/tool_calling/#passing-tools-to-llms),\n", + " meaning it can return function arguments in its response.\n", + "\n", + "
\n", + "

Note

\n", + "

\n", + " These model requirements are not general requirements for using LangGraph - they are just requirements for this one example.\n", + "

\n", + "
" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "Cold, with a low of 13 ℃\n", - "-----\n", - "\n", - "The current weather in San Francisco, CA is cold, with a temperature of 13°C (approximately 55°F).\n", - "-----\n", - "\n" - ] - } - ], - "source": [ - "for await (\n", - " const { messages } of await graphWithInterrupt.stream(null, {\n", - " ...toReplay.config,\n", - " streamMode: \"values\",\n", - " })\n", - ") {\n", - " let msg = messages[messages?.length - 1];\n", - " if (msg?.content) {\n", - " console.log(msg.content);\n", - " } else if (msg?.tool_calls?.length > 0) {\n", - " console.log(msg.tool_calls);\n", - " } else {\n", - " console.log(msg);\n", - " }\n", - " console.log(\"-----\\n\");\n", - "}" - ] - }, - { - "cell_type": "markdown", - "id": "e870c084", - "metadata": {}, - "source": [ - "## Branch off a past state\n", - "\n", - "Using LangGraph's checkpointing, you can do more than just replay past states.\n", - "You can branch off previous locations to let the agent explore alternate\n", - "trajectories or to let a user \"version control\" changes in a workflow.\n", - "\n", - "#### First, update a previous checkpoint\n", - "\n", - "Updating the state will create a **new** snapshot by applying the update to the\n", - "previous checkpoint. Let's **add a tool message** to simulate calling the tool." - ] - }, - { - "cell_type": "code", - "execution_count": 16, - "id": "d7656840-3a4a-4a80-af74-214b35cfbadd", - "metadata": {}, - "outputs": [ + "cell_type": "code", + "execution_count": 4, + "id": "dae9ab9c", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "import { ChatOpenAI } from \"@langchain/openai\";\n", + "\n", + "const model = new ChatOpenAI({ model: \"gpt-4o\" });" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " messages: [\n", - " [ 'user', \"What's the weather like in SF currently?\" ],\n", - " AIMessage {\n", - " \"id\": \"chatcmpl-9y6Tn0RGjUnVqxDHz5CxlGfldPS2E\",\n", - " \"content\": \"\",\n", - " \"additional_kwargs\": {\n", - " \"tool_calls\": [\n", - " {\n", - " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\",\n", - " \"type\": \"function\",\n", - " \"function\": \"[Object]\"\n", - " }\n", - " ]\n", - " },\n", - " \"response_metadata\": {\n", - " \"tokenUsage\": {\n", - " \"completionTokens\": 19,\n", - " \"promptTokens\": 72,\n", - " \"totalTokens\": 91\n", - " },\n", - " \"finish_reason\": \"tool_calls\",\n", - " \"system_fingerprint\": \"fp_3aa7262c27\"\n", - " },\n", - " \"tool_calls\": [\n", - " {\n", - " \"name\": \"search\",\n", - " \"args\": {\n", - " \"query\": \"current weather in San Francisco, CA\"\n", - " },\n", - " \"type\": \"tool_call\",\n", - " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", - " }\n", - " ],\n", - " \"invalid_tool_calls\": []\n", - " },\n", - " ToolMessage {\n", - " \"content\": \"It's sunny out, with a high of 38 ℃.\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", - " }\n", - " ]\n", - "}\n", - "[ 'agent' ]\n" - ] - } - ], - "source": [ - "import { ToolMessage } from \"@langchain/core/messages\";\n", - "\n", - "const tool_calls =\n", - " toReplay.values.messages[toReplay.values.messages.length - 1].tool_calls;\n", - "const branchConfig = await graphWithInterrupt.updateState(\n", - " toReplay.config,\n", - " {\n", - " messages: [\n", - " new ToolMessage(\"It's sunny out, with a high of 38 ℃.\", tool_calls[0].id),\n", - " ],\n", - " },\n", - " // Updates are applied \"as if\" they were coming from a node. By default,\n", - " // the updates will come from the last node to run. In our case, we want to treat\n", - " // this update as if it came from the tools node, so that the next node to run will be\n", - " // the agent.\n", - " \"tools\",\n", - ");\n", - "\n", - "const branchState = await graphWithInterrupt.getState(branchConfig);\n", - "console.log(branchState.values);\n", - "console.log(branchState.next);" - ] - }, - { - "cell_type": "markdown", - "id": "4689abd9-1008-4d8b-902c-e956a5913e12", - "metadata": {}, - "source": [ - "#### Now you can run from this branch\n", - "\n", - "Just use the updated config (containing the new checkpoint ID). The trajectory\n", - "will follow the new branch." - ] - }, - { - "cell_type": "code", - "execution_count": 17, - "id": "bb95930f-07e5-4e32-8e38-2170d36ab1a0", - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "id": "b5cfd558", + "metadata": {}, + "source": [ + "After we've done this, we should make sure the model knows that it has these\n", + "tools available to call. We can do this by calling\n", + "[bindTools](https://v01.api.js.langchain.com/classes/langchain_core_language_models_chat_models.BaseChatModel.html#bindTools).\n" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "ca438e74", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [], + "source": [ + "const boundModel = model.bindTools(tools);" + ] + }, + { + "cell_type": "markdown", + "id": "4a2b8a4f", + "metadata": {}, + "source": [ + "## Define the graph\n", + "\n", + "We can now put it all together. Time travel requires a checkpointer to save the\n", + "state - otherwise you wouldn't have anything go `get` or `update`. We will use\n", + "the\n", + "[MemorySaver](https://langchain-ai.github.io/langgraphjs/reference/classes/index.MemorySaver.html),\n", + "which \"saves\" checkpoints in-memory." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "1a29ec2a", + "metadata": {}, + "outputs": [], + "source": [ + "import { END, START, StateGraph } from \"@langchain/langgraph\";\n", + "import { AIMessage } from \"@langchain/core/messages\";\n", + "import { RunnableConfig } from \"@langchain/core/runnables\";\n", + "import { MemorySaver } from \"@langchain/langgraph\";\n", + "\n", + "const routeMessage = (state: typeof GraphState.State) => {\n", + " const { messages } = state;\n", + " const lastMessage = messages[messages.length - 1] as AIMessage;\n", + " // If no tools are called, we can finish (respond to the user)\n", + " if (!lastMessage?.tool_calls?.length) {\n", + " return END;\n", + " }\n", + " // Otherwise if there is, we continue and call the tools\n", + " return \"tools\";\n", + "};\n", + "\n", + "const callModel = async (\n", + " state: typeof GraphState.State,\n", + " config?: RunnableConfig,\n", + ") => {\n", + " const { messages } = state;\n", + " const response = await boundModel.invoke(messages, config);\n", + " return { messages: [response] };\n", + "};\n", + "\n", + "const workflow = new StateGraph(GraphState)\n", + " .addNode(\"agent\", callModel)\n", + " .addNode(\"tools\", toolNode)\n", + " .addEdge(START, \"agent\")\n", + " .addConditionalEdges(\"agent\", routeMessage)\n", + " .addEdge(\"tools\", \"agent\");\n", + "\n", + "// Here we only save in-memory\n", + "let memory = new MemorySaver();\n", + "const graph = workflow.compile({ checkpointer: memory });" + ] + }, + { + "cell_type": "markdown", + "id": "a6dd42a3", + "metadata": {}, + "source": [ + "## Interacting with the Agent\n", + "\n", + "We can now interact with the agent. Between interactions you can get and update\n", + "state." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "0749329a", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[ 'user', \"Hi I'm Jo.\" ]\n", + "-----\n", + "\n", + "Hello Jo! How can I assist you today?\n", + "-----\n", + "\n" + ] + } + ], + "source": [ + "let config = { configurable: { thread_id: \"conversation-num-1\" } };\n", + "let inputs = { messages: [[\"user\", \"Hi I'm Jo.\"]] };\n", + "for await (\n", + " const { messages } of await graph.stream(inputs, {\n", + " ...config,\n", + " streamMode: \"values\",\n", + " })\n", + ") {\n", + " let msg = messages[messages?.length - 1];\n", + " if (msg?.content) {\n", + " console.log(msg.content);\n", + " } else if (msg?.tool_calls?.length > 0) {\n", + " console.log(msg.tool_calls);\n", + " } else {\n", + " console.log(msg);\n", + " }\n", + " console.log(\"-----\\n\");\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "221f323d", + "metadata": {}, + "source": [ + "See LangSmith example run here\n", + "https://smith.langchain.com/public/b3feb09b-bcd2-4ad5-ad1d-414106148448/r\n", + "\n", + "Here you can see the \"agent\" node ran, and then our edge returned `__end__` so\n", + "the graph stopped execution there.\n", + "\n", + "Let's check the current graph state." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "6ff5468d", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " messages: [\n", + " [ 'user', \"Hi I'm Jo.\" ],\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y6TlYVbfL3d3VonkF1b3iXwnbdFm\",\n", + " \"content\": \"Hello Jo! How can I assist you today?\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 11,\n", + " \"promptTokens\": 68,\n", + " \"totalTokens\": 79\n", + " },\n", + " \"finish_reason\": \"stop\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [],\n", + " \"invalid_tool_calls\": []\n", + " }\n", + " ]\n", + "}\n" + ] + } + ], + "source": [ + "let checkpoint = await graph.getState(config);\n", + "checkpoint.values;" + ] + }, + { + "cell_type": "markdown", + "id": "571077e2", + "metadata": {}, + "source": [ + "The current state is the two messages we've seen above, 1. the HumanMessage we\n", + "sent in, 2. the AIMessage we got back from the model.\n", + "\n", + "The `next` values are empty since the graph has terminated (transitioned to the\n", + "`__end__`)." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "22b25946", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[]\n" + ] + } + ], + "source": [ + "checkpoint.next;" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "The current weather in San Francisco is sunny with a high of 38°C (100.4°F).\n", - "-----\n", - "\n" - ] + "cell_type": "markdown", + "id": "889cd8ce", + "metadata": {}, + "source": [ + "## Let's get it to execute a tool\n", + "\n", + "When we call the graph again, it will create a checkpoint after each internal\n", + "execution step. Let's get it to run a tool, then look at the checkpoint." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "873b3438", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[ 'user', \"What's the weather like in SF currently?\" ]\n", + "-----\n", + "\n", + "[\n", + " {\n", + " name: 'search',\n", + " args: { query: 'current weather in San Francisco' },\n", + " type: 'tool_call',\n", + " id: 'call_IBDK50kVnVq2RtDjbpq0UiTA'\n", + " }\n", + "]\n", + "-----\n", + "\n", + "Cold, with a low of 13 ℃\n", + "-----\n", + "\n", + "The current weather in San Francisco is cold, with a low temperature of 13°C (55°F). Is there anything else you would like to know?\n", + "-----\n", + "\n" + ] + } + ], + "source": [ + "inputs = { messages: [[\"user\", \"What's the weather like in SF currently?\"]] };\n", + "for await (\n", + " const { messages } of await graph.stream(inputs, {\n", + " ...config,\n", + " streamMode: \"values\",\n", + " })\n", + ") {\n", + " let msg = messages[messages?.length - 1];\n", + " if (msg?.content) {\n", + " console.log(msg.content);\n", + " } else if (msg?.tool_calls?.length > 0) {\n", + " console.log(msg.tool_calls);\n", + " } else {\n", + " console.log(msg);\n", + " }\n", + " console.log(\"-----\\n\");\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "6384c1e3", + "metadata": {}, + "source": [ + "See the trace of the above execution here:\n", + "https://smith.langchain.com/public/0ef426fd-0da1-4c02-a50b-64ae1e68338e/r We can\n", + "see it planned the tool execution (ie the \"agent\" node), then \"should_continue\"\n", + "edge returned \"continue\" so we proceeded to \"action\" node, which executed the\n", + "tool, and then \"agent\" node emitted the final response, which made\n", + "\"should_continue\" edge return \"end\". Let's see how we can have more control over\n", + "this." + ] + }, + { + "cell_type": "markdown", + "id": "3a3fe0ce", + "metadata": {}, + "source": [ + "### Pause before tools\n", + "\n", + "If you notice below, we now will add `interruptBefore=[\"action\"]` - this means\n", + "that before any actions are taken we pause. This is a great moment to allow the\n", + "user to correct and update the state! This is very useful when you want to have\n", + "a human-in-the-loop to validate (and potentially change) the action to take." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "id": "736be42e", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[ 'user', \"What's the weather like in SF currently?\" ]\n", + "-----\n", + "\n", + "[\n", + " {\n", + " name: 'search',\n", + " args: { query: 'current weather in San Francisco, CA' },\n", + " type: 'tool_call',\n", + " id: 'call_upim4LMd1U6JdWlsGGk772Pa'\n", + " }\n", + "]\n", + "-----\n", + "\n" + ] + } + ], + "source": [ + "memory = new MemorySaver();\n", + "const graphWithInterrupt = workflow.compile({\n", + " checkpointer: memory,\n", + " interruptBefore: [\"tools\"],\n", + "});\n", + "\n", + "inputs = { messages: [[\"user\", \"What's the weather like in SF currently?\"]] };\n", + "for await (\n", + " const { messages } of await graphWithInterrupt.stream(inputs, {\n", + " ...config,\n", + " streamMode: \"values\",\n", + " })\n", + ") {\n", + " let msg = messages[messages?.length - 1];\n", + " if (msg?.content) {\n", + " console.log(msg.content);\n", + " } else if (msg?.tool_calls?.length > 0) {\n", + " console.log(msg.tool_calls);\n", + " } else {\n", + " console.log(msg);\n", + " }\n", + " console.log(\"-----\\n\");\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "bf27f2b4", + "metadata": {}, + "source": [ + "## Get State\n", + "\n", + "You can fetch the latest graph checkpoint using\n", + "[`getState(config)`](https://langchain-ai.github.io/langgraphjs/reference/classes/pregel.Pregel.html#getState)." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "id": "0f434f69", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[ 'tools' ]\n" + ] + } + ], + "source": [ + "let snapshot = await graphWithInterrupt.getState(config);\n", + "snapshot.next;" + ] + }, + { + "cell_type": "markdown", + "id": "1f78ad8f", + "metadata": {}, + "source": [ + "## Resume\n", + "\n", + "You can resume by running the graph with a `null` input. The checkpoint is\n", + "loaded, and with no new inputs, it will execute as if no interrupt had occurred." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "id": "fd4d7eff", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Cold, with a low of 13 ℃\n", + "-----\n", + "\n", + "Currently, it's cold in San Francisco, with a temperature around 13°C.\n", + "-----\n", + "\n" + ] + } + ], + "source": [ + "for await (\n", + " const { messages } of await graphWithInterrupt.stream(null, {\n", + " ...snapshot.config,\n", + " streamMode: \"values\",\n", + " })\n", + ") {\n", + " let msg = messages[messages?.length - 1];\n", + " if (msg?.content) {\n", + " console.log(msg.content);\n", + " } else if (msg?.tool_calls?.length > 0) {\n", + " console.log(msg.tool_calls);\n", + " } else {\n", + " console.log(msg);\n", + " }\n", + " console.log(\"-----\\n\");\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "2885d91d", + "metadata": {}, + "source": [ + "## Check full history\n", + "\n", + "Let's browse the history of this thread, from newest to oldest.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "id": "bc7acb70", + "metadata": { + "lines_to_next_cell": 2 + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " values: {\n", + " messages: [\n", + " [Array],\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y6Tn0RGjUnVqxDHz5CxlGfldPS2E\",\n", + " \"content\": \"\",\n", + " \"additional_kwargs\": {\n", + " \"tool_calls\": [\n", + " {\n", + " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\",\n", + " \"type\": \"function\",\n", + " \"function\": \"[Object]\"\n", + " }\n", + " ]\n", + " },\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 19,\n", + " \"promptTokens\": 72,\n", + " \"totalTokens\": 91\n", + " },\n", + " \"finish_reason\": \"tool_calls\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [\n", + " {\n", + " \"name\": \"search\",\n", + " \"args\": {\n", + " \"query\": \"current weather in San Francisco, CA\"\n", + " },\n", + " \"type\": \"tool_call\",\n", + " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", + " }\n", + " ],\n", + " \"invalid_tool_calls\": []\n", + " },\n", + " ToolMessage {\n", + " \"content\": \"Cold, with a low of 13 ℃\",\n", + " \"name\": \"search\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", + " },\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y6ToC6yczhz1hzn5XMPt6Fha4CLJ\",\n", + " \"content\": \"Currently, it's cold in San Francisco, with a temperature around 13°C.\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 17,\n", + " \"promptTokens\": 107,\n", + " \"totalTokens\": 124\n", + " },\n", + " \"finish_reason\": \"stop\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [],\n", + " \"invalid_tool_calls\": []\n", + " }\n", + " ]\n", + " },\n", + " next: [],\n", + " metadata: { source: 'loop', step: 3, writes: { agent: [Object] } },\n", + " config: {\n", + " configurable: {\n", + " thread_id: 'conversation-num-1',\n", + " checkpoint_id: '1ef5e864-0045-68b1-8003-3da747a708d6'\n", + " }\n", + " },\n", + " createdAt: '2024-08-19T23:53:36.443Z',\n", + " parentConfig: undefined\n", + "}\n", + "--\n", + "{\n", + " values: {\n", + " messages: [\n", + " [Array],\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y6Tn0RGjUnVqxDHz5CxlGfldPS2E\",\n", + " \"content\": \"\",\n", + " \"additional_kwargs\": {\n", + " \"tool_calls\": [\n", + " {\n", + " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\",\n", + " \"type\": \"function\",\n", + " \"function\": \"[Object]\"\n", + " }\n", + " ]\n", + " },\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 19,\n", + " \"promptTokens\": 72,\n", + " \"totalTokens\": 91\n", + " },\n", + " \"finish_reason\": \"tool_calls\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [\n", + " {\n", + " \"name\": \"search\",\n", + " \"args\": {\n", + " \"query\": \"current weather in San Francisco, CA\"\n", + " },\n", + " \"type\": \"tool_call\",\n", + " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", + " }\n", + " ],\n", + " \"invalid_tool_calls\": []\n", + " },\n", + " ToolMessage {\n", + " \"content\": \"Cold, with a low of 13 ℃\",\n", + " \"name\": \"search\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", + " }\n", + " ]\n", + " },\n", + " next: [ 'agent' ],\n", + " metadata: { source: 'loop', step: 2, writes: { tools: [Object] } },\n", + " config: {\n", + " configurable: {\n", + " thread_id: 'conversation-num-1',\n", + " checkpoint_id: '1ef5e863-fa1c-6650-8002-bf4528305aac'\n", + " }\n", + " },\n", + " createdAt: '2024-08-19T23:53:35.797Z',\n", + " parentConfig: undefined\n", + "}\n", + "--\n", + "{\n", + " values: {\n", + " messages: [\n", + " [Array],\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y6Tn0RGjUnVqxDHz5CxlGfldPS2E\",\n", + " \"content\": \"\",\n", + " \"additional_kwargs\": {\n", + " \"tool_calls\": [\n", + " {\n", + " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\",\n", + " \"type\": \"function\",\n", + " \"function\": \"[Object]\"\n", + " }\n", + " ]\n", + " },\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 19,\n", + " \"promptTokens\": 72,\n", + " \"totalTokens\": 91\n", + " },\n", + " \"finish_reason\": \"tool_calls\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [\n", + " {\n", + " \"name\": \"search\",\n", + " \"args\": {\n", + " \"query\": \"current weather in San Francisco, CA\"\n", + " },\n", + " \"type\": \"tool_call\",\n", + " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", + " }\n", + " ],\n", + " \"invalid_tool_calls\": []\n", + " }\n", + " ]\n", + " },\n", + " next: [ 'tools' ],\n", + " metadata: { source: 'loop', step: 1, writes: { agent: [Object] } },\n", + " config: {\n", + " configurable: {\n", + " thread_id: 'conversation-num-1',\n", + " checkpoint_id: '1ef5e863-f976-6611-8001-af242a92fef8'\n", + " }\n", + " },\n", + " createdAt: '2024-08-19T23:53:35.729Z',\n", + " parentConfig: undefined\n", + "}\n", + "--\n", + "{\n", + " values: { messages: [ [Array] ] },\n", + " next: [ 'agent' ],\n", + " metadata: { source: 'loop', step: 0, writes: null },\n", + " config: {\n", + " configurable: {\n", + " thread_id: 'conversation-num-1',\n", + " checkpoint_id: '1ef5e863-f365-6a51-8000-6443aafd5477'\n", + " }\n", + " },\n", + " createdAt: '2024-08-19T23:53:35.093Z',\n", + " parentConfig: undefined\n", + "}\n", + "--\n", + "{\n", + " values: {},\n", + " next: [ '__start__' ],\n", + " metadata: { source: 'input', step: -1, writes: { __start__: [Object] } },\n", + " config: {\n", + " configurable: {\n", + " thread_id: 'conversation-num-1',\n", + " checkpoint_id: '1ef5e863-f365-6a50-ffff-0ae60570513f'\n", + " }\n", + " },\n", + " createdAt: '2024-08-19T23:53:35.093Z',\n", + " parentConfig: undefined\n", + "}\n", + "--\n" + ] + } + ], + "source": [ + "let toReplay;\n", + "const states = await graphWithInterrupt.getStateHistory(config);\n", + "for await (const state of states) {\n", + " console.log(state);\n", + " console.log(\"--\");\n", + " if (state.values?.messages?.length === 2) {\n", + " toReplay = state;\n", + " }\n", + "}\n", + "if (!toReplay) {\n", + " throw new Error(\"No state to replay\");\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "342f0154", + "metadata": {}, + "source": [ + "## Replay a past state\n", + "\n", + "To replay from this place we just need to pass its config back to the agent.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "id": "c1cefbfa", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Cold, with a low of 13 ℃\n", + "-----\n", + "\n", + "The current weather in San Francisco, CA is cold, with a temperature of 13°C (approximately 55°F).\n", + "-----\n", + "\n" + ] + } + ], + "source": [ + "for await (\n", + " const { messages } of await graphWithInterrupt.stream(null, {\n", + " ...toReplay.config,\n", + " streamMode: \"values\",\n", + " })\n", + ") {\n", + " let msg = messages[messages?.length - 1];\n", + " if (msg?.content) {\n", + " console.log(msg.content);\n", + " } else if (msg?.tool_calls?.length > 0) {\n", + " console.log(msg.tool_calls);\n", + " } else {\n", + " console.log(msg);\n", + " }\n", + " console.log(\"-----\\n\");\n", + "}" + ] + }, + { + "cell_type": "markdown", + "id": "e870c084", + "metadata": {}, + "source": [ + "## Branch off a past state\n", + "\n", + "Using LangGraph's checkpointing, you can do more than just replay past states.\n", + "You can branch off previous locations to let the agent explore alternate\n", + "trajectories or to let a user \"version control\" changes in a workflow.\n", + "\n", + "#### First, update a previous checkpoint\n", + "\n", + "Updating the state will create a **new** snapshot by applying the update to the\n", + "previous checkpoint. Let's **add a tool message** to simulate calling the tool." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "id": "d7656840-3a4a-4a80-af74-214b35cfbadd", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " messages: [\n", + " [ 'user', \"What's the weather like in SF currently?\" ],\n", + " AIMessage {\n", + " \"id\": \"chatcmpl-9y6Tn0RGjUnVqxDHz5CxlGfldPS2E\",\n", + " \"content\": \"\",\n", + " \"additional_kwargs\": {\n", + " \"tool_calls\": [\n", + " {\n", + " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\",\n", + " \"type\": \"function\",\n", + " \"function\": \"[Object]\"\n", + " }\n", + " ]\n", + " },\n", + " \"response_metadata\": {\n", + " \"tokenUsage\": {\n", + " \"completionTokens\": 19,\n", + " \"promptTokens\": 72,\n", + " \"totalTokens\": 91\n", + " },\n", + " \"finish_reason\": \"tool_calls\",\n", + " \"system_fingerprint\": \"fp_3aa7262c27\"\n", + " },\n", + " \"tool_calls\": [\n", + " {\n", + " \"name\": \"search\",\n", + " \"args\": {\n", + " \"query\": \"current weather in San Francisco, CA\"\n", + " },\n", + " \"type\": \"tool_call\",\n", + " \"id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", + " }\n", + " ],\n", + " \"invalid_tool_calls\": []\n", + " },\n", + " ToolMessage {\n", + " \"content\": \"It's sunny out, with a high of 38 ℃.\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"call_upim4LMd1U6JdWlsGGk772Pa\"\n", + " }\n", + " ]\n", + "}\n", + "[ 'agent' ]\n" + ] + } + ], + "source": [ + "import { ToolMessage } from \"@langchain/core/messages\";\n", + "\n", + "const tool_calls =\n", + " toReplay.values.messages[toReplay.values.messages.length - 1].tool_calls;\n", + "const branchConfig = await graphWithInterrupt.updateState(\n", + " toReplay.config,\n", + " {\n", + " messages: [\n", + " new ToolMessage(\"It's sunny out, with a high of 38 ℃.\", tool_calls[0].id),\n", + " ],\n", + " },\n", + " // Updates are applied \"as if\" they were coming from a node. By default,\n", + " // the updates will come from the last node to run. In our case, we want to treat\n", + " // this update as if it came from the tools node, so that the next node to run will be\n", + " // the agent.\n", + " \"tools\",\n", + ");\n", + "\n", + "const branchState = await graphWithInterrupt.getState(branchConfig);\n", + "console.log(branchState.values);\n", + "console.log(branchState.next);" + ] + }, + { + "cell_type": "markdown", + "id": "4689abd9-1008-4d8b-902c-e956a5913e12", + "metadata": {}, + "source": [ + "#### Now you can run from this branch\n", + "\n", + "Just use the updated config (containing the new checkpoint ID). The trajectory\n", + "will follow the new branch." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "id": "bb95930f-07e5-4e32-8e38-2170d36ab1a0", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "The current weather in San Francisco is sunny with a high of 38°C (100.4°F).\n", + "-----\n", + "\n" + ] + } + ], + "source": [ + "for await (\n", + " const { messages } of await graphWithInterrupt.stream(null, {\n", + " ...branchConfig,\n", + " streamMode: \"values\",\n", + " })\n", + ") {\n", + " let msg = messages[messages?.length - 1];\n", + " if (msg?.content) {\n", + " console.log(msg.content);\n", + " } else if (msg?.tool_calls?.length > 0) {\n", + " console.log(msg.tool_calls);\n", + " } else {\n", + " console.log(msg);\n", + " }\n", + " console.log(\"-----\\n\");\n", + "}" + ] + } + ], + "metadata": { + "jupytext": { + "encoding": "# -*- coding: utf-8 -*-" + }, + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" } - ], - "source": [ - "for await (\n", - " const { messages } of await graphWithInterrupt.stream(null, {\n", - " ...branchConfig,\n", - " streamMode: \"values\",\n", - " })\n", - ") {\n", - " let msg = messages[messages?.length - 1];\n", - " if (msg?.content) {\n", - " console.log(msg.content);\n", - " } else if (msg?.tool_calls?.length > 0) {\n", - " console.log(msg.tool_calls);\n", - " } else {\n", - " console.log(msg);\n", - " }\n", - " console.log(\"-----\\n\");\n", - "}" - ] - } - ], - "metadata": { - "jupytext": { - "encoding": "# -*- coding: utf-8 -*-" - }, - "kernelspec": { - "display_name": "TypeScript", - "language": "typescript", - "name": "tslab" }, - "language_info": { - "codemirror_mode": { - "mode": "typescript", - "name": "javascript", - "typescript": true - }, - "file_extension": ".ts", - "mimetype": "text/typescript", - "name": "typescript", - "version": "3.7.2" - } - }, - "nbformat": 4, - "nbformat_minor": 5 + "nbformat": 4, + "nbformat_minor": 5 } diff --git a/examples/how-tos/tool-calling-errors.ipynb b/examples/how-tos/tool-calling-errors.ipynb index c91766adf..abf9eefd6 100644 --- a/examples/how-tos/tool-calling-errors.ipynb +++ b/examples/how-tos/tool-calling-errors.ipynb @@ -1,805 +1,805 @@ { - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# How to handle tool calling errors\n", - "\n", - "LLMs aren't perfect at calling tools. The model may try to call a tool that doesn't exist or fail to return arguments that match the requested schema. Strategies like keeping schemas simple, reducing the number of tools you pass at once, and having good names and descriptions can help mitigate this risk, but aren't foolproof.\n", - "\n", - "This guide covers some ways to build error handling into your graphs to mitigate these failure modes.\n", - "\n", - "
\n", - "

Compatibility

\n", - "

\n", - " This guide requires @langchain/langgraph>=0.0.28, @langchain/anthropic>=0.2.6, and @langchain/core>=0.2.17. For help upgrading, see this guide.\n", - "

\n", - "
\n", - "\n", - "## Using the prebuilt `ToolNode`\n", - "\n", - "To start, define a mock weather tool that has some hidden restrictions on input queries. The intent here is to simulate a real-world case where a model fails to call a tool correctly:\n", - "\n", - "```bash\n", - "$ npm install @langchain/anthropic\n", - "```" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [], - "source": [ - "import { z } from \"zod\";\n", - "import { tool } from \"@langchain/core/tools\";\n", - "\n", - "const getWeather = tool(async ({ location }) => {\n", - " if (location === \"SAN FRANCISCO\") {\n", - " return \"It's 60 degrees and foggy\";\n", - " } else if (location.toLowerCase() === \"san francisco\") {\n", - " throw new Error(\"Input queries must be all capitals\");\n", - " } else {\n", - " throw new Error(\"Invalid input.\");\n", - " }\n", - "}, {\n", - " name: \"get_weather\",\n", - " description: \"Call to get the current weather\",\n", - " schema: z.object({\n", - " location: z.string(),\n", - " }),\n", - "});" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Next, set up a graph implementation of the [ReAct agent](https://langchain-ai.github.io/langgraphjs/concepts/). This agent takes some query as input, then repeatedly call tools until it has enough information to resolve the query. We'll use the prebuilt [`ToolNode`](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html) to execute called tools, and a small, fast model powered by Anthropic:" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [], - "source": [ - "import { StateGraph, Annotation, messagesStateReducer } from \"@langchain/langgraph\";\n", - "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", - "import { ChatAnthropic } from \"@langchain/anthropic\";\n", - "import { BaseMessage, isAIMessage } from \"@langchain/core/messages\";\n", - "\n", - "const GraphState = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: messagesStateReducer,\n", - " }),\n", - "});\n", - "\n", - "const toolNode = new ToolNode([getWeather]);\n", - "\n", - "const modelWithTools = new ChatAnthropic({\n", - " model: \"claude-3-haiku-20240307\",\n", - " temperature: 0,\n", - "}).bindTools([getWeather]);\n", - "\n", - "const shouldContinue = async (state: typeof GraphState.State) => {\n", - " const { messages } = state;\n", - " const lastMessage = messages[messages.length - 1];\n", - " if (isAIMessage(lastMessage) && lastMessage.tool_calls?.length) {\n", - " return \"tools\";\n", - " }\n", - " return \"__end__\";\n", - "}\n", - "\n", - "const callModel = async (state: typeof GraphState.State) => {\n", - " const { messages } = state;\n", - " const response = await modelWithTools.invoke(messages);\n", - " return { messages: [response] };\n", - "}\n", - "\n", - "const app = new StateGraph(GraphState)\n", - " .addNode(\"agent\", callModel)\n", - " .addNode(\"tools\", toolNode)\n", - " .addEdge(\"__start__\", \"agent\")\n", - " .addEdge(\"tools\", \"agent\")\n", - " .addConditionalEdges(\"agent\", shouldContinue, {\n", - " // Explicitly list possible destinations so that\n", - " // we can automatically draw the graph below.\n", - " tools: \"tools\",\n", - " __end__: \"__end__\",\n", - " })\n", - " .compile();" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ + "cells": [ { - "data": { - "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCADaAMcDASIAAhEBAxEB/8QAHQABAAMBAAMBAQAAAAAAAAAAAAUGBwgCAwQJAf/EAE8QAAEDBAADAwYIBw0HBQAAAAECAwQABQYRBxIhEzFVCBYiQZTRFBUXMlFhk+EJN0JxdbO0IyQ0NkNSYnN2gaHB0hhUVpGSlbElM0Vyov/EABsBAQACAwEBAAAAAAAAAAAAAAACAwEEBQYH/8QANREAAgECAQgIBQUBAQAAAAAAAAECAxExBBITIUFRUpEFFBVhcaGxwSIyM2LRQnKB4fA0Y//aAAwDAQACEQMRAD8A/VOlKUApSlAK+SbdoNtKBMmx4pX1SH3Uo5vzbNfXWZ5/Cjzs/tSJMdqQkWyQQl1AUAe1a+mjlGEZTlgk2XUaelmoXxLx51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3Vye1cn4Jc0dPs77vI0TzqsvjED2lHvp51WXxiB7Sj31nfm9a/DYf2CPdTzetfhsP7BHup2rk/BLmh2d93kaJ51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3U7VyfglzQ7O+7yNE86rL4xA9pR76edVl8Yge0o99Z35vWvw2H9gj3U83rX4bD+wR7qdq5PwS5odnfd5GiedVl8Yge0o99eTWS2h91Dbd1hOOLISlCZCCVE9wA3Wc+b1r8Nh/YI91Rl/s1visW91mDGZdTdbfpbbKUqH78Z9YFX0OkKFetCiotZzSxW12IyyDNi5Z2BtdKUrfOQKUpQClKUApSlAKUpQClKUApSlAKznNfxg2v9FyP1rVaNWc5r+MG1/ouR+taqqt9Cp+1m5kn1onjSlK8IenILMs4snD6zi6X+cIENTqI6FBtbq3HVHSUIQgFS1HrpKQT0P0VQMr8pDHsemYQY7U242zJH5DZlsW+WtbCGW3CSGkslal9ogJKNBQHMrWgTUxxztlrueHxhdLbkE4MT2ZEaRjDCnp0B9IUUSEJTs+j1B9FXztFJBNZeZmcO2LhZmGT2O73WRYr5NMtuLbv/AFBcNxiQwxIcit9UrIU2VoSOm+4dQNulThKN5d+3u1GtUnJOy7vU1jJuOeEYbdmbder0q3yXG23SXYb/AGbSXOiC64G+Rrf9Mpr6cl4w4liWRjH7lcnU3tUduWmBGhSJLqmVqUhKwlptWxtCt6+boE6BG8H41NZRnxzu3ybTm0iPPs7Qxe22pl2PDV2kbbhmKSUjtEulQU08e5ICUqJrQ+HlonO8ZxfH7VOjRXcGtcdMmXFW1yu9u+txklQGnACgqQeo6bFSdKEYKT3b/DuIqpNyzUTnDjjjbeIWX5Tj7cObElWe4uQ2lLhSQ282httSlqcU0lCFcy1AIKuYgBQ2FA1plY9wzfnYjxTz+xXCx3dKb3e1XaFdWoS1wFsqiMpIU+PRQoKZUnlVo7I1vdbDVFVRUvhwsi6m21rFRGTfwOB+lLf+2M1L1EZN/A4H6Ut/7YzW10d/20f3R9UKv05eDNfpSlewPIilKUApSlAKUpQClKUApSlAKUpQCs5zX8YNr/Rcj9a1WjVXMlwaDk8+NNfkzYsmO0plK4b/AGe0qIJB6HfVIrEoqpCUG7XTRfQqKlUU2ZzlfD3GM6VGOR4/bL6YvMGDcIqHuy5tc3LzA63yp3r6BUB/s/cMt78wMb/7Wz/prUvkqg+MXv237qfJVB8Yvftv3VxV0XNKyrep1nltB63EpWLcOMVwd997HcctdjdkJCHV2+IhkuJB2AopA2BVjqS+SqD4xe/bfup8lUHxi9+2/dUX0S5O7qrkySy+ktSTI2lZpxkizcJ4ncI7HbL3dEQMlu78O4B2RzKU2hnnTynXonfrrXfkqg+MXv237qx2P/6rkzPaFLcyvXyxW7JrVItl2gx7nbpAAdiy2g404AQRzJPQ9QD/AHVUEcAeGjZ2nAccSdEbFsZHQjRHzforUPkqg+MXv237qfJVB8Yvftv3VNdFSjqVZcmReXUXjEzi18E+H9juMa4W/CrDBnRlh1mTHtzSHG1juUlQTsEfTU9k38DgfpS3/tjNWn5KoPjF79t+6v6nhRbO3juO3G7SUsPtyEtPS+ZBW2sLTsa6jmSD/dWxk/R7pV6dadW+a08HsdyEstpOLjFWuXWlKV0ziClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv9mNdEVzv5SP48fJ5/tDL/AGY10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv8AZjXRFc7+Uj+PHyef7Qy/2Y10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSoTJMvgYyG0P9rJmvAlmDFTzvOgd5A2AlPcOZRCRsbPUVKMXJ2iZScnZE3Xy3S2Rb1bZdvnx25cGWyuO+w6NodbUkpUlQ9YIJB/PVDczzJZJKmLPboTfXlEmWt1z6thKAAfqCj+evX555d/u1k/6nqt0W+S5m11Ws/wBJ+OnlF8G5fArjBfsRfClxWHu2t76v5eKv0mlb9Z16Kv6SVD1V+r3kU8F3+B/AOz2qehbV6ujirxcWXO9p51CAG9eopbQ2kj+clX01UOLnBg8Zs/wrLL5EtIn4w/2qW2u05ZiAoLQ07sbKErHMB/SWPyumueeeXf7tZP8AqepolxLmOqVtxpVKzdGa5Ykgrh2Zwb+al11HT8/Kf/FS1n4jsvyGot5hKskh1QQ26p0OxnFE6CQ7oaJOgAtKdkgDZ6VjRN/K0/B+2JCWT1YK7iXKlKVSa4pSlAKUpQClKUApSlAKUpQClKUApSlAKUpQEVlF+RjNhl3FbfbKaSEtsg6LrqlBLaAfUVLUlP8AfWew4zqFOyZbpk3CQQuQ+T3n1JT9CE7ISn1D6ySZ7istXwbHGv5J27oDn0aSy8tP/wC0IqKq2fwU4pbdfsl5M7OQwWa57RSudOPPFHJ8ZvGSSMQv1zlOY3ARMnWqJZYz0GOeQualSHVJX6aBvlaPMkddHYr08TuL2QfHOTohZfFwNiy41HvUCPIjMPLuzrqXVFO3QSUJLaG9NgK5l9/cK1bG660VdHSNK5me4ncQslvbGO2hrIYr9msdtk3J61wLdIlOy5LJWQ8JTjSUpHLrTaNlXP1SAAZm0ZRxLyrLsRxq63M4RcZeOS7hc2o0OM+6HmZbbSFo5u0QgrSsKI2sAKI79KCxlVU8Ezf1uJb5eZQTzHlGzrZ+ivF9huSy4y82l1pxJQttYBSpJGiCD3g1yreLtknEqzcHJUzI37ZeGcunWp+XAisacdYRMaTICHELAVytH0fm/uqunROuqIbLkeIw06+uU6hCUrfcSlKnCBoqISAAT39AB16Cs4ayUJ599RO4BfHkSpNgmOrfcjtiREfeXzLcYKtFKiepLatDZ6lKkbJPMau1ZXblqa4gY2pHznBKaXrv7Mtcx/u5kI/wrVK2qmtRnvXu17XODlUFCq0hSlKpNQUpSgFKUoBSlKAUpSgFKUoBSlKAUpSgK7ntjfv2NvtRE80+OtEuMknl5nG1BQRv1BYBQT9CzVNhTG7hFbkNE8ix3KGlJI6FJHqIIII9RBFapWP53fbHbOKNpxa1TixmV7juTfiv4K4uM80gHbzziEkMElJSHDvZ0ClXokWq045jdrYfg38lyhUm4ywZRcu4BWPMLtf5b92vkCLkDSGrtbbfMDUaaUt9mlaxyFYPIEpPKpIUEgKB67zzirwnyQ5JZn8etuR3ddstLEKJdI92tiOR1vm0txqSwSgn0SpbOubp6I5RXQjhvcMlEvGLilQ36cUtPtq+sFK+b/mkH6q9fxhP/wCHL17J99Y6vV2LzR03KjJapGbp4KyciYsl9vWQ3SxZ2m1swbtdMZkIYTNKRtSVpW2pJAUVEKCUkb6aGgLZb+G9vt+V2jIRMuEi4WyzrsjZkvh0OMqW2srcURzKc20n0ubrs7BJ3Xvu2dQ7DcLZAucSXbp1zcLMCLLDbTstY1tLSVLBWRsdE7PUVK/GE/8A4cvXsn306vV3E1Oitq5lEk8BLC/iUWxNXG7QzDvD18h3KM+hEuNKcdccUUK5OXl/dnE8qkn0T12etaBaoKrZbIkNUp+cqOyhoyZSgp14pAHOsgAFR1skAdT3V4ImXJ0hKMbvKlE60phKP8VLAqTtuH3u/KHxmn4it5+eyy8Fy3B/NK07S2PUSkqV1Oik6VTQSXz2S8fbEi61GmrpnswW3qu2TSLwRuFBaXCjq3tLjqlDtlD/AOnIlG/pLg6aO9Eqk8OOJWLZrJv9jx3to7+MSvi2bAehORTHUNhHKFJAKFBJKSn1a7t1dqTkpNJYLD/eZwatR1ZubFKUqsqFKUoBSlKAUpSgFKUoBSlKAUpSgFfwkDvOvz1HT8hgQLgzbFTIxvEllx+LblPoQ/ISjXMUJJ2QNjZ7hsbrNIWEz+OuKY3cOJmPycVuFtupujFit95WpBCFEx/hJb5QpSfRXoHopAOwCpFAfdcciu3FRzPcMtEfI8GdtyG4jOXLiIShx5Q5l/BkrO1gJ5RzgD550UkJJvWM4+nGrDbLaZsu6uwYrcX4wuKw5KfCQBzOLAHMo62TrqetStKAUpUbklkTkuO3W0LlyoCLhFdiGXCWEPshaCnnbUQQFp3sEggEDoaA/ILyy/KMmcT/ACiF3ewXBTVrxR8RLJIjr/LaXzKkJPdtTg2FfzUo+iv1L8n/AIvQ+OXCWwZfE5W3pjPJMjp/kJKPRdR9OuYEjfekpPrriHi3+D/4e4FxI4V4/b7zkz0PKro/CmuSZUdTjaEM84LRSwADvv5goa9VdreT/wCT/j3k4YbMxrGplznQJU9dxW5dXW3HQ4pttsgFttA5dNJ9W9k9e7QGmUpSgKvxFwCJxIw+6Y/Jn3CzonpRzT7PIMaU0pCgpCkuD1gpHfsEdKhY96ynFM0xXEW8bnZBi7lt7OTmD89tTrMltJ/99s+krnCUnnH5S+6tCpQEXjmUWfMLYLjY7pDu8ArU18JhPJdb50nSk7SSNg9CKlKzDKOFdxx3DrhE4Pu2TAb3LuKbk8tdtS5GlL6BaFpTrk5wlIKkgkAHQBOxLw+LFrVxTc4dyY9xayBu2puSJSoDiIcpvYDhac6j0CUbBOgVgAkg6AvFKUoBSlKAUpSgFKUoBSlKAVnWY57OvreX4rw6uFqd4i2VqMXI14Q6iPFD/pIcUQn0/wBz5lDl2NgA67q0Ws0uU5GN8d7NGhYIqQvJbe/8YZfGbJ+D/BgC2w8Qg6Srm9EqWOvQA+oCdtPDazoyO35fdrXbZudtW1u3v3xmNyKIAJX2YJVyJKlL9ZOiEkkCrdSlAKUpQClKw7jX5Q0jGsgZ4fcPLajLuKE9vmbgJV+9rW2dfviYsfMSNghOwVbHdzJ2BX/KPuURzyhvJ6tiJLS7im9S5KoiVguhr4OR2hT3hOwRvu6H6DXSNY7wL8nmPwxkzcoyO5Ly/iVeBzXTJJY2ob/kI6f5JlOgAABvQ3oBKU7FQClKUApSlAK9E2G3cIj8Z3nDbzamlFpam1hKho8qkkFJ+sEEeqvfSgMgj4vkPAPBccsOAWeZncBq59lKbvV51KjRHFHRaWtPKUtcydJ6aQg95JUNStV7t19aedts+LcGmXVMOLivJdShxPRSFFJOlD1g9RX21lPk6ysJl41kqsFhzYUBORz0TkTiSpc4LHbqTtSvQJ1ru/MKA1alKUApSlAKUpQClKUAr8+/KJ/CQXPF81h47jGK3exSrHdWlXtu9LjIclIbWsPQwlAeSlCwGyH0Ob79JI0T+gD8hqK2XHnUNIHepxQSP+Zri7y8vJnsfGSyO5ticy3jOLaz++IzUhG7pHSPmaB6upA9E96h6J36OpKMpYIFz8hzym808pa35fOyq1We3RbU7FZhOWlh1sOrWHS6F9o6vfKEta1r5x7/AFdRVyl+Dyx6Dw48nC3qucli23O9TZFzfjS3EtuoBIab2lWiAUNJWPqXv110z51WXxiB7Sj31LRz4WZsyUpXzQ7nDuG/gstiTrqexcC//BrmfJuIuVeVFkU/C+F8yRj2BQnVRb/nqElLj6h0XFt++9XqLvq3sdOXng01qZgluJfHTIc/zGXww4MdjLyBj0L5lrqe0gWBB2CAe52R0OkDYBHXelcui8FOBePcD7A/EtfbXG8T1/CLrfp6u0mXF87JcdWeutk6TvQ2e8kkzfDLhfjfCDEYmN4rbW7bbI/UhPVx5Z+c44vvWs66k/UBoAAWusAUpSgFKUoBSvkm3aDbSgTJseKV9Uh91KOb82zXzedVl8Yge0o99TUJNXSM2ZKUqL86rL4xA9pR76edVl8Yge0o99Z0c+FizMY8rPyn5vkv2XH7s3hispt9zkOxXnxcfgiYriUpU2k/uTnMVjtCO7XZnv3XO/Cf8Jffs3yy24rC4UQpd2vFx7GMIV3VHQhK1DRcBYXspGypewNAnQ1XV/HnD8Y438KMhw+Zd7ahc6OTEkLko/e8lPpNOdDvQUBvXekqHrrjn8G5wOZxbJ8izzLixbbhbXXLPbI0x1CFJc7pDwBPqGmwobB5nB6qaOfCxZn6NUqL86rL4xA9pR76edVl8Yge0o99NHPhYsyUpUX51WXxiB7Sj30TlFmUoAXeCSegAko6/wCNNHPhYsyUpSlVmBVQy7Ln4ksWm0hBuBSFvyXBzNxEHu6flOK/JT3AAqV05Urtch9EWO684dNtpK1H6gNmshxpbku1N3F/Rl3I/DX1DfVSwCB19SU8qR9SRVsbRi6j2YeJu5LRVWfxYI/i8agy3u3uLZvEsjRk3HTyz130BHKkfUkAfVXu837WP/jYf2CPdVO4wcXYnCOJj78qHImC63Vi3nsGHnS0hSvTc02hZUoDuR0Kj3b0RX0ZFxsw3FI1sdul0djKuUb4ZHjCBJXJ7HptxbKWy42kb6laU6OwdEGq3WqSxkzuJwjq1KxafN+1+Gw/sE+6nm/a/DYf2CfdVdv/ABgw/G7PaLnMvbS4l4Tz24wmnJTktPLzFTbbSVLUACCSBobG9VDSuLzFyyjhszjkiFdLBlLs5DkwBRUAxHW4OTqOVXOjlUFAkaI0DUdJPiZlyii7PYrZ3lBZtsZDqSFJdabDbiSO4hSdEf3GpbGr+7hfZw5jhkWNbhAkKSO1irWvZU4ofPbKlElZ9JJJUoqBKkU2wcXMTyjJ5WP2q6mbc4y3W3EojPBrmbOnEpeKOzUUnoQlRIq3PMokMradQlxtaSlSFDYUD0INWRrSwm7r/YbiqpShWjY1ClVPhjcHJmKNx33C6/b3nYKlkklSW1ENkk9SS3yEk+vff31bKTjmScdx5yUXFuLFKUqBEUpSgMzz+FHnZ/akSY7UhItkghLqAoA9q19NfH5vWvw2H9gj3VJZr+MG1/ouR+tarxrn5fUnGcUm1qXqzxfS0pLKWk9iI/zetfhsP7BHup5vWvw2H9gj3VIVGZLk1rw+ySrvepzVutsYAuyHjoDZAAHrJJIAA2SSAASa5ulqP9T5nHU5t2TZ5+b1r8Nh/YI91PN61+Gw/sEe6qjD474LNsV2vCb8lmFaezM/4VFeYdjJcUEoUtpxCXAlRPRXLroevQ1IYrxYxXNJc+Larr2kmCymS+1JjuxlBlW+V1IdSnnbOj6adp+us59ZbX5ljVZJtp6vEnvN61+Gw/sEe6nm9a/DYf2CPdWVRvKOsmUcTsLxvFJ0e6wruuaJj64j6PQZYUtCmHFBKFpK06Kk8419HfWy0lOrHGT5mJqrTtn3VyP83rX4bD+wR7qhc0sluYxa4uNQIrbiW9pWhlIIOx3HVWqoLOf4pXP+q/zFbOSVajyims5/MtveWZPOWmhr2r1NlpSldg+inzXKILhbpUUnQfaU3v6Ngj/OslxVxS8btoWlSHW2EsuIUNFK0DlWD+ZSSK2Os6yqwu45cZN1iMKetUtZdmNtDa4zpABdCfW2rXpa6pV6WiFKKLorPg6axxX4/wBusdDI6qpzaltMm8oK23GTjmOXK322Xd/iTI7fdZMSA2XZC2GnP3QtoHVagFb5R1OjVWVkcvFeK9yzl7E8mulnyGxxY0X4HaXHZcR1h17mYdY1ztBfaJUCoBOwdkVusaSzMYQ/HdQ+y4OZDjagpKh9II6GvZWq9WpnYcLvOTOWuHWJZDwZl4JkV9xy53GKmx3C3yIVmjGa9anX53wttPZo2op5D2RUgHRQN6FeeM4pkmP3vC8vmY1c24UjMLxc3bZHY7STb485lbbKnW0np6Wlr1vl5zvuNdRUrFyCopWs8P6/BgOAfGti4xfFuK2fJrbh0uRPfvUG+wC3CivbKkPwnj1IdcJJbSpSdKJ0kjVb9SvlhxnswkLt9scIjBXJMuKN8jKd6UhtQ6F0jYAHzPnK/JSuyEHUfdte4k3GjFuT1Fk4URyMdlzNEJn3CRIRsaJQFdmk/mIbBH1EVdK9EKGxbobESM0liMw2lpppA0lCEjQA+oACvfVtSWfNyR5ucs+TlvFKUqsgKUpQGc5r+MG1/ouR+tarxryzX8YNr/Rcj9a1Vcyvh7jGdKinI8ftt9MXmDBuEVD3Zc2ubl5gdb5U719ArmdIW0kb7l7niulbda17kWGsj8pfErrlWGWR61xJ1y+Jr7EusuBa5CmJcmO3zhxLK0qSQ4OcLTpQJKBo71U5/s+8Mt/xAxv/ALWz/pqdxXhviuDPvvY7jlrsbshIQ6u3xEMlxIOwFFIGwK5yai7o5kJRpyU4t3Xd/Zz3mWE23JeFmd3PHMZzpd9ehxbeheTKnPyZLQkodLbLT61r0ggknlA6nW+tWjjXgN/zTiDkEazxZCPjDh7cLa1N5FJYMhUlooZU5rlClDm6E70VHu3W/wBKlpWixZTJNNbL46934OdLJe5uYcRODoj4RkePR7C1OanfGFqcYjxCYRbSgOa5VJ5hpKh6J6ddnVdF181xt0W8W+TBnR2pkKS2pl+O+gLQ6hQ0pKknoQQSCDVJHk/8MwQRgGOAjuItjP8ApqLkpY6iE5wqWvqt/O1vf3l/qCzn+KVz/qv8xUBG4DcN4chp9jBMdZfaUFtuItjIUlQOwQeXoQan85/ilc/6r/MVsZJbrNO3EvUzQUdNDNe1evibLSlK7h9GFKUoCr3PhvYbnJckiM7BkuHa3bfIcjlZ3slQQQFHfrIJr4PkogeL3r237qu9KvVeov1FiqzjqUmUj5KIHi969t+6nyUQPF717b91XelZ09Tf6EtNU4mU5nhVYwoGUu4XJIIPZy5zimzr6UAhJ/MQRVriRGIEZuPGZbjx2khKGmkBKEAdwAHQCvdSq5VJz1SZXKUpfM7ilKVWRFKUoBSlKArmS4NByefGmvyZsWTHaUylcN/s9pUQSD0O+qRUZ8lUHxi9+2/dV2pVmklZL2RXKnCTvKKf8FJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IjoaXAuSKT8lUHxi9+2/dT5KoPjF79t+6rtSmkfdyQ0NLgXJFJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IaGlwLkik/JVB8Yvftv3V65HCG1y2lNSLneH2VfObXM2lQ+g9KvVKyqsk7r0RlUaSd1FckKUpVRaf/9k=" - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "import * as tslab from \"tslab\";\n", - "\n", - "const graph = app.getGraph();\n", - "const image = await graph.drawMermaidPng();\n", - "const arrayBuffer = await image.arrayBuffer();\n", - "\n", - "await tslab.display.png(new Uint8Array(arrayBuffer));" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "When you try to call the tool, you can see that the model calls the tool with a bad input, causing the tool to throw an error. The prebuilt `ToolNode` that executes the tool has some built-in error handling that captures the error and passes it back to the model so that it can try again:" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# How to handle tool calling errors\n", + "\n", + "LLMs aren't perfect at calling tools. The model may try to call a tool that doesn't exist or fail to return arguments that match the requested schema. Strategies like keeping schemas simple, reducing the number of tools you pass at once, and having good names and descriptions can help mitigate this risk, but aren't foolproof.\n", + "\n", + "This guide covers some ways to build error handling into your graphs to mitigate these failure modes.\n", + "\n", + "
\n", + "

Compatibility

\n", + "

\n", + " This guide requires @langchain/langgraph>=0.0.28, @langchain/anthropic>=0.2.6, and @langchain/core>=0.2.17. For help upgrading, see this guide.\n", + "

\n", + "
\n", + "\n", + "## Using the prebuilt `ToolNode`\n", + "\n", + "To start, define a mock weather tool that has some hidden restrictions on input queries. The intent here is to simulate a real-world case where a model fails to call a tool correctly:\n", + "\n", + "```bash\n", + "$ npm install @langchain/anthropic\n", + "```" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "HUMAN: \"what is the weather in san francisco?\"\n", - "AI: [\n", - " {\n", - " \"type\": \"text\",\n", - " \"text\": \"Okay, let's check the weather in San Francisco:\"\n", - " },\n", - " {\n", - " \"type\": \"tool_use\",\n", - " \"id\": \"toolu_01TiQFrXDi5x4p7oEEZ99T5p\",\n", - " \"name\": \"get_weather\",\n", - " \"input\": {\n", - " \"location\": \"San Francisco\"\n", - " }\n", - " }\n", - "]\n", - "TOOL: \"Error: Input queries must be all capitals\\n Please fix your mistakes.\"\n", - "AI: [\n", - " {\n", - " \"type\": \"text\",\n", - " \"text\": \"Apologies, let me try that again with the location in all capital letters:\"\n", - " },\n", - " {\n", - " \"type\": \"tool_use\",\n", - " \"id\": \"toolu_01B1LiK5i74E52EwfHL3S3XJ\",\n", - " \"name\": \"get_weather\",\n", - " \"input\": {\n", - " \"location\": \"SAN FRANCISCO\"\n", - " }\n", - " }\n", - "]\n", - "TOOL: \"It's 60 degrees and foggy\"\n", - "AI: \"The weather in San Francisco is 60 degrees and foggy.\"\n" - ] - } - ], - "source": [ - "import { HumanMessage } from \"@langchain/core/messages\";\n", - "\n", - "const response = await app.invoke({\n", - " messages: [\n", - " new HumanMessage(\"what is the weather in san francisco?\"),\n", - " ]\n", - "});\n", - "\n", - "for (const message of response.messages) {\n", - " // Anthropic returns tool calls in content as well as in `AIMessage.tool_calls`\n", - " const content = JSON.stringify(message.content, null, 2);\n", - " console.log(`${message._getType().toUpperCase()}: ${content}`);\n", - "}" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Custom strategies\n", - "\n", - "This is a fine default in many cases, but there are cases where custom fallbacks may be better.\n", - "\n", - "For example, the below tool requires as input a list of elements of a specific length - tricky for a small model! We'll also intentionally avoid pluralizing `topic` to trick the model into thinking it should pass a string:" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "import { z } from \"zod\";\n", + "import { tool } from \"@langchain/core/tools\";\n", + "\n", + "const getWeather = tool(async ({ location }) => {\n", + " if (location === \"SAN FRANCISCO\") {\n", + " return \"It's 60 degrees and foggy\";\n", + " } else if (location.toLowerCase() === \"san francisco\") {\n", + " throw new Error(\"Input queries must be all capitals\");\n", + " } else {\n", + " throw new Error(\"Invalid input.\");\n", + " }\n", + "}, {\n", + " name: \"get_weather\",\n", + " description: \"Call to get the current weather\",\n", + " schema: z.object({\n", + " location: z.string(),\n", + " }),\n", + "});" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "HUMAN: \"Write me an incredible haiku about water.\"\n", - "AI: [\n", - " {\n", - " \"type\": \"text\",\n", - " \"text\": \"Okay, let's generate a haiku about water using the master haiku generator tool:\"\n", - " },\n", - " {\n", - " \"type\": \"tool_use\",\n", - " \"id\": \"toolu_011kEBcJ3LnqwbBgTMgsED9C\",\n", - " \"name\": \"master_haiku_generator\",\n", - " \"input\": {\n", - " \"topic\": [\n", - " \"water\"\n", - " ]\n", - " }\n", - " }\n", - "]\n", - "TOOL: \"Error: Received tool input did not match expected schema\\n Please fix your mistakes.\"\n", - "AI: [\n", - " {\n", - " \"type\": \"text\",\n", - " \"text\": \"Oops, looks like I need to provide 3 topics for the haiku generator. Let me try again with 3 water-related topics:\"\n", - " },\n", - " {\n", - " \"type\": \"tool_use\",\n", - " \"id\": \"toolu_017uPCEEjQzaBAY9iy8mv1cU\",\n", - " \"name\": \"master_haiku_generator\",\n", - " \"input\": {\n", - " \"topic\": [\n", - " \"ocean\",\n", - " \"waves\",\n", - " \"rain\"\n", - " ]\n", - " }\n", - " }\n", - "]\n", - "TOOL: \"Here is a haiku about the ocean, waves, and rain:\\n\\nWaves crash on the shore,\\nRhythmic dance of water's song,\\nRain falls from the sky.\"\n", - "AI: \"The haiku generator has produced a beautiful and evocative haiku about the different aspects of water - the ocean, waves, and rain. I hope you enjoy this poetic take on the theme of water!\"\n" - ] - } - ], - "source": [ - "import { StringOutputParser } from \"@langchain/core/output_parsers\";\n", - "import { Annotation } from \"@langchain/langgraph\";\n", - "import { BaseMessage } from \"@langchain/core/messages\";\n", - "\n", - "const CustomStrategyState = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: messagesStateReducer,\n", - " }),\n", - "});\n", - "\n", - "const haikuRequestSchema = z.object({\n", - " topic: z.array(z.string()).length(3),\n", - "});\n", - "\n", - "const masterHaikuGenerator = tool(async ({ topic }) => {\n", - " const model = new ChatAnthropic({\n", - " model: \"claude-3-haiku-20240307\",\n", - " temperature: 0,\n", - " });\n", - " const chain = model.pipe(new StringOutputParser());\n", - " const topics = topic.join(\", \");\n", - " const haiku = await chain.invoke(`Write a haiku about ${topics}`);\n", - " return haiku;\n", - "}, {\n", - " name: \"master_haiku_generator\",\n", - " description: \"Generates a haiku based on the provided topics.\",\n", - " schema: haikuRequestSchema,\n", - "});\n", - "\n", - "const customStrategyToolNode = new ToolNode([masterHaikuGenerator]);\n", - "\n", - "const customStrategyModel = new ChatAnthropic({\n", - " model: \"claude-3-haiku-20240307\",\n", - " temperature: 0,\n", - "});\n", - "const customStrategyModelWithTools = customStrategyModel.bindTools([masterHaikuGenerator]);\n", - "\n", - "const customStrategyShouldContinue = async (state: typeof CustomStrategyState.State) => {\n", - " const { messages } = state;\n", - " const lastMessage = messages[messages.length - 1];\n", - " if (isAIMessage(lastMessage) && lastMessage.tool_calls?.length) {\n", - " return \"tools\";\n", - " }\n", - " return \"__end__\";\n", - "}\n", - "\n", - "const customStrategyCallModel = async (state: typeof CustomStrategyState.State) => {\n", - " const { messages } = state;\n", - " const response = await customStrategyModelWithTools.invoke(messages);\n", - " return { messages: [response] };\n", - "}\n", - "\n", - "const customStrategyApp = new StateGraph(CustomStrategyState)\n", - " .addNode(\"tools\", customStrategyToolNode)\n", - " .addNode(\"agent\", customStrategyCallModel)\n", - " .addEdge(\"__start__\", \"agent\")\n", - " .addEdge(\"tools\", \"agent\")\n", - " .addConditionalEdges(\"agent\", customStrategyShouldContinue, {\n", - " // Explicitly list possible destinations so that\n", - " // we can automatically draw the graph below.\n", - " tools: \"tools\",\n", - " __end__: \"__end__\",\n", - " })\n", - " .compile();\n", - "\n", - "const response2 = await customStrategyApp.invoke(\n", - " {\n", - " messages: [new HumanMessage(\"Write me an incredible haiku about water.\")],\n", - " },\n", - " { recursionLimit: 10 }\n", - ");\n", - "\n", - "for (const message of response2.messages) {\n", - " // Anthropic returns tool calls in content as well as in `AIMessage.tool_calls`\n", - " const content = JSON.stringify(message.content, null, 2);\n", - " console.log(`${message._getType().toUpperCase()}: ${content}`);\n", - "}" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "We can see that the model takes two attempts.\n", - "\n", - "A better strategy might be to trim the failed attempt to reduce distraction, then fall back to a more advanced model. Here's an example - note the custom-built tool calling node instead of the prebuilt `ToolNode`:" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [], - "source": [ - "import { AIMessage, ToolMessage, RemoveMessage, BaseMessage } from \"@langchain/core/messages\";\n", - "import { Annotation } from \"@langchain/langgraph\";\n", - "\n", - "const CustomStrategyState2 = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: messagesStateReducer,\n", - " }),\n", - "});\n", - "\n", - "const haikuRequestSchema2 = z.object({\n", - " topic: z.array(z.string()).length(3),\n", - "});\n", - "\n", - "const masterHaikuGenerator2 = tool(async ({ topic }) => {\n", - " const model = new ChatAnthropic({\n", - " model: \"claude-3-haiku-20240307\",\n", - " temperature: 0,\n", - " });\n", - " const chain = model.pipe(new StringOutputParser());\n", - " const topics = topic.join(\", \");\n", - " const haiku = await chain.invoke(`Write a haiku about ${topics}`);\n", - " return haiku;\n", - "}, {\n", - " name: \"master_haiku_generator\",\n", - " description: \"Generates a haiku based on the provided topics.\",\n", - " schema: haikuRequestSchema2,\n", - "});\n", - "\n", - "const callTool2 = async (state: typeof CustomStrategyState2.State) => {\n", - " const { messages } = state;\n", - " const toolsByName = { master_haiku_generator: masterHaikuGenerator };\n", - " const lastMessage = messages[messages.length - 1] as AIMessage;\n", - " const outputMessages: ToolMessage[] = [];\n", - " for (const toolCall of lastMessage.tool_calls) {\n", - " try {\n", - " const toolResult = await toolsByName[toolCall.name].invoke(toolCall);\n", - " outputMessages.push(toolResult);\n", - " } catch (error: any) {\n", - " // Return the error if the tool call fails\n", - " outputMessages.push(\n", - " new ToolMessage({\n", - " content: error.message,\n", - " name: toolCall.name,\n", - " tool_call_id: toolCall.id!,\n", - " additional_kwargs: { error }\n", - " })\n", - " );\n", - " }\n", - " }\n", - " return { messages: outputMessages };\n", - "};\n", - "\n", - "const model = new ChatAnthropic({\n", - " model: \"claude-3-haiku-20240307\",\n", - " temperature: 0,\n", - "});\n", - "const modelWithTools2 = model.bindTools([masterHaikuGenerator2]);\n", - "\n", - "const betterModel = new ChatAnthropic({\n", - " model: \"claude-3-5-sonnet-20240620\",\n", - " temperature: 0,\n", - "});\n", - "const betterModelWithTools = betterModel.bindTools([masterHaikuGenerator2]);\n", - "\n", - "const shouldContinue2 = async (state: typeof CustomStrategyState2.State) => {\n", - " const { messages } = state;\n", - " const lastMessage = messages[messages.length - 1];\n", - " if (isAIMessage(lastMessage) && lastMessage.tool_calls?.length) {\n", - " return \"tools\";\n", - " }\n", - " return \"__end__\";\n", - "}\n", - "\n", - "const shouldFallback = async (state: typeof CustomStrategyState2.State) => {\n", - " const { messages } = state;\n", - " const failedToolMessages = messages.find((message) => {\n", - " return message._getType() === \"tool\" && message.additional_kwargs.error !== undefined;\n", - " });\n", - " if (failedToolMessages) {\n", - " return \"remove_failed_tool_call_attempt\";\n", - " }\n", - " return \"agent\";\n", - "}\n", - "\n", - "const callModel2 = async (state: typeof CustomStrategyState2.State) => {\n", - " const { messages } = state;\n", - " const response = await modelWithTools2.invoke(messages);\n", - " return { messages: [response] };\n", - "}\n", - "\n", - "const removeFailedToolCallAttempt = async (state: typeof CustomStrategyState2.State) => {\n", - " const { messages } = state;\n", - " // Remove all messages from the most recent\n", - " // instance of AIMessage onwards.\n", - " const lastAIMessageIndex = messages\n", - " .map((msg, index) => ({ msg, index }))\n", - " .reverse()\n", - " .findIndex(({ msg }) => isAIMessage(msg));\n", - " const messagesToRemove = messages.slice(lastAIMessageIndex);\n", - " return { messages: messagesToRemove.map(m => new RemoveMessage({ id: m.id })) };\n", - "}\n", - "\n", - "const callFallbackModel = async (state: typeof CustomStrategyState2.State) => {\n", - " const { messages } = state;\n", - " const response = await betterModelWithTools.invoke(messages);\n", - " return { messages: [response] };\n", - "}\n", - "\n", - "const app2 = new StateGraph(CustomStrategyState2)\n", - " .addNode(\"tools\", callTool2)\n", - " .addNode(\"agent\", callModel2)\n", - " .addNode(\"remove_failed_tool_call_attempt\", removeFailedToolCallAttempt)\n", - " .addNode(\"fallback_agent\", callFallbackModel)\n", - " .addEdge(\"__start__\", \"agent\")\n", - " .addConditionalEdges(\"agent\", shouldContinue2, {\n", - " // Explicitly list possible destinations so that\n", - " // we can automatically draw the graph below.\n", - " tools: \"tools\",\n", - " __end__: \"__end__\",\n", - " })\n", - " .addConditionalEdges(\"tools\", shouldFallback, {\n", - " remove_failed_tool_call_attempt: \"remove_failed_tool_call_attempt\",\n", - " agent: \"agent\",\n", - " })\n", - " .addEdge(\"remove_failed_tool_call_attempt\", \"fallback_agent\")\n", - " .addEdge(\"fallback_agent\", \"tools\")\n", - " .compile();" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The `tools` node will now return `ToolMessage`s with an `error` field in `additional_kwargs` if a tool call fails. If that happens, it will go to another node that removes the failed tool messages, and has a better model retry the tool call generation. We also add a trimming step via returning the special message modifier `RemoveMessage` to remove previous messages from the state.\n", - "\n", - "The diagram below shows this visually:" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, set up a graph implementation of the [ReAct agent](https://langchain-ai.github.io/langgraphjs/concepts/). This agent takes some query as input, then repeatedly call tools until it has enough information to resolve the query. We'll use the prebuilt [`ToolNode`](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html) to execute called tools, and a small, fast model powered by Anthropic:" + ] + }, { - "data": { - "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCAGCATsDASIAAhEBAxEB/8QAHQABAAICAwEBAAAAAAAAAAAAAAYHBQgCAwQBCf/EAFwQAAEDAwICBAgICAgKBwkAAAEAAgMEBQYREgchExYxQQgUFSJRVZTRFzJUVmGRk9IjQlJxcnWBtAklMzc4obPhJDQ1NkRidJKxsjlDZHeCorVGU2Nzg4TB1PH/xAAbAQEAAgMBAQAAAAAAAAAAAAAAAQMCBAUGB//EADcRAQABAgELAQUHBQEBAAAAAAABAgMRBBITFCExQVFSkaFhFWJxsfAFIjIzgdHhQlNjwcLxcv/aAAwDAQACEQMRAD8A/VNERAREQEREBERAREQEREBERAREQdFXW09BCZqqeKmiB0Mkzw1uv5yvD1qsnrig9qZ71HOLkMdRYrZHKxskbrpTBzHjUEbj2hR/q/a/VtH9g33LWynKrWSxTnxMzOO7B0MnyXT052OCw+tVk9cUHtTPenWqyeuKD2pnvVedX7X6to/sG+5Or9r9W0f2DfctL2rk/RV3hs+zve8LD61WT1xQe1M96darJ64oPame9V51ftfq2j+wb7k6v2v1bR/YN9ye1cn6Ku8Hs73vCw+tVk9cUHtTPenWqyeuKD2pnvVedX7X6to/sG+5Or9r9W0f2DfcntXJ+irvB7O97wsPrVZPXFB7Uz3p1qsnrig9qZ71XnV+1+raP7BvuTq/a/VtH9g33J7Vyfoq7wezve8LD61WT1xQe1M96yUUrJ4mSRvbJG8BzXtOocD2EHvCqiXH7X0b/wCLaTsP/UN9ymnDH+bbE/1TSf2LFv5PlFvKqKq7cTGbMRt9cf2aeUZNq8RtxxSZERXtIREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQQrit/ka1frWm/wCYrGLJ8Vv8jWr9a03/ADFRy+5Da8XtslxvNyo7Rb4yA+rrp2QxNJOgBe4gDUkAc1wvteMdFEcp+bv5BstT8WQWNyXIaHEcdud8ucpht1tppKuokDS4tjY0ucQBzJ0B5BRocc+G5104g4ty7f46pvvrz3DizhOSW2ttdnyLGMrulVTyxU9jZeKZxr3lh0h03O5O7DyPInkVwYt1Y7Yl0Jrp4SiGe+EDcbbwbvuXWXEL5Q1dK2A07L1SRMY9kpGk3mzaOaAewO3Aubq3tU2ufE2qteN2+6PwfKp6qslfH5IpqWCWqh2k+dJtmMbWkDUHpOeoHbyVNU3B/NLvwz4k49TWibF7NcaamGP45c7qyt8VnjJfKGSNc4RxPLYw1m4gHU6NHJSXiHZ8w4idULjdMCq6qzUktULrh7rtTAzvcxgp5nuEgiljYRJ5hdr5wO06aLami3jERhvnj6Rhx+ubXiqvfOO7l6/BJqnwisbhx/GLrBQXmuGQV0tspqOnox4zFVRiTfDLG5wLXB0bmd4B0JIbq4Y1vG2/P4y2vF+pN6ittXZRXSMcym8YgkdUMj6SQio29EwEhwbudqeQcFDeHfCDK7BLg8NVjsFrprPmF0ucsVLVxSQQUk9POYjHzDi1rpmx6bQ7VpO0N0KsDMbLk1l402jMbLYDklvlsktlqoIayKnlpnGoZK2X8K4BzdA4ENOv0KJptU1Zsbdk8eyYquTGM7N3BayKEP448OI3ua7iBizXNOhab1TAg/76O45cN2uLXcQMWBB0IN6puX/nWro6+UtjPp5ppL/JP/MVJeGP822J/qmk/sWKLsniqqMTwSMmhkj3skjcHNc0jUEEdoI71KOGP822J/qmk/sWL0n2R+Td+NPyqcr7R3U/qkyIi7DiCIiAiIgIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIiCFcVv8jWr9a03/MVipI2StLXta9p7nDUKaZNjNJldubR1b54mMlZMx9PJse17TqCCo/8ABTQ+uL37b/ctPK8kjK4owrwwx4S6eS5VRZommqGF8Qpfk0P+4FyZR08bg5sEbXDsIYAQsx8FND64vftv9yfBTQ+uL37b/cub7In+7HaW5r9rlLGosl8FND64vftv9yq3woLXU8KeA2W5XYL3dY7vbYIpKd09T0jAXTRsOrSOfJxT2P8A5Y7Sn2ha5SsBF32jhlSVtpoqiS8XoySwMkdpWaDUtBPcvX8FND64vftv9yex/wDLHaT2ha5Sw3iFMf8AR4v9wL55PpR/o0P2YWa+Cmh9cXv23+5PgpofXF79t/uT2RP92O0o1+1ylipABC4AaANPJSXhj/Ntif6ppP7Fix54UUBBBvF70P8A23+5Sqz2qCxWihttKHClo4GU8QcdSGMaGt1PfyAXUyTJoyS3XRnY50x4x/do5XlFN+IzeD2IiLac4REQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAVD+HT/RQ4hf7LB+8xK+FQ/h0/0UOIX+ywfvMSC58d/wA37Z/ssX/IFkVjsd/zftn+yxf8gWRQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQFQ/h0/0UOIX+ywfvMSvhateG7xdwW4eDtxExulzTHqnImsjpjaIbrA+rErKmPfH0Ifv3N2u1bpqNDr2INlMd/wA37Z/ssX/IFkVCOHHE/Dc0t9BQY9ltjvtdFRtfJTWy5Q1EjGsDGvJaxxIDS9gJ7i9uvaFN0BERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEXhvN7osfoXVlfOIIGkNGjS9z3Hsa1rQXOce5rQSfQoXU8RL1WPJtlihp6f8WW6VWyR3/042u0H53A+kei2m3VVGO6PXYtotV3PwxisJfk/wDwk/Av4O+LkeZ22n2WTK908uwebFXN06YHly3gtk1J5udJ6F+ixzLLdTpT2XT88ygHG/Bavj/gc+J5PTW5lC+aOpjqaJ721FPIw8nxue1zQSC5p1afNe786y0UdUd1+qXuSkv4LTgnLZcevfE64RPimuzXWu2A8g6ma9rppPpDpGNaO8dE70rfZU9iFRfMGxa049aKGy09rtdLHSU0ZdMSGMaGjU95OmpPeSSsuMzy0dtNZXfRumH9aaKOqO6NUvcllIoHRcSaqkeG3uzmnh5A1dvlNSxv0uZta8D6QHek6DXSb0tVDW00VRTzR1FPKwSRyxODmPaRqHAjkQR3rCq3VTtnd3UV267c4VRg7URFWrEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAXxzg1pJIAHMk9y+qOcSKiWk4eZRPCS2aO11T2OHaCInEH9ist0Z9cUc5wTEYzghLLk/K63y5OSYX6igiLtWxwa+a8D8t40cT2gEN7jr61wp4mQQRxRANjY0NaB2AAclUnHGqnu98xvFbSb9Lfa2OorGU9nvRtMQhj6Nr5J52tc/QOewNa0HUuOo0VVyvPqx4cPg9PERaowjgt5FqdQcQL9ceGvDM5ZkF1ttimu1ztl9vNsne2pc6CSaOkY+eJoeA4x6Oe0DcWjXTcV5KW9Z1FZcLxWmrLvLHk99vM0FddbvNb6+roYNHUrHVBjkfCXsO7RrGlwj5bdxJwwV6eOX1s/dt4vJd7xQ4/a6q5XOrhoLfSxmWepqHhkcbB2lzjyAWt17tnEXFLBZ7Xer/V22luWY22kopKO9Prq2Glla5s8L6l8MZeN3nN3NcRr2naCsRxYo6qkwXj5iUt5vFxtNmt1BcaE11wlmniM0chkjMrnb3x7ogdriRzI7FGCZu4RubZtcHNBB1BGoK5YzdDi9/gpAdtpucpZ0evm09SdSHNHcJDqHAct+06aueTg8Ox+nxiwU1DS1dfWwgbxNcq6WsmO7n/KSuc4j0DXQdy55Y90NmMrP5WKop5Y+X47ZmFun06gLZyfbcijhVs7/ALJvURctzFS5URFW8yIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIiAuito4rhRz0s7d8E8bopG+lrhoR9RXeimJwnGBUVnZPQwutdaSa+3kU8pcecgHJkv5ntAd+0jtBWHzThnjnEGWglvlvdUz0JeaaeGplp5Yw8APaHxOa4tdoNWk6HQajkpNbcoxnjRdcigxusqYrxjFWbbUXJ9BI2Hphrvh1eGiZrXAhzQQWnQgjUE9NTR5HanllVj8ta0dlTa5o5GO/O17mPB+gBwHpPfbVb0szXRht4bu2P/ru2cpt3Kc25vVBn3AOjdjNttWF2KzwQUlwlr/FK2419EyN0jC1/QS0z90WpPNoBZ2+aCdV6cL4Exy8Ppsdz58eRtdcn3CkhbV1Motg0aGRwVMj+n83RxD9wPnuHIKzjX3AEjq5ej+alH3liMsz2nwXHqy+3+13W12ijaHVFXPS6MjBcGgnQ+lwH7Vjq93kuxs444w8lBwgxK2WigtlPanCkobky7wB9VM9/jbPiyue55c8/pEg6cwsjPw/x+rrchqp7ZHPNkFNHSXQSuc9lTDG17WMLCdoAbI8cgNdeeugXthu1ZUQxyxY7enxyNDmuFKNCCNQfjLsFfcSdBjd6J9HiwH/Fyavd5M9JZ5w8GF4NZuH1m8lWKnmpaHeZBHNVTVBB2hvJ0r3OA0aAADoNOQWWp6B2RZHb7bGC6Gnljr6xwPJjGO3RtP0ve0cu8Mf6ND57obtQWS43e4UvVuzW6mkrKuurh4xLHDGwvkcyCEuLiADy119AdyBj3DjwsuBFRbo6a08QbdA+V++aa8F9JLLKQAXyOmawa8gO5oADW6NAAypp0M51U7eHH9fri08oymimjMtr/RY6xZHacooBXWa6Ud3oidoqaCoZPGToDpuaSNdCD+0LIqlxRERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAReS7XSnsdqrbjWOeyko4X1EzmRukcGMaXOIa0FzjoDyAJPcFVlLlmUcccSw7JeHl1dh9lqLj09wF/tDjVVVFG86CJhdoGy7R53bskBDmkaEJlk3EKltlJkdPZI4snyiy0Xjj8coqqMVb9wcY2kE+bv2nTUfmB1Gscp8FufEyTh/luUy3nEbzZmmrqMZtt1DqR1S4AATlg0lDRvAAI5SEHUagzGyYFjmN328Xq12ShoLveJBLcK6CBrZqlwAA3u7T2a6dmpJ7SSc8g4xxsiaWsa1gJLtGjTmTqT+0klckRAUD463PyNwnyGs6k/CN0UUZ6r9D03j+srBt2dHJrprv+I74n7RPFEuLFty+78PbzR4FdaSyZdLGwW+vr2h0MLukaXFwMcgOrA4fEPMj86CS25/SW+ld0Hiu6Jp6DTTo+Q83TQdnZ2dy9C6aJszKOBtS9slQI2iV7exztOZHId67kHCaJlRE+KVjZI3tLXMeNQ4HtBHeF+KvhWeDxXcG+PVZi1nt1TV267zNqLDDTxulknjldoyFgGpc9r9YwObjoD+MF+1qjmTcOsYzO84/dr5Y6K6XKwVRrLXVVMQc+llLS3c0/tDtDqNzGO03MYQGvf8AB9cGGcH+E9c+72O52DNbncZKe7MuhAEpgdIIBTgHaYgx7juGpc4yHUtDNNp1EOJHCvHuK1vttJkFPPKLbXRXGjnpamSnmgnjPJzXsII1Bc0/Q46aHQjHRZtkWPZLmMuZW202LA7ZDHVW/IxcRpJGW6SMnY4Asc1wJ1Hm6OaBuOpQWAi6aKtp7jSQVdJPFVUs7BJFPC8PZIwjUOa4ciCOYIXcgIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIiAoDxIzu/WyglpMAs9uzLKIayCmq7dLdGU4t8crXOE8/a4N0bqG6bjrqNdFPlVPDiqwmbjhxchsNHWwZfE+1dY6ickwzE0zjS9EC4gaR6h2gbz9KCQ0HCu3UfFKvz99wu1ReauhZb20k1e99FSxDaXCKHkAXOY1xJ15jUAEnWaAAAADQDuX1EBERAREQFWvhH2rEb1wSymizu51lmxKSCPyhX29hfPCwSsILQI5D8YNB8w8ifziyl1VNNDW00tPURMnp5mGOSKVocx7SNC0g8iCOWhQeHGK+3XTHLXWWisjuFqnpYpKSrieHNmiLQWPBHIgjQrJqvLlbMjxTO+sLMjo6ThnQ2R8dXj/k7V9PJFq5ssDo+emzUFuh5MaA0k6tkeBZ7YeJ2JW7JcauEdzs1ezfDUR6jsOha4Hm1wIILTzBBBQSBERAXkutpor7bam3XKjgr6CpjMU9LUxiSKVhGha5p1BB9BXrRBXrMOyXFMlw6hw2ex2rh1b6WSjuFikpHCZoDdYpIJGntBAaWu0GjnuJcSNM3w64k4/xWx3y7jNa6vtnjEtL0roXxHpI3lrxteAe0dui9+ZXS42TEb3cLRb3Xa7UtFNPR29nbUzNYTHH2j4zgB+1Y3hbBLFw+sU1Vj1Hitwq6WOsrLPQxCOOlqJQHys0AHnBzjqe869qCVIiICIiAiIgIiICIiAiIgIiICIiAiIgIvPW3CltsQlq6mGljLtofNIGAn0anv5FeLrVZPXFB7Uz3rOKKqtsQnBlUWK61WT1xQe1M96darJ64oPame9To6+mTCVU+F5xazLghwgky/C7XbbrU0VbC2vZdIpJI4qV4c0yBscjDqJDCO3QBziR3jRah/hRuK0teOhxTDJKipe1rhDb6vpJT2AcqkknuHav0tyWoxTL8eudju1wt1XbLjTSUlTA6qZo+N7S1w7eXI9q/ObwV/BMkxrwpLlLljohjOHVBqqG4VBDILnNu/wZ0bjycAPwjtpO1zA13amjr6ZMJfpRh1TeKzEbHUZFT09JkEtDA+409Jr0MVSY2mVrNSTtD9wGpJ0A5lZhYrrVZPXFB7Uz3p1qsnrig9qZ700dfTJhLKosV1qsnrig9qZ7061WT1xQe1M96aOvpkwllUXCKVk8TJI3tkjeA5r2nUOB7CD3hc1WgREQFBr5hmQw5NiVRid+o8dxy3zTeVbELcx8VdFIN2rHAgxvDxqCOXnuJ102unKx2R+Vur108g+J+XPFZfEPKO/xbxjYei6XZ53R79u7bz0105oNX+Nv8IThWAYtb6zFpYchvkl7NvrLNVNkpqmlp4ZP8JkfG5oLSWgMj3lupk3aO6J7FtJZLzR5FZqC626dtTb66njqqednZJG9ocxw+gggr8duNngreEAMsvGR5Xi9zyivrp3T1F1tW2tbMT+MGRauY0AaBpY0NAAAAAW8f8G9xRqMw4K1OJXQSx3rDqkUb452kSCmk3Ph3AgaaESsA7hGEG2aLGy5JaIHlkt0oo3Dta+oYD/xXDrVZPXFB7Uz3qzR18pThKLcZqG4X/Gaaw2XNYMHvdxrYBTVji3ppmxvbJJFCC4Euc1pB0B5E6jQqfKl8lyXh5mXHrGbRcop6q/4vb5b/QXVtQ1tuh6V3i7o3neA6XTRwaWnQcwRz1tmmv8Aa6yQR09ypJ5CdA2Odrif2ApNuuN8SYS96IirQIiICIiAiIgIiICIiAiIgIiICIiCAcWaeKqkxGKaJk0Tru7VkjQ5p/wKq7isN1ftfq2j+wb7lneKP+M4h+t3fuVUvEtHL666ZtxTOH3f+qnkPtmqqL9OE/0/7lj+r9r9W0f2DfcnV+1+raP7BvuWQWPv9+oMWslfeLrUso7bQwvqKiok7GMaNSeXM8h2DmVytLc6p7uFFdczhEydX7X6to/sG+5Or9r9W0f2DfcoJa+O9rq69tHccfyLHJ56Sesom3iibD49HEzfIIiHu0eG+dsfsdp3cjpww/j9ZcwuWOUzLNfbVT5HTuntFfcqRkUFZtj6VzG6Pc4ODNSNzQHBpLS4aE5517nPddmXo27U+6v2v1bR/YN9ydX7X6to/sG+5UhnfhGOrRZo8Qob02hnyegtL8j8Qjdbqlhq2xVEcb3EuII3t3hgGoO12uiv9Jru0xtqnuxri7biJqnex/V+1+raP7BvuXGXH7X0b/4tpOw/9Q33LJLhL/JP/MVjF251T3VZ9XNJeGP82uJ/qik/sWKTKM8Mf5tcT/VFJ/YsUmXqco/Or+M/N9OneIiKhAiIgxOSZFBjdAJ5I31E8jxFBTRab5nnuGvYAASSeQAJVb3C3zZK50mQVBuIedfEQS2jjH5Ij7H/AKUm49umgOg99/qzds8uBdo6O0xR0cTefmySNbLI79rXQj/wn0rxX24yWiyV9dDSTV8tNA+ZlLT7eklLWkhjdzmjU6aDUgfSFdVXNnCmjZOycfjt2fo7eS2KaaIuVRtlxZjtqiaGstlGxo7m07AP+C5eQLZ6upPsG+5VLw98IQ1/Bq15jllluFtqKllPHEyCBjxc6iY6MZRxsle52rtAA/ae88gSs5F4QOOQ2fIKy8Ud2xyssYhdV2q50oFWRMdsBjbG57ZOkd5rdrj5w0Oio0lzqnu3YromMU+8gWz1dSfYN9y658Zs9TGWTWmhlYe1r6ZhH1EKrMq46eNYBn4oKC74ll1lx+ou1PR3uljbNsEb+jnYA6Rj2h7dCNToeTgNVmK3jVTWCaw2aSz3vJL/AFdoiutRDZaRkphgOjDK8F7ORfqA1mp5Hkpi5cjdVPcz6Fh2w12KPbJZ55X0jfj2qeQvheP/AIZdqYnejaQ30t7xZNkvNNf7ZDXUpd0UgOrJBtfG4HRzHDucDqCPoUAXpwWrNuzOutzdBDcaXx5rBrylicyOR37WyQj/AMP0q6mqb0TFX4o248+ePzaGV2Kc3SUwsVERUuMIiICIiAiIgIiICIiAiIgIiIIJxR/xnEP1u79yqlHcjyuyYdQsrb9ebfZKOSQQsqLjVMp43PIJDQ55AJ0a46dugPoUi4o/4ziH63d+5VSx8sMc7dsjGyNB10cNQud9o/it/wDz/wBVPH/bOGsU49P+5Qscc+G5B04g4sdOZ/jqm5f+dRriXdcR488PshwbHc3x+svN1o3NpoqW5Qzu3sIeCWMcXFurRu0B0GqtTyfS/JofswuUdJBC7dHDGx3pawArlxMROMONTXTRMVUxOMev8KGwjhuDFXOk4KWPCbvFaqiKO60c1I90lS+Mx7Yej85rHNc/znlpA0BB1JHstfDTIocb4B0c1vLJsYZE28NE8etLpbZIHcw7z/wjg3zN3br2c1eaLKbkyznKKpnd8+Uxxn1au2/h/wARbZgOH8O34eypo8byC3z9YYLlTthqaOCsbL0ghc4SB+z4zSO0EgkkBXW/jjw4je5ruIGLNc06FpvVMCD/AL6m68/k+lP+jQ/ZhJrzt8IrvRd/HHb1378UQdxy4btcWu4gYsCDoQb1Tcv/ADqXU9bT3K3RVdJPFVUlREJYZ4Xh7JGOGrXNcORBBBBHbqvvk+lH+jQ/Zhdr2hsLmtAADdAB3clhs4KZzf6YSXhj/Nrif6opP7FikyjPDH+bXE/1RSf2LFJl63KPzq/jPzfTp3iIioQIi+OcGNLnEBoGpJ7kFWXCndQ53kkTwR42+CvYSORaYWw8j+eA/WPSuVTD4zTSxa7ekYW6+jUaLNXeK38QLfT33F7jQXaponywRz01S2SGUBwEsDnsJAILR+i5o15aqP0NzgrzLGwmOogOyemlG2WF35L29oP9RHMag6q27E1xFyN2ERPphs8u/kl2K7cU8Ya7WXh5njOE2IY/PjEdPecBuVFXUZfcYTT3oQmRj2xuaS6LWN+oMjR5xGvYSuWV8Ks14nXm/ZnU2WHHbtTstbLNY6utjlM/idU6qd074i5jd5eWN0LtO06LZFFq4r9DThhi1/v3D3MeLV0zG83SxsxB1Vh9XjNuoaqtiqJJZZ3b3TSOhLmtYC1jQNSebiQOQXiz/Es0ybEca8R4f1FvzWjtbYKS/wBFfoIJbVUh20tkId+FgIa15aN4IcQWg81saiYpm1E47d7poWTx0VO2qkbLUtjaJXsGjXP084gdw11XqxGndV8RBO0Ho6C1yMedOW6eWMt5+nSnf9YXgqbgGVUVDTs8bucw1ho4z57hrpuP5LB3uPIfn0Bk9FNZOFlhNXkl7ttqkrqgGorq6pZTwyTlvKNjpCOQazRo7dGk9uq2rcTbpmueMYR+uyf0wx/Vq5ZdimjM4ymSKFDjdw6d2Z9i5/Neab76lFmvduyK2w3G019Lc7fNr0VVRzNmifoS07XtJB0IIOh7QVU4T2oiICIiAiIgIiICIiAiIgIiIMLlGJ0eW09JFVy1MBpJ/GYZaWXo3tfsezt9G2Rw/asH8FNB64vftp9ymyKyLlUREK6rdFc41UxP6IT8FNB64vftp9yfBTQeuL37afcpsiaSr07Qx0NrojtCE/BTQeuL37afcnwU0Hri9+2n3KbImkq9O0GhtdEdoQn4KaD1xe/bT7k+Cmg9cXv20+5TZE0lXp2g0NrojtCE/BTQeuL37afch4UUBBBu970P/bT7lNkTSVenaDQ2uiO0PHZ7VBYrRQ22lDhS0cDKeIOOpDGNDW6nv5AL2IiwmZqnGVwvhOg1PYoHnPFhmNWG23LHrBcs/FfXi3xsxvo52xOBcHvlfuDWMbseCT2OAadNV9ZhmUV3Eq93K75RDXYJV28UVNiniDA1rnBvSyyTfGcTtcA3s0efQoHVmvGCnsONUV3xmyXDiIKy4+TWRYwYqgRyAuD3Sv3aMYzY4Ocex2gOmuq9bMQyWp4k3O7XDKmVmGT0HicGKm3x7Gvdt6SV83xnk7SA08gHuCzGEYJj/DfHKaw4xaKay2in16OlpWbW6ntcT2uce9ziSe8rPIMHhuEWDh5YYLJjVno7HaodSyloohGzU9rjp2uPe46k95XO/wCH2fJzG640LJpoxpHUMc6KaMehsjCHtH5iFmUWVNVVE40zhKYmY2whTuFFtPJlzvMTfyRXvd/W7U/1rj8E9v8AW979tPuU3RW6e5zW6a51ShHwT2/1ve/bT7lybwntR5TXC8zs72m4yM1/awtP9amqJp7nM01zqljLHjdrxqnfDbKGGja87pHMHnyH0vcebj9JJKhnH/gpaOP/AAxueJXbSJ0uk9FWbdzqSqaCI5QPo3FpHe1zhqNdVYyKqqqapxqnGVUzjvfgdkfCXKcW4nScP621SNyoVzLdHRNI/DSyODYthOgLX7mlruQIcD2L9T/BH418OcdwbHOFBfV4Xltlpm01RYslhdR1E9S4l80ke9zg7pJXSPDA7UB2gaAAryunCLELzxIs2fVdliky+0QyU9HdGySMfHG9j2OaWtcGvG2R4G8HTcdNF5uKnBLCeNNo8n5hj9Ld2NaRDUObsqIPpjlbo9v5gdD3grFCcotV/g142+Dn+F4fX74VsLh59VsmmDLjTxj8Wnqux2gHIO0A7AwlTvhT4XGEcSrt1er3VWE5rG4RzY1kkfitSJPyYy7RsmvcAdxHPaEF2oiICIiAiIgIiICIiAiIgIiICIiAiIgIiIC+E6DU8gvqgd2xnK73xNqxW11uqeF1Xjj6CossjT4y+vfMd0muz+TMB2ab+38Xnqg451xcp8XstouFkslyzrypcRboGY21lS1jwXCR0j921jGdHICSeRbodO763DcquPEa/Vt5ySlrsBq7d4jTYsLe0EOcG9LLLNrq4nR7Q3TTa/uI1OdwTAce4ZYxSY9i9qgs9npQeipoAdNTzLnEklzie1ziSe8qQIMBg2BY9w0xumsGL2mmstop9THS0zdBqe1xJ1LnHvcSSfSs+iICIiAiIgIiICIiAiIgIiICgnFbgbg/Gu0+IZhj9LdQxpENURsqaf6Y5W6Obz56A6HvBU7RBqt8HXHDwcvwmBXz4W8Kh59WMkmEdzp4x+LBVdj9AOQdyHY1hKnvCfwt8H4nXTq/VyVWGZpG4RzY1kkfitUJPyWbuUn0AHdpzLQrsUB4scCsG422sUWYY/TXNzGlsNYB0dVT/wDy5W6Obz56a6HvBQT5FrT4LMd8wvizxf4ZVmU3XKMexM2h1okvMjZaiBlTTySvYZAAXAeY0A8gGcgNStlkBERAREQEREBERAREQEREBERAREQY7IcjtOJWeou18ulFZbVT7emrrhUMggi3ODW7nvIaNXOaBqeZIHeqHPFbgOfCFbmHwqWXrSMX8k9H5Vg8meJ+N9Lr02mzp+k/E6Xds57NPOVs8XeG1v4v8NMiw65+bS3aldCJNNeikBDopAO8skax2n+qvw6PDDIhxNOAeIu6zC5+SfFR/wC/6To9NdPi6893Zpz7EH7y2LILXlNrhudluVHd7bMXCKsoJ2TwvLXFjg17SQdHNc06HkQR2hZBQ3g9wzoODvDHHcNtrjJS2mlERlI0MshJfLJp3bpHPdp3btFMkBERAREQEREBERAREQEREBERAREQEREGu3Bf+mB4R36GN/uL1sStduC/9MDwjv0Mb/cXrYlAXluVzpbPQy1lbOympogC6R50A1OgH0kkgADmSQBzK9SqquuRy68yXCQ77fRyvit8WurdR5r5yPyidzWnuZ2ab3a2U0xMTVVuj6wbFmzN6rNhlqriNc6xxNosjWwEebPdJzAXc+6JrXOHp87afoXk65Zd8nsv1zLrRNPEbqI+bsxklmI3Ozrll3yey/XMnXLLvk9l+uZdaJp56Y7J1SzydnXLLvk9l+uZOuWXfJ7L9cy60TTz0x2NUs8nZ1yy75PZfrmTrll3yey/XMutE089MdjVLPJ2dcsu+T2X65k65Zd8nsv1zKMZJxDsGJ3e2Wq5Vr2XO5EilpKemlqJXgODS4tja4tYC5oL3aNGvapGmnnpjsjVbM7MHZ1yy75PZfrmTrll3yey/XMojV8UMZoWVr57kY20d0hss5NPKdlZKWCOP4vPUys84atG7mRodJUmnnpjsRk1id0Ozrll3yey/XMnXLLvk9l+uZdaxWKZVa82x6ivllqvHbXWsL4J+jfHvaCRrteA4cwe0BNPPTHZOq2d2DM9csu+T2X65lVR4NwHj2OLvk+2jKhS+L9GJH+Lb9nR9Pt27ul6PzNd2mn4uvNWLcL1QWmaiirayCllrZvFqVksga6eXaXbGA/GO1rjoO5pPcvYmnnpjsarZ5Ozrll3yey/XMnXLLvk9l+uZdaxVblVrt+SWywVFV0d2uUM9RS0/RvPSMh2dIdwG0adIzkSCdeWuhTTz0x2NVsxwZnrll3yey/XMnXLLvk9l+uZdaJp56Y7GqWeTs65Zd8nsv1zJ1yy75PZfrmXWopPxUxems1yusl020Fuufkeqm8XlPR1fSsh6PTbqfPkYNwBbz110BKaeemOyJyaxG+Ev65Zd8nsv1zJ1yy75PZfrmXWiaeemOydUs8nZ1yy75PZfrmTrll3yey/XMsfPeqCmutLbJayCO41TJJYKV0gEkrGab3Nb2kN3N1PduHpSmvNBWXOtt0FZBNX0TY3VNNHIC+ESAlm8fi7g0ka9uiaeemOxqtnkyHXLLvk9l+uZOuWXfJ7L9cy60TTz0x2NUs8nZ1yy75PZfrmTrll3yey/XMutcZpo6eJ8sr2xRMaXPe86NaBzJJ7gmnnpjsapZ5MhS8RrnRkG72RroNNXT2uczObz74nNa4jv80uP0KbW250t4oYqyinZU00oO2Rh5HQ6EfQQQQQeYIIPMKsrXdKS922luFBUxVlDVRNngqIXBzJY3DVrmkdoIIIK5UVzOI3llwjOy31crIrhFrozn5rZ9Py2na1x72duuxmmVM03pzYjCeGG6fT9vlxal/I6Ypzra1URFS5AiIg124L/wBMDwjv0Mb/AHF62JWu3Bf+mB4R36GN/uL1sSg8d3mkp7TWywjWZkD3MA/KDSR/WqpxRjI8Xs7Y9CwUcOhA018wc1cJAcCCNQe0FVHbaJ9gnqLDNqH0B2wFx1MtMf5J4+gDzD/rMcrd9mYjhMT84+vi6mQ1RFU0zxVn4QMlzlquG9ttt6uFjbc8oipKqe3TmKSSA0tS58eo7jtHbroQCOYCguV0uVXXirPgFgqrnNabFZoK6OObLKm31VS+aWXdM+oEM0szWbWsDS4NHfu1AGwF8xW15JU2iouNL4xNaawV9E7pHt6KcMewP0aRu82R40dqOfZqAsNm3CbFeIlXR1d+tXjNbRtcyCrp6mWmnYx3xmdJE9ri0/kk6fQtV0q7c1TMwp+js+aVuacO8SzHJrjBUTWa7SV5sdzki8abHUQeLl8rGxkvaxzdXtDCTu7nOBj2K1t/tOHYXlT8vyG4XJ+bNx+aOtuDpKeaiNfJSbHxfEc7Y0O6Qjfu57u5bG27h5j1pr7HWUdtbT1FkopLdbyyR4EFO/ZuYG7tDr0bOZBI05HmV5ouFeLw2WitLLXpb6K6C9QQ+MS+ZWCcziXXdqfwri7aSW89NNOSMdFOOOP1sa+3jIMhdwqyziwcrvFNkNqvVTHTWaOsc23xxQ1vi7aN9MPNeXtHNxG/c8EELN3+/ZFRZde+FEd4ubbnfr3T3C2XLxqTximtEodNVbJNdW9E6nmiboeXTRAdytis4G4PX5UciqLBFJdHVLa156eUQPqG6bZnQB/ROkGg88tLtRrrqpTNjlsnyGmvslHG6701NJRxVZHnshkcxz2fmJjYfo0OnadZItVcZ+uLWiT4SuK2RZ7V2Gtmo5rNeam0W4syiWhiouhDQx8lI2lkbPv1EhMjjuDtBtAWzto8d8k0XlLovKPQM8Z6Akx9LtG/brz27tdPoUPyLgbg+VZFLfblY2y3ObZ08sNTNC2p2fE6Zkb2sl00AG8HkNF33O3cRZLhUut+QYvT0JkcYIqmx1MsrGa8g57axocdO0ho19AUMqKaqMZnagM+JwVPhf8Aj7rhdWPbizKwQR3GZsG4VPRbOjDtvRkAOLNNpdq7TXmoJb8uvxzjDswsdXkHVPIMmdaw+9X0zx1kL+mH4Oh6PbCxro9WODw7Ro3NO7VbF2XEWQ3eDIrs2lqsrFB5Omr6JksMLoekMga2F0jw0btDqST9OnJR2LwfMAhuTK+PHwypiqxXwFtXOG004kEm+FnSbYSXDUiMNDuYOoJCMZt1Tu54qVu0tXHiuU0NTc7hc4LdxVttNTPudZJUvhh6SheGB8hJDQ57iB2DUrJ32XP+KPEjPqSz1NRSw49Vx2+iip8nltXi2sDJBO+FlNIJ97nkgyO26N0DRoSbvr+FGKXSz5Daquzx1Nvv9Z4/coZJZCJqjSMdIDu1YQIo9NhGhaCNDzWMyPgNguWV0NZdLH09VHTMo3TR1c8Tp4WjRrJix46YAcvwm5TiibVXCfrar6zUeUcSOItVi2VZPcbJPjVht008ONVrqQV9bO2TpqjeAHOjaYwAzTbq7mO5SjwUxt8HrCgSXEUjhqe0/hXqTZbwexDOLnQ3G8WcT19FCaeGpgqJaeQRa69E50T2l7NfxHajmeXMrxU+DZDh1uoLJgldYLHjdDCIqeiuNtqa2Vh1JP4QVbNRz7CCR6VDKKKqas6dqH+EZjNNkWV8JWVFZcaRr8jdTF1BXzUpAdSTu1Bjc3R+rAA4cwC4A6OOvjqrVXZ7xIz2y1WX37HbZiVDQw24W25PgOstOZHVU7tdZjqNujyW/g3agkkqxKzh7JnWNutXELyZftlU2pp32qnnoOhc0ea5rune9rwS7zmvbyOmnbr57/wGwXKG0IudkdUGjpG29jxW1DHyU7eyKVzZAZmdvKQuHM+kqSaJmZmI3/sp3hZkWQeEHcLZT5DkF5x6GkxSguIhsdY6hkrKid8rX1LnM0JaOhboz4ur+YK7eFuW3HNsv4KXa71Ira99oyKnfWbQ3xkQ1EETZdBy1e2MO5cvO5clc+V8GMMzVtALrZGE0FOaSmdRzy0jmQcvwOsLmEx8h5h1b9C9Nz4X45W0tgbDaKSnnxw77K9jXsbRPDQAAI3NJZybuZro4DQoxi3XG+f52wliovgraq/ihbYs9vOV3+O5SXWq0tFHcHQ0VJHDUvibTOpx5rvNjG4uBcd3aORU3Fs4oajXJcRI79Meqv8A95c38DsIfljskFjEV2fVNrnvhqpo4ZKhpBEzoWvEbn6gHcWk689VC2YmqYnBRtRl1+ObY7mWPVeQjF7pl7LO6a7X3pIKuJ874ZGxUGzbHG1zXbH7g/zASDrquF7IHBziUSeTeJjdfo/jWlV1zeD5gE9ykr34+PGXVYr2baudrIKjpBJ0sLBJtheXgEujDSeeuoJ1ydZwfw64NydlRYoJYsmMbrtEXv6Oqcz4ry3do14PPe0BxIBJJA0nFToq8JxlMVQPHWsuWRZXW2bGajIm3ez2fyhVSUF/NroqRrzJ0T3hrHmaQ9G7zCNmjeZGqsOTH88tpbSWK+43SWana2GkgrrPVVM8cTQA0Pl8cbvdoPjaDVcK/g9ZcykorlmlBRXjIIoDTT1VAJ6OnqIt5c2J8PTO6Rg1+JIXjUuOg10ULa4qrjCIU1a6J3E/iDwPvt4uV0guF1w6oq6h9vuU1IHStFI8lojc0AOMji4Dk4BoOoaNJHi2OQUHhCcXL5HU3eerttPb6uKjZc5xDO59NPqx8W7a9oI0Y1wIb+KArEuHAzCbnj9kss9md4hZN4twiraiOWla7k5rJWyB4aRy27tNABpoABkrhwuxm55fR5TPbnC/UrGRMrIaqaIvawksEjWPDZQCTpvDtNVOLCLcxtnnj4wUFwnpOKmb23D84pbmH+Up4K2vmnyeWWlmpnO/DQNoPFRHEQ3c1u1+5rmjVzueuPqTfqrBm5K3Ncnp7m/PJLK3orm/oY6OS5upzGIjqw6NcS1zgXN0aAQ1oAvyz8DcHx/J23+3WJtHcWTvqWdFUzCCOV4Ie9kG/omuIc4EtaDzKyA4V4uLK20+S/4vbdPLQh8Yl/xzp+n6XXdr/K+dt1292mnJGMWqsMJlQOf3S+Yfa+LmN27J76KW2y47UUFXPcZJayl8bqgyZrJ3Ev2kR9hJHnOHYdFI8jtlVi3EC64XTZDfbpZL1iVdcKimrrnLPPRyxPYxksczndIxr97mloOmreWnYptxg4L0ue4tlNPaoKanveQutrK2pq5pRHNFS1LZGtIG4N0YZQNrRqXDU94kWF8JsU4fVNfU2O1Cnqq9rWVNTUVEtTNI0a6MMkrnO2jX4oOn0InR1Z2HD+ZRnwYcfp7JwPw2aCrr6o11oo6iQVtdLUNjcYGatiD3ERsHcxujR6FYGVMZJjF3bJpsNJLrqNdBsKxeDcM8b4bRVcOOW99tgqnBz4BUzSxN0LiBGx73Njbq53msAHPs7FmblRPv81PYoNTJXnbMWO0MVOP5V5/Z5o/1ntV1iMbtPxWbLdv73CFoWWaSps9DNNqJpII3v1/KLQT/AFr2r4GhoAAAA5ADuX1RM4zMvMCIiga7cF/6YHhHfoY3+4vWxK124L/0wPCO/Qxv9xetiUBYXJsWpslhhL3upq6nJdTVkXx4idNwP5THaDc08joDyLWkZpFlTVNM4wmJmmcYVbVWzJLQ4sqLMbowDlU2uRmjuffHI4Oby7gXfnXj8oXD5t3r2UfeVvIrM63O+jzLfjLbsRhOCofKFw+bd79lH3k8oXD5t3v2UfeVvImda6PKdeucoVD5QuHzbvfso+8nlC4fNu9+yj7yt5EzrXR5NeucoVD5QuHzbvfso+8nlC4fNu9+yj7yt5EzrXR5NeucoVD5QuHzbvfso+8nlC4fNu9+yj7yt5EzrXR5NeucoVD5QuHzbvfso+8nlC4fNu9+yj7yt5EzrXR5NeucoVD5QuHzbvfso+8nlC4fNu9+yj7yt5EzrXR5NeucoVD5QuHzbvfso+8sN8IdIMyGJ+Tbp1jNB5U8neK/hfFek6Ppe3Tbv8386vda7P8A+kJZ/wB15/8AVUzrXR5NeucoSvyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95ffKFwP/s3evZR95W6iZ1ro8mvXOUKspbZkl3IZTWY2thHOpukjNG8+6ONznO5dxLfzqc4zi1NjUMzmvdVV1SQamsl+PLprtaPyWN1O1o5DUnm5znHNIomvZm0xhDWu5RXd2VTsERFU1xERBrtwX/pgeEd+hjf7i9bErXbgv/TA8I79DG/3F62JQEREBERAREQEREBERAREQEREBERAWuz/APpCWf8Adef/AFVbErWTJMms+I+H7Q1d9utDZaWr4ceJ089xqWQMmnN01ETHPIDnkAnaOencg2bRfAdRqOYX1AREQEREBERAREQEREBERAREQEREBEXXU1MVHTyTzysggiaXvlkcGtY0cySTyAHpQa9cF/6YHhHfoY3+4vWxK1k8HDJbVmnhReERe7DcKe72ec2CKGvo3iSCV0dJIx4a8cnaOaRyWzaAiIgIiICIiAiIgIiICIiAiIgIiICjmd8O8Z4nWKSzZVY6K+21/PoayIO2Hs3Md2sd/rNII9KkaINZzwN4m8CD4xwfyrrFjcfPqNl8xkjY38ikqvjR+gNd5veSVJcA8LXGL/fWYvmVDW8M825A2XJAImTHs1gqOUcrSeQPIu7gVeajWf8ADXFuKVifZ8ssVFfbe7UiKrj3GMn8Zjh5zHf6zSD9KCSotaPgW4pcBvw/CXJ+tuMRc+pGXzF5jZ+RSVfaz0Na/wA0dpLlKOHvhZYrlF8ZjGVUlbw3zcaNdYclb0BlPZrBMdGStJ7CCC7uagu9FXXFXj9hXBW74pb8uubrZJklVJS0cphc6KPY0F0krxyYwOfEzU98rTptD3NsQEEajmEH1ERAREQEREBERARFwnnjpoZJppGxRRtLnyPIDWgcyST2BBzXXPPFSwSTTSMhhjaXvkkcGta0cyST2ALXzL/DEtVXfJsX4U2Ks4r5YzzXttBDbdSns3T1Z8wD6W6g9m4FYqm8GjNeNM8dx46Zg6ttxcJGYPjEj6W2R89Q2aQHfMR+fkexxCDLZV4X9uud7nxjhLYKvitlEZ2SOtbgy2Uh7A6arPmafo6g9m4FY+l8GbLuMVRFcuOmXuu1HuEkeE4299JaYueoErgRJOR6SRoexxCv7FMQseC2SCz49aaOy2uAfg6ShhbFGPSdAOZPeTzPeswgxeM4tZ8Ms1PaLDa6Sz2ynGkVJQwtijb+ZrQBqe89pWUREBERAREQEREBERAREQEREBERAREQEREBERAUQ4m8PcL4k4+61Ztabddba7XaK8BpjPe6N+ocx3+s0g/SvJlGW1dbWz2qzTGlZAdlXcQ0Oc12n8nCCCC4fjOIIb8UAu3bIs3FrWZTNNRx1tS7TdU1g6eVxHpe/U//ANV2bRR+ZO3lH+/qW/aySu5GdM4Q/IzjVI/JOIdzGPtv9yxS3SPoLFJdZJqmVtEx7jHtc+NjmscXOe1hbqwP2kkgk/oJ/B++E3W5ZihwDNRPS3ixUwdQ3OtaWNqqRpDQx7nADfHuY0EnVzSO9pJvbyBbPV1J9g33J5Atnq6k+wb7lGdZ9fDY1D3lj9arJ64oPame9OtVk9cUHtTPeq48gWz1dSfYN9yeQLZ6upPsG+5M6z6+DUPeWP1qsnrig9qZ7061WT1xQe1M96rjyBbPV1J9g33J5Atnq6k+wb7kzrPr4NQ95Y/WqyeuKD2pnvTrVZPXFB7Uz3quPIFs9XUn2DfcnkC2erqT7BvuTOs+vg1D3lj9arJ64oPame9OtVk9cUHtTPeq48gWz1dSfYN9yeQLZ6upPsG+5M6z6+DUPeRbih4YVpxy+zYtg1hr+IWXteYnQUY6Cgpn9n4eqeNjQD6NR3EtUPp+CmRccZ46/jpxHopLU5wkZguLVopreznqGzyh2+Y/t5HsforZ8gWz1dSfYN9yeQLZ6upPsG+5M6z6+DUPeT3CMSxvCcfgtWKWygtVoi+JBbo2sj17ydvxnHvJ1J7ys+qdbjNBTTdPQReSasDRtTb/AMC8c9ee3k4fQ4Eczy5qZ4lls9TVi0XcsNeWl9PVRt2sqmDtBH4sg729hHnN5bmsZtNUY25x9J3/AM/Wxq3slrtRnb4S9ERVNIREQEREBERAREQEREBERAREQEREBERAREQFhsyvT8dxS7XKIAz01M98Qd2GTTzAfo3aLMqN8R6CW54Le4IGmSYUzpI2NGpc5nngD6SW6K6zETcpirdjDKnDGMULtVA212+Cma4vMbfOeeZe483OJ7ySSSfSVUnHfiPkeO5NhmK47SXaCa/VMgluttpaWokZHHFI90cTZ3hu/wA0OcXNIDNdNztArhpqiOrp4p4nbopWh7XekEahQLPMVul64n8MrvR0vTW+zVldLXTdIxvQtkopYmHQkF2r3NHmg6a6nlzWtVMzVM1b3p64nNwp9HRfeMtJhF4itd6smSChimp6KXJpKBgoDLJsaxznhwOhc9oLms2hx01Gii+VcSMjtrePhprj0fVe1xVFo/ARnxaR1A6Unm3z/PAOj9w7uzkoJxP4NZflFTmokwyPJb3V3VlbaMjqrpE1lNRRvjkZSwxudujfoxzD5rWuLy4vU5yLhrk14v8Axho4rfEy2ZtYWMo7i+pYBT1TKU0/QSxjV3Mu3b26t0BGuuihRNVc4x9bp/hyvHF6+45xExKiNBdsioK/FJbhPbLNRwyTSVIlpwJdXFm0Br3jTcBq4DQnRSuHjdbbph9vyGxWHIMkiq55KZ1DbKEGqpZYyRIydkj2CMtIIIce3s11Cg8Ntz6z5ZhuWR4JJXT0WLzWattcV1pWyRTGeJzS17nBr2kQ69o0Dx3ghR+Xg5l1PS2mtu2PNyahuV7ul7vuJUVxjjibNU7PFw50jmMnbEGHcCdC5+oDtEM6uMcPrctaTj3jIwq15HDHcqo3Srdb6S0Q0hNwlqmuc18HREjR7Cx+7UhoDdddNNcNknFi5vr+H00NFc8TpbhkElBcqS900Ucj4W0c8p56vaG7mtO9jvxSNe0KAYpwhzXDrRjt3osZo23bGcluldDj8VfGIamirGOBEEvINdGJNrQ8M16N3YCNZVxJwW/8c6TCoMhxGezW2lyF01xofKkT5PEvFJW7nuicOT3v2FjHOO09vM6E51cx67Fp4HnFHxDsIvVtpayC2SzPZSz1cbWeNxtOgnjAcT0buZaXBpI56aEE+nLslOJ2V9wbaLnfHh7WNorRAJp3lx05AuaAB3kkAelV9gOSQcILCcVzu/2u0xWyZ1NZLjdLnTwuuNA0DonbXPDt8YIjdqBqWAjXcvHxUutNxcxqipcHu1vzKnobpT1F7s1lvMLZayi0kBhMgeA0F2121zmhwY4aqFmf931ZqHwgMdditZeJqG70lXSXJlnksU9IBcPHX7THA2MOLXOcHtcCHbdDrryOnTbfCFsdXbcgqqyz3uz1FmrILdLbaymYayaqlaHRwxRRveXOcHN0PxSDqDoHEUteeGt6xHFsvrWYxSYb0uR2q9WCKmutHGykkYxsRaOkc2J0g2ndG4hrum0a92mq4UuJnjBYslghtz7xnlpyG3ZBeaG+y0j6S5tEL4o6Zr6d8kTG9C14AJJa7Qu7dVOCjSXMcOP/AKuyo8Iiw2qz36svNnvthq7LBDV1VruFIxtUaeWTo2zRhr3Mkbu1B2uJGhGmugPa7jfBPHkNJHjd8oL9bLU67QW25U8cclZBqWiSPSXTQOABa5zHDUagKvr7wmnvnCjMqOw8JLdg18roaenpoKeej6eraJmPeHOiOxrRt1ALzr6ByVg5Dgt1vPGd13ZAGWaXEKu0OrC9vmVElTE5rdmu4+a1x1005aa6osibk/Xxe/ghxAuXErhzZb3dbLV2isqaKnmkfO2NsVS58TXGSAMkeRGSTpv2u07QpVkvSw2iatp9BWUH+GU7jqNJI/OA5dx0LT9DiOeqhfASlyWxcPrTjWS466y1Fhoae3R1IrIqiKtEbNhkj2EuaNGNOjwD53ZyU1yec0+O3JzWufIYHsjYwauc9w2taPpJIH7VbZxi7ThzhZG2397ktujqo66kgqYiTFMxsjCfQRqP+K7l47PQ+S7RQ0Wod4vAyHUd+1oH/wCF7FNWGM4bnlxERYgiIgIiICIiAiIgIiICIiAiIgIiICIiAiIgq29Wc4LPJqzbjsji+GoHxaIkkmOT8lmp8x/xQPNO3Rpfza4PaHNIc0jUEHUEKznNDmkEAg8iD3qJVfC2wTSOfTQ1Nrc46kW6qkgZr+g07P6ldOZd21zhPfF1LOW5lObXGKPosv8ABPb/AFve/bT7k+Ce3+t737afco0Vvr8NjXrfKWIRZf4J7f63vftp9yfBPb/W979tPuTRW+vwa9b5SxCLL/BPb/W979tPuT4J7f63vftp9yaK31+DXrfKWElp4pyDJEyQjs3NB0X2Knig16OJkevbtaBqs18E9v8AW979tPuT4J7f63vftp9yaK31+DXrfKUbvFkt2Q2+SgutBS3Ohl+PTVkLZY3/AJ2uBBXVj+MWfE6HxKyWmhs1Fu3eL2+mZBHr6drABqpT8E9v9b3v20+5Pgnt/re9+2n3JorfX4NdtY44SxCLL/BPb/W979tPuQcJ7eD/AJXvR/8AvT7k0Vvr8GvW+UsJUVMVJA+aeVkMLBufJI4Na0ekk9i9mLWGXJbhTXOphdFaKR4mpWStLXVUo+LLtPZG3tbrzc7Rw0a1pfnrbw2sNuqI6h1NLX1EZBZLcKh9RsIOoLQ8kA694AKlCmJot7aNs892H1z8cWrfyzSU5tEYCIipc0REQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERB/9k=" - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "import * as tslab from \"tslab\";\n", - "\n", - "const graph2 = app2.getGraph();\n", - "const image2 = await graph2.drawMermaidPng();\n", - "const arrayBuffer2 = await image2.arrayBuffer();\n", - "\n", - "await tslab.display.png(new Uint8Array(arrayBuffer2));" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's try it out. To emphasize the removal steps, let's `stream` the responses from the model so that we can see each executed node:" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "import { StateGraph, Annotation, messagesStateReducer } from \"@langchain/langgraph\";\n", + "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", + "import { ChatAnthropic } from \"@langchain/anthropic\";\n", + "import { BaseMessage, isAIMessage } from \"@langchain/core/messages\";\n", + "\n", + "const GraphState = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: messagesStateReducer,\n", + " }),\n", + "});\n", + "\n", + "const toolNode = new ToolNode([getWeather]);\n", + "\n", + "const modelWithTools = new ChatAnthropic({\n", + " model: \"claude-3-haiku-20240307\",\n", + " temperature: 0,\n", + "}).bindTools([getWeather]);\n", + "\n", + "const shouldContinue = async (state: typeof GraphState.State) => {\n", + " const { messages } = state;\n", + " const lastMessage = messages[messages.length - 1];\n", + " if (isAIMessage(lastMessage) && lastMessage.tool_calls?.length) {\n", + " return \"tools\";\n", + " }\n", + " return \"__end__\";\n", + "}\n", + "\n", + "const callModel = async (state: typeof GraphState.State) => {\n", + " const { messages } = state;\n", + " const response = await modelWithTools.invoke(messages);\n", + " return { messages: [response] };\n", + "}\n", + "\n", + "const app = new StateGraph(GraphState)\n", + " .addNode(\"agent\", callModel)\n", + " .addNode(\"tools\", toolNode)\n", + " .addEdge(\"__start__\", \"agent\")\n", + " .addEdge(\"tools\", \"agent\")\n", + " .addConditionalEdges(\"agent\", shouldContinue, {\n", + " // Explicitly list possible destinations so that\n", + " // we can automatically draw the graph below.\n", + " tools: \"tools\",\n", + " __end__: \"__end__\",\n", + " })\n", + " .compile();" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCADaAMcDASIAAhEBAxEB/8QAHQABAAMBAAMBAQAAAAAAAAAAAAUGBwgCAwQJAf/EAE8QAAEDBAADAwYIBw0HBQAAAAECAwQABQYRBxIhEzFVCBYiQZTRFBUXMlFhk+EJN0JxdbO0IyQ0NkNSYnN2gaHB0hhUVpGSlbElM0Vyov/EABsBAQACAwEBAAAAAAAAAAAAAAACAwEEBQYH/8QANREAAgECAQgIBQUBAQAAAAAAAAECAxExBBITIUFRUpEFFBVhcaGxwSIyM2LRQnKB4fA0Y//aAAwDAQACEQMRAD8A/VOlKUApSlAK+SbdoNtKBMmx4pX1SH3Uo5vzbNfXWZ5/Cjzs/tSJMdqQkWyQQl1AUAe1a+mjlGEZTlgk2XUaelmoXxLx51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3Vye1cn4Jc0dPs77vI0TzqsvjED2lHvp51WXxiB7Sj31nfm9a/DYf2CPdTzetfhsP7BHup2rk/BLmh2d93kaJ51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3U7VyfglzQ7O+7yNE86rL4xA9pR76edVl8Yge0o99Z35vWvw2H9gj3U83rX4bD+wR7qdq5PwS5odnfd5GiedVl8Yge0o99eTWS2h91Dbd1hOOLISlCZCCVE9wA3Wc+b1r8Nh/YI91Rl/s1visW91mDGZdTdbfpbbKUqH78Z9YFX0OkKFetCiotZzSxW12IyyDNi5Z2BtdKUrfOQKUpQClKUApSlAKUpQClKUApSlAKznNfxg2v9FyP1rVaNWc5r+MG1/ouR+taqqt9Cp+1m5kn1onjSlK8IenILMs4snD6zi6X+cIENTqI6FBtbq3HVHSUIQgFS1HrpKQT0P0VQMr8pDHsemYQY7U242zJH5DZlsW+WtbCGW3CSGkslal9ogJKNBQHMrWgTUxxztlrueHxhdLbkE4MT2ZEaRjDCnp0B9IUUSEJTs+j1B9FXztFJBNZeZmcO2LhZmGT2O73WRYr5NMtuLbv/AFBcNxiQwxIcit9UrIU2VoSOm+4dQNulThKN5d+3u1GtUnJOy7vU1jJuOeEYbdmbder0q3yXG23SXYb/AGbSXOiC64G+Rrf9Mpr6cl4w4liWRjH7lcnU3tUduWmBGhSJLqmVqUhKwlptWxtCt6+boE6BG8H41NZRnxzu3ybTm0iPPs7Qxe22pl2PDV2kbbhmKSUjtEulQU08e5ICUqJrQ+HlonO8ZxfH7VOjRXcGtcdMmXFW1yu9u+txklQGnACgqQeo6bFSdKEYKT3b/DuIqpNyzUTnDjjjbeIWX5Tj7cObElWe4uQ2lLhSQ282httSlqcU0lCFcy1AIKuYgBQ2FA1plY9wzfnYjxTz+xXCx3dKb3e1XaFdWoS1wFsqiMpIU+PRQoKZUnlVo7I1vdbDVFVRUvhwsi6m21rFRGTfwOB+lLf+2M1L1EZN/A4H6Ut/7YzW10d/20f3R9UKv05eDNfpSlewPIilKUApSlAKUpQClKUApSlAKUpQCs5zX8YNr/Rcj9a1WjVXMlwaDk8+NNfkzYsmO0plK4b/AGe0qIJB6HfVIrEoqpCUG7XTRfQqKlUU2ZzlfD3GM6VGOR4/bL6YvMGDcIqHuy5tc3LzA63yp3r6BUB/s/cMt78wMb/7Wz/prUvkqg+MXv237qfJVB8Yvftv3VxV0XNKyrep1nltB63EpWLcOMVwd997HcctdjdkJCHV2+IhkuJB2AopA2BVjqS+SqD4xe/bfup8lUHxi9+2/dUX0S5O7qrkySy+ktSTI2lZpxkizcJ4ncI7HbL3dEQMlu78O4B2RzKU2hnnTynXonfrrXfkqg+MXv237qx2P/6rkzPaFLcyvXyxW7JrVItl2gx7nbpAAdiy2g404AQRzJPQ9QD/AHVUEcAeGjZ2nAccSdEbFsZHQjRHzforUPkqg+MXv237qfJVB8Yvftv3VNdFSjqVZcmReXUXjEzi18E+H9juMa4W/CrDBnRlh1mTHtzSHG1juUlQTsEfTU9k38DgfpS3/tjNWn5KoPjF79t+6v6nhRbO3juO3G7SUsPtyEtPS+ZBW2sLTsa6jmSD/dWxk/R7pV6dadW+a08HsdyEstpOLjFWuXWlKV0ziClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv9mNdEVzv5SP48fJ5/tDL/AGY10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv8AZjXRFc7+Uj+PHyef7Qy/2Y10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSoTJMvgYyG0P9rJmvAlmDFTzvOgd5A2AlPcOZRCRsbPUVKMXJ2iZScnZE3Xy3S2Rb1bZdvnx25cGWyuO+w6NodbUkpUlQ9YIJB/PVDczzJZJKmLPboTfXlEmWt1z6thKAAfqCj+evX555d/u1k/6nqt0W+S5m11Ws/wBJ+OnlF8G5fArjBfsRfClxWHu2t76v5eKv0mlb9Z16Kv6SVD1V+r3kU8F3+B/AOz2qehbV6ujirxcWXO9p51CAG9eopbQ2kj+clX01UOLnBg8Zs/wrLL5EtIn4w/2qW2u05ZiAoLQ07sbKErHMB/SWPyumueeeXf7tZP8AqepolxLmOqVtxpVKzdGa5Ykgrh2Zwb+al11HT8/Kf/FS1n4jsvyGot5hKskh1QQ26p0OxnFE6CQ7oaJOgAtKdkgDZ6VjRN/K0/B+2JCWT1YK7iXKlKVSa4pSlAKUpQClKUApSlAKUpQClKUApSlAKUpQEVlF+RjNhl3FbfbKaSEtsg6LrqlBLaAfUVLUlP8AfWew4zqFOyZbpk3CQQuQ+T3n1JT9CE7ISn1D6ySZ7istXwbHGv5J27oDn0aSy8tP/wC0IqKq2fwU4pbdfsl5M7OQwWa57RSudOPPFHJ8ZvGSSMQv1zlOY3ARMnWqJZYz0GOeQualSHVJX6aBvlaPMkddHYr08TuL2QfHOTohZfFwNiy41HvUCPIjMPLuzrqXVFO3QSUJLaG9NgK5l9/cK1bG660VdHSNK5me4ncQslvbGO2hrIYr9msdtk3J61wLdIlOy5LJWQ8JTjSUpHLrTaNlXP1SAAZm0ZRxLyrLsRxq63M4RcZeOS7hc2o0OM+6HmZbbSFo5u0QgrSsKI2sAKI79KCxlVU8Ezf1uJb5eZQTzHlGzrZ+ivF9huSy4y82l1pxJQttYBSpJGiCD3g1yreLtknEqzcHJUzI37ZeGcunWp+XAisacdYRMaTICHELAVytH0fm/uqunROuqIbLkeIw06+uU6hCUrfcSlKnCBoqISAAT39AB16Cs4ayUJ599RO4BfHkSpNgmOrfcjtiREfeXzLcYKtFKiepLatDZ6lKkbJPMau1ZXblqa4gY2pHznBKaXrv7Mtcx/u5kI/wrVK2qmtRnvXu17XODlUFCq0hSlKpNQUpSgFKUoBSlKAUpSgFKUoBSlKAUpSgK7ntjfv2NvtRE80+OtEuMknl5nG1BQRv1BYBQT9CzVNhTG7hFbkNE8ix3KGlJI6FJHqIIII9RBFapWP53fbHbOKNpxa1TixmV7juTfiv4K4uM80gHbzziEkMElJSHDvZ0ClXokWq045jdrYfg38lyhUm4ywZRcu4BWPMLtf5b92vkCLkDSGrtbbfMDUaaUt9mlaxyFYPIEpPKpIUEgKB67zzirwnyQ5JZn8etuR3ddstLEKJdI92tiOR1vm0txqSwSgn0SpbOubp6I5RXQjhvcMlEvGLilQ36cUtPtq+sFK+b/mkH6q9fxhP/wCHL17J99Y6vV2LzR03KjJapGbp4KyciYsl9vWQ3SxZ2m1swbtdMZkIYTNKRtSVpW2pJAUVEKCUkb6aGgLZb+G9vt+V2jIRMuEi4WyzrsjZkvh0OMqW2srcURzKc20n0ubrs7BJ3Xvu2dQ7DcLZAucSXbp1zcLMCLLDbTstY1tLSVLBWRsdE7PUVK/GE/8A4cvXsn306vV3E1Oitq5lEk8BLC/iUWxNXG7QzDvD18h3KM+hEuNKcdccUUK5OXl/dnE8qkn0T12etaBaoKrZbIkNUp+cqOyhoyZSgp14pAHOsgAFR1skAdT3V4ImXJ0hKMbvKlE60phKP8VLAqTtuH3u/KHxmn4it5+eyy8Fy3B/NK07S2PUSkqV1Oik6VTQSXz2S8fbEi61GmrpnswW3qu2TSLwRuFBaXCjq3tLjqlDtlD/AOnIlG/pLg6aO9Eqk8OOJWLZrJv9jx3to7+MSvi2bAehORTHUNhHKFJAKFBJKSn1a7t1dqTkpNJYLD/eZwatR1ZubFKUqsqFKUoBSlKAUpSgFKUoBSlKAUpSgFfwkDvOvz1HT8hgQLgzbFTIxvEllx+LblPoQ/ISjXMUJJ2QNjZ7hsbrNIWEz+OuKY3cOJmPycVuFtupujFit95WpBCFEx/hJb5QpSfRXoHopAOwCpFAfdcciu3FRzPcMtEfI8GdtyG4jOXLiIShx5Q5l/BkrO1gJ5RzgD550UkJJvWM4+nGrDbLaZsu6uwYrcX4wuKw5KfCQBzOLAHMo62TrqetStKAUpUbklkTkuO3W0LlyoCLhFdiGXCWEPshaCnnbUQQFp3sEggEDoaA/ILyy/KMmcT/ACiF3ewXBTVrxR8RLJIjr/LaXzKkJPdtTg2FfzUo+iv1L8n/AIvQ+OXCWwZfE5W3pjPJMjp/kJKPRdR9OuYEjfekpPrriHi3+D/4e4FxI4V4/b7zkz0PKro/CmuSZUdTjaEM84LRSwADvv5goa9VdreT/wCT/j3k4YbMxrGplznQJU9dxW5dXW3HQ4pttsgFttA5dNJ9W9k9e7QGmUpSgKvxFwCJxIw+6Y/Jn3CzonpRzT7PIMaU0pCgpCkuD1gpHfsEdKhY96ynFM0xXEW8bnZBi7lt7OTmD89tTrMltJ/99s+krnCUnnH5S+6tCpQEXjmUWfMLYLjY7pDu8ArU18JhPJdb50nSk7SSNg9CKlKzDKOFdxx3DrhE4Pu2TAb3LuKbk8tdtS5GlL6BaFpTrk5wlIKkgkAHQBOxLw+LFrVxTc4dyY9xayBu2puSJSoDiIcpvYDhac6j0CUbBOgVgAkg6AvFKUoBSlKAUpSgFKUoBSlKAVnWY57OvreX4rw6uFqd4i2VqMXI14Q6iPFD/pIcUQn0/wBz5lDl2NgA67q0Ws0uU5GN8d7NGhYIqQvJbe/8YZfGbJ+D/BgC2w8Qg6Srm9EqWOvQA+oCdtPDazoyO35fdrXbZudtW1u3v3xmNyKIAJX2YJVyJKlL9ZOiEkkCrdSlAKUpQClKw7jX5Q0jGsgZ4fcPLajLuKE9vmbgJV+9rW2dfviYsfMSNghOwVbHdzJ2BX/KPuURzyhvJ6tiJLS7im9S5KoiVguhr4OR2hT3hOwRvu6H6DXSNY7wL8nmPwxkzcoyO5Ly/iVeBzXTJJY2ob/kI6f5JlOgAABvQ3oBKU7FQClKUApSlAK9E2G3cIj8Z3nDbzamlFpam1hKho8qkkFJ+sEEeqvfSgMgj4vkPAPBccsOAWeZncBq59lKbvV51KjRHFHRaWtPKUtcydJ6aQg95JUNStV7t19aedts+LcGmXVMOLivJdShxPRSFFJOlD1g9RX21lPk6ysJl41kqsFhzYUBORz0TkTiSpc4LHbqTtSvQJ1ru/MKA1alKUApSlAKUpQClKUAr8+/KJ/CQXPF81h47jGK3exSrHdWlXtu9LjIclIbWsPQwlAeSlCwGyH0Ob79JI0T+gD8hqK2XHnUNIHepxQSP+Zri7y8vJnsfGSyO5ticy3jOLaz++IzUhG7pHSPmaB6upA9E96h6J36OpKMpYIFz8hzym808pa35fOyq1We3RbU7FZhOWlh1sOrWHS6F9o6vfKEta1r5x7/AFdRVyl+Dyx6Dw48nC3qucli23O9TZFzfjS3EtuoBIab2lWiAUNJWPqXv110z51WXxiB7Sj31LRz4WZsyUpXzQ7nDuG/gstiTrqexcC//BrmfJuIuVeVFkU/C+F8yRj2BQnVRb/nqElLj6h0XFt++9XqLvq3sdOXng01qZgluJfHTIc/zGXww4MdjLyBj0L5lrqe0gWBB2CAe52R0OkDYBHXelcui8FOBePcD7A/EtfbXG8T1/CLrfp6u0mXF87JcdWeutk6TvQ2e8kkzfDLhfjfCDEYmN4rbW7bbI/UhPVx5Z+c44vvWs66k/UBoAAWusAUpSgFKUoBSvkm3aDbSgTJseKV9Uh91KOb82zXzedVl8Yge0o99TUJNXSM2ZKUqL86rL4xA9pR76edVl8Yge0o99Z0c+FizMY8rPyn5vkv2XH7s3hispt9zkOxXnxcfgiYriUpU2k/uTnMVjtCO7XZnv3XO/Cf8Jffs3yy24rC4UQpd2vFx7GMIV3VHQhK1DRcBYXspGypewNAnQ1XV/HnD8Y438KMhw+Zd7ahc6OTEkLko/e8lPpNOdDvQUBvXekqHrrjn8G5wOZxbJ8izzLixbbhbXXLPbI0x1CFJc7pDwBPqGmwobB5nB6qaOfCxZn6NUqL86rL4xA9pR76edVl8Yge0o99NHPhYsyUpUX51WXxiB7Sj30TlFmUoAXeCSegAko6/wCNNHPhYsyUpSlVmBVQy7Ln4ksWm0hBuBSFvyXBzNxEHu6flOK/JT3AAqV05Urtch9EWO684dNtpK1H6gNmshxpbku1N3F/Rl3I/DX1DfVSwCB19SU8qR9SRVsbRi6j2YeJu5LRVWfxYI/i8agy3u3uLZvEsjRk3HTyz130BHKkfUkAfVXu837WP/jYf2CPdVO4wcXYnCOJj78qHImC63Vi3nsGHnS0hSvTc02hZUoDuR0Kj3b0RX0ZFxsw3FI1sdul0djKuUb4ZHjCBJXJ7HptxbKWy42kb6laU6OwdEGq3WqSxkzuJwjq1KxafN+1+Gw/sE+6nm/a/DYf2CfdVdv/ABgw/G7PaLnMvbS4l4Tz24wmnJTktPLzFTbbSVLUACCSBobG9VDSuLzFyyjhszjkiFdLBlLs5DkwBRUAxHW4OTqOVXOjlUFAkaI0DUdJPiZlyii7PYrZ3lBZtsZDqSFJdabDbiSO4hSdEf3GpbGr+7hfZw5jhkWNbhAkKSO1irWvZU4ofPbKlElZ9JJJUoqBKkU2wcXMTyjJ5WP2q6mbc4y3W3EojPBrmbOnEpeKOzUUnoQlRIq3PMokMradQlxtaSlSFDYUD0INWRrSwm7r/YbiqpShWjY1ClVPhjcHJmKNx33C6/b3nYKlkklSW1ENkk9SS3yEk+vff31bKTjmScdx5yUXFuLFKUqBEUpSgMzz+FHnZ/akSY7UhItkghLqAoA9q19NfH5vWvw2H9gj3VJZr+MG1/ouR+tarxrn5fUnGcUm1qXqzxfS0pLKWk9iI/zetfhsP7BHup5vWvw2H9gj3VIVGZLk1rw+ySrvepzVutsYAuyHjoDZAAHrJJIAA2SSAASa5ulqP9T5nHU5t2TZ5+b1r8Nh/YI91PN61+Gw/sEe6qjD474LNsV2vCb8lmFaezM/4VFeYdjJcUEoUtpxCXAlRPRXLroevQ1IYrxYxXNJc+Larr2kmCymS+1JjuxlBlW+V1IdSnnbOj6adp+us59ZbX5ljVZJtp6vEnvN61+Gw/sEe6nm9a/DYf2CPdWVRvKOsmUcTsLxvFJ0e6wruuaJj64j6PQZYUtCmHFBKFpK06Kk8419HfWy0lOrHGT5mJqrTtn3VyP83rX4bD+wR7qhc0sluYxa4uNQIrbiW9pWhlIIOx3HVWqoLOf4pXP+q/zFbOSVajyims5/MtveWZPOWmhr2r1NlpSldg+inzXKILhbpUUnQfaU3v6Ngj/OslxVxS8btoWlSHW2EsuIUNFK0DlWD+ZSSK2Os6yqwu45cZN1iMKetUtZdmNtDa4zpABdCfW2rXpa6pV6WiFKKLorPg6axxX4/wBusdDI6qpzaltMm8oK23GTjmOXK322Xd/iTI7fdZMSA2XZC2GnP3QtoHVagFb5R1OjVWVkcvFeK9yzl7E8mulnyGxxY0X4HaXHZcR1h17mYdY1ztBfaJUCoBOwdkVusaSzMYQ/HdQ+y4OZDjagpKh9II6GvZWq9WpnYcLvOTOWuHWJZDwZl4JkV9xy53GKmx3C3yIVmjGa9anX53wttPZo2op5D2RUgHRQN6FeeM4pkmP3vC8vmY1c24UjMLxc3bZHY7STb485lbbKnW0np6Wlr1vl5zvuNdRUrFyCopWs8P6/BgOAfGti4xfFuK2fJrbh0uRPfvUG+wC3CivbKkPwnj1IdcJJbSpSdKJ0kjVb9SvlhxnswkLt9scIjBXJMuKN8jKd6UhtQ6F0jYAHzPnK/JSuyEHUfdte4k3GjFuT1Fk4URyMdlzNEJn3CRIRsaJQFdmk/mIbBH1EVdK9EKGxbobESM0liMw2lpppA0lCEjQA+oACvfVtSWfNyR5ucs+TlvFKUqsgKUpQGc5r+MG1/ouR+tarxryzX8YNr/Rcj9a1Vcyvh7jGdKinI8ftt9MXmDBuEVD3Zc2ubl5gdb5U719ArmdIW0kb7l7niulbda17kWGsj8pfErrlWGWR61xJ1y+Jr7EusuBa5CmJcmO3zhxLK0qSQ4OcLTpQJKBo71U5/s+8Mt/xAxv/ALWz/pqdxXhviuDPvvY7jlrsbshIQ6u3xEMlxIOwFFIGwK5yai7o5kJRpyU4t3Xd/Zz3mWE23JeFmd3PHMZzpd9ehxbeheTKnPyZLQkodLbLT61r0ggknlA6nW+tWjjXgN/zTiDkEazxZCPjDh7cLa1N5FJYMhUlooZU5rlClDm6E70VHu3W/wBKlpWixZTJNNbL46934OdLJe5uYcRODoj4RkePR7C1OanfGFqcYjxCYRbSgOa5VJ5hpKh6J6ddnVdF181xt0W8W+TBnR2pkKS2pl+O+gLQ6hQ0pKknoQQSCDVJHk/8MwQRgGOAjuItjP8ApqLkpY6iE5wqWvqt/O1vf3l/qCzn+KVz/qv8xUBG4DcN4chp9jBMdZfaUFtuItjIUlQOwQeXoQan85/ilc/6r/MVsZJbrNO3EvUzQUdNDNe1evibLSlK7h9GFKUoCr3PhvYbnJckiM7BkuHa3bfIcjlZ3slQQQFHfrIJr4PkogeL3r237qu9KvVeov1FiqzjqUmUj5KIHi969t+6nyUQPF717b91XelZ09Tf6EtNU4mU5nhVYwoGUu4XJIIPZy5zimzr6UAhJ/MQRVriRGIEZuPGZbjx2khKGmkBKEAdwAHQCvdSq5VJz1SZXKUpfM7ilKVWRFKUoBSlKArmS4NByefGmvyZsWTHaUylcN/s9pUQSD0O+qRUZ8lUHxi9+2/dV2pVmklZL2RXKnCTvKKf8FJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IjoaXAuSKT8lUHxi9+2/dT5KoPjF79t+6rtSmkfdyQ0NLgXJFJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IaGlwLkik/JVB8Yvftv3V65HCG1y2lNSLneH2VfObXM2lQ+g9KvVKyqsk7r0RlUaSd1FckKUpVRaf/9k=" + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "import * as tslab from \"tslab\";\n", + "\n", + "const graph = app.getGraph();\n", + "const image = await graph.drawMermaidPng();\n", + "const arrayBuffer = await image.arrayBuffer();\n", + "\n", + "await tslab.display.png(new Uint8Array(arrayBuffer));" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "When you try to call the tool, you can see that the model calls the tool with a bad input, causing the tool to throw an error. The prebuilt `ToolNode` that executes the tool has some built-in error handling that captures the error and passes it back to the model so that it can try again:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "HUMAN: \"what is the weather in san francisco?\"\n", + "AI: [\n", + " {\n", + " \"type\": \"text\",\n", + " \"text\": \"Okay, let's check the weather in San Francisco:\"\n", + " },\n", + " {\n", + " \"type\": \"tool_use\",\n", + " \"id\": \"toolu_01TiQFrXDi5x4p7oEEZ99T5p\",\n", + " \"name\": \"get_weather\",\n", + " \"input\": {\n", + " \"location\": \"San Francisco\"\n", + " }\n", + " }\n", + "]\n", + "TOOL: \"Error: Input queries must be all capitals\\n Please fix your mistakes.\"\n", + "AI: [\n", + " {\n", + " \"type\": \"text\",\n", + " \"text\": \"Apologies, let me try that again with the location in all capital letters:\"\n", + " },\n", + " {\n", + " \"type\": \"tool_use\",\n", + " \"id\": \"toolu_01B1LiK5i74E52EwfHL3S3XJ\",\n", + " \"name\": \"get_weather\",\n", + " \"input\": {\n", + " \"location\": \"SAN FRANCISCO\"\n", + " }\n", + " }\n", + "]\n", + "TOOL: \"It's 60 degrees and foggy\"\n", + "AI: \"The weather in San Francisco is 60 degrees and foggy.\"\n" + ] + } + ], + "source": [ + "import { HumanMessage } from \"@langchain/core/messages\";\n", + "\n", + "const response = await app.invoke({\n", + " messages: [\n", + " new HumanMessage(\"what is the weather in san francisco?\"),\n", + " ]\n", + "});\n", + "\n", + "for (const message of response.messages) {\n", + " // Anthropic returns tool calls in content as well as in `AIMessage.tool_calls`\n", + " const content = JSON.stringify(message.content, null, 2);\n", + " console.log(`${message._getType().toUpperCase()}: ${content}`);\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Custom strategies\n", + "\n", + "This is a fine default in many cases, but there are cases where custom fallbacks may be better.\n", + "\n", + "For example, the below tool requires as input a list of elements of a specific length - tricky for a small model! We'll also intentionally avoid pluralizing `topic` to trick the model into thinking it should pass a string:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "HUMAN: \"Write me an incredible haiku about water.\"\n", + "AI: [\n", + " {\n", + " \"type\": \"text\",\n", + " \"text\": \"Okay, let's generate a haiku about water using the master haiku generator tool:\"\n", + " },\n", + " {\n", + " \"type\": \"tool_use\",\n", + " \"id\": \"toolu_011kEBcJ3LnqwbBgTMgsED9C\",\n", + " \"name\": \"master_haiku_generator\",\n", + " \"input\": {\n", + " \"topic\": [\n", + " \"water\"\n", + " ]\n", + " }\n", + " }\n", + "]\n", + "TOOL: \"Error: Received tool input did not match expected schema\\n Please fix your mistakes.\"\n", + "AI: [\n", + " {\n", + " \"type\": \"text\",\n", + " \"text\": \"Oops, looks like I need to provide 3 topics for the haiku generator. Let me try again with 3 water-related topics:\"\n", + " },\n", + " {\n", + " \"type\": \"tool_use\",\n", + " \"id\": \"toolu_017uPCEEjQzaBAY9iy8mv1cU\",\n", + " \"name\": \"master_haiku_generator\",\n", + " \"input\": {\n", + " \"topic\": [\n", + " \"ocean\",\n", + " \"waves\",\n", + " \"rain\"\n", + " ]\n", + " }\n", + " }\n", + "]\n", + "TOOL: \"Here is a haiku about the ocean, waves, and rain:\\n\\nWaves crash on the shore,\\nRhythmic dance of water's song,\\nRain falls from the sky.\"\n", + "AI: \"The haiku generator has produced a beautiful and evocative haiku about the different aspects of water - the ocean, waves, and rain. I hope you enjoy this poetic take on the theme of water!\"\n" + ] + } + ], + "source": [ + "import { StringOutputParser } from \"@langchain/core/output_parsers\";\n", + "import { Annotation } from \"@langchain/langgraph\";\n", + "import { BaseMessage } from \"@langchain/core/messages\";\n", + "\n", + "const CustomStrategyState = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: messagesStateReducer,\n", + " }),\n", + "});\n", + "\n", + "const haikuRequestSchema = z.object({\n", + " topic: z.array(z.string()).length(3),\n", + "});\n", + "\n", + "const masterHaikuGenerator = tool(async ({ topic }) => {\n", + " const model = new ChatAnthropic({\n", + " model: \"claude-3-haiku-20240307\",\n", + " temperature: 0,\n", + " });\n", + " const chain = model.pipe(new StringOutputParser());\n", + " const topics = topic.join(\", \");\n", + " const haiku = await chain.invoke(`Write a haiku about ${topics}`);\n", + " return haiku;\n", + "}, {\n", + " name: \"master_haiku_generator\",\n", + " description: \"Generates a haiku based on the provided topics.\",\n", + " schema: haikuRequestSchema,\n", + "});\n", + "\n", + "const customStrategyToolNode = new ToolNode([masterHaikuGenerator]);\n", + "\n", + "const customStrategyModel = new ChatAnthropic({\n", + " model: \"claude-3-haiku-20240307\",\n", + " temperature: 0,\n", + "});\n", + "const customStrategyModelWithTools = customStrategyModel.bindTools([masterHaikuGenerator]);\n", + "\n", + "const customStrategyShouldContinue = async (state: typeof CustomStrategyState.State) => {\n", + " const { messages } = state;\n", + " const lastMessage = messages[messages.length - 1];\n", + " if (isAIMessage(lastMessage) && lastMessage.tool_calls?.length) {\n", + " return \"tools\";\n", + " }\n", + " return \"__end__\";\n", + "}\n", + "\n", + "const customStrategyCallModel = async (state: typeof CustomStrategyState.State) => {\n", + " const { messages } = state;\n", + " const response = await customStrategyModelWithTools.invoke(messages);\n", + " return { messages: [response] };\n", + "}\n", + "\n", + "const customStrategyApp = new StateGraph(CustomStrategyState)\n", + " .addNode(\"tools\", customStrategyToolNode)\n", + " .addNode(\"agent\", customStrategyCallModel)\n", + " .addEdge(\"__start__\", \"agent\")\n", + " .addEdge(\"tools\", \"agent\")\n", + " .addConditionalEdges(\"agent\", customStrategyShouldContinue, {\n", + " // Explicitly list possible destinations so that\n", + " // we can automatically draw the graph below.\n", + " tools: \"tools\",\n", + " __end__: \"__end__\",\n", + " })\n", + " .compile();\n", + "\n", + "const response2 = await customStrategyApp.invoke(\n", + " {\n", + " messages: [new HumanMessage(\"Write me an incredible haiku about water.\")],\n", + " },\n", + " { recursionLimit: 10 }\n", + ");\n", + "\n", + "for (const message of response2.messages) {\n", + " // Anthropic returns tool calls in content as well as in `AIMessage.tool_calls`\n", + " const content = JSON.stringify(message.content, null, 2);\n", + " console.log(`${message._getType().toUpperCase()}: ${content}`);\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that the model takes two attempts.\n", + "\n", + "A better strategy might be to trim the failed attempt to reduce distraction, then fall back to a more advanced model. Here's an example - note the custom-built tool calling node instead of the prebuilt `ToolNode`:" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " agent: {\n", - " messages: [\n", - " AIMessage {\n", - " \"id\": \"msg_01GLo6bFLEpcKH1mZHkb2jhf\",\n", - " \"content\": [\n", - " {\n", - " \"type\": \"text\",\n", - " \"text\": \"Okay, let's generate a haiku about water using the master haiku generator tool:\"\n", - " },\n", - " {\n", - " \"type\": \"tool_use\",\n", - " \"id\": \"toolu_01Lyy87dsq6ja6NBZp2P2bLj\",\n", - " \"name\": \"master_haiku_generator\",\n", - " \"input\": {\n", - " \"topic\": \"[Array]\"\n", - " }\n", - " }\n", - " ],\n", - " \"additional_kwargs\": {\n", - " \"id\": \"msg_01GLo6bFLEpcKH1mZHkb2jhf\",\n", - " \"type\": \"message\",\n", - " \"role\": \"assistant\",\n", - " \"model\": \"claude-3-haiku-20240307\",\n", - " \"stop_reason\": \"tool_use\",\n", - " \"stop_sequence\": null,\n", - " \"usage\": {\n", - " \"input_tokens\": 392,\n", - " \"output_tokens\": 77\n", - " }\n", - " },\n", - " \"response_metadata\": {\n", - " \"id\": \"msg_01GLo6bFLEpcKH1mZHkb2jhf\",\n", - " \"model\": \"claude-3-haiku-20240307\",\n", - " \"stop_reason\": \"tool_use\",\n", - " \"stop_sequence\": null,\n", - " \"usage\": {\n", - " \"input_tokens\": 392,\n", - " \"output_tokens\": 77\n", - " },\n", - " \"type\": \"message\",\n", - " \"role\": \"assistant\"\n", - " },\n", - " \"tool_calls\": [\n", - " {\n", - " \"name\": \"master_haiku_generator\",\n", - " \"args\": {\n", - " \"topic\": \"[Array]\"\n", - " },\n", - " \"id\": \"toolu_01Lyy87dsq6ja6NBZp2P2bLj\",\n", - " \"type\": \"tool_call\"\n", - " }\n", - " ],\n", - " \"invalid_tool_calls\": [],\n", - " \"usage_metadata\": {\n", - " \"input_tokens\": 392,\n", - " \"output_tokens\": 77,\n", - " \"total_tokens\": 469\n", - " }\n", - " }\n", - " ]\n", - " }\n", - "}\n", - "{\n", - " tools: {\n", - " messages: [\n", - " ToolMessage {\n", - " \"id\": \"7c551efd-03a1-4a3d-bc13-29f253ed64da\",\n", - " \"content\": \"Received tool input did not match expected schema\",\n", - " \"name\": \"master_haiku_generator\",\n", - " \"additional_kwargs\": {\n", - " \"error\": {\n", - " \"output\": \"{\\\"topic\\\":[\\\"water\\\"]}\"\n", - " }\n", - " },\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"toolu_01Lyy87dsq6ja6NBZp2P2bLj\"\n", - " }\n", - " ]\n", - " }\n", - "}\n", - "{\n", - " remove_failed_tool_call_attempt: {\n", - " messages: [\n", - " BaseMessage {\n", - " \"id\": \"msg_01GLo6bFLEpcKH1mZHkb2jhf\",\n", - " \"content\": \"\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {}\n", - " },\n", - " BaseMessage {\n", - " \"id\": \"7c551efd-03a1-4a3d-bc13-29f253ed64da\",\n", - " \"content\": \"\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {}\n", - " }\n", - " ]\n", - " }\n", - "}\n", - "{\n", - " fallback_agent: {\n", - " messages: [\n", - " AIMessage {\n", - " \"id\": \"msg_01EAWJF5xxc2wpr2gtqzT4AK\",\n", - " \"content\": [\n", - " {\n", - " \"type\": \"text\",\n", - " \"text\": \"Certainly! I'd be happy to help you create an incredible haiku about water. To do this, we'll use the master_haiku_generator function, which requires three topics as input. Since you've specified water as the main theme, I'll add two related concepts to create a more vivid and interesting haiku. Let's use \\\"water,\\\" \\\"flow,\\\" and \\\"reflection\\\" as our three topics.\\n\\nHere's the function call to generate your haiku:\"\n", - " },\n", - " {\n", - " \"type\": \"tool_use\",\n", - " \"id\": \"toolu_015ktvv3AAxZiUtXoMhrfyw7\",\n", - " \"name\": \"master_haiku_generator\",\n", - " \"input\": {\n", - " \"topic\": \"[Array]\"\n", - " }\n", - " }\n", - " ],\n", - " \"additional_kwargs\": {\n", - " \"id\": \"msg_01EAWJF5xxc2wpr2gtqzT4AK\",\n", - " \"type\": \"message\",\n", - " \"role\": \"assistant\",\n", - " \"model\": \"claude-3-5-sonnet-20240620\",\n", - " \"stop_reason\": \"tool_use\",\n", - " \"stop_sequence\": null,\n", - " \"usage\": {\n", - " \"input_tokens\": 422,\n", - " \"output_tokens\": 162\n", - " }\n", - " },\n", - " \"response_metadata\": {\n", - " \"id\": \"msg_01EAWJF5xxc2wpr2gtqzT4AK\",\n", - " \"model\": \"claude-3-5-sonnet-20240620\",\n", - " \"stop_reason\": \"tool_use\",\n", - " \"stop_sequence\": null,\n", - " \"usage\": {\n", - " \"input_tokens\": 422,\n", - " \"output_tokens\": 162\n", - " },\n", - " \"type\": \"message\",\n", - " \"role\": \"assistant\"\n", - " },\n", - " \"tool_calls\": [\n", - " {\n", - " \"name\": \"master_haiku_generator\",\n", - " \"args\": {\n", - " \"topic\": \"[Array]\"\n", - " },\n", - " \"id\": \"toolu_015ktvv3AAxZiUtXoMhrfyw7\",\n", - " \"type\": \"tool_call\"\n", - " }\n", - " ],\n", - " \"invalid_tool_calls\": [],\n", - " \"usage_metadata\": {\n", - " \"input_tokens\": 422,\n", - " \"output_tokens\": 162,\n", - " \"total_tokens\": 584\n", - " }\n", - " }\n", - " ]\n", - " }\n", - "}\n", - "{\n", - " tools: {\n", - " messages: [\n", - " ToolMessage {\n", - " \"id\": \"5fa074e5-2af1-4da5-8260-c94ee8127fa6\",\n", - " \"content\": \"Here is a haiku about water, flow, and reflection:\\n\\nRippling waters flow,\\nMirroring the sky above,\\nTranquil reflection.\",\n", - " \"name\": \"master_haiku_generator\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"toolu_015ktvv3AAxZiUtXoMhrfyw7\"\n", - " }\n", - " ]\n", - " }\n", - "}\n", - "{\n", - " agent: {\n", - " messages: [\n", - " AIMessage {\n", - " \"id\": \"msg_014EjX9Y5WTW7vP1RyWWURCW\",\n", - " \"content\": \"I hope you enjoy this haiku about the beauty and serenity of water. Please let me know if you would like me to generate another one.\",\n", - " \"additional_kwargs\": {\n", - " \"id\": \"msg_014EjX9Y5WTW7vP1RyWWURCW\",\n", - " \"type\": \"message\",\n", - " \"role\": \"assistant\",\n", - " \"model\": \"claude-3-haiku-20240307\",\n", - " \"stop_reason\": \"end_turn\",\n", - " \"stop_sequence\": null,\n", - " \"usage\": {\n", - " \"input_tokens\": 601,\n", - " \"output_tokens\": 35\n", - " }\n", - " },\n", - " \"response_metadata\": {\n", - " \"id\": \"msg_014EjX9Y5WTW7vP1RyWWURCW\",\n", - " \"model\": \"claude-3-haiku-20240307\",\n", - " \"stop_reason\": \"end_turn\",\n", - " \"stop_sequence\": null,\n", - " \"usage\": {\n", - " \"input_tokens\": 601,\n", - " \"output_tokens\": 35\n", - " },\n", - " \"type\": \"message\",\n", - " \"role\": \"assistant\"\n", - " },\n", - " \"tool_calls\": [],\n", - " \"invalid_tool_calls\": [],\n", - " \"usage_metadata\": {\n", - " \"input_tokens\": 601,\n", - " \"output_tokens\": 35,\n", - " \"total_tokens\": 636\n", - " }\n", - " }\n", - " ]\n", - " }\n", - "}\n" - ] + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "import { AIMessage, ToolMessage, RemoveMessage, BaseMessage } from \"@langchain/core/messages\";\n", + "import { Annotation } from \"@langchain/langgraph\";\n", + "\n", + "const CustomStrategyState2 = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: messagesStateReducer,\n", + " }),\n", + "});\n", + "\n", + "const haikuRequestSchema2 = z.object({\n", + " topic: z.array(z.string()).length(3),\n", + "});\n", + "\n", + "const masterHaikuGenerator2 = tool(async ({ topic }) => {\n", + " const model = new ChatAnthropic({\n", + " model: \"claude-3-haiku-20240307\",\n", + " temperature: 0,\n", + " });\n", + " const chain = model.pipe(new StringOutputParser());\n", + " const topics = topic.join(\", \");\n", + " const haiku = await chain.invoke(`Write a haiku about ${topics}`);\n", + " return haiku;\n", + "}, {\n", + " name: \"master_haiku_generator\",\n", + " description: \"Generates a haiku based on the provided topics.\",\n", + " schema: haikuRequestSchema2,\n", + "});\n", + "\n", + "const callTool2 = async (state: typeof CustomStrategyState2.State) => {\n", + " const { messages } = state;\n", + " const toolsByName = { master_haiku_generator: masterHaikuGenerator };\n", + " const lastMessage = messages[messages.length - 1] as AIMessage;\n", + " const outputMessages: ToolMessage[] = [];\n", + " for (const toolCall of lastMessage.tool_calls) {\n", + " try {\n", + " const toolResult = await toolsByName[toolCall.name].invoke(toolCall);\n", + " outputMessages.push(toolResult);\n", + " } catch (error: any) {\n", + " // Return the error if the tool call fails\n", + " outputMessages.push(\n", + " new ToolMessage({\n", + " content: error.message,\n", + " name: toolCall.name,\n", + " tool_call_id: toolCall.id!,\n", + " additional_kwargs: { error }\n", + " })\n", + " );\n", + " }\n", + " }\n", + " return { messages: outputMessages };\n", + "};\n", + "\n", + "const model = new ChatAnthropic({\n", + " model: \"claude-3-haiku-20240307\",\n", + " temperature: 0,\n", + "});\n", + "const modelWithTools2 = model.bindTools([masterHaikuGenerator2]);\n", + "\n", + "const betterModel = new ChatAnthropic({\n", + " model: \"claude-3-5-sonnet-20240620\",\n", + " temperature: 0,\n", + "});\n", + "const betterModelWithTools = betterModel.bindTools([masterHaikuGenerator2]);\n", + "\n", + "const shouldContinue2 = async (state: typeof CustomStrategyState2.State) => {\n", + " const { messages } = state;\n", + " const lastMessage = messages[messages.length - 1];\n", + " if (isAIMessage(lastMessage) && lastMessage.tool_calls?.length) {\n", + " return \"tools\";\n", + " }\n", + " return \"__end__\";\n", + "}\n", + "\n", + "const shouldFallback = async (state: typeof CustomStrategyState2.State) => {\n", + " const { messages } = state;\n", + " const failedToolMessages = messages.find((message) => {\n", + " return message._getType() === \"tool\" && message.additional_kwargs.error !== undefined;\n", + " });\n", + " if (failedToolMessages) {\n", + " return \"remove_failed_tool_call_attempt\";\n", + " }\n", + " return \"agent\";\n", + "}\n", + "\n", + "const callModel2 = async (state: typeof CustomStrategyState2.State) => {\n", + " const { messages } = state;\n", + " const response = await modelWithTools2.invoke(messages);\n", + " return { messages: [response] };\n", + "}\n", + "\n", + "const removeFailedToolCallAttempt = async (state: typeof CustomStrategyState2.State) => {\n", + " const { messages } = state;\n", + " // Remove all messages from the most recent\n", + " // instance of AIMessage onwards.\n", + " const lastAIMessageIndex = messages\n", + " .map((msg, index) => ({ msg, index }))\n", + " .reverse()\n", + " .findIndex(({ msg }) => isAIMessage(msg));\n", + " const messagesToRemove = messages.slice(lastAIMessageIndex);\n", + " return { messages: messagesToRemove.map(m => new RemoveMessage({ id: m.id })) };\n", + "}\n", + "\n", + "const callFallbackModel = async (state: typeof CustomStrategyState2.State) => {\n", + " const { messages } = state;\n", + " const response = await betterModelWithTools.invoke(messages);\n", + " return { messages: [response] };\n", + "}\n", + "\n", + "const app2 = new StateGraph(CustomStrategyState2)\n", + " .addNode(\"tools\", callTool2)\n", + " .addNode(\"agent\", callModel2)\n", + " .addNode(\"remove_failed_tool_call_attempt\", removeFailedToolCallAttempt)\n", + " .addNode(\"fallback_agent\", callFallbackModel)\n", + " .addEdge(\"__start__\", \"agent\")\n", + " .addConditionalEdges(\"agent\", shouldContinue2, {\n", + " // Explicitly list possible destinations so that\n", + " // we can automatically draw the graph below.\n", + " tools: \"tools\",\n", + " __end__: \"__end__\",\n", + " })\n", + " .addConditionalEdges(\"tools\", shouldFallback, {\n", + " remove_failed_tool_call_attempt: \"remove_failed_tool_call_attempt\",\n", + " agent: \"agent\",\n", + " })\n", + " .addEdge(\"remove_failed_tool_call_attempt\", \"fallback_agent\")\n", + " .addEdge(\"fallback_agent\", \"tools\")\n", + " .compile();" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `tools` node will now return `ToolMessage`s with an `error` field in `additional_kwargs` if a tool call fails. If that happens, it will go to another node that removes the failed tool messages, and has a better model retry the tool call generation. We also add a trimming step via returning the special message modifier `RemoveMessage` to remove previous messages from the state.\n", + "\n", + "The diagram below shows this visually:" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCAGCATsDASIAAhEBAxEB/8QAHQABAAICAwEBAAAAAAAAAAAAAAYHBQgCAwQBCf/EAFwQAAEDAwICBAgICAgKBwkAAAEAAgMEBQYREgchExYxQQgUFSJRVZTRFzJUVmGRk9IjQlJxcnWBtAklMzc4obPhJDQ1NkRidJKxsjlDZHeCorVGU2Nzg4TB1PH/xAAbAQEAAgMBAQAAAAAAAAAAAAAAAQMCBAUGB//EADcRAQABAgELAQUHBQEBAAAAAAABAgMRBBITFCExQVFSkaFhFWJxsfAFIjIzgdHhQlNjwcLxcv/aAAwDAQACEQMRAD8A/VNERAREQEREBERAREQEREBERAREQdFXW09BCZqqeKmiB0Mkzw1uv5yvD1qsnrig9qZ71HOLkMdRYrZHKxskbrpTBzHjUEbj2hR/q/a/VtH9g33LWynKrWSxTnxMzOO7B0MnyXT052OCw+tVk9cUHtTPenWqyeuKD2pnvVedX7X6to/sG+5Or9r9W0f2DfctL2rk/RV3hs+zve8LD61WT1xQe1M96darJ64oPame9V51ftfq2j+wb7k6v2v1bR/YN9ye1cn6Ku8Hs73vCw+tVk9cUHtTPenWqyeuKD2pnvVedX7X6to/sG+5Or9r9W0f2DfcntXJ+irvB7O97wsPrVZPXFB7Uz3p1qsnrig9qZ71XnV+1+raP7BvuTq/a/VtH9g33J7Vyfoq7wezve8LD61WT1xQe1M96yUUrJ4mSRvbJG8BzXtOocD2EHvCqiXH7X0b/wCLaTsP/UN9ymnDH+bbE/1TSf2LFv5PlFvKqKq7cTGbMRt9cf2aeUZNq8RtxxSZERXtIREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQQrit/ka1frWm/wCYrGLJ8Vv8jWr9a03/ADFRy+5Da8XtslxvNyo7Rb4yA+rrp2QxNJOgBe4gDUkAc1wvteMdFEcp+bv5BstT8WQWNyXIaHEcdud8ucpht1tppKuokDS4tjY0ucQBzJ0B5BRocc+G5104g4ty7f46pvvrz3DizhOSW2ttdnyLGMrulVTyxU9jZeKZxr3lh0h03O5O7DyPInkVwYt1Y7Yl0Jrp4SiGe+EDcbbwbvuXWXEL5Q1dK2A07L1SRMY9kpGk3mzaOaAewO3Aubq3tU2ufE2qteN2+6PwfKp6qslfH5IpqWCWqh2k+dJtmMbWkDUHpOeoHbyVNU3B/NLvwz4k49TWibF7NcaamGP45c7qyt8VnjJfKGSNc4RxPLYw1m4gHU6NHJSXiHZ8w4idULjdMCq6qzUktULrh7rtTAzvcxgp5nuEgiljYRJ5hdr5wO06aLami3jERhvnj6Rhx+ubXiqvfOO7l6/BJqnwisbhx/GLrBQXmuGQV0tspqOnox4zFVRiTfDLG5wLXB0bmd4B0JIbq4Y1vG2/P4y2vF+pN6ittXZRXSMcym8YgkdUMj6SQio29EwEhwbudqeQcFDeHfCDK7BLg8NVjsFrprPmF0ucsVLVxSQQUk9POYjHzDi1rpmx6bQ7VpO0N0KsDMbLk1l402jMbLYDklvlsktlqoIayKnlpnGoZK2X8K4BzdA4ENOv0KJptU1Zsbdk8eyYquTGM7N3BayKEP448OI3ua7iBizXNOhab1TAg/76O45cN2uLXcQMWBB0IN6puX/nWro6+UtjPp5ppL/JP/MVJeGP822J/qmk/sWKLsniqqMTwSMmhkj3skjcHNc0jUEEdoI71KOGP822J/qmk/sWL0n2R+Td+NPyqcr7R3U/qkyIi7DiCIiAiIgIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIiCFcVv8jWr9a03/MVipI2StLXta9p7nDUKaZNjNJldubR1b54mMlZMx9PJse17TqCCo/8ABTQ+uL37b/ctPK8kjK4owrwwx4S6eS5VRZommqGF8Qpfk0P+4FyZR08bg5sEbXDsIYAQsx8FND64vftv9yfBTQ+uL37b/cub7In+7HaW5r9rlLGosl8FND64vftv9yq3woLXU8KeA2W5XYL3dY7vbYIpKd09T0jAXTRsOrSOfJxT2P8A5Y7Sn2ha5SsBF32jhlSVtpoqiS8XoySwMkdpWaDUtBPcvX8FND64vftv9yex/wDLHaT2ha5Sw3iFMf8AR4v9wL55PpR/o0P2YWa+Cmh9cXv23+5PgpofXF79t/uT2RP92O0o1+1ylipABC4AaANPJSXhj/Ntif6ppP7Fix54UUBBBvF70P8A23+5Sqz2qCxWihttKHClo4GU8QcdSGMaGt1PfyAXUyTJoyS3XRnY50x4x/do5XlFN+IzeD2IiLac4REQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAVD+HT/RQ4hf7LB+8xK+FQ/h0/0UOIX+ywfvMSC58d/wA37Z/ssX/IFkVjsd/zftn+yxf8gWRQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQFQ/h0/0UOIX+ywfvMSvhateG7xdwW4eDtxExulzTHqnImsjpjaIbrA+rErKmPfH0Ifv3N2u1bpqNDr2INlMd/wA37Z/ssX/IFkVCOHHE/Dc0t9BQY9ltjvtdFRtfJTWy5Q1EjGsDGvJaxxIDS9gJ7i9uvaFN0BERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEXhvN7osfoXVlfOIIGkNGjS9z3Hsa1rQXOce5rQSfQoXU8RL1WPJtlihp6f8WW6VWyR3/042u0H53A+kei2m3VVGO6PXYtotV3PwxisJfk/wDwk/Av4O+LkeZ22n2WTK908uwebFXN06YHly3gtk1J5udJ6F+ixzLLdTpT2XT88ygHG/Bavj/gc+J5PTW5lC+aOpjqaJ721FPIw8nxue1zQSC5p1afNe786y0UdUd1+qXuSkv4LTgnLZcevfE64RPimuzXWu2A8g6ma9rppPpDpGNaO8dE70rfZU9iFRfMGxa049aKGy09rtdLHSU0ZdMSGMaGjU95OmpPeSSsuMzy0dtNZXfRumH9aaKOqO6NUvcllIoHRcSaqkeG3uzmnh5A1dvlNSxv0uZta8D6QHek6DXSb0tVDW00VRTzR1FPKwSRyxODmPaRqHAjkQR3rCq3VTtnd3UV267c4VRg7URFWrEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAXxzg1pJIAHMk9y+qOcSKiWk4eZRPCS2aO11T2OHaCInEH9ist0Z9cUc5wTEYzghLLk/K63y5OSYX6igiLtWxwa+a8D8t40cT2gEN7jr61wp4mQQRxRANjY0NaB2AAclUnHGqnu98xvFbSb9Lfa2OorGU9nvRtMQhj6Nr5J52tc/QOewNa0HUuOo0VVyvPqx4cPg9PERaowjgt5FqdQcQL9ceGvDM5ZkF1ttimu1ztl9vNsne2pc6CSaOkY+eJoeA4x6Oe0DcWjXTcV5KW9Z1FZcLxWmrLvLHk99vM0FddbvNb6+roYNHUrHVBjkfCXsO7RrGlwj5bdxJwwV6eOX1s/dt4vJd7xQ4/a6q5XOrhoLfSxmWepqHhkcbB2lzjyAWt17tnEXFLBZ7Xer/V22luWY22kopKO9Prq2Glla5s8L6l8MZeN3nN3NcRr2naCsRxYo6qkwXj5iUt5vFxtNmt1BcaE11wlmniM0chkjMrnb3x7ogdriRzI7FGCZu4RubZtcHNBB1BGoK5YzdDi9/gpAdtpucpZ0evm09SdSHNHcJDqHAct+06aueTg8Ox+nxiwU1DS1dfWwgbxNcq6WsmO7n/KSuc4j0DXQdy55Y90NmMrP5WKop5Y+X47ZmFun06gLZyfbcijhVs7/ALJvURctzFS5URFW8yIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIiAuito4rhRz0s7d8E8bopG+lrhoR9RXeimJwnGBUVnZPQwutdaSa+3kU8pcecgHJkv5ntAd+0jtBWHzThnjnEGWglvlvdUz0JeaaeGplp5Yw8APaHxOa4tdoNWk6HQajkpNbcoxnjRdcigxusqYrxjFWbbUXJ9BI2Hphrvh1eGiZrXAhzQQWnQgjUE9NTR5HanllVj8ta0dlTa5o5GO/O17mPB+gBwHpPfbVb0szXRht4bu2P/ru2cpt3Kc25vVBn3AOjdjNttWF2KzwQUlwlr/FK2419EyN0jC1/QS0z90WpPNoBZ2+aCdV6cL4Exy8Ppsdz58eRtdcn3CkhbV1Motg0aGRwVMj+n83RxD9wPnuHIKzjX3AEjq5ej+alH3liMsz2nwXHqy+3+13W12ijaHVFXPS6MjBcGgnQ+lwH7Vjq93kuxs444w8lBwgxK2WigtlPanCkobky7wB9VM9/jbPiyue55c8/pEg6cwsjPw/x+rrchqp7ZHPNkFNHSXQSuc9lTDG17WMLCdoAbI8cgNdeeugXthu1ZUQxyxY7enxyNDmuFKNCCNQfjLsFfcSdBjd6J9HiwH/Fyavd5M9JZ5w8GF4NZuH1m8lWKnmpaHeZBHNVTVBB2hvJ0r3OA0aAADoNOQWWp6B2RZHb7bGC6Gnljr6xwPJjGO3RtP0ve0cu8Mf6ND57obtQWS43e4UvVuzW6mkrKuurh4xLHDGwvkcyCEuLiADy119AdyBj3DjwsuBFRbo6a08QbdA+V++aa8F9JLLKQAXyOmawa8gO5oADW6NAAypp0M51U7eHH9fri08oymimjMtr/RY6xZHacooBXWa6Ud3oidoqaCoZPGToDpuaSNdCD+0LIqlxRERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAReS7XSnsdqrbjWOeyko4X1EzmRukcGMaXOIa0FzjoDyAJPcFVlLlmUcccSw7JeHl1dh9lqLj09wF/tDjVVVFG86CJhdoGy7R53bskBDmkaEJlk3EKltlJkdPZI4snyiy0Xjj8coqqMVb9wcY2kE+bv2nTUfmB1Gscp8FufEyTh/luUy3nEbzZmmrqMZtt1DqR1S4AATlg0lDRvAAI5SEHUagzGyYFjmN328Xq12ShoLveJBLcK6CBrZqlwAA3u7T2a6dmpJ7SSc8g4xxsiaWsa1gJLtGjTmTqT+0klckRAUD463PyNwnyGs6k/CN0UUZ6r9D03j+srBt2dHJrprv+I74n7RPFEuLFty+78PbzR4FdaSyZdLGwW+vr2h0MLukaXFwMcgOrA4fEPMj86CS25/SW+ld0Hiu6Jp6DTTo+Q83TQdnZ2dy9C6aJszKOBtS9slQI2iV7exztOZHId67kHCaJlRE+KVjZI3tLXMeNQ4HtBHeF+KvhWeDxXcG+PVZi1nt1TV267zNqLDDTxulknjldoyFgGpc9r9YwObjoD+MF+1qjmTcOsYzO84/dr5Y6K6XKwVRrLXVVMQc+llLS3c0/tDtDqNzGO03MYQGvf8AB9cGGcH+E9c+72O52DNbncZKe7MuhAEpgdIIBTgHaYgx7juGpc4yHUtDNNp1EOJHCvHuK1vttJkFPPKLbXRXGjnpamSnmgnjPJzXsII1Bc0/Q46aHQjHRZtkWPZLmMuZW202LA7ZDHVW/IxcRpJGW6SMnY4Asc1wJ1Hm6OaBuOpQWAi6aKtp7jSQVdJPFVUs7BJFPC8PZIwjUOa4ciCOYIXcgIiICIiAiIgIiICIiAiIgIiICIiAiIgIiICIiAoDxIzu/WyglpMAs9uzLKIayCmq7dLdGU4t8crXOE8/a4N0bqG6bjrqNdFPlVPDiqwmbjhxchsNHWwZfE+1dY6ickwzE0zjS9EC4gaR6h2gbz9KCQ0HCu3UfFKvz99wu1ReauhZb20k1e99FSxDaXCKHkAXOY1xJ15jUAEnWaAAAADQDuX1EBERAREQFWvhH2rEb1wSymizu51lmxKSCPyhX29hfPCwSsILQI5D8YNB8w8ifziyl1VNNDW00tPURMnp5mGOSKVocx7SNC0g8iCOWhQeHGK+3XTHLXWWisjuFqnpYpKSrieHNmiLQWPBHIgjQrJqvLlbMjxTO+sLMjo6ThnQ2R8dXj/k7V9PJFq5ssDo+emzUFuh5MaA0k6tkeBZ7YeJ2JW7JcauEdzs1ezfDUR6jsOha4Hm1wIILTzBBBQSBERAXkutpor7bam3XKjgr6CpjMU9LUxiSKVhGha5p1BB9BXrRBXrMOyXFMlw6hw2ex2rh1b6WSjuFikpHCZoDdYpIJGntBAaWu0GjnuJcSNM3w64k4/xWx3y7jNa6vtnjEtL0roXxHpI3lrxteAe0dui9+ZXS42TEb3cLRb3Xa7UtFNPR29nbUzNYTHH2j4zgB+1Y3hbBLFw+sU1Vj1Hitwq6WOsrLPQxCOOlqJQHys0AHnBzjqe869qCVIiICIiAiIgIiICIiAiIgIiICIiAiIgIvPW3CltsQlq6mGljLtofNIGAn0anv5FeLrVZPXFB7Uz3rOKKqtsQnBlUWK61WT1xQe1M96darJ64oPame9To6+mTCVU+F5xazLghwgky/C7XbbrU0VbC2vZdIpJI4qV4c0yBscjDqJDCO3QBziR3jRah/hRuK0teOhxTDJKipe1rhDb6vpJT2AcqkknuHav0tyWoxTL8eudju1wt1XbLjTSUlTA6qZo+N7S1w7eXI9q/ObwV/BMkxrwpLlLljohjOHVBqqG4VBDILnNu/wZ0bjycAPwjtpO1zA13amjr6ZMJfpRh1TeKzEbHUZFT09JkEtDA+409Jr0MVSY2mVrNSTtD9wGpJ0A5lZhYrrVZPXFB7Uz3p1qsnrig9qZ700dfTJhLKosV1qsnrig9qZ7061WT1xQe1M96aOvpkwllUXCKVk8TJI3tkjeA5r2nUOB7CD3hc1WgREQFBr5hmQw5NiVRid+o8dxy3zTeVbELcx8VdFIN2rHAgxvDxqCOXnuJ102unKx2R+Vur108g+J+XPFZfEPKO/xbxjYei6XZ53R79u7bz0105oNX+Nv8IThWAYtb6zFpYchvkl7NvrLNVNkpqmlp4ZP8JkfG5oLSWgMj3lupk3aO6J7FtJZLzR5FZqC626dtTb66njqqednZJG9ocxw+gggr8duNngreEAMsvGR5Xi9zyivrp3T1F1tW2tbMT+MGRauY0AaBpY0NAAAAAW8f8G9xRqMw4K1OJXQSx3rDqkUb452kSCmk3Ph3AgaaESsA7hGEG2aLGy5JaIHlkt0oo3Dta+oYD/xXDrVZPXFB7Uz3qzR18pThKLcZqG4X/Gaaw2XNYMHvdxrYBTVji3ppmxvbJJFCC4Euc1pB0B5E6jQqfKl8lyXh5mXHrGbRcop6q/4vb5b/QXVtQ1tuh6V3i7o3neA6XTRwaWnQcwRz1tmmv8Aa6yQR09ypJ5CdA2Odrif2ApNuuN8SYS96IirQIiICIiAiIgIiICIiAiIgIiICIiCAcWaeKqkxGKaJk0Tru7VkjQ5p/wKq7isN1ftfq2j+wb7lneKP+M4h+t3fuVUvEtHL666ZtxTOH3f+qnkPtmqqL9OE/0/7lj+r9r9W0f2DfcnV+1+raP7BvuWQWPv9+oMWslfeLrUso7bQwvqKiok7GMaNSeXM8h2DmVytLc6p7uFFdczhEydX7X6to/sG+5Or9r9W0f2DfcoJa+O9rq69tHccfyLHJ56Sesom3iibD49HEzfIIiHu0eG+dsfsdp3cjpww/j9ZcwuWOUzLNfbVT5HTuntFfcqRkUFZtj6VzG6Pc4ODNSNzQHBpLS4aE5517nPddmXo27U+6v2v1bR/YN9ydX7X6to/sG+5UhnfhGOrRZo8Qob02hnyegtL8j8Qjdbqlhq2xVEcb3EuII3t3hgGoO12uiv9Jru0xtqnuxri7biJqnex/V+1+raP7BvuXGXH7X0b/4tpOw/9Q33LJLhL/JP/MVjF251T3VZ9XNJeGP82uJ/qik/sWKTKM8Mf5tcT/VFJ/YsUmXqco/Or+M/N9OneIiKhAiIgxOSZFBjdAJ5I31E8jxFBTRab5nnuGvYAASSeQAJVb3C3zZK50mQVBuIedfEQS2jjH5Ij7H/AKUm49umgOg99/qzds8uBdo6O0xR0cTefmySNbLI79rXQj/wn0rxX24yWiyV9dDSTV8tNA+ZlLT7eklLWkhjdzmjU6aDUgfSFdVXNnCmjZOycfjt2fo7eS2KaaIuVRtlxZjtqiaGstlGxo7m07AP+C5eQLZ6upPsG+5VLw98IQ1/Bq15jllluFtqKllPHEyCBjxc6iY6MZRxsle52rtAA/ae88gSs5F4QOOQ2fIKy8Ud2xyssYhdV2q50oFWRMdsBjbG57ZOkd5rdrj5w0Oio0lzqnu3YromMU+8gWz1dSfYN9y658Zs9TGWTWmhlYe1r6ZhH1EKrMq46eNYBn4oKC74ll1lx+ou1PR3uljbNsEb+jnYA6Rj2h7dCNToeTgNVmK3jVTWCaw2aSz3vJL/AFdoiutRDZaRkphgOjDK8F7ORfqA1mp5Hkpi5cjdVPcz6Fh2w12KPbJZ55X0jfj2qeQvheP/AIZdqYnejaQ30t7xZNkvNNf7ZDXUpd0UgOrJBtfG4HRzHDucDqCPoUAXpwWrNuzOutzdBDcaXx5rBrylicyOR37WyQj/AMP0q6mqb0TFX4o248+ePzaGV2Kc3SUwsVERUuMIiICIiAiIgIiICIiAiIgIiIIJxR/xnEP1u79yqlHcjyuyYdQsrb9ebfZKOSQQsqLjVMp43PIJDQ55AJ0a46dugPoUi4o/4ziH63d+5VSx8sMc7dsjGyNB10cNQud9o/it/wDz/wBVPH/bOGsU49P+5Qscc+G5B04g4sdOZ/jqm5f+dRriXdcR488PshwbHc3x+svN1o3NpoqW5Qzu3sIeCWMcXFurRu0B0GqtTyfS/JofswuUdJBC7dHDGx3pawArlxMROMONTXTRMVUxOMev8KGwjhuDFXOk4KWPCbvFaqiKO60c1I90lS+Mx7Yej85rHNc/znlpA0BB1JHstfDTIocb4B0c1vLJsYZE28NE8etLpbZIHcw7z/wjg3zN3br2c1eaLKbkyznKKpnd8+Uxxn1au2/h/wARbZgOH8O34eypo8byC3z9YYLlTthqaOCsbL0ghc4SB+z4zSO0EgkkBXW/jjw4je5ruIGLNc06FpvVMCD/AL6m68/k+lP+jQ/ZhJrzt8IrvRd/HHb1378UQdxy4btcWu4gYsCDoQb1Tcv/ADqXU9bT3K3RVdJPFVUlREJYZ4Xh7JGOGrXNcORBBBBHbqvvk+lH+jQ/Zhdr2hsLmtAADdAB3clhs4KZzf6YSXhj/Nrif6opP7FikyjPDH+bXE/1RSf2LFJl63KPzq/jPzfTp3iIioQIi+OcGNLnEBoGpJ7kFWXCndQ53kkTwR42+CvYSORaYWw8j+eA/WPSuVTD4zTSxa7ekYW6+jUaLNXeK38QLfT33F7jQXaponywRz01S2SGUBwEsDnsJAILR+i5o15aqP0NzgrzLGwmOogOyemlG2WF35L29oP9RHMag6q27E1xFyN2ERPphs8u/kl2K7cU8Ya7WXh5njOE2IY/PjEdPecBuVFXUZfcYTT3oQmRj2xuaS6LWN+oMjR5xGvYSuWV8Ks14nXm/ZnU2WHHbtTstbLNY6utjlM/idU6qd074i5jd5eWN0LtO06LZFFq4r9DThhi1/v3D3MeLV0zG83SxsxB1Vh9XjNuoaqtiqJJZZ3b3TSOhLmtYC1jQNSebiQOQXiz/Es0ybEca8R4f1FvzWjtbYKS/wBFfoIJbVUh20tkId+FgIa15aN4IcQWg81saiYpm1E47d7poWTx0VO2qkbLUtjaJXsGjXP084gdw11XqxGndV8RBO0Ho6C1yMedOW6eWMt5+nSnf9YXgqbgGVUVDTs8bucw1ho4z57hrpuP5LB3uPIfn0Bk9FNZOFlhNXkl7ttqkrqgGorq6pZTwyTlvKNjpCOQazRo7dGk9uq2rcTbpmueMYR+uyf0wx/Vq5ZdimjM4ymSKFDjdw6d2Z9i5/Neab76lFmvduyK2w3G019Lc7fNr0VVRzNmifoS07XtJB0IIOh7QVU4T2oiICIiAiIgIiICIiAiIgIiIMLlGJ0eW09JFVy1MBpJ/GYZaWXo3tfsezt9G2Rw/asH8FNB64vftp9ymyKyLlUREK6rdFc41UxP6IT8FNB64vftp9yfBTQeuL37afcpsiaSr07Qx0NrojtCE/BTQeuL37afcnwU0Hri9+2n3KbImkq9O0GhtdEdoQn4KaD1xe/bT7k+Cmg9cXv20+5TZE0lXp2g0NrojtCE/BTQeuL37afch4UUBBBu970P/bT7lNkTSVenaDQ2uiO0PHZ7VBYrRQ22lDhS0cDKeIOOpDGNDW6nv5AL2IiwmZqnGVwvhOg1PYoHnPFhmNWG23LHrBcs/FfXi3xsxvo52xOBcHvlfuDWMbseCT2OAadNV9ZhmUV3Eq93K75RDXYJV28UVNiniDA1rnBvSyyTfGcTtcA3s0efQoHVmvGCnsONUV3xmyXDiIKy4+TWRYwYqgRyAuD3Sv3aMYzY4Ocex2gOmuq9bMQyWp4k3O7XDKmVmGT0HicGKm3x7Gvdt6SV83xnk7SA08gHuCzGEYJj/DfHKaw4xaKay2in16OlpWbW6ntcT2uce9ziSe8rPIMHhuEWDh5YYLJjVno7HaodSyloohGzU9rjp2uPe46k95XO/wCH2fJzG640LJpoxpHUMc6KaMehsjCHtH5iFmUWVNVVE40zhKYmY2whTuFFtPJlzvMTfyRXvd/W7U/1rj8E9v8AW979tPuU3RW6e5zW6a51ShHwT2/1ve/bT7lybwntR5TXC8zs72m4yM1/awtP9amqJp7nM01zqljLHjdrxqnfDbKGGja87pHMHnyH0vcebj9JJKhnH/gpaOP/AAxueJXbSJ0uk9FWbdzqSqaCI5QPo3FpHe1zhqNdVYyKqqqapxqnGVUzjvfgdkfCXKcW4nScP621SNyoVzLdHRNI/DSyODYthOgLX7mlruQIcD2L9T/BH418OcdwbHOFBfV4Xltlpm01RYslhdR1E9S4l80ke9zg7pJXSPDA7UB2gaAAryunCLELzxIs2fVdliky+0QyU9HdGySMfHG9j2OaWtcGvG2R4G8HTcdNF5uKnBLCeNNo8n5hj9Ld2NaRDUObsqIPpjlbo9v5gdD3grFCcotV/g142+Dn+F4fX74VsLh59VsmmDLjTxj8Wnqux2gHIO0A7AwlTvhT4XGEcSrt1er3VWE5rG4RzY1kkfitSJPyYy7RsmvcAdxHPaEF2oiICIiAiIgIiICIiAiIgIiICIiAiIgIiIC+E6DU8gvqgd2xnK73xNqxW11uqeF1Xjj6CossjT4y+vfMd0muz+TMB2ab+38Xnqg451xcp8XstouFkslyzrypcRboGY21lS1jwXCR0j921jGdHICSeRbodO763DcquPEa/Vt5ySlrsBq7d4jTYsLe0EOcG9LLLNrq4nR7Q3TTa/uI1OdwTAce4ZYxSY9i9qgs9npQeipoAdNTzLnEklzie1ziSe8qQIMBg2BY9w0xumsGL2mmstop9THS0zdBqe1xJ1LnHvcSSfSs+iICIiAiIgIiICIiAiIgIiICgnFbgbg/Gu0+IZhj9LdQxpENURsqaf6Y5W6Obz56A6HvBU7RBqt8HXHDwcvwmBXz4W8Kh59WMkmEdzp4x+LBVdj9AOQdyHY1hKnvCfwt8H4nXTq/VyVWGZpG4RzY1kkfitUJPyWbuUn0AHdpzLQrsUB4scCsG422sUWYY/TXNzGlsNYB0dVT/wDy5W6Obz56a6HvBQT5FrT4LMd8wvizxf4ZVmU3XKMexM2h1okvMjZaiBlTTySvYZAAXAeY0A8gGcgNStlkBERAREQEREBERAREQEREBERAREQY7IcjtOJWeou18ulFZbVT7emrrhUMggi3ODW7nvIaNXOaBqeZIHeqHPFbgOfCFbmHwqWXrSMX8k9H5Vg8meJ+N9Lr02mzp+k/E6Xds57NPOVs8XeG1v4v8NMiw65+bS3aldCJNNeikBDopAO8skax2n+qvw6PDDIhxNOAeIu6zC5+SfFR/wC/6To9NdPi6893Zpz7EH7y2LILXlNrhudluVHd7bMXCKsoJ2TwvLXFjg17SQdHNc06HkQR2hZBQ3g9wzoODvDHHcNtrjJS2mlERlI0MshJfLJp3bpHPdp3btFMkBERAREQEREBERAREQEREBERAREQEREGu3Bf+mB4R36GN/uL1sStduC/9MDwjv0Mb/cXrYlAXluVzpbPQy1lbOympogC6R50A1OgH0kkgADmSQBzK9SqquuRy68yXCQ77fRyvit8WurdR5r5yPyidzWnuZ2ab3a2U0xMTVVuj6wbFmzN6rNhlqriNc6xxNosjWwEebPdJzAXc+6JrXOHp87afoXk65Zd8nsv1zLrRNPEbqI+bsxklmI3Ozrll3yey/XMnXLLvk9l+uZdaJp56Y7J1SzydnXLLvk9l+uZOuWXfJ7L9cy60TTz0x2NUs8nZ1yy75PZfrmTrll3yey/XMutE089MdjVLPJ2dcsu+T2X65k65Zd8nsv1zKMZJxDsGJ3e2Wq5Vr2XO5EilpKemlqJXgODS4tja4tYC5oL3aNGvapGmnnpjsjVbM7MHZ1yy75PZfrmTrll3yey/XMojV8UMZoWVr57kY20d0hss5NPKdlZKWCOP4vPUys84atG7mRodJUmnnpjsRk1id0Ozrll3yey/XMnXLLvk9l+uZdaxWKZVa82x6ivllqvHbXWsL4J+jfHvaCRrteA4cwe0BNPPTHZOq2d2DM9csu+T2X65lVR4NwHj2OLvk+2jKhS+L9GJH+Lb9nR9Pt27ul6PzNd2mn4uvNWLcL1QWmaiirayCllrZvFqVksga6eXaXbGA/GO1rjoO5pPcvYmnnpjsarZ5Ozrll3yey/XMnXLLvk9l+uZdaxVblVrt+SWywVFV0d2uUM9RS0/RvPSMh2dIdwG0adIzkSCdeWuhTTz0x2NVsxwZnrll3yey/XMnXLLvk9l+uZdaJp56Y7GqWeTs65Zd8nsv1zJ1yy75PZfrmXWopPxUxems1yusl020Fuufkeqm8XlPR1fSsh6PTbqfPkYNwBbz110BKaeemOyJyaxG+Ev65Zd8nsv1zJ1yy75PZfrmXWiaeemOydUs8nZ1yy75PZfrmTrll3yey/XMsfPeqCmutLbJayCO41TJJYKV0gEkrGab3Nb2kN3N1PduHpSmvNBWXOtt0FZBNX0TY3VNNHIC+ESAlm8fi7g0ka9uiaeemOxqtnkyHXLLvk9l+uZOuWXfJ7L9cy60TTz0x2NUs8nZ1yy75PZfrmTrll3yey/XMutcZpo6eJ8sr2xRMaXPe86NaBzJJ7gmnnpjsapZ5MhS8RrnRkG72RroNNXT2uczObz74nNa4jv80uP0KbW250t4oYqyinZU00oO2Rh5HQ6EfQQQQQeYIIPMKsrXdKS922luFBUxVlDVRNngqIXBzJY3DVrmkdoIIIK5UVzOI3llwjOy31crIrhFrozn5rZ9Py2na1x72duuxmmVM03pzYjCeGG6fT9vlxal/I6Ypzra1URFS5AiIg124L/wBMDwjv0Mb/AHF62JWu3Bf+mB4R36GN/uL1sSg8d3mkp7TWywjWZkD3MA/KDSR/WqpxRjI8Xs7Y9CwUcOhA018wc1cJAcCCNQe0FVHbaJ9gnqLDNqH0B2wFx1MtMf5J4+gDzD/rMcrd9mYjhMT84+vi6mQ1RFU0zxVn4QMlzlquG9ttt6uFjbc8oipKqe3TmKSSA0tS58eo7jtHbroQCOYCguV0uVXXirPgFgqrnNabFZoK6OObLKm31VS+aWXdM+oEM0szWbWsDS4NHfu1AGwF8xW15JU2iouNL4xNaawV9E7pHt6KcMewP0aRu82R40dqOfZqAsNm3CbFeIlXR1d+tXjNbRtcyCrp6mWmnYx3xmdJE9ri0/kk6fQtV0q7c1TMwp+js+aVuacO8SzHJrjBUTWa7SV5sdzki8abHUQeLl8rGxkvaxzdXtDCTu7nOBj2K1t/tOHYXlT8vyG4XJ+bNx+aOtuDpKeaiNfJSbHxfEc7Y0O6Qjfu57u5bG27h5j1pr7HWUdtbT1FkopLdbyyR4EFO/ZuYG7tDr0bOZBI05HmV5ouFeLw2WitLLXpb6K6C9QQ+MS+ZWCcziXXdqfwri7aSW89NNOSMdFOOOP1sa+3jIMhdwqyziwcrvFNkNqvVTHTWaOsc23xxQ1vi7aN9MPNeXtHNxG/c8EELN3+/ZFRZde+FEd4ubbnfr3T3C2XLxqTximtEodNVbJNdW9E6nmiboeXTRAdytis4G4PX5UciqLBFJdHVLa156eUQPqG6bZnQB/ROkGg88tLtRrrqpTNjlsnyGmvslHG6701NJRxVZHnshkcxz2fmJjYfo0OnadZItVcZ+uLWiT4SuK2RZ7V2Gtmo5rNeam0W4syiWhiouhDQx8lI2lkbPv1EhMjjuDtBtAWzto8d8k0XlLovKPQM8Z6Akx9LtG/brz27tdPoUPyLgbg+VZFLfblY2y3ObZ08sNTNC2p2fE6Zkb2sl00AG8HkNF33O3cRZLhUut+QYvT0JkcYIqmx1MsrGa8g57axocdO0ho19AUMqKaqMZnagM+JwVPhf8Aj7rhdWPbizKwQR3GZsG4VPRbOjDtvRkAOLNNpdq7TXmoJb8uvxzjDswsdXkHVPIMmdaw+9X0zx1kL+mH4Oh6PbCxro9WODw7Ro3NO7VbF2XEWQ3eDIrs2lqsrFB5Omr6JksMLoekMga2F0jw0btDqST9OnJR2LwfMAhuTK+PHwypiqxXwFtXOG004kEm+FnSbYSXDUiMNDuYOoJCMZt1Tu54qVu0tXHiuU0NTc7hc4LdxVttNTPudZJUvhh6SheGB8hJDQ57iB2DUrJ32XP+KPEjPqSz1NRSw49Vx2+iip8nltXi2sDJBO+FlNIJ97nkgyO26N0DRoSbvr+FGKXSz5Daquzx1Nvv9Z4/coZJZCJqjSMdIDu1YQIo9NhGhaCNDzWMyPgNguWV0NZdLH09VHTMo3TR1c8Tp4WjRrJix46YAcvwm5TiibVXCfrar6zUeUcSOItVi2VZPcbJPjVht008ONVrqQV9bO2TpqjeAHOjaYwAzTbq7mO5SjwUxt8HrCgSXEUjhqe0/hXqTZbwexDOLnQ3G8WcT19FCaeGpgqJaeQRa69E50T2l7NfxHajmeXMrxU+DZDh1uoLJgldYLHjdDCIqeiuNtqa2Vh1JP4QVbNRz7CCR6VDKKKqas6dqH+EZjNNkWV8JWVFZcaRr8jdTF1BXzUpAdSTu1Bjc3R+rAA4cwC4A6OOvjqrVXZ7xIz2y1WX37HbZiVDQw24W25PgOstOZHVU7tdZjqNujyW/g3agkkqxKzh7JnWNutXELyZftlU2pp32qnnoOhc0ea5rune9rwS7zmvbyOmnbr57/wGwXKG0IudkdUGjpG29jxW1DHyU7eyKVzZAZmdvKQuHM+kqSaJmZmI3/sp3hZkWQeEHcLZT5DkF5x6GkxSguIhsdY6hkrKid8rX1LnM0JaOhboz4ur+YK7eFuW3HNsv4KXa71Ira99oyKnfWbQ3xkQ1EETZdBy1e2MO5cvO5clc+V8GMMzVtALrZGE0FOaSmdRzy0jmQcvwOsLmEx8h5h1b9C9Nz4X45W0tgbDaKSnnxw77K9jXsbRPDQAAI3NJZybuZro4DQoxi3XG+f52wliovgraq/ihbYs9vOV3+O5SXWq0tFHcHQ0VJHDUvibTOpx5rvNjG4uBcd3aORU3Fs4oajXJcRI79Meqv8A95c38DsIfljskFjEV2fVNrnvhqpo4ZKhpBEzoWvEbn6gHcWk689VC2YmqYnBRtRl1+ObY7mWPVeQjF7pl7LO6a7X3pIKuJ874ZGxUGzbHG1zXbH7g/zASDrquF7IHBziUSeTeJjdfo/jWlV1zeD5gE9ykr34+PGXVYr2baudrIKjpBJ0sLBJtheXgEujDSeeuoJ1ydZwfw64NydlRYoJYsmMbrtEXv6Oqcz4ry3do14PPe0BxIBJJA0nFToq8JxlMVQPHWsuWRZXW2bGajIm3ez2fyhVSUF/NroqRrzJ0T3hrHmaQ9G7zCNmjeZGqsOTH88tpbSWK+43SWana2GkgrrPVVM8cTQA0Pl8cbvdoPjaDVcK/g9ZcykorlmlBRXjIIoDTT1VAJ6OnqIt5c2J8PTO6Rg1+JIXjUuOg10ULa4qrjCIU1a6J3E/iDwPvt4uV0guF1w6oq6h9vuU1IHStFI8lojc0AOMji4Dk4BoOoaNJHi2OQUHhCcXL5HU3eerttPb6uKjZc5xDO59NPqx8W7a9oI0Y1wIb+KArEuHAzCbnj9kss9md4hZN4twiraiOWla7k5rJWyB4aRy27tNABpoABkrhwuxm55fR5TPbnC/UrGRMrIaqaIvawksEjWPDZQCTpvDtNVOLCLcxtnnj4wUFwnpOKmb23D84pbmH+Up4K2vmnyeWWlmpnO/DQNoPFRHEQ3c1u1+5rmjVzueuPqTfqrBm5K3Ncnp7m/PJLK3orm/oY6OS5upzGIjqw6NcS1zgXN0aAQ1oAvyz8DcHx/J23+3WJtHcWTvqWdFUzCCOV4Ie9kG/omuIc4EtaDzKyA4V4uLK20+S/4vbdPLQh8Yl/xzp+n6XXdr/K+dt1292mnJGMWqsMJlQOf3S+Yfa+LmN27J76KW2y47UUFXPcZJayl8bqgyZrJ3Ev2kR9hJHnOHYdFI8jtlVi3EC64XTZDfbpZL1iVdcKimrrnLPPRyxPYxksczndIxr97mloOmreWnYptxg4L0ue4tlNPaoKanveQutrK2pq5pRHNFS1LZGtIG4N0YZQNrRqXDU94kWF8JsU4fVNfU2O1Cnqq9rWVNTUVEtTNI0a6MMkrnO2jX4oOn0InR1Z2HD+ZRnwYcfp7JwPw2aCrr6o11oo6iQVtdLUNjcYGatiD3ERsHcxujR6FYGVMZJjF3bJpsNJLrqNdBsKxeDcM8b4bRVcOOW99tgqnBz4BUzSxN0LiBGx73Njbq53msAHPs7FmblRPv81PYoNTJXnbMWO0MVOP5V5/Z5o/1ntV1iMbtPxWbLdv73CFoWWaSps9DNNqJpII3v1/KLQT/AFr2r4GhoAAAA5ADuX1RM4zMvMCIiga7cF/6YHhHfoY3+4vWxK124L/0wPCO/Qxv9xetiUBYXJsWpslhhL3upq6nJdTVkXx4idNwP5THaDc08joDyLWkZpFlTVNM4wmJmmcYVbVWzJLQ4sqLMbowDlU2uRmjuffHI4Oby7gXfnXj8oXD5t3r2UfeVvIrM63O+jzLfjLbsRhOCofKFw+bd79lH3k8oXD5t3v2UfeVvImda6PKdeucoVD5QuHzbvfso+8nlC4fNu9+yj7yt5EzrXR5NeucoVD5QuHzbvfso+8nlC4fNu9+yj7yt5EzrXR5NeucoVD5QuHzbvfso+8nlC4fNu9+yj7yt5EzrXR5NeucoVD5QuHzbvfso+8nlC4fNu9+yj7yt5EzrXR5NeucoVD5QuHzbvfso+8nlC4fNu9+yj7yt5EzrXR5NeucoVD5QuHzbvfso+8sN8IdIMyGJ+Tbp1jNB5U8neK/hfFek6Ppe3Tbv8386vda7P8A+kJZ/wB15/8AVUzrXR5NeucoSvyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95PKFw+bd79lH3lbyJnWujya9c5QqHyhcPm3e/ZR95ffKFwP/s3evZR95W6iZ1ro8mvXOUKspbZkl3IZTWY2thHOpukjNG8+6ONznO5dxLfzqc4zi1NjUMzmvdVV1SQamsl+PLprtaPyWN1O1o5DUnm5znHNIomvZm0xhDWu5RXd2VTsERFU1xERBrtwX/pgeEd+hjf7i9bErXbgv/TA8I79DG/3F62JQEREBERAREQEREBERAREQEREBERAWuz/APpCWf8Adef/AFVbErWTJMms+I+H7Q1d9utDZaWr4ceJ089xqWQMmnN01ETHPIDnkAnaOencg2bRfAdRqOYX1AREQEREBERAREQEREBERAREQEREBEXXU1MVHTyTzysggiaXvlkcGtY0cySTyAHpQa9cF/6YHhHfoY3+4vWxK1k8HDJbVmnhReERe7DcKe72ec2CKGvo3iSCV0dJIx4a8cnaOaRyWzaAiIgIiICIiAiIgIiICIiAiIgIiICjmd8O8Z4nWKSzZVY6K+21/PoayIO2Hs3Md2sd/rNII9KkaINZzwN4m8CD4xwfyrrFjcfPqNl8xkjY38ikqvjR+gNd5veSVJcA8LXGL/fWYvmVDW8M825A2XJAImTHs1gqOUcrSeQPIu7gVeajWf8ADXFuKVifZ8ssVFfbe7UiKrj3GMn8Zjh5zHf6zSD9KCSotaPgW4pcBvw/CXJ+tuMRc+pGXzF5jZ+RSVfaz0Na/wA0dpLlKOHvhZYrlF8ZjGVUlbw3zcaNdYclb0BlPZrBMdGStJ7CCC7uagu9FXXFXj9hXBW74pb8uubrZJklVJS0cphc6KPY0F0krxyYwOfEzU98rTptD3NsQEEajmEH1ERAREQEREBERARFwnnjpoZJppGxRRtLnyPIDWgcyST2BBzXXPPFSwSTTSMhhjaXvkkcGta0cyST2ALXzL/DEtVXfJsX4U2Ks4r5YzzXttBDbdSns3T1Z8wD6W6g9m4FYqm8GjNeNM8dx46Zg6ttxcJGYPjEj6W2R89Q2aQHfMR+fkexxCDLZV4X9uud7nxjhLYKvitlEZ2SOtbgy2Uh7A6arPmafo6g9m4FY+l8GbLuMVRFcuOmXuu1HuEkeE4299JaYueoErgRJOR6SRoexxCv7FMQseC2SCz49aaOy2uAfg6ShhbFGPSdAOZPeTzPeswgxeM4tZ8Ms1PaLDa6Sz2ynGkVJQwtijb+ZrQBqe89pWUREBERAREQEREBERAREQEREBERAREQEREBERAUQ4m8PcL4k4+61Ztabddba7XaK8BpjPe6N+ocx3+s0g/SvJlGW1dbWz2qzTGlZAdlXcQ0Oc12n8nCCCC4fjOIIb8UAu3bIs3FrWZTNNRx1tS7TdU1g6eVxHpe/U//ANV2bRR+ZO3lH+/qW/aySu5GdM4Q/IzjVI/JOIdzGPtv9yxS3SPoLFJdZJqmVtEx7jHtc+NjmscXOe1hbqwP2kkgk/oJ/B++E3W5ZihwDNRPS3ixUwdQ3OtaWNqqRpDQx7nADfHuY0EnVzSO9pJvbyBbPV1J9g33J5Atnq6k+wb7lGdZ9fDY1D3lj9arJ64oPame9OtVk9cUHtTPeq48gWz1dSfYN9yeQLZ6upPsG+5M6z6+DUPeWP1qsnrig9qZ7061WT1xQe1M96rjyBbPV1J9g33J5Atnq6k+wb7kzrPr4NQ95Y/WqyeuKD2pnvTrVZPXFB7Uz3quPIFs9XUn2DfcnkC2erqT7BvuTOs+vg1D3lj9arJ64oPame9OtVk9cUHtTPeq48gWz1dSfYN9yeQLZ6upPsG+5M6z6+DUPeRbih4YVpxy+zYtg1hr+IWXteYnQUY6Cgpn9n4eqeNjQD6NR3EtUPp+CmRccZ46/jpxHopLU5wkZguLVopreznqGzyh2+Y/t5HsforZ8gWz1dSfYN9yeQLZ6upPsG+5M6z6+DUPeT3CMSxvCcfgtWKWygtVoi+JBbo2sj17ydvxnHvJ1J7ys+qdbjNBTTdPQReSasDRtTb/AMC8c9ee3k4fQ4Eczy5qZ4lls9TVi0XcsNeWl9PVRt2sqmDtBH4sg729hHnN5bmsZtNUY25x9J3/AM/Wxq3slrtRnb4S9ERVNIREQEREBERAREQEREBERAREQEREBERAREQFhsyvT8dxS7XKIAz01M98Qd2GTTzAfo3aLMqN8R6CW54Le4IGmSYUzpI2NGpc5nngD6SW6K6zETcpirdjDKnDGMULtVA212+Cma4vMbfOeeZe483OJ7ySSSfSVUnHfiPkeO5NhmK47SXaCa/VMgluttpaWokZHHFI90cTZ3hu/wA0OcXNIDNdNztArhpqiOrp4p4nbopWh7XekEahQLPMVul64n8MrvR0vTW+zVldLXTdIxvQtkopYmHQkF2r3NHmg6a6nlzWtVMzVM1b3p64nNwp9HRfeMtJhF4itd6smSChimp6KXJpKBgoDLJsaxznhwOhc9oLms2hx01Gii+VcSMjtrePhprj0fVe1xVFo/ARnxaR1A6Unm3z/PAOj9w7uzkoJxP4NZflFTmokwyPJb3V3VlbaMjqrpE1lNRRvjkZSwxudujfoxzD5rWuLy4vU5yLhrk14v8Axho4rfEy2ZtYWMo7i+pYBT1TKU0/QSxjV3Mu3b26t0BGuuihRNVc4x9bp/hyvHF6+45xExKiNBdsioK/FJbhPbLNRwyTSVIlpwJdXFm0Br3jTcBq4DQnRSuHjdbbph9vyGxWHIMkiq55KZ1DbKEGqpZYyRIydkj2CMtIIIce3s11Cg8Ntz6z5ZhuWR4JJXT0WLzWattcV1pWyRTGeJzS17nBr2kQ69o0Dx3ghR+Xg5l1PS2mtu2PNyahuV7ul7vuJUVxjjibNU7PFw50jmMnbEGHcCdC5+oDtEM6uMcPrctaTj3jIwq15HDHcqo3Srdb6S0Q0hNwlqmuc18HREjR7Cx+7UhoDdddNNcNknFi5vr+H00NFc8TpbhkElBcqS900Ucj4W0c8p56vaG7mtO9jvxSNe0KAYpwhzXDrRjt3osZo23bGcluldDj8VfGIamirGOBEEvINdGJNrQ8M16N3YCNZVxJwW/8c6TCoMhxGezW2lyF01xofKkT5PEvFJW7nuicOT3v2FjHOO09vM6E51cx67Fp4HnFHxDsIvVtpayC2SzPZSz1cbWeNxtOgnjAcT0buZaXBpI56aEE+nLslOJ2V9wbaLnfHh7WNorRAJp3lx05AuaAB3kkAelV9gOSQcILCcVzu/2u0xWyZ1NZLjdLnTwuuNA0DonbXPDt8YIjdqBqWAjXcvHxUutNxcxqipcHu1vzKnobpT1F7s1lvMLZayi0kBhMgeA0F2121zmhwY4aqFmf931ZqHwgMdditZeJqG70lXSXJlnksU9IBcPHX7THA2MOLXOcHtcCHbdDrryOnTbfCFsdXbcgqqyz3uz1FmrILdLbaymYayaqlaHRwxRRveXOcHN0PxSDqDoHEUteeGt6xHFsvrWYxSYb0uR2q9WCKmutHGykkYxsRaOkc2J0g2ndG4hrum0a92mq4UuJnjBYslghtz7xnlpyG3ZBeaG+y0j6S5tEL4o6Zr6d8kTG9C14AJJa7Qu7dVOCjSXMcOP/AKuyo8Iiw2qz36svNnvthq7LBDV1VruFIxtUaeWTo2zRhr3Mkbu1B2uJGhGmugPa7jfBPHkNJHjd8oL9bLU67QW25U8cclZBqWiSPSXTQOABa5zHDUagKvr7wmnvnCjMqOw8JLdg18roaenpoKeej6eraJmPeHOiOxrRt1ALzr6ByVg5Dgt1vPGd13ZAGWaXEKu0OrC9vmVElTE5rdmu4+a1x1005aa6osibk/Xxe/ghxAuXErhzZb3dbLV2isqaKnmkfO2NsVS58TXGSAMkeRGSTpv2u07QpVkvSw2iatp9BWUH+GU7jqNJI/OA5dx0LT9DiOeqhfASlyWxcPrTjWS466y1Fhoae3R1IrIqiKtEbNhkj2EuaNGNOjwD53ZyU1yec0+O3JzWufIYHsjYwauc9w2taPpJIH7VbZxi7ThzhZG2397ktujqo66kgqYiTFMxsjCfQRqP+K7l47PQ+S7RQ0Wod4vAyHUd+1oH/wCF7FNWGM4bnlxERYgiIgIiICIiAiIgIiICIiAiIgIiICIiAiIgq29Wc4LPJqzbjsji+GoHxaIkkmOT8lmp8x/xQPNO3Rpfza4PaHNIc0jUEHUEKznNDmkEAg8iD3qJVfC2wTSOfTQ1Nrc46kW6qkgZr+g07P6ldOZd21zhPfF1LOW5lObXGKPosv8ABPb/AFve/bT7k+Ce3+t737afco0Vvr8NjXrfKWIRZf4J7f63vftp9yfBPb/W979tPuTRW+vwa9b5SxCLL/BPb/W979tPuT4J7f63vftp9yaK31+DXrfKWElp4pyDJEyQjs3NB0X2Knig16OJkevbtaBqs18E9v8AW979tPuT4J7f63vftp9yaK31+DXrfKUbvFkt2Q2+SgutBS3Ohl+PTVkLZY3/AJ2uBBXVj+MWfE6HxKyWmhs1Fu3eL2+mZBHr6drABqpT8E9v9b3v20+5Pgnt/re9+2n3JorfX4NdtY44SxCLL/BPb/W979tPuQcJ7eD/AJXvR/8AvT7k0Vvr8GvW+UsJUVMVJA+aeVkMLBufJI4Na0ekk9i9mLWGXJbhTXOphdFaKR4mpWStLXVUo+LLtPZG3tbrzc7Rw0a1pfnrbw2sNuqI6h1NLX1EZBZLcKh9RsIOoLQ8kA694AKlCmJot7aNs892H1z8cWrfyzSU5tEYCIipc0REQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERAREQEREBERB/9k=" + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "import * as tslab from \"tslab\";\n", + "\n", + "const graph2 = app2.getGraph();\n", + "const image2 = await graph2.drawMermaidPng();\n", + "const arrayBuffer2 = await image2.arrayBuffer();\n", + "\n", + "await tslab.display.png(new Uint8Array(arrayBuffer2));" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's try it out. To emphasize the removal steps, let's `stream` the responses from the model so that we can see each executed node:" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " agent: {\n", + " messages: [\n", + " AIMessage {\n", + " \"id\": \"msg_01GLo6bFLEpcKH1mZHkb2jhf\",\n", + " \"content\": [\n", + " {\n", + " \"type\": \"text\",\n", + " \"text\": \"Okay, let's generate a haiku about water using the master haiku generator tool:\"\n", + " },\n", + " {\n", + " \"type\": \"tool_use\",\n", + " \"id\": \"toolu_01Lyy87dsq6ja6NBZp2P2bLj\",\n", + " \"name\": \"master_haiku_generator\",\n", + " \"input\": {\n", + " \"topic\": \"[Array]\"\n", + " }\n", + " }\n", + " ],\n", + " \"additional_kwargs\": {\n", + " \"id\": \"msg_01GLo6bFLEpcKH1mZHkb2jhf\",\n", + " \"type\": \"message\",\n", + " \"role\": \"assistant\",\n", + " \"model\": \"claude-3-haiku-20240307\",\n", + " \"stop_reason\": \"tool_use\",\n", + " \"stop_sequence\": null,\n", + " \"usage\": {\n", + " \"input_tokens\": 392,\n", + " \"output_tokens\": 77\n", + " }\n", + " },\n", + " \"response_metadata\": {\n", + " \"id\": \"msg_01GLo6bFLEpcKH1mZHkb2jhf\",\n", + " \"model\": \"claude-3-haiku-20240307\",\n", + " \"stop_reason\": \"tool_use\",\n", + " \"stop_sequence\": null,\n", + " \"usage\": {\n", + " \"input_tokens\": 392,\n", + " \"output_tokens\": 77\n", + " },\n", + " \"type\": \"message\",\n", + " \"role\": \"assistant\"\n", + " },\n", + " \"tool_calls\": [\n", + " {\n", + " \"name\": \"master_haiku_generator\",\n", + " \"args\": {\n", + " \"topic\": \"[Array]\"\n", + " },\n", + " \"id\": \"toolu_01Lyy87dsq6ja6NBZp2P2bLj\",\n", + " \"type\": \"tool_call\"\n", + " }\n", + " ],\n", + " \"invalid_tool_calls\": [],\n", + " \"usage_metadata\": {\n", + " \"input_tokens\": 392,\n", + " \"output_tokens\": 77,\n", + " \"total_tokens\": 469\n", + " }\n", + " }\n", + " ]\n", + " }\n", + "}\n", + "{\n", + " tools: {\n", + " messages: [\n", + " ToolMessage {\n", + " \"id\": \"7c551efd-03a1-4a3d-bc13-29f253ed64da\",\n", + " \"content\": \"Received tool input did not match expected schema\",\n", + " \"name\": \"master_haiku_generator\",\n", + " \"additional_kwargs\": {\n", + " \"error\": {\n", + " \"output\": \"{\\\"topic\\\":[\\\"water\\\"]}\"\n", + " }\n", + " },\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"toolu_01Lyy87dsq6ja6NBZp2P2bLj\"\n", + " }\n", + " ]\n", + " }\n", + "}\n", + "{\n", + " remove_failed_tool_call_attempt: {\n", + " messages: [\n", + " BaseMessage {\n", + " \"id\": \"msg_01GLo6bFLEpcKH1mZHkb2jhf\",\n", + " \"content\": \"\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {}\n", + " },\n", + " BaseMessage {\n", + " \"id\": \"7c551efd-03a1-4a3d-bc13-29f253ed64da\",\n", + " \"content\": \"\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {}\n", + " }\n", + " ]\n", + " }\n", + "}\n", + "{\n", + " fallback_agent: {\n", + " messages: [\n", + " AIMessage {\n", + " \"id\": \"msg_01EAWJF5xxc2wpr2gtqzT4AK\",\n", + " \"content\": [\n", + " {\n", + " \"type\": \"text\",\n", + " \"text\": \"Certainly! I'd be happy to help you create an incredible haiku about water. To do this, we'll use the master_haiku_generator function, which requires three topics as input. Since you've specified water as the main theme, I'll add two related concepts to create a more vivid and interesting haiku. Let's use \\\"water,\\\" \\\"flow,\\\" and \\\"reflection\\\" as our three topics.\\n\\nHere's the function call to generate your haiku:\"\n", + " },\n", + " {\n", + " \"type\": \"tool_use\",\n", + " \"id\": \"toolu_015ktvv3AAxZiUtXoMhrfyw7\",\n", + " \"name\": \"master_haiku_generator\",\n", + " \"input\": {\n", + " \"topic\": \"[Array]\"\n", + " }\n", + " }\n", + " ],\n", + " \"additional_kwargs\": {\n", + " \"id\": \"msg_01EAWJF5xxc2wpr2gtqzT4AK\",\n", + " \"type\": \"message\",\n", + " \"role\": \"assistant\",\n", + " \"model\": \"claude-3-5-sonnet-20240620\",\n", + " \"stop_reason\": \"tool_use\",\n", + " \"stop_sequence\": null,\n", + " \"usage\": {\n", + " \"input_tokens\": 422,\n", + " \"output_tokens\": 162\n", + " }\n", + " },\n", + " \"response_metadata\": {\n", + " \"id\": \"msg_01EAWJF5xxc2wpr2gtqzT4AK\",\n", + " \"model\": \"claude-3-5-sonnet-20240620\",\n", + " \"stop_reason\": \"tool_use\",\n", + " \"stop_sequence\": null,\n", + " \"usage\": {\n", + " \"input_tokens\": 422,\n", + " \"output_tokens\": 162\n", + " },\n", + " \"type\": \"message\",\n", + " \"role\": \"assistant\"\n", + " },\n", + " \"tool_calls\": [\n", + " {\n", + " \"name\": \"master_haiku_generator\",\n", + " \"args\": {\n", + " \"topic\": \"[Array]\"\n", + " },\n", + " \"id\": \"toolu_015ktvv3AAxZiUtXoMhrfyw7\",\n", + " \"type\": \"tool_call\"\n", + " }\n", + " ],\n", + " \"invalid_tool_calls\": [],\n", + " \"usage_metadata\": {\n", + " \"input_tokens\": 422,\n", + " \"output_tokens\": 162,\n", + " \"total_tokens\": 584\n", + " }\n", + " }\n", + " ]\n", + " }\n", + "}\n", + "{\n", + " tools: {\n", + " messages: [\n", + " ToolMessage {\n", + " \"id\": \"5fa074e5-2af1-4da5-8260-c94ee8127fa6\",\n", + " \"content\": \"Here is a haiku about water, flow, and reflection:\\n\\nRippling waters flow,\\nMirroring the sky above,\\nTranquil reflection.\",\n", + " \"name\": \"master_haiku_generator\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"toolu_015ktvv3AAxZiUtXoMhrfyw7\"\n", + " }\n", + " ]\n", + " }\n", + "}\n", + "{\n", + " agent: {\n", + " messages: [\n", + " AIMessage {\n", + " \"id\": \"msg_014EjX9Y5WTW7vP1RyWWURCW\",\n", + " \"content\": \"I hope you enjoy this haiku about the beauty and serenity of water. Please let me know if you would like me to generate another one.\",\n", + " \"additional_kwargs\": {\n", + " \"id\": \"msg_014EjX9Y5WTW7vP1RyWWURCW\",\n", + " \"type\": \"message\",\n", + " \"role\": \"assistant\",\n", + " \"model\": \"claude-3-haiku-20240307\",\n", + " \"stop_reason\": \"end_turn\",\n", + " \"stop_sequence\": null,\n", + " \"usage\": {\n", + " \"input_tokens\": 601,\n", + " \"output_tokens\": 35\n", + " }\n", + " },\n", + " \"response_metadata\": {\n", + " \"id\": \"msg_014EjX9Y5WTW7vP1RyWWURCW\",\n", + " \"model\": \"claude-3-haiku-20240307\",\n", + " \"stop_reason\": \"end_turn\",\n", + " \"stop_sequence\": null,\n", + " \"usage\": {\n", + " \"input_tokens\": 601,\n", + " \"output_tokens\": 35\n", + " },\n", + " \"type\": \"message\",\n", + " \"role\": \"assistant\"\n", + " },\n", + " \"tool_calls\": [],\n", + " \"invalid_tool_calls\": [],\n", + " \"usage_metadata\": {\n", + " \"input_tokens\": 601,\n", + " \"output_tokens\": 35,\n", + " \"total_tokens\": 636\n", + " }\n", + " }\n", + " ]\n", + " }\n", + "}\n" + ] + } + ], + "source": [ + "const stream = await app2.stream(\n", + " { messages: [new HumanMessage(\"Write me an incredible haiku about water.\")] },\n", + " { recursionLimit: 10 },\n", + ")\n", + "\n", + "for await (const chunk of stream) {\n", + " console.log(chunk);\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that you get a cleaner response - the more powerful model gets it right on the first try, and the smaller model's failure gets wiped from the graph state. This shorter message history also avoid overpopulating the graph state with attempts.\n", + "\n", + "You can also inspect this [LangSmith trace](https://smith.langchain.com/public/c94f95d0-97fc-4d4d-a59a-b5161c2f4a90/r), which shows the failed initial call to the smaller model.\n", + "\n", + "## Next steps\n", + "\n", + "You've now seen how to implement some strategies to handle tool calling errors.\n", + "\n", + "Next, check out some of the [other LangGraph how-to guides here](https://langchain-ai.github.io/langgraphjs/how-tos/)." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" } - ], - "source": [ - "const stream = await app2.stream(\n", - " { messages: [new HumanMessage(\"Write me an incredible haiku about water.\")] },\n", - " { recursionLimit: 10 },\n", - ")\n", - "\n", - "for await (const chunk of stream) {\n", - " console.log(chunk);\n", - "}" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "You can see that you get a cleaner response - the more powerful model gets it right on the first try, and the smaller model's failure gets wiped from the graph state. This shorter message history also avoid overpopulating the graph state with attempts.\n", - "\n", - "You can also inspect this [LangSmith trace](https://smith.langchain.com/public/c94f95d0-97fc-4d4d-a59a-b5161c2f4a90/r), which shows the failed initial call to the smaller model.\n", - "\n", - "## Next steps\n", - "\n", - "You've now seen how to implement some strategies to handle tool calling errors.\n", - "\n", - "Next, check out some of the [other LangGraph how-to guides here](https://langchain-ai.github.io/langgraphjs/how-tos/)." - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "TypeScript", - "language": "typescript", - "name": "tslab" }, - "language_info": { - "codemirror_mode": { - "mode": "typescript", - "name": "javascript", - "typescript": true - }, - "file_extension": ".ts", - "mimetype": "text/typescript", - "name": "typescript", - "version": "3.7.2" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} + "nbformat": 4, + "nbformat_minor": 2 +} \ No newline at end of file diff --git a/examples/how-tos/tool-calling.ipynb b/examples/how-tos/tool-calling.ipynb index 9cdb2ecfd..272d95b42 100644 --- a/examples/how-tos/tool-calling.ipynb +++ b/examples/how-tos/tool-calling.ipynb @@ -1,605 +1,605 @@ { - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# How to call tools using ToolNode\n", - "\n", - "This guide covers how to use LangGraph's prebuilt [`ToolNode`](https://langchain-ai.github.io/langgraphjs/reference/classes/prebuilt.ToolNode.html) for tool calling.\n", - "\n", - "`ToolNode` is a LangChain Runnable that takes graph state (with a list of messages) as input and outputs state update with the result of tool calls. It is designed to work well out-of-box with LangGraph's prebuilt ReAct agent, but can also work with any `StateGraph` as long as its state has a `messages` key with an appropriate reducer (see [`MessagesState`](https://github.com/langchain-ai/langgraphjs/blob/bcefdd0cfa1727104012993326462b5ebca46f79/langgraph/src/graph/message.ts#L79))." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Setup\n", - "\n", - "```bash\n", - "npm install @langchain/langgraph @langchain/anthropic zod\n", - "```\n", - "\n", - "Set env vars:\n", - "\n", - "```typescript\n", - "process.env.ANTHROPIC_API_KEY = 'your-anthropic-api-key';\n", - "```" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Define tools" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [], - "source": [ - "import { tool } from '@langchain/core/tools';\n", - "import { z } from 'zod';\n", - "\n", - "const getWeather = tool((input) => {\n", - " if (['sf', 'san francisco'].includes(input.location.toLowerCase())) {\n", - " return 'It\\'s 60 degrees and foggy.';\n", - " } else {\n", - " return 'It\\'s 90 degrees and sunny.';\n", - " }\n", - "}, {\n", - " name: 'get_weather',\n", - " description: 'Call to get the current weather.',\n", - " schema: z.object({\n", - " location: z.string().describe(\"Location to get the weather for.\"),\n", - " })\n", - "})\n", - "\n", - "const getCoolestCities = tool(() => {\n", - " return 'nyc, sf';\n", - "}, {\n", - " name: 'get_coolest_cities',\n", - " description: 'Get a list of coolest cities',\n", - " schema: z.object({\n", - " noOp: z.string().optional().describe(\"No-op parameter.\"),\n", - " })\n", - "})" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [], - "source": [ - "import { ToolNode } from '@langchain/langgraph/prebuilt';\n", - "\n", - "const tools = [getWeather, getCoolestCities]\n", - "const toolNode = new ToolNode(tools)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Manually call `ToolNode`" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "`ToolNode` operates on graph state with a list of messages. It expects the last message in the list to be an `AIMessage` with `tool_calls` parameter. \n", - "\n", - "Let's first see how to invoke the tool node manually:" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ + "cells": [ { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " messages: [\n", - " ToolMessage {\n", - " \"content\": \"It's 60 degrees and foggy.\",\n", - " \"name\": \"get_weather\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"tool_call_id\"\n", - " }\n", - " ]\n", - "}\n" - ] - } - ], - "source": [ - "import { AIMessage } from '@langchain/core/messages';\n", - "\n", - "const messageWithSingleToolCall = new AIMessage({\n", - " content: \"\",\n", - " tool_calls: [\n", - " {\n", - " name: \"get_weather\",\n", - " args: { location: \"sf\" },\n", - " id: \"tool_call_id\",\n", - " type: \"tool_call\",\n", - " }\n", - " ]\n", - "})\n", - "\n", - "await toolNode.invoke({ messages: [messageWithSingleToolCall] })" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Note that typically you don't need to create `AIMessage` manually, and it will be automatically generated by any LangChain chat model that supports tool calling.\n", - "\n", - "You can also do parallel tool calling using `ToolNode` if you pass multiple tool calls to `AIMessage`'s `tool_calls` parameter:" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# How to call tools using ToolNode\n", + "\n", + "This guide covers how to use LangGraph's prebuilt [`ToolNode`](https://langchain-ai.github.io/langgraphjs/reference/classes/langgraph_prebuilt.ToolNode.html) for tool calling.\n", + "\n", + "`ToolNode` is a LangChain Runnable that takes graph state (with a list of messages) as input and outputs state update with the result of tool calls. It is designed to work well out-of-box with LangGraph's prebuilt ReAct agent, but can also work with any `StateGraph` as long as its state has a `messages` key with an appropriate reducer (see [`MessagesState`](https://github.com/langchain-ai/langgraphjs/blob/bcefdd0cfa1727104012993326462b5ebca46f79/langgraph/src/graph/message.ts#L79))." + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " messages: [\n", - " ToolMessage {\n", - " \"content\": \"nyc, sf\",\n", - " \"name\": \"get_coolest_cities\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"tool_call_id\"\n", - " },\n", - " ToolMessage {\n", - " \"content\": \"It's 60 degrees and foggy.\",\n", - " \"name\": \"get_weather\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"tool_call_id_2\"\n", - " }\n", - " ]\n", - "}\n" - ] - } - ], - "source": [ - "const messageWithMultipleToolCalls = new AIMessage({\n", - " content: \"\",\n", - " tool_calls: [\n", - " {\n", - " name: \"get_coolest_cities\",\n", - " args: {},\n", - " id: \"tool_call_id\",\n", - " type: \"tool_call\",\n", - " },\n", - " {\n", - " name: \"get_weather\",\n", - " args: { location: \"sf\" },\n", - " id: \"tool_call_id_2\",\n", - " type: \"tool_call\",\n", - " }\n", - " ]\n", - "})\n", - "\n", - "await toolNode.invoke({ messages: [messageWithMultipleToolCalls] })" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Using with chat models" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "We'll be using a small chat model from Anthropic in our example. To use chat models with tool calling, we need to first ensure that the model is aware of the available tools. We do this by calling `.bindTools` method on `ChatAnthropic` model" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [], - "source": [ - "import { ChatAnthropic } from \"@langchain/anthropic\";\n", - "\n", - "const modelWithTools = new ChatAnthropic({\n", - " model: \"claude-3-haiku-20240307\",\n", - " temperature: 0\n", - "}).bindTools(tools)" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup\n", + "\n", + "```bash\n", + "npm install @langchain/langgraph @langchain/anthropic zod\n", + "```\n", + "\n", + "Set env vars:\n", + "\n", + "```typescript\n", + "process.env.ANTHROPIC_API_KEY = 'your-anthropic-api-key';\n", + "```" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "[\n", - " {\n", - " name: 'get_weather',\n", - " args: { location: 'sf' },\n", - " id: 'toolu_01DQJwh6WmCYkvCjDAVkX8E6',\n", - " type: 'tool_call'\n", - " }\n", - "]\n" - ] - } - ], - "source": [ - "(await modelWithTools.invoke(\"what's the weather in sf?\")).tool_calls" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "As you can see, the AI message generated by the chat model already has `tool_calls` populated, so we can just pass it directly to `ToolNode`" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Define tools" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " messages: [\n", - " ToolMessage {\n", - " \"content\": \"It's 60 degrees and foggy.\",\n", - " \"name\": \"get_weather\",\n", - " \"additional_kwargs\": {},\n", - " \"response_metadata\": {},\n", - " \"tool_call_id\": \"toolu_01LQSRLQCcNdnyfWyjvvBeRb\"\n", - " }\n", - " ]\n", - "}\n" - ] - } - ], - "source": [ - "await toolNode.invoke({ messages: [await modelWithTools.invoke(\"what's the weather in sf?\")] })" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## ReAct Agent" - ] - }, - { - "attachments": {}, - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Next, let's see how to use `ToolNode` inside a LangGraph graph. Let's set up a graph implementation of the [ReAct agent](https://langchain-ai.github.io/langgraph/concepts/agentic_concepts/#react-agent). This agent takes some query as input, then repeatedly call tools until it has enough information to resolve the query. We'll be using `ToolNode` and the Anthropic model with tools we just defined" - ] - }, - { - "cell_type": "code", - "execution_count": 17, - "metadata": {}, - "outputs": [], - "source": [ - "import { StateGraph, Annotation, messagesStateReducer, END, START } from \"@langchain/langgraph\";\n", - "import { BaseMessage } from \"@langchain/core/messages\";\n", - "\n", - "const MessagesState = Annotation.Root({\n", - " messages: Annotation({\n", - " reducer: messagesStateReducer,\n", - " })\n", - "})\n", - "\n", - "const toolNodeForGraph = new ToolNode(tools)\n", - "\n", - "const shouldContinue = (state: typeof MessagesState.State): \"tools\" | typeof END => {\n", - " const { messages } = state;\n", - " const lastMessage = messages[messages.length - 1];\n", - " if (\"tool_calls\" in lastMessage && Array.isArray(lastMessage.tool_calls) && lastMessage.tool_calls?.length) {\n", - " return \"tools\";\n", - " }\n", - " return END;\n", - "}\n", - "\n", - "const callModel = async (state: typeof MessagesState.State): Promise> => {\n", - " const { messages } = state;\n", - " const response = await modelWithTools.invoke(messages);\n", - " return { messages: [response] };\n", - "}\n", - "\n", - "\n", - "const workflow = new StateGraph(MessagesState)\n", - " // Define the two nodes we will cycle between\n", - " .addNode(\"agent\", callModel)\n", - " .addNode(\"tools\", toolNodeForGraph)\n", - " .addEdge(START, \"agent\")\n", - " .addConditionalEdges(\n", - " \"agent\",\n", - " shouldContinue,\n", - " )\n", - " .addEdge(\"tools\", \"agent\");\n", - "\n", - "const app = workflow.compile()" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": {}, - "outputs": [ + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "import { tool } from '@langchain/core/tools';\n", + "import { z } from 'zod';\n", + "\n", + "const getWeather = tool((input) => {\n", + " if (['sf', 'san francisco'].includes(input.location.toLowerCase())) {\n", + " return 'It\\'s 60 degrees and foggy.';\n", + " } else {\n", + " return 'It\\'s 90 degrees and sunny.';\n", + " }\n", + "}, {\n", + " name: 'get_weather',\n", + " description: 'Call to get the current weather.',\n", + " schema: z.object({\n", + " location: z.string().describe(\"Location to get the weather for.\"),\n", + " })\n", + "})\n", + "\n", + "const getCoolestCities = tool(() => {\n", + " return 'nyc, sf';\n", + "}, {\n", + " name: 'get_coolest_cities',\n", + " description: 'Get a list of coolest cities',\n", + " schema: z.object({\n", + " noOp: z.string().optional().describe(\"No-op parameter.\"),\n", + " })\n", + "})" + ] + }, { - "data": { - "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCADaAMcDASIAAhEBAxEB/8QAHQABAAMBAAMBAQAAAAAAAAAAAAUGBwgCAwQJAf/EAE8QAAEDBAADAwYIBw0HBQAAAAECAwQABQYRBxIhEzFVCBYiQZTRFBUXMlFhk+EJN0JxdbO0IyQ0NkNSYnN2gaHB0hhUVpGSlbElM0Vyov/EABsBAQACAwEBAAAAAAAAAAAAAAACAwEEBQYH/8QANREAAgECAQgIBQUBAQAAAAAAAAECAxExBBITIUFRUpEFFBVhcaGxwSIyM2LRQnKB4fA0Y//aAAwDAQACEQMRAD8A/VOlKUApSlAK+SbdoNtKBMmx4pX1SH3Uo5vzbNfXWZ5/Cjzs/tSJMdqQkWyQQl1AUAe1a+mjlGEZTlgk2XUaelmoXxLx51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3Vye1cn4Jc0dPs77vI0TzqsvjED2lHvp51WXxiB7Sj31nfm9a/DYf2CPdTzetfhsP7BHup2rk/BLmh2d93kaJ51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3U7VyfglzQ7O+7yNE86rL4xA9pR76edVl8Yge0o99Z35vWvw2H9gj3U83rX4bD+wR7qdq5PwS5odnfd5GiedVl8Yge0o99eTWS2h91Dbd1hOOLISlCZCCVE9wA3Wc+b1r8Nh/YI91Rl/s1visW91mDGZdTdbfpbbKUqH78Z9YFX0OkKFetCiotZzSxW12IyyDNi5Z2BtdKUrfOQKUpQClKUApSlAKUpQClKUApSlAKznNfxg2v9FyP1rVaNWc5r+MG1/ouR+taqqt9Cp+1m5kn1onjSlK8IenILMs4snD6zi6X+cIENTqI6FBtbq3HVHSUIQgFS1HrpKQT0P0VQMr8pDHsemYQY7U242zJH5DZlsW+WtbCGW3CSGkslal9ogJKNBQHMrWgTUxxztlrueHxhdLbkE4MT2ZEaRjDCnp0B9IUUSEJTs+j1B9FXztFJBNZeZmcO2LhZmGT2O73WRYr5NMtuLbv/AFBcNxiQwxIcit9UrIU2VoSOm+4dQNulThKN5d+3u1GtUnJOy7vU1jJuOeEYbdmbder0q3yXG23SXYb/AGbSXOiC64G+Rrf9Mpr6cl4w4liWRjH7lcnU3tUduWmBGhSJLqmVqUhKwlptWxtCt6+boE6BG8H41NZRnxzu3ybTm0iPPs7Qxe22pl2PDV2kbbhmKSUjtEulQU08e5ICUqJrQ+HlonO8ZxfH7VOjRXcGtcdMmXFW1yu9u+txklQGnACgqQeo6bFSdKEYKT3b/DuIqpNyzUTnDjjjbeIWX5Tj7cObElWe4uQ2lLhSQ282httSlqcU0lCFcy1AIKuYgBQ2FA1plY9wzfnYjxTz+xXCx3dKb3e1XaFdWoS1wFsqiMpIU+PRQoKZUnlVo7I1vdbDVFVRUvhwsi6m21rFRGTfwOB+lLf+2M1L1EZN/A4H6Ut/7YzW10d/20f3R9UKv05eDNfpSlewPIilKUApSlAKUpQClKUApSlAKUpQCs5zX8YNr/Rcj9a1WjVXMlwaDk8+NNfkzYsmO0plK4b/AGe0qIJB6HfVIrEoqpCUG7XTRfQqKlUU2ZzlfD3GM6VGOR4/bL6YvMGDcIqHuy5tc3LzA63yp3r6BUB/s/cMt78wMb/7Wz/prUvkqg+MXv237qfJVB8Yvftv3VxV0XNKyrep1nltB63EpWLcOMVwd997HcctdjdkJCHV2+IhkuJB2AopA2BVjqS+SqD4xe/bfup8lUHxi9+2/dUX0S5O7qrkySy+ktSTI2lZpxkizcJ4ncI7HbL3dEQMlu78O4B2RzKU2hnnTynXonfrrXfkqg+MXv237qx2P/6rkzPaFLcyvXyxW7JrVItl2gx7nbpAAdiy2g404AQRzJPQ9QD/AHVUEcAeGjZ2nAccSdEbFsZHQjRHzforUPkqg+MXv237qfJVB8Yvftv3VNdFSjqVZcmReXUXjEzi18E+H9juMa4W/CrDBnRlh1mTHtzSHG1juUlQTsEfTU9k38DgfpS3/tjNWn5KoPjF79t+6v6nhRbO3juO3G7SUsPtyEtPS+ZBW2sLTsa6jmSD/dWxk/R7pV6dadW+a08HsdyEstpOLjFWuXWlKV0ziClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv9mNdEVzv5SP48fJ5/tDL/AGY10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv8AZjXRFc7+Uj+PHyef7Qy/2Y10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSoTJMvgYyG0P9rJmvAlmDFTzvOgd5A2AlPcOZRCRsbPUVKMXJ2iZScnZE3Xy3S2Rb1bZdvnx25cGWyuO+w6NodbUkpUlQ9YIJB/PVDczzJZJKmLPboTfXlEmWt1z6thKAAfqCj+evX555d/u1k/6nqt0W+S5m11Ws/wBJ+OnlF8G5fArjBfsRfClxWHu2t76v5eKv0mlb9Z16Kv6SVD1V+r3kU8F3+B/AOz2qehbV6ujirxcWXO9p51CAG9eopbQ2kj+clX01UOLnBg8Zs/wrLL5EtIn4w/2qW2u05ZiAoLQ07sbKErHMB/SWPyumueeeXf7tZP8AqepolxLmOqVtxpVKzdGa5Ykgrh2Zwb+al11HT8/Kf/FS1n4jsvyGot5hKskh1QQ26p0OxnFE6CQ7oaJOgAtKdkgDZ6VjRN/K0/B+2JCWT1YK7iXKlKVSa4pSlAKUpQClKUApSlAKUpQClKUApSlAKUpQEVlF+RjNhl3FbfbKaSEtsg6LrqlBLaAfUVLUlP8AfWew4zqFOyZbpk3CQQuQ+T3n1JT9CE7ISn1D6ySZ7istXwbHGv5J27oDn0aSy8tP/wC0IqKq2fwU4pbdfsl5M7OQwWa57RSudOPPFHJ8ZvGSSMQv1zlOY3ARMnWqJZYz0GOeQualSHVJX6aBvlaPMkddHYr08TuL2QfHOTohZfFwNiy41HvUCPIjMPLuzrqXVFO3QSUJLaG9NgK5l9/cK1bG660VdHSNK5me4ncQslvbGO2hrIYr9msdtk3J61wLdIlOy5LJWQ8JTjSUpHLrTaNlXP1SAAZm0ZRxLyrLsRxq63M4RcZeOS7hc2o0OM+6HmZbbSFo5u0QgrSsKI2sAKI79KCxlVU8Ezf1uJb5eZQTzHlGzrZ+ivF9huSy4y82l1pxJQttYBSpJGiCD3g1yreLtknEqzcHJUzI37ZeGcunWp+XAisacdYRMaTICHELAVytH0fm/uqunROuqIbLkeIw06+uU6hCUrfcSlKnCBoqISAAT39AB16Cs4ayUJ599RO4BfHkSpNgmOrfcjtiREfeXzLcYKtFKiepLatDZ6lKkbJPMau1ZXblqa4gY2pHznBKaXrv7Mtcx/u5kI/wrVK2qmtRnvXu17XODlUFCq0hSlKpNQUpSgFKUoBSlKAUpSgFKUoBSlKAUpSgK7ntjfv2NvtRE80+OtEuMknl5nG1BQRv1BYBQT9CzVNhTG7hFbkNE8ix3KGlJI6FJHqIIII9RBFapWP53fbHbOKNpxa1TixmV7juTfiv4K4uM80gHbzziEkMElJSHDvZ0ClXokWq045jdrYfg38lyhUm4ywZRcu4BWPMLtf5b92vkCLkDSGrtbbfMDUaaUt9mlaxyFYPIEpPKpIUEgKB67zzirwnyQ5JZn8etuR3ddstLEKJdI92tiOR1vm0txqSwSgn0SpbOubp6I5RXQjhvcMlEvGLilQ36cUtPtq+sFK+b/mkH6q9fxhP/wCHL17J99Y6vV2LzR03KjJapGbp4KyciYsl9vWQ3SxZ2m1swbtdMZkIYTNKRtSVpW2pJAUVEKCUkb6aGgLZb+G9vt+V2jIRMuEi4WyzrsjZkvh0OMqW2srcURzKc20n0ubrs7BJ3Xvu2dQ7DcLZAucSXbp1zcLMCLLDbTstY1tLSVLBWRsdE7PUVK/GE/8A4cvXsn306vV3E1Oitq5lEk8BLC/iUWxNXG7QzDvD18h3KM+hEuNKcdccUUK5OXl/dnE8qkn0T12etaBaoKrZbIkNUp+cqOyhoyZSgp14pAHOsgAFR1skAdT3V4ImXJ0hKMbvKlE60phKP8VLAqTtuH3u/KHxmn4it5+eyy8Fy3B/NK07S2PUSkqV1Oik6VTQSXz2S8fbEi61GmrpnswW3qu2TSLwRuFBaXCjq3tLjqlDtlD/AOnIlG/pLg6aO9Eqk8OOJWLZrJv9jx3to7+MSvi2bAehORTHUNhHKFJAKFBJKSn1a7t1dqTkpNJYLD/eZwatR1ZubFKUqsqFKUoBSlKAUpSgFKUoBSlKAUpSgFfwkDvOvz1HT8hgQLgzbFTIxvEllx+LblPoQ/ISjXMUJJ2QNjZ7hsbrNIWEz+OuKY3cOJmPycVuFtupujFit95WpBCFEx/hJb5QpSfRXoHopAOwCpFAfdcciu3FRzPcMtEfI8GdtyG4jOXLiIShx5Q5l/BkrO1gJ5RzgD550UkJJvWM4+nGrDbLaZsu6uwYrcX4wuKw5KfCQBzOLAHMo62TrqetStKAUpUbklkTkuO3W0LlyoCLhFdiGXCWEPshaCnnbUQQFp3sEggEDoaA/ILyy/KMmcT/ACiF3ewXBTVrxR8RLJIjr/LaXzKkJPdtTg2FfzUo+iv1L8n/AIvQ+OXCWwZfE5W3pjPJMjp/kJKPRdR9OuYEjfekpPrriHi3+D/4e4FxI4V4/b7zkz0PKro/CmuSZUdTjaEM84LRSwADvv5goa9VdreT/wCT/j3k4YbMxrGplznQJU9dxW5dXW3HQ4pttsgFttA5dNJ9W9k9e7QGmUpSgKvxFwCJxIw+6Y/Jn3CzonpRzT7PIMaU0pCgpCkuD1gpHfsEdKhY96ynFM0xXEW8bnZBi7lt7OTmD89tTrMltJ/99s+krnCUnnH5S+6tCpQEXjmUWfMLYLjY7pDu8ArU18JhPJdb50nSk7SSNg9CKlKzDKOFdxx3DrhE4Pu2TAb3LuKbk8tdtS5GlL6BaFpTrk5wlIKkgkAHQBOxLw+LFrVxTc4dyY9xayBu2puSJSoDiIcpvYDhac6j0CUbBOgVgAkg6AvFKUoBSlKAUpSgFKUoBSlKAVnWY57OvreX4rw6uFqd4i2VqMXI14Q6iPFD/pIcUQn0/wBz5lDl2NgA67q0Ws0uU5GN8d7NGhYIqQvJbe/8YZfGbJ+D/BgC2w8Qg6Srm9EqWOvQA+oCdtPDazoyO35fdrXbZudtW1u3v3xmNyKIAJX2YJVyJKlL9ZOiEkkCrdSlAKUpQClKw7jX5Q0jGsgZ4fcPLajLuKE9vmbgJV+9rW2dfviYsfMSNghOwVbHdzJ2BX/KPuURzyhvJ6tiJLS7im9S5KoiVguhr4OR2hT3hOwRvu6H6DXSNY7wL8nmPwxkzcoyO5Ly/iVeBzXTJJY2ob/kI6f5JlOgAABvQ3oBKU7FQClKUApSlAK9E2G3cIj8Z3nDbzamlFpam1hKho8qkkFJ+sEEeqvfSgMgj4vkPAPBccsOAWeZncBq59lKbvV51KjRHFHRaWtPKUtcydJ6aQg95JUNStV7t19aedts+LcGmXVMOLivJdShxPRSFFJOlD1g9RX21lPk6ysJl41kqsFhzYUBORz0TkTiSpc4LHbqTtSvQJ1ru/MKA1alKUApSlAKUpQClKUAr8+/KJ/CQXPF81h47jGK3exSrHdWlXtu9LjIclIbWsPQwlAeSlCwGyH0Ob79JI0T+gD8hqK2XHnUNIHepxQSP+Zri7y8vJnsfGSyO5ticy3jOLaz++IzUhG7pHSPmaB6upA9E96h6J36OpKMpYIFz8hzym808pa35fOyq1We3RbU7FZhOWlh1sOrWHS6F9o6vfKEta1r5x7/AFdRVyl+Dyx6Dw48nC3qucli23O9TZFzfjS3EtuoBIab2lWiAUNJWPqXv110z51WXxiB7Sj31LRz4WZsyUpXzQ7nDuG/gstiTrqexcC//BrmfJuIuVeVFkU/C+F8yRj2BQnVRb/nqElLj6h0XFt++9XqLvq3sdOXng01qZgluJfHTIc/zGXww4MdjLyBj0L5lrqe0gWBB2CAe52R0OkDYBHXelcui8FOBePcD7A/EtfbXG8T1/CLrfp6u0mXF87JcdWeutk6TvQ2e8kkzfDLhfjfCDEYmN4rbW7bbI/UhPVx5Z+c44vvWs66k/UBoAAWusAUpSgFKUoBSvkm3aDbSgTJseKV9Uh91KOb82zXzedVl8Yge0o99TUJNXSM2ZKUqL86rL4xA9pR76edVl8Yge0o99Z0c+FizMY8rPyn5vkv2XH7s3hispt9zkOxXnxcfgiYriUpU2k/uTnMVjtCO7XZnv3XO/Cf8Jffs3yy24rC4UQpd2vFx7GMIV3VHQhK1DRcBYXspGypewNAnQ1XV/HnD8Y438KMhw+Zd7ahc6OTEkLko/e8lPpNOdDvQUBvXekqHrrjn8G5wOZxbJ8izzLixbbhbXXLPbI0x1CFJc7pDwBPqGmwobB5nB6qaOfCxZn6NUqL86rL4xA9pR76edVl8Yge0o99NHPhYsyUpUX51WXxiB7Sj30TlFmUoAXeCSegAko6/wCNNHPhYsyUpSlVmBVQy7Ln4ksWm0hBuBSFvyXBzNxEHu6flOK/JT3AAqV05Urtch9EWO684dNtpK1H6gNmshxpbku1N3F/Rl3I/DX1DfVSwCB19SU8qR9SRVsbRi6j2YeJu5LRVWfxYI/i8agy3u3uLZvEsjRk3HTyz130BHKkfUkAfVXu837WP/jYf2CPdVO4wcXYnCOJj78qHImC63Vi3nsGHnS0hSvTc02hZUoDuR0Kj3b0RX0ZFxsw3FI1sdul0djKuUb4ZHjCBJXJ7HptxbKWy42kb6laU6OwdEGq3WqSxkzuJwjq1KxafN+1+Gw/sE+6nm/a/DYf2CfdVdv/ABgw/G7PaLnMvbS4l4Tz24wmnJTktPLzFTbbSVLUACCSBobG9VDSuLzFyyjhszjkiFdLBlLs5DkwBRUAxHW4OTqOVXOjlUFAkaI0DUdJPiZlyii7PYrZ3lBZtsZDqSFJdabDbiSO4hSdEf3GpbGr+7hfZw5jhkWNbhAkKSO1irWvZU4ofPbKlElZ9JJJUoqBKkU2wcXMTyjJ5WP2q6mbc4y3W3EojPBrmbOnEpeKOzUUnoQlRIq3PMokMradQlxtaSlSFDYUD0INWRrSwm7r/YbiqpShWjY1ClVPhjcHJmKNx33C6/b3nYKlkklSW1ENkk9SS3yEk+vff31bKTjmScdx5yUXFuLFKUqBEUpSgMzz+FHnZ/akSY7UhItkghLqAoA9q19NfH5vWvw2H9gj3VJZr+MG1/ouR+tarxrn5fUnGcUm1qXqzxfS0pLKWk9iI/zetfhsP7BHup5vWvw2H9gj3VIVGZLk1rw+ySrvepzVutsYAuyHjoDZAAHrJJIAA2SSAASa5ulqP9T5nHU5t2TZ5+b1r8Nh/YI91PN61+Gw/sEe6qjD474LNsV2vCb8lmFaezM/4VFeYdjJcUEoUtpxCXAlRPRXLroevQ1IYrxYxXNJc+Larr2kmCymS+1JjuxlBlW+V1IdSnnbOj6adp+us59ZbX5ljVZJtp6vEnvN61+Gw/sEe6nm9a/DYf2CPdWVRvKOsmUcTsLxvFJ0e6wruuaJj64j6PQZYUtCmHFBKFpK06Kk8419HfWy0lOrHGT5mJqrTtn3VyP83rX4bD+wR7qhc0sluYxa4uNQIrbiW9pWhlIIOx3HVWqoLOf4pXP+q/zFbOSVajyims5/MtveWZPOWmhr2r1NlpSldg+inzXKILhbpUUnQfaU3v6Ngj/OslxVxS8btoWlSHW2EsuIUNFK0DlWD+ZSSK2Os6yqwu45cZN1iMKetUtZdmNtDa4zpABdCfW2rXpa6pV6WiFKKLorPg6axxX4/wBusdDI6qpzaltMm8oK23GTjmOXK322Xd/iTI7fdZMSA2XZC2GnP3QtoHVagFb5R1OjVWVkcvFeK9yzl7E8mulnyGxxY0X4HaXHZcR1h17mYdY1ztBfaJUCoBOwdkVusaSzMYQ/HdQ+y4OZDjagpKh9II6GvZWq9WpnYcLvOTOWuHWJZDwZl4JkV9xy53GKmx3C3yIVmjGa9anX53wttPZo2op5D2RUgHRQN6FeeM4pkmP3vC8vmY1c24UjMLxc3bZHY7STb485lbbKnW0np6Wlr1vl5zvuNdRUrFyCopWs8P6/BgOAfGti4xfFuK2fJrbh0uRPfvUG+wC3CivbKkPwnj1IdcJJbSpSdKJ0kjVb9SvlhxnswkLt9scIjBXJMuKN8jKd6UhtQ6F0jYAHzPnK/JSuyEHUfdte4k3GjFuT1Fk4URyMdlzNEJn3CRIRsaJQFdmk/mIbBH1EVdK9EKGxbobESM0liMw2lpppA0lCEjQA+oACvfVtSWfNyR5ucs+TlvFKUqsgKUpQGc5r+MG1/ouR+tarxryzX8YNr/Rcj9a1Vcyvh7jGdKinI8ftt9MXmDBuEVD3Zc2ubl5gdb5U719ArmdIW0kb7l7niulbda17kWGsj8pfErrlWGWR61xJ1y+Jr7EusuBa5CmJcmO3zhxLK0qSQ4OcLTpQJKBo71U5/s+8Mt/xAxv/ALWz/pqdxXhviuDPvvY7jlrsbshIQ6u3xEMlxIOwFFIGwK5yai7o5kJRpyU4t3Xd/Zz3mWE23JeFmd3PHMZzpd9ehxbeheTKnPyZLQkodLbLT61r0ggknlA6nW+tWjjXgN/zTiDkEazxZCPjDh7cLa1N5FJYMhUlooZU5rlClDm6E70VHu3W/wBKlpWixZTJNNbL46934OdLJe5uYcRODoj4RkePR7C1OanfGFqcYjxCYRbSgOa5VJ5hpKh6J6ddnVdF181xt0W8W+TBnR2pkKS2pl+O+gLQ6hQ0pKknoQQSCDVJHk/8MwQRgGOAjuItjP8ApqLkpY6iE5wqWvqt/O1vf3l/qCzn+KVz/qv8xUBG4DcN4chp9jBMdZfaUFtuItjIUlQOwQeXoQan85/ilc/6r/MVsZJbrNO3EvUzQUdNDNe1evibLSlK7h9GFKUoCr3PhvYbnJckiM7BkuHa3bfIcjlZ3slQQQFHfrIJr4PkogeL3r237qu9KvVeov1FiqzjqUmUj5KIHi969t+6nyUQPF717b91XelZ09Tf6EtNU4mU5nhVYwoGUu4XJIIPZy5zimzr6UAhJ/MQRVriRGIEZuPGZbjx2khKGmkBKEAdwAHQCvdSq5VJz1SZXKUpfM7ilKVWRFKUoBSlKArmS4NByefGmvyZsWTHaUylcN/s9pUQSD0O+qRUZ8lUHxi9+2/dV2pVmklZL2RXKnCTvKKf8FJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IjoaXAuSKT8lUHxi9+2/dT5KoPjF79t+6rtSmkfdyQ0NLgXJFJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IaGlwLkik/JVB8Yvftv3V65HCG1y2lNSLneH2VfObXM2lQ+g9KvVKyqsk7r0RlUaSd1FckKUpVRaf/9k=" - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "import * as tslab from \"tslab\";\n", - "\n", - "const drawableGraph = app.getGraph();\n", - "const image = await drawableGraph.drawMermaidPng();\n", - "const arrayBuffer = await image.arrayBuffer();\n", - "\n", - "await tslab.display.png(new Uint8Array(arrayBuffer));" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Let's try it out!" - ] - }, - { - "cell_type": "code", - "execution_count": 18, - "metadata": {}, - "outputs": [ + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "import { ToolNode } from '@langchain/langgraph/prebuilt';\n", + "\n", + "const tools = [getWeather, getCoolestCities]\n", + "const toolNode = new ToolNode(tools)" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " type: 'human',\n", - " content: \"what's the weather in sf?\",\n", - " toolCalls: undefined\n", - "}\n", - "{\n", - " type: 'ai',\n", - " content: [\n", - " { type: 'text', text: \"Okay, let's check the weather in SF:\" },\n", - " {\n", - " type: 'tool_use',\n", - " id: 'toolu_01Adr6WYEuUuzShyDzwYZf5a',\n", - " name: 'get_weather',\n", - " input: { location: 'sf' }\n", - " }\n", - " ],\n", - " toolCalls: [\n", - " {\n", - " name: 'get_weather',\n", - " args: { location: 'sf' },\n", - " id: 'toolu_01Adr6WYEuUuzShyDzwYZf5a',\n", - " type: 'tool_call'\n", - " }\n", - " ]\n", - "}\n", - "{\n", - " type: 'tool',\n", - " content: \"It's 60 degrees and foggy.\",\n", - " toolCalls: undefined\n", - "}\n", - "{\n", - " type: 'ai',\n", - " content: 'The current weather in San Francisco is 60 degrees and foggy.',\n", - " toolCalls: []\n", - "}\n" - ] - } - ], - "source": [ - "import { HumanMessage } from \"@langchain/core/messages\";\n", - "\n", - "// example with a single tool call\n", - "const stream = await app.stream(\n", - " {\n", - " messages: [new HumanMessage(\"what's the weather in sf?\")],\n", - " },\n", - " {\n", - " streamMode: \"values\"\n", - " }\n", - ")\n", - "for await (const chunk of stream) {\n", - " const lastMessage = chunk.messages[chunk.messages.length - 1];\n", - " const type = lastMessage._getType();\n", - " const content = lastMessage.content;\n", - " const toolCalls = lastMessage.tool_calls;\n", - " console.dir({\n", - " type,\n", - " content,\n", - " toolCalls\n", - " }, { depth: null });\n", - "}" - ] - }, - { - "cell_type": "code", - "execution_count": 19, - "metadata": {}, - "outputs": [ + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Manually call `ToolNode`" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`ToolNode` operates on graph state with a list of messages. It expects the last message in the list to be an `AIMessage` with `tool_calls` parameter. \n", + "\n", + "Let's first see how to invoke the tool node manually:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " messages: [\n", + " ToolMessage {\n", + " \"content\": \"It's 60 degrees and foggy.\",\n", + " \"name\": \"get_weather\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"tool_call_id\"\n", + " }\n", + " ]\n", + "}\n" + ] + } + ], + "source": [ + "import { AIMessage } from '@langchain/core/messages';\n", + "\n", + "const messageWithSingleToolCall = new AIMessage({\n", + " content: \"\",\n", + " tool_calls: [\n", + " {\n", + " name: \"get_weather\",\n", + " args: { location: \"sf\" },\n", + " id: \"tool_call_id\",\n", + " type: \"tool_call\",\n", + " }\n", + " ]\n", + "})\n", + "\n", + "await toolNode.invoke({ messages: [messageWithSingleToolCall] })" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that typically you don't need to create `AIMessage` manually, and it will be automatically generated by any LangChain chat model that supports tool calling.\n", + "\n", + "You can also do parallel tool calling using `ToolNode` if you pass multiple tool calls to `AIMessage`'s `tool_calls` parameter:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " messages: [\n", + " ToolMessage {\n", + " \"content\": \"nyc, sf\",\n", + " \"name\": \"get_coolest_cities\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"tool_call_id\"\n", + " },\n", + " ToolMessage {\n", + " \"content\": \"It's 60 degrees and foggy.\",\n", + " \"name\": \"get_weather\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"tool_call_id_2\"\n", + " }\n", + " ]\n", + "}\n" + ] + } + ], + "source": [ + "const messageWithMultipleToolCalls = new AIMessage({\n", + " content: \"\",\n", + " tool_calls: [\n", + " {\n", + " name: \"get_coolest_cities\",\n", + " args: {},\n", + " id: \"tool_call_id\",\n", + " type: \"tool_call\",\n", + " },\n", + " {\n", + " name: \"get_weather\",\n", + " args: { location: \"sf\" },\n", + " id: \"tool_call_id_2\",\n", + " type: \"tool_call\",\n", + " }\n", + " ]\n", + "})\n", + "\n", + "await toolNode.invoke({ messages: [messageWithMultipleToolCalls] })" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Using with chat models" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll be using a small chat model from Anthropic in our example. To use chat models with tool calling, we need to first ensure that the model is aware of the available tools. We do this by calling `.bindTools` method on `ChatAnthropic` model" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "import { ChatAnthropic } from \"@langchain/anthropic\";\n", + "\n", + "const modelWithTools = new ChatAnthropic({\n", + " model: \"claude-3-haiku-20240307\",\n", + " temperature: 0\n", + "}).bindTools(tools)" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[\n", + " {\n", + " name: 'get_weather',\n", + " args: { location: 'sf' },\n", + " id: 'toolu_01DQJwh6WmCYkvCjDAVkX8E6',\n", + " type: 'tool_call'\n", + " }\n", + "]\n" + ] + } + ], + "source": [ + "(await modelWithTools.invoke(\"what's the weather in sf?\")).tool_calls" + ] + }, { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " type: 'human',\n", - " content: \"what's the weather in the coolest cities?\",\n", - " toolCalls: undefined\n", - "}\n", - "{\n", - " type: 'ai',\n", - " content: [\n", - " {\n", - " type: 'text',\n", - " text: \"Okay, let's find out the weather in the coolest cities:\"\n", - " },\n", - " {\n", - " type: 'tool_use',\n", - " id: 'toolu_01Qh1jhQHH14ykNEx5oLXApL',\n", - " name: 'get_coolest_cities',\n", - " input: { noOp: 'dummy' }\n", - " }\n", - " ],\n", - " toolCalls: [\n", - " {\n", - " name: 'get_coolest_cities',\n", - " args: { noOp: 'dummy' },\n", - " id: 'toolu_01Qh1jhQHH14ykNEx5oLXApL',\n", - " type: 'tool_call'\n", - " }\n", - " ]\n", - "}\n", - "{ type: 'tool', content: 'nyc, sf', toolCalls: undefined }\n", - "{\n", - " type: 'ai',\n", - " content: [\n", - " {\n", - " type: 'text',\n", - " text: \"Now let's get the weather for those cities:\"\n", - " },\n", - " {\n", - " type: 'tool_use',\n", - " id: 'toolu_01TWgf1ezxk1hAzwYtqrE3cA',\n", - " name: 'get_weather',\n", - " input: { location: 'nyc' }\n", - " }\n", - " ],\n", - " toolCalls: [\n", - " {\n", - " name: 'get_weather',\n", - " args: { location: 'nyc' },\n", - " id: 'toolu_01TWgf1ezxk1hAzwYtqrE3cA',\n", - " type: 'tool_call'\n", - " }\n", - " ]\n", - "}\n", - "{\n", - " type: 'tool',\n", - " content: \"It's 90 degrees and sunny.\",\n", - " toolCalls: undefined\n", - "}\n", - "{\n", - " type: 'ai',\n", - " content: [\n", - " {\n", - " type: 'tool_use',\n", - " id: 'toolu_01NyRcucFHEZmyA6hE6BtTPs',\n", - " name: 'get_weather',\n", - " input: { location: 'sf' }\n", - " }\n", - " ],\n", - " toolCalls: [\n", - " {\n", - " name: 'get_weather',\n", - " args: { location: 'sf' },\n", - " id: 'toolu_01NyRcucFHEZmyA6hE6BtTPs',\n", - " type: 'tool_call'\n", - " }\n", - " ]\n", - "}\n", - "{\n", - " type: 'tool',\n", - " content: \"It's 60 degrees and foggy.\",\n", - " toolCalls: undefined\n", - "}\n", - "{\n", - " type: 'ai',\n", - " content: 'Based on the results, the weather in the coolest cities is:\\n' +\n", - " '- New York City: 90 degrees and sunny\\n' +\n", - " '- San Francisco: 60 degrees and foggy\\n' +\n", - " '\\n' +\n", - " 'So the weather in the coolest cities is a mix of warm and cool temperatures.',\n", - " toolCalls: []\n", - "}\n" - ] + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, the AI message generated by the chat model already has `tool_calls` populated, so we can just pass it directly to `ToolNode`" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " messages: [\n", + " ToolMessage {\n", + " \"content\": \"It's 60 degrees and foggy.\",\n", + " \"name\": \"get_weather\",\n", + " \"additional_kwargs\": {},\n", + " \"response_metadata\": {},\n", + " \"tool_call_id\": \"toolu_01LQSRLQCcNdnyfWyjvvBeRb\"\n", + " }\n", + " ]\n", + "}\n" + ] + } + ], + "source": [ + "await toolNode.invoke({ messages: [await modelWithTools.invoke(\"what's the weather in sf?\")] })" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## ReAct Agent" + ] + }, + { + "attachments": {}, + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, let's see how to use `ToolNode` inside a LangGraph graph. Let's set up a graph implementation of the [ReAct agent](https://langchain-ai.github.io/langgraph/concepts/agentic_concepts/#react-agent). This agent takes some query as input, then repeatedly call tools until it has enough information to resolve the query. We'll be using `ToolNode` and the Anthropic model with tools we just defined" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [], + "source": [ + "import { StateGraph, Annotation, messagesStateReducer, END, START } from \"@langchain/langgraph\";\n", + "import { BaseMessage } from \"@langchain/core/messages\";\n", + "\n", + "const MessagesState = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: messagesStateReducer,\n", + " })\n", + "})\n", + "\n", + "const toolNodeForGraph = new ToolNode(tools)\n", + "\n", + "const shouldContinue = (state: typeof MessagesState.State): \"tools\" | typeof END => {\n", + " const { messages } = state;\n", + " const lastMessage = messages[messages.length - 1];\n", + " if (\"tool_calls\" in lastMessage && Array.isArray(lastMessage.tool_calls) && lastMessage.tool_calls?.length) {\n", + " return \"tools\";\n", + " }\n", + " return END;\n", + "}\n", + "\n", + "const callModel = async (state: typeof MessagesState.State): Promise> => {\n", + " const { messages } = state;\n", + " const response = await modelWithTools.invoke(messages);\n", + " return { messages: [response] };\n", + "}\n", + "\n", + "\n", + "const workflow = new StateGraph(MessagesState)\n", + " // Define the two nodes we will cycle between\n", + " .addNode(\"agent\", callModel)\n", + " .addNode(\"tools\", toolNodeForGraph)\n", + " .addEdge(START, \"agent\")\n", + " .addConditionalEdges(\n", + " \"agent\",\n", + " shouldContinue,\n", + " )\n", + " .addEdge(\"tools\", \"agent\");\n", + "\n", + "const app = workflow.compile()" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "/9j/4AAQSkZJRgABAQAAAQABAAD/4gHYSUNDX1BST0ZJTEUAAQEAAAHIAAAAAAQwAABtbnRyUkdCIFhZWiAH4AABAAEAAAAAAABhY3NwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAQAA9tYAAQAAAADTLQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAlkZXNjAAAA8AAAACRyWFlaAAABFAAAABRnWFlaAAABKAAAABRiWFlaAAABPAAAABR3dHB0AAABUAAAABRyVFJDAAABZAAAAChnVFJDAAABZAAAAChiVFJDAAABZAAAAChjcHJ0AAABjAAAADxtbHVjAAAAAAAAAAEAAAAMZW5VUwAAAAgAAAAcAHMAUgBHAEJYWVogAAAAAAAAb6IAADj1AAADkFhZWiAAAAAAAABimQAAt4UAABjaWFlaIAAAAAAAACSgAAAPhAAAts9YWVogAAAAAAAA9tYAAQAAAADTLXBhcmEAAAAAAAQAAAACZmYAAPKnAAANWQAAE9AAAApbAAAAAAAAAABtbHVjAAAAAAAAAAEAAAAMZW5VUwAAACAAAAAcAEcAbwBvAGcAbABlACAASQBuAGMALgAgADIAMAAxADb/2wBDAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGBYUGBIUFRT/2wBDAQMEBAUEBQkFBQkUDQsNFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBQUFBT/wAARCADaAMcDASIAAhEBAxEB/8QAHQABAAMBAAMBAQAAAAAAAAAAAAUGBwgCAwQJAf/EAE8QAAEDBAADAwYIBw0HBQAAAAECAwQABQYRBxIhEzFVCBYiQZTRFBUXMlFhk+EJN0JxdbO0IyQ0NkNSYnN2gaHB0hhUVpGSlbElM0Vyov/EABsBAQACAwEBAAAAAAAAAAAAAAACAwEEBQYH/8QANREAAgECAQgIBQUBAQAAAAAAAAECAxExBBITIUFRUpEFFBVhcaGxwSIyM2LRQnKB4fA0Y//aAAwDAQACEQMRAD8A/VOlKUApSlAK+SbdoNtKBMmx4pX1SH3Uo5vzbNfXWZ5/Cjzs/tSJMdqQkWyQQl1AUAe1a+mjlGEZTlgk2XUaelmoXxLx51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3Vye1cn4Jc0dPs77vI0TzqsvjED2lHvp51WXxiB7Sj31nfm9a/DYf2CPdTzetfhsP7BHup2rk/BLmh2d93kaJ51WXxiB7Sj3086rL4xA9pR76zvzetfhsP7BHup5vWvw2H9gj3U7VyfglzQ7O+7yNE86rL4xA9pR76edVl8Yge0o99Z35vWvw2H9gj3U83rX4bD+wR7qdq5PwS5odnfd5GiedVl8Yge0o99eTWS2h91Dbd1hOOLISlCZCCVE9wA3Wc+b1r8Nh/YI91Rl/s1visW91mDGZdTdbfpbbKUqH78Z9YFX0OkKFetCiotZzSxW12IyyDNi5Z2BtdKUrfOQKUpQClKUApSlAKUpQClKUApSlAKznNfxg2v9FyP1rVaNWc5r+MG1/ouR+taqqt9Cp+1m5kn1onjSlK8IenILMs4snD6zi6X+cIENTqI6FBtbq3HVHSUIQgFS1HrpKQT0P0VQMr8pDHsemYQY7U242zJH5DZlsW+WtbCGW3CSGkslal9ogJKNBQHMrWgTUxxztlrueHxhdLbkE4MT2ZEaRjDCnp0B9IUUSEJTs+j1B9FXztFJBNZeZmcO2LhZmGT2O73WRYr5NMtuLbv/AFBcNxiQwxIcit9UrIU2VoSOm+4dQNulThKN5d+3u1GtUnJOy7vU1jJuOeEYbdmbder0q3yXG23SXYb/AGbSXOiC64G+Rrf9Mpr6cl4w4liWRjH7lcnU3tUduWmBGhSJLqmVqUhKwlptWxtCt6+boE6BG8H41NZRnxzu3ybTm0iPPs7Qxe22pl2PDV2kbbhmKSUjtEulQU08e5ICUqJrQ+HlonO8ZxfH7VOjRXcGtcdMmXFW1yu9u+txklQGnACgqQeo6bFSdKEYKT3b/DuIqpNyzUTnDjjjbeIWX5Tj7cObElWe4uQ2lLhSQ282httSlqcU0lCFcy1AIKuYgBQ2FA1plY9wzfnYjxTz+xXCx3dKb3e1XaFdWoS1wFsqiMpIU+PRQoKZUnlVo7I1vdbDVFVRUvhwsi6m21rFRGTfwOB+lLf+2M1L1EZN/A4H6Ut/7YzW10d/20f3R9UKv05eDNfpSlewPIilKUApSlAKUpQClKUApSlAKUpQCs5zX8YNr/Rcj9a1WjVXMlwaDk8+NNfkzYsmO0plK4b/AGe0qIJB6HfVIrEoqpCUG7XTRfQqKlUU2ZzlfD3GM6VGOR4/bL6YvMGDcIqHuy5tc3LzA63yp3r6BUB/s/cMt78wMb/7Wz/prUvkqg+MXv237qfJVB8Yvftv3VxV0XNKyrep1nltB63EpWLcOMVwd997HcctdjdkJCHV2+IhkuJB2AopA2BVjqS+SqD4xe/bfup8lUHxi9+2/dUX0S5O7qrkySy+ktSTI2lZpxkizcJ4ncI7HbL3dEQMlu78O4B2RzKU2hnnTynXonfrrXfkqg+MXv237qx2P/6rkzPaFLcyvXyxW7JrVItl2gx7nbpAAdiy2g404AQRzJPQ9QD/AHVUEcAeGjZ2nAccSdEbFsZHQjRHzforUPkqg+MXv237qfJVB8Yvftv3VNdFSjqVZcmReXUXjEzi18E+H9juMa4W/CrDBnRlh1mTHtzSHG1juUlQTsEfTU9k38DgfpS3/tjNWn5KoPjF79t+6v6nhRbO3juO3G7SUsPtyEtPS+ZBW2sLTsa6jmSD/dWxk/R7pV6dadW+a08HsdyEstpOLjFWuXWlKV0ziClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv9mNdEVzv5SP48fJ5/tDL/AGY10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQHO/lI/jx8nn+0Mv8AZjXRFc7+Uj+PHyef7Qy/2Y10RQClKUApSlAKUpQClKUApSlAKUpQClKUApSlAKUpQClKUApSoTJMvgYyG0P9rJmvAlmDFTzvOgd5A2AlPcOZRCRsbPUVKMXJ2iZScnZE3Xy3S2Rb1bZdvnx25cGWyuO+w6NodbUkpUlQ9YIJB/PVDczzJZJKmLPboTfXlEmWt1z6thKAAfqCj+evX555d/u1k/6nqt0W+S5m11Ws/wBJ+OnlF8G5fArjBfsRfClxWHu2t76v5eKv0mlb9Z16Kv6SVD1V+r3kU8F3+B/AOz2qehbV6ujirxcWXO9p51CAG9eopbQ2kj+clX01UOLnBg8Zs/wrLL5EtIn4w/2qW2u05ZiAoLQ07sbKErHMB/SWPyumueeeXf7tZP8AqepolxLmOqVtxpVKzdGa5Ykgrh2Zwb+al11HT8/Kf/FS1n4jsvyGot5hKskh1QQ26p0OxnFE6CQ7oaJOgAtKdkgDZ6VjRN/K0/B+2JCWT1YK7iXKlKVSa4pSlAKUpQClKUApSlAKUpQClKUApSlAKUpQEVlF+RjNhl3FbfbKaSEtsg6LrqlBLaAfUVLUlP8AfWew4zqFOyZbpk3CQQuQ+T3n1JT9CE7ISn1D6ySZ7istXwbHGv5J27oDn0aSy8tP/wC0IqKq2fwU4pbdfsl5M7OQwWa57RSudOPPFHJ8ZvGSSMQv1zlOY3ARMnWqJZYz0GOeQualSHVJX6aBvlaPMkddHYr08TuL2QfHOTohZfFwNiy41HvUCPIjMPLuzrqXVFO3QSUJLaG9NgK5l9/cK1bG660VdHSNK5me4ncQslvbGO2hrIYr9msdtk3J61wLdIlOy5LJWQ8JTjSUpHLrTaNlXP1SAAZm0ZRxLyrLsRxq63M4RcZeOS7hc2o0OM+6HmZbbSFo5u0QgrSsKI2sAKI79KCxlVU8Ezf1uJb5eZQTzHlGzrZ+ivF9huSy4y82l1pxJQttYBSpJGiCD3g1yreLtknEqzcHJUzI37ZeGcunWp+XAisacdYRMaTICHELAVytH0fm/uqunROuqIbLkeIw06+uU6hCUrfcSlKnCBoqISAAT39AB16Cs4ayUJ599RO4BfHkSpNgmOrfcjtiREfeXzLcYKtFKiepLatDZ6lKkbJPMau1ZXblqa4gY2pHznBKaXrv7Mtcx/u5kI/wrVK2qmtRnvXu17XODlUFCq0hSlKpNQUpSgFKUoBSlKAUpSgFKUoBSlKAUpSgK7ntjfv2NvtRE80+OtEuMknl5nG1BQRv1BYBQT9CzVNhTG7hFbkNE8ix3KGlJI6FJHqIIII9RBFapWP53fbHbOKNpxa1TixmV7juTfiv4K4uM80gHbzziEkMElJSHDvZ0ClXokWq045jdrYfg38lyhUm4ywZRcu4BWPMLtf5b92vkCLkDSGrtbbfMDUaaUt9mlaxyFYPIEpPKpIUEgKB67zzirwnyQ5JZn8etuR3ddstLEKJdI92tiOR1vm0txqSwSgn0SpbOubp6I5RXQjhvcMlEvGLilQ36cUtPtq+sFK+b/mkH6q9fxhP/wCHL17J99Y6vV2LzR03KjJapGbp4KyciYsl9vWQ3SxZ2m1swbtdMZkIYTNKRtSVpW2pJAUVEKCUkb6aGgLZb+G9vt+V2jIRMuEi4WyzrsjZkvh0OMqW2srcURzKc20n0ubrs7BJ3Xvu2dQ7DcLZAucSXbp1zcLMCLLDbTstY1tLSVLBWRsdE7PUVK/GE/8A4cvXsn306vV3E1Oitq5lEk8BLC/iUWxNXG7QzDvD18h3KM+hEuNKcdccUUK5OXl/dnE8qkn0T12etaBaoKrZbIkNUp+cqOyhoyZSgp14pAHOsgAFR1skAdT3V4ImXJ0hKMbvKlE60phKP8VLAqTtuH3u/KHxmn4it5+eyy8Fy3B/NK07S2PUSkqV1Oik6VTQSXz2S8fbEi61GmrpnswW3qu2TSLwRuFBaXCjq3tLjqlDtlD/AOnIlG/pLg6aO9Eqk8OOJWLZrJv9jx3to7+MSvi2bAehORTHUNhHKFJAKFBJKSn1a7t1dqTkpNJYLD/eZwatR1ZubFKUqsqFKUoBSlKAUpSgFKUoBSlKAUpSgFfwkDvOvz1HT8hgQLgzbFTIxvEllx+LblPoQ/ISjXMUJJ2QNjZ7hsbrNIWEz+OuKY3cOJmPycVuFtupujFit95WpBCFEx/hJb5QpSfRXoHopAOwCpFAfdcciu3FRzPcMtEfI8GdtyG4jOXLiIShx5Q5l/BkrO1gJ5RzgD550UkJJvWM4+nGrDbLaZsu6uwYrcX4wuKw5KfCQBzOLAHMo62TrqetStKAUpUbklkTkuO3W0LlyoCLhFdiGXCWEPshaCnnbUQQFp3sEggEDoaA/ILyy/KMmcT/ACiF3ewXBTVrxR8RLJIjr/LaXzKkJPdtTg2FfzUo+iv1L8n/AIvQ+OXCWwZfE5W3pjPJMjp/kJKPRdR9OuYEjfekpPrriHi3+D/4e4FxI4V4/b7zkz0PKro/CmuSZUdTjaEM84LRSwADvv5goa9VdreT/wCT/j3k4YbMxrGplznQJU9dxW5dXW3HQ4pttsgFttA5dNJ9W9k9e7QGmUpSgKvxFwCJxIw+6Y/Jn3CzonpRzT7PIMaU0pCgpCkuD1gpHfsEdKhY96ynFM0xXEW8bnZBi7lt7OTmD89tTrMltJ/99s+krnCUnnH5S+6tCpQEXjmUWfMLYLjY7pDu8ArU18JhPJdb50nSk7SSNg9CKlKzDKOFdxx3DrhE4Pu2TAb3LuKbk8tdtS5GlL6BaFpTrk5wlIKkgkAHQBOxLw+LFrVxTc4dyY9xayBu2puSJSoDiIcpvYDhac6j0CUbBOgVgAkg6AvFKUoBSlKAUpSgFKUoBSlKAVnWY57OvreX4rw6uFqd4i2VqMXI14Q6iPFD/pIcUQn0/wBz5lDl2NgA67q0Ws0uU5GN8d7NGhYIqQvJbe/8YZfGbJ+D/BgC2w8Qg6Srm9EqWOvQA+oCdtPDazoyO35fdrXbZudtW1u3v3xmNyKIAJX2YJVyJKlL9ZOiEkkCrdSlAKUpQClKw7jX5Q0jGsgZ4fcPLajLuKE9vmbgJV+9rW2dfviYsfMSNghOwVbHdzJ2BX/KPuURzyhvJ6tiJLS7im9S5KoiVguhr4OR2hT3hOwRvu6H6DXSNY7wL8nmPwxkzcoyO5Ly/iVeBzXTJJY2ob/kI6f5JlOgAABvQ3oBKU7FQClKUApSlAK9E2G3cIj8Z3nDbzamlFpam1hKho8qkkFJ+sEEeqvfSgMgj4vkPAPBccsOAWeZncBq59lKbvV51KjRHFHRaWtPKUtcydJ6aQg95JUNStV7t19aedts+LcGmXVMOLivJdShxPRSFFJOlD1g9RX21lPk6ysJl41kqsFhzYUBORz0TkTiSpc4LHbqTtSvQJ1ru/MKA1alKUApSlAKUpQClKUAr8+/KJ/CQXPF81h47jGK3exSrHdWlXtu9LjIclIbWsPQwlAeSlCwGyH0Ob79JI0T+gD8hqK2XHnUNIHepxQSP+Zri7y8vJnsfGSyO5ticy3jOLaz++IzUhG7pHSPmaB6upA9E96h6J36OpKMpYIFz8hzym808pa35fOyq1We3RbU7FZhOWlh1sOrWHS6F9o6vfKEta1r5x7/AFdRVyl+Dyx6Dw48nC3qucli23O9TZFzfjS3EtuoBIab2lWiAUNJWPqXv110z51WXxiB7Sj31LRz4WZsyUpXzQ7nDuG/gstiTrqexcC//BrmfJuIuVeVFkU/C+F8yRj2BQnVRb/nqElLj6h0XFt++9XqLvq3sdOXng01qZgluJfHTIc/zGXww4MdjLyBj0L5lrqe0gWBB2CAe52R0OkDYBHXelcui8FOBePcD7A/EtfbXG8T1/CLrfp6u0mXF87JcdWeutk6TvQ2e8kkzfDLhfjfCDEYmN4rbW7bbI/UhPVx5Z+c44vvWs66k/UBoAAWusAUpSgFKUoBSvkm3aDbSgTJseKV9Uh91KOb82zXzedVl8Yge0o99TUJNXSM2ZKUqL86rL4xA9pR76edVl8Yge0o99Z0c+FizMY8rPyn5vkv2XH7s3hispt9zkOxXnxcfgiYriUpU2k/uTnMVjtCO7XZnv3XO/Cf8Jffs3yy24rC4UQpd2vFx7GMIV3VHQhK1DRcBYXspGypewNAnQ1XV/HnD8Y438KMhw+Zd7ahc6OTEkLko/e8lPpNOdDvQUBvXekqHrrjn8G5wOZxbJ8izzLixbbhbXXLPbI0x1CFJc7pDwBPqGmwobB5nB6qaOfCxZn6NUqL86rL4xA9pR76edVl8Yge0o99NHPhYsyUpUX51WXxiB7Sj30TlFmUoAXeCSegAko6/wCNNHPhYsyUpSlVmBVQy7Ln4ksWm0hBuBSFvyXBzNxEHu6flOK/JT3AAqV05Urtch9EWO684dNtpK1H6gNmshxpbku1N3F/Rl3I/DX1DfVSwCB19SU8qR9SRVsbRi6j2YeJu5LRVWfxYI/i8agy3u3uLZvEsjRk3HTyz130BHKkfUkAfVXu837WP/jYf2CPdVO4wcXYnCOJj78qHImC63Vi3nsGHnS0hSvTc02hZUoDuR0Kj3b0RX0ZFxsw3FI1sdul0djKuUb4ZHjCBJXJ7HptxbKWy42kb6laU6OwdEGq3WqSxkzuJwjq1KxafN+1+Gw/sE+6nm/a/DYf2CfdVdv/ABgw/G7PaLnMvbS4l4Tz24wmnJTktPLzFTbbSVLUACCSBobG9VDSuLzFyyjhszjkiFdLBlLs5DkwBRUAxHW4OTqOVXOjlUFAkaI0DUdJPiZlyii7PYrZ3lBZtsZDqSFJdabDbiSO4hSdEf3GpbGr+7hfZw5jhkWNbhAkKSO1irWvZU4ofPbKlElZ9JJJUoqBKkU2wcXMTyjJ5WP2q6mbc4y3W3EojPBrmbOnEpeKOzUUnoQlRIq3PMokMradQlxtaSlSFDYUD0INWRrSwm7r/YbiqpShWjY1ClVPhjcHJmKNx33C6/b3nYKlkklSW1ENkk9SS3yEk+vff31bKTjmScdx5yUXFuLFKUqBEUpSgMzz+FHnZ/akSY7UhItkghLqAoA9q19NfH5vWvw2H9gj3VJZr+MG1/ouR+tarxrn5fUnGcUm1qXqzxfS0pLKWk9iI/zetfhsP7BHup5vWvw2H9gj3VIVGZLk1rw+ySrvepzVutsYAuyHjoDZAAHrJJIAA2SSAASa5ulqP9T5nHU5t2TZ5+b1r8Nh/YI91PN61+Gw/sEe6qjD474LNsV2vCb8lmFaezM/4VFeYdjJcUEoUtpxCXAlRPRXLroevQ1IYrxYxXNJc+Larr2kmCymS+1JjuxlBlW+V1IdSnnbOj6adp+us59ZbX5ljVZJtp6vEnvN61+Gw/sEe6nm9a/DYf2CPdWVRvKOsmUcTsLxvFJ0e6wruuaJj64j6PQZYUtCmHFBKFpK06Kk8419HfWy0lOrHGT5mJqrTtn3VyP83rX4bD+wR7qhc0sluYxa4uNQIrbiW9pWhlIIOx3HVWqoLOf4pXP+q/zFbOSVajyims5/MtveWZPOWmhr2r1NlpSldg+inzXKILhbpUUnQfaU3v6Ngj/OslxVxS8btoWlSHW2EsuIUNFK0DlWD+ZSSK2Os6yqwu45cZN1iMKetUtZdmNtDa4zpABdCfW2rXpa6pV6WiFKKLorPg6axxX4/wBusdDI6qpzaltMm8oK23GTjmOXK322Xd/iTI7fdZMSA2XZC2GnP3QtoHVagFb5R1OjVWVkcvFeK9yzl7E8mulnyGxxY0X4HaXHZcR1h17mYdY1ztBfaJUCoBOwdkVusaSzMYQ/HdQ+y4OZDjagpKh9II6GvZWq9WpnYcLvOTOWuHWJZDwZl4JkV9xy53GKmx3C3yIVmjGa9anX53wttPZo2op5D2RUgHRQN6FeeM4pkmP3vC8vmY1c24UjMLxc3bZHY7STb485lbbKnW0np6Wlr1vl5zvuNdRUrFyCopWs8P6/BgOAfGti4xfFuK2fJrbh0uRPfvUG+wC3CivbKkPwnj1IdcJJbSpSdKJ0kjVb9SvlhxnswkLt9scIjBXJMuKN8jKd6UhtQ6F0jYAHzPnK/JSuyEHUfdte4k3GjFuT1Fk4URyMdlzNEJn3CRIRsaJQFdmk/mIbBH1EVdK9EKGxbobESM0liMw2lpppA0lCEjQA+oACvfVtSWfNyR5ucs+TlvFKUqsgKUpQGc5r+MG1/ouR+tarxryzX8YNr/Rcj9a1Vcyvh7jGdKinI8ftt9MXmDBuEVD3Zc2ubl5gdb5U719ArmdIW0kb7l7niulbda17kWGsj8pfErrlWGWR61xJ1y+Jr7EusuBa5CmJcmO3zhxLK0qSQ4OcLTpQJKBo71U5/s+8Mt/xAxv/ALWz/pqdxXhviuDPvvY7jlrsbshIQ6u3xEMlxIOwFFIGwK5yai7o5kJRpyU4t3Xd/Zz3mWE23JeFmd3PHMZzpd9ehxbeheTKnPyZLQkodLbLT61r0ggknlA6nW+tWjjXgN/zTiDkEazxZCPjDh7cLa1N5FJYMhUlooZU5rlClDm6E70VHu3W/wBKlpWixZTJNNbL46934OdLJe5uYcRODoj4RkePR7C1OanfGFqcYjxCYRbSgOa5VJ5hpKh6J6ddnVdF181xt0W8W+TBnR2pkKS2pl+O+gLQ6hQ0pKknoQQSCDVJHk/8MwQRgGOAjuItjP8ApqLkpY6iE5wqWvqt/O1vf3l/qCzn+KVz/qv8xUBG4DcN4chp9jBMdZfaUFtuItjIUlQOwQeXoQan85/ilc/6r/MVsZJbrNO3EvUzQUdNDNe1evibLSlK7h9GFKUoCr3PhvYbnJckiM7BkuHa3bfIcjlZ3slQQQFHfrIJr4PkogeL3r237qu9KvVeov1FiqzjqUmUj5KIHi969t+6nyUQPF717b91XelZ09Tf6EtNU4mU5nhVYwoGUu4XJIIPZy5zimzr6UAhJ/MQRVriRGIEZuPGZbjx2khKGmkBKEAdwAHQCvdSq5VJz1SZXKUpfM7ilKVWRFKUoBSlKArmS4NByefGmvyZsWTHaUylcN/s9pUQSD0O+qRUZ8lUHxi9+2/dV2pVmklZL2RXKnCTvKKf8FJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IjoaXAuSKT8lUHxi9+2/dT5KoPjF79t+6rtSmkfdyQ0NLgXJFJ+SqD4xe/bfup8lUHxi9+2/dV2pTSPu5IaGlwLkik/JVB8Yvftv3V65HCG1y2lNSLneH2VfObXM2lQ+g9KvVKyqsk7r0RlUaSd1FckKUpVRaf/9k=" + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "import * as tslab from \"tslab\";\n", + "\n", + "const drawableGraph = app.getGraph();\n", + "const image = await drawableGraph.drawMermaidPng();\n", + "const arrayBuffer = await image.arrayBuffer();\n", + "\n", + "await tslab.display.png(new Uint8Array(arrayBuffer));" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's try it out!" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " type: 'human',\n", + " content: \"what's the weather in sf?\",\n", + " toolCalls: undefined\n", + "}\n", + "{\n", + " type: 'ai',\n", + " content: [\n", + " { type: 'text', text: \"Okay, let's check the weather in SF:\" },\n", + " {\n", + " type: 'tool_use',\n", + " id: 'toolu_01Adr6WYEuUuzShyDzwYZf5a',\n", + " name: 'get_weather',\n", + " input: { location: 'sf' }\n", + " }\n", + " ],\n", + " toolCalls: [\n", + " {\n", + " name: 'get_weather',\n", + " args: { location: 'sf' },\n", + " id: 'toolu_01Adr6WYEuUuzShyDzwYZf5a',\n", + " type: 'tool_call'\n", + " }\n", + " ]\n", + "}\n", + "{\n", + " type: 'tool',\n", + " content: \"It's 60 degrees and foggy.\",\n", + " toolCalls: undefined\n", + "}\n", + "{\n", + " type: 'ai',\n", + " content: 'The current weather in San Francisco is 60 degrees and foggy.',\n", + " toolCalls: []\n", + "}\n" + ] + } + ], + "source": [ + "import { HumanMessage } from \"@langchain/core/messages\";\n", + "\n", + "// example with a single tool call\n", + "const stream = await app.stream(\n", + " {\n", + " messages: [new HumanMessage(\"what's the weather in sf?\")],\n", + " },\n", + " {\n", + " streamMode: \"values\"\n", + " }\n", + ")\n", + "for await (const chunk of stream) {\n", + " const lastMessage = chunk.messages[chunk.messages.length - 1];\n", + " const type = lastMessage._getType();\n", + " const content = lastMessage.content;\n", + " const toolCalls = lastMessage.tool_calls;\n", + " console.dir({\n", + " type,\n", + " content,\n", + " toolCalls\n", + " }, { depth: null });\n", + "}" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " type: 'human',\n", + " content: \"what's the weather in the coolest cities?\",\n", + " toolCalls: undefined\n", + "}\n", + "{\n", + " type: 'ai',\n", + " content: [\n", + " {\n", + " type: 'text',\n", + " text: \"Okay, let's find out the weather in the coolest cities:\"\n", + " },\n", + " {\n", + " type: 'tool_use',\n", + " id: 'toolu_01Qh1jhQHH14ykNEx5oLXApL',\n", + " name: 'get_coolest_cities',\n", + " input: { noOp: 'dummy' }\n", + " }\n", + " ],\n", + " toolCalls: [\n", + " {\n", + " name: 'get_coolest_cities',\n", + " args: { noOp: 'dummy' },\n", + " id: 'toolu_01Qh1jhQHH14ykNEx5oLXApL',\n", + " type: 'tool_call'\n", + " }\n", + " ]\n", + "}\n", + "{ type: 'tool', content: 'nyc, sf', toolCalls: undefined }\n", + "{\n", + " type: 'ai',\n", + " content: [\n", + " {\n", + " type: 'text',\n", + " text: \"Now let's get the weather for those cities:\"\n", + " },\n", + " {\n", + " type: 'tool_use',\n", + " id: 'toolu_01TWgf1ezxk1hAzwYtqrE3cA',\n", + " name: 'get_weather',\n", + " input: { location: 'nyc' }\n", + " }\n", + " ],\n", + " toolCalls: [\n", + " {\n", + " name: 'get_weather',\n", + " args: { location: 'nyc' },\n", + " id: 'toolu_01TWgf1ezxk1hAzwYtqrE3cA',\n", + " type: 'tool_call'\n", + " }\n", + " ]\n", + "}\n", + "{\n", + " type: 'tool',\n", + " content: \"It's 90 degrees and sunny.\",\n", + " toolCalls: undefined\n", + "}\n", + "{\n", + " type: 'ai',\n", + " content: [\n", + " {\n", + " type: 'tool_use',\n", + " id: 'toolu_01NyRcucFHEZmyA6hE6BtTPs',\n", + " name: 'get_weather',\n", + " input: { location: 'sf' }\n", + " }\n", + " ],\n", + " toolCalls: [\n", + " {\n", + " name: 'get_weather',\n", + " args: { location: 'sf' },\n", + " id: 'toolu_01NyRcucFHEZmyA6hE6BtTPs',\n", + " type: 'tool_call'\n", + " }\n", + " ]\n", + "}\n", + "{\n", + " type: 'tool',\n", + " content: \"It's 60 degrees and foggy.\",\n", + " toolCalls: undefined\n", + "}\n", + "{\n", + " type: 'ai',\n", + " content: 'Based on the results, the weather in the coolest cities is:\\n' +\n", + " '- New York City: 90 degrees and sunny\\n' +\n", + " '- San Francisco: 60 degrees and foggy\\n' +\n", + " '\\n' +\n", + " 'So the weather in the coolest cities is a mix of warm and cool temperatures.',\n", + " toolCalls: []\n", + "}\n" + ] + } + ], + "source": [ + "// example with a multiple tool calls in succession\n", + "const streamWithMultiToolCalls = await app.stream(\n", + " {\n", + " messages: [new HumanMessage(\"what's the weather in the coolest cities?\")],\n", + " },\n", + " {\n", + " streamMode: \"values\"\n", + " }\n", + ")\n", + "for await (const chunk of streamWithMultiToolCalls) {\n", + " const lastMessage = chunk.messages[chunk.messages.length - 1];\n", + " const type = lastMessage._getType();\n", + " const content = lastMessage.content;\n", + " const toolCalls = lastMessage.tool_calls;\n", + " console.dir({\n", + " type,\n", + " content,\n", + " toolCalls\n", + " }, { depth: null });\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`ToolNode` can also handle errors during tool execution. See our guide on handling errors in `ToolNode` [here](https://langchain-ai.github.io/langgraphjs/how-tos/tool-calling-errors/)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" } - ], - "source": [ - "// example with a multiple tool calls in succession\n", - "const streamWithMultiToolCalls = await app.stream(\n", - " {\n", - " messages: [new HumanMessage(\"what's the weather in the coolest cities?\")],\n", - " },\n", - " {\n", - " streamMode: \"values\"\n", - " }\n", - ")\n", - "for await (const chunk of streamWithMultiToolCalls) {\n", - " const lastMessage = chunk.messages[chunk.messages.length - 1];\n", - " const type = lastMessage._getType();\n", - " const content = lastMessage.content;\n", - " const toolCalls = lastMessage.tool_calls;\n", - " console.dir({\n", - " type,\n", - " content,\n", - " toolCalls\n", - " }, { depth: null });\n", - "}" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "`ToolNode` can also handle errors during tool execution. See our guide on handling errors in `ToolNode` [here](https://langchain-ai.github.io/langgraphjs/how-tos/tool-calling-errors/)" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "TypeScript", - "language": "typescript", - "name": "tslab" }, - "language_info": { - "codemirror_mode": { - "mode": "typescript", - "name": "javascript", - "typescript": true - }, - "file_extension": ".ts", - "mimetype": "text/typescript", - "name": "typescript", - "version": "3.7.2" - } - }, - "nbformat": 4, - "nbformat_minor": 4 -} + "nbformat": 4, + "nbformat_minor": 4 +} \ No newline at end of file diff --git a/examples/quickstart.ipynb b/examples/quickstart.ipynb index 9ef2a834b..c920ede47 100644 --- a/examples/quickstart.ipynb +++ b/examples/quickstart.ipynb @@ -116,7 +116,7 @@ "## How does it work?\n", "\n", "The\n", - "[createReactAgent](https://langchain-ai.github.io/langgraphjs/reference/functions/prebuilt.createReactAgent.html)\n", + "[createReactAgent](https://langchain-ai.github.io/langgraphjs/reference/functions/langgraph_prebuilt.createReactAgent.html)\n", "constructor lets you create a simple tool-using LangGraph agent in a single line\n", "of code. Here's a visual representation of the graph:" ] diff --git a/libs/langgraph/README.md b/libs/langgraph/README.md index 62053a9da..5b9352378 100644 --- a/libs/langgraph/README.md +++ b/libs/langgraph/README.md @@ -243,7 +243,7 @@ Is there anything else you'd like to know about the weather in New York or any o - [Tutorials](https://langchain-ai.github.io/langgraphjs/tutorials/): Learn to build with LangGraph through guided examples. - [How-to Guides](https://langchain-ai.github.io/langgraphjs/how-tos/): Accomplish specific things within LangGraph, from streaming, to adding memory & persistence, to common design patterns (branching, subgraphs, etc.). These are the place to go if you want to copy and run a specific code snippet. - [Conceptual Guides](https://langchain-ai.github.io/langgraphjs/concepts/): In-depth explanations of the key concepts and principles behind LangGraph, such as nodes, edges, state and more. -- [API Reference](https://langchain-ai.github.io/langgraphjs/reference/graphs/): Review important classes and methods, simple examples of how to use the graph and checkpointing APIs, higher-level prebuilt components and more. +- [API Reference](https://langchain-ai.github.io/langgraphjs/reference/): Review important classes and methods, simple examples of how to use the graph and checkpointing APIs, higher-level prebuilt components and more. ## Running Example Jupyter Notebooks