From ce69236b55e22ad36c46582faac1e65ff3a0f233 Mon Sep 17 00:00:00 2001 From: Brace Sproul Date: Thu, 22 Aug 2024 18:02:46 -0700 Subject: [PATCH] docs[minor]: Add doc on defining state (#350) --- examples/how-tos/define-state.ipynb | 216 ++++++++++++++++++++++++++++ examples/quickstart.ipynb | 38 ++--- yarn.lock | 6 +- 3 files changed, 233 insertions(+), 27 deletions(-) create mode 100644 examples/how-tos/define-state.ipynb diff --git a/examples/how-tos/define-state.ipynb b/examples/how-tos/define-state.ipynb new file mode 100644 index 000000000..ef29e4c45 --- /dev/null +++ b/examples/how-tos/define-state.ipynb @@ -0,0 +1,216 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# How to define graph state\n", + "\n", + "This how to guide will cover how to define the state of your graph. This implementation has changed, and there is a new recommended method of defining the state of your graph. This new method is through the [`Annotation`](https://langchain-ai.github.io/langgraphjs/reference/functions/langgraph.Annotation-1.html) function.\n", + "\n", + "## Prerequisites\n", + "\n", + "- [State conceptual guide](https://langchain-ai.github.io/langgraphjs/concepts/low_level/#state) - Conceptual guide on defining the state of your graph.\n", + "- [Building graphs](https://langchain-ai.github.io/langgraphjs/tutorials/quickstart/) - This how to assumes you have a basic understanding of how to build graphs.\n", + "\n", + "## Setup\n", + "\n", + "This guide requires installing the `@langchain/langgraph`, and `@langchain/core` packages:\n", + "\n", + "```bash\n", + "npm install @langchain/langgraph @langchain/core\n", + "```\n", + "\n", + "## Getting started\n", + "\n", + "The `Annotation` function is the recommended way to define your graph state for new `StateGraph` graphs. The `Annotation.Root` function is used to create the top-level state object, where each field represents a channel in the graph.\n", + "\n", + "Here's an example of how to define a simple graph state with one channel called `messages`:" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "import { BaseMessage } from \"@langchain/core/messages\";\n", + "import { Annotation } from \"@langchain/langgraph\";\n", + "\n", + "const GraphAnnotation = Annotation.Root({\n", + " // Define a 'messages' channel to store an array of BaseMessage objects\n", + " messages: Annotation({\n", + " // Reducer function: Combines the current state with new messages\n", + " reducer: (currentState, updateValue) => currentState.concat(updateValue),\n", + " // Default function: Initialize the channel with an empty array\n", + " default: () => [],\n", + " })\n", + "});" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Each channel can optionally have `reducer` and `default` functions:\n", + "- The `reducer` function defines how new values are combined with the existing state.\n", + "- The `default` function provides an initial value for the channel.\n", + "\n", + "For more information on reducers, see the [reducers conceptual guide](https://langchain-ai.github.io/langgraphjs/concepts/low_level/#reducers)" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "const QuestionAnswerAnnotation = Annotation.Root({\n", + " question: Annotation,\n", + " answer: Annotation,\n", + "});" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Above, all we're doing is defining the channels, and then passing the un-instantiated `Annotation` function as the value. It is important to note we always pass in the TypeScript type of each channel as the first generics argument to `Annotation`. Doing this ensures our graph state is type safe, and we can get the proper types when defining our nodes. Below shows how you can extract the typings from the `Annotation` function:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "type QuestionAnswerAnnotationType = typeof QuestionAnswerAnnotation.State;" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This is equivalent to the following type:\n", + "\n", + "```typescript\n", + "type QuestionAnswerAnnotationType = {\n", + " question: string;\n", + " answer: string;\n", + "}\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Merging states\n", + "\n", + "If you have two graph state annotations, you can merge the two into a single annotation by using the `spec` value:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "const MergedAnnotation = Annotation.Root({\n", + " ...QuestionAnswerAnnotation.spec,\n", + " ...GraphAnnotation.spec,\n", + "})" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The type of the merged annotation is the intersection of the two annotations:\n", + "\n", + "```typescript\n", + "type MergedAnnotation = {\n", + " messages: BaseMessage[];\n", + " question: string;\n", + " answer: string;\n", + "}\n", + "```\n", + "\n", + "Finally, instantiating your graph using the annotations is as simple as passing the annotation to the `StateGraph` constructor:\n" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "import { StateGraph } from \"@langchain/langgraph\";\n", + "\n", + "const workflow = new StateGraph(MergedAnnotation);" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## State channels\n", + "\n", + "The `Annotation` function is a convince wrapper around the low level implementation of how states are defined in LangGraph. Defining state using the `channels` object (which is what `Annotation` is a wrapper of) is still possible, although not recommended for most cases. The below example shows how to implement a graph using this pattern:" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "import { StateGraph } from \"@langchain/langgraph\";\n", + "\n", + "interface WorkflowChannelsState {\n", + " messages: BaseMessage[];\n", + " question: string;\n", + " answer: string;\n", + "}\n", + "\n", + "const workflowWithChannels = new StateGraph({\n", + " channels: {\n", + " messages: {\n", + " reducer: (currentState, updateValue) => currentState.concat(updateValue),\n", + " default: () => [],\n", + " },\n", + " question: null,\n", + " answer: null,\n", + " }\n", + "});" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Above, we set the value of `question` and `answer` to `null`, as it does not contain a default value. To set a default value, the channel should be implemented how the `messages` key is, with the `default` factory returing the default value. The `reducer` function is optional, and can be added to the channel object if needed." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/examples/quickstart.ipynb b/examples/quickstart.ipynb index 4ddf7806f..9ef2a834b 100644 --- a/examples/quickstart.ipynb +++ b/examples/quickstart.ipynb @@ -156,15 +156,15 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": 1, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "The current weather in San Francisco is sunny with a temperature of 82.0°F (27.8°C). The wind is coming from the northwest at 19.1 km/h. The humidity is at 41%, and there is no precipitation at the moment.\n", - "The current weather in New York is sunny with a temperature of 84.0°F (28.9°C). The wind is coming from the south-southeast at 3.6 km/h. The humidity is at 57%, and there is a slight precipitation of 0.01 inches.\n" + "The current weather in San Francisco is partly cloudy with a temperature of 72.0°F (22.2°C). The wind speed is 31.0 km/h coming from the west. The humidity is at 64%, and the visibility is 16.0 km.\n", + "The current weather in New York is partly cloudy with a temperature of 73.9°F (23.3°C). The wind speed is 3.6 km/h coming from the west-northwest. The humidity is at 38%, and the visibility is 16.0 km.\n" ] } ], @@ -172,32 +172,27 @@ "import { TavilySearchResults } from \"@langchain/community/tools/tavily_search\";\n", "import { ChatOpenAI } from \"@langchain/openai\";\n", "import { MemorySaver } from \"@langchain/langgraph\";\n", - "import { HumanMessage } from \"@langchain/core/messages\";\n", + "import { HumanMessage, BaseMessage } from \"@langchain/core/messages\";\n", "import { ToolNode } from \"@langchain/langgraph/prebuilt\";\n", - "import { END, START, StateGraph, StateGraphArgs } from \"@langchain/langgraph\";\n", - "\n", - "// Define the state interface\n", - "interface AgentState {\n", - " messages: HumanMessage[];\n", - "}\n", + "import { END, START, StateGraph, Annotation } from \"@langchain/langgraph\";\n", "\n", "// Define the graph state\n", - "const graphState: StateGraphArgs[\"channels\"] = {\n", - " messages: {\n", - " value: (x: HumanMessage[], y: HumanMessage[]) => x.concat(y),\n", + "const GraphAnnotation = Annotation.Root({\n", + " messages: Annotation({\n", + " reducer: (state, update) => state.concat(update),\n", " default: () => [],\n", - " },\n", - "};\n", + " })\n", + "})\n", "\n", "// Define the tools for the agent to use\n", "const tools = [new TavilySearchResults({ maxResults: 3 })];\n", "\n", - "const toolNode = new ToolNode(tools);\n", + "const toolNode = new ToolNode(tools);\n", "\n", "const model = new ChatOpenAI({ temperature: 0 }).bindTools(tools);\n", "\n", "// Define the function that determines whether to continue or not\n", - "function shouldContinue(state: AgentState): \"tools\" | typeof END {\n", + "function shouldContinue(state: typeof GraphAnnotation.State): \"tools\" | typeof END {\n", " const messages = state.messages;\n", "\n", " const lastMessage = messages[messages.length - 1];\n", @@ -211,7 +206,7 @@ "}\n", "\n", "// Define the function that calls the model\n", - "async function callModel(state: AgentState) {\n", + "async function callModel(state: typeof GraphAnnotation.State) {\n", " const messages = state.messages;\n", "\n", " const response = await model.invoke(messages);\n", @@ -221,7 +216,7 @@ "}\n", "\n", "// Define a new graph\n", - "const workflow = new StateGraph({ channels: graphState })\n", + "const workflow = new StateGraph(GraphAnnotation)\n", " .addNode(\"agent\", callModel)\n", " .addNode(\"tools\", toolNode)\n", " .addEdge(START, \"agent\")\n", @@ -265,11 +260,6 @@ "[how-to guides](https://langchain-ai.github.io/langgraphjs/how-tos/) for more\n", "examples of how to implement different design patterns." ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [] } ], "metadata": { diff --git a/yarn.lock b/yarn.lock index d8421de73..bc0df9756 100644 --- a/yarn.lock +++ b/yarn.lock @@ -2050,8 +2050,8 @@ __metadata: linkType: hard "@langchain/scripts@npm:>=0.1.0 <0.2.0": - version: 0.1.0 - resolution: "@langchain/scripts@npm:0.1.0" + version: 0.1.1 + resolution: "@langchain/scripts@npm:0.1.1" dependencies: "@rollup/wasm-node": ^4.19.0 axios: ^1.6.7 @@ -2066,7 +2066,7 @@ __metadata: bin: lc_build: bin/build.js notebook_validate: bin/validate_notebook.js - checksum: 6f9f90195688a08326f49430cea8b844b3db222b6a86d94827e31e637dcb77de5330e1733df0373914b3920807e81b1598a6ac6b1a9ddee2ee0db204362d4645 + checksum: 4c56c081b456b072ea0e9b6eddbe9f8e8d69996027800084ae871e20fc76453a12fcf64794c6b3e0d5aae5177603a772a906d4a2b591dbe70ca7bb51327bfa8c languageName: node linkType: hard