From 1b25b2531322e27756beed302f52b2ad2a6048f2 Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Fri, 25 Oct 2024 22:08:18 +0000 Subject: [PATCH] [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --- docs/source/developers/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/developers/index.md b/docs/source/developers/index.md index a71ec3f07..b2e870caf 100644 --- a/docs/source/developers/index.md +++ b/docs/source/developers/index.md @@ -430,7 +430,7 @@ def create_llm_chain( Once your chat handler binds a Runnable to `self.llm_chain` in `self.create_llm_chain()`, you can define `process_message()` to invoke `self.stream_reply()`, which streams a reply back to the user using -`self.llm_chain.astream()`. +`self.llm_chain.astream()`. `self.stream_reply()` has two required arguments: - `input`: An input to your LangChain Runnable. This is usually a dictionary