From add990d7f1b695124f05eb7673bd974867a92b10 Mon Sep 17 00:00:00 2001 From: "pre-commit-ci[bot]" <66853113+pre-commit-ci[bot]@users.noreply.github.com> Date: Fri, 25 Oct 2024 22:06:49 +0000 Subject: [PATCH] [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --- docs/source/developers/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/developers/index.md b/docs/source/developers/index.md index 26360891d..31fae6c7a 100644 --- a/docs/source/developers/index.md +++ b/docs/source/developers/index.md @@ -400,7 +400,7 @@ real-time, which offers a visually pleasing user experience. Custom slash commands can stream responses in chat by invoking the `stream_reply()` method, provided by the `BaseChatHandler` class that custom slash commands inherit from. Custom slash commands should always use `self.stream_reply()` to stream -responses, as it provides support for stopping the response stream from the UI. +responses, as it provides support for stopping the response stream from the UI. The streaming functionality uses LangChain's Expression Language (LCEL). LCEL is a declarative way to compose [Runnables](https://python.langchain.com/api_reference/core/runnables/langchain_core.runnables.base.Runnable.html) into chains. Any chain constructed this way will automatically have sync, async, batch, and streaming support. The main composition primitives are RunnableSequence and RunnableParallel.