From a93505f6b2b93d7b23a06e7751464920ae63c4f9 Mon Sep 17 00:00:00 2001 From: Jacob Lee Date: Thu, 15 Aug 2024 05:54:13 -0700 Subject: [PATCH] docs[patch]: Adds ChatOpenAI streamUsage docs on disabling stream usage (#6534) * Adds ChatOpenAI streamUsage docs on disabling stream usage * Fix notebook --- .../docs/integrations/chat/openai.ipynb | 30 +++++++++++++++++++ 1 file changed, 30 insertions(+) diff --git a/docs/core_docs/docs/integrations/chat/openai.ipynb b/docs/core_docs/docs/integrations/chat/openai.ipynb index 9b832eba98ad..eb829b093865 100644 --- a/docs/core_docs/docs/integrations/chat/openai.ipynb +++ b/docs/core_docs/docs/integrations/chat/openai.ipynb @@ -305,6 +305,36 @@ "await llmWithCustomHeaders.invoke(\"Hi there!\");" ] }, + { + "cell_type": "markdown", + "id": "7af61d1d", + "metadata": {}, + "source": [ + "## Disabling streaming usage metadata\n", + "\n", + "Some proxies or third-party providers present largely the same API interface as OpenAI, but don't support the more recently added `stream_options` parameter to return streaming usage. You can use `ChatOpenAI` to access these providers by disabling streaming usage like this:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0ff40bd7", + "metadata": {}, + "outputs": [], + "source": [ + "import { ChatOpenAI } from \"@langchain/openai\";\n", + "\n", + "const llmWithoutStreamUsage = new ChatOpenAI({\n", + " temperature: 0.9,\n", + " streamUsage: false,\n", + " configuration: {\n", + " baseURL: \"https://proxy.com\",\n", + " },\n", + "});\n", + "\n", + "await llmWithoutStreamUsage.invoke(\"Hi there!\");" + ] + }, { "cell_type": "markdown", "id": "013b6300",