Skip to content

Commit

Permalink
docs[patch]: Adds Cohere chat model docstring, update others (#6499)
Browse files Browse the repository at this point in the history
* Adds Cohere chat model docstring, update others

* Use absolute links
  • Loading branch information
jacoblee93 authored Aug 12, 2024
1 parent 228c1eb commit 3fbe328
Show file tree
Hide file tree
Showing 6 changed files with 491 additions and 129 deletions.
33 changes: 13 additions & 20 deletions libs/langchain-anthropic/src/chat_models.ts
Original file line number Diff line number Diff line change
Expand Up @@ -179,16 +179,16 @@ function extractToken(chunk: AIMessageChunk): string | undefined {
* Anthropic chat model integration.
*
* Setup:
* Install `@langchain/anthropic` and set environment variable `ANTHROPIC_API_KEY`.
* Install `@langchain/anthropic` and set an environment variable named `ANTHROPIC_API_KEY`.
*
* ```bash
* npm install @langchain/anthropic
* export ANTHROPIC_API_KEY="your-api-key"
* ```
*
* ## [Constructor args](/classes/langchain_anthropic.ChatAnthropic.html#constructor)
* ## [Constructor args](https://api.js.langchain.com/classes/langchain_anthropic.ChatAnthropic.html#constructor)
*
* ## [Runtime args](/interfaces/langchain_anthropic.ChatAnthropicCallOptions.html)
* ## [Runtime args](https://api.js.langchain.com/interfaces/langchain_anthropic.ChatAnthropicCallOptions.html)
*
* Runtime args can be passed as the second argument to any of the base runnable methods `.invoke`. `.stream`, `.batch`, etc.
* They can also be passed via `.bind`, or the second arg in `.bindTools`, like shown in the examples below:
Expand Down Expand Up @@ -235,17 +235,10 @@ function extractToken(chunk: AIMessageChunk): string | undefined {
* <summary><strong>Invoking</strong></summary>
*
* ```typescript
* const messages = [
* {
* type: "system" as const,
* content: "You are a helpful translator. Translate the user sentence to French.",
* },
* {
* type: "human" as const,
* content: "I love programming.",
* },
* ];
* const result = await llm.invoke(messages);
* const input = `Translate "I love programming" into French.`;
*
* // Models also accept a list of chat messages or a formatted prompt
* const result = await llm.invoke(input);
* console.log(result);
* ```
*
Expand Down Expand Up @@ -280,7 +273,7 @@ function extractToken(chunk: AIMessageChunk): string | undefined {
* <summary><strong>Streaming Chunks</strong></summary>
*
* ```typescript
* for await (const chunk of await llm.stream(messages)) {
* for await (const chunk of await llm.stream(input)) {
* console.log(chunk);
* }
* ```
Expand Down Expand Up @@ -346,7 +339,7 @@ function extractToken(chunk: AIMessageChunk): string | undefined {
* import { AIMessageChunk } from '@langchain/core/messages';
* import { concat } from '@langchain/core/utils/stream';
*
* const stream = await llm.stream(messages);
* const stream = await llm.stream(input);
* let full: AIMessageChunk | undefined;
* for await (const chunk of stream) {
* full = !full ? chunk : concat(full, chunk);
Expand Down Expand Up @@ -450,7 +443,7 @@ function extractToken(chunk: AIMessageChunk): string | undefined {
* rating: z.number().optional().describe("How funny the joke is, from 1 to 10")
* }).describe('Joke to tell user.');
*
* const structuredLlm = llm.withStructuredOutput(Joke);
* const structuredLlm = llm.withStructuredOutput(Joke, { name: "Joke" });
* const jokeResult = await structuredLlm.invoke("Tell me a joke about cats");
* console.log(jokeResult);
* ```
Expand Down Expand Up @@ -501,7 +494,7 @@ function extractToken(chunk: AIMessageChunk): string | undefined {
* <summary><strong>Usage Metadata</strong></summary>
*
* ```typescript
* const aiMsgForMetadata = await llm.invoke(messages);
* const aiMsgForMetadata = await llm.invoke(input);
* console.log(aiMsgForMetadata.usage_metadata);
* ```
*
Expand All @@ -517,7 +510,7 @@ function extractToken(chunk: AIMessageChunk): string | undefined {
*
* ```typescript
* const streamForMetadata = await llm.stream(
* messages,
* input,
* {
* streamUsage: true
* }
Expand All @@ -540,7 +533,7 @@ function extractToken(chunk: AIMessageChunk): string | undefined {
* <summary><strong>Response Metadata</strong></summary>
*
* ```typescript
* const aiMsgForResponseMetadata = await llm.invoke(messages);
* const aiMsgForResponseMetadata = await llm.invoke(input);
* console.log(aiMsgForResponseMetadata.response_metadata);
* ```
*
Expand Down
33 changes: 13 additions & 20 deletions libs/langchain-aws/src/chat_models.ts
Original file line number Diff line number Diff line change
Expand Up @@ -181,9 +181,9 @@ export interface ChatBedrockConverseCallOptions
* export BEDROCK_AWS_ACCESS_KEY_ID="your-aws-access-key-id"
* ```
*
* ## [Constructor args](/classes/langchain_aws.ChatBedrockConverse.html#constructor)
* ## [Constructor args](https://api.js.langchain.com/classes/langchain_aws.ChatBedrockConverse.html#constructor)
*
* ## [Runtime args](/interfaces/langchain_aws.ChatBedrockConverseCallOptions.html)
* ## [Runtime args](https://api.js.langchain.com/interfaces/langchain_aws.ChatBedrockConverseCallOptions.html)
*
* Runtime args can be passed as the second argument to any of the base runnable methods `.invoke`. `.stream`, `.batch`, etc.
* They can also be passed via `.bind`, or the second arg in `.bindTools`, like shown in the examples below:
Expand All @@ -199,7 +199,7 @@ export interface ChatBedrockConverseCallOptions
* const llmWithTools = llm.bindTools(
* [...],
* {
* tool_choice: "auto",
* stop: ["\n"],
* }
* );
* ```
Expand Down Expand Up @@ -234,17 +234,10 @@ export interface ChatBedrockConverseCallOptions
* <summary><strong>Invoking</strong></summary>
*
* ```typescript
* const messages = [
* {
* type: "system" as const,
* content: "You are a helpful translator. Translate the user sentence to French.",
* },
* {
* type: "human" as const,
* content: "I love programming.",
* },
* ];
* const result = await llm.invoke(messages);
* const input = `Translate "I love programming" into French.`;
*
* // Models also accept a list of chat messages or a formatted prompt
* const result = await llm.invoke(input);
* console.log(result);
* ```
*
Expand Down Expand Up @@ -284,7 +277,7 @@ export interface ChatBedrockConverseCallOptions
* <summary><strong>Streaming Chunks</strong></summary>
*
* ```typescript
* for await (const chunk of await llm.stream(messages)) {
* for await (const chunk of await llm.stream(input)) {
* console.log(chunk);
* }
* ```
Expand Down Expand Up @@ -370,7 +363,7 @@ export interface ChatBedrockConverseCallOptions
* import { AIMessageChunk } from '@langchain/core/messages';
* import { concat } from '@langchain/core/utils/stream';
*
* const stream = await llm.stream(messages);
* const stream = await llm.stream(input);
* let full: AIMessageChunk | undefined;
* for await (const chunk of stream) {
* full = !full ? chunk : concat(full, chunk);
Expand Down Expand Up @@ -495,7 +488,7 @@ export interface ChatBedrockConverseCallOptions
* }).describe('Joke to tell user.');
*
* const structuredLlm = llm.withStructuredOutput(Joke);
* const jokeResult = await structuredLlm.invoke("Tell me a joke about cats");
* const jokeResult = await structuredLlm.invoke("Tell me a joke about cats", { name: "Joke" });
* console.log(jokeResult);
* ```
*
Expand Down Expand Up @@ -545,7 +538,7 @@ export interface ChatBedrockConverseCallOptions
* <summary><strong>Usage Metadata</strong></summary>
*
* ```typescript
* const aiMsgForMetadata = await llm.invoke(messages);
* const aiMsgForMetadata = await llm.invoke(input);
* console.log(aiMsgForMetadata.usage_metadata);
* ```
*
Expand All @@ -560,7 +553,7 @@ export interface ChatBedrockConverseCallOptions
* <summary><strong>Stream Usage Metadata</strong></summary>
*
* ```typescript
* const streamForMetadata = await llm.stream(messages);
* const streamForMetadata = await llm.stream(input);
* let fullForMetadata: AIMessageChunk | undefined;
* for await (const chunk of streamForMetadata) {
* fullForMetadata = !fullForMetadata ? chunk : concat(fullForMetadata, chunk);
Expand All @@ -579,7 +572,7 @@ export interface ChatBedrockConverseCallOptions
* <summary><strong>Response Metadata</strong></summary>
*
* ```typescript
* const aiMsgForResponseMetadata = await llm.invoke(messages);
* const aiMsgForResponseMetadata = await llm.invoke(input);
* console.log(aiMsgForResponseMetadata.response_metadata);
* ```
*
Expand Down
Loading

0 comments on commit 3fbe328

Please sign in to comment.