Skip to content

Commit

Permalink
refactor: openai function calling docs
Browse files Browse the repository at this point in the history
  • Loading branch information
bracesproul committed Oct 12, 2023
1 parent e83888e commit 988ae44
Show file tree
Hide file tree
Showing 3 changed files with 123 additions and 69 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,82 @@ directly to the model and call it, as shown below.

## Usage

There are two main ways to apply functions to your OpenAI calls.

The first and simplest is by attaching a function directly to the `.invoke({})` method:

```typescript
/* Define your function schema */
const extractionFunctionSchema = {...}

/* Instantiate ChatOpenAI class */
const model = new ChatOpenAI({ modelName: "gpt-4" });

/**
* Call the .invoke method on the model, directly passing
* the function arguments as call args.
*/
const result = await model.invoke([new HumanMessage("What a beautiful day!")], {
functions: [extractionFunctionSchema],
function_call: { name: "extractor" },
});

console.log({ result });
```

The second way is by directly binding the function to your model. Binding function arguments to your model is useful when you want to call the same function twice.
Calling the `.bind({})` method attaches any call arguments passed in to all future calls to the model.

```typescript
/* Define your function schema */
const extractionFunctionSchema = {...}

/* Instantiate ChatOpenAI class and bind function arguments to the model */
const model = new ChatOpenAI({ modelName: "gpt-4" }).bind({
functions: [extractionFunctionSchema],
function_call: { name: "extractor" },
});

/* Now we can call the model without having to pass the function arguments in again */
const result = await model.invoke([new HumanMessage("What a beautiful day!")]);

console.log({ result });
```

OpenAI requires parameter schemas in the format below, where `parameters` must be [JSON Schema](https://json-schema.org/).
Specifying the `function_call` parameter will force the model to return a response using the specified function.
When adding call arguments to your model, specifying the `function_call` argument will force the model to return a response using the specified function.
This is useful if you have multiple schemas you'd like the model to pick from.

Example function schema:

```typescript
const extractionFunctionSchema = {
name: "extractor",
description: "Extracts fields from the input.",
parameters: {
type: "object",
properties: {
tone: {
type: "string",
enum: ["positive", "negative"],
description: "The overall tone of the input",
},
word_count: {
type: "number",
description: "The number of words in the input",
},
chat_response: {
type: "string",
description: "A response to the human's input",
},
},
required: ["tone", "word_count", "chat_response"],
},
};
```

Now to put it all together:

<CodeBlock language="typescript">{OpenAIFunctionsExample}</CodeBlock>

## Usage with Zod
Expand Down
46 changes: 15 additions & 31 deletions examples/src/models/chat/openai_functions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,6 @@ const extractionFunctionSchema = {
},
};

// Bind function arguments to the model.
// All subsequent invoke calls will use the bound parameters.
// "functions.parameters" must be formatted as JSON Schema
// Omit "function_call" if you want the model to choose a function to call.
const model = new ChatOpenAI({
modelName: "gpt-4",
}).bind({
Expand All @@ -39,34 +35,22 @@ const model = new ChatOpenAI({
const result = await model.invoke([new HumanMessage("What a beautiful day!")]);

console.log(result);

/*
AIMessage {
content: '',
name: undefined,
additional_kwargs: {
function_call: {
name: 'extractor',
arguments: '{\n' +
' "tone": "positive",\n' +
' "word_count": 4,\n' +
' "chat_response": "It certainly is a beautiful day!"\n' +
'}'
}
AIMessage {
lc_serializable: true,
lc_kwargs: { content: '', additional_kwargs: { function_call: [Object] } },
lc_namespace: [ 'langchain', 'schema' ],
content: '',
name: undefined,
additional_kwargs: {
function_call: {
name: 'extractor',
arguments: '{\n' +
' "tone": "positive",\n' +
' "word_count": 4,\n' +
` "chat_response": "I'm glad you're enjoying the day! What makes it so beautiful for you?"\n` +
'}'
}
}
*/

// Alternatively, you can pass function call arguments as an additional argument as a one-off:
/*
const model = new ChatOpenAI({
modelName: "gpt-4",
});
const result = await model.call([
new HumanMessage("What a beautiful day!")
], {
functions: [extractionFunctionSchema],
function_call: {name: "extractor"}
});
}
*/
72 changes: 35 additions & 37 deletions examples/src/models/chat/openai_functions_zod.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,54 +3,52 @@ import { HumanMessage } from "langchain/schema";
import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";

const extractionFunctionZodSchema = z.object({
tone: z
.enum(["positive", "negative"])
.describe("The overall tone of the input"),
entity: z.string().describe("The entity mentioned in the input"),
word_count: z.number().describe("The number of words in the input"),
chat_response: z.string().describe("A response to the human's input"),
final_punctuation: z
.optional(z.string())
.describe("The final punctuation mark in the input, if any."),
});
const extractionFunctionSchema = {
name: "extractor",
description: "Extracts fields from the input.",
parameters: zodToJsonSchema(
z.object({
tone: z
.enum(["positive", "negative"])
.describe("The overall tone of the input"),
entity: z.string().describe("The entity mentioned in the input"),
word_count: z.number().describe("The number of words in the input"),
chat_response: z.string().describe("A response to the human's input"),
final_punctuation: z
.optional(z.string())
.describe("The final punctuation mark in the input, if any."),
})
),
};

// Bind function arguments to the model.
// "functions.parameters" must be formatted as JSON Schema.
// We translate the above Zod schema into JSON schema using the "zodToJsonSchema" package.
// Omit "function_call" if you want the model to choose a function to call.
const model = new ChatOpenAI({
modelName: "gpt-4",
}).bind({
functions: [
{
name: "extractor",
description: "Extracts fields from the input.",
parameters: zodToJsonSchema(extractionFunctionZodSchema),
},
],
functions: [extractionFunctionSchema],
function_call: { name: "extractor" },
});

const result = await model.invoke([new HumanMessage("What a beautiful day!")]);

console.log(result);

/*
AIMessage {
content: '',
name: undefined,
additional_kwargs: {
function_call: {
name: 'extractor',
arguments: '{\n' +
' "tone": "positive",\n' +
' "entity": "day",\n' +
' "word_count": 4,\n' +
' "chat_response": "It certainly is a gorgeous day!",\n' +
' "final_punctuation": "!"\n' +
'}'
}
AIMessage {
lc_serializable: true,
lc_kwargs: { content: '', additional_kwargs: { function_call: [Object] } },
lc_namespace: [ 'langchain', 'schema' ],
content: '',
name: undefined,
additional_kwargs: {
function_call: {
name: 'extractor',
arguments: '{\n' +
'"tone": "positive",\n' +
'"entity": "day",\n' +
'"word_count": 4,\n' +
`"chat_response": "I'm glad you're enjoying the day!",\n` +
'"final_punctuation": "!"\n' +
'}'
}
}
}
*/

0 comments on commit 988ae44

Please sign in to comment.