diff --git a/docs/concepts/parallel.md b/docs/concepts/parallel.md index 506010234..8bb5b52fb 100644 --- a/docs/concepts/parallel.md +++ b/docs/concepts/parallel.md @@ -5,19 +5,19 @@ description: Learn about OpenAI's experimental parallel function calling to redu # Parallel Tools -Parallel Tool Calling is a feature that allows you to call multiple functions in a single request. This makes it faster to get a response from the language model, especially if your tool calls are independent of each other. +Parallel Function Calling is a feature that allows you to call multiple functions in a single request. !!! warning "Experimental Feature" - Parallel tool calling is only supported by Gemini and OpenAI at the moment + Parallel Function calling is only supported by Google and OpenAI at the moment. Make sure to use the equivalent parallel tool `mode` for your client. ## Understanding Parallel Function Calling -By using parallel function callings that allow you to call multiple functions in a single request, you can significantly reduce the latency of your application without having to use tricks with now one builds a schema. +Parallel Function Calling helps you to significantly reduce the latency of your application without having to build a parent schema as a wrapper around these tool calls. === "OpenAI" - ```python hl_lines="19 31" + ```python hl_lines="20 32" from __future__ import annotations import openai @@ -38,7 +38,7 @@ By using parallel function callings that allow you to call multiple functions in client = instructor.from_openai( openai.OpenAI(), mode=instructor.Mode.PARALLEL_TOOLS - ) # (1)! + ) function_calls = client.chat.completions.create( model="gpt-4o-mini", @@ -49,7 +49,7 @@ By using parallel function callings that allow you to call multiple functions in "content": "What is the weather in toronto and dallas and who won the super bowl?", }, ], - response_model=Iterable[Weather | GoogleSearch], # (2)! + response_model=Iterable[Weather | GoogleSearch], ) for fc in function_calls: @@ -61,7 +61,7 @@ By using parallel function callings that allow you to call multiple functions in === "Vertex AI" - ```python + ```python hl_lines="20 30" import instructor import vertexai from vertexai.generative_models import GenerativeModel @@ -82,7 +82,7 @@ By using parallel function callings that allow you to call multiple functions in client = instructor.from_vertexai( GenerativeModel("gemini-1.5-pro-preview-0409"), mode=instructor.Mode.VERTEXAI_PARALLEL_TOOLS - ) # (1)! + ) function_calls = client.create( messages=[ @@ -91,7 +91,7 @@ By using parallel function callings that allow you to call multiple functions in "content": "What is the weather in toronto and dallas and who won the super bowl?", }, ], - response_model=Iterable[Weather | GoogleSearch], # (2)! + response_model=Iterable[Weather | GoogleSearch], ) for fc in function_calls: @@ -101,7 +101,6 @@ By using parallel function callings that allow you to call multiple functions in #> query='who won the super bowl' ``` -1. Set the mode to `PARALLEL_TOOLS` to enable parallel function calling. -2. Set the response model to `Iterable[Weather | GoogleSearch]` to indicate that the response will be a list of `Weather` and `GoogleSearch` objects. This is necessary because the response will be a list of objects, and we need to specify the types of the objects in the list. +We need to set the response model to `Iterable[Weather | GoogleSearch]` to indicate that the response will be a list of `Weather` and `GoogleSearch` objects. -Noticed that the `response_model` Must be in the form `Iterable[Type1 | Type2 | ...]` or `Iterable[Type1]` where `Type1` and `Type2` are the types of the objects that will be returned in the response. +This is necessary because the response will be a list of objects, and we need to specify the types of the objects in the list. This returns an iterable which you can then iterate over