diff --git a/docs/integrations/anthropic.md b/docs/integrations/anthropic.md index 5e97308b5..120af60e1 100644 --- a/docs/integrations/anthropic.md +++ b/docs/integrations/anthropic.md @@ -7,69 +7,24 @@ description: Learn how to combine Anthropic and Instructor clients to create use Now that we have a [Anthropic](https://www.anthropic.com/) client, we can use it with the `instructor` client to make requests. -Let's first install the instructor client with anthropic support +Let's first install the instructor client with anthropic support: -``` -pip install "instructor[anthropic]" -``` - -Once we've done so, getting started is as simple as using our `from_anthropic` method to patch the client up. - -```python -from pydantic import BaseModel -from typing import List -import anthropic -import instructor +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh -# Patching the Anthropics client with the instructor for enhanced capabilities -client = instructor.from_anthropic( - anthropic.Anthropic(), -) + # Install instructor + uv pip install "instructor[anthropic]" + ``` +=== "pip" + ```bash + pip install "instructor[anthropic]" + ``` -class Properties(BaseModel): - name: str - value: str -class User(BaseModel): - name: str - age: int - properties: List[Properties] - - -# client.messages.create will also work due to the instructor client -user_response = client.chat.completions.create( - model="claude-3-haiku-20240307", - max_tokens=1024, - max_retries=0, - messages=[ - { - "role": "user", - "content": "Create a user for a model with a name, age, and properties.", - } - ], - response_model=User, -) # type: ignore - -print(user_response.model_dump_json(indent=2)) -""" -{ - "name": "John Doe", - "age": 35, - "properties": [ - { - "name": "City", - "value": "New York" - }, - { - "name": "Occupation", - "value": "Software Engineer" - } - ] -} -""" -``` ## Streaming Support diff --git a/docs/integrations/azure.md b/docs/integrations/azure.md index c3742f29d..c3a507b6e 100644 --- a/docs/integrations/azure.md +++ b/docs/integrations/azure.md @@ -9,23 +9,22 @@ This guide demonstrates how to use Azure OpenAI with instructor for structured o ## Installation -We can use the same installation as we do for OpenAI since the default `openai` client ships with an AzureOpenAI client. +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh -First, install the required dependencies: + # Install instructor + uv pip install "instructor[azure]" + ``` -```bash -pip install instructor -``` - -Next, make sure that you've enabled Azure OpenAI in your Azure account and have a deployment for the model you'd like to use. [Here is a guide to get started](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource?pivots=web-portal) +=== "pip" + ```bash + pip install "instructor[azure]" + ``` -Once you've done so, you'll have an endpoint and a API key to be used to configure the client. -```bash -instructor.exceptions.InstructorRetryException: Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'} -``` -If you see an error like the one above, make sure you've set the correct endpoint and API key in the client. ## Authentication diff --git a/docs/integrations/cerebras.md b/docs/integrations/cerebras.md index bfb326fcb..2ed005667 100644 --- a/docs/integrations/cerebras.md +++ b/docs/integrations/cerebras.md @@ -12,8 +12,31 @@ Cerebras provides hardware-accelerated AI models optimized for high-performance Install Instructor with Cerebras support: ```bash -pip install "instructor[cerebras_cloud_sdk]" -``` +=== "UV (Recommended)" + ```bash +=== "UV (Recommended)" + ```bash +Let's first install the instructor client with cerebras support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor + uv pip install "instructor[cerebras]" + ``` + +=== "pip" + ```bash + pip install "instructor[cerebras]" + ``` + + + + + + ## Simple User Example (Sync) diff --git a/docs/integrations/cohere.md b/docs/integrations/cohere.md index d3824651a..f4ae32d10 100644 --- a/docs/integrations/cohere.md +++ b/docs/integrations/cohere.md @@ -13,10 +13,15 @@ You'll need a cohere API key which can be obtained by signing up [here](https:// ## Setup -``` -pip install "instructor[cohere]" - -``` +=== "UV (Recommended)" + ```bash + uv pip install "instructor[cohere]" + ``` + +=== "pip" + ```bash + pip install "instructor[cohere]" + ``` Export your key: diff --git a/docs/integrations/deepseek.md b/docs/integrations/deepseek.md index 64a3f8649..ec0956a31 100644 --- a/docs/integrations/deepseek.md +++ b/docs/integrations/deepseek.md @@ -14,8 +14,28 @@ This guide covers everything you need to know about using DeepSeek with Instruct Instructor comes with support for the OpenAI Client out of the box, so you don't need to install anything extra. ```bash -pip install "instructor" -``` +=== "UV (Recommended)" + ```bash +Let's first install the instructor client with deepseek support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor + uv pip install "instructor[deepseek]" + ``` + +=== "pip" + ```bash + pip install "instructor[deepseek]" + ``` + + + + + ⚠️ **Important**: You must set your DeepSeek API key before using the client. You can do this in two ways: diff --git a/docs/integrations/fireworks.md b/docs/integrations/fireworks.md index a12edab1d..282963680 100644 --- a/docs/integrations/fireworks.md +++ b/docs/integrations/fireworks.md @@ -12,8 +12,31 @@ Fireworks provides efficient and cost-effective AI models with enterprise-grade Install Instructor with Fireworks support: ```bash -pip install "instructor[fireworks-ai]" -``` +=== "UV (Recommended)" + ```bash +=== "UV (Recommended)" + ```bash +Let's first install the instructor client with fireworks support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor + uv pip install "instructor[fireworks]" + ``` + +=== "pip" + ```bash + pip install "instructor[fireworks]" + ``` + + + + + + ## Simple User Example (Sync) diff --git a/docs/integrations/google.md b/docs/integrations/google.md index fcc280f0e..4cf4c6975 100644 --- a/docs/integrations/google.md +++ b/docs/integrations/google.md @@ -11,9 +11,24 @@ This guide will show you how to use Instructor with the Google.GenerativeAI libr Google's Gemini models provide powerful AI capabilities with multimodal support. This guide shows you how to use Instructor with Google's Gemini models for type-safe, validated responses. -```bash -pip install "instructor[google-generativeai] -``` +Let's first install the instructor client with Google/Gemini support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor + uv pip install "instructor[google]" + ``` + +=== "pip" + ```bash + pip install "instructor[google]" + ``` + + + ## Simple User Example (Sync) diff --git a/docs/integrations/groq.md b/docs/integrations/groq.md index 81fd8068c..c41e4fe00 100644 --- a/docs/integrations/groq.md +++ b/docs/integrations/groq.md @@ -11,9 +11,31 @@ you'll need to sign up for an account and get an API key. You can do that [here] ```bash export GROQ_API_KEY= -pip install "instructor[groq]" ``` +Let's first install the instructor client with groq support: +```bash + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor + uv pip install "instructor[groq]" + ``` + +=== "pip" + ```bash + pip install "instructor[groq]" + ``` + + + + + + + ## Groq AI Groq supports structured outputs with their new `llama-3-groq-70b-8192-tool-use-preview` model. diff --git a/docs/integrations/litellm.md b/docs/integrations/litellm.md index 671f5fd25..53b0a6bcd 100644 --- a/docs/integrations/litellm.md +++ b/docs/integrations/litellm.md @@ -12,8 +12,31 @@ LiteLLM provides a unified interface for multiple LLM providers, making it easy Install Instructor with LiteLLM support: ```bash -pip install "instructor[litellm]" -``` +=== "UV (Recommended)" + ```bash +=== "UV (Recommended)" + ```bash +Let's first install the instructor client with litellm support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor + uv pip install "instructor[litellm]" + ``` + +=== "pip" + ```bash + pip install "instructor[litellm]" + ``` + + + + + + ## Simple User Example (Sync) diff --git a/docs/integrations/llama-cpp-python.md b/docs/integrations/llama-cpp-python.md index de267c39b..8abca9ebe 100644 --- a/docs/integrations/llama-cpp-python.md +++ b/docs/integrations/llama-cpp-python.md @@ -16,6 +16,22 @@ This guide demonstrates how to use llama-cpp-python with Instructor to generate Open-source LLMS are gaining popularity, and llama-cpp-python has made the `llama-cpp` model available to obtain structured outputs using JSON schema via a mixture of [constrained sampling](https://llama-cpp-python.readthedocs.io/en/latest/#json-schema-mode) and [speculative decoding](https://llama-cpp-python.readthedocs.io/en/latest/#speculative-decoding). +Let's first install the instructor client with llama-cpp-python support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor with llama-cpp-python support + uv pip install "instructor[llama]" + ``` + +=== "pip" + ```bash + pip install "instructor[llama]" + ``` + They also support a [OpenAI compatible client](https://llama-cpp-python.readthedocs.io/en/latest/#openai-compatible-web-server), which can be used to obtain structured output as a in process mechanism to avoid any network dependency. diff --git a/docs/integrations/mistral.md b/docs/integrations/mistral.md index 37f2b9d04..af737b016 100644 --- a/docs/integrations/mistral.md +++ b/docs/integrations/mistral.md @@ -18,6 +18,25 @@ Mistral Large is the flagship model from Mistral AI, supporting 32k context wind By the end of this blog post, you will learn how to effectively utilize Instructor with Mistral Large. +Let's first install the instructor client with Mistral support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor + uv pip install "instructor[mistral]" + ``` + +=== "pip" + ```bash + pip install "instructor[mistral]" + ``` + + + + ```python import os from pydantic import BaseModel diff --git a/docs/integrations/ollama.md b/docs/integrations/ollama.md index f7254bf2e..4848fa835 100644 --- a/docs/integrations/ollama.md +++ b/docs/integrations/ollama.md @@ -15,6 +15,22 @@ authors: This guide demonstrates how to use Ollama with Instructor to generate structured outputs. You'll learn how to use JSON schema mode with local LLMs to create type-safe responses. +Let's first install the instructor client with Ollama support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor with Ollama support + uv pip install "instructor[ollama]" + ``` + +=== "pip" + ```bash + pip install "instructor[ollama]" + ``` + Open-source LLMS are gaining popularity, and the release of Ollama's OpenAI compatibility later it has made it possible to obtain structured outputs using JSON schema. By the end of this blog post, you will learn how to effectively utilize instructor with ollama. But before we proceed, let's first explore the concept of patching. diff --git a/docs/integrations/openai.md b/docs/integrations/openai.md index d8dd58ad1..d17898fc5 100644 --- a/docs/integrations/openai.md +++ b/docs/integrations/openai.md @@ -11,17 +11,19 @@ OpenAI is the primary integration for Instructor, offering robust support for st Instructor comes with support for OpenAI out of the box, so you don't need to install anything extra. -```bash -pip install "instructor" -``` - -⚠️ **Important**: You must set your OpenAI API key before using the client. You can do this in two ways: - -1. Set the environment variable: - -```bash -export OPENAI_API_KEY='your-api-key-here' -``` +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor + uv pip install "instructor" + ``` + +=== "pip" + ```bash + pip install "instructor" + ``` 2. Or provide it directly to the client: diff --git a/docs/integrations/together.md b/docs/integrations/together.md index 273c9c00d..48c007d66 100644 --- a/docs/integrations/together.md +++ b/docs/integrations/together.md @@ -15,6 +15,22 @@ authors: This guide demonstrates how to use Together AI with Instructor to generate structured outputs. You'll learn how to use function calling with Together's models to create type-safe responses. +Let's first install the instructor client with Together AI support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor with Together AI support + uv pip install "instructor[together]" + ``` + +=== "pip" + ```bash + pip install "instructor[together]" + ``` + Open-source LLMS are gaining popularity, and with the release of Together's Function calling models, its been easier than ever to get structured outputs. By the end of this blog post, you will learn how to effectively utilize instructor with Together AI. But before we proceed, let's first explore the concept of patching. diff --git a/docs/integrations/vertex.md b/docs/integrations/vertex.md index 79ead60c2..5c4c1b58c 100644 --- a/docs/integrations/vertex.md +++ b/docs/integrations/vertex.md @@ -9,11 +9,24 @@ Google Cloud's Vertex AI provides enterprise-grade AI capabilities with robust s ## Quick Start -Install Instructor with Vertex AI support. You can do so by running the command below. +Let's first install the instructor client with Vertex AI support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor + uv pip install "instructor[vertex]" + ``` + +=== "pip" + ```bash + pip install "instructor[vertex]" + ``` + + -```bash -pip install "instructor[vertexai]" -``` ## Simple User Example (Sync) diff --git a/docs/integrations/writer.md b/docs/integrations/writer.md index 4c5c9674f..e6a3a956f 100644 --- a/docs/integrations/writer.md +++ b/docs/integrations/writer.md @@ -11,8 +11,31 @@ You'll need to sign up for an account and get an API key. You can do that [here] ```bash export WRITER_API_KEY= -pip install "instructor[writer]" -``` +=== "UV (Recommended)" + ```bash +=== "UV (Recommended)" + ```bash +Let's first install the instructor client with writer support: + +=== "UV (Recommended)" + ```bash + # Install UV if you haven't already + curl -LsSf https://astral.sh/uv/install.sh | sh + + # Install instructor + uv pip install "instructor[writer]" + ``` + +=== "pip" + ```bash + pip install "instructor[writer]" + ``` + + + + + + ## Palmyra-X-004