Skip to content

Commit

Permalink
docs: Update integration docs to recommend UV package manager
Browse files Browse the repository at this point in the history
  • Loading branch information
devin-ai-integration[bot] committed Nov 24, 2024
1 parent 58eef74 commit 9b16a21
Show file tree
Hide file tree
Showing 16 changed files with 292 additions and 102 deletions.
69 changes: 12 additions & 57 deletions docs/integrations/anthropic.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,69 +7,24 @@ description: Learn how to combine Anthropic and Instructor clients to create use

Now that we have a [Anthropic](https://www.anthropic.com/) client, we can use it with the `instructor` client to make requests.

Let's first install the instructor client with anthropic support
Let's first install the instructor client with anthropic support:

```
pip install "instructor[anthropic]"
```

Once we've done so, getting started is as simple as using our `from_anthropic` method to patch the client up.

```python
from pydantic import BaseModel
from typing import List
import anthropic
import instructor
=== "UV (Recommended)"
```bash
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Patching the Anthropics client with the instructor for enhanced capabilities
client = instructor.from_anthropic(
anthropic.Anthropic(),
)
# Install instructor
uv pip install "instructor[anthropic]"
```

=== "pip"
```bash
pip install "instructor[anthropic]"
```

class Properties(BaseModel):
name: str
value: str


class User(BaseModel):
name: str
age: int
properties: List[Properties]


# client.messages.create will also work due to the instructor client
user_response = client.chat.completions.create(
model="claude-3-haiku-20240307",
max_tokens=1024,
max_retries=0,
messages=[
{
"role": "user",
"content": "Create a user for a model with a name, age, and properties.",
}
],
response_model=User,
) # type: ignore

print(user_response.model_dump_json(indent=2))
"""
{
"name": "John Doe",
"age": 35,
"properties": [
{
"name": "City",
"value": "New York"
},
{
"name": "Occupation",
"value": "Software Engineer"
}
]
}
"""
```

## Streaming Support

Expand Down
23 changes: 11 additions & 12 deletions docs/integrations/azure.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,23 +9,22 @@ This guide demonstrates how to use Azure OpenAI with instructor for structured o

## Installation

We can use the same installation as we do for OpenAI since the default `openai` client ships with an AzureOpenAI client.
=== "UV (Recommended)"
```bash
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh

First, install the required dependencies:
# Install instructor
uv pip install "instructor[azure]"
```

```bash
pip install instructor
```

Next, make sure that you've enabled Azure OpenAI in your Azure account and have a deployment for the model you'd like to use. [Here is a guide to get started](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource?pivots=web-portal)
=== "pip"
```bash
pip install "instructor[azure]"
```

Once you've done so, you'll have an endpoint and a API key to be used to configure the client.

```bash
instructor.exceptions.InstructorRetryException: Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'}
```

If you see an error like the one above, make sure you've set the correct endpoint and API key in the client.

## Authentication

Expand Down
27 changes: 25 additions & 2 deletions docs/integrations/cerebras.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,31 @@ Cerebras provides hardware-accelerated AI models optimized for high-performance
Install Instructor with Cerebras support:

```bash
pip install "instructor[cerebras_cloud_sdk]"
```
=== "UV (Recommended)"
```bash
=== "UV (Recommended)"
```bash
Let's first install the instructor client with cerebras support:
=== "UV (Recommended)"
```bash
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install instructor
uv pip install "instructor[cerebras]"
```

=== "pip"
```bash
pip install "instructor[cerebras]"
```







## Simple User Example (Sync)

Expand Down
13 changes: 9 additions & 4 deletions docs/integrations/cohere.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,15 @@ You'll need a cohere API key which can be obtained by signing up [here](https://

## Setup

```
pip install "instructor[cohere]"
```
=== "UV (Recommended)"
```bash
uv pip install "instructor[cohere]"
```

=== "pip"
```bash
pip install "instructor[cohere]"
```

Export your key:

Expand Down
24 changes: 22 additions & 2 deletions docs/integrations/deepseek.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,28 @@ This guide covers everything you need to know about using DeepSeek with Instruct
Instructor comes with support for the OpenAI Client out of the box, so you don't need to install anything extra.

```bash
pip install "instructor"
```
=== "UV (Recommended)"
```bash
Let's first install the instructor client with deepseek support:
=== "UV (Recommended)"
```bash
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install instructor
uv pip install "instructor[deepseek]"
```

=== "pip"
```bash
pip install "instructor[deepseek]"
```






⚠️ **Important**: You must set your DeepSeek API key before using the client. You can do this in two ways:

Expand Down
27 changes: 25 additions & 2 deletions docs/integrations/fireworks.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,31 @@ Fireworks provides efficient and cost-effective AI models with enterprise-grade
Install Instructor with Fireworks support:

```bash
pip install "instructor[fireworks-ai]"
```
=== "UV (Recommended)"
```bash
=== "UV (Recommended)"
```bash
Let's first install the instructor client with fireworks support:
=== "UV (Recommended)"
```bash
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install instructor
uv pip install "instructor[fireworks]"
```

=== "pip"
```bash
pip install "instructor[fireworks]"
```







## Simple User Example (Sync)

Expand Down
21 changes: 18 additions & 3 deletions docs/integrations/google.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,24 @@ This guide will show you how to use Instructor with the Google.GenerativeAI libr

Google's Gemini models provide powerful AI capabilities with multimodal support. This guide shows you how to use Instructor with Google's Gemini models for type-safe, validated responses.

```bash
pip install "instructor[google-generativeai]
```
Let's first install the instructor client with Google/Gemini support:

=== "UV (Recommended)"
```bash
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install instructor
uv pip install "instructor[google]"
```

=== "pip"
```bash
pip install "instructor[google]"
```




## Simple User Example (Sync)

Expand Down
24 changes: 23 additions & 1 deletion docs/integrations/groq.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,31 @@ you'll need to sign up for an account and get an API key. You can do that [here]

```bash
export GROQ_API_KEY=<your-api-key-here>
pip install "instructor[groq]"
```

Let's first install the instructor client with groq support:
```bash

=== "UV (Recommended)"
```bash
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install instructor
uv pip install "instructor[groq]"
```

=== "pip"
```bash
pip install "instructor[groq]"
```







## Groq AI

Groq supports structured outputs with their new `llama-3-groq-70b-8192-tool-use-preview` model.
Expand Down
27 changes: 25 additions & 2 deletions docs/integrations/litellm.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,31 @@ LiteLLM provides a unified interface for multiple LLM providers, making it easy
Install Instructor with LiteLLM support:

```bash
pip install "instructor[litellm]"
```
=== "UV (Recommended)"
```bash
=== "UV (Recommended)"
```bash
Let's first install the instructor client with litellm support:
=== "UV (Recommended)"
```bash
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install instructor
uv pip install "instructor[litellm]"
```

=== "pip"
```bash
pip install "instructor[litellm]"
```







## Simple User Example (Sync)

Expand Down
16 changes: 16 additions & 0 deletions docs/integrations/llama-cpp-python.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,22 @@ This guide demonstrates how to use llama-cpp-python with Instructor to generate

Open-source LLMS are gaining popularity, and llama-cpp-python has made the `llama-cpp` model available to obtain structured outputs using JSON schema via a mixture of [constrained sampling](https://llama-cpp-python.readthedocs.io/en/latest/#json-schema-mode) and [speculative decoding](https://llama-cpp-python.readthedocs.io/en/latest/#speculative-decoding).

Let's first install the instructor client with llama-cpp-python support:

=== "UV (Recommended)"
```bash
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install instructor with llama-cpp-python support
uv pip install "instructor[llama]"
```

=== "pip"
```bash
pip install "instructor[llama]"
```

They also support a [OpenAI compatible client](https://llama-cpp-python.readthedocs.io/en/latest/#openai-compatible-web-server), which can be used to obtain structured output as a in process mechanism to avoid any network dependency.

<!-- more -->
Expand Down
19 changes: 19 additions & 0 deletions docs/integrations/mistral.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,25 @@ Mistral Large is the flagship model from Mistral AI, supporting 32k context wind

By the end of this blog post, you will learn how to effectively utilize Instructor with Mistral Large.

Let's first install the instructor client with Mistral support:

=== "UV (Recommended)"
```bash
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install instructor
uv pip install "instructor[mistral]"
```

=== "pip"
```bash
pip install "instructor[mistral]"
```




```python
import os
from pydantic import BaseModel
Expand Down
Loading

0 comments on commit 9b16a21

Please sign in to comment.