-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Anthropic batch mode not available #225
Comments
Example (from the batch console welcome page)
Stdout
Batch Output
|
Details / meaningful differences from openai https://docs.anthropic.com/en/docs/build-with-claude/message-batches Different limits
List instead of file content
Different batch statuses
Different request statuses
Recommend streaming finished requests instead of downloading all of them
Different errors
|
Examples: https://docs.anthropic.com/en/api/messages-batch-examples Polling example shows interval of 60 seconds When retrieving results Cancelling a batch
|
API Reference, notable differences from openai System prompt is a parameter not a message
How is structured output done? Through tool use? @CharlieJCJ will provide the details based on the litellm work |
We can't store metadata in the batch, so we will need to store a map of request_file to batch_id |
They just increased the limits significantly for batch:
https://docs.anthropic.com/en/docs/build-with-claude/message-batches#batch-limitations |
instead use a single batch_objects_file (not submitted / downloaded) for each |
when doing
I get
do
to get
|
This is what a failed request looks like in the output
This is what a successful request looks like
|
Now I want to add structured output support for batch mode While poking around I see that instructor supports batch mode
We are currently handling the json payload manually for both online and batch for openai and doing structured output via
In the instructor library here is how they define openai vs anthropic request json
and response json
So we can use the handle_response_model function. Or do ourselves.
https://github.com/instructor-ai/instructor/blob/main/instructor/process_response.py#L735 Which actually does this
So it is just prompting in the system message with the schema. There is also a tool use mode Mode.ANTHROPIC_TOOLS but for some reason the instructor batch cli does Mode.ANTHROPIC_JSON https://github.com/instructor-ai/instructor/blob/main/instructor/process_response.py#L336
Anthropic docs on json output The documentation on tool use gives the system prompt states
For structured output, we would want to force tool use. They suggest using tool use when json output is needed, regardless of if it is a tool. This is exactly what Mode.ANTHROPIC_TOOLS does
|
We might want to prefill the response with the start of the model json schema (as suggested in the doc above) In 1/5 responses failed because it has "Claude’s friendly preamble"
|
Model information: https://docs.anthropic.com/en/docs/about-claude/models#model-comparison-table
Batch Doc: https://docs.anthropic.com/en/docs/build-with-claude/message-batches
Batch Examples: https://docs.anthropic.com/en/api/messages-batch-examples
Batch API Reference: https://docs.anthropic.com/en/api/creating-message-batches
Python SDK: https://github.com/anthropics/anthropic-sdk-python
Console batch: https://console.anthropic.com/settings/workspaces/default/batches
Console logs: https://console.anthropic.com/settings/logs
Console usage: https://console.anthropic.com/settings/usage
Discord: https://discord.com/channels/1072196207201501266/1114301961831981107
Thread about batches in discord: https://discord.com/channels/1072196207201501266/1114305823187943506/threads/1315560776374681702
Form to submit batch feedback: https://docs.google.com/forms/d/e/1FAIpQLSfNBdq0FK_WNzvcotvD0HVrFc1c-LrugFV3Rwb6QlTWNceIJQ/viewform
The text was updated successfully, but these errors were encountered: