v0.25.0
What's Changed
- Switch AnthropicChatModel to use streaming by @jackmpcollins in #215
StreamedStr
now streams correctly, but object streaming is waiting on Anthropic support for streaming array responses.from magentic import prompt, StreamedStr from magentic.chat_model.anthropic_chat_model import AnthropicChatModel @prompt( "Tell me about {topic}.", model=AnthropicChatModel("claude-3-opus-20240229"), ) def tell_me_about(topic: str) -> StreamedStr: ... for chunk in tell_me_about("chocolate"): print(chunk, end="", flush=True)
- add optional custom_llm_provider param for litellm by @entropi in #221
- Add tests for LiteLLM async callbacks by @jackmpcollins in #223
- Tidy up: Combine openai streamed_tool_call functions by @jackmpcollins in #225
New Contributors
Full Changelog: v0.24.0...v0.25.0