Skip to content

v0.25.0

Compare
Choose a tag to compare
@jackmpcollins jackmpcollins released this 23 May 06:55
· 63 commits to main since this release

What's Changed

  • Switch AnthropicChatModel to use streaming by @jackmpcollins in #215 StreamedStr now streams correctly, but object streaming is waiting on Anthropic support for streaming array responses.
     from magentic import prompt, StreamedStr
     from magentic.chat_model.anthropic_chat_model import AnthropicChatModel
     
     
     @prompt(
         "Tell me about {topic}.",
         model=AnthropicChatModel("claude-3-opus-20240229"),
     )
     def tell_me_about(topic: str) -> StreamedStr: ...
     
     
     for chunk in tell_me_about("chocolate"):
         print(chunk, end="", flush=True)
  • add optional custom_llm_provider param for litellm by @entropi in #221
  • Add tests for LiteLLM async callbacks by @jackmpcollins in #223
  • Tidy up: Combine openai streamed_tool_call functions by @jackmpcollins in #225

New Contributors

Full Changelog: v0.24.0...v0.25.0