Skip to content

Releases: jackmpcollins/magentic

v0.26.0

27 May 06:06
Compare
Choose a tag to compare

What's Changed

Example of non-streamed response with usage immediately available

from magentic import OpenaiChatModel, UserMessage

chat_model = OpenaiChatModel("gpt-3.5-turbo", seed=42)
message = chat_model.complete(messages=[UserMessage("Say hello!")])

print(message.usage)
# > Usage(input_tokens=10, output_tokens=9)

Example of streamed response where usage only becomes available after the stream has been processed

from magentic import OpenaiChatModel, UserMessage
from magentic.streaming import StreamedStr

chat_model = OpenaiChatModel("gpt-3.5-turbo", seed=42)
message = chat_model.complete(messages=[UserMessage("Say hello!")], output_types=[StreamedStr])

print(message.usage)
# > `None` because stream has not be processed yet

# Process the stream (convert StreamedStr to str)
str(message.content)

print(message.usage)
# > Usage(input_tokens=10, output_tokens=9)

Full Changelog: v0.25.0...v0.26.0

v0.25.0

23 May 06:55
Compare
Choose a tag to compare

What's Changed

  • Switch AnthropicChatModel to use streaming by @jackmpcollins in #215 StreamedStr now streams correctly, but object streaming is waiting on Anthropic support for streaming array responses.
     from magentic import prompt, StreamedStr
     from magentic.chat_model.anthropic_chat_model import AnthropicChatModel
     
     
     @prompt(
         "Tell me about {topic}.",
         model=AnthropicChatModel("claude-3-opus-20240229"),
     )
     def tell_me_about(topic: str) -> StreamedStr: ...
     
     
     for chunk in tell_me_about("chocolate"):
         print(chunk, end="", flush=True)
  • add optional custom_llm_provider param for litellm by @entropi in #221
  • Add tests for LiteLLM async callbacks by @jackmpcollins in #223
  • Tidy up: Combine openai streamed_tool_call functions by @jackmpcollins in #225

New Contributors

Full Changelog: v0.24.0...v0.25.0

v0.25.0a0

16 May 08:04
Compare
Choose a tag to compare
v0.25.0a0 Pre-release
Pre-release

Prerelease for testing PR #214

v0.24.0

16 May 05:10
Compare
Choose a tag to compare

Warning

The default model for magentic is now gpt-4o instead of gpt-4-turbo. See Configuration for how to change this.

What's Changed

New Contributors

Full Changelog: v0.23.0...v0.24.0

v0.23.0

07 May 05:20
Compare
Choose a tag to compare

What's Changed

  • 🦙 Ollama can now return structured outputs / function calls (it takes a little prompting to make it reliable).
from magentic import prompt
from magentic.chat_model.litellm_chat_model import LitellmChatModel


@prompt(
    "Count to {n}. Use the tool to return in the format [1, 2, 3, ...]",
    model=LitellmChatModel("ollama_chat/llama2", api_base="http://localhost:11434")
)
def count_to(n: int) -> list[int]: ...


count_to(5)
# > [1, 2, 3, 4, 5]

PRs

Full Changelog: v0.22.0...v0.23.0

v0.22.0

04 May 07:00
Compare
Choose a tag to compare

What's Changed

  • 🚀 Forced function calling using the new tool_choice: "required" argument from OpenAI. This means no more StructuredOutputError caused by the model returning a string when it was not in the return annotation (for prompt-functions with a union return type. Single return types were already forced).

PRs

Full Changelog: v0.21.1...v0.22.0

v0.21.1

29 Apr 08:17
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.21.0...v0.21.1


The error message for when the model returns a string when not expected now contains the start of the returned string. For example:

StructuredOutputError: String was returned by model but not expected. You may need to update your prompt to encourage the model to return a specific type. Model output: '{ "name": "return_list_of_int", "arguments": { "properties": { "value": { "items": [1, 2, 3], [...]'

v0.21.0

22 Apr 06:46
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.20.1...v0.21.0


Mistral API now supported natively 🚀

with full support for StreamedStr, ParallelFunctionCall, etc. Example:

from magentic import prompt
from magentic.chat_model.mistral_chat_model import MistralChatModel
from pydantic import BaseModel


class Superhero(BaseModel):
    name: str
    age: int
    power: str
    enemies: list[str]


@prompt(
    """Create a Superhero named {name}""",
    model=MistralChatModel("mistral-large-latest"),
)
def create_superhero(name: str) -> Superhero: ...


create_superhero("Garden Man")
# Superhero(name='Garden Man', age=35, power='Plant control', enemies=['Smog', 'Deforestator'])

v0.20.1

19 Apr 18:29
Compare
Choose a tag to compare

What's Changed

  • Add support for passing metadata on LitellmChatModel by @hspak in #185

Full Changelog: v0.20.0...v0.20.1


Example of passing litellm metadata

@prompt(
    "Create a Superhero named {name}.",
    model=LitellmChatModel("gpt-4", metadata={"foo": "bar"})
)
def create_superhero(name: str) -> Superhero: ...

v0.20.0

19 Apr 06:47
Compare
Choose a tag to compare

Warning

The default model for magentic is now gpt-4-turbo instead of gpt-3.5-turbo. See Configuration for how to change this.

What's Changed

Full Changelog: v0.19.0...v0.20.0


Having a default of gpt-4-turbo enables using vision with function calling by default

from pydantic import BaseModel, Field

from magentic import chatprompt, UserMessage
from magentic.vision import UserImageMessage


IMAGE_URL_WOODEN_BOARDWALK = "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg"


class ImageDetails(BaseModel):
    description: str = Field(description="A brief description of the image.")
    name: str = Field(description="A short name.")


@chatprompt(
    UserMessage("Describe the following image in one sentence."),
    UserImageMessage(IMAGE_URL_WOODEN_BOARDWALK),
)
def describe_image() -> ImageDetails: ...


image_details = describe_image()
print(image_details.name)
# 'Wooden Boardwalk in Green Wetland'
print(image_details.description)
# 'A serene wooden boardwalk meanders through a lush green wetland under a blue sky dotted with clouds.'