-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: make llama.cpp Chat Generator compatible with new ChatMessage
#1254
Conversation
if formatted_msg["role"] == "tool": | ||
formatted_msg["name"] = message.tool_call_result.origin.tool_name | ||
formatted_msg["content"] = message.tool_call_result.result |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
starting from 2.9.0, Haystack function
messages will be automatically converted to tool
messages. This change ensures compatibility.
if name: | ||
if hasattr(reply, "_name"): | ||
reply._name = name # new ChatMessage | ||
elif hasattr(reply, "name"): | ||
reply.name = name # legacy ChatMessage |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
once we release 2.9.0, we can get rid of this ugly check
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 after 2.9.0 release, we may consider to refactor for
's code in a _create_reply
method maybe.
Related Issues
ChatMessage
#1249: read the issue for detailsProposed Changes:
ChatMessage
How did you test it?
CI; local tests with Haystack main branch
Checklist
fix:
,feat:
,build:
,chore:
,ci:
,docs:
,style:
,refactor:
,perf:
,test:
.