You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Hello, thank you for providing haystack such a useful LLM framework.
I found that ollama can now support the function call feature ollama/ollama@d02bbeb.
Then I tested using the OllamaChatGenerator interface of haystack, the test code is as follows:
When using the OllamaChatGenerator to call the ollama interface, the return result is as follows, that is, there is no function call feature result.
{'replies': [ChatMessage(content=" To provide you with the current weather in Beijing, I would typically use a service like OpenWeatherMap API. However, as a text-based AI model, I don't have real-time capabilities to access external data sources directly. You can check the current weather in Beijing by visiting a reliable weather website or using a mobile app that provides this information.\n\nFor your convenience, here is a link to the Beijing weather forecast on Weather.com: https://weather.com/weather/today/lx-CNXX0126:1:US", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'mistral', 'created_at': '2024-07-19T09:00:40.619199632Z', 'done_reason': 'stop', 'done': True, 'total_duration': 1311717475, 'load_duration': 14763572, 'prompt_eval_count': 13, 'prompt_eval_duration': 26318000, 'eval_count': 121, 'eval_duration': 1226776000})]}
Then I used the OpenAIChatGenerator interface to call ollama:
You can see that the function call feature has been implemented.
So the current interface OllamaChatGenerator seems to have no function call feature. So when will this issue be fixed?
To Reproduce
Steps to reproduce the behavior. Feel free to link a Colab we can run to investigate the issue.
Describe your environment (please complete the following information):
OS: ubuntu 18.04
Haystack version: 2.3.0
Integration version: 2.16.3
Ollama: 0.2.7
The text was updated successfully, but these errors were encountered:
This can not be fixed until the ChatMessage data class is updated to support Tools. It is currently using the deprecated Functions. There is a current PR that is meant to add multimodal support to ChatMessage which would conflict with adding support for Tools. Either the PR needs to be updated to support Tools (which would be a too broad PR) or Tools support should be added after the PR is approved with a deprecation warning for Functions.
Also, it seems like Tools supported isn't fully ironed out for Ollama as seen here ollama/ollama#5796. Once the issue is closed (if the PR gets merged), you should try running your example again after rebuilding Ollama from source to see if the issue persists.
There is more work being done for ChatMessage, so once that is ironed out, proper Tools support can be added.
Describe the bug
Hello, thank you for providing haystack such a useful LLM framework.
I found that ollama can now support the function call feature ollama/ollama@d02bbeb.
Then I tested using the OllamaChatGenerator interface of haystack, the test code is as follows:
When using the OllamaChatGenerator to call the ollama interface, the return result is as follows, that is, there is no function call feature result.
Then I used the OpenAIChatGenerator interface to call ollama:
The output result is as follows:
You can see that the function call feature has been implemented.
So the current interface OllamaChatGenerator seems to have no function call feature. So when will this issue be fixed?
To Reproduce
Steps to reproduce the behavior. Feel free to link a Colab we can run to investigate the issue.
Describe your environment (please complete the following information):
The text was updated successfully, but these errors were encountered: