Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue of ollama supporting the function call feature #913

Closed
fan-chao opened this issue Jul 19, 2024 · 3 comments
Closed

Issue of ollama supporting the function call feature #913

fan-chao opened this issue Jul 19, 2024 · 3 comments
Labels
bug Something isn't working integration:ollama

Comments

@fan-chao
Copy link

Describe the bug
Hello, thank you for providing haystack such a useful LLM framework.

I found that ollama can now support the function call feature ollama/ollama@d02bbeb.
Then I tested using the OllamaChatGenerator interface of haystack, the test code is as follows:

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "获取指定城市的天气信息",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "城市名称,如:北京",
                    }
                },
                "required": ["location"],
            },
        }
    }
]

chat_generator = OllamaChatGenerator(model="mistral",
                            url = "http://192.168.1.101:11434/api/chat",
                            generation_kwargs={
                              "num_predict": 4096,
                              "temperature": 0.0,
                              },
                            )

messages5 = [
    ChatMessage.from_user("what is whether of beijing today?"),
]
response = chat_generator.run(messages=messages5, generation_kwargs={"tools": tools})
print(response)

When using the OllamaChatGenerator to call the ollama interface, the return result is as follows, that is, there is no function call feature result.

{'replies': [ChatMessage(content=" To provide you with the current weather in Beijing, I would typically use a service like OpenWeatherMap API. However, as a text-based AI model, I don't have real-time capabilities to access external data sources directly. You can check the current weather in Beijing by visiting a reliable weather website or using a mobile app that provides this information.\n\nFor your convenience, here is a link to the Beijing weather forecast on Weather.com: https://weather.com/weather/today/lx-CNXX0126:1:US", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'mistral', 'created_at': '2024-07-19T09:00:40.619199632Z', 'done_reason': 'stop', 'done': True, 'total_duration': 1311717475, 'load_duration': 14763572, 'prompt_eval_count': 13, 'prompt_eval_duration': 26318000, 'eval_count': 121, 'eval_duration': 1226776000})]}

Then I used the OpenAIChatGenerator interface to call ollama:

chat_generator2 = OpenAIChatGenerator(model="mistral", api_base_url='http://192.168.1.101:11434/v1/')

The output result is as follows:

{'replies': [ChatMessage(content='[{"id": "call_lb0y3k1z", "function": {"arguments": "{\\"location\\":\\"Beijing\\"}", "name": "get_current_weather"}, "type": "function"}]', role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'mistral', 'index': 0, 'finish_reason': 'stop', 'usage': {'completion_tokens': 22, 'prompt_tokens': 85, 'total_tokens': 107}})]}

You can see that the function call feature has been implemented.
So the current interface OllamaChatGenerator seems to have no function call feature. So when will this issue be fixed?

To Reproduce
Steps to reproduce the behavior. Feel free to link a Colab we can run to investigate the issue.

Describe your environment (please complete the following information):

  • OS: ubuntu 18.04
  • Haystack version: 2.3.0
  • Integration version: 2.16.3
  • Ollama: 0.2.7
@fan-chao fan-chao added the bug Something isn't working label Jul 19, 2024
@lbux
Copy link
Contributor

lbux commented Jul 22, 2024

This can not be fixed until the ChatMessage data class is updated to support Tools. It is currently using the deprecated Functions. There is a current PR that is meant to add multimodal support to ChatMessage which would conflict with adding support for Tools. Either the PR needs to be updated to support Tools (which would be a too broad PR) or Tools support should be added after the PR is approved with a deprecation warning for Functions.

@lbux
Copy link
Contributor

lbux commented Jul 25, 2024

Also, it seems like Tools supported isn't fully ironed out for Ollama as seen here ollama/ollama#5796. Once the issue is closed (if the PR gets merged), you should try running your example again after rebuilding Ollama from source to see if the issue persists.

There is more work being done for ChatMessage, so once that is ironed out, proper Tools support can be added.

@vblagoje
Copy link
Member

Thanks for these comments @lbux and @fan-chao We'll close this one in favor of deepset-ai/haystack#8190

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working integration:ollama
Projects
None yet
Development

No branches or pull requests

4 participants