We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
llm = {'model': 'Qwen2.5-72B-Instruct-GPTQ-Int4', 'model_server': 'http://192.168.252.70:9002/v1', 'generate_cfg': {'temperature': 0.01, 'max_tokens': 2000}} def test(): bot = Assistant(llm=llm) messages = [{'role': 'user', 'content': '你是谁'}] for rsp in bot.run(messages): print(rsp)
{'temperature': 0.01, 'max_tokens': 2000, 'stop': ['✿RESULT✿', '✿RETURN✿'], 'seed': 1061424198} [{'role': 'assistant', 'content': '我是阿里云开发的一款超大规模语言模型,'}]
大模型的输出不完整,我在oai.py把generate_cfg['stop'] = []设置为空后,输出正常。
The text was updated successfully, but these errors were encountered:
您好,我本地无法复现这个case,请问您那里是必现吗?另外,我看您这里并没有指定function,应该不会出现['✿RESULT✿', '✿RETURN✿']这种stopwords才对
Sorry, something went wrong.
1、是必现的。 2、没有指定function,我只是想试下RAG效果。 我debug看了下stopwords设置的地方,应该是下面这两个路径。
文件:qwen_agent/llm/function_calling.py的18行设置的stopwords 文件:qwen_agent/llm/fncall_prompts/qwen_fncall_prompt.py
请问这个bug怎么修复? 我也遇到了
No branches or pull requests
环境
运行代码
输出
{'temperature': 0.01, 'max_tokens': 2000, 'stop': ['✿RESULT✿', '✿RETURN✿'], 'seed': 1061424198}
[{'role': 'assistant', 'content': '我是阿里云开发的一款超大规模语言模型,'}]
问题
大模型的输出不完整,我在oai.py把generate_cfg['stop'] = []设置为空后,输出正常。
The text was updated successfully, but these errors were encountered: