Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

本地Ollama不起作用 #152

Open
liekkasfc opened this issue Dec 16, 2024 · 0 comments
Open

本地Ollama不起作用 #152

liekkasfc opened this issue Dec 16, 2024 · 0 comments

Comments

@liekkasfc
Copy link

项目.env使用siliconflow可以运行采集和处理,切换成本地Ollama后再跑run_task.sh 控制台 llm output:无信息 无法处理

使用ollama时.env配置文件信息:

export LLM_API_KEY=" "
export LLM_API_BASE="http://192.168.233.49:11434/v1"
#export LLM_API_BASE="https://api.siliconflow.cn/v1"
export PB_API_AUTH="[email protected]|password" ##your pb superuser account and password
##belowing is optional, go as you need
#export VERBOSE="true" ##for detail log info. If not need, remove this item.
export PRIMARY_MODEL="qwen2.5:14b"
#export SECONDARY_MODEL="THUDM/glm-4-9b-chat"
export PROJECT_DIR="work_dir"
#export PB_API_BASE="" ##only use if your pb not run on 127.0.0.1:8090

使用siliconflow的采集和处理结果
1734333937244

使用本地ollama运行输出
1734334225896

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant