Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run AgentSims with LLama2 #21

Open
myrainbowandsky opened this issue Oct 28, 2023 · 1 comment
Open

How to run AgentSims with LLama2 #21

myrainbowandsky opened this issue Oct 28, 2023 · 1 comment

Comments

@myrainbowandsky
Copy link

How to run AgentSims with LLama2

@alvinowyong
Copy link

I managed to get this running with Ollama serving Llama2 combined with LiteLLM as a Local OpenAI Proxy Server that will accept and serve OpenAI request to Ollama. You will still need to follow through the setup process in the README but when you run the ./restart.sh script you can prepend OPENAI_BASE_URL="http://localhost:<LITELLM_PORT> before the nohup python main.py to point to the running Local OpenAI Proxy Server.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants