Skip to content

Commit

Permalink
finished custom url call (#142)
Browse files Browse the repository at this point in the history
  • Loading branch information
Jasonqi146 authored Jul 23, 2024
1 parent c18d073 commit 276456d
Show file tree
Hide file tree
Showing 2 changed files with 38 additions and 3 deletions.
16 changes: 16 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -153,6 +153,22 @@ For the complete set of parameters, please check the `sotopia_conf` folder.

To run a large batch of environments, you can change the `ENV_IDS` parameter in `sotopia_conf/run_async_server_in_batch.gin` to a list of environment ids. When `gin.ENV_IDS==[]`, all environments on the DB will be used.

## Custom OPENAI API URL
To use a custom API url for agent models, you can set `AGENT1_MODEL` to the following format:
```python
"custom/{model_name}@{base_url}"
```

As an example, if you use vllm to expose an OpenAI API:
```bash
python -m vllm.entrypoints.openai.api_server --model facebook/opt-125m
```

The `gin.AGENT1_MODEL` should be set to:
```python
"custom/facebook/opt-125m@http://localhost:8000/v1"
```

## Getting access to your simulation
After running experiments, you can go to the `examples/redis_stats.ipynb` notebook to check the existing episodes (Episode Log section), as well as calculate the performance.

Expand Down
25 changes: 22 additions & 3 deletions sotopia/generation_utils/generate.py
Original file line number Diff line number Diff line change
Expand Up @@ -301,7 +301,7 @@ def obtain_chain(
Using langchain to sample profiles for participants
"""
model_name = _return_fixed_model_version(model_name)
if "together_ai" in model_name:
if model_name.startswith("together_ai"):
model_name = "/".join(model_name.split("/")[1:])
human_message_prompt = HumanMessagePromptTemplate(
prompt=PromptTemplate(
Expand All @@ -319,7 +319,7 @@ def obtain_chain(
)
chain = chat_prompt_template | chat_openai
return chain
elif "groq" in model_name:
elif model_name.startswith("groq"):
model_name = "/".join(model_name.split("/")[1:])
human_message_prompt = HumanMessagePromptTemplate(
prompt=PromptTemplate(
Expand All @@ -337,7 +337,7 @@ def obtain_chain(
)
chain = chat_prompt_template | chat_openai
return chain
elif "azure" in model_name:
elif model_name.startswith("azure"):
# azure/resource_name/deployment_name/version
azure_credentials = model_name.split("/")[1:]
resource_name, deployment_name, azure_version = (
Expand All @@ -361,6 +361,25 @@ def obtain_chain(
)
chain = chat_prompt_template | chat_azure_openai
return chain
elif model_name.startswith("custom"):
custom_model_name, model_base_url = (
model_name.split("@")[0],
model_name.split("@")[1],
)
custom_model_name = "/".join(custom_model_name.split("/")[1:])
chat = ChatOpenAI(
model=custom_model_name,
temperature=temperature,
max_retries=max_retries,
api_key="EMPTY",
base_url=model_base_url,
)
human_message_prompt = HumanMessagePromptTemplate(
prompt=PromptTemplate(template=template, input_variables=input_variables)
)
chat_prompt_template = ChatPromptTemplate.from_messages([human_message_prompt])
chain = chat_prompt_template | chat
return chain
else:
chat = ChatOpenAI(
model=model_name,
Expand Down

0 comments on commit 276456d

Please sign in to comment.