Skip to content

Commit

Permalink
Add explicit model sampling parameters to requests
Browse files Browse the repository at this point in the history
  • Loading branch information
sd109 committed Dec 13, 2023
1 parent ee0c0eb commit 9daf368
Showing 1 changed file with 6 additions and 0 deletions.
6 changes: 6 additions & 0 deletions web-app-utils/example_app_vanilla.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,12 @@ def inference(message, history):
"prompt": prompt.format(prompt=context),
"stream": True,
"max_tokens": 1000,
# Parameters requested by HU
"sampling_params": {
"temperature": 0.7,
"top_p": 0.4,
"top_k": 40,
}
}
response = requests.post(
f"{backend_url}/generate", headers=headers, json=pload, stream=True
Expand Down

0 comments on commit 9daf368

Please sign in to comment.