Skip to content

Commit

Permalink
Merge pull request #163 from janhq/162-feat-enable-caching-prompt-for…
Browse files Browse the repository at this point in the history
…-the-server

enable cache prompt
  • Loading branch information
tikikun authored Nov 21, 2023
2 parents 7c0be9f + 2602b4f commit a9a90a1
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion controllers/llamaCPP.cc
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,7 @@ void llamaCPP::chatCompletion(
data["frequency_penalty"] =
(*jsonBody).get("frequency_penalty", 0).asFloat();
data["presence_penalty"] = (*jsonBody).get("presence_penalty", 0).asFloat();

data["cache_prompt"] = true;
const Json::Value &messages = (*jsonBody)["messages"];
for (const auto &message : messages) {
std::string input_role = message["role"].asString();
Expand Down

0 comments on commit a9a90a1

Please sign in to comment.