Skip to content

Commit

Permalink
Merge pull request #230 from tikikun/main
Browse files Browse the repository at this point in the history
bug: Fix caching compatibility issues regarding system prompt
  • Loading branch information
tikikun authored Dec 3, 2023
2 parents 9615282 + 61fc781 commit 8c95f6b
Showing 1 changed file with 9 additions and 2 deletions.
11 changes: 9 additions & 2 deletions controllers/llamaCPP.cc
Original file line number Diff line number Diff line change
Expand Up @@ -186,15 +186,22 @@ void llamaCPP::chatCompletion(
std::string role;
if (input_role == "user") {
role = user_prompt;
std::string content = message["content"].asString();
formatted_output += role + content;
} else if (input_role == "assistant") {
role = ai_prompt;
std::string content = message["content"].asString();
formatted_output += role + content;
} else if (input_role == "system") {
role = system_prompt;
std::string content = message["content"].asString();
formatted_output = role + content + formatted_output;

} else {
role = input_role;
std::string content = message["content"].asString();
formatted_output += role + content;
}
std::string content = message["content"].asString();
formatted_output += role + content;
}
formatted_output += ai_prompt;

Expand Down

0 comments on commit 8c95f6b

Please sign in to comment.