Skip to content

Commit

Permalink
Merge pull request ChatGPTNextWeb#3230 from Yidadaa/bugfix-1112
Browse files Browse the repository at this point in the history
  • Loading branch information
Yidadaa authored Nov 11, 2023
2 parents 8bd39f3 + 64647b0 commit 22b6987
Show file tree
Hide file tree
Showing 5 changed files with 31 additions and 21 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -216,9 +216,9 @@ If you want to disable parse settings from url, set this to 1.
### `CUSTOM_MODELS` (optional)

> Default: Empty
> Example: `+llama,+claude-2,-gpt-3.5-turbo` means add `llama, claude-2` to model list, and remove `gpt-3.5-turbo` from list.
> Example: `+llama,+claude-2,-gpt-3.5-turbo,gpt-4-1106-preview:gpt-4-turbo` means add `llama, claude-2` to model list, and remove `gpt-3.5-turbo` from list, and display `gpt-4-1106-preview` as `gpt-4-turbo`.
To control custom models, use `+` to add a custom model, use `-` to hide a model, separated by comma.
To control custom models, use `+` to add a custom model, use `-` to hide a model, use `name:displayName` to customize model name, separated by comma.

## Requirements

Expand Down
4 changes: 2 additions & 2 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -122,9 +122,9 @@ Azure Api 版本,你可以在这里找到:[Azure 文档](https://learn.micro

### `CUSTOM_MODELS` (可选)

> 示例:`+qwen-7b-chat,+glm-6b,-gpt-3.5-turbo` 表示增加 `qwen-7b-chat``glm-6b` 到模型列表,而从列表中删除 `gpt-3.5-turbo`
> 示例:`+qwen-7b-chat,+glm-6b,-gpt-3.5-turbo,gpt-4-1106-preview:gpt-4-turbo` 表示增加 `qwen-7b-chat``glm-6b` 到模型列表,而从列表中删除 `gpt-3.5-turbo`,并将 `gpt-4-1106-preview` 模型名字展示为 `gpt-4-turbo`
用来控制模型列表,使用 `+` 增加一个模型,使用 `-` 来隐藏一个模型,用英文逗号隔开。
用来控制模型列表,使用 `+` 增加一个模型,使用 `-` 来隐藏一个模型,使用 `模型名:展示名` 来自定义模型的展示名,用英文逗号隔开。

## 开发

Expand Down
2 changes: 1 addition & 1 deletion app/api/common.ts
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ export async function requestOpenai(req: NextRequest) {
const jsonBody = JSON.parse(clonedBody) as { model?: string };

// not undefined and is false
if (modelTable[jsonBody?.model ?? ""] === false) {
if (modelTable[jsonBody?.model ?? ""].available === false) {
return NextResponse.json(
{
error: true,
Expand Down
10 changes: 5 additions & 5 deletions app/components/chat.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -433,17 +433,17 @@ export function ChatActions(props: {
const currentModel = chatStore.currentSession().mask.modelConfig.model;
const allModels = useAllModels();
const models = useMemo(
() => allModels.filter((m) => m.available).map((m) => m.name),
() => allModels.filter((m) => m.available),
[allModels],
);
const [showModelSelector, setShowModelSelector] = useState(false);

useEffect(() => {
// if current model is not available
// switch to first available model
const isUnavaliableModel = !models.includes(currentModel);
const isUnavaliableModel = !models.some((m) => m.name === currentModel);
if (isUnavaliableModel && models.length > 0) {
const nextModel = models[0] as ModelType;
const nextModel = models[0].name as ModelType;
chatStore.updateCurrentSession(
(session) => (session.mask.modelConfig.model = nextModel),
);
Expand Down Expand Up @@ -531,8 +531,8 @@ export function ChatActions(props: {
<Selector
defaultSelectedValue={currentModel}
items={models.map((m) => ({
title: m,
value: m,
title: m.displayName,
value: m.name,
}))}
onClose={() => setShowModelSelector(false)}
onSelection={(s) => {
Expand Down
32 changes: 21 additions & 11 deletions app/utils/model.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,21 +4,34 @@ export function collectModelTable(
models: readonly LLMModel[],
customModels: string,
) {
const modelTable: Record<string, boolean> = {};
const modelTable: Record<
string,
{ available: boolean; name: string; displayName: string }
> = {};

// default models
models.forEach((m) => (modelTable[m.name] = m.available));
models.forEach(
(m) =>
(modelTable[m.name] = {
...m,
displayName: m.name,
}),
);

// server custom models
customModels
.split(",")
.filter((v) => !!v && v.length > 0)
.map((m) => {
if (m.startsWith("+")) {
modelTable[m.slice(1)] = true;
} else if (m.startsWith("-")) {
modelTable[m.slice(1)] = false;
} else modelTable[m] = true;
const available = !m.startsWith("-");
const nameConfig =
m.startsWith("+") || m.startsWith("-") ? m.slice(1) : m;
const [name, displayName] = nameConfig.split(":");
modelTable[name] = {
name,
displayName: displayName || name,
available,
};
});
return modelTable;
}
Expand All @@ -31,10 +44,7 @@ export function collectModels(
customModels: string,
) {
const modelTable = collectModelTable(models, customModels);
const allModels = Object.keys(modelTable).map((m) => ({
name: m,
available: modelTable[m],
}));
const allModels = Object.values(modelTable);

return allModels;
}

0 comments on commit 22b6987

Please sign in to comment.