Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Created DEFAULT_NUM_CTX VAR with a default of 32768 #328

Merged
merged 3 commits into from
Nov 22, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -56,3 +56,10 @@ XAI_API_KEY=

# Include this environment variable if you want more logging for debugging locally
VITE_LOG_LEVEL=debug

# Example Context Values for qwen2.5-coder:32b
#
# DEFAULT_NUM_CTX=32768 # Consumes 36GB of VRAM
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would also just leave an empty uncommented entry here as well, to avoid any issues with process.env missing the variable so we could call it out in errors if necessary.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, this will need to live anywhere else that the env vars are being utilized, Dockerfile and docker-compose files for example. Other than that this looks good to me.

# DEFAULT_NUM_CTX=24576 # Consumes 32GB of VRAM
# DEFAULT_NUM_CTX=12288 # Consumes 26GB of VRAM
# DEFAULT_NUM_CTX=6144 # Consumes 24GB of VRAM
6 changes: 5 additions & 1 deletion app/lib/.server/llm/model.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,10 @@ import { ollama } from 'ollama-ai-provider';
import { createOpenRouter } from "@openrouter/ai-sdk-provider";
import { createMistral } from '@ai-sdk/mistral';

export const DEFAULT_NUM_CTX = process.env.DEFAULT_NUM_CTX ?
parseInt(process.env.DEFAULT_NUM_CTX, 10) :
32768;

export function getAnthropicModel(apiKey: string, model: string) {
const anthropic = createAnthropic({
apiKey,
Expand Down Expand Up @@ -58,7 +62,7 @@ export function getGroqModel(apiKey: string, model: string) {

export function getOllamaModel(baseURL: string, model: string) {
let Ollama = ollama(model, {
numCtx: 32768,
numCtx: DEFAULT_NUM_CTX,
});

Ollama.config.baseURL = `${baseURL}/api`;
Expand Down