vllm provider #3021
Labels
area:configuration
Relates to configuration options
kind:bug
Indicates an unexpected problem or unintended behavior
priority:medium
Indicates medium priority
Before submitting your bug report
Relevant environment info
Description
Hi, I have deployed an Llama3 model thourgh vllm.
When I choose vllm as provider in my config file I get:
Error streaming diff: TypeError: Cannot read properties of undefined (reading 'toLowerCase')
when pushing apply or trying to edit the code, or commenting the code through Continue. But when changing the provider to openai it works well. Is this expected?
To reproduce
config:
{ "title": "Llama 3.1 8B", "model": "ged4", "apiBase": "some_url", "completionOptions": { "temperature": 0.1, "topK": 1, "topP": 1, "presencePenalty": 0, "frequencyPenalty": 0 }, "provider": "vllm", "apiKey": "some_api_key" },
Log output
Error streaming diff: TypeError: Cannot read properties of undefined (reading 'toLowerCase')
The text was updated successfully, but these errors were encountered: