Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vllm provider #3021

Open
2 of 3 tasks
DanielBeck93 opened this issue Nov 21, 2024 · 1 comment
Open
2 of 3 tasks

vllm provider #3021

DanielBeck93 opened this issue Nov 21, 2024 · 1 comment
Assignees
Labels
area:configuration Relates to configuration options kind:bug Indicates an unexpected problem or unintended behavior priority:medium Indicates medium priority

Comments

@DanielBeck93
Copy link

DanielBeck93 commented Nov 21, 2024

Before submitting your bug report

Relevant environment info

- OS: Windows 11
- Continue version: 0.8.57
- IDE version: 1.95.1
- Model: Llama 3.1 8B
- config.json:  {
      "title": "Llama 3.1 8B",
      "model": "ged4",
      "apiBase":  "some_url",
      "completionOptions": {
      "temperature": 0.1,
      "topK": 1,
      "topP": 1,
      "presencePenalty": 0,
      "frequencyPenalty": 0
      },
      "provider": "vllm",
      "apiKey": "some_api_key"
  },

Description

Hi, I have deployed an Llama3 model thourgh vllm.
When I choose vllm as provider in my config file I get:

Error streaming diff: TypeError: Cannot read properties of undefined (reading 'toLowerCase')

when pushing apply or trying to edit the code, or commenting the code through Continue. But when changing the provider to openai it works well. Is this expected?

To reproduce

config:
{ "title": "Llama 3.1 8B", "model": "ged4", "apiBase": "some_url", "completionOptions": { "temperature": 0.1, "topK": 1, "topP": 1, "presencePenalty": 0, "frequencyPenalty": 0 }, "provider": "vllm", "apiKey": "some_api_key" },

Log output

Error streaming diff: TypeError: Cannot read properties of undefined (reading 'toLowerCase')

@dosubot dosubot bot added area:configuration Relates to configuration options kind:bug Indicates an unexpected problem or unintended behavior labels Nov 21, 2024
@tomasz-stefaniak
Copy link
Collaborator

This error originates in VerticalDiffManager and is probably not expected:

Is the problem blocking you from using Continue, or does switching the provider openai fix the problems completely?

@tomasz-stefaniak tomasz-stefaniak added priority:medium Indicates medium priority and removed "needs-triage" labels Nov 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:configuration Relates to configuration options kind:bug Indicates an unexpected problem or unintended behavior priority:medium Indicates medium priority
Projects
None yet
Development

No branches or pull requests

2 participants