Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

openai migration error #1876

Open
1 task done
Shihab-Litu opened this issue Nov 19, 2024 · 4 comments
Open
1 task done

openai migration error #1876

Shihab-Litu opened this issue Nov 19, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@Shihab-Litu
Copy link

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

I want to migrate my code for the openai = 1.54.4 version.
But got the below error when I run the "openai migrate" command in linux environment.
Error:
Error: Failed to download Grit CLI from https://github.com/getgrit/gritql/releases/latest/download/marzano-x86_64-unknown-linux-gnu.tar.gz

To Reproduce

error picture have attached:

Code snippets

No response

OS

Linux 20.04 LTS

Python version

Python 3.9.12

Library version

openai 1.54.4

@Shihab-Litu Shihab-Litu added the bug Something isn't working label Nov 19, 2024
@RobertCraigie
Copy link
Collaborator

Can you try installing the Grit CLI from npm? https://docs.grit.io/cli/quickstart#installation

Then you can run grit apply openai to get the same migration.

@Jesus0510-max
Copy link

me aparece el sig. error: You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.

You can run openai migrate to automatically upgrade your codebase to use the 1.0.0 interface.

A detailed migration guide is available here: #742

@Shihab-Litu
Copy link
Author

Shihab-Litu commented Nov 21, 2024

@RobertCraigie
This is helpful. I can migrate my code(https://github.com/InfiAgent/InfiAgent/blob/main/pipeline/src/infiagent/llm/client/llama.py)
But got the error aclient (AsynchOpenAI) is not defined while I run the code. Though I have defined aclient.
def init(self, **data):
super().init(**data)
client = OpenAI(api_key="", api_base="http://localhost:8000/v1")
aclient = AsyncOpenAI(api_key="", api_base="http://localhost:8000/v1")

Is it correct way to define the client and aclient instead of globally?
Note: I have run the the vllm server in local pc. But dont give me any answer for the prompt. Its shows due to LLM fails.

server_running

@RobertCraigie
Copy link
Collaborator

you'll need to replace any reference to AsynchOpenAI with AsyncOpenAI

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants