openai-python icon indicating copy to clipboard operation
openai-python copied to clipboard

openai migration error

Open Shihab-Litu opened this issue 1 year ago • 4 comments

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • [X] This is an issue with the Python library

Describe the bug

I want to migrate my code for the openai = 1.54.4 version. But got the below error when I run the "openai migrate" command in linux environment. Error: Error: Failed to download Grit CLI from https://github.com/getgrit/gritql/releases/latest/download/marzano-x86_64-unknown-linux-gnu.tar.gz

To Reproduce

error picture have attached:

Code snippets

No response

OS

Linux 20.04 LTS

Python version

Python 3.9.12

Library version

openai 1.54.4

Shihab-Litu avatar Nov 19 '24 06:11 Shihab-Litu

Can you try installing the Grit CLI from npm? https://docs.grit.io/cli/quickstart#installation

Then you can run grit apply openai to get the same migration.

RobertCraigie avatar Nov 19 '24 11:11 RobertCraigie

me aparece el sig. error: You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.

You can run openai migrate to automatically upgrade your codebase to use the 1.0.0 interface.

A detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742

Jesus0510-max avatar Nov 20 '24 19:11 Jesus0510-max

@RobertCraigie This is helpful. I can migrate my code(https://github.com/InfiAgent/InfiAgent/blob/main/pipeline/src/infiagent/llm/client/llama.py) But got the error aclient (AsynchOpenAI) is not defined while I run the code. Though I have defined aclient. def init(self, **data): super().init(**data) client = OpenAI(api_key="", api_base="http://localhost:8000/v1") aclient = AsyncOpenAI(api_key="", api_base="http://localhost:8000/v1")

Is it correct way to define the client and aclient instead of globally? Note: I have run the the vllm server in local pc. But dont give me any answer for the prompt. Its shows due to LLM fails.

server_running

Shihab-Litu avatar Nov 21 '24 03:11 Shihab-Litu

you'll need to replace any reference to AsynchOpenAI with AsyncOpenAI

RobertCraigie avatar Nov 21 '24 09:11 RobertCraigie