(pyproject.toml) did not run successfully.
Confirm this is an issue with the Python library and not an underlying OpenAI API
- [x] This is an issue with the Python library
Describe the bug
Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... error error: subprocess-exited-with-error
× Preparing metadata (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [6 lines of output]
Cargo, the Rust package manager, is not installed or is not on PATH.
This package requires Rust and Cargo to compile extensions. Install it through
the system's package manager or via https://rustup.rs/
Checking for Rust toolchain....
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed
× Encountered error while generating package metadata. ╰─> See above for output.
note: This is an issue with the package mentioned above, not pip. hint: See above for details.
To Reproduce
Fetch: (pyproject.toml) did not run successfully.
Code snippets
from openai import OpenAI
# Initialize the client with Gemini API settings
client = OpenAI(
api_key="AIzaSyD_RvBzWEtyaXsgnHSJFArAkWeAYf563Mw", # Replace with your actual Gemini API key
base_url="https://generativelanguage.googleapis.com/v1beta/openai/"
)
# Create a simple chat completion request
response = client.chat.completions.create(
model="gemini-2.0-flash", # Specify the Gemini 2.0 Flash model
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Say Hello World!"}
]
)
# Print the response
print(response.choices[0].message.content)
OS
Windows 11
Python version
latest
Library version
latest
Hey can I work on this issue ?
Sounds like a dependency that requires Rust cannot be built on your system. Please identify which one and ask for help with them directly.