openrouter-examples icon indicating copy to clipboard operation
openrouter-examples copied to clipboard

How do I send a `prompt` using the Python OpenAI client?

Open NightMachinery opened this issue 1 year ago • 1 comments

The site has this example code for sending ChatML messages:

from openai import OpenAI
from os import getenv

# gets API Key from environment variable OPENAI_API_KEY
client = OpenAI(
  base_url="https://openrouter.ai/api/v1",
  api_key=getenv("OPENROUTER_API_KEY"),
)

completion = client.chat.completions.create(
  model="anthropic/claude-3-opus:beta",
  messages=[
    {
      "role": "user",
      "content": "Say this is a test",
    },
  ],
)
print(completion.choices[0].message.content)

But I can't find how I can send a simple prompt (with no instruct formatting) in Python and get the response streamed.

Related:

  • https://openrouter.ai/docs#transforms

NightMachinery avatar Mar 23 '24 18:03 NightMachinery

I think it was changed to messages now, and to send a traditional prompt, you send it with the "system" role:

from openai import OpenAI client = OpenAI()

completion = client.chat.completions.create( model="gpt-3.5-turbo", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"} ] )

print(completion.choices[0].message)

paradiselabs-ai avatar Jul 31 '24 18:07 paradiselabs-ai

To send a prompt you have to use the completions endpoint instead of chat.completions.

yogasanas avatar Mar 03 '25 03:03 yogasanas