promptulate icon indicating copy to clipboard operation
promptulate copied to clipboard

Add pne.chat use openai provider

Open Undertone0809 opened this issue 1 month ago • 0 comments

🚀 Feature Request

Add pne.chat() use openai provider to proxy some specified model, eg:

import promptulate as pne

pne.chat(messages="hello", model="gpt-4-turbo")

If developer want to use openai provider by pne.chat(). They can use pne.chat(messages="hello", model="openai/custom-model", model_config={"base_url": "xxxx", "api_key": "xxx"}) to chat with custom model. Model name use openai prefix.

Why?

There are lots of providers use OpenAI SDK to proxy their model, eg:

  1. https://platform.deepseek.com/api-docs/api/create-chat-completion Original:

from openai import OpenAI

client = OpenAI(api_key="<your API key>", base_url="https://api.deepseek.com")

response = client.chat.completions.create(
    model="deepseek-chat",
    messages=[
        {"role": "system", "content": "You are a helpful assistant"},
        {"role": "user", "content": "Hello"},
  ],
    max_tokens=1024,
    temperature=0.7,
    stream=False
)

print(response.choices[0].message.content)

Expect:

import promptulate as pne

pne.chat(
    messages=[
        {"role": "system", "content": "You are a helpful assistant"},
        {"role": "user", "content": "Hello"},
    ], 
    model="openai/deepseek-chat", 
    model_config={
        base_url="https://api.deepseek.com",
        max_tokens=1024,
        temperature=0.7,
        stream=False
    }
)

2.https://openrouter.ai/docs#principles

image

from openai import OpenAI
from os import getenv

# gets API Key from environment variable OPENAI_API_KEY
client = OpenAI(
  base_url="https://openrouter.ai/api/v1",
  api_key=getenv("OPENROUTER_API_KEY"),
)

completion = client.chat.completions.create(
  extra_headers={
    "HTTP-Referer": $YOUR_SITE_URL, # Optional, for including your app on openrouter.ai rankings.
    "X-Title": $YOUR_APP_NAME, # Optional. Shows in rankings on openrouter.ai.
  },
  model="openai/gpt-3.5-turbo",
  messages=[
    {
      "role": "user",
      "content": "Say this is a test",
    },
  ],
)
print(completion.choices[0].message.content)
  1. zhipu

What to do?

  1. Optimize pne.chat() core code.
  2. Add unit test
  3. Add docs and notebook to show how to use custom model by openai in pne.chat()

Attention

In openrouter, openai need to add extra_headers. It's a question to consider.

Undertone0809 avatar May 07 '24 09:05 Undertone0809