ai
ai copied to clipboard
Request Body
without using ai-sdk, a response call to an api path is like this for my project:
const response = await fetch("/api/gpt3-output", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
promptText: prompt,
maxTokens: Number(gpt3Length),
temperature: Number(gpt3Temperature),
topP: Number(gpt3TopP),
presencePenalty: Number(gpt3PresencePenalty),
frequencyPenalty: Number(gpt3FrequencyPenalty),
}),
})
and then the api file is like this:
const { promptText, maxTokens, temperature, topP, presencePenalty, frequencyPenalty} = (await req.json())
const payload: OpenAIStreamPayload = {
model: "gpt-4",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: promptText },
],
max_tokens: maxTokens as number,
temperature: temperature as number,
top_p: topP as number,
presence_penalty: presencePenalty as number,
frequency_penalty: frequencyPenalty as number,
stream: true,
}
const stream = await OpenAIStream(payload)
return new Response(stream)
how can i pass the request body parameters with the ai-sdk?
I use something like this. (See below for an untested example with your code.) I'm not sure if it's the best practices way to do it, but it's working well for me.
import { OpenAIStream, StreamingTextResponse } from 'ai'
import {
ChatCompletionRequestMessage,
Configuration,
OpenAIApi,
} from 'openai-edge'
export const runtime = 'edge'
const openai = new OpenAIApi(
new Configuration({
apiKey: process.env.OPENAI_API_KEY,
}),
)
export async function POST(req: Request) {
const { promptText, maxTokens, temperature, topP, presencePenalty, frequencyPenalty} = (await req.json())
const payload: OpenAIStreamPayload = {
model: "gpt-4",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: promptText },
],
max_tokens: maxTokens as number,
temperature: temperature as number,
top_p: topP as number,
presence_penalty: presencePenalty as number,
frequency_penalty: frequencyPenalty as number,
stream: true,
}
const response = await openai.createChatCompletion(payload)
const stream = OpenAIStream(response)
return new StreamingTextResponse(stream)
}
Some endpoints I use a slightly different implementation where the body only contains some specific parameters (user data, not hyperparameters) that get template literal'd into the prompt, the important bits of which are stored on the backend.
How will the frontend call look for this api call with the useChat function as in the example?
The useChat
hook accepts a body
field for as the extra request body (docs):
useChat({
body: {
maxTokens: Number(gpt3Length),
temperature: Number(gpt3Temperature),
topP: Number(gpt3TopP),
presencePenalty: Number(gpt3PresencePenalty),
frequencyPenalty: Number(gpt3FrequencyPenalty),
}
})