agents
agents copied to clipboard
OpenAI - Include content in function call results
It seems the other providers also exclude content - however I'm not using them so wasn't able to test.
🦋 Changeset detected
Latest commit: af012b7089f460d30362aa71d95ff11f5a395440
The changes in this PR will be included in the next version bump.
This PR includes changesets to release 2 packages
| Name | Type |
|---|---|
| livekit-agents | Patch |
| livekit-plugins-openai | Patch |
Not sure what this means? Click here to learn what changesets are.
Click here if you're a maintainer who wants to add another changeset to this PR
ptal @davidzhao @theomonnom
do you know any providers that would do this? it'd be great to understand the behavior on our side.
do you know any providers that would do this? it'd be great to understand the behavior on our side.
I'm only familiar with OpenAI: https://platform.openai.com/playground/chat?models=gpt-4o&preset=preset-yQpTdwXuiyOaanbcYPoObp97 - here's an example.
The JSON looks like:
import OpenAI from "openai";
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
{
"role": "system",
"content": [
{
"text": "Create a response that includes an interjection before calling the get_weather function to simulate processing time while the backend queries the API.\n\n# Steps\n\n1. Start with a natural-sounding interjection to acknowledge the user's request or to signal that you are processing the request.\n2. Call the `get_weather` function to retrieve the weather data.\n3. Return the final response with the weather details after the interjection.\n\n# Output Format\n\n- A natural interjection followed by the weather data response.\n- The response should be conversational and seamlessly integrate the interjection.\n\n# Examples\n\n**Input Request**: \"What's the weather like today?\"\n\n**Example Output**: \n- \"Hmm, let me check that for you...The weather today is [sunny with a high of 75°F].\" \n(Note: Replace [sunny with a high of 75°F] with actual weather data retrieved from get_weather.)\n\n# Notes\n\n- Ensure that the interjection is friendly and engaging.\n- The transition from interjection to the actual weather data should be smooth.\n- You can vary the interjection for different interactions if needed to enhance the natural feel.",
"type": "text"
}
]
},
{
"role": "user",
"content": [
{
"type": "text",
"text": "What's the weather in NYC right now?"
}
]
},
{
"role": "assistant",
"content": [
{
"type": "text",
"text": "Oh, just a moment while I check that for you... \n\nLet me find out the current weather in NYC by querying our weather service."
}
],
"tool_calls": [
{
"id": "call_Gwqkvbb3r0oRJnj3zzrIEL1R",
"type": "function",
"function": {
"name": "get_weather",
"arguments": "{\"location\":\"New York, NY\",\"unit\":\"f\"}"
}
}
]
},
{
"role": "tool",
"content": [
{
"text": "",
"type": "text"
}
],
"tool_call_id": "call_Gwqkvbb3r0oRJnj3zzrIEL1R"
}
],
temperature: 1,
max_tokens: 2048,
top_p: 1,
frequency_penalty: 0,
presence_penalty: 0,
tools: [
{
"type": "function",
"function": {
"name": "get_weather",
"strict": true,
"parameters": {
"type": "object",
"required": [
"location",
"unit"
],
"properties": {
"unit": {
"enum": [
"c",
"f"
],
"type": "string"
},
"location": {
"type": "string",
"description": "The city and state e.g. San Francisco, CA"
}
},
"additionalProperties": false
},
"description": "Determine weather in my location"
}
}
],
parallel_tool_calls: true,
response_format: {
"type": "text"
},
});
@davidzhao I noticed a few more places to change for this to work, gimme a few
Alright this is working. Not sure how you'd want to version it - it won't break but it'll require the newer openai plugin to actually work (or content will never be populated)
nice! I'll give this a try today