langchain
langchain copied to clipboard
Support and make use of function calling and other OpenAI updates on 2023-06-13
Feature request
OpenAI released several major updates today (2023-06-13) that likely have major implications for what is possible. At the very least, it will make things more reliable.
Here's a shortlist from the blog post:
- Dramatically improved function calling support + JSON consistency
-
gpt-3.5-turbo
with a 16K context window (🤯) - Token cost changes for completions and embeddings
- Upcoming deprecation for March-versioned models
Motivation
The release of OpenAI's blog post found here:
https://openai.com/blog/function-calling-and-other-api-updates
I'm adding this issue mostly to flag and track this OpenAI release and kick off a forum for discussion.
Your contribution
Can potentially add PRs but haven't contributed here previously.
Looks like they added a new functions
property to their API
Example from post:
curl https://api.openai.com/v1/chat/completions -u :$OPENAI_API_KEY -H 'Content-Type: application/json' -d '{
"model": "gpt-3.5-turbo-0613",
"messages": [
{"role": "user", "content": "What is the weather like in Boston?"}
],
"functions": [
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"required": ["location"]
}
}
]
}'
Response
{
"id": "chatcmpl-123",
...
"choices": [{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"function_call": {
"name": "get_current_weather",
"arguments": "{ \"location\": \"Boston, MA\"}"
}
},
"finish_reason": "function_call"
}]
}
This looks like it could be game changing for agents!
I haven't contributed before but am happy to work on this if helpful!
Should this be a new agent type?
From the api docs it seems an agent should repeatedly call the openAI API until no more functions are returned.
Is there somewhere I should look for proper contribution etiquette?
Here's the contribution guidelines: https://github.com/hwchase17/langchain/blob/11ab0be11aff9128c12178b5ebf62071985fb823/.github/CONTRIBUTING.md
I'm also interested in contributing to this! :smiley:
I think this could be a new openAI agent, and there would have to be some way to add functions. I also think being able to connect functions to tools would help remove the need for the extra prompt engineering at the beginning of the agent.
Looks like @hwchase17 is already diving right in 😎
#6113
I did not see the funcion calling example in https://github.com/hwchase17/langchain/pull/6113 like in openai site for custom function
as below, how to integrate with it using LangChain?
import openai
import json
# Example dummy function hard coded to return the same weather
# In production, this could be your backend API or an external API
def get_current_weather(location, unit="fahrenheit"):
"""Get the current weather in a given location"""
weather_info = {
"location": location,
"temperature": "72",
"unit": unit,
"forecast": ["sunny", "windy"],
}
return json.dumps(weather_info)
# Step 1, send model the user query and what functions it has access to
def run_conversation():
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo-0613",
messages=[{"role": "user", "content": "What's the weather like in Boston?"}],
functions=[
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
},
"required": ["location"],
},
}
],
function_call="auto",
)
message = response["choices"][0]["message"]
# Step 2, check if the model wants to call a function
if message.get("function_call"):
function_name = message["function_call"]["name"]
# Step 3, call the function
# Note: the JSON response from the model may not be valid JSON
function_response = get_current_weather(
location=message.get("location"),
unit=message.get("unit"),
)
# Step 4, send model the info on the function call and function response
second_response = openai.ChatCompletion.create(
model="gpt-3.5-turbo-0613",
messages=[
{"role": "user", "content": "What is the weather like in boston?"},
message,
{
"role": "function",
"name": function_name,
"content": function_response,
},
],
)
return second_response
print(run_conversation())
Can we also add APIs to use the new function call feature directly without an agent?
See #6178, it explains how to use functions directly on an LLM. However, it looks like OpenAI just finetuned their models to spit out some JSON, and it doesn't work amazingly yet.
https://news.ycombinator.com/item?id=36313348
Hi, @danielgwilson! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, this issue is a feature request to support and utilize OpenAI's recent updates, including improved function calling support, a new model with a larger context window, changes in token cost, and upcoming deprecation for older models. There have been discussions among users about the new functions
property added to the API and potential contribution guidelines. There is also a question about integrating the function calling example from the OpenAI site using LangChain. Additionally, there is a request to add APIs to use the new function call feature directly without an agent.
Before we proceed, we would like to confirm if this issue is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on this issue. Otherwise, feel free to close the issue yourself, or the issue will be automatically closed in 7 days.
Thank you for your understanding and cooperation. We appreciate your contribution to the LangChain community!