openai-cookbook
openai-cookbook copied to clipboard
[SUPPORT] function call usage
environment:ubuntu=20.04 + transformers=4.42.4 + openai=1.30.5 + vllm=0.5.2
I use vllm as server and openai as client and using similar code from website:https://github.com/openai/openai-cookbook/blob/main/examples/How_to_call_functions_with_chat_models.ipynb
but result show no function info to call. how to fix this problem?
---------------------------------------------------- client code info --------------------------------------------------------- from openai import OpenAI import json
tools = [ { "type": "function", "function": { "name": "get_current_weather", "description": "Get the current weather", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA", }, "format": { "type": "string", "enum": ["celsius", "fahrenheit"], "description": "The temperature unit to use. Infer this from the users location.", }, }, "required": ["location", "format"], }, } }, { "type": "function", "function": { "name": "get_n_day_weather_forecast", "description": "Get an N-day weather forecast", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA", }, "format": { "type": "string", "enum": ["celsius", "fahrenheit"], "description": "The temperature unit to use. Infer this from the users location.", }, "num_days": { "type": "integer", "description": "The number of days to forecast", } }, "required": ["location", "format", "num_days"] }, } }, ]
openai_api_key = "xxx" openai_api_base = "http://localhost:8000/v1/"
client = OpenAI( api_key=openai_api_key, base_url=openai_api_base, )
models = client.models.list() model = models.data[0].id
messages = [] messages.append({"role": "system", "content": "Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous."}) messages.append({"role": "user", "content": "What's the weather like today"}) response = client.chat.completions.create( model=model, messages=messages, tools=tools, ) assistant_message = response.choices[0].message messages.append(assistant_message) print("response: ", assistant_message)
---------------------------------------------- server script------------------------------------------------- python entrypoints/openai/api_server.py --model="xxxx/Qwen2-1.5B-Instruct" --trust-remote-code --host "localhost" --port 8000 --dtype auto
--------------------------------------------- print info -------------------------------------------------------- response: ChatCompletionMessage(content='Get out and check.', role='assistant', function_call=None, tool_calls=[])
@FanZhang91 Please format your code with triple backticks.
What GPT model are you using? Your invocation includes --model="xxxx/Qwen2-1.5B-Instruct" which of course isn't an OpenAI model at all.
Please provide a more simple code sample that doesn't use an API server
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 10 days.
This issue was closed because it has been stalled for 10 days with no activity.