vllm icon indicating copy to clipboard operation
vllm copied to clipboard

Support `tools` and `tool_choice` parameter in OpenAI compatible service

Open simon-mo opened this issue 2 years ago • 3 comments

Also aliased as functions and function_call in deprecated parameters.

https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools

After #1756 is merged (thanks @Tostino!), it should be straightforward to add this as a core parameter to OpenAI compatible service. This will help unlock client libraries using similar interface. Do note that the underlying model need to support function calling (e.g. OpenHermes) and prompt engineering might be needed.

Also see @dongxiaolong's example here: https://github.com/vllm-project/vllm/pull/1756#issuecomment-1827064922

simon-mo avatar Nov 30 '23 19:11 simon-mo

I am confused, is this supported by vllm or not?

thiner avatar Dec 20 '23 16:12 thiner

+1 need function calling feature

code959437957 avatar Feb 27 '24 09:02 code959437957

vllm openai 接口应该如何使用 tools 和 tool_choice 参数?

weiminw avatar Mar 27 '24 05:03 weiminw

Can someone help to confirm if tools/tools_choice is supported? this is not clear from the thread . I am using mistral 8 * 7B instruct model which supports function calling

shubham-bnxt avatar Apr 15 '24 06:04 shubham-bnxt

Can someone help to confirm if tools/tools_choice is supported?

Not yet, tools support is coming with https://github.com/vllm-project/vllm/pull/3237

hmellor avatar Apr 20 '24 00:04 hmellor

I really appreciate all the work being done in all attempts so far (#2488, #3237, #4656)! I've been waiting now already months for this...

I'd like to make a suggestion for a three-step approach to add this feature into vLLM. It's a relatively big change and I think it would make things much easier if we would go step-by-step.

Step 1 – support tool_choice set to none or a particular function

In this first step vLLM would support requests where the tool_choice is set to none or to a particular function (cf. https://platform.openai.com/docs/api-reference/chat/create#chat-create-tool_choice). This means that the LLM is either required to generate a particular tool call or required to not generate a tool call.

Why would this make things easier?

Supporting tool_choice none is trivial – this is what we already do.

Supporting tool_choice setting a particular named function is relatively easy. This is almost equivalent to the current guided_json option. We would need to parse the tool parameter and translate it into something Outlines understands.

Step 2 – support tool_choice set to required

Supporting tool_choice required requires us to allow any of a set of functions to be called. This is more involved than the calling of one particular function. We need to use guided_json and implement some sort of "choice" between the available objects.

NOTE

Steps 1 and 2 don't require any kind of embedded tool prompting or output buffering and parsing into vLLM. This would make the work in #4656 much simpler.

Disadvantages

Not supporting the auto tool_choice means the use-case of agents choosing their next action is not supported.

Step 3 – support tool_choice auto

Once we support the other two use-cases, adding the auto case, should be easier and if not, we would not be blocking all other use-cases one can cover without auto.

@simon-mo I'd really appreciate any feedback on this idea.

br3no avatar May 22 '24 13:05 br3no

@br3no Yes. Thank you for your suggestion and pushing this through.

simon-mo avatar May 25 '24 06:05 simon-mo