ollama-js icon indicating copy to clipboard operation
ollama-js copied to clipboard

can use function call ?

Open 812781385 opened this issue 1 year ago • 6 comments

can use function call ?

812781385 avatar Jun 19 '24 07:06 812781385

You will have to be a bit clearer, sorry

hopperelec avatar Jun 19 '24 07:06 hopperelec

There are some issues where I want the model to call a third-party http interface to get the results. For example, if I ask for the latest location of ship 414096000, the model will call a third-party api to get the result instead of using ollama directly。

812781385 avatar Jun 19 '24 09:06 812781385

Ollama doesn't have a specific feature which calls APIs for you, but you can set format: "json" to force the LLM to output in JSON format, and then you could instruct the model to use a format such as

{"request":"location?ship=414096000"}

or (if it doesn't need to make a request)

{"message":"I don't need to use a third-party for that"}

Then, when you receive a response, you can check if it has a "request" key and, if so, make the corresponding request then pass the response back to the model.

For example, here's how a conversation could go.

<system>You are a helpful assistant. You format your responses with JSON. Your response should usually contain a key "message" containing your response to the user. You have access to a third-party API which can provide you information about ships. If the user asks something which requires information about a ship, you can call the API by responding with a key "request" where the value is the endpoint (and parameters) for the request you want to make. After the JSON is closed, you will then be given the response to the request and you can produce another message to pass the information to the user. You should only make such a request if it is necessary to answer the user's question. The API has the following endpoints:
Get the latest location of ship <id>: location?ship=<id>
Get the size of ship <id>: size?ship=<id>
</system>
<user>Hello</user>
<assistant>{"message":"Hello, what can I help you with? If you have any questions about ships, I can answer them for you"}</assistant>
<user>What is the latest location of ship 414096000?</user>
<assistant>{"message":"Hold on, let me find that for you.","request":"location?ship=414096000"}</assistant>
United Kingdom
<assistant>{"message":"The ship was last seen in the UK"}</assistant>
<user>And how big was that ship?</user>
<assistant>{"request":"size?ship=414096000"}</assistant>
150m
<assistant>{"message":"The ship was 150m long"}</assistant>

hopperelec avatar Jun 19 '24 11:06 hopperelec

@812781385 Ollama now supports function calling through their OpenAI drop-in API (https://gist.github.com/alonsosilvaallende/c4731e0db6bc8292ad2ae7e66ceb1ffd) The update should be coming to custom clients as well.

NeevJewalkar avatar Jul 21 '24 07:07 NeevJewalkar

@812781385 Ollama 现在支持通过其 OpenAI 嵌入式 API (https://gist.github.com/alonsosilvaallende/c4731e0db6bc8292ad2ae7e66ceb1ffd)进行函数调用 ,此更新也应该适用于自定义客户端。

Install your example,but : function_call=None

812781385 avatar Jul 22 '24 02:07 812781385

This should work now for models that support tools, you may need to update to the most recent version of Ollama.

Here is an example: https://github.com/ollama/ollama-js/blob/main/examples/tools/tools.ts

And here are some models you can use: https://ollama.com/search?c=tools

BruceMacD avatar Jul 22 '24 23:07 BruceMacD