Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can use function call ? #105

Open
812781385 opened this issue Jun 19, 2024 · 6 comments
Open

can use function call ? #105

812781385 opened this issue Jun 19, 2024 · 6 comments

Comments

@812781385
Copy link

can use function call ?

@hopperelec
Copy link
Contributor

You will have to be a bit clearer, sorry

@812781385
Copy link
Author

There are some issues where I want the model to call a third-party http interface to get the results.
For example, if I ask for the latest location of ship 414096000, the model will call a third-party api to get the result instead of using ollama directly。

@hopperelec
Copy link
Contributor

hopperelec commented Jun 19, 2024

Ollama doesn't have a specific feature which calls APIs for you, but you can set format: "json" to force the LLM to output in JSON format, and then you could instruct the model to use a format such as

{"request":"location?ship=414096000"}

or (if it doesn't need to make a request)

{"message":"I don't need to use a third-party for that"}

Then, when you receive a response, you can check if it has a "request" key and, if so, make the corresponding request then pass the response back to the model.

For example, here's how a conversation could go.

<system>You are a helpful assistant. You format your responses with JSON. Your response should usually contain a key "message" containing your response to the user. You have access to a third-party API which can provide you information about ships. If the user asks something which requires information about a ship, you can call the API by responding with a key "request" where the value is the endpoint (and parameters) for the request you want to make. After the JSON is closed, you will then be given the response to the request and you can produce another message to pass the information to the user. You should only make such a request if it is necessary to answer the user's question. The API has the following endpoints:
Get the latest location of ship <id>: location?ship=<id>
Get the size of ship <id>: size?ship=<id>
</system>
<user>Hello</user>
<assistant>{"message":"Hello, what can I help you with? If you have any questions about ships, I can answer them for you"}</assistant>
<user>What is the latest location of ship 414096000?</user>
<assistant>{"message":"Hold on, let me find that for you.","request":"location?ship=414096000"}</assistant>
United Kingdom
<assistant>{"message":"The ship was last seen in the UK"}</assistant>
<user>And how big was that ship?</user>
<assistant>{"request":"size?ship=414096000"}</assistant>
150m
<assistant>{"message":"The ship was 150m long"}</assistant>

@NeevJewalkar
Copy link

@812781385
Ollama now supports function calling through their OpenAI drop-in API
(https://gist.github.com/alonsosilvaallende/c4731e0db6bc8292ad2ae7e66ceb1ffd)
The update should be coming to custom clients as well.

@812781385
Copy link
Author

@812781385 Ollama 现在支持通过其 OpenAI 嵌入式 API (https://gist.github.com/alonsosilvaallende/c4731e0db6bc8292ad2ae7e66ceb1ffd)进行函数调用 ,此更新也应该适用于自定义客户端。

Install your example,but :
function_call=None

@BruceMacD
Copy link
Collaborator

This should work now for models that support tools, you may need to update to the most recent version of Ollama.

Here is an example:
https://github.com/ollama/ollama-js/blob/main/examples/tools/tools.ts

And here are some models you can use:
https://ollama.com/search?c=tools

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants