Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net: Ollama LocalModels FunctionCalling Support #7442

Closed
RogerBarreto opened this issue Jul 25, 2024 · 8 comments · Fixed by #9488
Closed

.Net: Ollama LocalModels FunctionCalling Support #7442

RogerBarreto opened this issue Jul 25, 2024 · 8 comments · Fixed by #9488
Assignees
Labels
ai connector Anything related to AI connectors Ignite sk team issue A tag to denote issues that where created by the Semantic Kernel team (i.e., not the community)

Comments

@RogerBarreto
Copy link
Member

Update Ollama Connector to allow usage of the pattern for function calling using the Ollama’s raw mode.

Reference:

@markwallace-microsoft markwallace-microsoft added .NET Issue or Pull requests regarding .NET code triage labels Jul 25, 2024
@RogerBarreto RogerBarreto self-assigned this Jul 25, 2024
@RogerBarreto RogerBarreto added ai connector Anything related to AI connectors and removed .NET Issue or Pull requests regarding .NET code triage labels Jul 25, 2024
@evchaki evchaki added the sk team issue A tag to denote issues that where created by the Semantic Kernel team (i.e., not the community) label Aug 22, 2024
@RogerBarreto RogerBarreto moved this to Sprint: Planned in Semantic Kernel Sep 9, 2024
@anktsrkr
Copy link

As LLamaSharp support Tooling, will this bring changes in overall Ollam Connector or is it specific to Mistral?

@taha-ghadirian
Copy link

Hello @RogerBarreto, Is there any update on this issue?

@RogerBarreto
Copy link
Member Author

@taha-ghadirian this feature is coming in the following week.

Thanks for the interest!

@taha-ghadirian
Copy link

taha-ghadirian commented Oct 3, 2024

@taha-ghadirian this feature is coming in the following week.

Thanks for the interest!

Thanks for your answer.
I'm eagerly waiting for ollama connector,
Does this implemantation also supports function calling as are available in AzureAi and OpenAi?

@Vijay-Braidable
Copy link

Hey,

Any update on this functionality?

Thanks!

github-merge-queue bot pushed a commit that referenced this issue Nov 8, 2024
# Problem & Solution

This PR updates the Ollama Connector to use it's latest `OllamaSharp`
updated version `4.0` where already supports the new
`Microsoft.Extensions.AI` that can be completely leveraged by SK
Abstractions including Function Calling support.

- Resolves #7442
@github-project-automation github-project-automation bot moved this from Sprint: In Review to Sprint: Done in Semantic Kernel Nov 8, 2024
@muqeet-khan
Copy link

@RogerBarreto is there an ETA on #9488 to be included in a published package? I believe the current 1.28.0-alpha does not have that.

@RogerBarreto
Copy link
Member Author

@muqeet-khan As this is already part of main, it will be available as soon we publish our next release. Which normally happens on Tuesday or Wednesday's.

@bauann
Copy link

bauann commented Nov 14, 2024

@RogerBarreto Hi, Roger

I have updated all the packages to version 1.29.0. According to the example code in the Samples->Demo->OllamaFunctionCalling (using chatCompletionService.GetChatMessageContentAsync), The action works as expected and successfully calls the plugin. However, when using chatCompletionService.GetStreamingChatMessageContentsAsync, it appears to be the same situation as before, returning a response like the one below without invoking the function calling.

{"name": "CurrentDateTimePlugin", "parameters": {}}

I'm wondering if this is a current limitation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai connector Anything related to AI connectors Ignite sk team issue A tag to denote issues that where created by the Semantic Kernel team (i.e., not the community)
Projects
Status: Sprint: Done
9 participants