-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.Net: Ollama LocalModels FunctionCalling Support #7442
Comments
As LLamaSharp support Tooling, will this bring changes in overall Ollam Connector or is it specific to Mistral? |
Hello @RogerBarreto, Is there any update on this issue? |
@taha-ghadirian this feature is coming in the following week. Thanks for the interest! |
Thanks for your answer. |
Hey, Any update on this functionality? Thanks! |
# Problem & Solution This PR updates the Ollama Connector to use it's latest `OllamaSharp` updated version `4.0` where already supports the new `Microsoft.Extensions.AI` that can be completely leveraged by SK Abstractions including Function Calling support. - Resolves #7442
@RogerBarreto is there an ETA on #9488 to be included in a published package? I believe the current 1.28.0-alpha does not have that. |
@muqeet-khan As this is already part of |
@RogerBarreto Hi, Roger I have updated all the packages to version 1.29.0. According to the example code in the Samples->Demo->OllamaFunctionCalling (using chatCompletionService.GetChatMessageContentAsync), The action works as expected and successfully calls the plugin. However, when using chatCompletionService.GetStreamingChatMessageContentsAsync, it appears to be the same situation as before, returning a response like the one below without invoking the function calling.
I'm wondering if this is a current limitation? |
Update Ollama Connector to allow usage of the pattern for function calling using the
Ollama’s raw mode.
Reference:
The text was updated successfully, but these errors were encountered: