-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support semantic SKFunctions for chat completion backends #676
Comments
hi @danpere we added this a couple of weeks ago: https://github.com/microsoft/semantic-kernel/blob/main/samples/dotnet/kernel-syntax-examples/Example26_SemanticFunctionsUsingChatGPT.cs allowing to use chat endpoints for semantic functions, e.g. kernel.Config.AddAzureChatCompletionService("id", "gpt-35-turbo", "https://....openai.azure.com/", "...API KEY...");
var func = kernel.CreateSemanticFunction(
"List the two planets closest to '{{$input}}', excluding moons, using bullet points.");
var result = await func.InvokeAsync("Jupiter"); does it provide the functionality you were looking for? |
The solution above hides the fact you're dealing with a chat history, object, in fact the history is created from scratch every time. About the second part of your message, where functions interact with a chat history, I would use traditional native functions, passing the chat object around, and invoking semantic functions manually, when needed. |
I think it looks like that would be possibly by making our own implementation of I can see if we can fit what we want to do into that to get a better understanding of what we want out of SemanticKernel. |
After further discussion, it sounds like |
Currently semantic functions are only compatible with text completion backends. That is,
IKernel.RunAsync()
can be given a semantic function that will fill in a template and call sub-skills before making a call toITextCompletion.CompleteAsync()
, but there's no mechanism to get it to callIChatCompletion.GenerateMessageAsync()
to use a chat completion backend other than writing an adapter to make the chat completion backend pretend to be a text completion backend and therefore not take advantage of chat-specific features like providing the history in a structured way.The straightforward version of this I would expect is that a semantic function computed prompt should be added to the chat history as the user message and return the assistant response. The chat history would be kept around by the kernel. This branch implements that change... but is not at all ready to merge because the use of
ITextCompletion
is tied into theISKFunction
interface. In that branch, I just replaced those references withIChatCompletion
, but more likely the desired behavior would be to support both, and I'm not sure exactly what that should look like.Also, there's more possible ways the API client might want to manipulate the chat history. For instance, maybe subskills should include the history in the requests but not add the back-and-forth to the chat history. Or maybe the client may want to summarize or filter the history to keep it shorter or on-topic. The latter is supported in my PR by making the history setter
public
, but that might not be the best way to do that.The text was updated successfully, but these errors were encountered: