Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net: Bug: SemanticKernel does not support Azure o1 series models #9749

Open
aeras3637 opened this issue Nov 19, 2024 · 2 comments
Open

.Net: Bug: SemanticKernel does not support Azure o1 series models #9749

aeras3637 opened this issue Nov 19, 2024 · 2 comments
Assignees
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code

Comments

@aeras3637
Copy link

Describe the bug
The latest library of SemanticKernel does not support Azure o1 series models.
Reason for error: Unsupported parameter: 'max_tokens' is not supported with this model Use 'max_completion_tokens' instead.

To Reproduce

static async Task Main(string[] args)
{
    var kernel = Kernel.CreateBuilder()
        .AddAzureOpenAIChatCompletion(
            "deploy",
            "endpoint",
            "key"
        ).Build();

    KernelArguments arguments = new(new AzureOpenAIPromptExecutionSettings() { MaxTokens = 4096});
    Console.WriteLine(await kernel.InvokePromptAsync("What is the Semantic Kernel? Include citations to the relevant information where it is referenced in the response.", arguments));
}

Platform

  • OS: windows
  • IDE: Visual Studio
  • Language: C#
  • Source: NuGet package version 1.29.0

Additional context

Microsoft.SemanticKernel.HttpOperationException
HResult=0x80131500
Message=HTTP 400 (invalid_request_error: unsupported_parameter)
Parameter: max_tokens

Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.
Source=Microsoft.SemanticKernel.Connectors.OpenAI
スタック トレース:
場所 Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.d__731.MoveNext() 場所 Microsoft.SemanticKernel.Connectors.OpenAI.ClientCore.<GetChatMessageContentsAsync>d__16.MoveNext() 場所 Microsoft.SemanticKernel.KernelFunctionFromPrompt.<GetChatCompletionResultAsync>d__25.MoveNext() 場所 Microsoft.SemanticKernel.KernelFunctionFromPrompt.<InvokeCoreAsync>d__6.MoveNext() 場所 System.Threading.Tasks.ValueTask1.get_Result()
場所 Microsoft.SemanticKernel.KernelFunction.<>c__DisplayClass27_0.<b__0>d.MoveNext()
場所 Microsoft.SemanticKernel.Kernel.d__34.MoveNext()
場所 Microsoft.SemanticKernel.Kernel.d__33.MoveNext()
場所 Microsoft.SemanticKernel.KernelFunction.d__27.MoveNext()
場所 AOAI.Program.

d__0.MoveNext() (D:\sandbox\AOAI\Program.cs):行 35

この例外は、最初にこの呼び出し履歴
[外部コード] でスローされました

内部例外 1:
ClientResultException: HTTP 400 (invalid_request_error: unsupported_parameter)
Parameter: max_tokens

Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

@aeras3637 aeras3637 added the bug Something isn't working label Nov 19, 2024
@markwallace-microsoft markwallace-microsoft added .NET Issue or Pull requests regarding .NET code triage labels Nov 19, 2024
@github-actions github-actions bot changed the title Bug: SemanticKernel does not support Azure o1 series models .Net: Bug: SemanticKernel does not support Azure o1 series models Nov 19, 2024
@RogerBarreto
Copy link
Member

RogerBarreto commented Nov 19, 2024

This error is related to the Azure SDK and can be found here:

If that helps as part of the issue above, seems that to avoid this error on happening would be to not set the max tokens in the settings and you should be able to execute.

Another approach that you might want to use is intercept the httprequest call with a Custom HttpHandler and modify the content body of your request to send max_completion_tokens instead of max_tokens when targeting the o1 model.

@mathieumack
Copy link
Contributor

Hi @aeras3637 ,
Here is a sample of fix that can be done with a custom HttpHandler :
https://github.com/mathieumack/MDev.Dotnet.SemanticKernel/blob/feature%2F4-net9/src%2FMDev.Dotnet.SemanticKernel.Connectors.AzureAIStudio.Gpt4o1%2FHttpClientHandlers%2FAzureOpenAIHttpClientHandler.cs
Note:
It's not just system messages but also temperature and tools that failed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code
Projects
Status: Bug
Development

No branches or pull requests

4 participants