Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Python: Bug: When using Azure OpenAI, even if stream options are enabled, it is not reflected in the usage. #9751

Open
yuichiromukaiyama opened this issue Nov 19, 2024 · 1 comment
Assignees
Labels
bug Something isn't working python Pull requests for the Python Semantic Kernel

Comments

@yuichiromukaiyama
Copy link
Contributor

yuichiromukaiyama commented Nov 19, 2024

Describe the bug

By setting stream_options.include_usage to true, the token usage is supposed to be returned, but it always ends up being None.

model: azure openai gpt 4o/4omini
api version: 2024-10-21

To Reproduce

async def stream_sample() -> None:
    kernel = sk.Kernel()
    service_id: str = "dummy"

    kernel.add_service(
        AzureChatCompletion(
            service_id=service_id,
            deployment_name=AZURE_OPENAI_COMPLETION_DEPLOYMENT_NAME,
            endpoint=AZURE_OPENAI_COMPLETION_ENDPOINT,
            api_key=AZURE_OPENAI_COMPLETION_API_KEY,
        )
    )

    service = kernel.get_service(service_id=service_id)
    settings = service.get_prompt_execution_settings_class()(service_id=service_id)

    if isinstance(settings, AzureChatPromptExecutionSettings):
        settings.extra_body = {
            "stream_options": {
                "include_usage": True,
            }
        }

    history = ChatHistory()
    history.add_user_message("hello")

    async for chunk in service.get_streaming_chat_message_contents(
        chat_history=history,
        settings=settings,
        kernel=kernel,
        arguments=KernelArguments(settings=settings),
    ):
        print("app chunk result: ", chunk)
[StreamingChatMessageContent(choice_index=0, inner_content=ChatCompletionChunk(id='chatcmpl-AVE9a9WOtYJbFwkUqcBrvMe6CWIRr', choices=[], created=1732005938, model='gpt-4o-mini', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_d54531d9eb', usage=None), ai_model_id='gpt-4o-mini', metadata={'id': 'chatcmpl-AVE9a9WOtYJbFwkUqcBrvMe6CWIRr', 'created': 1732005938, 'system_fingerprint': 'fp_d54531d9eb', 'usage': None, content_type='message', role=<AuthorRole.ASSISTANT: 'assistant'>, name=None, items=[], encoding=None, finish_reason=None)]

Expected behavior

[StreamingChatMessageContent(choice_index=0, inner_content=ChatCompletionChunk(id='chatcmpl-AVE9a9WOtYJbFwkUqcBrvMe6CWIRr', choices=[], created=1732005938, model='gpt-4o-mini', object='chat.completion.chunk', service_tier=None, system_fingerprint='fp_d54531d9eb', usage=CompletionUsage(completion_tokens=9, prompt_tokens=8, total_tokens=17)), ai_model_id='gpt-4o-mini', metadata={'id': 'chatcmpl-AVE9a9WOtYJbFwkUqcBrvMe6CWIRr', 'created': 1732005938, 'system_fingerprint': 'fp_d54531d9eb', 'usage': CompletionUsage(prompt_tokens=8, completion_tokens=9)}, content_type='message', role=<AuthorRole.ASSISTANT: 'assistant'>, name=None, items=[], encoding=None, finish_reason=None)]

Screenshots
none

Platform

  • OS: Mac
  • IDE: VS Code
  • Language: Python
  • Source: Semantic Kernel 1.15

Additional context
The cause is clear. The last chunk can sometimes have choice as 0, but it is being skipped entirely with continue. In the implementation of OpenAI, if usage is included, the chunk is processed appropriately.

azure

openai

@yuichiromukaiyama yuichiromukaiyama added the bug Something isn't working label Nov 19, 2024
@markwallace-microsoft markwallace-microsoft added python Pull requests for the Python Semantic Kernel triage labels Nov 19, 2024
@github-actions github-actions bot changed the title Bug: Python When using Azure OpenAI, even if stream options are enabled, it is not reflected in the usage. Python: Bug: Python When using Azure OpenAI, even if stream options are enabled, it is not reflected in the usage. Nov 19, 2024
@yuichiromukaiyama yuichiromukaiyama changed the title Python: Bug: Python When using Azure OpenAI, even if stream options are enabled, it is not reflected in the usage. Python: Bug: When using Azure OpenAI, even if stream options are enabled, it is not reflected in the usage. Nov 19, 2024
@yuichiromukaiyama
Copy link
Contributor Author

I was created pr: #9753

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working python Pull requests for the Python Semantic Kernel
Projects
Status: Community PRs
Development

No branches or pull requests

3 participants