-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable streaming support for openai ChatCompletion #217 #465
Conversation
As you know, OpenAI v1.0 is a total rewrite of the library with many breaking changes.
These changes may require adjustments in AutoGen codebase. However, once these updates are in place, implementing streaming should be straightforward, based on my testing. On the other hand, I haven't noticed significant changes in the streaming behavior in version v1.0, aside from the new structure it returns in the updated version. |
That's great. The changes are already in place in the PR branch #393 . The core functionalities work w/ openai v1.0 in that branch. If you could experiment with the streaming support based on that branch, we might have a chance to integrate the streaming support into v0.2 before its release. |
@Alvaromah I get the below error when I try to run this. Not running openai 1.0 ERROR:root:Message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided.
Traceback (most recent call last):
File "/Users/ragy/Documents/RNA/GitHub.nosync/agentcloud/agent-backend/src/utils/log_exception_context_manager.py", line 17, in log_exception
yield
File "/Users/ragy/Documents/RNA/GitHub.nosync/agentcloud/agent-backend/src/agents/base.py", line 107, in init_socket_generate_team
user_proxy.initiate_chat(
File "/Users/ragy/Library/Caches/pypoetry/virtualenvs/agent-backend-rnB-mn_B-py3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 594, in initiate_chat
self.send(self.generate_init_message(**context), recipient, silent=silent)
File "/Users/ragy/Library/Caches/pypoetry/virtualenvs/agent-backend-rnB-mn_B-py3.10/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 346, in send
raise ValueError(
ValueError: Message can't be converted into a valid ChatCompletion message. Either content or function_call must be provided. |
Could you please provide a sample code to repro this error? Thanks |
I made some changes in the streaming implementation to support more scenarios. However, this version is not compatible with OpenAI v1.0. |
Created PR #491 for dev/v0.2 tested on openai v1.0.0b3 |
Sorry, it was actually an issue on my end. Working well for me now. Thank you |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #465 +/- ##
==========================================
- Coverage 42.59% 1.25% -41.35%
==========================================
Files 21 21
Lines 2526 2559 +33
Branches 566 573 +7
==========================================
- Hits 1076 32 -1044
- Misses 1359 2527 +1168
+ Partials 91 0 -91
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
You are amazing!!! |
…ntation (#465) * host agent runtime API and docs * graceful shutdown of worker * HostAgentRuntime --> WorkerAgentRuntimeHost * Add unit tests for worker runtime * Fix bug in worker runtime adding sender filed to proto. Documentation. * wip * Fix unit tests; refactor API * fix formatting * Fix * Update * Make source field optional in Event proto
Why are these changes needed?
The OpenAI API, along with other LLM frameworks, offers streaming capabilities that enhance debugging workflows by eliminating the need to wait for complete responses, resulting in a more efficient and time-saving process.
This is a simple and non very intrusive mechanism to support streaming.
To enable streaimg just use this code:
Related issue number
Related to #217
Checks