-
Notifications
You must be signed in to change notification settings - Fork 5.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dev/v0.2 - Enable streaming support for openai v1.0.0b3 #491
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let me find a few reviewers who are interested in streaming support. Feel free to invite if you know any.
Meanwhile, the structure looks good to me. It's a good time to add a test.
else: | ||
# If streaming is not enabled, send a regular chat completion request | ||
# Ensure streaming is disabled | ||
params["stream"] = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this needed? I'd like to avoid modifying the original dict if not necessary.
This works great for me. We've also managed to integrate it with #394 (see gif below). Given that streaming is essentially a UX centric issue, it would probably make sense to open another PR to formally integrate streaming with some sort of messaging framework (e.g. sockets). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Are there any plans to merge into microsoft:dev/v0.2? |
After the conversation is resolved, and no further revision request from other reviewers. |
@ragyabraham 🙏 Please please please, can you explain me how are you running the UI?/ is it part of your autogen branch? |
Hey @franciscoabenza yeah, this a web UI we are building for autogen. You can find it here |
dev/v0.2 is merged into main. Could you recreate the PR to target main? @Alvaromah |
Sure! I will, as soon as possible. |
Created PR #597 to target main. |
Why are these changes needed?
The OpenAI API, along with other LLM frameworks, offers streaming capabilities that enhance debugging workflows by eliminating the need to wait for complete responses, resulting in a more efficient and time-saving process.
This is a simple mechanism to support streaming.
Tested on openai v1.0.0b3.
To enable streaming just use this code:
Related issue number
Related to #465, #217
Checks