forked from openai/openai-python
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* docs(readme): assistant streaming (openai#1238) * release: 1.14.1
- Loading branch information
1 parent
8e0f37a
commit a7446d5
Showing
5 changed files
with
172 additions
and
3 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,3 @@ | ||
{ | ||
".": "1.14.0" | ||
".": "1.14.1" | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,161 @@ | ||
# Streaming Helpers | ||
|
||
OpenAI supports streaming responses when interacting with the [Assistant](#assistant-streaming-api) APIs. | ||
|
||
## Assistant Streaming API | ||
|
||
OpenAI supports streaming responses from Assistants. The SDK provides convenience wrappers around the API | ||
so you can subscribe to the types of events you are interested in as well as receive accumulated responses. | ||
|
||
More information can be found in the documentation: [Assistant Streaming](https://platform.openai.com/docs/assistants/overview?lang=python) | ||
|
||
#### An example of creating a run and subscribing to some events | ||
|
||
You can subscribe to events by creating an event handler class and overloading the relevant event handlers. | ||
|
||
```python | ||
from typing_extensions import override | ||
from openai import AssistantEventHandler | ||
|
||
# First, we create a EventHandler class to define | ||
# how we want to handle the events in the response stream. | ||
|
||
class EventHandler(AssistantEventHandler): | ||
@override | ||
def on_text_created(self, text) -> None: | ||
print(f"\nassistant > ", end="", flush=True) | ||
|
||
@override | ||
def on_text_delta(self, delta, snapshot): | ||
print(delta.value, end="", flush=True) | ||
|
||
def on_tool_call_created(self, tool_call): | ||
print(f"\nassistant > {tool_call.type}\n", flush=True) | ||
|
||
def on_tool_call_delta(self, delta, snapshot): | ||
if delta.type == 'code_interpreter': | ||
if delta.code_interpreter.input: | ||
print(delta.code_interpreter.input, end="", flush=True) | ||
if delta.code_interpreter.outputs: | ||
print(f"\n\noutput >", flush=True) | ||
for output in delta.code_interpreter.outputs: | ||
if output.type == "logs": | ||
print(f"\n{output.logs}", flush=True) | ||
|
||
# Then, we use the `create_and_stream` SDK helper | ||
# with the `EventHandler` class to create the Run | ||
# and stream the response. | ||
|
||
with client.beta.threads.runs.create_and_stream( | ||
thread_id=thread.id, | ||
assistant_id=assistant.id, | ||
instructions="Please address the user as Jane Doe. The user has a premium account.", | ||
event_handler=EventHandler(), | ||
) as stream: | ||
stream.until_done() | ||
``` | ||
|
||
### Assistant Events | ||
|
||
The assistant API provides events you can subscribe to for the following events. | ||
|
||
```python | ||
def on_event(self, event: AssistantStreamEvent) | ||
``` | ||
|
||
This allows you to subscribe to all the possible raw events sent by the OpenAI streaming API. | ||
In many cases it will be more convenient to subscribe to a more specific set of events for your use case. | ||
|
||
More information on the types of events can be found here: [Events](https://platform.openai.com/docs/api-reference/assistants-streaming/events) | ||
|
||
```python | ||
def on_run_step_created(self, run_step: RunStep) | ||
def on_run_step_delta(self, delta: RunStepDelta, snapshot: RunStep) | ||
def on_run_step_done(self, run_step: RunStep) | ||
``` | ||
|
||
These events allow you to subscribe to the creation, delta and completion of a RunStep. | ||
|
||
For more information on how Runs and RunSteps work see the documentation [Runs and RunSteps](https://platform.openai.com/docs/assistants/how-it-works/runs-and-run-steps) | ||
|
||
```python | ||
def on_message_created(self, message: Message) | ||
def on_message_delta(self, delta: MessageDelta, snapshot: Message) | ||
def on_message_done(self, message: Message) | ||
``` | ||
|
||
This allows you to subscribe to Message creation, delta and completion events. Messages can contain | ||
different types of content that can be sent from a model (and events are available for specific content types). | ||
For convenience, the delta event includes both the incremental update and an accumulated snapshot of the content. | ||
|
||
More information on messages can be found | ||
on in the documentation page [Message](https://platform.openai.com/docs/api-reference/messages/object). | ||
|
||
```python | ||
def on_text_created(self, text: Text) | ||
def on_text_delta(self, delta: TextDelta, snapshot: Text) | ||
def on_text_done(self, text: Text) | ||
``` | ||
|
||
These events allow you to subscribe to the creation, delta and completion of a Text content (a specific type of message). | ||
For convenience, the delta event includes both the incremental update and an accumulated snapshot of the content. | ||
|
||
```python | ||
def on_image_file_done(self, image_file: ImageFile) | ||
``` | ||
|
||
Image files are not sent incrementally so an event is provided for when a image file is available. | ||
|
||
```python | ||
def on_tool_call_created(self, tool_call: ToolCall) | ||
def on_tool_call_delta(self, delta: ToolCallDelta, snapshot: ToolCall) | ||
def on_tool_call_done(self, tool_call: ToolCall) | ||
``` | ||
|
||
These events allow you to subscribe to events for the creation, delta and completion of a ToolCall. | ||
|
||
More information on tools can be found here [Tools](https://platform.openai.com/docs/assistants/tools) | ||
|
||
```python | ||
def on_end(self) | ||
``` | ||
|
||
The last event send when a stream ends. | ||
|
||
```python | ||
def on_timeout(self) | ||
``` | ||
|
||
This event is triggered if the request times out. | ||
|
||
```python | ||
def on_exception(self, exception: Exception) | ||
``` | ||
|
||
This event is triggered if an exception occurs during streaming. | ||
|
||
### Assistant Methods | ||
|
||
The assistant streaming object also provides a few methods for convenience: | ||
|
||
```python | ||
def current_event() | ||
def current_run() | ||
def current_message_snapshot() | ||
def current_run_step_snapshot() | ||
``` | ||
|
||
These methods are provided to allow you to access additional context from within event handlers. In many cases | ||
the handlers should include all the information you need for processing, but if additional context is required it | ||
can be accessed. | ||
|
||
Note: There is not always a relevant context in certain situations (these will be undefined in those cases). | ||
|
||
```python | ||
def get_final_run(self) | ||
def get_final_run_steps(self) | ||
def get_final_messages(self) | ||
``` | ||
|
||
These methods are provided for convenience to collect information at the end of a stream. Calling these events | ||
will trigger consumption of the stream until completion and then return the relevant accumulated objects. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,4 @@ | ||
# File generated from our OpenAPI spec by Stainless. | ||
|
||
__title__ = "openai" | ||
__version__ = "1.14.0" # x-release-please-version | ||
__version__ = "1.14.1" # x-release-please-version |