Skip to content

Commit

Permalink
feat(client): add support for streaming raw responses (openai#1072)
Browse files Browse the repository at this point in the history
As an alternative to `with_raw_response` we now provide
`with_streaming_response` as well. When using these methods you
will have to use a context manager to ensure that the response is
always cleaned up.
  • Loading branch information
stainless-bot authored and megamanics committed Aug 14, 2024
1 parent 2074a8b commit 0bf971b
Show file tree
Hide file tree
Showing 61 changed files with 4,273 additions and 563 deletions.
37 changes: 35 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -414,7 +414,7 @@ if response.my_field is None:

### Accessing raw response data (e.g. headers)

The "raw" Response object can be accessed by prefixing `.with_raw_response.` to any HTTP method call.
The "raw" Response object can be accessed by prefixing `.with_raw_response.` to any HTTP method call, e.g.,

```py
from openai import OpenAI
Expand All @@ -433,7 +433,40 @@ completion = response.parse() # get the object that `chat.completions.create()`
print(completion)
```

These methods return an [`APIResponse`](https://github.com/openai/openai-python/tree/main/src/openai/_response.py) object.
These methods return an [`LegacyAPIResponse`](https://github.com/openai/openai-python/tree/main/src/openai/_legacy_response.py) object. This is a legacy class as we're changing it slightly in the next major version.

For the sync client this will mostly be the same with the exception
of `content` & `text` will be methods instead of properties. In the
async client, all methods will be async.

A migration script will be provided & the migration in general should
be smooth.

#### `.with_streaming_response`

The above interface eagerly reads the full response body when you make the request, which may not always be what you want.

To stream the response body, use `.with_streaming_response` instead, which requires a context manager and only reads the response body once you call `.read()`, `.text()`, `.json()`, `.iter_bytes()`, `.iter_text()`, `.iter_lines()` or `.parse()`. In the async client, these are async methods.

As such, `.with_streaming_response` methods return a different [`APIResponse`](https://github.com/openai/openai-python/tree/main/src/openai/_response.py) object, and the async client returns an [`AsyncAPIResponse`](https://github.com/openai/openai-python/tree/main/src/openai/_response.py) object.

```python
with client.chat.completions.with_streaming_response.create(
messages=[
{
"role": "user",
"content": "Say this is a test",
}
],
model="gpt-3.5-turbo",
) as response:
print(response.headers.get("X-My-Header"))

for line in response.iter_lines():
print(line)
```

The context manager is required so that the response will reliably be closed.

### Configuring the HTTP client

Expand Down
16 changes: 10 additions & 6 deletions examples/audio.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,14 +12,18 @@

def main() -> None:
# Create text-to-speech audio file
response = openai.audio.speech.create(
model="tts-1", voice="alloy", input="the quick brown fox jumped over the lazy dogs"
)

response.stream_to_file(speech_file_path)
with openai.audio.speech.with_streaming_response.create(
model="tts-1",
voice="alloy",
input="the quick brown fox jumped over the lazy dogs",
) as response:
response.stream_to_file(speech_file_path)

# Create transcription from audio file
transcription = openai.audio.transcriptions.create(model="whisper-1", file=speech_file_path)
transcription = openai.audio.transcriptions.create(
model="whisper-1",
file=speech_file_path,
)
print(transcription.text)

# Create translation from audio file
Expand Down
1 change: 1 addition & 0 deletions src/openai/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
from ._utils import file_from_path
from ._client import Client, OpenAI, Stream, Timeout, Transport, AsyncClient, AsyncOpenAI, AsyncStream, RequestOptions
from ._version import __title__, __version__
from ._response import APIResponse as APIResponse, AsyncAPIResponse as AsyncAPIResponse
from ._exceptions import (
APIError,
OpenAIError,
Expand Down
Loading

0 comments on commit 0bf971b

Please sign in to comment.