-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is Server-Sent Events(SSE) stream of HTTP/2 supported? #207
Comments
Hi @SokichiFujita, I took a look, and there are a few things interacting here.
responses:
"200":
description: OK
content:
application/json:
schema:
$ref: "#/components/schemas/CreateChatCompletionResponse"
+ text/event-stream:
+ schema:
+ type: string
+ format: binary Once that do that, you will get a new case generated in the Output enum for the binary data. You can then process that raw data according to OpenAI's instructions, such as to split the data by empty newlines, then taking each JSON event and decoding the event (e.g. However, for now, you'll get a buffered data blob, until we address #9, which is planned for next month - at which point, you'd be able to fully asynchronously stream the data and parse it as it comes in. To summarize:
Hope this helps! |
There are some OSS Swift libraries for parsing the Server-side Events/EventSource format, which you could use to parse the stream and get the individual events out, which you can then feed into Closing this issue, but I'll file a new one to document how to document an SSE endpoint in OpenAPI. |
Added this pattern to our docs: #208 |
Document using Server-sent Events with OpenAPI ### Motivation Inspired by #207. While OpenAPI doesn't provide extra support for Server-sent Events, it still makes sense to document what you can achieve today - turns out it's quite a lot. ### Modifications Documented how to spell an OpenAPI operation that returns SSE. ### Result Folks looking to use SSE can quickly see how to consume them (how to produce them would be a similar inverse process, left as an exercise to the reader.) ### Test Plan N/A Reviewed by: gjcairo Builds: ✔︎ pull request validation (5.8) - Build finished. ✔︎ pull request validation (5.9) - Build finished. ✔︎ pull request validation (docc test) - Build finished. ✔︎ pull request validation (integration test) - Build finished. ✔︎ pull request validation (nightly) - Build finished. ✔︎ pull request validation (soundness) - Build finished. #208
Thank you the very helpful quick response. I understand the situation and plan. Regarding SSE, OpenAI does not provide the schema in the document. But they uses only simple Thank you. |
I just tried and this worked well with manual parsing. For now this solution seems to be enough for me. And I am looking forward to use your next release about async. Thank you.
/chat/completions:
post:
operationId: createChatCompletion
tags:
- OpenAI
summary: Creates a model response for the given chat conversation.
requestBody:
required: true
content:
application/json:
schema:
$ref: "#/components/schemas/CreateChatCompletionRequest"
responses:
"200":
description: OK
content:
application/json:
schema:
$ref: "#/components/schemas/CreateChatCompletionResponse"
content:
text/event-stream:
schema:
$ref: "#/components/schemas/CreateChatCompletionStreamResponse"
:
CreateChatCompletionStreamResponse:
type: string A part of the response of generated code (before parsing): Ok(headers: OpenAPITest.Operations.createChatCompletion.Output.Ok.Headers(), body: OpenAPITest.Operations.createChatCompletion.Output.Ok.Body.text("data: {\"id\":\"chatcmpl-***\",\"object\":\"chat.completion.chunk\",\"created\":1692339765,\"model\":\"gpt-3.5-turbo-0613\",\"choices\":[{\"index\":0,\"delta\":{\"role\":\"assistant\",\"content\":\"\"},\"finish_reason\":null}]}\n\ndata: {\"id\":\"chatcmpl-***\",\"object\":\"chat.completion.chunk\",\"created\":1692339765,\"model\":\"gpt-3.5-turbo-0613\",\"choices\":[{\"index\":0,\"delta\":{\"content\":\"Sure\"},\"finish_reason\":null}]}\n\n
|
Ah interesting - I didn't realize we generate a String underlying type for the bytes, not raw data (because the MIME type starts with |
Yes, for the https://swagger.io/docs/specification/data-models/data-types/#object |
Right, in this case we derive the underlying raw type not from the JSON schema, but from the content type. So even if you did
you'd get the same generated code. |
FYI #494 |
I'm trying to receive a stream response which is based on Server-Sent Events (SSE) of HTTP/2 from OpenAI's chat API with
stream: true
andAccept
header istext/event-stream
.The status code is 200 but the following fatal client error is occurred. I would like to know SSE is supported or not.
The text was updated successfully, but these errors were encountered: