Skip to content

Commit

Permalink
do not exclude object field in CompletionStreamResponse (vllm-proje…
Browse files Browse the repository at this point in the history
…ct#6196)

Signed-off-by: Alvant <[email protected]>
  • Loading branch information
kczimm authored and Alvant committed Oct 26, 2024
1 parent 98e31e8 commit 3505814
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions vllm/entrypoints/openai/serving_completion.py
Original file line number Diff line number Diff line change
Expand Up @@ -301,7 +301,7 @@ async def completion_stream_generator(
else:
chunk.usage = None

response_json = chunk.model_dump_json(exclude_unset=True)
response_json = chunk.model_dump_json(exclude_unset=False)
yield f"data: {response_json}\n\n"

if (request.stream_options
Expand All @@ -314,7 +314,7 @@ async def completion_stream_generator(
usage=usage,
)
final_usage_data = (final_usage_chunk.model_dump_json(
exclude_unset=True, exclude_none=True))
exclude_unset=False, exclude_none=True))
yield f"data: {final_usage_data}\n\n"

except ValueError as e:
Expand Down

0 comments on commit 3505814

Please sign in to comment.