Skip to content

Commit

Permalink
Removed Extraneous Print Message From OAI Server (vllm-project#3440)
Browse files Browse the repository at this point in the history
  • Loading branch information
robertgshaw2-neuralmagic authored Mar 16, 2024
1 parent 9b0a1af commit a60b105
Showing 1 changed file with 0 additions and 3 deletions.
3 changes: 0 additions & 3 deletions vllm/entrypoints/openai/serving_completion.py
Original file line number Diff line number Diff line change
Expand Up @@ -309,10 +309,7 @@ async def completion_stream_generator(
except ValueError as e:
# TODO: Use a vllm-specific Validation Error
data = self.create_streaming_error_response(str(e))
print("yield", f"data: {data}\n\n")
yield f"data: {data}\n\n"

print("yield", "data: [DONE]\n\n")
yield "data: [DONE]\n\n"

def request_output_to_completion_response(
Expand Down

0 comments on commit a60b105

Please sign in to comment.