Skip to content

Commit

Permalink
add example for custom x-request-id header
Browse files Browse the repository at this point in the history
  • Loading branch information
cjackal committed Oct 26, 2024
1 parent bfd5118 commit 5be6191
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion docs/source/serving/openai_compatible_server.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,14 +35,17 @@ vLLM also provides experimental support for OpenAI Vision API compatible inferen
## Extra Parameters
vLLM supports a set of parameters that are not part of the OpenAI API.
In order to use them, you can pass them as extra parameters in the OpenAI client.
Or directly merge them into the JSON payload if you are using HTTP call directly.
Or directly merge them into the HTTP headers or JSON payload if you are using HTTP call directly.

```python
completion = client.chat.completions.create(
model="NousResearch/Meta-Llama-3-8B-Instruct",
messages=[
{"role": "user", "content": "Classify this sentiment: vLLM is wonderful!"}
],
extra_headers={
"x-request-id": "sentiment-classification-00001",
},
extra_body={
"guided_choice": ["positive", "negative"]
}
Expand Down

0 comments on commit 5be6191

Please sign in to comment.