Skip to content

Commit

Permalink
[Bugfix] Bump fastapi and pydantic version (vllm-project#8435)
Browse files Browse the repository at this point in the history
  • Loading branch information
DarkLight1337 authored and MengqingCao committed Sep 30, 2024
1 parent ae15dae commit 188612f
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions requirements-common.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,11 @@ py-cpuinfo
transformers >= 4.43.2 # Required for Chameleon and Llama 3.1 hotfox.
tokenizers >= 0.19.1 # Required for Llama 3.
protobuf # Required by LlamaTokenizer.
fastapi
fastapi >= 0.114.1
aiohttp
openai >= 1.40.0 # Ensure modern openai package (ensure types module present)
uvicorn[standard]
pydantic >= 2.8 # Required for OpenAI server.
pydantic >= 2.9 # Required for fastapi >= 0.113.0
pillow # Required for image processing
prometheus_client >= 0.18.0
prometheus-fastapi-instrumentator >= 7.0.0
Expand Down

0 comments on commit 188612f

Please sign in to comment.