Skip to content

Actions: triton-inference-server/vllm_backend

pre-commit

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
364 workflow runs
364 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

perf: Upgrade vLLM version to 0.6.3.post1
pre-commit #390: Pull request #76 synchronize by oandreeva-nv
December 20, 2024 23:11 2m 57s jacky-vllm-0.6.3.post1
December 20, 2024 23:11 2m 57s
Setting shutdown asyncio event in a thread-safe manner
pre-commit #389: Pull request #78 synchronize by oandreeva-nv
December 20, 2024 20:07 21s oandreeva_asyncio_fix
December 20, 2024 20:07 21s
Setting shutdown asyncio event in a thread-safe manner
pre-commit #388: Pull request #78 opened by oandreeva-nv
December 20, 2024 18:44 3m 1s oandreeva_asyncio_fix
December 20, 2024 18:44 3m 1s
Followup with some fixes
pre-commit #387: Pull request #77 opened by oandreeva-nv
December 20, 2024 02:06 2m 56s oandreeva_metrics_refactor
December 20, 2024 02:06 2m 56s
Add resolve_model_relative_to_config_file config option
pre-commit #386: Pull request #29 synchronize by Legion2
December 16, 2024 23:35 Action required Legion2:local-vllm-models
December 16, 2024 23:35 Action required
perf: Upgrade vLLM version to 0.6.3.post1
pre-commit #385: Pull request #76 synchronize by kthui
December 7, 2024 00:45 24s jacky-vllm-0.6.3.post1
December 7, 2024 00:45 24s
perf: Upgrade vLLM version to 0.6.3.post1
pre-commit #384: Pull request #76 synchronize by kthui
December 7, 2024 00:27 23s jacky-vllm-0.6.3.post1
December 7, 2024 00:27 23s
perf: Upgrade vLLM version to 0.6.3.post1
pre-commit #383: Pull request #76 opened by kthui
December 6, 2024 18:39 2m 56s jacky-vllm-0.6.3.post1
December 6, 2024 18:39 2m 56s
feat: Auto unload model if vLLM health check failed
pre-commit #378: Pull request #73 synchronize by kthui
November 27, 2024 00:04 17s jacky-vllm-health
November 27, 2024 00:04 17s
feat: Auto unload model if vLLM health check failed
pre-commit #377: Pull request #73 synchronize by kthui
November 26, 2024 23:49 23s jacky-vllm-health
November 26, 2024 23:49 23s
Update main branch post 24.11
pre-commit #376: Pull request #74 synchronize by mc-nv
November 26, 2024 23:31 18s mchornyi/after-24.11
November 26, 2024 23:31 18s
feat: Auto unload model if vLLM health check failed
pre-commit #375: Pull request #73 synchronize by kthui
November 26, 2024 23:05 24s jacky-vllm-health
November 26, 2024 23:05 24s
feat: Auto unload model if vLLM health check failed
pre-commit #374: Pull request #73 synchronize by kthui
November 26, 2024 19:10 22s jacky-vllm-health
November 26, 2024 19:10 22s
feat: Auto unload model if vLLM health check failed
pre-commit #373: Pull request #73 synchronize by kthui
November 26, 2024 18:57 19s jacky-vllm-health
November 26, 2024 18:57 19s
feat: Auto unload model if vLLM health check failed
pre-commit #372: Pull request #73 synchronize by kthui
November 26, 2024 18:08 26s jacky-vllm-health
November 26, 2024 18:08 26s
feat: Support sending additional outputs from vLLM inference
pre-commit #370: Pull request #70 synchronize by kthui
November 25, 2024 23:48 2m 54s jacky-vllm-additional-outputs
November 25, 2024 23:48 2m 54s