-
-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[issue templates] add some issue templates #3360
Conversation
And we might need to add some |
This is currently how the output looks like:
Should have enough information for most of the issues! |
Note: you can preview the ui at https://github.com/youkaichao/vllm/issues/new/choose . |
If @simon-mo is bandwidth bounded, maybe @WoosukKwon can merge this PR? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the env, can we also collect GPU topology like SXM or PCIe, is the cards connected by NVLINK or not? (basically some sort of output of nvidia-smi topo -m)
Please run the following and paste the output below. | ||
```sh | ||
wget https://raw.githubusercontent.com/vllm-project/vllm/main/collect_env.py | ||
# For security purposes, please check the contents of collect_env.py before running it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# For security purposes, please check the contents of collect_env.py before running it. | |
# For security purposes, please feel free to check the contents of collect_env.py before running it. |
```python | ||
# All necessary imports at the beginning | ||
import torch | ||
|
||
# A succinct reproducing example trimmed down to the essential parts: | ||
t = torch.rand(5, 10) # Note: the bug is here, we should pass requires_grad=True | ||
t.sum().backward() | ||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change this to vLLM hello world?
collect_env.py
Outdated
@@ -0,0 +1,676 @@ | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cite the pytorch link?
Good suggestions, will do today. |
Currently developers are heavily overloaded and we need to classify issues first.
cc @zhuohan123 @WoosukKwon @simon-mo