Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error while inference #8

Open
satani99 opened this issue Apr 6, 2023 · 4 comments
Open

Error while inference #8

satani99 opened this issue Apr 6, 2023 · 4 comments

Comments

@satani99
Copy link

satani99 commented Apr 6, 2023

After installing all dependencies, when I run the torchrun command I get this error:
raise RuntimeError("Distributed package doesn't have NCCL " "built in")
I can't figure out what am I doing wrong?
Thanks

@csuhan
Copy link
Collaborator

csuhan commented Apr 6, 2023

Hi @satani99 , this issue may help:
meta-llama/llama#112

@baiyuting
Copy link

+1

theAdamColton pushed a commit to theAdamColton/LLaMA-Adapter that referenced this issue May 8, 2023
@dunanyang
Copy link

+1

@dunanyang
Copy link

Why give a reqirement that does specifiy the torch version with GPU, but with a version of CPU?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants