Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Installation of Flash-Attention Stalls During Wheel Building on Windows #16

Open
hl2dm opened this issue Dec 18, 2024 · 3 comments
Open

Comments

@hl2dm
Copy link

hl2dm commented Dec 18, 2024

  1. Initial DeepSpeed Issue
    I encountered an issue with DeepSpeed during its installation using python setup.py install. Specifically:

The build process stalled while constructing certain modules (cpu_adam, aio.lib).
I resolved this by modifying the setup.py file to force DS_BUILD_OPS=0.
2. Flash-Attention Issue
When attempting to install Flash-Attention, the build process stalled at the following step:

Building wheel for flash-attn (setup.py) ...
3. Troubleshooting Steps
Reinstalled Ninja: Installed Ninja (version 1.11.1.3) successfully. However, the problem persists.
Disabled Multi-Core Compilation: Set the environment variable USE_NINJA=OFF to limit the compilation to a single core. The issue remains unresolved.
Verified Dependencies: Confirmed that PyTorch (matching CUDA version) and other dependencies like nvidia-ml-py and torch-optimi are installed.
Tried Different Python Environments: Attempted installation on both Python 3.10 and 3.12 environments but encountered the same issue.

@wwwffbf
Copy link

wwwffbf commented Dec 18, 2024

@hl2dm
same issue with wsl2
I downloaded whl file and manually install it .
https://github.com/Dao-AILab/flash-attention/releases

@RichGua
Copy link

RichGua commented Dec 24, 2024

I hope this can helps you.
https://www.deepspeed.ai/tutorials/advanced-install/
To use it correctly in Python 312, you must correspond to all package environment versions.

@hl2dm
Copy link
Author

hl2dm commented Dec 25, 2024

I just gave up and executed it on WSL. Now there is no problem at all. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants