Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Are there any plans to support flash-attn 1? #25

Open
mateiradu88 opened this issue Jul 27, 2024 · 4 comments
Open

Are there any plans to support flash-attn 1? #25

mateiradu88 opened this issue Jul 27, 2024 · 4 comments

Comments

@mateiradu88
Copy link

Are there any plans to support flash-attn 1? This is a serious limitation at the moment for anyone still running Turing architecture, as flash-attn 2 does not support Turing.

@mateiradu88
Copy link
Author

To anyone struggling with this issue and running on Turing, simply disabling the flash-attn 2 checks in code seems to work just fine! You will then be able to run on flash-attn 1 without errors, and results look similar if not the exact same as hugging-face demo.

@thomashay
Copy link

@mateiradu88 How exactly did you do this? I tried config=self.config, use_flash_attention_2 = False in line 115 of meshanything.py, but that results in errors later from shape_opt.py.

@mateiradu88
Copy link
Author

I deleted the files, but I found an if condition checking for the flash attention version 2, deleting the condition and running the branch anyway worked like a charm, no other modifications as far as I remember.

@thomashay
Copy link

Also worked for me, thanks! For anyone else: I just removed the ifelse in line 347 in shape_opt.py (only the if and else statement, not the body of the loop, so just lines 347, 356 and 357) and could run the point cloud command line inference test.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants