-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] Update PyTorch dependency #884
Comments
Sorry as I am not proficient enough neither in the latest pytorch changes, nor in how GaNDLF uses distributed training. Do we have any tests for multi-gpu training? Cannot find any. If yes, then maybe just running a tests should be enough to ensure new version is ok for us. |
Unfortunately, we do not have any GPU tests right now. 😞 I am fine with updating the dependency right now, but I would like to get the opinion of other developers/contributors/maintainers. 😄 |
Hey, |
Sounds good, thanks! Just waiting for @Geeks-Sid to respond and then we can start. |
Looks like the backwards compatibility issue does not affect us, however the tests might be good to be run on GPU's . We are good to go, but is there any issue in staying at current version? |
Agreed - I am in discussion with a couple of CI providers to give us some extremely limited free GPU compute. Let's see how it goes.
Nothing specific. Just that moving to the last stable release ensures that we aren't too far back in terms of ensuring latest bug fixes from PyTorch getting propagated forward. And since we will be making a jump with the new API branch anyway, I figured it might make sense to go to the latest one. |
Dears, ragading the torch version - from version 2.2, torch has a built-in flash attention mechanism implemented, see: https://pytorch.org/blog/pytorch2-2/ . @sarthakpati mentioned that in the future we may integrate flash attention to speed up some models that employ the attention, this would be also useful regarding the synthesis module, where some diffusion models use it too. So, considering version updates, we may look directly into 2.2 as that solves both version update and flash attention. |
Since #845 also involves a torch version update, I think it might be best to let it get merged and tagged before working on this update. |
So, if there is no further issue with this, I am going to assign this is to @scap3yvt to start work. |
Is your feature request related to a problem? Please describe.
Now that PyTorch 2.3.0 has been out for a while, does it make sense to make the switch? There are a few backward incompatible changes [ref] which potentially relate to the work being done by @Geeks-Sid, so I will definitely wait for his comments.
Describe the solution you'd like
N.A.
Describe alternatives you've considered
N.A.
Additional context
Comments/suggestions, @VukW, @szmazurek?
The text was updated successfully, but these errors were encountered: