You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm really happy to use DIGITS in multi-GPU system. It reduce training time almost 20 times.
However, it still needs almost one day to train little bit big network.
Do you have any plan to release distributed DIGITS?
If a dataset is big enough (such as Imagenet or Places DB), data parallelism is suitable solution for DIGITS.
Hope to release it ASAP.
The text was updated successfully, but these errors were encountered:
I'm really happy to use DIGITS in multi-GPU system. It reduce training time almost 20 times.
However, it still needs almost one day to train little bit big network.
Do you have any plan to release distributed DIGITS?
If a dataset is big enough (such as Imagenet or Places DB), data parallelism is suitable solution for DIGITS.
Hope to release it ASAP.
The text was updated successfully, but these errors were encountered: