Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any plan to distributed DIGITS? #412

Closed
dccho opened this issue Nov 11, 2015 · 1 comment
Closed

Any plan to distributed DIGITS? #412

dccho opened this issue Nov 11, 2015 · 1 comment

Comments

@dccho
Copy link

dccho commented Nov 11, 2015

I'm really happy to use DIGITS in multi-GPU system. It reduce training time almost 20 times.
However, it still needs almost one day to train little bit big network.
Do you have any plan to release distributed DIGITS?
If a dataset is big enough (such as Imagenet or Places DB), data parallelism is suitable solution for DIGITS.
Hope to release it ASAP.

@lukeyeager
Copy link
Member

Do you want one training job on multiple nodes? See BVLC/caffe#3252

Do you want to schedule single-node training jobs on multiple nodes? See #108.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants