-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running Deep Occlusion Fully on CPU #25
Comments
Hello, Yes, it is normal. The inference part runs only on CPU but the other parts should be ran on GPU. Cheers |
@emmanuec Does the CPU you are working has Nvidia? In my system, it isn't. And I am keen in running the model. Could you help me in any case? Thank you! |
@nagasanthoshp My CPU is Intel i7. My GPU is Nvidia. Usually, you will need to use an Nvidia GPU to run the model in GPU mode. For CPU, it should work with any. |
@pierrebaque Hello, I also have this puzzle. I run RunParts.ipynb successfully, but it calculates one image per 5 minute. I have checked that CPU and GPU are both used but the GPU-Util is between 0% and 1% from nvidia-smi. So I hope you can confirm if this is normal. My CPU is Intel i7, and GPU is Titan X. Thank you. |
@emmanuec Can RunUnaries.ipynb and RunPom.ipynb run successfully on CPU? |
Hello,
Thank you for your work!
I have a quick question: I think the answer to the question is yes, but I wanted to confirm in case I set up something incorrectly:
Is it normal for the Deep Occlusion model to run really slow when it is used on CPU only?
Thank you!
The text was updated successfully, but these errors were encountered: