Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

用CPU训练出来的模型,可以用GPU做预测吗?测试下来两者耗时差不多 #11

Open
BuLuoPiaoYu opened this issue Jun 1, 2021 · 5 comments

Comments

@BuLuoPiaoYu
Copy link

No description provided.

@BuLuoPiaoYu
Copy link
Author

image
GPU

image
CPU

@LiKangyuLKY
Copy link
Owner

image
GPU

image
CPU
不看是什么训练出来的,而是看你编译dll时是支持GPU,还是支持CPU。如果是前者,CPU/GPU都可以做预测,后者只支持CPU预测。看一下模型的大小是多少MB。正常来说,GPU预测下要比CPU快不少。

@BuLuoPiaoYu
Copy link
Author

BuLuoPiaoYu commented Jun 1, 2021

dll文件是用你百度云发的那个,不是自己编译的呢

训练模型
image

发布模型
image

@LiKangyuLKY
Copy link
Owner

dll文件是用你百度云发的那个,不是自己编译的呢

模型
image

那应该就是CPU、GPU都可以,建议训练和测试都用GPU,修改use GPU参数为True

@BuLuoPiaoYu
Copy link
Author

dll文件是用你百度云发的那个,不是自己编译的呢
模型
image

那应该就是CPU、GPU都可以,建议训练和测试都用GPU,修改use GPU参数为True

我用GPU重新训练了,好像没有用。
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants