Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

need to expose Device Type and Data Type configuration for users in fluid python API #6141

Closed
QiJune opened this issue Dec 1, 2017 · 0 comments

Comments

@QiJune
Copy link
Member

QiJune commented Dec 1, 2017

We now have ProgramDesc to store neural network topology. However, one operator may have many kernels. ProgramDesc only contains operator. It's not enough to decide which kernels to run.
At now, DataType and DeviceType are the keys to choose which kernel to run. We have to expose these two options to users.

  • Low-precision is efficient in both training and inference. Usually, we would have both fp32 and fp16 in a model. We have to set the reasonable data type for operators.
  • In inference, we may have CPU/GPU/FPGA, and device type has to be set for operators.

I am not sure if our framework can set both data type and device type reasonably. But exposing them to users is flexible for both research and online/mobile inference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant