-
Notifications
You must be signed in to change notification settings - Fork 6.8k
fail to quantize custom symbols exported from hybrid block #11794
Comments
Could you share the symbol you want to quantize? |
@reminisce I'm not sure whether u want this, and i printed
|
I need the json file/str which describes the network structure to debug. Can you provide that? |
@reminisce sure, it can be found in this repo |
It contains only quantized symbol. Do you have original symbol and param files? |
@reminisce In fact that quantized symbol file is for another issue, sorry for the trouble. I have uploaded the related file to Baidu Drive(the access key is |
@reminisce feel free to remove the Label Bug and apply the appropriate label after your analysis |
@CodePlay2016 Sorry for the late reply. Can you just attach the .json file here? Baidu drive is very slow and I have difficulty in downloading 266MB file from there. |
@reminisce So i attached the symbol file before and after quantization here. Due to the upload size limit, i just couldn't upload the .params file. |
The quantized network has duplicate dequantize operators. It is be a bug of the quantization flow. @xinyu-intel found similar things before and has already pinpointed the place of error. |
@reminisce @CodePlay2016 a fix pr has been open and may be helpful to this issues. You can have a pre-try and if still errors please let me know. Thanks! |
@CodePlay2016 do you still see this issue? |
@reminisce @xinyu-intel I still see the same bug that the quantized network has duplicate dequantize operators when I use custom symbols. Attached are the network symbols before and after quantization of a tiny example where I use the custom op only once in the network. |
Description
when i quantized my custom symbol exported from hybrid block use the quantization tool, there will always be a duplicated output node which will lead to an error when i bind the module.
Environment info (Required)
Error Message:
Minimum reproducible example
Steps to reproduce
i have tried to replace the custom blocks with the original mxnet gluon block (use gluon.nn.LeakyReLU instead of custom PReLU), which could solve this problem, but i need that custom layer, so this is not a feasible solution to me.
The text was updated successfully, but these errors were encountered: