Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG][TUTORIAL] Tutorial for quantization need update #5145

Closed
Robeast opened this issue Mar 25, 2020 · 4 comments · Fixed by #5150
Closed

[BUG][TUTORIAL] Tutorial for quantization need update #5145

Robeast opened this issue Mar 25, 2020 · 4 comments · Fixed by #5150

Comments

@Robeast
Copy link

Robeast commented Mar 25, 2020

Dear community,
I've noticed that the stock tutorial for quantization fails if the following call is changed from local scales to global scales:

...
mod = quantize(mod, params, data_aware=True) # -> fails with data_aware=False 
...

ValueError: Unknown calibrate mode global

The fix is straight forward:

-         with relay.quantize.qconfig(calibrate_mode='global', global_scale=8.0):
---
+        with relay.quantize.qconfig(calibrate_mode='global_scale', global_scale=8.0):

I would like to kindly ask @vinx13 to update the tutorial. Thank you very much in advance & best regards!

@tqchen
Copy link
Member

tqchen commented Mar 25, 2020

feel free to send a PR to fix this

@Robeast
Copy link
Author

Robeast commented Mar 25, 2020

@tqchen I'm extremely ashamed to say this, but due to company policy I'm not allowed to contribute to anything other than company code bases. Especially shameful since it's only 1 string change... :( I'm really sorry.

@tqchen
Copy link
Member

tqchen commented Mar 25, 2020

no problem, thanks @Robeast for reporting the problem :)

@Robeast
Copy link
Author

Robeast commented Mar 25, 2020

Thank you for your understanding... And thank you very much for @vinx13 for fixing!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants