You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
is a float value and is returned as is. I believe that this should be converted to a tensor on the appropriate device before returning something like scaling_factor = torch.tensor([1 / 2 ** self.output_bit], device=exp_int.device).
Please let me know your thoughts on this. Thanks!
The text was updated successfully, but these errors were encountered:
🐛 Bug
To Reproduce
Steps to reproduce the behavior (always include the command you ran):
Code sample
Expected behavior
Environment
pip
, source):Additional context
I've been trying to add the Ibert quantization modules to distilbert and ran into this issue.
I-BERT/fairseq/quantization/utils/quant_modules.py
Line 658 in 45cb6da
is a float value and is returned as is. I believe that this should be converted to a tensor on the appropriate device before returning something like
scaling_factor = torch.tensor([1 / 2 ** self.output_bit], device=exp_int.device)
.Please let me know your thoughts on this. Thanks!
The text was updated successfully, but these errors were encountered: