You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
x_scaling_factor in transformer_sentence_encoder.py:
I think the code here should modified scaling_factor to x_scaling_factor. Or do we use the same scaling factor for all transformer encoder layers?
freeze_model for IntSoftmax:
I think we should implement fix function in IntSoftmax because we have QuantActhere. If we don't implement the fix function, we will skip fixing the QuantAct in IntSoftmax from here.
Problem
I am recently trying to implement IBERT on TVM an I found I should add FixedPointMul and SymmetricQuantFunction operators in TVM. Do you have any existing implementation? Thanks!
The text was updated successfully, but these errors were encountered:
🐛 Bug
x_scaling_factor
intransformer_sentence_encoder.py
:I think the code here should modified
scaling_factor
tox_scaling_factor
. Or do we use the same scaling factor for all transformer encoder layers?IntSoftmax
:I think we should implement fix function in
IntSoftmax
because we haveQuantAct
here. If we don't implement the fix function, we will skip fixing theQuantAct
inIntSoftmax
from here.Problem
I am recently trying to implement IBERT on TVM an I found I should add
FixedPointMul
andSymmetricQuantFunction
operators in TVM. Do you have any existing implementation? Thanks!The text was updated successfully, but these errors were encountered: