Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bugs in the code #10

Open
hsiehjackson opened this issue Jul 23, 2021 · 0 comments
Open

Bugs in the code #10

hsiehjackson opened this issue Jul 23, 2021 · 0 comments
Labels
bug Something isn't working

Comments

@hsiehjackson
Copy link

🐛 Bug

  1. x_scaling_factor in transformer_sentence_encoder.py:
    I think the code here should modified scaling_factor to x_scaling_factor. Or do we use the same scaling factor for all transformer encoder layers?
  2. freeze_model for IntSoftmax:
    I think we should implement fix function in IntSoftmax because we have QuantAct here. If we don't implement the fix function, we will skip fixing the QuantAct in IntSoftmax from here.

Problem

I am recently trying to implement IBERT on TVM an I found I should add FixedPointMul and SymmetricQuantFunction operators in TVM. Do you have any existing implementation? Thanks!

@hsiehjackson hsiehjackson added the bug Something isn't working label Jul 23, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant