-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CIF] Deal With Replicate Codes #1749
Conversation
MrSupW
commented
Mar 14, 2023
- reuse the MHA class of wenet/transformer/attention.py
- reuse the DecoderLayer class of wenet/transformer/decoder_layer.py
- reuse the make_pad_mask method of wenet/utils/mask.py
- reuse the PositionalEncoding class of wenet/transformer/embeddings.py
MultiHeadedAttentionSANMDecoder, MultiHeadedAttentionCrossAtt | ||
from wenet.cif.decoder_layer import DecoderLayer, DecoderLayerSANM | ||
from wenet.cif.embedding import PositionalEncoding | ||
from wenet.utils.mask import make_pad_mask |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please sort the import order by alphabet
|
||
return mask | ||
|
||
|
||
def sequence_mask(lengths, maxlen: Optional[int] = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sequence_mask has the same function with subsequent_mask in utisl/mask.py, we can reuse it.
I add some comments, we can refine it in the future. |