Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CIF] Deal With Replicate Codes #1749

Merged
merged 7 commits into from
Mar 15, 2023
Merged

[CIF] Deal With Replicate Codes #1749

merged 7 commits into from
Mar 15, 2023

Conversation

MrSupW
Copy link
Collaborator

@MrSupW MrSupW commented Mar 14, 2023

  1. reuse the MHA class of wenet/transformer/attention.py
  2. reuse the DecoderLayer class of wenet/transformer/decoder_layer.py
  3. reuse the make_pad_mask method of wenet/utils/mask.py
  4. reuse the PositionalEncoding class of wenet/transformer/embeddings.py

MultiHeadedAttentionSANMDecoder, MultiHeadedAttentionCrossAtt
from wenet.cif.decoder_layer import DecoderLayer, DecoderLayerSANM
from wenet.cif.embedding import PositionalEncoding
from wenet.utils.mask import make_pad_mask
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please sort the import order by alphabet


return mask


def sequence_mask(lengths, maxlen: Optional[int] = None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sequence_mask has the same function with subsequent_mask in utisl/mask.py, we can reuse it.

@robin1001
Copy link
Collaborator

I add some comments, we can refine it in the future.

@robin1001 robin1001 merged commit 0c9e40f into wenet-e2e:main Mar 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants