Skip to content

Commit

Permalink
Change to kwargs (#5475)
Browse files Browse the repository at this point in the history
Signed-off-by: MaximumEntropy <[email protected]>

Signed-off-by: MaximumEntropy <[email protected]>
  • Loading branch information
MaximumEntropy authored Nov 22, 2022
1 parent 8552c95 commit 4a523ad
Showing 1 changed file with 7 additions and 7 deletions.
14 changes: 7 additions & 7 deletions nemo/collections/nlp/modules/common/megatron/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -2185,13 +2185,13 @@ def custom_forward(*inputs):
for index in range(start, end):
layer = self._get_layer(index)
hidden_states = layer(
hidden_states,
attention_mask,
encoder_output,
enc_dec_attn_mask,
rotary_pos_emb,
self_attention_relative_position_bias,
cross_attention_relative_position_bias,
hidden_states=hidden_states,
attention_mask=attention_mask,
encoder_output=encoder_output,
enc_dec_attn_mask=enc_dec_attn_mask,
rotary_pos_emb=rotary_pos_emb,
self_attention_relative_position_bias=self_attention_relative_position_bias,
cross_attention_relative_position_bias=cross_attention_relative_position_bias,
)
if isinstance(hidden_states, tuple):
pass
Expand Down

0 comments on commit 4a523ad

Please sign in to comment.