Skip to content

Commit

Permalink
Change to kwargs (NVIDIA#5475) (NVIDIA#5477)
Browse files Browse the repository at this point in the history
Signed-off-by: MaximumEntropy <[email protected]>

Signed-off-by: MaximumEntropy <[email protected]>

Signed-off-by: MaximumEntropy <[email protected]>
Co-authored-by: Sandeep Subramanian <[email protected]>
Signed-off-by: Hainan Xu <[email protected]>
  • Loading branch information
2 people authored and Hainan Xu committed Nov 29, 2022
1 parent 0f0dae1 commit d832b45
Showing 1 changed file with 7 additions and 7 deletions.
14 changes: 7 additions & 7 deletions nemo/collections/nlp/modules/common/megatron/transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -2332,13 +2332,13 @@ def custom_forward(*inputs):
for index in range(start, end):
layer = self._get_layer(index)
hidden_states = layer(
hidden_states,
attention_mask,
encoder_output,
enc_dec_attn_mask,
rotary_pos_emb,
self_attention_relative_position_bias,
cross_attention_relative_position_bias,
hidden_states=hidden_states,
attention_mask=attention_mask,
encoder_output=encoder_output,
enc_dec_attn_mask=enc_dec_attn_mask,
rotary_pos_emb=rotary_pos_emb,
self_attention_relative_position_bias=self_attention_relative_position_bias,
cross_attention_relative_position_bias=cross_attention_relative_position_bias,
)
if isinstance(hidden_states, tuple):
pass
Expand Down

0 comments on commit d832b45

Please sign in to comment.