Skip to content

Commit

Permalink
Update
Browse files Browse the repository at this point in the history
  • Loading branch information
pglorio committed Aug 14, 2024
1 parent e6c6278 commit 6281d93
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/models/zamba/modeling_zamba.py
Original file line number Diff line number Diff line change
Expand Up @@ -880,7 +880,7 @@ def slow_forward(self, input_states, cache_params: HybridMambaAttentionDynamicCa
hidden_states = self.act(self.conv1d(hidden_states)[..., :seq_len]) # (b d l)
if not torch.all(attention_mask==1):
hidden_states = hidden_states * attention_mask.unsqueeze(1)

# 3. State Space Model sequence transformation
# 3.a. Selection: [batch, seq_len, self.time_step_rank + self.ssm_state_size * 2]
hidden_states = hidden_states.reshape(
Expand Down

0 comments on commit 6281d93

Please sign in to comment.