Skip to content

Commit

Permalink
Typo: lenth -> length
Browse files Browse the repository at this point in the history
  • Loading branch information
tomaarsen committed Dec 13, 2023
1 parent 01d5ae3 commit b875550
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion sentence_transformers/models/Pooling.py
Original file line number Diff line number Diff line change
Expand Up @@ -139,7 +139,7 @@ def forward(self, features: Dict[str, Tensor]):
# attention_mask shape: (bs, seq_len)
# Get shape [bs] indices of the last token (i.e. the last token for each batch item)
# argmin gives us the index of the first 0 in the attention mask; We get the last 1 index by subtracting 1
# Any sequence where min == 1, we use the entire sequence lenth since argmin = 0
# Any sequence where min == 1, we use the entire sequence length since argmin = 0
values, indices = torch.min(attention_mask, 1, keepdim = False)
gather_indices = torch.where(values==0, indices, seq_len) - 1 # Shape [bs]

Expand Down

0 comments on commit b875550

Please sign in to comment.