Skip to content

Commit

Permalink
Merge pull request #16683 from balvisio:ba/fix-doc-multi-head-attention
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 455435004
  • Loading branch information
tensorflower-gardener committed Jun 16, 2022
2 parents 418d246 + c380d05 commit 8930c26
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions keras/layers/attention/multi_head_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -499,8 +499,8 @@ def _compute_attention(
Args:
query: Projected query `Tensor` of shape `(B, T, N, key_dim)`.
key: Projected key `Tensor` of shape `(B, T, N, key_dim)`.
value: Projected value `Tensor` of shape `(B, T, N, value_dim)`.
key: Projected key `Tensor` of shape `(B, S, N, key_dim)`.
value: Projected value `Tensor` of shape `(B, S, N, value_dim)`.
attention_mask: a boolean mask of shape `(B, T, S)`, that prevents
attention to certain positions.
training: Python boolean indicating whether the layer should behave in
Expand Down

0 comments on commit 8930c26

Please sign in to comment.