Skip to content

Commit

Permalink
Fix doc of attention_hidden_size
Browse files Browse the repository at this point in the history
  • Loading branch information
pglorio authored Sep 27, 2024
1 parent 9c10afe commit 1880455
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/models/zamba/configuration_zamba.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ class ZambaConfig(PretrainedConfig):
model has a output word embedding layer.
hidden_size (`int`, *optional*, defaults to 3712):
Dimension of the hidden representations.
attention_hidden_size (`int`, *optional*, defaults to `None`):
attention_hidden_size (`int`, *optional*):
Dimension of the hidden representations of the inputs to the Attention layer.
intermediate_size (`int`, *optional*, defaults to 14848):
Dimension of the MLP representations.
Expand Down

0 comments on commit 1880455

Please sign in to comment.