You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The T5Config has the parameter n_positions set to 512 and max_position_embeddigs referring to n_positions. However, neither max_position_embeddigs nor n_positions is used in the T5Model and T5 is not limited to max_position_embeddings. E.g.:
fromtransformersimportT5Modelmodel=T5Model.from_pretrained("t5-small")
model.config.max_position_embeddings# shows 512input_ids=torch.tensor([600* [0]]) # input of size > 512model(input_ids, decoder_input_ids=input_ids) # works fine
I think we should delete the parameter.
@thomwolf - do you remember why we added max_position_embeddigs and n_positions to T5? The model does not seem to use these params and also should not be limited to 512 due to its relative position embeddings.
The text was updated successfully, but these errors were encountered:
The T5Config has the parameter
n_positions
set to 512 andmax_position_embeddigs
referring ton_positions
. However, neithermax_position_embeddigs
norn_positions
is used in theT5Model
and T5 is not limited tomax_position_embeddings
. E.g.:I think we should delete the parameter.
@thomwolf - do you remember why we added
max_position_embeddigs
andn_positions
to T5? The model does not seem to use these params and also should not be limited to 512 due to its relative position embeddings.The text was updated successfully, but these errors were encountered: