You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
max-model-input-length argument not working in prepare input,
max_length uses max_length from model weight/config.json
leads to python kernel crash when max_length is too large
class_TraceableTextEmbeddingModel(_TransformerTraceableModel):
def_prepare_inputs(self) ->transformers.BatchEncoding:
returnself._tokenizer(
"This is an example sentence.",
padding="max_length", #may lead to python kernel crashreturn_tensors="pt",
)
maybe change it to 'longest' ?
The text was updated successfully, but these errors were encountered:
Which model were you importing when you saw the error and please can you share the command you do use so I can reproduce the error. And which version of Eland are you using?
eland/eland/ml/pytorch/transformers.py
max-model-input-length argument not working in prepare input,
max_length uses max_length from model weight/config.json
leads to python kernel crash when max_length is too large
maybe change it to 'longest' ?
The text was updated successfully, but these errors were encountered: