-
Notifications
You must be signed in to change notification settings - Fork 476
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deberta onnx pipeline issue #968
Comments
Hi @rcshubhadeep, this looks like a bug in the fill mask pipeline, will fix! Apart from that, microsoft/deberta-base looks to be quite bad, compare with from transformers import AutoTokenizer, pipeline, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("microsoft/deberta-base")
model = AutoModelForMaskedLM.from_pretrained("microsoft/deberta-base")
pipe = pipeline("fill-mask", model=model, tokenizer=tokenizer)
res = pipe("I am a [MASK] engineer.")
print(res[0])
# prints {'score': 0.002210365841165185, 'token': 44452, 'token_str': ' Patreon', 'sequence': 'I am a Patreon engineer.'} |
Hello, Thanks so much for this. I can confirm that this issue does not exist for BERT (and Distill, roberta etc.). I can also confirm that this issue is there for at least Do you want me to have a deeper look and try to see what is going on here? if that is something then please let me know. Also I will need an initial guide of how to setup things. |
Hi, @fxmarty It is not a bug in Optimum. As mentioned by you it is a bug in the pipeline classes. In fact for the pipelines the |
here is a mention of this issue - #207 (comment) |
The (extremely) non-elegant solution that I am using at the moment, looks like the following -
|
posted by @@rcshubhadeep
Hi,
I am really lost in something which relates to exporting DeBERTa in ONNX. I have the following code -
Results into this error -
Whatever I do I can't get rid of this error. I am a noob with HF. I have posted the issue in both forum and SO. But no replies. Here is my last hope. What am I doing wrong? I read the PR that adds the V2 support as well. I can't figure out what is wrong. Even when I use
return_token_type_ids=False
in my tokenizer call it does not solve the problem. My transformer version is 4.27.0 and my optimum version is also the latest.Originally posted by @rcshubhadeep in #555 (comment)
The text was updated successfully, but these errors were encountered: