-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Web] 1.20.0 breaks SkipSimplifiedLayerNormalization backwards compatibility. Missing Input: model.layers.0.input_layernorm.weight #22704
Comments
It also breaks for WASM EP (WebGPU still works): https://jsfiddle.net/9v4fa3gw/ |
@xenova Thank you for the issue report! I did some investigation and identified the issue is in the CPU implementation of f16 This issue is not web specific. All language binding may run into this issue if using CPU/f16 on any of the 4 operators. We will do the fix ASAP and will publish a dev build of onnxruntime-web once it's done. Will also work on a patch release to include the fix. |
@fs-eire Amazing - thanks so much! 🥳 I'll upgrade the build when you're ready 👍 Will this include a dev version of |
I see this issue in Onnxruntime-DirectML as well; Does this fix help with ort-dml? |
Will investigate this issue. It looks like the problem is in CPU EP and
Currently the pipeline does not support this but I can do a manual publish if necessary. |
I am not sure if the problem that you saw is exactly caused by this. If it is, the fix should help. |
Yes please! 😇 Transformers.js v3.1.0 will include this fix |
Here is the new error message I get now:
|
Did some update and this is the latest fix -> dist.zip |
Great! That fixed it @fs-eire 🥳 Please let me know when you put a dev build out 👍 |
👀 |
onnxruntime-node 1.20.1 is released and should have included the fixes. |
I can confirm that fixes it. Thanks! |
Describe the issue
After upgrading to onnxruntime-node 1.20.0, I obtain the following error when trying to run models which were previously exported (and working) with earlier versions of onnx/onnxruntime:
To reproduce
Attempt to run one of the following models:
Urgency
Blocks upgrading transformers.js to use onnxruntime-node v1.20.0
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.20.0
Execution Provider
'wasm'/'cpu' (WebAssembly CPU)
The text was updated successfully, but these errors were encountered: