-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
optimizer_test.py failure on litprotobuf ERROR #38
Comments
The underlying issue appears the same as microsoft/onnxruntime#5035. Using protobuf-lite is a working workaround for me:
Other related discussions: onnx/onnx#3030, protocolbuffers/protobuf#1941 |
Thanks for the suggestion @yan12125, that helped! I'm now getting the below failure
Looks like a protobuf version issue. I can proceed to build the project and install the whl package if I remove the code associated with |
Hmm,
I think that is a good way. Let's see what @daquexian thinks. |
Sorry for the late reply. @yan12125 @jiafatom @BowenBao Could you please try linking to protobuf static lib? I found that if onnx and onnxoptimizer are both built from source and link to protobuf shared lib, the "File already exists in database: onnx/onnx-ml.proto" error will happen. If protobuf is statically linked instead (i.e. linking libprotobuf.a), the error will not happen. |
Thanks a lot for the hint. I managed to build onnxoptimzer with a static protobuf library and it indeed works. I got some issues during the build, so I'd like to share my steps in hope of helping others:
I feel the process more complex than using protobuf-lite provided by Linux distributions. Also, I'm maintaining a unofficial python-onnxoptimizer package for Arch Linux, and packaging guidelines there recommend against using static libraries for concerns around security and package sizes - so I use protobuf-lite for that package. |
@yan12125 wow! I had been an arch linux user until one year ago 😂 (I'm now using ChromeOS/Windows and most of my daily work is to ssh to server) Thanks for your great contribution to archlinuxcn and aur! But I'm afraid it is a "trick" to use libprotobuf-lite. The reason that it solves the problem is it makes onnx optimizer link to libprotobuf-lite.so while onnx links to another library, libprotobuf.so. Though it works, it relies on the "implementation" of how onnx is packaged, which is fragile. |
Thanks for your kind words! I'm glad that my efforts are also helpful for others :) Yep, using protobuf-lite is a trick and it is indeed a little fragile. I share my findings in hope of helping others. Building onnxoptimizer against static protobuf does prevent the conflict between |
Looks like multiple onnx-ml.proto copies are allowed in protobuf-lite. See also [1] for the protobuf issue. [1] onnx/optimizer#38
Looks like multiple onnx-ml.proto copies are allowed in protobuf-lite. See also [1] for the protobuf issue. [1] onnx/optimizer#38
* Add an external MAML implementation as a submodule * Load omniglot * Model rewriting in transform.py - Change inputs for KWS to eliminate unsupported operators from the final ONNX model. - Save models at each step of rewriting for debugging - Inference dynamic shapes to get rid of a complex graph from forward steps like `x = x.view(x.size(0), -1)` - Constant folding for Squeeze and Reshape nodes with known new shape and constant input - Reduce global variables so that the latest model is always used after model rewriting * Other changes in transform.py - Move more ONNX helpers to utils.py - Make transformation of input samples more robust in terms of input shape - Don't use a default batch size; make that argument required. * Implement BatchNormalization * Use libc_nano instead of libc as the model omniglot-maml is too large [1] onnx/optimizer#38 [2] microsoft/onnxruntime#5577
@yan12125 @daquexian got into same issue in 202222.. Why it still an issue, uninstall system protobuf seems broken everything I rely on, any better suggestion here? optimizer become standalone but shared a onnx-ml.proto with onnx, looks like not a good way, why not just using onnx as included lib remove onnx-ml.proto from this repo? |
You don't need to uninstall system protobuf if your system protobuf is 3.9.0 or newer. Using the command at #38 (comment) to build onnxoptimizer should be enough.
That is exactly what I wanted to do, but that requires non-trivial changes discussed in onnx/onnx#3030 as you already know. I'm not a developer of onnx or onnxoptimizer and I'm OK with the status quo, so I don't put more efforts into this issue. |
@yan12125 I believe this error will happen then: |
I saw you mentioned M1 Mac in another issue (onnx/onnx#4013). Do you use a package manager like Homebrew or MacPorts? Both provides protobuf 3.19.x [1,2] and [1] https://formulae.brew.sh/formula/protobuf |
* Add an external MAML implementation as a submodule * Load omniglot * Model rewriting in transform.py - Change inputs for KWS to eliminate unsupported operators from the final ONNX model. - Save models at each step of rewriting for debugging - Inference dynamic shapes to get rid of a complex graph from forward steps like `x = x.view(x.size(0), -1)` - Constant folding for Squeeze and Reshape nodes with known new shape and constant input - Reduce global variables so that the latest model is always used after model rewriting * Other changes in transform.py - Move more ONNX helpers to utils.py - Make transformation of input samples more robust in terms of input shape - Don't use a default batch size; make that argument required. * Implement BatchNormalization * Use libc_nano instead of libc as the model omniglot-maml is too large [1] onnx/optimizer#38 [2] microsoft/onnxruntime#5577
I think is not a protobuf lib issue. It's a prootbuf schema conflict issue |
This is still a problem now, in 2024: pyg-team/pytorch_geometric#9220 |
I installed onnx/optimizer from source as mentioned.
Then I run
python onnxoptimizer/test/optimizer_test.py
, I got the failure below:Any suggestions? Thanks!
The text was updated successfully, but these errors were encountered: