We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
利用export出来onnx转mnn模型出现一堆胶水算子,但是下载您提供的是没有的?想请教下是我哪里的步骤问题。
我是下载了官方的v5lite-e.pt模型, 利用export 转v5lite-e.onnx, 通过onnxsim优化下, 在利用mnnconvert将v5lite-e-sim.onnx转为v5lite-e-sim.mnn
netron可视化对比下载好的v5lite-e-mnnd_fp16.mnn多了很多胶水算子。
转onnx时候试过加或者不加mnnd/concat貌似都有很多Const算子 python export.py --weights v5lite-e.pt --mnnd --batch-size 1 onnxsim 优化 python -m onnxsim v5lite-e.onnx v5lite-e-sim.onnx mnnconvert(版本2.9.3)导出 mnnconvert -f ONNX --modelFile v5lite-e-sim.onnx --MNNModel v5lite-e-sim.mnn --bizCode MNN
The text was updated successfully, but these errors were encountered:
No branches or pull requests
利用export出来onnx转mnn模型出现一堆胶水算子,但是下载您提供的是没有的?想请教下是我哪里的步骤问题。
我是下载了官方的v5lite-e.pt模型, 利用export 转v5lite-e.onnx, 通过onnxsim优化下, 在利用mnnconvert将v5lite-e-sim.onnx转为v5lite-e-sim.mnn
netron可视化对比下载好的v5lite-e-mnnd_fp16.mnn多了很多胶水算子。
转onnx时候试过加或者不加mnnd/concat貌似都有很多Const算子
python export.py --weights v5lite-e.pt --mnnd --batch-size 1
onnxsim 优化
python -m onnxsim v5lite-e.onnx v5lite-e-sim.onnx
mnnconvert(版本2.9.3)导出
mnnconvert -f ONNX --modelFile v5lite-e-sim.onnx --MNNModel v5lite-e-sim.mnn --bizCode MNN
The text was updated successfully, but these errors were encountered: