-
Notifications
You must be signed in to change notification settings - Fork 414
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
hello, how to export the projection onnx? #8
Comments
PR is here, but not been merged yet since I didn't have time to do some changes (about testing and inference scripts). However, you can use the export script for now. |
@EmreOzkose Hi, I only found these onnx were exported, didn't find encoder_proj.onnx or decoder_proj.onnx. which needed by your onnx c++ inference code. I have 2 questions:
|
@csukuangfj Hello, can u help me out on this issue? I can not get your project run. |
What have you done and what are the error messages? |
Hi @jinfagang , I am updating PR now. You can use export script
Export script is updated.
Initially, the aim was to combine all parts into one .onnx like model.pt. There is a branch which uses only all-in-one.onnx to decode. It is written in Python. This repo uses Onnxruntime without dependency on Libtorch and I couldn't extract each model (encoder, decoder, etc..) internally. Hence models are given separately for now. |
@EmreOzkose thanks. now i can get proj model as well. the all in on infefrence, from I can see: https://github.com/EmreOzkose/sherpa/blob/887ddd0924cf5c4216a8671c39b04e8e8371356d/sherpa/bin/pruned_transducer_statelessX/offline_asr.py#L298 this still using model's decoder when greedysearch. This is not convenient on onnxruntime. |
Oh, I get it. Will this also port to c++ as well? |
This repo contains the first working version, but I have to do refactoring (adding OfflineASR, OfflineRecognizer, etc..). I am planning to do it in a few days. |
Please use |
From the export, I didn't find how to export the projection onnx as well?
The text was updated successfully, but these errors were encountered: