Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Context leak detected with CoreML #1449

Open
thewh1teagle opened this issue Oct 20, 2024 · 2 comments
Open

Context leak detected with CoreML #1449

thewh1teagle opened this issue Oct 20, 2024 · 2 comments

Comments

@thewh1teagle
Copy link
Contributor

When using keyword spotter from C-API example
Does it something internal from onnx or something we have control over?

sherpa-onnx git:(v1.10.28) ✗ ./build/bin/keywords-spotter-buffered-tokens-keywords-c-api
/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/c-api/c-api.cc:SherpaOnnxCreateKeywordSpotter:710 KeywordSpotterConfig(feat_config=FeatureExtractorConfig(sampling_rate=16000, feature_dim=80, low_freq=20, high_freq=-400, dither=0), model_config=OnlineModelConfig(transducer=OnlineTransducerModelConfig(encoder="sherpa-onnx-kws-zipformer-gigaspeech-3.3M-2024-01-01/encoder-epoch-12-avg-2-chunk-16-left-64.int8.onnx", decoder="sherpa-onnx-kws-zipformer-gigaspeech-3.3M-2024-01-01/decoder-epoch-12-avg-2-chunk-16-left-64.int8.onnx", joiner="sherpa-onnx-kws-zipformer-gigaspeech-3.3M-2024-01-01/joiner-epoch-12-avg-2-chunk-16-left-64.int8.onnx"), paraformer=OnlineParaformerModelConfig(encoder="", decoder=""), wenet_ctc=OnlineWenetCtcModelConfig(model="", chunk_size=16, num_left_chunks=4), zipformer2_ctc=OnlineZipformer2CtcModelConfig(model=""), nemo_ctc=OnlineNeMoCtcModelConfig(model=""), provider_config=ProviderConfig(device=0, provider="coreml", cuda_config=CudaConfig(cudnn_conv_algo_search=1), trt_config=TensorrtConfig(trt_max_workspace_size=2147483647, trt_max_partition_iterations=10, trt_min_subgraph_size=5, trt_fp16_enable="True", trt_detailed_build_log="False", trt_engine_cache_enable="True", trt_engine_cache_path=".", trt_timing_cache_enable="True", trt_timing_cache_path=".",trt_dump_subgraphs="False" )), tokens="", num_threads=1, warm_up=0, debug=True, model_type="", modeling_unit="cjkchar", bpe_vocab=""), max_active_paths=4, num_trailing_blanks=1, keywords_score=3, keywords_threshold=0.1, keywords_file="")

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-transducer-model.cc:GetModelType:52 num_heads=4,4,4,8,4,4
num_encoder_layers=1,1,1,1,1,1
cnn_module_kernels=31,31,15,15,15,31
model_type=zipformer2
T=45
model_author=k2-fsa
version=1
comment=streaming zipformer2
left_context_len=64,32,16,8,16,32
decode_chunk_len=32
value_head_dims=12,12,12,12,12,12
encoder_dims=128,128,128,128,128,128
onnx.infer=onnxruntime.quant
query_head_dims=32,32,32,32,32,32

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:InitEncoder:100 ---encoder---
num_heads=4,4,4,8,4,4
num_encoder_layers=1,1,1,1,1,1
cnn_module_kernels=31,31,15,15,15,31
model_type=zipformer2
T=45
model_author=k2-fsa
version=1
comment=streaming zipformer2
left_context_len=64,32,16,8,16,32
decode_chunk_len=32
value_head_dims=12,12,12,12,12,12
encoder_dims=128,128,128,128,128,128
onnx.infer=onnxruntime.quant
query_head_dims=32,32,32,32,32,32

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:operator():122 encoder_dims: 128 128 128 128 128 128 

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:operator():122 query_head_dims: 32 32 32 32 32 32 

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:operator():122 value_head_dims: 12 12 12 12 12 12 

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:operator():122 num_heads: 4 4 4 8 4 4 

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:operator():122 num_encoder_layers: 1 1 1 1 1 1 

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:operator():122 cnn_module_kernels: 31 31 15 15 15 31 

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:operator():122 left_context_len: 64 32 16 8 16 32 

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:InitEncoder:131 T: 45
/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:InitEncoder:132 decode_chunk_len_: 32
/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:InitDecoder:153 ---decoder---
vocab_size=500
context_size=2
onnx.infer=onnxruntime.quant

/Volumes/Internal/sherpa-rs/sys/sherpa-onnx/sherpa-onnx/csrc/online-zipformer2-transducer-model.cc:InitJoiner:178 ---joiner---
onnx.infer=onnxruntime.quant
joiner_dim=320

sample rate: 16000, num samples: 267440, duration: 16.72 s
Context leak detected, msgtracer returned -1
0:FOREVER

Related
thewh1teagle/sherpa-rs#23

@csukuangfj
Copy link
Collaborator

the logs look normal.

could you describe the issue you have?

@thewh1teagle
Copy link
Contributor Author

the logs look normal.

could you describe the issue you have?

It shows the following warning in the logs as you can see:
Context leak detected, msgtracer returned -1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants