-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can not load any network for model person-detection-retail-0013 #2704
Comments
Hi @kafan1986 Could you try loading the model using the inference engine demo application? I was able to load the model to CPU and run inference with the following steps: Source Environment
Download the model Build the demo application
Run application Test Environment |
I told you in the original post the model files work fine with the sample application. Only when I use it inside my c++ app, it gives following error. Makes no sense but I have tried nearly everything. |
@jgespino As mentioned earlier the official sample app works but not my own application. I have created a mini sample app that can reproduce the issue at my end. You will need to change the root_path in the main_application.cpp to the model file directory (already present inside the data folder). Then run cd build && cmake .. && make -j2 |
@jgespino Did you get the time to test the above application? I am looking for some guidance towards solution. |
Hi @kafan1986 I haven't had a chance to deep dive into your code and debug. However, the error message recommends to use CNNNetwork::reshape() method in order to specialize shapes before the conversion. Briefly looking through your code, I did not see any reshape methods called to match the input shape. Please take a look at the following documentation for additional information. https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_ShapeInference.html#usage_of_reshape_method Regards, |
Hi @jgespino https://github.com/openvinotoolkit/openvino/blob/master/inference-engine/samples/object_detection_sample_ssd/main.cpp My code is exact same apart from few lines that I removed to not support the Resnet. Can you build the test app at your end and check if you get the error message too. May be it is my environment setup issue. |
@jgespino After many hours of hair pulling, I finally figured out the issue. I am not sure if this is the intended behaviour of the project. The sample project or any other project that uses it, runs OK, only when build with "Release". You change it to "Debug", the build goes fine but while running it starts throwing different errors, giving an illusion, something is wrong with model/weight files. May be this is a bug and should be corrected in future release. To reproduce the issue, go the "build_samples.sh" and change cmake -DCMAKE_BUILD_TYPE=Release "$SAMPLES_PATH" and try to run the generated samples with model files |
Hi @kafan1986 I am glad you figured it out! I was able to reproduce the error with your testapp and by building the Inference Engine samples in debug. I have opened a bug with the development team to take a look. Regards, Ref. 41509 |
Hi @kafan1986 I apologize the delay in my response, the build_samples.sh script was not designed with 'make install' of source code in mind, it was created to make use of the OpenVINO toolkit release packages more convenient for users. Regards, |
System information (version)
Detailed description
Steps to reproduce
At the below line all three version of the model (fp32, fp16, fp16-int8) throw different error.
NOTE: Strange thing is almost similar code (in the sample directory object_detection_sample_ssd) works with the model files. Can't seem to figure the issue here.
openvino_ = ie.LoadNetwork(network, device_name);
fp32
fp16
fp16-int8
My Code:
The text was updated successfully, but these errors were encountered: