-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Custom Layer Support [Bug] #8249
Comments
You'll need to use OpenVINO Model Optimizer to convert a native model into IR before feeding it to your custom inference app. |
First of all, thanks for your reply. When I use the mo.py to convert the caffe model into the IR, the errors are reported as above. For now, I didnot feeding this backwardwarp custom layer to my custom inference app. |
I want to konw how to add the custom layer rightly? For now, I have already followed the openvino guide to add the BackwardWarp layer, but it did not work well. |
Fyi, this is the list of supported layers for caffe It seems that your layer (BackwardWarp Layer) is not listed. Therefore it's not supported. |
Hello, thanks for your reply. Have there any other guide on how to add the customer layer? Or can you figure out that what's wrong with the code or process I appended? |
It seems that your layer (BackwardWarp Layer) is not listed in the supported Framework Layers. Therefore it's not supported. You may refer to the link I shared in my previous comment: the list of supported layers for caffe |
I followed the guide shared by you. I follow the part ' Convert the Frozen TensorFlow* Model to Intermediate Representation' and, add the backward.py into the dir of extension/ops and the backward_ext.py into the extensions/front/caffe/. After these two steps as the guide introduced, it may caonvert the custom layer sucessfuly. However, I still can not convert rightly. Here is the convert log file. |
Tensorflow and caffe model has different specific conversion parameters required. That is why I recommended for you to use the steps of Model Optimizer for Caffe Model in the first place to get the IR file. The guide you referring to is using Tensorflow, you'll need adjustment according to your model if it's not Tensorflow. In Tensorflow you'll need the frozen file format (.pb) meanwhile, the caffe would have .caffemodel. In your log, it clear that you were getting the "[ ERROR ] BackwardWarp (1)". Again as I mentioned previously the BackWarp Layer in Caffe model is not supported (in OpenVINO) as what are supported is only on this list: the list of supported layers for caffe Your model need to have both Topology (eg: Caffe Model, Topology:MobileNet) and Layer (eg:Softmax) to be supported by OpenVINO in order to use them with OpenVINO no matter for which operations. Could you just share your caffe model for me to confirm from my side whether the caffe model topology/layer is supported or not? I couldn't access the google drive link that you shared before. |
Hello, thanks for your reply. Again as you mentioned previously the BackWarp Layer in Caffe model is not supported (in OpenVINO). Therefore, as the guide introduced, I add the BackWarp layer in the specified pathwhen converting the network. Thanks! |
Is there any relevant progress? |
Hi @DunguTmp I've been trying to add your custom layer by following the documentation and the code you provided but was unsuccessful. I will need to check with the development team for additional guidance. Regards, Ref. 70316 |
Hi, @Iffa-Meah: But there is another issue reported. When using the following code to load the IR files: ie = IECore() The issues reported as : Does this is a common issue? What should I do to deal with this issue? |
Hi @DunguTmp Glad to see you are making progress! Have you implemented a custom operation for the Inference Engine Plugin?
Please take a look at the Custom Operation Support Overview for additional information. Regards, |
Hi, @jgespino Thanks for your reply. Recently I have read the part of creating an operation set and implement a custom nGraph operation.
Thanks! |
Hi @DunguTmp
No, you should be able to add the customer layer using the Intel Distribution of OpenVINO toolkit and not build from source.
We have a document that has more detailed information, however, it's based on 2020.2 release. I will reach out to my peers and find out if there is a more recent version. Regards, |
Hi @jgespino Thanks for your new document. Regards, |
Hi, @wdkwyf Best regards, |
@DunguTmp Understood. I think the only gap is the IE implementation. You can easily modify the template file to insert the BackwardWarp Operation. The only difficulity I think it's the kernel file. I think the OpenCV https://drive.google.com/drive/folders/13LxBUg6k_yfxh8c4ynKJSRpA9JBTbDhk?usp=sharing |
Hi! We have an example implementation of |
Hi, @dkurt Best regards, |
Hi, @wdkwyf Best regards, |
Hi @DunguTmp
I've checked with the team, unfortunately, there isn't an updated version available. Regards, |
Hi, @wdkwyf, @jgespino and @dkurt Thanks for all of your works. Now, I have followed some guide and already implemented this backwardwarp operation. I tested it successfully using the benchmark app with -l parameter.
However, some error repotrs as followed: I don’t know if you have encountered the above problems. Is there a problem with my usage? Best regards, |
Your kernel implementation is based on CPU or GPU? Running |
Hi, @wdkwyf Following is the log when I try to using this
|
Maybe it's related to the windows path. I used
|
Thanks a lot. This method works for me. Best regards, |
Hi, Recently, I have already implement the backwardwarp opration with OpenCL. Follow the openvino guide, I have build the backward_warp_extension.xml. However, when I use following command.
Some errors occoured as follows: I don’t know if you have encountered the above problems. Is there a problem with my usage? Best regards, |
``> Hi, I suspect you didn't install the GPU plugin while installation: you can check whether this file existed.
Hi, @wdkwyf I use the codes:
File "ie_api.pyx", line 372, in openvino.inference_engine.ie_api.IECore.load_network
C:\workspace\LenovoVideoInterp\openvino_pytorch_layers-master>set path
|
oh, I think you didn't update the graphic driver: https://docs.openvino.ai/latest/openvino_docs_install_guides_installing_openvino_windows.html#optional-steps-for-intel-processor-graphics-gpu |
Hi, @wdkwyf The same error ocurred: |
Hi, @wdkwyf |
Hello, I tried the first method. From the log, it seems that the extended operation is not running on the GPU device. |
oh, I'm wrong. GPU should be placed firstly: |
Hi, @wdkwyf :
When I change the This is our model. This is our extension operator source code and dll file. |
I think it's because some operation have two results. It's allowed in the master branch. #6844 |
Hi, BR, |
Hi, you can check it, very simple modification. |
Closing due to inactivity, please re-open if additional assistance is needed. |
Follwing the openvino guide, I add the myop_ext.py in extensions/front/caffe and the myop.py in extensions/op. Howevere, when converting the caffe model to IR, some errors are reported as followed:
[ WARNING ] Consider building the Inference Engine Python API from sources or reinstall OpenVINO (TM) toolkit using "pip install openvino==2021.4"
[ ERROR ] List of operations that cannot be converted to Inference Engine IR:
[ ERROR ] BackwardWarp (1)
[ ERROR ] backwardwarp0
[ ERROR ] Part of the nodes was not converted to IR. Stopped.
Dose there detailed documents help me to add the custom layers rightly?
Thanks!
The code of myop_ext.py is :
The code of myop.py is:
The text was updated successfully, but these errors were encountered: