Looking for help testing Intel OpenVino support for NCS2 #3797
Replies: 12 comments 77 replies
-
I put this around on a few different existing discussions / issues, hopefully some users will be able to help. |
Beta Was this translation helpful? Give feedback.
-
happy to help but will have to be arm64 for me though |
Beta Was this translation helpful? Give feedback.
-
Nice, I should be able to help out with this later today as well. |
Beta Was this translation helpful? Give feedback.
-
Awesome, will give it a try over the weekend! |
Beta Was this translation helpful? Give feedback.
-
Hello, I just tested with ghcr.io/natemeyer/frigate:0.11.0-openvino-f0d0893 with core I7 3770S And I add got:
|
Beta Was this translation helpful? Give feedback.
-
I was able to run this image with my Neural Compute Stick 2, and Frigate is reporting an inference speed of 10. One word of caution if you are running the container inside of a VM. The Neural Compute Stick 2 does some odd things when in a VM. When the driver instantiates the device the device ID seems to change in /dev ont he host, making the mapping into the VM no longer relevant and the device disappears. This could explain some issues. I have the container running directly on bare metal with the NCS2 and it's working without a hitch. I actually gave up Frigate earlier this year due to Coral unavailability, but have had this NCS2 sitting in my closet for almost 2 years. Glad to finally put it to use and be using Frigate again. |
Beta Was this translation helpful? Give feedback.
-
I don't have NCS2, but I have got it working on my server with Intel J4105 using the GPU device as detector. Interface speed is now 25 ms which is a nice improvement from the default CPU detector, which had about 150 ms interface speed. Additionally the CPU usage is now very low (about 15% in idle and 30% during detection, much better than full load with the default CPU detector). |
Beta Was this translation helpful? Give feedback.
-
@NateMeyer I finally got it working! I tried with privileged, and with the cgroup rule, both worked, so cgroup rule it is. I believe the issue all this time is that I was mounting /dev/bus/usb as a Getting an inference speed of about 62, which is in line with @indieaz's results. |
Beta Was this translation helpful? Give feedback.
-
Just wondering, is there an 0.11.1 version of the OpenVINO Docker image? Or should I still use 0.11.0? I have an NCS2 kicking around that I'd be happy to plug in and be able to use. |
Beta Was this translation helpful? Give feedback.
-
Great job, thank you very much. It was exactly what I was looking for.. I installed it on raspberry pi3 with Intel ncs2 usb stick.. :) |
Beta Was this translation helpful? Give feedback.
-
Can someone explain to me how to do this with an Intel HDU like I'm a baby? |
Beta Was this translation helpful? Give feedback.
-
Has anyone used the stick on an aarch64 SBC recently ? I can't get it to work on a Pi 4 or Vim 3, both running a fully up to date Debian Bookworm. It's working fine on an Intel server with all parameters strictly identical. |
Beta Was this translation helpful? Give feedback.
-
I am working on a build that includes OpenVino support on x86 platforms ( #3768 ). I currently have it working with the iGPU on my i5 1135G7 laptop, more details are in the PR.
I was wondering if there was anyone who had a Neural Compute Stick 2 who wouldn't mind helping me test the VPU device integration?
I have an image pushed to github that should be ready to test.
For now, this will only work on an x86 host.NOW WITH ARM 64/32 BUILDS
docker pull ghcr.io/natemeyer/frigate:0.11.0-openvino-a3afb6f
With this image, you can add the VPU as a detector with the following config
Additionally, you will have to add the following configuration to use the included OpenVino model.
WIth this dev image, you need to add the following configuration to docker:
--device-cgroup-rule='c 189:\* rmw' -v /dev/bus/usb:/dev/bus/usb
or in your compose file:
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions