You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Integrate OVMS for model inferencing instead of DL streamer pipeline-server @NeethuES-intel
Preprocess frames & send to ovms (grpc call) for model inferencing @NeethuES-intel
Postprocess to get roi coordinates & publish it to mqtt topic AnalyticsData
Can pre & post processing be chained with inferencing by OVMS & only roi coordinates sent back to CV ROI service which are then published to mqtt topic AnalyticsData ?
Based on discussions with Brian, updated architecture diagram -
The text was updated successfully, but these errors were encountered:
Convert to OVMS
Tasks
AnalyticsData
AnalyticsData
?Based on discussions with Brian, updated architecture diagram -
The text was updated successfully, but these errors were encountered: