-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Synchronizing multiple devices #2148
Comments
If you cannot use hardware sync, the multiple camera white paper has advice about software sync. It is also possible to align frames in software using either their time-stamp or frame-number. Both approaches are valid but you do need to be careful during the initial step of subtracting the offsets that the frames were not read across a frame boundary – for example one camera could have a frame belonging to a previous frame time. The benefit of using the frame counters is that they will not walk off over time |
Hi @rishikant-sharma |
What we have noticed is that there is no frame number correspondence guarantee. This is primarily because when you set the device as slave, it still acquires frames, even if no master signal is received. However, even if that was not the case, one would have to first setup the slaves (that would not be receiving frames at all), and the fire up the master in order to establish frame number correspondences. Nonetheless, as reported in #2121, it seems that HW syncing sometimes causes the color stream to not produce any frames (can it be because of intra-frame syncing?), therefore limiting the applicability of HW sync for an increasing number of cameras. Thank you for your immediate responses, we will have more feedback on these once some more concrete information is available as we have everything setup (cabling, housings, pins, multiple sensors connected). |
@MartyG-RealSense @dorodnic To software synchronize clocks from different cameras and host a function to get current camera time would help. Synchronization up to the function latency would be possible. This would eliminate time drift between different clocks and one could rely on timestamps instead of frame counters. |
[Realsense Customer Engineering Team Comment] would like to know why not HW sync? would be more easy and direct. |
@RealSense-Customer-Engineering |
[Realsense Customer Engineering Team Comment] |
@RealSense-Customer-Engineering |
Can this work in capturing images from multi cam powered by different Jetson tx2 where unto 33ms latency is ok for my use-case. |
[Realsense Customer Engineering Team Comment] Did you try the HW sync to see how it works based on the multiple camera white paper? |
Hi, I successfully HW synced three 415 cameras and because they do not start at exactly the same time so that the frame number is not aligned. Is there any way we can align or sync the frame number during HW sync? |
Hi there, When the camera is set to "slave", why does it still acquire images? The behavior I expected is that a camera set to "slave" would wait to receive the hardware trigger before acquiring any images. This would guarantee that frame numbers, clocks, etc. would all be synchronized, presuming you have a good hardware trigger. Is there any way to accomplish this? |
A slave camera continues streaming even when it is not synched. It listens for a sync trigger and if it does not receive it within a certain time then it gives up listening and triggers itself independently as though it were an unsynced camera. I am not aware of a means of having the stream inactive until a sync signal is received. Given that a stream starting from a dead stop will take several frames before the auto-exposure settles down, having the stream on 'hot standby' listening for a trigger whilst it streams is likely to produce more accurate results. |
Thanks for your reply! I'm not using auto-exposure, so that seems like not an issue. I'm using this in a scientific application, when it's absolutely crucial to know when frames were acquired in order to sync with other hardware. I'll make a new issue trying to find a way to keep the stream inactive until a signal is received. |
In Intel's January 2018 demonstration of multi-cam capture with D435 cameras at the Sundance Festival (about 6 months before the D435 officially supported sync in the firmware), they used a commercial flash to signal the start and end of capture by causing a spike in the data to make sure the camera footage was properly synched together. https://realsense.intel.com/intel-realsense-volumetric-capture/ Edit: I knew I had seen a video of the demo area before. I managed to track it down. |
This is now implemented as part of 2.22 |
Issue Description
I am trying to synchronize the depth streams from multiple D415 cameras.
Currently I set up a pipeline for each device and set up 2 infrared, 1 depth a 1 color stream.
I create a thread to run a function which receives the pipeline along with the serial number of the device.
The function saves the next 300 framesets in a file.
In each iteration I wait_for_frames() and then write them to the file so that the frames are in sync.
The frames captured from the same device are more or less in sync(I'm satisfied with the performance), but they are not in sync with the frames from the other devices.
Is there a way for me to set up the syncer so that frames from all devices are in sync with each other.
I believe there should be a way to use the low level API for my purpose, but I can't seem to figure it out as I am new to the library.
Unfortunately, hardware sync is not an option for me at the moment.
The text was updated successfully, but these errors were encountered: