You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Imagine I have a very long usb cable or a very unpredictable latency between my image being ready on the camera firmware and my pyrealsense sdk receiving the data. If I call get_timestamp(), I can imagine there being one of two values returned, depending on how get_timestamp syncs its clock with my system clock:
a) My host computer's timestamp when the frames were fully collected on the camera's firmware
b) My host computer's timestamp when the frames were received by the host computer
Which of those values is returned by get_timestamp?
If this doesn't make sense yet, here's an example:
The camera and my computer are clock-synced. The camera takes a picture at t=100ms.
It transmits this information back to the computer. This process takes 10ms.
So the SDK receives the full batch of data at t=110ms.
Does get_timestamp() return 100 or 110?
Or, alternatively, am I misunderstanding how this works? I did my best to parse the other questions on here but am still confused.
The text was updated successfully, but these errors were encountered:
Hi @amm385 The documentation for get_timestamp() in the link below states that it retrieves the time at which the frame was captured. At runtime the SDK selects the most correct representation from the different types of timestamp available, based on both device and host capabilities.
The timestamp timings of RealSense camera sensors are generated as frame metadata attributes. These can be produced by the camera firmware or by the host clock. The links below describe how metadata is created.
Issue Description
Imagine I have a very long usb cable or a very unpredictable latency between my image being ready on the camera firmware and my pyrealsense sdk receiving the data. If I call get_timestamp(), I can imagine there being one of two values returned, depending on how get_timestamp syncs its clock with my system clock:
a) My host computer's timestamp when the frames were fully collected on the camera's firmware
b) My host computer's timestamp when the frames were received by the host computer
Which of those values is returned by get_timestamp?
If this doesn't make sense yet, here's an example:
The camera and my computer are clock-synced. The camera takes a picture at t=100ms.
It transmits this information back to the computer. This process takes 10ms.
So the SDK receives the full batch of data at t=110ms.
Does get_timestamp() return 100 or 110?
Or, alternatively, am I misunderstanding how this works? I did my best to parse the other questions on here but am still confused.
The text was updated successfully, but these errors were encountered: