-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When is time stamp of frame captured? #11330
Comments
Hi @reinzler There are a range of different timestamps that have different characteristics. At #2188 (comment) a RealSense team member provides a list of timestamp types and descriptions for them, and further information about how the timestamps work at #4525 In regard to references in these lnks about Global Time (which is enabled by default on 400 Series cameras), information can be found at #3909 Documentation for the get_timestamp() instruction can be found at the link below. My understanding from the above information resources is that if hardware timestamps are available then get_timestamp() provides the frame_timestamp, which designates the beginning of UVC frame transmission (the first USB chunk sent towards host) and readout begins after exposure is completed. |
Thanks for fast responce, how can I get FRAME_TIMESTAMP using Python? |
First, disable Global Time using the Python code at #9172 (comment) You should be able to obtain frame_timestamp using the Python code below.
An example of Python code for printing the frame_timestamp metadata value is at #9891 |
Thanks, how I can convert the Hardware clock to epochs to compare with the global time? |
The section of the global timestamp system's code that makes use of EPOCH is at the link below. My understanding from IntelRealSense/realsense-ros#1906 (comment) is that global time is not a direct conversion of the hardware time, but instead uses the system clock as a means to perform correction of the hardware timestamp. |
Hi @reinzler Was the information in the above comment helpful to you, please? Thanks! |
Yes, it was helpful, thanks, another question - is it possible to fuse data from 2 different IMU ( VN100, Bosh BMI055), GPS (HERE 3) and SLAM from IntelRealSense D455 using ROS robot_localization package? |
Multiple sensors, such as multiple IMUs, can be fused in robot_localization using state estimation nodes, as described at the robot_localization documentation link below. Intel's RealSense SLAM guide for ROS makes use of robot_localization, though there have not been previous cases of the SLAM data being used with sensors other than a single IMU-equipped RealSense camera. https://github.com/IntelRealSense/realsense-ros/wiki/SLAM-with-D435i |
Thank you very much for fast and useful response! |
Thank you for your help, a have another question, can I pass some prepared in advance data from D455 to ROS robot_localization library to produce some simulation? |
Intel's SLAM guide suggests replaying a saved RealSense bag file by setting the rosparam use_sim_time to true and then reading the bag with rosbag play
The link below suggests a method for loading a bag file directly into robot_localization. |
Hi @reinzler Do you require further assistance with this case, please? Thanks! |
Case closed due to no further comments received. |
I have to know precision time of the end of exposure, how can i get it?
I saw get_timestamp() function but i'm not sure that it is exactly what i need.
The text was updated successfully, but these errors were encountered: