-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to find position (x,y,z) and orientation (Euler angles) of the measurement frame for D435i ? #7884
Comments
Hi @CarMineo It sounds as though you already know how to position and rotate the individual clouds to align them, using an affine transform. Instead, you want to be able to calculate the transform between the camera device and the flange. Is this correct please? If you do need details of how to perform an affine transform, an example of one in librealsense is the instruction rs2_transform_point_to_point In regard to the flange: this reminds me of a tool that a RealSense team member wrote to work out the extrinsic calibration between a 400 Series camera and a T265 Tracking Camera on a custom mount. I wonder if it may be applicable to this situation if the flange represented the position on the mount where the T265 would have been. The use of gimballing with RealSense calibration for the 400 Series cameras was also recently discussed. |
Hello MartyG, thanks for your reply. I am going to see the links you shared. Many thanks, |
I carefully considered the information that you kindly provided. This is not one of my specialist areas of knowledge, but I wonder if what you are describing is hand-eye calibration, which can be used to get the transformation between the end-effector of a robot arm and the camera. If that is the case, the links below have information about hand-eye calibration in regard to RealSense: https://support.intelrealsense.com/hc/en-us/community/posts/360051325334/comments/360013640454 |
Hi @CarMineo Do you require further assistance with this case, please? Thanks! |
Hello Marty, |
Okay, thanks very much for the update @CarMineo |
Hi @CarMineo Do you require further assistance with this case, please? Thanks! |
Hello @MartyG-RealSense, thanks for checking how this is going. I have found a quite straightforward procedure to calibrate the centre of the RGB sensor of D435i. This is based on the MATLAB Camera Calibrator app (https://it.mathworks.com/help/vision/ref/cameracalibrator-app.html). I manipulate the D435i with a robotic arm to take 10 pictures of a chessboard pattern (https://github.com/opencv/opencv/blob/master/doc/pattern.png) from different positions. I record the colour frame and the robot positional feedback (the position of the robot flange on which the D435i is mounted) at every pose. The MATLAB Camera Calibrator allows me to compute the camera intrinsics, extrinsics, and lens distortion parameters. Thus it provides also the coordinates of the location where each colour frame was acquired. Then I can compare the transform between the locations of the RGB sensor and the robot flange, which gives the robot tool parameters I need. Best regards, |
Hi @CarMineo Great news that you were successful. Thanks so much for sharing the details of your solution for the benefit of RealSense community members. :) The origin point of the 400 Series camera is always the center of the left IR sensor. The link below explains the coordinate system relative to this origin point. |
Hi @CarMineo Do you require further assistance with this case, please? Thanks! |
Case closed due to no further comments received. |
Issue Description
Hello all,
I apologise in advance if my problem is already covered in another issue. I have not been able to find a straightforward solution so far.
I need to manipulate a D435i camera through a robot to reconstruct the geometry of an object, by bringing the device to different locations around the object. I have been able to develop the software required to acquire a point cloud from D435i at every pose. However I need an accurate measure of the position of the device reference system (origin coordinates and Euler angles), in the robot absolute reference system, in order to translate and rotate each acquired point cloud. In other words, I need to find the fixed offset and rotation between the robot flange and the device reference system. I assume this should be done using a predefined object (e.g. a chessboard pattern in a fixed position) and a calibration routine to find the camera extrinsic parameters (??). Can anyone point me towards a documented well documented algorithm applicable to D435i?
Thanks,
Carmelo
The text was updated successfully, but these errors were encountered: