Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extrinsic/Intrinsic difference between raw calibration and reported by librealsense #10323

Closed
shphilippe opened this issue Mar 18, 2022 · 15 comments

Comments

@shphilippe
Copy link


Required Info
Camera Model D410
Firmware Version 5.12.07.150
Operating System & Version Ubuntu 18.04.5
Kernel Version (Linux Only) 4.15.0-122-generic
Platform Thinkpad T480
SDK Version 2.50.0-0~realsense0.6128
Language C++
Segment Robot

Issue Description

I'm calibrating the Realsense sensor using a custom calibration mechanism using the raw IR stream (Y16 at full resolution 15fps). The calibration results are good. Our setup also include an external high resolution RGB camera. During this calibration, I'm also calibrating the intrinsic and extrinsic (left IR to RGB) of this camera.

From what I understood, the asic is reprojecting both left and right images inside a virtual plan and thus the intrinsic reported by the API are different than a simple "resolution conversion" from the raw calibration. But I need to know if the extrinsic from my RGB camera to the left imager should change too? If yes, how can I correct the extrinsic left IR to RGB for this change so I can capture only one datasets in order to calibrate both the realsense and my RGB camera ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Mar 19, 2022

Hi @shphilippe Typically when this subject has been discussed in past cases, it has been recommended that the synchronization with an external RGB camera is performed in C++ using the RealSense SDK's software_device() interface. Examples of this are at #4202 and #8389 and #10067

@shphilippe
Copy link
Author

We do have a custom synchronization using the external sync signal and the kernel timestamp of each frame. The synchronization is not an issue (we did verify it using a fast moving target). What I want to know is does the ASIC change the world center between the IR left raw calibration and the IR left depth point (ie: is there any homography between the two, if yes, which one).

@MartyG-RealSense
Copy link
Collaborator

The left infrared sensor is always pixel-perfect aligned and calibrated with the depth map, as described in point 2 of the section of Intel's camera tuning guide linked to below.

https://dev.intelrealsense.com/docs/tuning-depth-cameras-for-best-performance#section-use-the-left-color-camera

@shphilippe
Copy link
Author

Is it true even for the unrectified RAW IR? This document describe the rectified stream.

@MartyG-RealSense
Copy link
Collaborator

It should be, as left infrared and depth are originating from the same physical imaging sensor component.

@shphilippe
Copy link
Author

well, I need to know for sure, because right now, the extrinsic right->left do changes:
Intel.Realsense.CustomRW output the following:

RotationLeftRight: 0.999997 -0.002338 0.000833
                     0.002333 0.999978 0.006250
                     -0.000848 -0.006248 0.999980
TranslationLeftRight: -55.025173 -0.182627 0.106408

while the following script:

import pyrealsense2 as rs2

pipe = rs2.pipeline()
cfg = rs2.config()

cfg.enable_stream(rs2.stream.infrared, 1, 1920, 1080, rs2.format.y8, 15)
cfg.enable_stream(rs2.stream.infrared, 2, 1920, 1080, rs2.format.y8, 15)
pipe_prof = pipe.start(cfg)

stream1 = pipe_prof.get_stream(rs2.stream.infrared, 1)
stream2 = pipe_prof.get_stream(rs2.stream.infrared, 2)

print(stream1.get_extrinsics_to(stream2))

output something totally different (even the translation change):

rotation: [1, 0, 0, 0, 1, 0, 0, 0, 1]
translation: [-0.0550256, 0, 0]

I know that this can be due to the right imager being projected into the left image plan. But in the Realsense Viewer, in the calibration data window, there is a "World to Left Rotation" which is not the identity matrix. And this make me think that there is some kind of homography between the unrectified left IR and the rectified left IR.

@MartyG-RealSense
Copy link
Collaborator

There is a case that references homography and rectified / unrectified at #1526 if you have not seen it already.

Within that discussion, advice given by a RealSense team member at #1526 (comment) states: "Rectification is done by the ASIC and introduces a homography transformation from Y16 to Y8 view. It does not affect the distortion model of the frame, just reprojecting them into a shared virtual plane".

Other than that case though, I am not familiar with the subject of homography and so do not have anything that I can add regarding that particular question, unfortunately.

@shphilippe
Copy link
Author

Thanks for the link to the github comment. It's what I was afraid of. I thus needs more help from Intel in order to know how I can I adjust my extrinsic computed on the Y16 stream to compensate for this and be able to compute one extrinsic that I can use on the depth stream. Who should I contact at Intel ?

@MartyG-RealSense
Copy link
Collaborator

I am the point of contact for your case. I will continue to do my best to assist your queries.

I conducted further research in regard to extrinsic adjustment, and I wonder whether an SDK instruction called rs2_override_extrinsics() that is used for depth to RGB calibration may meet your needs.

https://intelrealsense.github.io/librealsense/doxygen/rs__sensor_8h.html#a00bb482ae06e11340711221eb3e07e67

The SDK's rs_sensor.h file provides an example of its use.

https://github.com/IntelRealSense/librealsense/blob/master/include/librealsense2/h/rs_sensor.h#L486

I will be happy to continue this support discussion when I return in 7 hours from the time of writing this.

@shphilippe
Copy link
Author

I'm not sure to understand how to use this function, I'm not looking to override any calibration inside the SDK.
I'm using a D410 sensor, which do not have any RGB camera.
I'm doing my own realsense calibration using Y16 IR stream from left and right camera at 1920x1080.
While doing so, I'm capturing also an external RGB sensor images.
I compute the following:
D410 Left Intrinsic from Y16@1920x1080
D410 Right Intrinsic from Y16@1920x1080
D410 Extrinsic (left->right) from both Y16@1920x1080
RGB sensor intrinsic (unrelated to the SDK/Intel).
Extrinsic D410 Left (from Y16@1920x1080) -> RGB sensor

Then I use Intel.Realsense.CustomRW to write the D410 Left Intrinsic, D410 Right Intrinsic and D410 Extrinsic inside the D410. After testing this calibration, it's in our case a better calibration than the one I can get from the Intel OEM calibration target.

When I use the D410 in the product, I get a depth stream from it a 640x480. This depth stream has the exact same position ("Pixel perfect") than the left IR at Y8 (rectified IR). I want to align the external RGB camera with the depth stream. I thus need the following transformation: Depth Stream -> external RGB sensor, but I only have D410 Left (from Y16@1920x1080) -> RGB sensor.

Thus, Intel need to tell (or provide a binary which compute it) how to transform the extrinsic "D410 Left (from Y16@1920x1080) -> external RGB sensor" to "D410 Left (from Y8@1920x1080) -> external RGB sensor".
This basically boil down to: I have rotation matrix + translation vector computed in Y16 frame. How can I transform them so they are valid in Y8/Depth frame ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Mar 22, 2022

Finding the extrinsics between multiple cameras is typically not a straightforward problem to solve. A couple of ways that it has been approached with RealSense are to point the cameras at fiducial marker images to perform alignment and calibration between them, or to arrange the cameras around a chessboard / checkerboard image to calculate the extrinsics between the cameras.

Fiducial markers

image

Chessboard - demonstrated by SDK 'box_dimensioner_multicam' Python example

https://github.com/IntelRealSense/librealsense/tree/master/wrappers/python/examples/box_dimensioner_multicam

https://dev.intelrealsense.com/docs/box-measurement-and-multi-camera-calibration

image

When the cameras have been positioned close to each other and facing in the same direction (such as when mounted on the same bracket), another approach that has been taken to finding the extrinsics between them is demonstrated by a Python script for the SDK that was designed to calculate the extrinsics between D435 and T265 RealSense cameras.

acb0bbd#diff-d17bb35407d5baa40b3af64e8fe9fbef1c6cfb571502c275143d782914a0bd6f


Another approach may be to use Y8 instead of Y16 and perform an undistort operation to undo the distortion model. OpenCV has undistort capabilities if you wish to investigate that possibility, as the SDK does not have an undistortion feature.

https://docs.opencv.org/4.5.2/dc/dbb/tutorial_py_calibration.html

A RealSense user at #3880 tried an OpenCV undistort of a RealSense image but found that it made little difference though.


Ultimately though, if your custom calibration system is producing good results already then it may not be necessary to seek to implement further measures unless you have a requirement in your project to precisely document every detail of how it works.

@shphilippe
Copy link
Author

I'm not asking on how to find the extrinsic. I'm using a charuco board + custom opencv based code. I must use our own calibration for both the realsense sensor and our RGB camera. In order to speed up factory calibration (our own, not the one of intel), I don't want to perform one calibration in Y16 for the intel realsense and one additional on Y8 for the external camera extrinsic. So I'm calibrating everything on Y16 with the same set of images.
What I'm asking is based on extrinsic computed with the Y16 stream, how to convert them to extrinsic that I can use on the Depth/Y8 stream. The D410 is doing an homography between both stream which impact any extrinsic computed relative to this stream. I need to know which correction I should apply to the extrinsic.

We are ready to sign any NDA if needed or use binary code. It looks like this discussion is going nowhere. I'm asking about a specific information from Intel, and I keep getting a generic answer, how can I get in contact with an engineer which will understand my question ?

@MartyG-RealSense
Copy link
Collaborator

MartyG-RealSense commented Mar 22, 2022

If you email me your full contact details at the email address below then I will pass them to Intel so that they can be directed onward to the appropriate person. I am not involved in the NDA registration process and so will not be able to provide updates about the progress of your application or a time estimate for processing of your application.

Please include in the email your full name, email address, telephone number and country. Also include a company name, company postal address and website address if you represent a company.

@MartyG-RealSense
Copy link
Collaborator

I have received your email containing your contact details and passed the details to Intel to forward to the persons responsible for NDA registration. Thanks very much for your patience!

@shphilippe
Copy link
Author

Thanks, closing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants