-
Notifications
You must be signed in to change notification settings - Fork 661
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tentative with a D435i camera #102
Comments
If you have stereo cameras but want to independently track then you can set The fly away in the beginning can be a cause of many things, but if the threshold properly detects the pickup, then it is likely incorrect extrinsics, intrinsics, or IMU noises. If you calibrated yourself (we recommend Kalibr), then make sure you are using the T_CAMERAtoIMU transform which should be in the results.txt file. I additionally see that your |
@goldbattle thank you very much for your message. Indeed I redo the calibration and got the follwong file:
IMU calibration was performed by the tool you open sourced KalibrAllen. The values were computed with 2 hours IMU data with the camera et rest. Howeevr I am wondering whether the reprojection error is not too big? Here is a screenshot of the Kalibr report: Finally, I can observe that at the beginning openVINS is able to compute the pose of the camera but suddenly drift away as you can see on the screenshot below: I am working on the math to understand how it works in order to have a better understanding of why such behavior. However your feedback is welcome :) Thanks again for sharing such tremendous work, the code but also all this wonderful documentation! |
The distortion parameters found are still extremely large. I would ensure you are covering the entire camera field of view. Can you try just calibrating a single camera instead of two cameras a the same time? You can also try manually setting these distortion parameters to zero and then running OpenVINS to try to calibrate them. I wouldn't expect the IMU-to-CAMERA kalibr calibration to work well if the intrinsics are not correct. I see there have been a few issues trying to get the sensor calibrated (including your own struggles):
You can also trying looking at the |
@goldbattle thank you very much for your comments. Actually I recalibrate but with camera model pinhole-radtan - I noticed that it computed very low distortion parameters. And I start to get something not bad in the sense that there is no drifts anymore as it is shown on the pic below: However what I observe is the following:
May I ask a "naive" question? Why the IMU initialization is necessary and how to know which threshold is the most appropriate? For information, it may be useful for other people with a D435i, I included the current launch file:
Thanks again |
To initialize the system we assume that we are standing still, thus we need to be able to detect this case. To tune this parameter, record a bag, set the threshold to be extremely large, and then visualize inspect images / threshold to see what is a good threshold. You shouldn't really need to change it after determining it as it is really just dependent on how noisy your IMU is. My recommendation for getting the calibration is to record a bag with a relatively aggressive motion and play it back using the serial node. Excite all axes with good orientation rotations (but try not to cause that much motion blur). You can then increase the feature count to be extremely large (maybe 500-800 tracks, and 100 or so SLAM features) to have it run non-realtime. Then after processing the whole bag, use final estimated values are your initial guess for the rest of your runs. Also it looks like you are using the rectified image streams, so one would expect the distortion parameters to be very close to zero. To handle the case that you come back to stationary, you will need to leverage zero velocity updates or SLAM features. I would first recommend enabling 25-50 SLAM features which will be continuously tracked over many frames. To handle the stationary case, you will want to enable the zero velocity update which basically detects stationary stops visual tracking and updates the filter assuming this model. See the kaist launch file for an example of that. Additionally, you might want to try to inflate your IMU noise parameters, and reduce your image pixel noise. A pixel noise above 2 typically shouldn't work since your camera isn't providing good information. |
@goldbattle regarding the calibration, I guess that you are publishing the updated parameters (of the left camera in the IMU frame) in:
Yes you are right, I am using the rectified image streams and yes the distortion parameters should be almost equal to 0. It means that only the transformations C0toI and C1toI have to be computed. Thank you again for your support |
That is only the left / first camera.
Just look at what gets printed to console at the end of a dataset (or
after you ctnr+c the node).
…On Fri, Oct 2, 2020 at 12:11 PM Fabrice noreils ***@***.***> wrote:
@goldbattle <https://github.com/goldbattle> regarding the calibration, I
guess that you are publishing the updated parameters (of the left camera in
the IMU frame) in:
- /ov_msckf/keyframe_intrinsics
- /ov_msckf/keyframe_extrinsic
Am I right? If so I can take the same values for the right camera then.
Yes you are right, I am using the rectified image streams and yes the
distortion parameters should be almost equal to 0. It means that only the
transformations C0toI and C1toI have to be computed.
Thank you again for your support
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#102 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAQ6TYQCZD3Q7OAIBIYYTWTSIX3UZANCNFSM4RUPMW5Q>
.
|
If you could upload a bag and launch file to google drive I can try to take a peek also. I think part of the issue is that you are using rectified image feeds (from what I see). I also think it might be the IMU but I have no experience with this sensor so I don't have that much of a feel. |
Here is a link to download a rosbag from my google drive: and here is the launch file I used:
Discussing with Intel Support for RealSense product, they told me that Infra1/infra2 Y8 are rectified by Y16 are unrectified so I will investigate if i can switch to this mode (640x400 at 25Hz) and then do a recalibration pass. |
I was unable to get it to work really well. I recommend setting the following in your launch:
|
@goldbattle thank you for the parameters - is there a place where all the parameters are described? When I used ov_serial with bag to get the computed calibration parameters from open_vins I noticed the following:
I would like to ask you some clarifications about the frames. At the initialization, there is an IMU vector expressed in the Camera frame following the RDF convention (X right, Y Down and Z Front). The orientation value of the IMU vector is Because I would like to couple open_vins with our path planner, I have to define a "base_link'" frame respecting the FLU convention (X forward, Y left and Z up) - a zero yaw angle means that the MAV is pointing in the X axis direction. May be I am a bit confused but the image below should clarify my issue: Actually I tried many different things like multiplying by q_rot = (0.0, 0, 0.707, 0,707) which is a 90 degrees rotation around the Z axis, it works but the Camera frame change and thus the perception module projected the point cloud in a wrong direction. And I did not find a place describing the conversion between JPL and Hamilton quaternions - I probably need to look at ov::core which is transforming JPL quaternions into rotation matrices but I was wondering whether you have in mind an easiest way to do it? Thank you again for help |
There is no guarantee that the calibration will converge to a better guess. As for the conversion from Hamilton to JPL it basically just flips the direction of the rotation. As for your base link, it sounds like you just need to publish a static TF in relation to the IMU? |
Feel free to re-open when you re-visit or have further issues. |
@goldbattle thank you very much for your support. We are doing some tests and it is quite clear that we need an additional more accurate IMU than the one embedded in the D435i. Do you have any recommandation on your side? |
If you want an intel realsense, then we have used the T265.
For an IMU, we normally use Xsens IMU or a microstrain.
…On Wed, Nov 18, 2020 at 7:47 AM Fabrice noreils ***@***.***> wrote:
@goldbattle <https://github.com/goldbattle> thank you very much for your
support. We are doing some tests and it is quite clear that we need an
additional more accurate IMU than the one embedded in the D435i. Do you
have any recommandation on your side?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#102 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAQ6TYSTWLH3QMNINBCSPDLSQO67JANCNFSM4RUPMW5Q>
.
|
First thank you for sharing the code, this is an amazing project!
I am trying to test it with a D435i and I got some results I would like to share with you and got your feedback to understand what is going on.
I created a launch file, based on the one provided in the "getting started page" and at the beginning I got these messages:
Then the Init phase:
Then the filter did start and I had a "crazy path" with the following messages:
and here is a screenshot (the camera just moved few centimeters):
when it is written
xxx seconds for MSCKF update (0 features)
, does it mean that no new feature(s) are tracked and this is why the trajectory looks so weird? When I look at the images in RVIZ looks like there are many features detected so can you tell me why this is happening?Regarding the launch file, actually the parameters available in
parse_ros.h
I would like to know :use_klt
anduse_stereo
? In the first case you are using the Kanade-Lucas-Tomashi (KLT) feature tracker and is use_stereo is set, you are doing stereo correspondance?Thank you very much for your hep and again thanks for this great work!
The launch file is the following:
The text was updated successfully, but these errors were encountered: