Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tentative with a D435i camera #102

Closed
FaboNo opened this issue Sep 21, 2020 · 16 comments
Closed

Tentative with a D435i camera #102

FaboNo opened this issue Sep 21, 2020 · 16 comments
Labels
question Theory or implementation question user-platform User has trouble running on their own platform.

Comments

@FaboNo
Copy link

FaboNo commented Sep 21, 2020

First thank you for sharing the code, this is an amazing project!

I am trying to test it with a D435i and I got some results I would like to share with you and got your feedback to understand what is going on.

I created a launch file, based on the one provided in the "getting started page" and at the beginning I got these messages:

InertialInitializer::initialize_with_imu(): no IMU excitation, below threshold 0.0328 < 2.0000
InertialInitializer::initialize_with_imu(): no IMU excitation, below threshold 0.0332 < 2.0000
InertialInitializer::initialize_with_imu(): no IMU excitation, below threshold 0.0334 < 2.0000

Then the Init phase:

InertialInitializer::initialize_with_imu(): no IMU excitation, below threshold 1.9928 < 2.0000
[INIT]: orientation = -0.6987, -0.0217, -0.0212, 0.7147
[INIT]: bias gyro = -0.0033, -0.0031, -0.0015
[INIT]: velocity = 0.0000, 0.0000, 0.0000
[INIT]: bias accel = 0.0012, -0.0193, 0.0004
[INIT]: position = 0.0000, 0.0000, 0.0000

Then the filter did start and I had a "crazy path" with the following messages:

[TIME]: 0.0274 seconds for total
q_GtoI = -0.469,-0.445,0.447,0.618 | p_IinG = 15468.675,8256.258,-9576.151 | dist = 21827.09 (meters)
bg = -0.0418,0.0576,-0.0003 | ba = -0.0634,-0.0300,-0.0254
^C[TIME]: 0.0192 seconds for tracking
[TIME]: 0.0008 seconds for propagation
[TIME]: 0.0013 seconds for MSCKF update (0 features)
[TIME]: 0.0010 seconds for marginalization (11 clones in state)
[TIME]: 0.0223 seconds for total
q_GtoI = -0.468,-0.445,0.448,0.618 | p_IinG = 15475.082,8264.223,-9582.148 | dist = 21838.94 (meters)
bg = -0.0418,0.0576,-0.0003 | ba = -0.0634,-0.0300,-0.0254
[TIME]: 0.0232 seconds for tracking
[TIME]: 0.0004 seconds for propagation
[TIME]: 0.0009 seconds for MSCKF update (0 features)
[TIME]: 0.0010 seconds for marginalization (11 clones in state)
[TIME]: 0.0256 seconds for total
q_GtoI = -0.467,-0.446,0.448,0.618 | p_IinG = 15481.772,8272.536,-9588.407 | dist = 21851.31 (meters)
bg = -0.0418,0.0576,-0.0003 | ba = -0.0634,-0.0300,-0.0254
[rviz-2] killing on exit
[run_serial_msckf-1] killing on exit
[TIME]: 0.0246 seconds for tracking
[TIME]: 0.0003 seconds for propagation
[TIME]: 0.0007 seconds for MSCKF update (0 features)
[TIME]: 0.0007 seconds for marginalization (11 clones in state)
[TIME]: 0.0264 seconds for total
q_GtoI = -0.467,-0.446,0.449,0.618 | p_IinG = 15488.356,8280.715,-9594.567 | dist = 21863.48 (meters)
bg = -0.0418,0.0576,-0.0003 | ba = -0.0634,-0.0300,-0.0254
TIME: 100.598 seconds

and here is a screenshot (the camera just moved few centimeters):
Screenshot from 2020-09-21 14-14-04

when it is written xxx seconds for MSCKF update (0 features), does it mean that no new feature(s) are tracked and this is why the trajectory looks so weird? When I look at the images in RVIZ looks like there are many features detected so can you tell me why this is happening?

Regarding the launch file, actually the parameters available in parse_ros.h I would like to know :

  • What is the difference between use_klt and use_stereo ? In the first case you are using the Kanade-Lucas-Tomashi (KLT) feature tracker and is use_stereo is set, you are doing stereo correspondance?
  • Grid_x and grid_y are parameters are for use_klt?

Thank you very much for your hep and again thanks for this great work!

The launch file is the following:

<launch>

    <node name="run_serial_msckf" pkg="ov_msckf" type="run_subscribe_msckf" output="screen">
    
        <!-- bag topics -->
        <param name="topic_imu"      type="string" value="/camera/imu" />
        <param name="topic_camera0"  type="string" value="/camera/infra1/image_rect_raw" />
        <param name="topic_camera1"  type="string" value="/camera/infra2/image_rect_raw" />
    
        <!-- bag parameters 
        <param name="path_bag"    type="string" value="/<path>/V1_01_easy.bag"/>
        <param name="bag_start"   type="int"    value="0" />
        <param name="bag_durr"    type="int"    value="-1" />
        -->
    
        <!-- world/filter parameters -->
        <param name="max_clones"             type="int"    value="11" />
        <param name="max_slam"               type="int"    value="0" />
        <param name="max_cameras"            type="int"    value="2" />
        <param name="init_window_time"       type="double" value="0.5" />
        <param name="init_imu_thresh"        type="double" value="2.0" />
        <rosparam param="gravity">[0.0,0.0,9.81]</rosparam>
        <param name="feat_rep_msckf"         type="string" value="GLOBAL_3D" />
        <param name="feat_rep_slam"          type="string" value="GLOBAL_3D" />
    
        <!-- tracker/extractor parameters -->
        <param name="use_klt"          type="bool"   value="true" />
        <param name="fast_threshold"   type="int"    value="10" />
        <param name="grid_x"           type="int"    value="5" />
        <param name="grid_y"           type="int"    value="3" />
        <param name="min_px_dist"      type="int"    value="10" />
        <param name="num_pts"          type="int"    value="400" />
    
        <!-- sensor noise values / update -->
        <param name="up_msckf_sigma_px"            type="double"   value="1" />
        <param name="up_msckf_chi2_multipler"      type="double"   value="1" />
        <param name="gyroscope_noise_density"      type="double"   value="0.000488" />
        <param name="gyroscope_random_walk"        type="double"   value="4.88e-05" />
        <param name="accelerometer_noise_density"  type="double"   value="0.00147" />
        <param name="accelerometer_random_walk"    type="double"   value="0.00147" />
    
        <!-- camera intrinsics -->
        <param name="cam0_is_fisheye" type="bool" value="false" />
        <param name="cam1_is_fisheye" type="bool" value="false" />
        <rosparam param="cam0_k">[395.5922288679944, 395.5957501002353, 319.88738279106286, 239.07386401539998]</rosparam>
        <rosparam param="cam0_d">[0.3775752250096366, -0.08329757139021714, 0.6504570262599587, -0.4530039925701512]</rosparam>
        <rosparam param="cam1_k">[396.5827833428163, 396.75838144377565, 319.1950768721882, 238.53973946284336]</rosparam>
        <rosparam param="cam1_d">[0.31220975539351176, 0.48299676137352554, -1.4010717249687126, 2.191483852686]</rosparam>
    
        <!-- camera extrinsics -->
        <rosparam param="T_C0toI">
            [0.999848715163262, 0.0018345448477758066, 0.017296856118190283, -0.009035083963879727,
            -0.001867318311452875, 0.999996491602708, 0.0018788040335944412, 0.002129300542443833,
            -0.01729334868368733, -0.0019108185351930207, 0.9998486329759259, -0.005593943362795322,
             0.0, 0.0, 0.0, 1.0]
        </rosparam>
        <rosparam param="T_C1toI">
            [0.9997911085949877, 0.0019231918046627172, 0.020347985347160178, -0.058894395640646836,
            -0.0019818392674274344, 0.999993939322739, 0.0028624519050287085, 0.0021865657711008357,
            -0.02034235698054308, -0.002902180399802527, 0.9997888606407872, -0.005416418875488448,
             0.0, 0.0, 0.0, 1.0]
        </rosparam>
    
    </node>
    
    <node type="rviz" name="rviz" pkg="rviz" args="-d $(find ov_msckf)/launch/display.rviz" />
    
</launch>
@goldbattle goldbattle added the question Theory or implementation question label Sep 22, 2020
@goldbattle
Copy link
Member

If you have stereo cameras but want to independently track then you can set use_stereo to false. This could be if have two cameras with non-overlapping / can't find stereo tracks. I would make sure you disable arucotag tracking also by setting use_aruco to false.

The fly away in the beginning can be a cause of many things, but if the threshold properly detects the pickup, then it is likely incorrect extrinsics, intrinsics, or IMU noises. If you calibrated yourself (we recommend Kalibr), then make sure you are using the T_CAMERAtoIMU transform which should be in the results.txt file.

I additionally see that your cam1_d has a values 0.48299676137352554, -1.4010717249687126, 2.191483852686 which are orders of magnitude larger then they should be. I would double check your calibration here.

@FaboNo
Copy link
Author

FaboNo commented Sep 30, 2020

@goldbattle thank you very much for your message. Indeed I redo the calibration and got the follwong file:

Calibration results
===================
Reprojection error squarred (cam0):  mean 0.181828966343, median 0.128803531869, std: 0.261264120998
Reprojection error squarred (cam1):  mean 0.192048225728, median 0.133705189856, std: 0.275139997516
Gyro error squarred (imu0):          mean 1.30007728271, median 0.525949739893, std: 7.1650882246
Accelerometer error squarred (imu0): mean 4.09278230677, median 1.44344041382, std: 36.2048978964

Transformation (cam0):
-----------------------
T_ci:  (imu to cam0): [m]
[[ 0.99997641  0.00162013  0.00667461 -0.00052136]
 [-0.00167256  0.99996774  0.00785674 -0.00058756]
 [-0.00666166 -0.00786771  0.99994686 -0.02053795]
 [ 0.          0.          0.          1.        ]]

T_ic:  (cam0 to imu): [m]
[[ 0.99997641 -0.00167256 -0.00666166  0.00038355]
 [ 0.00162013  0.99996774 -0.00786771  0.0004268 ]
 [ 0.00667461  0.00785674  0.99994686  0.02054495]
 [ 0.          0.          0.          1.        ]]

timeshift cam0 to imu0: [s] (t_imu = t_cam + shift)
0.00378274603115


Transformation (cam1):
-----------------------
T_ci:  (imu to cam1): [m]
[[ 0.99997628  0.00164768  0.006687   -0.05059321]
 [-0.00170216  0.99996534  0.00815026 -0.00054962]
 [-0.00667334 -0.00816145  0.99994443 -0.02019003]
 [ 0.          0.          0.          1.        ]]

T_ic:  (cam1 to imu): [m]
[[ 0.99997628 -0.00170216 -0.00667334  0.05045634]
 [ 0.00164768  0.99996534 -0.00816145  0.00046819]
 [ 0.006687    0.00815026  0.99994443  0.0205317 ]
 [ 0.          0.          0.          1.        ]]

timeshift cam1 to imu0: [s] (t_imu = t_cam + shift)
0.00376747873273

Baselines:
----------
Baseline (cam0 to cam1): [m]
[[ 1.          0.00002764  0.00001218 -0.05007158]
 [-0.00002764  0.99999996  0.00029372  0.00004395]
 [-0.00001217 -0.00029372  0.99999996  0.00034774]
 [ 0.          0.          0.          1.        ]]
baseline norm:  0.0500728071756 [m]


Gravity vector in target coords: : [m/s^2]
[ 0.14496581 -9.80800984  0.13426805]


Calibration configuration
=========================

cam0
-----
  Camera model: pinhole
  Focal length: [381.94250504355887, 381.9381171436429]
  Principal point: [321.9842053161529, 237.0879675196309]
  Distortion model: equidistant
  Distortion coefficients: [0.27568982109914647, 0.40013398188381283, -0.4193836605327471, 0.33030055149870974]
  Type: aprilgrid
  Tags: 
    Rows: 6
    Cols: 6
    Size: 0.062 [m]
    Spacing 0.01922 [m]


cam1
-----
  Camera model: pinhole
  Focal length: [382.4444663321073, 382.4078509628303]
  Principal point: [322.16499428069085, 236.93981267329306]
  Distortion model: equidistant
  Distortion coefficients: [0.27738816086310025, 0.388197970648877, -0.3943980302832256, 0.3197838466213332]
  Type: aprilgrid
  Tags: 
    Rows: 6
    Cols: 6
    Size: 0.062 [m]
    Spacing 0.01922 [m]

IMU configuration
=================

  Update rate: 400.0
  Accelerometer:
    Noise density: 0.00111607 
    Noise density (discrete): 0.0223214 
    Random walk: 8.971e-05
  Gyroscope:
    Noise density: 0.00018108
    Noise density (discrete): 0.0036216 
    Random walk: 1.63e-06

IMU calibration was performed by the tool you open sourced KalibrAllen. The values were computed with 2 hours IMU data with the camera et rest.

Howeevr I am wondering whether the reprojection error is not too big? Here is a screenshot of the Kalibr report:
Screenshot from 2020-09-30 14-34-13

Finally, I can observe that at the beginning openVINS is able to compute the pose of the camera but suddenly drift away as you can see on the screenshot below:
Screenshot from 2020-09-30 14-20-16

I am working on the math to understand how it works in order to have a better understanding of why such behavior. However your feedback is welcome :)

Thanks again for sharing such tremendous work, the code but also all this wonderful documentation!

@goldbattle
Copy link
Member

The distortion parameters found are still extremely large. I would ensure you are covering the entire camera field of view. Can you try just calibrating a single camera instead of two cameras a the same time? You can also try manually setting these distortion parameters to zero and then running OpenVINS to try to calibrate them. I wouldn't expect the IMU-to-CAMERA kalibr calibration to work well if the intrinsics are not correct.

I see there have been a few issues trying to get the sensor calibrated (including your own struggles):

You can also trying looking at the /topic/..../camera_info topics to see if you can just grab these values. You can additionally guess the orientation transformation (move the sensor and see how gravity rotates) between camera IMU and just try to have OpenVINS calibrate everything from these "factor guesses". You could also get this transformation if the realsense drive publishes TF using tf_echo utility.

@FaboNo
Copy link
Author

FaboNo commented Oct 1, 2020

@goldbattle thank you very much for your comments. Actually I recalibrate but with camera model pinhole-radtan - I noticed that it computed very low distortion parameters.

And I start to get something not bad in the sense that there is no drifts anymore as it is shown on the pic below:

Screenshot from 2020-10-01 13-43-18

However what I observe is the following:

  1. if I move the camera and then stop, sometimes the camera is still moving a bit, then stop and oscillates. Can you tell me why it happens and what parameters to tune (eventually) to reduce this phenomena?
  2. The precision: i.e. after IMU initialization I go to a give position then move and come back to this position, is not really good but I think it is related to the point mentioned above;

May I ask a "naive" question? Why the IMU initialization is necessary and how to know which threshold is the most appropriate?

For information, it may be useful for other people with a D435i, I included the current launch file:

<launch>

    <node name="run_serial_msckf" pkg="ov_msckf" type="run_subscribe_msckf" output="screen">
    
        <!-- bag topics -->
        <param name="topic_imu"      type="string" value="/camera/imu" />
        <param name="topic_camera0"  type="string" value="/camera/infra1/image_rect_raw" />
        <param name="topic_camera1"  type="string" value="/camera/infra2/image_rect_raw" />
    
        <!-- bag parameters 
        <param name="path_bag"    type="string" value="/<path>/V1_01_easy.bag"/>
        <param name="bag_start"   type="int"    value="0" />
        <param name="bag_durr"    type="int"    value="-1" />
        -->
    
        <!-- world/filter parameters -->
        <param name="max_clones"             type="int"    value="11" />
        <param name="max_slam"               type="int"    value="0" />
        <param name="max_cameras"            type="int"    value="2" />
        <param name="init_window_time"       type="double" value="0.5" />
        <param name="init_imu_thresh"        type="double" value="0.8" />
        <rosparam param="gravity">[0.0,0.0,9.81]</rosparam>
        <param name="feat_rep_msckf"         type="string" value="GLOBAL_3D" />
        <param name="feat_rep_slam"          type="string" value="GLOBAL_3D" />
    
        <!-- tracker/extractor parameters -->
        <param name="use_klt"          type="bool"   value="true" />
        <param name="fast_threshold"   type="int"    value="10" />
        <param name="grid_x"           type="int"    value="5" />
        <param name="grid_y"           type="int"    value="3" />
        <param name="min_px_dist"      type="int"    value="20" />
        <param name="num_pts"          type="int"    value="400" />
    
        <!-- sensor noise values / update -->
        <param name="up_msckf_sigma_px"            type="double"   value="5" />
        <param name="up_msckf_chi2_multipler"      type="double"   value="1" />
        <param name="gyroscope_noise_density"      type="double"   value="0.00018108" />
        <param name="gyroscope_random_walk"        type="double"   value="0.00000163" />
        <param name="accelerometer_noise_density"  type="double"   value=" 0.00111607" />
        <param name="accelerometer_random_walk"    type="double"   value="0.00008971" />
    
        <!-- camera intrinsics   
         cam0: 378.8116759999369, 379.21955433765766, 316.8014073043034, 236.15148238616524 
         cam1: 378.70925395397694, 379.16298201882034, 316.8988186752809, 235.6921605734767
	-->
        <param name="cam0_is_fisheye" type="bool" value="false" />
        <param name="cam1_is_fisheye" type="bool" value="false" />
        <rosparam param="cam0_k">[380.8116759999369, 379.21955433765766, 316.8014073043034, 236.15148238616524]</rosparam>
        <rosparam param="cam0_d">[0.00885028521747932, -0.0007953730420560086, 0.0007297255155325815, -0.0013976518637811187]</rosparam>
        <rosparam param="cam1_k">[380.70925395397694, 379.16298201882034, 316.8988186752809, 235.6921605734767]</rosparam>
        <rosparam param="cam1_d">[0.009899555714257014, -0.001550788276986606, 0.00029394757646159756, -0.00126722854928818]</rosparam>
    
        <!-- camera extrinsics -->
        <rosparam param="T_C0toI">
            [
             0.99996091, -0.00270352, -0.00841862, -0.00159719,
             0.00263387,  0.9999623,  -0.00827416, -0.00137637,
             0.00844067,  0.00825166,  0.99993033,  0.01048336,
             0.0, 0.0, 0.0, 1.0]

        </rosparam>
        <rosparam param="T_C1toI">
            [ 0.99995563, -0.00257844, -0.00906,     0.04826278,
              0.00249648,  0.99995596, -0.00904647, -0.00140386,
              0.00908292,  0.00902345,  0.99991804,  0.01110032,
              0.0, 0.0, 0.0, 1.0]
        </rosparam>
    
    </node>
    
    <node type="rviz" name="rviz" pkg="rviz" args="-d $(find ov_msckf)/launch/display.rviz" />
    
</launch>

Thanks again

@goldbattle
Copy link
Member

To initialize the system we assume that we are standing still, thus we need to be able to detect this case. To tune this parameter, record a bag, set the threshold to be extremely large, and then visualize inspect images / threshold to see what is a good threshold. You shouldn't really need to change it after determining it as it is really just dependent on how noisy your IMU is.

My recommendation for getting the calibration is to record a bag with a relatively aggressive motion and play it back using the serial node. Excite all axes with good orientation rotations (but try not to cause that much motion blur). You can then increase the feature count to be extremely large (maybe 500-800 tracks, and 100 or so SLAM features) to have it run non-realtime. Then after processing the whole bag, use final estimated values are your initial guess for the rest of your runs.

Also it looks like you are using the rectified image streams, so one would expect the distortion parameters to be very close to zero.

To handle the case that you come back to stationary, you will need to leverage zero velocity updates or SLAM features. I would first recommend enabling 25-50 SLAM features which will be continuously tracked over many frames. To handle the stationary case, you will want to enable the zero velocity update which basically detects stationary stops visual tracking and updates the filter assuming this model. See the kaist launch file for an example of that.

Additionally, you might want to try to inflate your IMU noise parameters, and reduce your image pixel noise. A pixel noise above 2 typically shouldn't work since your camera isn't providing good information.

@FaboNo
Copy link
Author

FaboNo commented Oct 2, 2020

@goldbattle regarding the calibration, I guess that you are publishing the updated parameters (of the left camera in the IMU frame) in:

  • /ov_msckf/keyframe_intrinsics
  • /ov_msckf/keyframe_extrinsic
    Am I right? If so I can take the same values for the right camera then.

Yes you are right, I am using the rectified image streams and yes the distortion parameters should be almost equal to 0. It means that only the transformations C0toI and C1toI have to be computed.

Thank you again for your support

@goldbattle
Copy link
Member

goldbattle commented Oct 2, 2020 via email

@FaboNo
Copy link
Author

FaboNo commented Oct 8, 2020

@goldbattle

I followed your recommendation and recorded a bag with a relatively aggressive motion and play it back using the serial node (800 tracks and 100 SLAM features). I got intrinsics fx/fx varying between 277 and 283 for instance depending on the bag. So I took an average over several bags.

With the settings I have (which are far from perfect) I started to test the accuracy in terms of positioning by moving in our lab and then go back to the "home/starting pose". I wanted to share with you some pics so that you can guess which parameter(s) i can tune in order to improve - if possible - the accuracy:

most of the time there is a significant discrepancy between the starting pose and the end pose :
Capture du 2020-10-08 16-04-49

But sometimes, the result can be good (it happens once so far):
Capture du 2020-10-07 15-10-00

Even though zero velocity is activated, the discrepancy persists but when I do not move there is no drift.
Capture du 2020-10-07 15-42-42

I am already impressed by the performances of open_vins. According to your experience the problem lies in the calibration and/or in the imu which is too noisy? May I know what is your setup for testing open_vins, i.e. which IMU did you choose?

Sorry for diverging a bit but I am puzzled with Kalibr and the distortion models: indeed with equidistant distortion model, distortion parameters are in the 10-1 range. However reprojection errors with imu calibration are under 1 pixel. BUT if I go for the radtan distortion model, distortion parameters are in the 10-3 range, however the reprojection errors with imu calibration are large, i.e, sometime greater than 2 pixels.

Thank you again for your support

@goldbattle
Copy link
Member

If you could upload a bag and launch file to google drive I can try to take a peek also.

I think part of the issue is that you are using rectified image feeds (from what I see).
This can cause feature tracking artifacts and it is really hard to calibrate an already calibrated image.

I also think it might be the IMU but I have no experience with this sensor so I don't have that much of a feel.
From reading online it seems that it is reasonably noisy, so that might also hurt performance.

@FaboNo
Copy link
Author

FaboNo commented Oct 8, 2020

@goldbattle

Here is a link to download a rosbag from my google drive:
https://drive.google.com/file/d/1kxAARz1HNdxNVr9sNryURRJkXOJsuyyC/view?usp=sharing

and here is the launch file I used:

<launch>

    <node name="run_serial_msckf" pkg="ov_msckf" type="run_serial_msckf" output="screen">
    
        <!-- bag topics -->
        <param name="topic_imu"      type="string" value="/camera/imu" />
        <param name="topic_camera0"  type="string" value="/camera/infra1/image_rect_raw" />
        <param name="topic_camera1"  type="string" value="/camera/infra2/image_rect_raw" />
    
        <!-- bag parameters --> 
        <param name="path_bag"    type="string" value="/home/fabrice/catkin_ov_ws/2020-10-07-11-54-19.bag"/>
        <param name="bag_start"   type="int"    value="0" />
        <param name="bag_durr"    type="int"    value="-1" />
        
        <!-- world/filter parameters -->
        <param name="max_clones"             type="int"    value="11" />
        <param name="max_slam"               type="int"    value="0" />
        <param name="max_cameras"            type="int"    value="2" />
        <param name="init_window_time"       type="double" value="0.5" />
        <param name="init_imu_thresh"        type="double" value="0.8" />
        <param name="use_stereo"             type="bool"   value="true" />
        <rosparam param="gravity">[0.0,0.0,9.81]</rosparam>
        <param name="feat_rep_msckf"         type="string" value="GLOBAL_3D" />
        <param name="feat_rep_slam"          type="string" value="GLOBAL_3D" />
        <param name="calib_cam_extrinsics"   type="bool"   value="true" />
        <param name="calib_cam_intrinsics"   type="bool"   value="true" />
        <param name="calib_cam_timeoffset"   type="bool"   value="true" />
        <param name="calib_camimu_dt"        type="double" value="0.0" />
        <param name="max_clones"             type="int"    value="11" />
        <param name="max_slam"               type="int"    value="100" />
        <param name="max_slam_in_update"     type="int"    value="25" /> <!-- 25 seems to work well -->
        <param name="max_msckf_in_update"    type="int"    value="999" />
        <!-- tracker/extractor parameters -->
        <param name="use_klt"          type="bool"   value="true" />
        <param name="fast_threshold"   type="int"    value="10" />
        <param name="grid_x"           type="int"    value="5" />
        <param name="grid_y"           type="int"    value="3" />
        <param name="min_px_dist"      type="int"    value="20" />
        <param name="num_pts"          type="int"    value="800" />
    
        <!-- sensor noise values / update -->
        <param name="up_msckf_sigma_px"            type="double"   value="5" />
        <param name="up_msckf_chi2_multipler"      type="double"   value="1" />
        <param name="gyroscope_noise_density"      type="double"   value="0.00018108" />
        <param name="gyroscope_random_walk"        type="double"   value="0.0000163" />
        <param name="accelerometer_noise_density"  type="double"   value=" 0.00111607" />
        <param name="accelerometer_random_walk"    type="double"   value="0.0008971" />
    
        <!-- camera intrinsics   
         cam0: 378.8116759999369, 379.21955433765766, 316.8014073043034, 236.15148238616524 
         cam1: 378.70925395397694, 379.16298201882034, 316.8988186752809, 235.6921605734767
	-->
        <param name="cam0_is_fisheye" type="bool" value="false" />
        <param name="cam1_is_fisheye" type="bool" value="false" />
        
        <rosparam param="cam0_k">[381.82470703125, 381.82470703125, 319.0743408203125, 237.6382293701172]</rosparam>
        <rosparam param="cam0_d">[0.000885028521747932, -0.00007953730420560086, 0.00007297255155325815, -0.00013976518637811187]</rosparam>
        <rosparam param="cam1_k">[381.82470703125, 381.82470703125, 319.0743408203125, 237.63822937011727]</rosparam>
        <rosparam param="cam1_d">[0.0009899555714257014, -0.0001550788276986606, 0.000029394757646159756, -0.000126722854928818]</rosparam>
    
        <!-- camera extrinsics -->
        <rosparam param="T_C0toI">
            [
             0.99996091, -0.00270352, -0.00841862, -0.00159719,
             0.00263387,  0.9999623,  -0.00827416, -0.00137637,
             0.00844067,  0.00825166,  0.99993033,  0.01048336,
             0.0, 0.0, 0.0, 1.0]

        </rosparam>
        <rosparam param="T_C1toI">
            [ 0.99995563, -0.00257844, -0.00906,     0.04826278,
              0.00249648,  0.99995596, -0.00904647, -0.00140386,
              0.00908292,  0.00902345,  0.99991804,  0.01110032,
              0.0, 0.0, 0.0, 1.0]
        </rosparam>
    
    </node>
    <node type="rviz" name="rviz" pkg="rviz" args="-d $(find ov_msckf)/launch/display.rviz" />
</launch>

Discussing with Intel Support for RealSense product, they told me that Infra1/infra2 Y8 are rectified by Y16 are unrectified so I will investigate if i can switch to this mode (640x400 at 25Hz) and then do a recalibration pass.

@goldbattle
Copy link
Member

I was unable to get it to work really well.
Is the structured light enabled? If so can you try disabling it.
Also what method are you using to "unite" the imu messages?
My hunch is that it isn't really related to the camera since it is able to triangulate features with relative success in the beginning.

I recommend setting the following in your launch:

<param name="use_fej"                type="bool"   value="true" />
<param name="use_imuavg"             type="bool"   value="true" />
<param name="use_rk4int"             type="bool"   value="true" />
<param name="use_aruco"              type="bool"   value="false" />

@FaboNo
Copy link
Author

FaboNo commented Oct 21, 2020

@goldbattle thank you for the parameters - is there a place where all the parameters are described?

When I used ov_serial with bag to get the computed calibration parameters from open_vins I noticed the following:

  1. I started with the data provided by camera_info then ov computed cam0_k = [385.601,386.218,322.093,241.831] (for instance)
  2. and if I introduced these parameters in the launch file and rerun the bag I got new parameters such as cam0_k = [394.201,395.607,321.09,239.594]. It is like there is no convergence, is it a normal behavior?

I would like to ask you some clarifications about the frames.

At the initialization, there is an IMU vector expressed in the Camera frame following the RDF convention (X right, Y Down and Z Front). The orientation value of the IMU vector is xyzw = -0.703386 0.0155119 -0.0424232 0.709371 (JPL convention) and it is aligned along the X axis of the camera frame.

Because I would like to couple open_vins with our path planner, I have to define a "base_link'" frame respecting the FLU convention (X forward, Y left and Z up) - a zero yaw angle means that the MAV is pointing in the X axis direction. May be I am a bit confused but the image below should clarify my issue:

Screenshot from 2020-10-21 12-07-40

Actually I tried many different things like multiplying by q_rot = (0.0, 0, 0.707, 0,707) which is a 90 degrees rotation around the Z axis, it works but the Camera frame change and thus the perception module projected the point cloud in a wrong direction. And I did not find a place describing the conversion between JPL and Hamilton quaternions - I probably need to look at ov::core which is transforming JPL quaternions into rotation matrices but I was wondering whether you have in mind an easiest way to do it?

Thank you again for help

@goldbattle
Copy link
Member

There is no guarantee that the calibration will converge to a better guess.
In terms of the global frame, this is arbitrary in terms of the xyz and the yaw.
For example here we pick the xyz to be zero, and the yaw to be zero at initialization by convention.
This means you don't know which direction the MAV is facing on startup I believe (and there is no way to recover it).
Everything published to ROS should be standard notation R_ItoG and p_IinG.

As for the conversion from Hamilton to JPL it basically just flips the direction of the rotation.
In our case, our R_GtoI if we take the xyzw quaternion, and use those values to compute a Hamilton quaternion it will be R_ItoG.
Unless you are changing code inside OpenVINS, you should treat them as Hamilton (i.e. the ROS convention to my understanding).
https://github.com/rpng/open_vins/blob/master/ov_msckf/src/core/RosVisualizer.cpp#L340-L374

As for your base link, it sounds like you just need to publish a static TF in relation to the IMU?
If so the global frame of reference shouldn't play into this at all.

@goldbattle
Copy link
Member

Feel free to re-open when you re-visit or have further issues.

@FaboNo
Copy link
Author

FaboNo commented Nov 18, 2020

@goldbattle thank you very much for your support. We are doing some tests and it is quite clear that we need an additional more accurate IMU than the one embedded in the D435i. Do you have any recommandation on your side?

@goldbattle
Copy link
Member

goldbattle commented Nov 18, 2020 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Theory or implementation question user-platform User has trouble running on their own platform.
Projects
None yet
Development

No branches or pull requests

2 participants