Skip to content

Commit

Permalink
Merge pull request #66 from rpng/pose_graph
Browse files Browse the repository at this point in the history
Development v2.1 - Secondary Pose Graph Support
  • Loading branch information
rpng-guest authored Jul 7, 2020
2 parents 2295e7f + 28f0a52 commit 1c8ff20
Show file tree
Hide file tree
Showing 8 changed files with 297 additions and 22 deletions.
22 changes: 21 additions & 1 deletion ReadMe.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ Please take a look at the feature list below for full details on what the system

## News / Events

* **May 18, 2020** - Released secondary pose graph example repository [ov_secondary](https://github.com/rpng/ov_secondary) based on [VINS-Fusion](https://github.com/HKUST-Aerial-Robotics/VINS-Fusion). OpenVINS now publishes marginalized feature track, feature 3d position, and first camera intrinsics and extrinsics. See [PR#66](https://github.com/rpng/open_vins/pull/66) for details and discussion.
* **April 3, 2020** - Released [v2.0](https://github.com/rpng/open_vins/releases/tag/v2.0) update to the codebase with some key refactoring, ros-free building, improved dataset support, and single inverse depth feature representation. Please check out the [release page](https://github.com/rpng/open_vins/releases/tag/v2.0) for details.
* **January 21, 2020** - Our paper has been accepted for presentation in [ICRA 2020](https://www.icra2020.org/). We look forward to seeing everybody there! We have also added links to a few videos of the system running on different datasets.
* **October 23, 2019** - OpenVINS placed first in the [IROS 2019 FPV Drone Racing VIO Competition
Expand Down Expand Up @@ -63,9 +64,27 @@ Please take a look at the feature list below for full details on what the system
* Out of the box evaluation on EurocMav and TUM-VI datasets
* Extensive evaluation suite (ATE, RPE, NEES, RMSE, etc..)

## Demo Videos


## Codebase Extensions

* **[ov_secondary](https://github.com/rpng/ov_secondary)** -
This is an example secondary thread which provides loop closure in a loosely coupled manner for [OpenVINS](https://github.com/rpng/open_vins).
This is a modification of the code originally developed by the HKUST aerial robotics group and can be found in their [VINS-Fusion](https://github.com/HKUST-Aerial-Robotics/VINS-Fusion) repository.
Here we stress that this is a loosely coupled method, thus no information is returned to the estimator to improve the underlying OpenVINS odometry.
This codebase has been modified in a few key areas including: exposing more loop closure parameters, subscribing to camera intrinsics, simplifying configuration such that only topics need to be supplied, and some tweaks to the loop closure detection to improve frequency.

* **[ov_maplab](https://github.com/rpng/ov_maplab)** -
This codebase contains the interface wrapper for exporting visual-inertial runs from [OpenVINS](https://github.com/rpng/open_vins) into the ViMap structure taken by [maplab](https://github.com/ethz-asl/maplab).
The state estimates and raw images are appended to the ViMap as OpenVINS runs through a dataset.
After completion of the dataset, features are re-extract and triangulate with maplab's feature system.
This can be used to merge multi-session maps, or to perform a batch optimization after first running the data through OpenVINS.
Some example have been provided along with a helper script to export trajectories into the standard groundtruth format.



## Demo Videos

[![](docs/youtube/KCX51GvYGss.jpg)](http://www.youtube.com/watch?v=KCX51GvYGss "OpenVINS - EuRoC MAV Vicon Rooms Flyby")
[![](docs/youtube/Lc7VQHngSuQ.jpg)](http://www.youtube.com/watch?v=Lc7VQHngSuQ "OpenVINS - TUM VI Datasets Flyby")
[![](docs/youtube/vaia7iPaRW8.jpg)](http://www.youtube.com/watch?v=vaia7iPaRW8 "OpenVINS - UZH-FPV Drone Racing Dataset Flyby")
Expand All @@ -76,6 +95,7 @@ Please take a look at the feature list below for full details on what the system
[![](docs/youtube/ExPIGwORm4E.jpg)](http://www.youtube.com/watch?v=ExPIGwORm4E "OpenVINS - UZH-FPV Drone Racing Dataset Demonstration")



## Credit / Licensing

This code was written by the [Robot Perception and Navigation Group (RPNG)](https://sites.udel.edu/robot/) at the University of Delaware.
Expand Down
4 changes: 2 additions & 2 deletions ov_msckf/launch/pgeneva_ros_tum.launch
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,11 @@
<arg name="max_cameras" default="2" />
<arg name="use_stereo" default="true" />
<arg name="bag_start" default="0" />
<arg name="bag" default="/home/patrick/datasets/tum/dataset-room2_512_16.bag" />
<arg name="bag" default="/home/patrick/datasets/tum/dataset-room1_512_16.bag" />

<!-- imu starting thresholds -->
<arg name="init_window_time" default="1.0" />
<arg name="init_imu_thresh" default="0.75" />
<arg name="init_imu_thresh" default="0.70" /> <!-- room1:0.70, room2:0.75, room3:0.70 -->

<!-- saving trajectory path and timing information -->
<arg name="dosave" default="false" />
Expand Down
2 changes: 1 addition & 1 deletion ov_msckf/launch/pgeneva_serial_tum.launch
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
<param name="max_cameras" type="int" value="2" />
<param name="dt_slam_delay" type="double" value="3" />
<param name="init_window_time" type="double" value="1.0" />
<param name="init_imu_thresh" type="double" value="0.8" />
<param name="init_imu_thresh" type="double" value="0.70" /> <!-- room1:0.70, room2:0.75, room3:0.70 -->
<rosparam param="gravity">[0.0,0.0,9.80766]</rosparam>
<param name="feat_rep_msckf" type="string" value="GLOBAL_3D" />
<param name="feat_rep_slam" type="string" value="ANCHORED_FULL_INVERSE_DEPTH" />
Expand Down
126 changes: 126 additions & 0 deletions ov_msckf/src/core/RosVisualizer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,12 @@ RosVisualizer::RosVisualizer(ros::NodeHandle &nh, VioManager* app, Simulator *si
pub_pathgt = nh.advertise<nav_msgs::Path>("/ov_msckf/pathgt", 2);
ROS_INFO("Publishing: %s", pub_pathgt.getTopic().c_str());

// Keyframe publishers
pub_keyframe_pose = nh.advertise<nav_msgs::Odometry>("/ov_msckf/keyframe_pose", 1000);
pub_keyframe_point = nh.advertise<sensor_msgs::PointCloud>("/ov_msckf/keyframe_feats", 1000);
pub_keyframe_extrinsic = nh.advertise<nav_msgs::Odometry>("/ov_msckf/keyframe_extrinsic", 1000);
pub_keyframe_intrinsics = nh.advertise<sensor_msgs::CameraInfo>("/ov_msckf/keyframe_intrinsics", 1000);

// option to enable publishing of global to IMU transformation
nh.param<bool>("publish_global_to_imu_tf", publish_global2imu_tf, true);
nh.param<bool>("publish_calibration_tf", publish_calibration_tf, true);
Expand Down Expand Up @@ -133,6 +139,9 @@ void RosVisualizer::visualize() {
// Publish gt if we have it
publish_groundtruth();

// Publish keyframe information
publish_keyframe_information();

// save total state
if(save_total_state)
sim_save_total_state_to_file();
Expand Down Expand Up @@ -670,6 +679,123 @@ void RosVisualizer::publish_groundtruth() {



void RosVisualizer::publish_keyframe_information() {


// Check if we have subscribers
if(pub_keyframe_pose.getNumSubscribers()==0 && pub_keyframe_point.getNumSubscribers()==0 &&
pub_keyframe_extrinsic.getNumSubscribers()==0 && pub_keyframe_intrinsics.getNumSubscribers()==0)
return;


// Skip if we don't have a marginalized frame yet
double hist_last_marginalized_time;
Eigen::Matrix<double,7,1> stateinG;
if(!_app->hist_last_marg_state(hist_last_marginalized_time, stateinG))
return;

// Default header
std_msgs::Header header;
header.stamp = ros::Time(hist_last_marginalized_time);

//======================================================
// PUBLISH IMU TO CAMERA0 EXTRINSIC
// need to flip the transform to the IMU frame
Eigen::Vector4d q_ItoC = _app->get_state()->_calib_IMUtoCAM.at(0)->quat();
Eigen::Vector3d p_CinI = -_app->get_state()->_calib_IMUtoCAM.at(0)->Rot().transpose()*_app->get_state()->_calib_IMUtoCAM.at(0)->pos();
nav_msgs::Odometry odometry_calib;
odometry_calib.header = header;
odometry_calib.header.frame_id = "imu";
odometry_calib.pose.pose.position.x = p_CinI(0);
odometry_calib.pose.pose.position.y = p_CinI(1);
odometry_calib.pose.pose.position.z = p_CinI(2);
odometry_calib.pose.pose.orientation.x = q_ItoC(0);
odometry_calib.pose.pose.orientation.y = q_ItoC(1);
odometry_calib.pose.pose.orientation.z = q_ItoC(2);
odometry_calib.pose.pose.orientation.w = q_ItoC(3);
pub_keyframe_extrinsic.publish(odometry_calib);


//======================================================
// PUBLISH CAMERA0 INTRINSICS
sensor_msgs::CameraInfo cameraparams;
cameraparams.header = header;
cameraparams.header.frame_id = "imu";
cameraparams.distortion_model = (_app->get_state()->_cam_intrinsics_model.at(0))? "equidistant" : "plumb_bob";
Eigen::VectorXd cparams = _app->get_state()->_cam_intrinsics.at(0)->value();
cameraparams.D = {cparams(4), cparams(5), cparams(6), cparams(7)};
cameraparams.K = {cparams(0), 0, cparams(2), 0, cparams(1), cparams(3), 0, 0, 1};
pub_keyframe_intrinsics.publish(cameraparams);


//======================================================
// PUBLISH HISTORICAL POSE ESTIMATE
nav_msgs::Odometry odometry_pose;
odometry_pose.header = header;
odometry_pose.header.frame_id = "global";
odometry_pose.pose.pose.position.x = stateinG(4);
odometry_pose.pose.pose.position.y = stateinG(5);
odometry_pose.pose.pose.position.z = stateinG(6);
odometry_pose.pose.pose.orientation.x = stateinG(0);
odometry_pose.pose.pose.orientation.y = stateinG(1);
odometry_pose.pose.pose.orientation.z = stateinG(2);
odometry_pose.pose.pose.orientation.w = stateinG(3);
pub_keyframe_pose.publish(odometry_pose);


//======================================================
// PUBLISH FEATURE TRACKS IN THE GLOBAL FRAME OF REFERENCE

// Get historical feature information
std::unordered_map<size_t, Eigen::Vector3d> hist_feat_posinG;
std::unordered_map<size_t, std::unordered_map<size_t, std::vector<Eigen::VectorXf>>> hist_feat_uvs;
std::unordered_map<size_t, std::unordered_map<size_t, std::vector<Eigen::VectorXf>>> hist_feat_uvs_norm;
std::unordered_map<size_t, std::unordered_map<size_t, std::vector<double>>> hist_feat_timestamps;
_app->hist_get_features(hist_feat_posinG, hist_feat_uvs, hist_feat_uvs_norm, hist_feat_timestamps);

// Construct the message
sensor_msgs::PointCloud point_cloud;
point_cloud.header = header;
point_cloud.header.frame_id = "global";
for(const auto &feattimes : hist_feat_timestamps) {

// Skip if this feature has no extraction in the "zero" camera
if(feattimes.second.find(0)==feattimes.second.end())
continue;

// Skip if this feature does not have measurement at this time
auto iter = std::find(feattimes.second.at(0).begin(), feattimes.second.at(0).end(), hist_last_marginalized_time);
if(iter==feattimes.second.at(0).end())
continue;

// Get this feature information
size_t featid = feattimes.first;
size_t index = (size_t)std::distance(feattimes.second.at(0).begin(), iter);
Eigen::VectorXf uv = hist_feat_uvs.at(featid).at(0).at(index);
Eigen::VectorXf uv_n = hist_feat_uvs_norm.at(featid).at(0).at(index);
Eigen::Vector3d pFinG = hist_feat_posinG.at(featid);

// Push back 3d point
geometry_msgs::Point32 p;
p.x = pFinG(0);
p.y = pFinG(1);
p.z = pFinG(2);
point_cloud.points.push_back(p);

// Push back the norm, raw, and feature id
sensor_msgs::ChannelFloat32 p_2d;
p_2d.values.push_back(uv_n(0));
p_2d.values.push_back(uv_n(1));
p_2d.values.push_back(uv(0));
p_2d.values.push_back(uv(1));
p_2d.values.push_back(featid);
point_cloud.channels.push_back(p_2d);

}
pub_keyframe_point.publish(point_cloud);


}


void RosVisualizer::sim_save_total_state_to_file() {
Expand Down
18 changes: 9 additions & 9 deletions ov_msckf/src/core/RosVisualizer.h
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,10 @@
#include <sensor_msgs/Image.h>
#include <sensor_msgs/Imu.h>
#include <sensor_msgs/NavSatFix.h>
#include <sensor_msgs/PointCloud.h>
#include <sensor_msgs/PointCloud2.h>
#include <sensor_msgs/point_cloud2_iterator.h>
#include <sensor_msgs/CameraInfo.h>
#include <std_msgs/Float64.h>
#include <nav_msgs/Path.h>
#include <nav_msgs/Odometry.h>
Expand Down Expand Up @@ -102,6 +104,9 @@ namespace ov_msckf {
/// Publish groundtruth (if we have it)
void publish_groundtruth();

/// Publish keyframe information of the marginalized pose and tracks
void publish_keyframe_information();

/// Save current estimate state and groundtruth including calibration
void sim_save_total_state_to_file();

Expand All @@ -115,23 +120,18 @@ namespace ov_msckf {
Simulator* _sim;

// Our publishers
ros::Publisher pub_poseimu;
ros::Publisher pub_odomimu;
ros::Publisher pub_pathimu;
ros::Publisher pub_points_msckf;
ros::Publisher pub_points_slam;
ros::Publisher pub_points_aruco;
ros::Publisher pub_points_sim;
ros::Publisher pub_poseimu, pub_odomimu, pub_pathimu;
ros::Publisher pub_points_msckf, pub_points_slam, pub_points_aruco, pub_points_sim;
ros::Publisher pub_tracks;
ros::Publisher pub_keyframe_pose, pub_keyframe_point, pub_keyframe_extrinsic, pub_keyframe_intrinsics;
tf::TransformBroadcaster *mTfBr;

// For path viz
unsigned int poses_seq_imu = 0;
vector<geometry_msgs::PoseStamped> poses_imu;

// Groundtruth infomation
ros::Publisher pub_pathgt;
ros::Publisher pub_posegt;
ros::Publisher pub_pathgt, pub_posegt;
double summed_rmse_ori = 0.0;
double summed_rmse_pos = 0.0;
double summed_nees_ori = 0.0;
Expand Down
104 changes: 95 additions & 9 deletions ov_msckf/src/core/VioManager.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -494,6 +494,13 @@ void VioManager::do_feature_propagate_update(double timestamp) {
//===================================================================================


// Collect all slam features into single vector
std::vector<Feature*> features_used_in_update = featsup_MSCKF;
features_used_in_update.insert(features_used_in_update.end(), feats_slam_UPDATE.begin(), feats_slam_UPDATE.end());
features_used_in_update.insert(features_used_in_update.end(), feats_slam_DELAYED.begin(), feats_slam_DELAYED.end());
update_keyframe_historical_information(features_used_in_update);


// Save all the MSCKF features used in the update
good_features_MSCKF.clear();
for(Feature* feat : featsup_MSCKF) {
Expand All @@ -516,17 +523,15 @@ void VioManager::do_feature_propagate_update(double timestamp) {
// First do anchor change if we are about to lose an anchor pose
updaterSLAM->change_anchors(state);

// Marginalize the oldest clone of the state if we are at max length
if((int)state->_clones_IMU.size() > state->_options.max_clone_size) {
// Cleanup any features older then the marginalization time
trackFEATS->get_feature_database()->cleanup_measurements(state->margtimestep());
if(trackARUCO != nullptr) {
trackARUCO->get_feature_database()->cleanup_measurements(state->margtimestep());
}
// Finally marginalize that clone
StateHelper::marginalize_old_clone(state);
// Cleanup any features older then the marginalization time
trackFEATS->get_feature_database()->cleanup_measurements(state->margtimestep());
if(trackARUCO != nullptr) {
trackARUCO->get_feature_database()->cleanup_measurements(state->margtimestep());
}

// Finally marginalize the oldest clone if needed
StateHelper::marginalize_old_clone(state);

// Finally if we are optimizing our intrinsics, update our trackers
if(state->_options.do_calib_camera_intrinsics) {
// Get vectors arrays
Expand Down Expand Up @@ -635,10 +640,91 @@ void VioManager::do_feature_propagate_update(double timestamp) {
}


void VioManager::update_keyframe_historical_information(const std::vector<Feature*> &features) {


// Loop through all features that have been used in the last update
// We want to record their historical measurements and estimates for later use
for(const auto &feat : features) {

// Get position of feature in the global frame of reference
Eigen::Vector3d p_FinG = feat->p_FinG;

// If it is a slam feature, then get its best guess from the state
if(state->_features_SLAM.find(feat->featid)!=state->_features_SLAM.end()) {
p_FinG = state->_features_SLAM.at(feat->featid)->get_xyz(false);
}

// Push back any new measurements if we have them
// Ensure that if the feature is already added, then just append the new measurements
if(hist_feat_posinG.find(feat->featid)!=hist_feat_posinG.end()) {
hist_feat_posinG.at(feat->featid) = p_FinG;
for(const auto &cam2uv : feat->uvs) {
if(hist_feat_uvs.at(feat->featid).find(cam2uv.first)!=hist_feat_uvs.at(feat->featid).end()) {
hist_feat_uvs.at(feat->featid).at(cam2uv.first).insert(hist_feat_uvs.at(feat->featid).at(cam2uv.first).end(), cam2uv.second.begin(), cam2uv.second.end());
hist_feat_uvs_norm.at(feat->featid).at(cam2uv.first).insert(hist_feat_uvs_norm.at(feat->featid).at(cam2uv.first).end(), feat->uvs_norm.at(cam2uv.first).begin(), feat->uvs_norm.at(cam2uv.first).end());
hist_feat_timestamps.at(feat->featid).at(cam2uv.first).insert(hist_feat_timestamps.at(feat->featid).at(cam2uv.first).end(), feat->timestamps.at(cam2uv.first).begin(), feat->timestamps.at(cam2uv.first).end());
} else {
hist_feat_uvs.at(feat->featid).insert(cam2uv);
hist_feat_uvs_norm.at(feat->featid).insert({cam2uv.first,feat->uvs_norm.at(cam2uv.first)});
hist_feat_timestamps.at(feat->featid).insert({cam2uv.first,feat->timestamps.at(cam2uv.first)});
}
}
} else {
hist_feat_posinG.insert({feat->featid,p_FinG});
hist_feat_uvs.insert({feat->featid,feat->uvs});
hist_feat_uvs_norm.insert({feat->featid,feat->uvs_norm});
hist_feat_timestamps.insert({feat->featid,feat->timestamps});
}
}


// Go through all our old historical vectors and find if any features should be removed
// In this case we know that if we have no use for features that only have info older then the last marg time
std::vector<size_t> ids_to_remove;
for(const auto &id2feat : hist_feat_timestamps) {
bool all_older = true;
for(const auto &cam2time : id2feat.second) {
for(const auto &time : cam2time.second) {
if(time >= hist_last_marginalized_time) {
all_older = false;
break;
}
}
if(!all_older) break;
}
if(all_older) {
ids_to_remove.push_back(id2feat.first);
}
}

// Remove those features!
for(const auto &id : ids_to_remove) {
hist_feat_posinG.erase(id);
hist_feat_uvs.erase(id);
hist_feat_uvs_norm.erase(id);
hist_feat_timestamps.erase(id);
}

// Remove any historical states older then the marg time
auto it0 = hist_stateinG.begin();
while(it0 != hist_stateinG.end()) {
if(it0->first < hist_last_marginalized_time) it0 = hist_stateinG.erase(it0);
else it0++;
}

// If we have reached our max window size record the oldest clone
// This clone is expected to be marginalized from the state
if ((int) state->_clones_IMU.size() > state->_options.max_clone_size) {
hist_last_marginalized_time = state->margtimestep();
assert(hist_last_marginalized_time != INFINITY);
Eigen::Matrix<double,7,1> imustate_inG = Eigen::Matrix<double,7,1>::Zero();
imustate_inG.block(0,0,4,1) = state->_clones_IMU.at(hist_last_marginalized_time)->quat();
imustate_inG.block(4,0,3,1) = state->_clones_IMU.at(hist_last_marginalized_time)->pos();
hist_stateinG.insert({hist_last_marginalized_time, imustate_inG});
}

}



Expand Down
Loading

0 comments on commit 1c8ff20

Please sign in to comment.