Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rotation issue on camera position #478

Open
TiagoMartinss opened this issue Oct 14, 2024 · 3 comments
Open

Rotation issue on camera position #478

TiagoMartinss opened this issue Oct 14, 2024 · 3 comments
Labels
user-platform User has trouble running on their own platform.

Comments

@TiagoMartinss
Copy link

Hello, I am a junior full-stack developer, so I don't have much expertise in the field of image processing.

I am using an Azure Kinect DK as a color camera and its IMU.
My server, connected to the camera, uses OpenVINS and sends the OpenVINS transformation matrix and the depth image to the clients to reconstruct the environment.
The steps:

First, I captured a 3-hour static IMU rosbag from my camera to generate an imu.yaml file using Allan.

Azure kinect imu system :
x forward
y right
z up

Azure kinect pointcloud system :
x right
y down
z forward

Then, I used Kalibr with the intrinsic and extrinsic parameters of the Azure Kinect and the imu.yaml file to generate two output YAML files.

I am using OpenVINS without ROS in my code, with the following code:

openvinscode1
openvinscode2
openvinscode3

And my configuration :

image image
%YAML:1.0 # need to specify the file type at the top!

verbosity: "INFO" # ALL, DEBUG, INFO, WARNING, ERROR, SILENT

use_fej: true # if first-estimate Jacobians should be used (enable for good consistency)
integration: "rk4" # discrete, rk4, analytical (if rk4 or analytical used then analytical covariance propagation is used)
use_stereo: false # if we have more than 1 camera, if we should try to track stereo constraints
max_cameras: 1 # how many cameras we have 1 = mono, 2 = stereo, >2 = binocular (all mono tracking)

calib_cam_extrinsics: false # degenerate motion
calib_cam_intrinsics: false
calib_cam_timeoffset: false # degenerate motion
calib_imu_intrinsics: false
calib_imu_g_sensitivity: false

max_clones: 11
max_slam: 50
max_slam_in_update: 25
max_msckf_in_update: 50
dt_slam_delay: 1

gravity_mag: 9.81

feat_rep_msckf: "ANCHORED_MSCKF_INVERSE_DEPTH"
feat_rep_slam: "ANCHORED_MSCKF_INVERSE_DEPTH"
feat_rep_aruco: "ANCHORED_MSCKF_INVERSE_DEPTH"

# zero velocity update parameters we can use
# we support either IMU-based or disparity detection.
try_zupt: true
zupt_chi2_multipler: 0.5 # set to 0 for only disp-based
zupt_max_velocity: 0.1
zupt_noise_multiplier: 1
zupt_max_disparity: 0.4 # set to 0 for only imu-based
zupt_only_at_beginning: false

# ==================================================================
# ==================================================================

init_window_time: 2.0
init_imu_thresh: 0.5
init_max_disparity: 1.5
init_max_features: 50

init_dyn_use: true
init_dyn_mle_opt_calib: false
init_dyn_mle_max_iter: 50
init_dyn_mle_max_time: 0.05
init_dyn_mle_max_threads: 6
init_dyn_num_pose: 6
init_dyn_min_deg: 5.0 # traj is mostly straight line motion

init_dyn_inflation_ori: 10
init_dyn_inflation_vel: 100
init_dyn_inflation_bg: 10
init_dyn_inflation_ba: 100
init_dyn_min_rec_cond: 1e-20 # traj is mostly straight line motion

init_dyn_bias_g: [0.0, 0.0, 0.0]
init_dyn_bias_a: [0.0, 0.0, 0.0]


# ==================================================================
# ==================================================================

record_timing_information: false
record_timing_filepath: "/tmp/traj_timing.txt"

save_total_state: false
filepath_est: "/tmp/ov_estimate.txt"
filepath_std: "/tmp/ov_estimate_std.txt"
filepath_gt: "/tmp/ov_groundtruth.txt"

# ==================================================================
# ==================================================================

# our front-end feature tracking parameters
# we have a KLT and descriptor based (KLT is better implemented...)
use_klt: true
num_pts: 100
fast_threshold: 30
grid_x: 5
grid_y: 5
min_px_dist: 20
knn_ratio: 0.65
track_frequency: 31.0
downsample_cameras: false
num_opencv_threads: 4 # -1: auto, 0-1: serial, >1: number of threads
histogram_method: "HISTOGRAM" # NONE, HISTOGRAM, CLAHE

fi_min_dist: 0.25
fi_max_dist: 150.0
fi_max_baseline: 200
fi_max_cond_number: 20000
fi_triangulate_1d: false

# aruco tag tracker for the system
# DICT_6X6_1000 from https://chev.me/arucogen/
use_aruco: false
num_aruco: 1024
downsize_aruco: true

# ==================================================================
# ==================================================================

# camera noises and chi-squared threshold multipliers
up_msckf_sigma_px: 1.5
up_msckf_chi2_multipler: 1
up_slam_sigma_px: 1.5
up_slam_chi2_multipler: 1
up_aruco_sigma_px: 1
up_aruco_chi2_multipler: 1

# masks for our images
use_mask: false

# imu and camera spacial-temporal
# imu config should also have the correct noise values
relative_config_imu: "azure-imu.yaml"
relative_config_imucam: "azure-camchain-imucam.yaml"

During my debugging session (focusing on the camera’s translation/position for now), the translation reacts very well, and I am even able to reconstruct my environment quite accurately, you can notice the green line which is the camera movements history.

openvins1

However, when I apply a rotation to my camera, OpenVINS outputs a completely incorrect translation (a 1-meter translation even though the camera is stationary and only rotating), i did apply multiple X-axis (openvins coordinates system which is my viewer Z-axis) rotations :

openvins2

In summary, my issue is that the OpenVINS transformation matrix is incorrect during rotations but not translations. It behaves like a significant drift is being applied, causing the movements to be completely inaccurate.

Since translation works but rotation doesn't, this clearly indicates there is an issue with my setup. I can't tell for now if it's my configuration, code or logic that create the issue, may you guys can help me ! I apologize if the error seems obvious :)
Let me know if you need more informations, this is pretty urgent as it block me on a project :/

@TiagoMartinss
Copy link
Author

TiagoMartinss commented Oct 15, 2024

As i said in my mention, i did calibrate everything (allan and kalibr) with a ubuntu 18.04 Docker with ROS Melodic in it.

@goldbattle goldbattle added the user-platform User has trouble running on their own platform. label Nov 23, 2024
@goldbattle
Copy link
Member

You might want to try to use the internal quaternion to rotation function as we use JPL quaterions, likely your reconstructed transformation matrix has a flipped orientation.
https://docs.openvins.com/namespaceov__core.html#adfb06397034cc6b346efb9517ed3757e

@TiagoMartinss
Copy link
Author

TiagoMartinss commented Dec 5, 2024

You might want to try to use the internal quaternion to rotation function as we use JPL quaterions, likely your reconstructed transformation matrix has a flipped orientation. https://docs.openvins.com/namespaceov__core.html#adfb06397034cc6b346efb9517ed3757e

As i said i'm not an expert, i only debug the translation in my viewer for now, and applying rotation actually make an infinite drift to my Vec3f (which is translation).
image
You can see it here how do i build my transformation matrix, however in my server part i only use translation values to show my green line.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
user-platform User has trouble running on their own platform.
Projects
None yet
Development

No branches or pull requests

2 participants