This Project present how to associate regions in a camera image with Lidar points in 3D space.
For more details, please check my blog.
- cmake >= 2.8
- All OSes: click here for installation instructions
- make >= 4.1 (Linux, Mac), 3.81 (Windows)
- Linux: make is installed by default on most Linux distros
- Mac: install Xcode command line tools to get make
- Windows: Click here for installation instructions
- Git LFS
- Weight files are handled using LFS
- OpenCV >= 4.1
- This must be compiled from source using the
-D OPENCV_ENABLE_NONFREE=ON
cmake flag for testing the SIFT and SURF detectors. - The OpenCV 4.1.0 source code can be found here
- This must be compiled from source using the
- gcc/g++ >= 5.4
- Linux: gcc / g++ is installed by default on most Linux distros
- Mac: same deal as make - install Xcode command line tools
- Windows: recommend using MinGW
-
Clone this repo.
-
Make a build directory in the top level project directory:
mkdir build && cd build
-
Compile:
cmake .. && make
-
Run it:
./project_lidar_to_camera
.
-
Filtering Lidar Points
The code below shows how a filter can be applied to remove Lidar points that do not satisfy a set of constraints, i.e. they are …
- … positioned behind the Lidar sensor and thus have a negative x coordinate.
- … too far away in x-direction and thus exceeding an upper distance limit.
- … too far off to the sides in y-direction and thus not relevant for collision detection
- … too close to the road surface in negative z-direction.
- … showing a reflectivity close to zero, which might indicate low reliability.
for(auto it=lidarPoints.begin(); it!=lidarPoints.end(); ++it) { float maxX = 25.0, maxY = 6.0, minZ = -1.4; if(it->x > maxX || it->x < 0.0 || abs(it->y) > maxY || it->z < minZ || it->r<0.01 ) { continue; // skip to next point } }
-
Convert current Lidar point into homogeneous coordinates and store it in the 4D variable X.
X.at<double>(0, 0) = it->x; X.at<double>(1, 0) = it->y; X.at<double>(2, 0) = it->z; X.at<double>(3, 0) = 1;
-
Then, apply the projection equation to map X onto the image plane of the camera and Store the result in Y.
Y = P_rect_00 * R_rect_00 * RT * X;
-
Once this is done, transform Y back into Euclidean coordinates and store the result in the variable pt.
cv::Point pt; pt.x = Y.at<double>(0, 0) / Y.at<double>(0, 2); pt.y = Y.at<double>(1, 0) / Y.at<double>(0, 2);