PX4 computer vision algorithms packaged as ROS nodes for depth sensor fusion and obstacle avoidance. This repository contains two different implementations, targeting different use cases:
- A local, VFH+ based planner that plans (including some history) in a vector field histogram
- A global, graph based planner that plans in a traditional occupancy grid
The full PX4 Developer documentation is available on: https://dev.px4.io and includes more computer vision (including VIO) examples.
A ROS container based on Ubuntu 16.04 has been created and can be used to quickly try the simulation, as a demo. Running it is as simple as installing docker and docker-compose, and running $ docker-compose up
from the right folder. Find the corresponding instructions here.
For deployment instructions, check "Deploying with Docker".
If you want to leverage docker in your development environment, check the "Developing with Docker" section.
This is a step-by-step guide to install and build all the prerequisites for running this module on Ubuntu 16.04. You might want to skip some of them if your system is already partially installed. A corresponding docker container is defined here as reference.
Note that in the following instructions, we assume your catkin workspace (in which we will build the avoidance module) is in ~/catkin_ws
, and the PX4 Firmware directory is ~/Firmware
. Feel free to adapt this to your situation.
- Add ROS to sources.list.
echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list
apt-key adv --keyserver hkp://ha.pool.sks-keyservers.net:80 --recv-key 421C365BD9FF1F717815A3895523BAEEB01FA116
apt-get update
- Install gazebo with ROS (many dependencies including gazebo7 will come with those two packages).
apt install ros-kinetic-gazebo-ros-control ros-kinetic-gazebo-ros-pkgs
# Source ROS
source /opt/ros/kinetic/setup.bash
- Initialize rosdep.
rosdep init
rosdep update
- Install catkin and create your catkin workspace directory.
apt install python-catkin-tools
mkdir -p ~/catkin_ws/src
- Install mavros. The package coming from the ROS repository should be fine. Just in case, instructions to install it from sources can be found here: https://dev.px4.io/en/ros/mavros_installation.html.
apt install ros-kinetic-mavros
- Install avoidance module dependencies (pointcloud library and octomap).
apt install libpcl1 ros-kinetic-octomap-*
- Clone this repository in your catkin workspace in order to build the avoidance node.
cd ~/catkin_ws/src
git clone https://github.com/PX4/avoidance.git
- Actually build the avoidance node.
catkin build -w ~/catkin_ws
If it fails, try to build again. It seems like it sometimes fails in an undeterministic way.
Note that you can build the node in release mode this way:
catkin build -w ~/catkin_ws --cmake-args -DCMAKE_BUILD_TYPE=Release
- Clone the PX4 Firmware and all its submodules (it may take some time).
cd ~
git clone https://github.com/PX4/Firmware.git
cd ~/Firmware
git submodule update --init --recursive
- Install Firmware dependencies.
apt install libopencv-dev python-jinja2 protobuf-compiler
- We will now build the Firmware once in order to generate SDF model files for Gazebo. This step will actually run a simulation that you can directly quit.
# Add the models from the avoidance module to GAZEBO_MODEL_PATH
export GAZEBO_MODEL_PATH=${GAZEBO_MODEL_PATH}:~/catkin_ws/src/avoidance/sim/models
# This is necessary to prevent some Qt-related errors (feel free to try to omit it)
export QT_X11_NO_MITSHM=1
# Setup some more Gazebo-related environment variables
. ~/Firmware/Tools/setup_gazebo.bash ~/Firmware ~/Firmware/build/posix_sitl_default
# Build and run simulation
make posix_sitl_default gazebo
- Add the Firmware directory to ROS_PACKAGE_PATH so that ROS can start the PX4.
export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:~/Firmware
You should now be ready to run the simulation.
A full step-by-step guide is not available for this configuration. You might find the instructions for Ubuntu 16.04 and ROS Kinetic useful, though. Another source of information is also the PX4 dev guide.
Notable differences with ROS Kinetic are:
- Installing the pointcloud library requires another package from another repository:
# Install PCL
sudo add-apt-repository ppa:v-launchpad-jochen-sprickerhof-de/pcl
sudo apt-get update
sudo apt-get install libpcl-all
- Octomap has different packages for Indigo:
# Install the Octomap Library
sudo apt-get install ros-indigo-octomap ros-indigo-octomap-mapping
- Make sure that you have sources the catkin setup.bash from your catkin workspace.
source ~/catkin_ws/devel/setup.bash
- Run the simulation.
roslaunch global_planner global_planner_sitl_mavros.launch
You should now see the drone unarmed on the ground, and the octomap should show 2 red arrows and the visible world, as pictured below.
To start flying, put the drone in OFFBOARD mode and arm it. The avoidance node will then take control of it.
# In another terminal
rosrun mavros mavsys mode -c OFFBOARD
rosrun mavros mavsafety arm
Initially the drone should just hover at 3.5m altitude.
From the command line, you can also make Gazebo follow the drone, if you want.
gz camera --camera-name=gzclient_camera --follow=iris
During the simulation, the ROS node "/path_handler_node" continuously publishes positions to the topic "/mavros/setpoint_position/local".
The graph of the ROS nodes is shown below:
One can plan a new path by setting a new goal with the 2D Nav Goal button in rviz. The planned path should show up in rviz and the drone should follow the path, updating it when obstacles are detected. It is also possible to set a goal without using the obstacle avoidance (i.e. the drone will go straight to this goal and potentially collide with obstacles). To do so, set the position with the 2D Pose Estimate button in rviz.
To simulate obstacle avoidance with stereo-cameras, the package stereo-image-proc
must be installed.
# Launch simulation
roslaunch global_planner global_planner_stereo.launch
Simulated stereo-vision is prone to errors due to artificial texture, which may make obstacles appear extremely close to the camera. For better results, choose a more natural world.
The disparity map from stereo-image-proc
is published as a stereo_msgs/DisparityImage message, which is not supported by rviz or rqt. To visualize the message, either run:
rosrun image_view stereo_view stereo:=/stereo image:=image_rect_color
or publish the DisparityImage as a simple sensor_msgs/Image
rosrun topic_tools transform /stereo/disparity /stereo/disparity_image sensor_msgs/Image 'm.image'
Now the disparity map can be visualized by rviz or rqt under the topic /stereo/disparity_image.
The local planner is based on the 3DVFH+ algorithm. To run the algorithm simulating stereo vision
# if stereo-image-proc not yet installed
sudo apt-get install ros-kinetic-stereo-image-proc
roslaunch local_planner local_planner_stereo.launch
Another option is to simulate a kinect depth sensor:
roslaunch local_planner local_planner_depth-camera.lauch
You will see the Iris drone unarmed in the Gazebo world. To start flying, put the drone in OFFBOARD mode and arm it. The avoidance node will then take control of it.
# In another terminal
rosrun mavros mavsys mode -c OFFBOARD
rosrun mavros mavsafety arm
The drone will first change its altitude to reach the goal height. It is possible to modify the goal altitude with rqt_reconfigure
GUI.
Then the drone will start moving towards the goal. The default x, y goal position can be changed in Rviz by clicking on the 2D Nav Goal button and then chosing the new goal x and y position by clicking on the visualized gray space. If the goal has been set correctly, a yellow sphere will appear where you have clicked in the grey world.
It is possible to log the point cloud, the polar histogram, the waypoints and the drone position when simulating the local planner. In order to (dis)enable logging, you need to set the following arguments to false
/true
in the local_planner_depth-camera.launch
launch file:
<arg name="record_point_cloud" default="true" />
<arg name="record_histogram" default="true" />
<arg name="record_waypoints" default="true" />
<arg name="record_position" default="true" />
You can retrieve the logs in the ~/.ros/
folder of you computer.
Check that some camera topics (including /camera/depth/points) are published with the following command:
rostopic list | grep camera
If /camera/depth/points is the only one listed, it may be a sign that gazebo is not actually publishing data from the simulated depth camera. Verify this claim by running:
rostopic echo /camera/depth/points
When everything runs correctly, the previous command should show a lot of unreadable data in the terminal. If you don't receive any message, it probably means that gazebo is not publishing the camera data.
Check that the clock is being published by Gazebo:
rostopic echo /clock
If it is not, you have a problem with Gazebo (Did it finish loading the world? Do you see the buildings and the drone in the Gazebo UI?). However, if it is publishing the clock, then it might be a problem with the depth camera plugin. Make sure the package ros-kinetic-gazebo-ros-pkgs
is installed. If not, install it and rebuild the Firmware (with $ make posix_sitl_default gazebo
as explained above).
Is the drone in OFFBOARD mode? Is it armed and flying?
# Set the drone to OFFBOARD mode
rosrun mavros mavsys mode -c OFFBOARD
# Arm
rosrun mavros mavsafety arm
Some tuning may be required in the file "<Firmware_dir>/posix-configs/SITL/init/rcS_gazebo_iris".
Some parameters that can be tuned in rqt reconfigure.
The global planner uses the octomap_servers to get probabilistic information about the evironment. The octomap_server needs a stream of point-clouds to generate the accumulated data.
In case the point-cloud stream already exists, this step can be skipped.
Assuming there already exists a stream of depth-maps on the ROS-topic <depthmap_topic>, we need to generate a corresponding stream of depth-maps. Start by following the instructions from PX4/disparity_to_point_cloud. Now run the point-cloud generation with the parameters for the camera intrinsics:
rosrun disparity_to_point_cloud disparity_to_point_cloud_node \
fx_:=fx fy_:=fy cx_:=cx cy_:=cy base_line_:=base_line disparity:=<depthmap_topic>
A stream of point-clouds should now be published to /point_cloud.
The planner can be built with:
catkin build
Note that you can build it in release mode by adding --cmake-args -DCMAKE_BUILD_TYPE=Release
as an argument to catkin:
catkin build --cmake-args -DCMAKE_BUILD_TYPE=Release
And then just run with the corresponding launch file:
roslaunch global_planner global_planner_offboard.launch point_cloud_topic:=<point_cloud_topic>
Read the Running on Odroid instructions