The AWS DeepRacer inference ROS package creates the inference_node
, which is part of the core AWS DeepRacer application and launches from the deepracer_launcher
. For more information about the application and the components, see the aws-deepracer-launcher repository.
This node is responsible for running the inference on the model that is selected using the Intel OpenVino Inference Engine APIs.
For more information about the Intel OpenVino Inference Engine, see the Inference Engine Developer Guide.
The source code is released under Apache 2.0 (https://aws.amazon.com/apache-2-0/).
Follow these steps to install the AWS DeepRacer inference package.
The AWS DeepRacer device comes with all the prerequisite packages and libraries installed to run the inference_pkg
. For more information about the pre-installed set of packages and libraries on the AWS DeepRacer, and about installing the required build systems, see Getting started with AWS DeepRacer OpenSource.
The inference_pkg
specifically depends on the following ROS 2 packages as build and run dependencies:
deepracer_interfaces_pkg
: This package contains the custom message and service type definitions used across the AWS DeepRacer core application.cv_bridge
: This package contains CvBridge, which converts between ROS image messages and OpenCV images.image_transport
: This package provides transparent support for transporting images in low-bandwidth compressed formats.sensor_msgs
: This package defines messages for commonly used sensors, including cameras and scanning laser rangefinders.
Open a terminal on the AWS DeepRacer device and run the following commands as the root user.
-
Switch to the root user before you source the ROS 2 installation:
sudo su
-
Source the ROS 2 Foxy setup bash script:
source /opt/ros/foxy/setup.bash
-
Set the environment variables required to run Intel OpenVino scripts:
source /opt/intel/openvino_2021/bin/setupvars.sh
-
Create a workspace directory for the package:
mkdir -p ~/deepracer_ws cd ~/deepracer_ws
-
Clone the
inference_pkg
on the AWS DeepRacer device:git clone https://github.com/aws-deepracer/aws-deepracer-inference-pkg.git
-
Fetch unreleased dependencies:
cd ~/deepracer_ws/aws-deepracer-inference-pkg rosws update
-
Resolve the
inference_pkg
dependencies:cd ~/deepracer_ws/aws-deepracer-inference-pkg && rosdep install -i --from-path . --rosdistro foxy -y
-
Build the
inference_pkg
anddeepracer_interfaces_pkg
:cd ~/deepracer_ws/aws-deepracer-inference-pkg && colcon build --packages-select inference_pkg deepracer_interfaces_pkg
The inference_node
provides a very specific and core functionality to run inference on the reinforcement learning models that are trained on the AWS DeepRacer Simulator. Intel OpenVino provides APIs to load an intermediate representation file for the model and create a core object which can be used to run the inference. Although the node is built to work with the AWS DeepRacer application, it can be run independently for development, testing, and debugging purposes.
To launch the built inference_node
as the root user on the AWS DeepRacer device, open another terminal on the AWS DeepRacer device and run the following commands as the root user:
-
Switch to the root user before you source the ROS 2 installation:
sudo su
-
Source the the ROS 2 Foxy setup bash script:
source /opt/ros/foxy/setup.bash
-
Set the environment variables required to run Intel OpenVino scripts:
source /opt/intel/openvino_2021/bin/setupvars.sh
-
Source the setup script for the installed packages:
source ~/deepracer_ws/aws-deepracer-inference-pkg/install/setup.bash
-
Launch the
inference_pkg
using the launch script:ros2 launch inference_pkg inference_pkg_launch.py
The inference_pkg_launch.py
, included in this package, provides an example demonstrating how to launch the nodes independently from the core application.
from launch import LaunchDescription
from launch_ros.actions import Node
def generate_launch_description():
return LaunchDescription([
Node(
package='inference_pkg',
namespace='inference_pkg',
executable='inference_node',
name='inference_node'
)
])
Topic name | Message type | Description |
---|---|---|
/sensor_fusion_pkg /sensor_msg |
EvoSensorMsg |
Message with the combined sensor data. Contains single camera or two camera images and LiDAR distance data. |
Topic name | Message type | Description |
---|---|---|
/inference_pkg /rl_results |
InferResultsArray |
Publish a message with the reinforcement learning inference results with class probabilities for the state input passed through the current model that is selected in the device console. |
Service name | Service type | Description |
---|---|---|
load_model |
LoadModelSrv |
Service that is responsible for setting pre-processing algorithm and inference tasks for the specific type of model loaded. |
inference_state |
InferenceStateSrv |
Service that is responsible for starting and stopping inference tasks. |