Skip to content
Lachlan Tinsley edited this page May 20, 2024 · 19 revisions

Venus-Blue: Turtlebot Control with Footpedals with Localisation Display

Project Brief:

This project involves using the Raspberry Pi platform. A small rover chassis should be controlled, to drive a circuit in a 2m X 2m space. The speed and direction of the rover should be controlled by the Foot Pedal. The Foot Pedal should be connected via USB OTG to the nucleo board, which should then broadcast the commands via ROS. The rover should also be tracked with a motion model and the onboard LiDar, with the display on either using a web dashboard.

Project System Overview

The overview block diagram of the system is shown here. There are four main nodes:

  • Nucleo Board Node: which processes data from the peripherals (foot pedals) and transmits to as raspberry Pi using serial. This process is handled in zephyr
  • Raspberry Pi Node: which receives data from the Nucleo board and publishes the data to a specified MQTT topic in JSON format
  • PC Dashboard Node: Receives the localisation information and peripheral information via MQTT and displays it on a web interface
  • Turtlebot Node: the robot to be controlled. It receives driving information via MQTT and publishes to a ROS network. Then, publishes the localised position of the turtlebot to MQTT.

The block diagram below uses dashed lines to represent wireless network connection and solid lines for wired connections.

flowchart LR
    A[Footpedal] -->|OTG wired| B[Nucleo Board]
    B --> |UART| C[Raspberry Pi]:::blue
    C -.-> |MQTT| D[PC Dashboard]:::blue
    C -.-> |MQTT| EA
    EB -.-> |MQTT| D
    EC -.-> |MQTT| D

subgraph E[Turtlebot]
    EA[Raspberry Pi] ---|ROS| EB[Localisation Node]
    EA ---|ROS| EC[Telemetry Node]
end

%% Defining the styles
    classDef Input fill:#017d22;
    classDef Node fill:#03368a;

%% Assigning styles to nodes
    class A Input;
    class B,C,D,EB,EC Node;
Loading

User Guide

Hardware

Components required:

  • Footpedal
  • Nucleo board.
  • USB A to micro cables.

Setup:

  • Using a USB A to micro cable, plug the foot pedal into the Nucleo board.
  • Using a second cable, plug the Nucleo board into the host machine.

Turtlebot Controller

Clone the repository:

git clone https://github.com/RachelChiong/Venus-Blue.git

Start the serial comms MQTT node:

cd pedal && python3 pi_serial.py

Initialise the ROS environment to point to the TurtleBot (This is for a waffle_pi with domain id 32, change these values for another turtlebot model or domain id):

export ROS_DOMAIN_ID=32
export TURTLEBOT3_MODEL=waffle_pi
export RMW_IMPLEMENTATION=rwm_cyclonedds_cpp
source /opt/ros/humble/setup.bash

Build and run the controller. The MQTT server can be selected by passing mqtt_server, mqtt_port arguments to ros2 run.

source /opt/ros/humble/setup.bash
cd turtlebot
colcon build --sylink-install --packages-select venusbluepy
source /install/setup.bash
ros2 launch venusblue venusblue.launch

Visualisation Dashboard

Pre-requisites:

  • Docker
  • Docker-compose
  • UQ internet connection (to interact with MQTT nodes)

Tech-stack:

  • React-Typescript (frontend)
  • Flask-Python (backend)

Steps:

  1. Ensure docker is running
  2. Go to dashboard/ and run
docker-compose up

This should spin up two sub-containers.

  • frontend: localhost:3000
  • backend: localhost:5001
  1. To close the dashboard, run
docker-compose down -v

Team Roles

Gabe

  • My responsibility is to integrate the TurtleBot3 Waffle Pi ROS interface with MQTT to handle wheel movement, handling deployment and execution on the TurtleBot3 and creating the team poster. The contribution involves writing the ROS node that handles the wheels and interfaces with the MQTT pedal and dashboard messages, and handling the development environment for ROS.

Lachie

  • My responsibility is to handle the localisation of the Turtlebot including algorithm selection and implementation. The main implementation exists within the ROS system on the turtle Raspberry Pi. Here LiDAR and Intertial Measurement Unit (IMU) data will be taken to determine the change in position of the turtlebot and creating a map Via SLAM. The LiDar point cloud will be filtered using Adaptive Monte-Carlo Localisation (AMCL) and the surroundings will be mapped via the use of a Simultaneous Localisation and Mapping (SLAM). The localisation of the rover will be backed up using sensor fusion with the IMU and mapping data, filtering them through an Extended Karman Filter (EKF) to increase accuracy of data, as well as improve the accuracy and reliability of the map created by the SLAM algorithm. This data will be mapped and displayed on the base station computer via some mapping program displaying via the Web application.

James

  • My role within the team is focused around the Pedal Node. There are several key functions that must be implemented. First, a USB OTG protocol is used to communicate between the Nucleo board and pedal inputs. These inputs must later be forwarded on to a Raspberry Pi that is connected to the Nucleo via serial. The Raspberry Pi will listen to these serial data packets and publish them on a MQTT topic for pedal data. My role for the milestone was to develop documentation for Wireless Network Communications or IoT protocols Web dashboards.

Rachel

  • My role within the team is to design and create the web dashboard. This dashboard will display the X,Y,Z values received from the the foot pedals and the telemetry and localisation data from the Turtlebot. This information will be retrieved using MQTT nodes.

Pages