Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds a maintenance guide page with information on maintaining vision #73

Merged
merged 48 commits into from
Aug 6, 2021
Merged
Changes from all commits
Commits
Show all changes
48 commits
Select commit Hold shift + click to select a range
17f20b2
Add vision guide and rearrange page numbers
ysims Jun 11, 2020
233c68f
Merge remote-tracking branch 'origin/master' into sims/guide-vision
ysims Jun 22, 2020
a317242
Merge branch 'master' into sims/guide-vision
ysims Jul 17, 2020
fc178bc
Merge remote-tracking branch 'origin/master' into sims/guide-vision
ysims Jul 17, 2020
442217f
Changes wording (minor)
ysims Jul 17, 2020
f8804ca
Merge branch 'master' into sims/guide-vision
ysims Jul 25, 2020
7427f28
Fixes merge
ysims Jul 25, 2020
a5a9dc8
Merge branch 'master' into sims/guide-vision
ysims Oct 8, 2020
2d99661
Renames to a general maintenance page
ysims Oct 8, 2020
1e607ed
Merge branch 'master' into sims/guide-vision
ysims Dec 6, 2020
5ddc67e
Merge remote-tracking branch 'origin/master' into sims/guide-vision
ysims Jan 7, 2021
2e4cbd9
Update src/book/03-guides/01-main-codebase/02-maintaining-subsystems.mdx
ysims Mar 8, 2021
414d31e
Update src/book/03-guides/01-main-codebase/02-maintaining-subsystems.mdx
ysims Mar 8, 2021
368d027
Merge remote-tracking branch 'origin/sims/guide-vision' into sims/gui…
ysims Mar 16, 2021
ed93bbf
Merge branch 'master' into sims/guide-vision
ysims Mar 30, 2021
fb6627c
Simplify NUgan and NUpbr sections
ysims Mar 31, 2021
908ce20
Merge remote-tracking branch 'origin/master' into sims/guide-vision
ysims Jun 29, 2021
abaeae3
Redo headings
ysims Jun 29, 2021
0b9bdf5
Remove NUgan - not useful anyway
ysims Jun 29, 2021
7f9691c
Apply suggestions from code review
ysims Jun 29, 2021
08adac1
Merge remote-tracking branch 'origin/sims/guide-vision' into sims/gui…
ysims Jun 29, 2021
47d2ab4
Update src/book/03-guides/01-main-codebase/02-maintaining-subsystems.mdx
ysims Jun 29, 2021
3301fc3
Just link to Visual Mesh page
ysims Jun 29, 2021
43cb092
How to export weights
ysims Jun 29, 2021
181372a
bash
ysims Jun 29, 2021
be9788c
NUgan will make its return!!
ysims Jul 3, 2021
1a83416
Updates a lot of things
ysims Jul 3, 2021
b227f06
Merge branch 'master' into sims/guide-vision
ysims Jul 3, 2021
048a509
Very small grammar mistake!!!!!
ysims Jul 3, 2021
8b8c95b
Merge branch 'master' into sims/guide-vision
ysims Jul 4, 2021
95ffb9b
Merge branch 'master' into sims/guide-vision
KipHamiltons Jul 8, 2021
6c36817
Apply suggestions from code review
ysims Jul 10, 2021
2225cd4
Better instruction for config file
ysims Jul 10, 2021
a304793
Better instructions for testing
ysims Jul 10, 2021
2b88725
Clarify recording and playing back for this purpose
ysims Jul 10, 2021
77eb137
Remove info on nano
ysims Jul 10, 2021
b2cd3a4
Fix up numbering
ysims Jul 10, 2021
071d1d2
Merge branch 'master' into sims/guide-vision
ysims Jul 11, 2021
dd8d4e1
Merge branch 'master' into sims/guide-vision
ysims Jul 13, 2021
edefb49
Merge branch 'master' into sims/guide-vision
ysims Jul 25, 2021
249fdb1
Better linking and wording for dataset stuff
ysims Jul 25, 2021
46ea8cd
Reword detector stuff
ysims Jul 25, 2021
e6fa254
link seg masks when first mentioned
ysims Jul 25, 2021
e711ad8
Merge branch 'master' into sims/guide-vision
ysims Aug 1, 2021
8f70a84
Merge remote-tracking branch 'origin/master' into sims/guide-vision
ysims Aug 5, 2021
fc90c73
Mention the mesh too with kinematics and odometry
ysims Aug 5, 2021
3be8d4d
Formatting
ysims Aug 5, 2021
3920a38
Word it a bit better
ysims Aug 5, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
121 changes: 121 additions & 0 deletions src/book/03-guides/01-main-codebase/02-maintaining-subsystems.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
---
section: Guides
chapter: Main Codebase
title: Maintaining Subsystems
description: How to maintain subsystems within the main codebase.
slug: /guides/main/maintaining-subsystems
---

This page details how to maintain various subsystems within the main codebase.

## Vision

The accuracy of the vision system is reliant on the accuracy of odometry and kinematics because they affect the placement of the mesh and green horizon. It is important that these systems work reasonable well otherwise the robot may have issues detecting objects.

### Dataset Generation

Synthetic and semi-synthetic training data for vision can be generated using [NUpbr](/system/tools/nupbr). Pre-generated datasets for training the Visual Mesh are on the <abbr title="Network-Attached Storage">NAS</abbr> in the lab.

#### NUpbr

NUpbr is a Physically Based Rendering tool created in Blender. It creates semi-synthetic images with corresponding [segmentation masks](https://paperswithcode.com/task/semantic-segmentation) for training.

Find out how to get and use NUpbr on the [NUpbr NUbook page](/system/tools/nupbr) and the [NUpbr GitHub repository](https://github.com/NUbots/NUpbr).

#### Setting Up The Data

The Visual Mesh requires raw images, segmentation masks and metadata, as outlined on the [Quick Start Guide](https://github.com/Fastcode/VisualMesh/blob/main/readme/quickstart.md#gathering-the-data). NUpbr can provide all of these as output, and premade data is available on the NAS. The data then needs to be converted to the tfrecord format using a script on the Visual Mesh repository. [The Quick Start Guide](https://github.com/Fastcode/VisualMesh/blob/main/readme/quickstart.md#building-the-dataset) describes how to use it.

### The Visual Mesh

#### Training and Testing

Go to the [NUbook Visual Mesh Getting Started guide](/guides/tools/visualmesh) to find out how to train and test a network, with an example dummy dataset.

#### Exporting Configuration

The resulting network should be exported to a [yaml](https://yaml.org/) file and added to the NUbots codebase, by completing the following steps.

1. Create a base configuration file. Example yaml files can be found [in the Visual Mesh repository](https://github.com/Fastcode/VisualMesh/blob/master/example/model.yaml) and [in the NUbots repository](https://github.com/NUbots/NUbots/blob/master/module/vision/VisualMesh/data/config/VisualMesh.yaml).

2. Export the weights of your trained Mesh to this configuration file using the following command, where `<output_dir>` is the directory of the configuration file:

```bash
./mesh.py export <output_dir>
```

3. Add this configuration file to the NUbots repository in the [VisualMesh module](https://github.com/NUbots/NUbots/tree/master/module/vision/VisualMesh/data/config/networks). Replace or add a configuration file depending on the use case of the Mesh - `RobocupNetwork.yaml` is for soccer playing on the real robot and `WebotsNetwork` is for soccer playing in the Webots simulator. View the [Git Guide](/guides/general/git) for information on using Git and submitting this change in a pull request.

### Camera Calibration

The vision system cannot work optimally if the cameras are not calibrated correctly. [The input page](/system/subsystems/input) describes the camera parameters that can be calibrated.

An automatic camera calibration tool is available in the NUbots repository. See the [camera calibration guide](guides/main/camera-calibration#page-content) to find out how to use this tool.

### Testing

After updating the Visual Mesh in the NUbots repository, it should be tested before merging. Refer to the [Getting Started guide](/guides/main/getting-started) for assistance for the following steps.

1. Build the code, ensuring `ROLE_visualmesh` is set to `ON` in `./b configure -i`, and install it to the robot. Ensure the new configuration file is installed by using the `-cu` or `-co` options when installing - check out the [Build System page](/system/foundations/build-system#install) to find out more about options when installing onto the robot.

2. When your new Visual Mesh is installed onto the robot, connect to the robot using:

```bash
ssh nubots@<address>
ysims marked this conversation as resolved.
Show resolved Hide resolved
```

3. Ensure NUsight is on:

```bash
nano config/NetworkForwarder.yaml
```

Turn `vision object` and `compressed images` on. Run NUsight using `yarn prod` and navigate to the NUsight page in your browser. More on NUsight can be found on [the NUsight NUbook page](/system/tools/nusight).

4. Run the `visualmesh` role

```bash
./visualmesh
```

5. Wait for the cameras to load and then watch the Vision tab in NUsight. To determine if the output is correct, consult the [vision page](/system/subsystems/vision) for images of the expected output.

If you would like to see the Visual Mesh output in NUsight, you will need to log the data and run it back in NUsight using DataPlayback, since the data is too large to send over a network. Use the steps in the [DataLogging and DataPlayback guide](/guides/main/data-recording-playback) to record and playback data. Adjust the instructions for our purpose using the following hints:

- In step 1 of Recording Data, use the [`visualmesh`](https://github.com/NUbots/NUbots/blob/master/roles/visualmesh.role) role to record the data.
- In step 2 of Recording Data and step 4 of Playing Back Data, set `message.output.CompressedImage` to `true` and add `message.vision.VisualMesh: true` in both [`DataLogging.yaml](https://github.com/NUbots/NUbots/blob/master/module/support/logging/DataLogging/data/config/DataLogging.yaml) and [`DataPlayback.yaml](https://github.com/NUbots/NUbots/blob/master/module/support/logging/DataPlayback/data/config/DataPlayback.yaml).
- In steps 1, 2 and 5 of Playing Back Data, use the [`playback`](https://github.com/NUbots/NUbots/blob/master/roles/playback.role) role to playback the data, without changes.

### Tuning Detectors

Potentially, the Visual Mesh had positive results after training, but when used on a robot it performed poorly. In this case, the detectors may need tuning.

[BallDetector.yaml](https://github.com/NUbots/NUbots/blob/master/module/vision/BallDetector/data/config/BallDetector.yaml) and [GoalDetector.yaml](https://github.com/NUbots/NUbots/blob/master/module/vision/GoalDetector/data/config/GoalDetector.yaml) contain the values for tuning the ball and goal detectors respectively.

1. Build and install the `visualmesh` role to a robot.
2. SSH onto the robot.
3. Enable NUsight messages on the robot by running

```bash
nano config/NetworkForwarder.yaml
```

and set `message.vision.Balls` and `message.vision.Goals` to `true`.

4. Run NUsight using `yarn prod` on a computer. Set up NUsight using the [Getting Started page](/guides/main/getting-started) if necessary.
5. Run `./visualmesh` on the robot.
6. Alter the configuration file for the detectors while simultaneously running the binary on the robot. In a new terminal, SSH onto the robot again and run:

```bash
nano config/BallDetector.yaml
```

Change the values and upon saving, the changes will be used immediately by the robot **without** needing to rebuild or rerun the `./visualmesh` binary.

7. Repeat #6 for the goal detector by running

```bash
nano config/GoalDetector.yaml
```

In general, it might be useful to adjust the `confidence_threshold` on both detectors to improve the results. Other variables may give better results with different values, except for `log_level` and the covariances (`goal_projection_covariance` and `ball_angular_cov`).