This project is meant to implement handpose tracking software taken from MediaPipe and use it to control and create music in the context of a Max-8 Spatial Programming project environment.
This example demonstrates both the handtracking and gesture recognition displayed within the jweb-hands-gesture-recognizer taken from MediaPipe. Note this is taken from the github repository of the orignal prject:
The program uses the points identified on the hands described here to determine volume control and note threshold.
The licence for the original project is saved in this repository as well.
This example is inspired by an example by Rob Ramirez
The previous example is inspired by an example by Rob Ramirez, which is in turn inspired by MediaPipe in JavaScript.