This project aims to control excavation equipment and forklifts through hand gestures. The project adapt modern deep learning using hand gesture recognition and implemented tilt pose. This project has tested on Jetsno NANO with the following environments:
- python 3.6.9
- trt_pose 0.0.1
- Pytorch 1.0.0a0
- sklearn 0.24.2
- Jetson Nano Robot library.
The machine recognizes the following hand movements and performs the following functions.
Movement
2. One finger (finger) = left
3. Palm = Go straight
4. Two Fingers (V) = Right
5. Okay = Back up
- There are many deaths caused by industrial accidents in construction machinery.
- It is caused by the lack of visibility of internal workers and the inability to look around.
- It can be safely operated by checking the field of view through the hand motion detection robot control system from the outside and operating heavy equipment.
- "Samsung Electronics' All-in-One PC" that recognizes hand movements and works
- Amazon AI Secretary Recognizing Sign Language "Alexa"
- Recognizing a person's hand movements through a camera
- Robot control for each movement, and various movements can be customized.
- a forklift truck
- Basic behavior of forklifts
Movement
2. Up – Down
3. In front of tilt – back
- an excavator
- How the Excavator Works
Movement
2. Handle cylinder (upper – lower)
3. Bucket cylinder (upper – lower)
- a companion robot
- High-five, stop, and take pictures. You can express various directions
- python 3.6.9
- trt_pose 0.0.1
- pytorch 1.0.0a0
- sklearn 0.24.2
Step 1 - Download PyTorch and Torchvision for Jetson nano. link
Step 2 - Install torch2trt
$git clone https://github.com/NVIDIA-AI-IOT/torch2trt
$cd torch2trt
$sudo python3 setup.py install --plugins
Step 3 - Install other miscellaneous packages
$sudo pip3 install tqdm cython pycocotools
$sudo apt-get install python3-matplotlib
Step 4 - Install trt_poses
$git clone https://github.com/NVIDIA-AI-IOT/trt_pose
$cd trt_pose
$sudo python3 setup.py install
Step 5 - Install dependecies for hand pose
$pip install traitlets
Step 6 - Download model weight
Model | Weight |
---|---|
hand_pose_resnet18_baseline_att_224x224_A | download model |
- Download the model weight using the link above.
- Place the downloaded weight in the model directory
Step 7 - Open and follow robot_control_with_hand_gestures.ipynb notebook
If you got TLS block issue when import the sklearn in ipynb, add this
import os
os.environ['LD_PRELOAD']='<your sklearn path>/scikit_learn.libs/libgomp-d22c30c5.so.1.0.0'
Moving the Robot in real world you should have to change the parameters of Robot movements for adaptable real world
if gesture_joints == 1: # stop
robot.stop()
elif gesture_joints == 2: # left
# at here
elif gesture_joints == 3: # forward
# at here
elif gesture_joints == 4: # right
# at this
elif gesture_joints == 5: # backward
# at this
else:
robot.stop()
- JetCam - An easy to use Python camera interface for NVIDIA Jetson
- JetBot - An educational AI robot based on NVIDIA Jetson Nano
- trt_pose - Real-time pose estimation accelerated with NVIDIA TensorRT
- trt_pose_hand - Real-time hand pose estimation based on trt_pose
- deepstream_pose_estimation - trt_pose deepstream integration
- ros2_trt_pose - ROS 2 package for "trt_pose": real-time human pose estimation on NVIDIA Jetson Platform
- torch2trt - An easy to use PyTorch to TensorRT converter