This project contains the code for the ICRA 2021 paper "Learning Human-like Hand Reaching for Human-Robot Handshaking". Please cite our paper if you use this work either in part or whole.
This codebase was developed and tested with ROS melodic on Ubuntu 18.04.
- Clone this package to your catkin workspace
git clone https://github.com/souljaboy764/icra_handshaking
- Go to the
src/
folder and runchmod a+x *.py
- Run
catkin_make
orcatkin build
This codebase assumes that you have downloaded the skeleton data from the NTU RGB+D Dataset.
The preprocessing step selects the reaching phases of the right handed handshakes. It saves the upper body skeletons to train an LSTM and the joint angles to train the ProMP model in an npz file handreach_data.npz
.
python src/preprocessing.py --src-dir /path/to/dataset --dst-dir /path/to/destination
The data can be loaded with:
with open('/path/to/handreach_data.npz','rb') as f:
data = np.load(f,allow_pickle=True, encoding='bytes')
skeletons = data['skeletons'] # Array of sequences of shape Tx15x3 containing 3D positions of 15 upper body skeleton joints for T timesteps
joint_angles = data['joint_angles'] # Array of sequences of shape Tx4 containing right arm joint angles (excluding the wrist) for T timesteps
Start the nodes for nuitrack and pepper moveit and have the human interaction partner visible in nuitrack. Then run
roslaunch icra_handshaking experiment.launch
In case the pepper ROS stack is unavailable, run
rosrun icra_handshaking pepper_promp_naoqi.py
@inproceedings{prasad2021learning,
title={Learning Human-like Hand Reaching for Human-Robot Handshaking},
author={Prasad, Vignesh and Stock-Homburg, Ruth and Peters, Jan},
booktitle = {IEEE International Conference on Robotics and Automation (ICRA)},
year = {2021}
}
- Add training codes
- Add nuitrack node information
- Update README