Here is a demostration of follow mode in our project:
holospot.mp4
For other modes, please see our project website.
Course project of Mixed Reality Fall 2022 in ETHZ.
A HoloLens 2 application that enables users (especially amputy people) to control the Boston Dynamics Spot robot’s movement and arm using:
- Eye tracking.
- Head movements.
- Voice control.
We implements three control modes, including follow mode (robot follow eye gaze), select mode (robot directly goes to the selected location), and arm mode (robot arm mimic the user’s head pose). Besides, we also implements user friendly interface including voice control, arm camera visualization, and a help panel to show detailed command for the users.
More details about this project can be found in our project website and report.
Authors: Ganlin Zhang *, Deheng Zhang *, Longteng Duan *, Guo Han *
(* Equally contribute to this project)
Requirements:
- Windows for the unity application, Ubuntu 20.04 & ROS Noetic for the spot robot.
- The Unity version is really important (recommend
2020.3.40
)
Clone this repository:
git clone https://github.com/dehezhang2/holo-spot.git
- Connecting the robot using ssh.
- Follow the instruction in ROS_ws.
-
Open this project using Unity
-
In the unity project
File > Build Settings > Universal Windows Platform
, use the following settings:Target Device: Hololens Architecture: ARM64
Then click
Switch Platform
. -
Switch the anchor user information in
Assets>Scenes>SampleScene
, edit the game objectAzureSpatialAnchors
, fill theSpatial Anchors Account id
,Spatial Anchors Account
andSpatial Anchors Account Domain
in Credentials.
- Please don’t move the project folder once you create it!!
- You can refer to the MRTK Tutorial to build and deploy the project.
- Make sure the unity version is consist.
- Connecting the Hololens2 to the USB can make the deployment faster.
We thank our supervisor Eric Vollenweider from Microsoft Mixed Reality & AI Lab Zurich for the help and tons of useful advice for this project.
We also thank Boyang Sun for the support for the usage of the Spot robot from CVG lab in ETH Zurich.
@article{zhang2023accessible,
title={Accessible Robot Control in Mixed Reality},
author={Zhang, Ganlin and Zhang, Deheng and Duan, Longteng and Han, Guo},
journal={arXiv preprint arXiv:2306.02393},
year={2023}
}
The code is released under the GPL-3.0 license.