-
Launch real or sim Baxter
roslaunch baxter_gazebo baxter_world.launch rosrun baxter_tools enable_robot.py -e
-
Launch camera image publication:
[Open a new terminal] roslaunch openni2_launch openni2.launch
OR (use the left hand camera as input)
rosrun birl_kitting_experiment setup_baxter_left_hand_camera.py rosrun birl_kitting_experiment set_left_arm2static_pose.py
. Launch alvar recognition with bundles, set up camera-robot transform:
[Open a new terminal] roslaunch birl_kitting_experiment alvar_marker_demo.launch
-
Setup motion services:
[Open a new terminal] rosrun birl_baxter_online_urdf_update update_urdf.py rosrun baxter_interface joint_trajectory_action_server.py roslaunch birl_moveit_r_ft_config birl_baxter_gripper.launch
-
Setup sensors, download and install birl_sensors. Then run:
[Open a new terminal] rosrun robotiq_force_torque_sensor rq_sensor [Open a new terminal] rosrun tactilesensors4 PollData4 [Open a new terminal] rosrun smach_based_introspection_framework timeseries_publisher.py
1.ssh command:
[open a new terminal] ssh [email protected]
-
Run the experiment:
[Open a new terminal] rosrun birl_kitting_experiment smach_based_kitting_experiment_runner.py If you want to run the experiment again, repeat this step.
-
Open the anomaly detection service
rosrun smach_based_introspection_framework anomaly_detection.py
-
Open anomaly classification service
rosrun smach_based_introspection_framework redis_based_anomaly_classification.py
-
To record a demonstraion: cd ../birl_kitting_expereiment/scripts/
python record_demonstration.py --name "DEMONSTRATION_NAME"
-
Kinesthetic teaching the robot arm for recording the demonstration until finish, and then press CTRL+C
3.To train a DMP models of all recorded demonstraions: cd ../birl_kitting_expereiment/scripts/
python cook_dmp_models_for_smach_states.py
- For manually active the anomal
rosrun smach_based_introspection_framework send_manual_anomaly_signal.py
- Whether the valuable in smach_based_instospection_framework/configurables.py is True or not.
HUMAN_AS_MODEL_MODL = True/False
- Pausing the robot movement while robot encounter an anomaly and label it carefully, for instance
tool collision
- And then following the instruction to record the human teaching behavior:
start end
- extract the adaptation demonstration data:
roscd smach_based_introspection_framework cd src/smach_based_introspection_framework/offline_part python process_experiment_record_to_dataset.py python process_dataset_to_models.py (we commonly comment the command for generating introspection and classification models)
If you use this toolbox, please cite the following related papers:
[1] "A Latent State-based Multimodal Execution Monitor with Anomaly Detection and Classification for Robot Introspection", Hongmin Wu, Yisheng Guan , Juan Rojas, Appl. Sci. 2019, 9(6), 1072; doi: 10.3390/app9061072.http://www.juanrojas.net/files/papers/2019AS-Wu-LatState_MMExecMntr_AnmlyDetClassif.pdf;
[2]"Robot Introspection for Process Monitoring in Robotic Contact Tasks through Bayesian Nonparametric Autoregressive Hidden Markov Models",Hongmin Wu, Yisheng Guan and Juan Rojas, International Journal of Advanced Robotics Systems, 2018 (IF 0.952).http://www.juanrojas.net/files/papers/2019IJARS-Wu_Rojas-AnalysisHDPVARHMM.pdf