- Install jupyter notebook and requirements with
pip install -r requirements.txt
- Download processed videos (trajectories of body landmarks) from Google Drive and unzip them to the
videos/np/
directory - Run
GetMetrics.ipynb
to derive all metrics used for statistical analysis -- they will be saved inresults.csv
- Start an RStudio project in stats directory
- Run
sit2stand_clean-data_v15.Rmd
notebook withresults.csv
derived previously or an already provideddataClean.csv
file.
If you don't want to use our preprocessed video trajectories, you can process videos on your own. Note that results may be slightly different from ours since we only share deidentified videos, while we ran open pose on raw videos.
- Download videos from our Google Drive.
- Run OpenPose on videos, for example as we did here
- Process videos to get x,y trajectories of keypoints and save them as numpy arrays as we did here