Skip to content

Latest commit

 

History

History
88 lines (58 loc) · 2.84 KB

README.md

File metadata and controls

88 lines (58 loc) · 2.84 KB

MotionMixer: MLP-based 3D Human Body Pose Forecasting

Official PyTorch Implementation of the paper: MotionMixer: MLP-based 3D Human Body Pose Forecasting.

Arij Bouazizi, Adrian Holzbock, Ulrich Kressel, Klaus Dietmayer and Vasileios Belagiannis

[Proceedings] [Papers with Code] [Arxiv]

Installation

To setup the environment:

cd MotionMixer
conda create -n MotionMixer python=3.8.8
conda activate MotionMixer
pip install -r requirements.txt

Data

Due to licensing it is not possible to provide any data. Please refer to STSGCN for the preparation of the dataset files.

Training

To train the model on h36m or amass, you can use the following commands:

python h36m/train_mixer_h36m.py --input_n 10 --output_n 25 --skip_rate 1 
python amass/train_mixer_amass.py --input_n 10 --output_n 25 --skip_rate 5 

Evaluation

To test the pretrained models, you can use the following commands:

python h36m/test_mixer_h36m.py --input_n 10 --output_n 25 --skip_rate 1 
python amass/test_mixer_amass.py --input_n 10 --output_n 25 --skip_rate 5 

Models

We release the pretrained models for academic purpose. You can download them from Google Drive. Unzip the .zip file in the /checkpoints directory.

Citation

If you find this code useful for your research, please consider citing the following paper:

@inproceedings{ijcai2022p111,
  title     = {MotionMixer: MLP-based 3D Human Body Pose Forecasting},
  author    = {Bouazizi, Arij and Holzbock, Adrian and Kressel, Ulrich and Dietmayer, Klaus and Belagiannis, Vasileios},
  booktitle = {Proceedings of the Thirty-First International Joint Conference on
               Artificial Intelligence, {IJCAI-22}},
  publisher = {International Joint Conferences on Artificial Intelligence Organization},
  pages     = {791--798},
  year      = {2022},
  month     = {7},
}

Acknowledgments

Some of our code was adapted from HisRepsItself and STSGCN. We thank the authors for making their code public.

License

Creative Commons License
This work is licensed under Creative Commons Attribution-NonCommercial 4.0 International License.