Code base for the project: Motion Matching for Responsive Animation For Digital Humans.
We recommend JetBrains CLion for the development. It is a paid software, but JetBrains has the student plan that provides free licenses. See this for a quick start guide.
-
Fork this repository and download the code.
-
Build the project (or build
mocapApp
). You can build the project in cmake Release mode for realtime performance: see this for a guide about cmake profile for CLion. -
Run the
mocapApp
. -
Click
Main Menu > Mocap Data > Import
button, and navigate to the example bvh mocap data directorydata/mocap/mann
. Select the folder to import the whole clips within the directory.Once you successfully import the data, you will see the list of the motion data clip as follows
-
Click a motion clip to play. The character will show up in the screen.
-
Press the space bar or click the play toggle to play the clip.
-
Play around it! And try understanding the code. Don't hesitate to contact Dongho ([email protected]) if you have any question regarding the implementation.
- Find bvh format mocap dataset of human motion recording and try importing it. You may need a bit of minor debugging to do that.
- Implement motion matching without blending: see the followings for references
- Implement blending algorithm (e.g. inertialization)
- (optional) Implement motion retargeting for Bob model
- If you want to start from scratch (instead of using this code), feel free to do it!
- This repo will be keep updated, so please stay tuned. If you want to sync your repo with the new commits,
use
git rebase
instead ofgit merge
: see this for more details ofgit rebase
. - Please actively use GitHub issue for questions and reporting issues on the code base!