Skip to content

xiexh20/CHORE

Repository files navigation

CHORE (ECCV'22)

Official implementation of the ECCV 2022 paper: Contact, Human and Object REconstruction from a single RGB image

[ArXiv] [Project Page]

teaser

News: our new CVPR'23 paper that can track human and object consistently from monocular videos. Check out here!

Contents

  1. Dependencies
  2. Run demo
  3. Training
  4. Testing
  5. Citation
  6. License

Dependencies

The code is tested with torch 1.6, cuda10.1, debian 11. We recommend using anaconda environment:

conda create -n chore python=3.7
conda activate chore 

Main dependencies are listed in requirements.txt, install them with

git clone https://github.com/xiexh20/CHORE.git && cd CHORE 
pip install -r requirements.txt

Installing other dependencies:

  1. psbody-mesh library. see installation.
  2. igl library. conda install -c conda-forge igl
  3. Detectron2 library:
python -m pip install detectron2 -f   https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/torch1.6/index.html

Update 2024 July: I realized that psbody-mesh library is not maintained anymore. The core function I used from that is the Mesh class for loading/export, and MeshViewer for fast visualization. If you don't need the visualizer, then you can simply copy the source code of the Mesh class and import Mesh from there. One example can be found here: https://github.com/xiexh20/ProciGen/blob/main/render/mesh.py. Simply replace from psbody.mesh import Mesh with from mesh import MyMesh as Mesh.

  1. Neural Mesh Renderer: pip install external/neural_renderer
  2. Mesh intersection library:
export CHORE_PATH=${PWD}
git clone https://github.com/NVIDIA/cuda-samples.git external/cuda-samples
export CUDA_SAMPLES_INC=${CHORE_PATH}/external/cuda-samples/Common/
git clone https://github.com/vchoutas/torch-mesh-isect external/torch-mesh-isect
cp external/torch-mesh-isect/include/double_vec_ops.h external/torch-mesh-isect/src/

Add these lines to external/torch-mesh-isect/src/bvh.cpp before AT_CHECK is defined (reference):

#ifndef AT_CHECK 
#define AT_CHECK TORCH_CHECK 
#endif 

finally run pip install external/torch-mesh-isect/

Run demo

Pretrained model can be downloaded from here. Please download and extract it in the code directory with: unzip chore-pretrained.zip -d experiments

We use the SMPL-H body model, please prepare it from the official website and modify SMPL_MODEL_ROOT in PATHS.yml accordingly.

This demo also requires the object templates from BEHAVE dataset. Download the object templates from here and modify BEHAVE_PATH in file PATHS.yml accordingly.

Run demo example with:

python demo.py chore-release -s example -on basketball 

results will be saved to example/000000117377/demo.

Training

Please follow the instructions here to download the BEHAVE dataset.

After download and unzip, modify the BEHAVE_PATH in file PATHS.yml accordingly.

Data preprocessing

Specify the root path PROCESSED_PATH in file PATHS.yml to where you want the processed data to be saved.

Preprocess one sequence:

python preprocess/preprocess_scale.py -s [path to one sequence]

Alternatively you can run python preprocess/preprocess_scale.py -a to process all sequences sequentially.

Train the model

Specify the number of GPUs by nproc_per_node and run the following to start training:

python -m torch.distributed.launch --nproc_per_node=4 --use_env train_launch.py -en chore-release

Testing

For BEHAVE dataset, we test on images from kinect one (k1) and evaluate on images where object is occluded less than 70%. To compute the occlusion ratio, you need to render full object masks.

Our rendered full object masks can be downloaded here. Download and extract them to the same path where BEHAVE sequences were extracted.

Test on BEHAVE test set

After downloading the pretrained model, you can test one sequence from behave data with:

python recon/recon_fit_behave.py chore-release --save_name chore-release -s [path to one sequence]

To align reconstruction with input images for BEHAVE data, you will need to set crop_cent to False when using this function.

Evaluate

Run the following to compute errors reported in table 1:

python recon/evaluate.py 

Citation

If you use our code, please cite:

@inproceedings{xie2022chore,
    title = {CHORE: Contact, Human and Object REconstruction from a single RGB image},
    author = {Xie, Xianghui and Bhatnagar, Bharat Lal and Pons-Moll, Gerard},
    booktitle = {European Conference on Computer Vision ({ECCV})},
    month = {October},
    organization = {{Springer}},
    year = {2022},
}

If you use BEHAVE dataset, please also cite:

@inproceedings{bhatnagar22behave,
    title = {BEHAVE: Dataset and Method for Tracking Human Object Interactions},
    author={Bhatnagar, Bharat Lal and Xie, Xianghui and Petrov, Ilya and Sminchisescu, Cristian and Theobalt, Christian and Pons-Moll, Gerard},
    booktitle = {{IEEE} Conference on Computer Vision and Pattern Recognition (CVPR)},
    month = {jun},
    organization = {{IEEE}},
    year = {2022},
    }

License

Copyright (c) 2022 Xianghui Xie, Max-Planck-Gesellschaft

Please read carefully the following terms and conditions and any accompanying documentation before you download and/or use this software and associated documentation files (the "Software").

The authors hereby grant you a non-exclusive, non-transferable, free of charge right to copy, modify, merge, publish, distribute, and sublicense the Software for the sole purpose of performing non-commercial scientific research, non-commercial education, or non-commercial artistic projects.

Any other use, in particular any use for commercial purposes, is prohibited. This includes, without limitation, incorporation in a commercial product, use in a commercial service, or production of other artefacts for commercial purposes.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

You understand and agree that the authors are under no obligation to provide either maintenance services, update services, notices of latent defects, or corrections of defects with regard to the Software. The authors nevertheless reserve the right to update, modify, or discontinue the Software at any time.

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. You agree to cite the CHORE: Contact, Human and Object REconstruction from a single RGB image paper in documents and papers that report on research using this Software.

Releases

No releases published

Packages

No packages published

Languages