Skip to content

Latest commit

 

History

History
67 lines (44 loc) · 2.36 KB

README.md

File metadata and controls

67 lines (44 loc) · 2.36 KB

What's in your hands? 3D Reconstruction of Generic Objects in Hands

Yufei Ye, Abhinav Gupta, Shubham Tulsiani in CVPR 2022

Our work aims to reconstruct hand-held objects given a single RGB image. In contrast to prior works that typically assume known 3D templates and reduce the problem to 3D pose estimation, our work reconstructs generic hand-held object without knowing their 3D templates.

[Project Page] [Video] [Colab Demo] [Demo Code] [Arxiv]

Installation

See install.md

Quick Start

  • Step by step interactive notebook

  • Or python script

    python -m demo.demo_image --filename demo/test.jpg --out output/ -e weights/
    
  • Step by step interactive notebook

  • Or python script

    python -m demo.demo_image --filename demo/test.jpg --out output/ -e weights/mow/
    

We also provide some other images docs/demo_%02d.jpg for you to play around.

Train your own model

Preprocess data

preprocess.md (Coming Soon)

Start training

# obman
python -m models.ihoi --config experiments/obman.yaml  --slurm 

# finetune
python -m models.ihoi --config experiments/mow.yaml  --ckpt PATH_TO_OBMAN_MODEL/obman/checkpoints/last.ckpt --slurm

python -m models.ihoi --config experiments/ho3d.yaml  --ckpt PATH_TO_OBMAN_MODEL/obman/checkpoints/last.ckpt --slurm

Citation

If you use find this code helpful, please consider citing:


@article{ye2022hand,
    author = {Ye, Yufei
              and Gupta, Abhinav
              and Tulsiani, Shubham},
    title = {What's in your hands? 3D Reconstruction of Generic Objects in Hands},
    booktitle = {CVPR},
    year={2022}
}