Skip to content
/ hamer Public
forked from geopavlakos/hamer

HaMeR: Reconstructing Hands in 3D with Transformers

License

Notifications You must be signed in to change notification settings

IRVLUTD/hamer

 
 

Repository files navigation

HaMeR: Hand Mesh Recovery

Code repository for the paper: Reconstructing Hands in 3D with Transformers

Georgios Pavlakos, Dandan Shan, Ilija Radosavovic, Angjoo Kanazawa, David Fouhey, Jitendra Malik

arXiv Website shields.io Open In Colab Hugging Face Spaces

teaser

News

  • [2024/06] HaMeR received the 2nd place award in the Ego-Pose Hands task of the Ego-Exo4D Challenge! Please check the validation report.
  • [2024/05] We have released the evaluation pipeline!
  • [2024/05] We have released the HInt dataset annotations! Please check here.
  • [2023/12] Original release!

Installation

First you need to clone the repo:

git clone --recursive https://github.com/geopavlakos/hamer.git
cd hamer

We recommend creating a virtual environment for HaMeR. You can use venv:

python3.10 -m venv .hamer
source .hamer/bin/activate

or alternatively conda:

conda create --name hamer python=3.10
conda activate hamer

Then, you can install the rest of the dependencies. This is for CUDA 11.7, but you can adapt accordingly:

pip install torch torchvision --index-url https://download.pytorch.org/whl/cu117
pip install -e .[all]
pip install -v -e third-party/ViTPose

You also need to download the trained models:

bash fetch_demo_data.sh

Besides these files, you also need to download the MANO model. Please visit the MANO website and register to get access to the downloads section. We only require the right hand model. You need to put MANO_RIGHT.pkl under the _DATA/data/mano folder.

Docker Compose

If you wish to use HaMeR with Docker, you can use the following command:

docker compose -f ./docker/docker-compose.yml up -d

After the image is built successfully, enter the container and run the steps as above:

docker compose -f ./docker/docker-compose.yml exec hamer-dev /bin/bash

Continue with the installation steps:

bash fetch_demo_data.sh

Demo

python demo.py \
    --img_folder example_data --out_folder demo_out \
    --batch_size=48 --side_view --save_mesh --full_frame

HInt Dataset

We have released the annotations for the HInt dataset. Please follow the instructions here

Training

First, download the training data to ./hamer_training_data/ by running:

bash fetch_training_data.sh

Then you can start training using the following command:

python train.py exp_name=hamer data=mix_all experiment=hamer_vit_transformer trainer=gpu launcher=local

Checkpoints and logs will be saved to ./logs/.

Evaluation

Download the evaluation metadata to ./hamer_evaluation_data/. Additionally, download the FreiHAND, HO-3D, and HInt dataset images and update the corresponding paths in hamer/configs/datasets_eval.yaml.

Run evaluation on multiple datasets as follows, results are stored in results/eval_regression.csv.

python eval.py --dataset 'FREIHAND-VAL,HO3D-VAL,NEWDAYS-TEST-ALL,NEWDAYS-TEST-VIS,NEWDAYS-TEST-OCC,EPICK-TEST-ALL,EPICK-TEST-VIS,EPICK-TEST-OCC,EGO4D-TEST-ALL,EGO4D-TEST-VIS,EGO4D-TEST-OCC'

Results for HInt are stored in results/eval_regression.csv. For FreiHAND and HO-3D you get as output a .json file that can be used for evaluation using their corresponding evaluation processes.

Acknowledgements

Parts of the code are taken or adapted from the following repos:

Additionally, we thank StabilityAI for a generous compute grant that enabled this work.

Citing

If you find this code useful for your research, please consider citing the following paper:

@inproceedings{pavlakos2024reconstructing,
    title={Reconstructing Hands in 3{D} with Transformers},
    author={Pavlakos, Georgios and Shan, Dandan and Radosavovic, Ilija and Kanazawa, Angjoo and Fouhey, David and Malik, Jitendra},
    booktitle={CVPR},
    year={2024}
}

About

HaMeR: Reconstructing Hands in 3D with Transformers

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.6%
  • Other 1.4%