Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

questions about root depth #36

Open
yanglilian opened this issue Jan 4, 2021 · 8 comments
Open

questions about root depth #36

yanglilian opened this issue Jan 4, 2021 · 8 comments

Comments

@yanglilian
Copy link

It seems that this project uses the absolute depth obtained from rootnet for all datasets in the evaluation procedure,so how do you train the rootnet for the hand project. As I know,rootnet is desinged for human pose. Can you release the training code for the rootnet in hand pose? thanks.

@mks0601
Copy link
Collaborator

mks0601 commented Jan 4, 2021

You don't have to change many things from RootNet repo. All you have to do are changing bbox_real to (300,300) and write data/InterHand26M/InterHand26M.py in the RootNet repo. (300,300) indicate hand scale of x- and y-axis in milimeter. You can write data/InterHand26M/InterHand26M.py in the RootNet repo by refering other dataloaders of the RootNet repo and this repo.

@yanglilian
Copy link
Author

You don't have to change many things from RootNet repo. All you have to do are changing bbox_real to (300,300) and write data/InterHand26M/InterHand26M.py in the RootNet repo. (300,300) indicate hand scale of x- and y-axis in milimeter. You can write data/InterHand26M/InterHand26M.py in the RootNet repo by refering other dataloaders of the RootNet repo and this repo.

ok, I will try. Another question is have you ever tried merge the branch of absolute depth prediction into this project?

@mks0601
Copy link
Collaborator

mks0601 commented Jan 4, 2021

I found the code. Use this one.
https://drive.google.com/drive/folders/1dYCgmHqW1qoD3osoB_Z4XWiaKcG7__-8?usp=sharing

You can set the path things by reading the README of this repo.

@mks0601
Copy link
Collaborator

mks0601 commented Jan 4, 2021

No I haven't. RootNet is my previous work and it can be applied to 3D body/hand pose/shape estimation.

@yanglilian
Copy link
Author

I found the code. Use this one.
https://drive.google.com/drive/folders/1dYCgmHqW1qoD3osoB_Z4XWiaKcG7__-8?usp=sharing

You can set the path things by reading the README of this repo.

ok, I will try. Thanks

@shreyashampali
Copy link

hi @mks0601
Thanks for this great work and dataset. I had a question regarding the MPJPE numbers (Table 4 in ECCV paper) in your paper for the interHands dataset. Are these numbers calculated using RootNet output or using ground truth root depths?

@mks0601
Copy link
Collaborator

mks0601 commented Feb 1, 2021

All numbers in the paper are obtained using RootNet.

@dlwsds
Copy link

dlwsds commented Mar 10, 2021

I found the code. Use this one.
https://drive.google.com/drive/folders/1dYCgmHqW1qoD3osoB_Z4XWiaKcG7__-8?usp=sharing

You can set the path things by reading the README of this repo.

hi @mks0601
I run train.py in InterHand2.6M-RootNet, but there is an error :
[Errno 2] No such file or directory: '../data/RHD/data/RHD_training.json'
Could you tell me how to get this file?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants