MuViHand is a synthetic hand pose dataset that is created with the help of Mixamo, a free web-based service for 3D animations and characters, several images as backgrounds from Pxfuel website, and Blender as a 3D computer graphics software for creating the data. The data have been captured from 10 distinct subjects with 12 cameras, where 6 cameras capture the whole body and the other 6 cameras track a specific hand. The dataset provides the 2D and 3D locations for 21 hand keypoints.
Please see this paper for more details.
- Download MuViHand datset here;
- Unzip dataverse_files.zip to /path/to/MuViHand;
- Your dataset structure should be like:
MuViHand/
F1_Subject.01.rar/ # Fixed cameras data for subject 1
F2_Subject.01.rar/ # Fixed cameras data for subject 1
T_Subject.01.rar/ # Tracking cameras data for subject 1
...
F1_Subject.10.rar/ # Fixed cameras data for subject 10
F2_Subject.10.rar/ # Fixed cameras data for subject 10
T_Subject.10.rar/ # Tracking cameras data for subject 10
- A scripts that shows the basic use of the data for python could be found here;
- Keypoints available: 0: wrist, 1-4: thumb [palm to tip], 5-8: index, 9-12: middle, 13-16: ring, 17-20: pinkie;
Please cite our paper if this dataset helps your research.
@article{khaleghi2022multi,
title={Multi-view video-based 3D hand pose estimation},
author={Khaleghi, Leyla and Sepas-Moghaddam, Alireza and Marshall, Joshua and Etemad, Ali},
journal={IEEE Transactions on Artificial Intelligence},
year={2022},
publisher={IEEE}
}
In this video you can learn how to release your dataset in Dataverse.