PARE: Part Attention Regressor for 3D Human Body Estimation,
Muhammed Kocabas, Chun-Hao Paul Huang, Otmar Hilliges Michael J. Black,
International Conference on Computer Vision (ICCV), 2021
PARE is an occlusion-robust human pose and shape estimation method. This implementation includes the demo and evaluation code for PARE implemented in PyTorch.
- 13/10/2021: Demo and evaluation code is released.
PARE has been implemented and tested on Ubuntu 18.04 with python >= 3.7. If you don't have a suitable device, try running our Colab demo.
Clone the repo:
git clone https://github.com/mkocabas/PARE.git
Install the requirements using virtualenv or conda:
# pip
source scripts/install_pip.sh
# conda
source scripts/install_conda.sh
First, you need to download the required data (i.e our trained model and SMPL model parameters). It is approximately 1.3GB. To do this you can just run:
source scripts/prepare_data.sh
Run the command below. See scripts/demo.py
for more options.
python scripts/demo.py --vid_file data/sample_video.mp4 --output_folder logs/demo
Sample demo output:
python scripts/demo.py --image_folder <path to image folder> --output_folder logs/demo
If demo finishes succesfully, it needs to create a file named pare_output.pkl
in the --output_folder
.
We can inspect what this file contains by:
>>> import joblib # you may also use native pickle here as well
>>> output = joblib.load('pare_output.pkl')
>>> print(output.keys())
dict_keys([1, 2, 3, 4]) # these are the track ids for each subject appearing in the video
>>> for k,v in output[1].items(): print(k,v.shape)
pred_cam (n_frames, 3) # weak perspective camera parameters in cropped image space (s,tx,ty)
orig_cam (n_frames, 4) # weak perspective camera parameters in original image space (sx,sy,tx,ty)
verts (n_frames, 6890, 3) # SMPL mesh vertices
pose (n_frames, 72) # SMPL pose parameters
betas (n_frames, 10) # SMPL body shape parameters
joints3d (n_frames, 49, 3) # SMPL 3D joints
joints2d (n_frames, 21, 3) # 2D keypoint detections by STAF if pose tracking enabled otherwise None
bboxes (n_frames, 4) # bbox detections (cx,cy,w,h)
frame_ids (n_frames,) # frame ids in which subject with tracking id #1 appears
smpl_joints2d (n_frames, 49, 2) # SMPL 2D joints
Training instructions will follow soon.
You need to download 3DPW
and 3DOH
datasets before running the evaluation script.
After the download, the data
folder should look like:
data/
├── body_models
│ └── smpl
├── dataset_extras
├── dataset_folders
│ ├── 3doh
│ └── 3dpw
└── pare
└── checkpoints
Then, you can evaluate PARE by running:
python scripts/eval.py \
--cfg data/pare/checkpoints/pare_config.yaml \
--opts DATASET.VAL_DS 3doh_3dpw-all
python scripts/eval.py \
--cfg data/pare/checkpoints/pare_w_3dpw_config.yaml \
--opts DATASET.VAL_DS 3doh_3dpw-all
You should obtain results in this table on 3DPW test set:
MPJPE | PAMPJPE | V2V | |
---|---|---|---|
PARE | 82 | 50.9 | 97.9 |
PARE (w. 3DPW) | 74.5 | 46.5 | 88.6 |
We prepare a script to run occlusion sensitivity analysis proposed in our paper. Occlusion sensitivity analysis slides an occluding patch on the image and visualizes how human pose and shape estimation result affected.
python scripts/occlusion_analysis.py \
--cfg data/pare/checkpoints/pare_config.yaml \
--ckpt data/pare/checkpoints/pare_checkpoint.ckpt
Sample occlusion test output:
@inproceedings{Kocabas_PARE_2021,
title = {{PARE}: Part Attention Regressor for {3D} Human Body Estimation},
author = {Kocabas, Muhammed and Huang, Chun-Hao P. and Hilliges, Otmar and Black, Michael J.},
booktitle = {Proc. International Conference on Computer Vision (ICCV)},
pages = {11127--11137},
month = oct,
year = {2021},
doi = {},
month_numeric = {10}
}
This code is available for non-commercial scientific research purposes as defined in the LICENSE file. By downloading and using this code you agree to the terms in the LICENSE. Third-party datasets and software are subject to their respective licenses.
We indicate if a function or script is borrowed externally inside each file. Consider citing these works if you use them in your project.
For questions, please contact [email protected]
For commercial licensing (and all related questions for business applications), please contact [email protected].