Skip to content

Latest commit

 

History

History
84 lines (67 loc) · 3.37 KB

DOWNLOAD.md

File metadata and controls

84 lines (67 loc) · 3.37 KB

Download

Getting Started

  1. Create folders that store pretrained models and datasets.

    export REPO_DIR=$PWD
    mkdir -p $REPO_DIR/models  # pre-trained models
    mkdir -p $REPO_DIR/datasets  # datasets
  2. Download pretrained models. Pretrained weights and config yaml files for HRNet and Aggpose can be downloaded from the following links:
    https://github.com/HRNet/HRNet-Human-Pose-Estimation
    https://github.com/PediaMedAI/AggPose

    Our trained models can be downloaded from here:
    https://drive.google.com/drive/folders/1ATX1FXS1hz1HIN_LKaDEEuudTiBMdMN4?usp=drive_link

    The recommended data structure should follow the hierarchy as below. The location where to store the models can be changed by modifying .src/config_path.py.

    ${REPO_DIR}  
    |-- models  
    |   |-- deformer_release
    |   |   |-- deformer_h36m_state_dict.bin
    |   |   |-- deformer_h36m_state_dict_s.bin
    |   |-- backbone
    |   |   |-- res50_256x192_d256x3_adam_lr1e-3.yaml
    |   |   |-- pose_resnet_50_256x192.pth.tar
    |   |   |-- w48_256x192_adam_lr1e-3.yaml
    |   |   |-- pose_hrnet_w48_256x192.pth
    |   |   |-- w48_384x288_adam_lr1e-3.yaml
    |   |   |-- pose_hrnet_w48_384x288.pth
    |   |   |-- aggpose_L_256x192_adamw_lr1e-3.yaml
    |   |   |-- AggPose-L_256x192_COCO2017.pth
    
  3. Download SMPL models from their official websites

    To run our code smoothly, please visit the following websites to download SMPL and MANO models.

    • Download basicModel_neutral_lbs_10_207_0_v1.0.0.pkl from SMPLify, and place it at ${REPO_DIR}/src/modeling/data.
    • Download J_regressor_extra.npy, J_regressor_h36m.npy and mesh_downsampling.npz from https://github.com/nkolot/GraphCMR/tree/master/data , and put them at ${REPO_DIR}/src/modeling/data. Please put the downloaded files under the ${REPO_DIR}/src/modeling/data directory. The data structure should follow the hierarchy below.
    ${REPO_DIR}  
    |-- src  
    |   |-- modeling
    |   |   |-- data
    |   |   |   |-- basicModel_neutral_lbs_10_207_0_v1.0.0.pkl
    |   |   |   |-- J_regressor_extra.npy
    |   |   |   |-- J_regressor_h36m.npy
    |   |   |   |-- mesh_downsampling.npz
    |-- datasets
    |-- ... 
    |-- ... 
    

    Please check /src/modeling/data/README.md for further details.

  4. Download datasets and pseudo labels for training.

    We use the same data from METRO

    Please visit their project page to download datasets and annotations for experiments. Click LINK.

  5. Set the path configuration file and preprocess image datasets.

    Please set image_dir in .src/config_path.py and modify the dataset paths if needed. As we experienced that the loading an image from .tsv takes a long time, we provided a command to save images into .png.

python  -m torch.distributed.launch --nproc_per_node=8 src/tools/run_deformer_bodymesh2.py \
         --train_yaml Tax-H36m-coco40k-Muco-UP-Mpii/train.yaml \
         --val_yaml human3.6m/valid.protocol2.yaml \
         --num_workers 16 \
         --run_data_process \
         --per_gpu_train_batch_size 16 \
         --per_gpu_eval_batch_size 16 \
         --data_dir 'path_to_datasets'