Clone this repo.
git clone https://github.com/tzt101/MichiGAN.git
cd hairSynthesis/
This code requires PyTorch 1.0 and python 3+. Please install dependencies by
pip install -r requirements.txt
If necessary, download the Synchronized-BatchNorm-PyTorch rep.
cd models/networks/
git clone https://github.com/vacancy/Synchronized-BatchNorm-PyTorch
cp -rf Synchronized-BatchNorm-PyTorch/sync_batchnorm .
cd ../../
The FFHQ dataset can be downloaded from Baidu Netdisk with the extracted code ichc
, you should specify the dataset root from through --data_dir
.
Once the dataset is ready, the result images can be generated using pretrained models.
-
Download the pretrained models from the Google Drive Folder, save it in 'checkpoints/MichiGAN/'
-
Generate single image using the pretrained model.
python inference.py --name MichiGAN --gpu_ids 0 --inference_ref_name 67172 --inference_tag_name 67172 --inference_orient_name 67172 --netG spadeb --which_epoch 50 --use_encoder --noise_background --expand_mask_be --expand_th 5 --use_ig --load_size 512 --crop_size 512 --add_feat_zeros --data_dir [path_to_dataset]
-
The outputs images are stored at
./inference_samples/
by default.
New models can be trained with the following command.
python train.py --name [name_experiment] --batchSize 8 --no_confidence_loss --gpu_ids 0,1,2,3,4,5,6,7 --no_style_loss --no_rgb_loss --no_content_loss --use_encoder --wide_edge 2 --no_background_loss --noise_background --random_expand_mask --use_ig --load_size 568 --crop_size 512 --data_dir [pah_to_dataset] ----checkpoints_dir ./checkpoints
[name_experiment]
is the directory name of the checkpoint file saved. if you want to train the model with orientation inpainting model (with the option --use_ig), please download the pretrained inpainting model from Google Drive Folder and save them in ./checkpoints/[name_experiment]/
firstly.
train.py
,inference.py
: the entry point for training and inferencing.trainers/pix2pix_trainer.py
: harnesses and reports the progress of training.models/pix2pix_model.py
: creates the networks, and compute the lossesmodels/networks/
: defines the architecture of all modelsoptions/
: creates option lists usingargparse
package. More individuals are dynamically added in other files as well. Please see the section below.data/
: defines the class for loading datas.
This code borrows heavily from SPADE. We thank Jiayuan Mao for his Synchronized Batch Normalization code.