diff --git a/README.md b/README.md index ec8de652..17419c67 100644 --- a/README.md +++ b/README.md @@ -98,7 +98,7 @@ cd CUT ### CUT and FastCUT Training and Test -- Download the grumpifycat dataset (Fig 8 of the paper. Russian Blue -> Grumpy Cats) +- Download the `grumpifycat` dataset (Fig 8 of the paper. Russian Blue -> Grumpy Cats) ```bash bash ./datasets/download_cut_dataset.sh grumpifycat ``` @@ -119,7 +119,7 @@ The checkpoints will be stored at `./checkpoints/grumpycat_*/web`. - Test the CUT model: ```bash -python test.py --dataroot ./datasets/grumpifycat --name grumpycat_CUT --CUT_mode CUT +python test.py --dataroot ./datasets/grumpifycat --name grumpycat_CUT --CUT_mode CUT --phase train ``` @@ -177,4 +177,4 @@ If you use this code for your research, please cite our [paper](https://arxiv.or ### Acknowledgments -We thank Allan Jabri and Phillip Isola for helpful discussion and feedback. Our code is developed based on [pytorch-CycleGAN-and-pix2pix](https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix). We also thank [pytorch-fid](https://github.com/mseitzer/pytorch-fid) for FID computation and [drn](https://github.com/fyu/drn) for mIoU computation, and [stylegan2-pytorch](https://github.com/rosinality/stylegan2-pytorch/) for the PyTorch implementation of StyleGAN2 used in single-image translation. +We thank Allan Jabri and Phillip Isola for helpful discussion and feedback. Our code is developed based on [pytorch-CycleGAN-and-pix2pix](https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix). We also thank [pytorch-fid](https://github.com/mseitzer/pytorch-fid) for FID computation, [drn](https://github.com/fyu/drn) for mIoU computation, and [stylegan2-pytorch](https://github.com/rosinality/stylegan2-pytorch/) for the PyTorch implementation of StyleGAN2 used in our single-image translation setting. diff --git a/models/cut_model.py b/models/cut_model.py index 6b17a336..1768905a 100644 --- a/models/cut_model.py +++ b/models/cut_model.py @@ -2,12 +2,16 @@ import torch from .base_model import BaseModel from . import networks -from .nce import PatchNCELoss +from .patchnce import PatchNCELoss import util.util as util class CUTModel(BaseModel): - """ This class implements CUT and FastCUT model + """ This class implements CUT and FastCUT model, described in the paper + Contrastive Learning for Unpaired Image-to-Image Translation + Taesung Park, Alexei A. Efros, Richard Zhang, Jun-Yan Zhu + ECCV, 2020 + The code borrows heavily from the PyTorch implementation of CycleGAN https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix """ diff --git a/models/networks.py b/models/networks.py index aa834a4b..4c80bac1 100644 --- a/models/networks.py +++ b/models/networks.py @@ -7,8 +7,6 @@ import numpy as np from .stylegan_networks import StyleGAN2Discriminator, StyleGAN2Generator, TileStyleGAN2Discriminator -# from IPython import embed - ############################################################################### # Helper Functions ############################################################################### diff --git a/models/nce.py b/models/patchnce.py similarity index 100% rename from models/nce.py rename to models/patchnce.py diff --git a/models/stylegan_networks.py b/models/stylegan_networks.py index 2f113dee..230dc11e 100644 --- a/models/stylegan_networks.py +++ b/models/stylegan_networks.py @@ -1,3 +1,10 @@ +""" +The network architectures is based on PyTorch implemenation of StyleGAN2Encoder. +Original PyTorch repo: https://github.com/rosinality/style-based-gan-pytorch +Origianl StyelGAN2 paper: https://github.com/NVlabs/stylegan2 +We use the network architeture for our single-image traning setting. +""" + import math import numpy as np import random