"Gated Fusion Network for Degraded Image Super Resolution" by Xinyi Zhang*, Hang Dong*, Zhe Hu, Wei-Sheng Lai, Fei Wang, Ming-Hsuan Yang (Accpeptd by IJCV, first two authors contributed equally).
[arXiv]
You can find more details on Project Website.
- Python 3.6
- PyTorch >= 0.4.0
- torchvision
- numpy
- skimage
- h5py
- MATLAB
- Git clone this repository.
$git clone https://github.com/BookerDeWitt/GFN-IJCV
$cd GFN-IJCV/DBSR
-
Download the trained model
GFN_G3D_4x.pkl
from here, then unzip and move theGFN_G3D_4x.pkl
toGFN-IJCV/DBSR/models
folder. -
Then, you can follow the instructions here to test and train our network with our latest code and pre-trained model.
Test on LR-RESIDE
- Git clone this repository.
$git clone https://github.com/BookerDeWitt/GFN-IJCV
$cd GFN-IJCV/DHSR
- Download the LR-RESIDE dataset (including both the test and training sets) from [Google Drive] or BaiduYun (Code:2tnh) and unzip it.
- Download the trained model
GFN_epoch_60.pkl
from Google Drive or BaiduYun (Code:v01z), then unzip and move theGFN_epoch_60.pkl
toGFN-IJCV/DHSR/models
folder. - Run the
GFN-IJCV/DHSR/test.py
with cuda on command line:
GFN-IJCV/DHSR/$python test.py --dataset your_downloads_directory/LR-RESIDE/Validation_4x
Then the dehazing and super-solving images ending with GFN_4x.png are in the directory of your_downloads_directory/LR-RESIDE/Validation_4x/Results.
- Calculate the PSNR using Matlab function
GFN-IJCV/DHSR/evaluation/test_RGB.m
. The output of the average PSNR is 25.77456 dB. You can also use theGFN-IJCV/DHSR/evaluation/test_bicubic.m
to calculate the bicubic method.
>> folder = 'your_downloads_directory/LR-RESIDE/Validation_4x';
>> test_RGB(folder)
Train on LR-RESIDE dataset You should accomplish the first two steps in Test on LR-RESIDE before the following steps.
- Generate the train hdf5 files of RESIDE dataset: Run the matlab function
LR_RESIDE_HDF5_Generator.m
which is in the directory ofGFN-IJCV/DHSR/h5_generator
. The generated hdf5 files are stored in the your_downloads_directory/LR-RESIDE/RESIDE/RESIDE_train256_4x_HDF5.
>> folder = 'your_downloads_directory/LR-RESIDE/RESIDE';
>> LR_RESIDE_HDF5_Generator(folder)
- Run the
GFN-IJCV/DHSR/train.py
with cuda on command line:
GFN-IJCV/DHSR/$python train.py --dataset your_downloads_directory/LR-RESIDE/RESIDE/RESIDE_train256_4x_HDF5
- The three step intermediate models will be respectively saved in
models/1/
models/2
andmodels/3
. You can also use the following command to test the intermediate results during the training process. Run theGFN/Hazy/test.py
with cuda on command line:
GFN-IJCV/DHSR/$python test.py --dataset your_downloads_directory/LR-RESIDE/Validation_4x --intermediate_process models/1/GFN_epoch_30.pkl # We give an example of step1 epoch30. You can replace another pkl file in models/.
Since the training process will take 3 or 4 days, you can use the following command to resume the training process from any breakpoints.
Run the GFN-IJCV/DHSR/train.py
with cuda on command line:
GFN-IJCV/DHSR/$python train.py --dataset your_downloads_directory/LR-RESIDE/RESIDE/RESIDE_train256_4x_HDF5 --resume models/1/GFN_epoch_25.pkl # Just an example of step1 epoch25.
Test on LR-Rain1200 This model is the result of the third step with 37 epoch.
- Git clone this repository.
$git clone https://github.com/BookerDeWitt/GFN-IJCV
$cd GFN-IJCV/DRSR
- Download the LR-Rain1200 dataset (including both the test and training sets) from Google Drive or BaiduYun (Code:v7e1) and unzip it.
- Download the trained model
GFN_epoch_37.pkl
from Google Drive or BaiduYun (Code:koeu), then unzip and move theGFN_epoch_37.pkl
toGFN/models
folder. - Run the
GFN-IJCV/DRSR/test.py
with cuda on command line:
GFN-IJCV/DRSR/$python test.py --dataset your_downloads_directory/LR_Rain1200/Validation_4x
Then the deraining and super-solving images ending with GFN_4x.png are in the directory of your_downloads_directory/LR_Rain1200/Validation_4x/Results.
- Calculate the PSNR using Matlab function
GFN-IJCV/DRSR/evaluation/test_RGB.m
. The output of the average PSNR is 25.248834 dB. You can also use theGFN-IJCV/DRSR/evaluation/test_bicubic.m
to calculate the bicubic method.
>> folder = 'your_downloads_directory/LR_Rain1200/Validation_4x';
>> test_RGB(folder)
Train on LR-Rain1200 dataset You should accomplish the first two steps in Test on LR-Rain1200 before the following steps.
- Generate the train hdf5 files of LR_Rain1200 dataset: Run the matlab function
rain_hdf5_generator.m
which is in the directory of GFN/h5_generator. The generated hdf5 files are stored in the your_downloads_directory/LR_Rain1200/Rain_HDF5.
>> folder = 'your_downloads_directory/LR_Rain1200';
>> rain_hdf5_generator(folder)
- Run the
GFN-IJCV/DRSR/train.py
with cuda on command line:
GFN-IJCV/DRSR/$python train.py --dataset your_downloads_directory/LR_Rain1200/Rain_HDF5
- The three step intermediate models will be respectively saved in models/1/ models/2 and models/3. You can also use the following command to test the intermediate results during the training process.
Run the
GFN-IJCV/DRSR/test.py
with cuda on command line:
GFN-IJCV/DRSR/$python test.py --dataset your_downloads_directory/LR_Rain1200/Validation_4x --intermediate_process models/1/GFN_epoch_25.pkl # We give an example of step1 epoch25. You can replace another pkl file in models/.
Since the training process will take 3 or 4 days, you can use the following command to resume the training process from any breakpoints.
Run the GFN-IJCV/DRSR/train.py
with cuda on command line:
GFN-IJCV/DRSR/$python train.py --dataset your_downloads_directory/LR_Rain1200/Rain_HDF5 --resume models/1/GFN_epoch_25.pkl # Just an example of step1 epoch25.
If you use these models in your research, please cite:
@article{GFN_IJCV,
author = {Xinyi, Zhang and Hang, Dong and Zhe, Hu and Wei-Sheng, Lai and Fei, Wang and Ming-Hsuan, Yang},
title = {Gated Fusion Network for Degraded Image Super Resolution},
journal={International Journal of Computer Vision},
year = {2020},
pages={1 - 23}
}
@inproceedings{GFN_BMVC,
title = {Gated Fusion Network for Joint Image Deblurring and Super-Resolution},
author = {Xinyi, Zhang and Hang, Dong and Zhe, Hu and Wei-Sheng, Lai and Fei, Wang and Ming-Hsuan, Yang},
booktitle = {BMVC},
year = {2018}
}