Skip to content

idiosyncracies/Sebica

Repository files navigation

Sebica: Lightweight Spatial and Efficient Bidirectional Channel Attention Super Resolution Network

We propose a Lightweight Spatial and Efficient Bidirectional Channel Attention Super Resolution Network This study is inspired from some similar works, e.g., ais 2024 challenge This study aims to find the optimal network architecture in terms of save computational resource without reducing accuracy. We reserve the further squeeze technichs, e.g. distilation, reparameterize, in the future It's support real-time 4K video processing

Dataset

  • Div2K and Flirkr2K. We filter the small sized images, and chopped the images in the unique size of 1152 x 2040 for training and testing
  • Use jupyter notebook to process the original data, or:
  • Download processed div2K via here.
  • Flirkr hasn't been upload deu to it's huge size, please download in the official website and convert by yourself
  • Put the datasets in ~/Documents/Datasets, or others you like, but revise the path in the code accordingly

Pre-trained weights

  • It's in the folder of logs/chpts

Train

  • Setup configures in configs/conf.yaml
  • Run train.py

Visualize inferrence result:

  • Setup the mode, e.g., "standard" or "mini", and pth file path in the name function
  • Run infer.py accordingly

Evaluate psnr ssim:

  • Setup the mode, e.g., "standard" or "mini", and pth file path in the name function of psnr_ssim_evaluate.py
  • Setup network, dataset path (in data->test) in conf.yaml accordingly
  • Run psnr_ssim_evaluate.py
  • Comparison as below: Comparison with baseline

Object detection test

  • Download the full dataset via official website, or
  • Download the section via here

Acknowledgement

Some of this work is based on Bicubic++ and RVSR, thanks to their valuable contributions.

Citation

  • Our study has been published on Arxiv, welcome to cite:
@article{liu2024sebica,
  title={Sebica: Lightweight Spatial and Efficient Bidirectional Channel Attention Super Resolution Network},
  author={Liu, Chongxiao},
  journal={arXiv preprint arXiv:2410.20546},
  year={2024}
}

About

Official code for Sebica

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages