Pytorch Implementation of our paper "High-Efficiency Lossy Image Coding Through Adaptive Neighborhood Information Aggregation"[arXiv].
More details can be found at the homepage.
- [22.10.27] The latest version of our TinyLIC is released with more efficient network architecture in both transform and entropy coding modules. More details can be found in the paper.
To get started locally and install the development version of our work, run the following commands (The docker environment is recommended):
git clone https://github.com/lumingzzz/TinyLIC.git
cd TinyLIC
pip install -U pip && pip install -e .
We use the Flicker2W dataset for training, and the script for preprocessing.
Run the script for a simple training pipeline:
python examples/train.py -m tinylic -d /path/to/my/image/dataset/ --epochs 400 -lr 1e-4 --batch-size 8 --cuda --save
The training checkpoints will be generated in the "chekpoints" folder at the current directory. You can change the default folder by modifying the function "init()" in "./expample/train.py".
Pre-trained models can be downloaded in NJU Box. The mse optimized R-D results of three popular datasets can be found in /results for reference.
An example to evaluate model:
python -m compressai.utils.eval_model checkpoint path/to/eval/data/ -a tinylic -p path/to/pretrained/model --cuda
If you find this work useful for your research, please cite:
@article{lu2022high,
title={High-Efficiency Lossy Image Coding Through Adaptive Neighborhood Information Aggregation},
author={Lu, Ming and Ma, Zhan},
journal={arXiv preprint arXiv:2204.11448},
year={2022}
}
This framework is based on CompressAI, we add our modifications mainly in compressai.models.tinylic and compressai.layers for usage. You can refer to the paper to understand the modificated part.
The TinyLIC model is partially built upon the Neighborhood Attention Transformer and the open-sourced unofficial implementation of Checkerboard Shaped Context Model. We thank the authors for sharing their codes.
If you have any question, please contact me via [email protected].