From https://github.com/MAZiqing/FEDformer
Frequency Enhanced Decomposed Transformer (FEDformer) is more efficient than standard Transformer with a linear complexity to the sequence length.
Our empirical studies with six benchmark datasets show that compared with state-of-the-art methods, FEDformer can reduce prediction error by 14.8% and 22.6% for multivariate and univariate time series, respectively.
- Install Python 3.6, PyTorch 1.9.0.
- Download data. You can obtain all the six benchmarks from xxxx.
- Train the model. We provide the experiment scripts of all benchmarks under the folder
./scripts
. You can reproduce the experiment results by:
bash ./scripts/run_M.sh
bash ./scripts/run_S.sh
If you find this repo useful, please cite our paper.
@inproceedings{zhou2022fedformer,
title={Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting},
author={Zhou, Tian and Ma, Ziqing and Wen, Qingsong and Wang, Xue and Sun, Liang and Jin, Rong},
booktitle={International Conference on Machine Learning},
pages={27268--27286},
year={2022},
organization={PMLR}
}
If you have any question or want to use the code, please contact xxx@xxxx .
We appreciate the following github repos a lot for their valuable code base or datasets:
https://github.com/thuml/Autoformer
https://github.com/zhouhaoyi/Informer2020