FreDF is an open-source library for deep learning researchers, especially for deep time series analysis.
We provide a neat code base to evaluate advanced deep time series models or develop your model on transformed domain, which covers three mainstream tasks: long- and short-term forecasting, and imputation.
🚩News (2023.12) We add implementations to train and evaluate deep learning models within transformed domain (Frequency Domain) on three main tasks.
Till October 2023, the top three models for five different tasks are:
Model<br> Ranking |
Long-term<br> Forecasting |
Short-term<br> Forecasting |
Imputation |
---|---|---|---|
🥇 1st | FreDF + iTrans. | FreDF + FreTS | FreDF + iTrans. |
Note: We will keep updating this leaderboard. If you have proposed advanced and awesome models, you can send us your paper/code link or raise a pull request. We will add them to this repo and update the leaderboard as soon as possible.
Compared models of this leaderboard. ☑ means that their codes have already been included in this repo.
- ☑ iTransformer - iTransformer: Inverted Transformers Are Effective for Time Series Forecasting [arXiv 2023] [Code].
- ☑ PatchTST - A Time Series is Worth 64 Words: Long-term Forecasting with Transformers [ICLR 2023] [Code].
- ☑ TimesNet - TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis [ICLR 2023] [Code].
- ☑ DLinear - Are Transformers Effective for Time Series Forecasting? [AAAI 2023] [Code].
- ☑ FEDformer - FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting [ICML 2022] [Code].
- ☑ Autoformer - Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting [NeurIPS 2021] [Code].
- ☑ Transformer - Attention is All You Need [NeurIPS 2017] [Code].
- ☑ TiDE - Long-term Forecasting with TiDE: Time-series Dense Encoder [arXiv 2023] [Code].
- ☑ Crossformer - Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting [ICLR 2023][Code].
- ☑ TCN
- ☑ LSTM
See our latest paper [FreDF] for the comprehensive benchmark. We will release a real-time updated online version soon.
- Install Python 3.8. For convenience, execute the following command.
pip install -r requirements.txt
- Prepare Data. You can obtain the well pre-processed datasets from [Google Drive] or [Baidu Drive], Then place the downloaded data in the folder
./dataset
. Here is a summary of supported datasets.
- Train and evaluate model. We provide the experiment scripts for all benchmarks under the folder
./scripts/
. You can reproduce the experiment results as the following examples:
# long-term forecast
bash ./scripts/fredf_exp/ltf_overall/ETTh1_script/iTransformer.sh
# short-term forecast
bash ./scripts/fredf_exp/stf_overall/FreTS_M4.sh
# imputation
bash ./scripts/fredf_exp/imp_autoencoder/ETTh1_script/iTransformer.sh
- Develop your own model.
- Add the model file to the folder
./models
. You can follow the./models/iTransformer.py
. - Include the newly added model in the
Exp_Basic.model_dict
of./exp/exp_basic.py
. - Create the corresponding scripts under the folder
./scripts
.
This library is mainly constructed based on the following repos, following the training-evaluation pipelines and the implementation of baseline models:
- Time-Series-Library: https://github.com/thuml/Time-Series-Library.
All the experiment datasets are public, and we obtain them from the following links:
- Long-term Forecasting and Imputation: https://github.com/thuml/Autoformer.
- Short-term Forecasting: https://github.com/ServiceNow/N-BEATS.