In this paper, we present Corrformer with the Multi-Correlation mechanism, which can unify the temporal auto-correlation and spatial correlation in a learned multiscale tree structure.
- Corrformer reduces the canonical double quadratic complexity of spatiotemporal modeling to linear in spatial modeling and log-linear in temporal modeling, firstly achieving collaborative forecasts for tens of thousands of stations within a unified deep model.
- Corrformer can generate interpretable predictions based on inferred propagation directions of weather processes, facilitating a fully data-driven AI paradigm for discovering insights for meteorological science.
- Corrformer yields state-of-the-art forecasts on global, regional and citywide datasets with high confidence, beating classical statistical methods, latest deep models, and comparing favorably to numerical methods in near-surface forecasting.
🚩News (2023.06) Our paper has been published in Nature Machine Intelligence as the Cover Article.
|-- Corrformer
|-- data_provider # Data loader
|-- exp # Pipelines for train, validation and test
|-- layers
| |-- Embed.py # Equ (1) of the paper
| |-- Corrformer_EncDec.py # Equ (2) and Equ (3) of the paper
| |-- Causal_Conv.py # Causal conv for Cross-Correlation
| |-- Multi_Correlation.py # Equ (5)-(10) of the paper
|-- models
| |-- Corrformer.py # Overall framework
|-- utils
|-- scripts # Running scripts
|-- dataset # Place the download datsets here
|-- checkpoints # Place the output or pretrained models here
- Find a device with GPU support. Our experiment is conducted on a single RTX 24GB GPU and in the Linux system.
- Install Python 3.6, PyTorch 1.7.1. The following script can be convenient.
pip install -r requirements.txt # take about 5 minutes
-
Download the dataset from [Code Ocean]. And place them under the
./dataset
folder. -
Train and evaluate the model with the following scripts.
bash ./scripts/Global_Temp/Corrformer.sh # take about 18 hours
bash ./scripts/Global_Wind/Corrformer.sh # take about 18 hours
Note: Since the raw data for Global Temp and Global Wind from the NCEI has been multiplied by ten times, the actual MSE and MAE for these two benchmarks should be divided by 100 and 10 respectively.
For a simple demo, we would recommend the experiments with the pre-trained models, which can provide a fast test of our code. Here are the detailed instructions:
- Configure the environment with the above instructions. Note that the following experiments will take 4GB GPU memory.
- Download the datasets and pretrained models from [Datasets] and [Pretrained Models]. Place the pretrained models under the
./checkpoints
folder. - Execute the demo with the following scripts.
bash ./scripts/Demo/Global_Temp_demo.sh # take about 35 minutes
bash ./scripts/Demo/Global_Wind_demo.sh # take about 35 minutes
Note again: Since the raw data for Global Temp and Global Wind from the NCEI has been multiplied by ten times, the actual MSE and MAE for these two benchmarks should be divided by 100 and 10 respectively.
If you find this repo useful, please cite our paper.
@article{wu2023corrformer,
title={Interpretable Weather Forecasting for Worldwide Stations with a Unified Deep Model},
author={Haixu Wu and Hang Zhou and Mingsheng Long and Jianmin Wang},
journal={Nature Machine Intelligence},
year={2023},
}
If you have any questions or suggestions, feel free to contact Haixu Wu ([email protected] or [email protected]).