This is a repository about Time Series Forecasting with Deeplearning method (More attention to multivariate time series)
Updating everyday
Contact me: [email protected]
-
Learning Informative Representation for Fairness-aware Multivariate Time-series Forecasting: A Group-based Perspective. [Paper]
This paper formulate the Multivariate time series (MTS) fairness modeling problem as learning informative representations attending to both advantaged and disadvantaged variables.
-
A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. [Paper] [Code]
In this paper, these author propose an efficient design of Transformer-based models for multivariate time series forecasting and self-supervised representation learning
-
Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. [Paper] [Code]
This paper propose Crossformer, a Transformer-based model utilizing cross-dimension dependency for multivariate time series(MTS) forecasting.
-
Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting. [Paper] [Code]
In this paper, these authors propose Pyraformer by exploring the multiresolution representation of the time series.
-
Multivariate Time Series Prediction Based on Temporal Change Information Learning Method. [Paper]
A temporal change information learning (CIL) method is proposed in this article.
-
Multivariate Time Series Forecasting with Latent Graph Inference. [Paper]
This paper introduces a new approach for Multivariate Time Series forecasting that jointly infers and leverages relations among time series.
-
Transformers in Time Series: A Survey. [Paper]
In this paper, these authors systematically review transformer schemes for time series modeling by highlighting their strengths as well as limitations through a new taxonomy to summarize existing time series transformers in two perspectives.
-
Bayesian optimization based dynamic ensemble for time series forecasting. [Paper]
This paper proposes a Bayesian optimization-based dynamic ensemble (BODE) that overcomes the single model-based methods limitation and provides a dynamic ensemble forecast combination for TS with time-varying underlying patterns.
-
Probabilistic Time Series Forecasting with Shape and Temporal Diversity. [Paper] [Code]
These authors introduce the STRIPE model for representing structured diversity based on shape and time features, ensuring both probable predictions while being sharp and accurate.
-
CoST: Contrastive Learning of Disentangled Seasonal-Trend Representations for Time Series Forecasting. [Paper] [Code]
This paper propose a new time series representation learning framework for time series forecasting named CoST, which applies contrastive learning methods to learn disentangled seasonal-trend representations.
-
Self-Supervised Contrastive Pre-Training For Time Series via Time-Frequency Consistency. [Paper]
These authors define a decomposable pre-training model, where the self-supervised signal is provided by the distance between time and frequency components, each individually trained by contrastive estimation.
-
Preformer: Predictive Transformer with Multi-Scale Segment-wise Correlations for Long-Term Time Series Forecasting. [Paper]
This paper proposes a predictive Transformer-based model called Preformer, which introduces a novel efficient Multi-Scale Segment-Correlation mechanism that divides time series into segments and utilizes segment-wise correlation-based attention for encoding time series.
-
Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting. [Paper] [Code]
This paper design Autoformer as a novel decomposition architecture with an Auto-Correlation mechanism. The paper break with the pre-processing convention of series decomposition and renovate it as a basic inner block of deep models.
-
MixSeq: Connecting Macroscopic Time Series Forecasting with Microscopic Time Series Data. [paper]
In this paper, these authors assume that the microscopic time series follow some unknown mixture probabilistic distributions.
-
TimeVAE: A Variational Auto-Encoder for Multivariate Time Series Generation. [Paper] [Code]
These authors propose a novel architecture for synthetically generating time-series data with the use of Variational Auto-Encoders (VAEs). The proposed architecture has several distinct properties: interpretability, ability to encode domain knowledge, and reduced training times.
-
Time Series Analysis and Modeling to Forecast: a Survey. [Paper]
These authors cover a sufficiently broad spectrum of models while nonetheless offering substantial methodological developments. We describe three major linear parametric models, together with two nonlinear extensions, and present five categories of nonlinear parametric models.
-
A Robust and Efficient Multi-Scale Seasonal-Trend Decomposition. [paper]
In this paper, these authors propose a general and efficient multi-scale seasonal-trend decomposition algorithm for time series with multiple seasonality.
-
NeuralProphet: Explainable Forecasting at Scale. [Paper] [Code]
This paper introduce NeuralProphet, a successor to Facebook Prophet, which set an industry standard for explainable, scalable, and user-friendly forecasting frameworks.
-
Probabilistic Time Series Forecasting with Shape and Temporal Diversity. [Paper]
In this paper, we address this problem(probanilistic forecasting) for non-stationary time series, which is very challenging yet crucially important.
-
Probabilistic Forecasting with Temporal Convolutional Neural Network. [Paper] [Code]
Combined with representation learning, our approach is able to learn complex patterns such as seasonality, holiday effects within and across series, and to leverage those patterns for more accurate forecasts, especially when historical data is sparse or unavailable.
-
N-BEATS: Neural basis expansion analysis for interpretable time series forecasting. [Paper] [Code] [Dataset Used1] [Dataset Used2]
This paper propose a deep neural architecture based on backward and forward residual links and a very deep stack of fully-connected layers.