-
Notifications
You must be signed in to change notification settings - Fork 184
Home
In this section, we highlight the papers, blogs, pre-trained models, and open-source codes from IBM Research's TSFM group.
HF Downloads | Github Stars | Github Forks | Citations | Blogs/Media/Articles |
-
TinyTimeMixer (TTM): https://huggingface.co/ibm-granite/granite-timeseries-ttm-v1
🚀 Downloads: 2 Million+, Likes: 210 (as of Nov 1 2024) 🚀
-
PatchTSMixer: https://huggingface.co/docs/transformers/en/model_doc/patchtsmixer
-
PatchTST: https://huggingface.co/docs/transformers/en/model_doc/patchtst
4 KDD, 1 NeurIPS, 1 ICLR, 2 AAAI, 1 ICML.
🚀 Total citations: 1700 (as of 26 Aug 2024). 🚀
-
TST: Zerveas, G., Jayaraman, S., Patel, D., Bhamidipaty, A., & Eickhoff, C. A transformer-based framework for multivariate time series representation learning. In KDD 2021. (citations: 840)
-
PatchTST: Nie, Y., Nguyen, N. H., Sinthong, P., & Kalagnanam, J. (2022). A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. ICLR 2023. (citations: 656)
-
PatchTSMixer: Ekambaram, V., Jati, A., Nguyen, N., Sinthong, P., & Kalagnanam, J. TSMixer: Lightweight MLP-Mixer Model for Multivariate Time Series Forecasting. In KDD 2023. (citations: 59)
-
NPF: Ekambaram, V., Manglik, K., Mukherjee, S., Sajja, S. S. K., Dwivedi, S., & Raykar, V. Attention based multi-modal new product sales time-series forecasting. In KDD 2020. (citations: 68)
-
TLAE: Nguyen, N., & Quanz, B. Temporal latent auto-encoder: A method for probabilistic multivariate time series forecasting. AAAI 2021. (citations: 67)
-
TTM: Ekambaram, V., Jati, A., Nguyen, N.H., Dayama, P., Reddy, C., Gifford, W.M. and Kalagnanam, J., Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series., NeurIPS 2024. (citations: 10)
-
HPRO: Jati, A., Ekambaram, V., Pal, S., Quanz, B., Gifford, W.M., Harsha, P., Siegel, S., Mukherjee, S. and Narayanaswami, C. Hierarchical proxy modeling for improved HPO in time series forecasting. In KDD 2023. (citations: 6)
-
AutoMixer: Palaskar, S., Ekambaram, V., Jati, A., Gantayat, N., Saha, A., Nagar, S., Nguyen, N.H., Dayama, P., Sindhgatta, R., Mohapatra, P. and Kumar, H. Automixer for improved multivariate time-series forecasting on business and it observability data. In AAAI 2024. (citations: 3)
-
ConCerNet: Zhang, W., Weng, T.W., Das, S., Megretski, A., Daniel, L. and Nguyen, L.M. ConCerNet: A Contrastive Learning Based Framework for Automated Conservation Law Discovery and Trustworthy Dynamical System Prediction. In ICML 2023. (citations: 1)
-
Trang H. Tran, Lam M. Nguyen, Kyongmin Yeo, Nam Nguyen, Dzung Phan, Roman Vaculin, Jayant Kalagnanam. An End-to-End Time Series Model for Simultaneous Imputation and Forecast. arXiv preprint 2023.
-
Anh Duy Nguyen, Trang H. Tran, Hieu H. Pham, Phi Le Nguyen, Lam M. Nguyen. Learning Robust and Consistent Time Series Representations: A Dilated Inception-Based Approach. arXiv preprint 2023.
-
Arindam Jati, Vijay Ekambaram, Pankaj Dayama, Nam H. Nguyen,Jayant Kalagnanam. Light-Weight Pre-Trained Mixer Models For Effective Transfer Learning In Multivariate Time Series Forecasting. Presented at the 44th International Symposium on Forecasting (ISF), 2024, held at Dijon, France.
-
Sumanta Mukherjee, Chandramouli Kamanchi, Pankaj Dayama, Vijay Ekambaram, Arindam Jati, Kameshwaran Sampath. Intervention Aware Forecasting For Process Control With Sparse Data. Presented at the 44th International Symposium on Forecasting (ISF), 2024, held at Dijon, France.
-
Lam M. Nguyen, Trang H. Tran, Wang Zhang, Subhro Das, Tsui-Wei Weng. When Machine Learning meets Dynamical Systems: Theory and Applications. Workshop at The 37th Conference on Artificial Intelligence (AAAI 2023).
We sincerely thank all the blog authors for dedicating their valuable time to analyzing and exploring our TSFM models. The analysis and conclusions presented are entirely the work of the respective authors.
-
September 3 2024: https://www.ibm.com/blog/time-series-forecasting/
-
October 17, 2024: https://www.linkedin.com/pulse/introducing-ibm-tiny-time-mixer-new-era-forecasting-rodrigo-andrade-qaxqc
-
August 30, 2024: https://www.ibm.com/granite/docs/models/time-series
-
August 28, 2024: IBM Developer Blog
-
June 4, 2024: At Think, IBM showed how generative AI is set to take automation to another level
-
March 14, 2024: Generative AI could offer a faster way to test theories of how the universe works
-
February 1, 2024: A crystal ball made of AI transformers
-
February 1, 2024: Patch Time Series Transformer in Hugging Face - Getting Started (joint blog post with Hugging Face)
-
January 19, 2024: PatchTSMixer in HuggingFace - Getting Started (joint blog post with Hugging Face)
-
June 26, 2023: TS Foundation Models - The Battle of Time-series Transformers
-
September 30, 2024: Introduction to Granite TTM: Revolutionizing Time Series Forecasting
-
August 29, 2024 [Forbes]: IBM Improves Generative AI Forecasting Using Time, Not Just Attention
-
August 28, 2024 [Techopedia]:: Exclusive: IBM Explains New ‘TinyTimeMixer’ AI
-
August 23, 2024 [The Stack]:: IBM reveals why its "tiny" AI models punch well above their weight
-
August 19, 2024 [Fierce-Network:]: ‘Tiny’ AI, big world: New models show smaller can be smarter
-
July 31, 2024 [Medium]: Tiny Time Mixers (TTMs) for Next-Level Time Series Forecasting
-
July 17, 2024 [Medium]: Exploring the Latest Advances in Foundation Time-Series Models
-
June 11, 2024 [IndiaAi.Gov]: IBM showcased GenAI’s potential to reshape business automation at Think 2024
-
June 4, 2024 [SubStack]: Tiny Time Mixers(TTMs): Powerful Zero/Few-Shot Forecasting Models by IBM
-
May 14, 2024 [Medium]: Predicting Venetian Lagoon Tide Levels with Multivariate Time Series Modeling
-
June 20, 2023 [TowardsDataScience]: PatchTST: A Breakthrough in Time Series Forecasting
-
May 25, 2023 [TowardsDataScience]: The Return of the Fallen: Transformers for Forecasting
-
May 17, 2023 [Medium]: PatchTST for Time Series Forecasting: Original Results and My Single-Channel Experiments
We sincerely thank all the tutors for dedicating their valuable time to analyzing and exploring our TSFM models. The analysis and conclusions presented are entirely the work of the respective authors and tutors.
🚀 Stars: 2600+, Forks: 500+ (as of 26 Aug 2024) 🚀
Model | Repository | Stars | Forks | Comment |
---|---|---|---|---|
TTM | granite-tsfm | 406 | 178 | From the Authors |
TTM | sktime | -- | -- | -- |
PatchTSMixer | HuggingFace | -- | -- | From the Authors |
PatchTST | HuggingFace | -- | -- | From the Authors |
PatchTST | GluonTS | -- | -- | -- |
PatchTST | Nixtla | -- | -- | -- |
PatchTST | yuqinie98/PatchTST | 1.5k | 256 | From the authors |
TST | tsai | -- | -- | -- |
TST | mvts_transformer | 734 | 171 | From the authors |
[-- indicates that our model is included in the library, but there is currently no capability to track stars at the individual model level.]