Skip to content

uctb/TSFM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 

Repository files navigation

Foundation Model for Dynamical System

This repo focus on progress on Time Series Foundation Model.

project Availability Documentation Code Data
code weight data hyperparamters pre-train fine-tune zero-shot 数据2 数据3 数据3

Time Series Foundation Model

Survey&Benchmark

2023

  • Large models for time series and spatio-temporal data: A survey and outlook.

2024

  • A Survey of Deep Learning and Foundation Models for Time Series Forecasting. Miller(University of Georgia), John A., Mohammed Aldosari, Farah Saeed, Nasid Habib Barna, Subas Rana, I. Budak Arpinar, and Ninghao Liu. link 🔗18
  • Foundation Models for Time Series Analysis: A Tutorial and Survey. Yuxuan Liang(The Hong Kong University of Science and Technology(Guangzhou), Haomin Wen(Beijing Jiaotong University)) link 🔗26
  • FoundTS: Comprehensive and Unified Benchmarking of Foundation Models for Time Series Forecasting. Zhe Li, Xiangfei Qiu, Peng Chen, Yihang Wang, Hanyin Cheng, Yang Shu, Jilin Hu, Chenjuan Guo, Aoying Zhou, Qingsong Wen, Christian S. Jensen, Bin Yang link [code]
  • GIFT-Eval: A Benchmark For General Time Series Forecasting Model Evaluation. Taha Aksu, Gerald Woo, Juncheng Liu, Xu Liu, Chenghao Liu, Silvio Savarese, Caiming Xiong, Doyen Sahoo. link

Work

2024

  • Exploring Representations and Interventions in Time Series Foundation Models. Michał Wiliński, Mononito Goswami, Willa Potosnak, Nina Żukowska, Artur Dubrawski link 🔗 9

  • Towards Neural Scaling Laws for Time Series Foundation Models. Qingren Yao, Chao-Han Huck Yang, Renhe Jiang, Yuxuan Liang, Ming Jin, Shirui Pan link 🔗 1

  • UniMTS: Unified Pre-training for Motion Time Series. Xiyuan Zhang, Diyan Teng, Ranak Roy Chowdhury, Shuheng Li, Dezhi Hong, Rajesh K. Gupta, Jingbo Shang. link 🔗14 code

  • UNITS: A Unified Multi-Task Time Series Model. Shanghua Gao, Teddy Koker, Owen Queen, Thomas Hartvigsen, Theodoros Tsiligkaridis, Marinka Zitnik. link 🔗14 code

  • TIME-FFM: Towards LM-Empowered Federated Foundation Model for Time Series Forecasting. Qingxiang Liu, Xu Liu, Chenghao Liu, Qingsong Wen, Yuxuan Liang. link 🔗1

  • S^2IP-LLM: Semantic Space Informed Prompt Learning with LLM for Time Series Forecasting. Zijie Pan, Yushan Jiang, Sahil Garg, Anderson Schneider, Yuriy Nevmyvaka, Dongjin Song. link 🔗10 code

  • ROSE: Register Assisted General Time Series Forecasting with Decomposed Frequency Learning Yihang Wang, Yuying Qiu, Peng Chen, Kai Zhao, Yang Shu, Zhongwen Rao, Lujia Pan, Bin Yang, Chenjuan Guo. link 🔗0

  • In-Context Fine-Tuning for Time-Series Foundation Models Abhimanyu Das(Google Research), Matthew Faw, Rajat Sen, Yichen Zhou. link 🔗0

  • Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts Xu Liu(Saleforce AI Research, National University of Singapore), Juncheng Liu, Gerald Woo, Taha Aksu, Yuxuan Liang, Roger Zimmermann, Chenghao Liu, Silvio Savarese, Caiming Xiong, Doyen Sahoo. link 🔗0 code

  • FoMo: A Foundation Model for Mobile Traffic Forecasting with Diffusion Model. Haoye Chai(Tsinghua University), Shiyuan Zhang, Xiaoqian Qi, Yong Li. link

  • Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts. Xiaoming Shi(princeton University), Shiyu Wang, Yuqi Nie, Dianqi Li, Zhou Ye, Qingsong Wen, and Ming Jin. link 🔗0 code

  • Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series. Ekambaram, Vijay(IBM Granite), Arindam Jati, Nam H. Nguyen, Pankaj Dayama, Chandra Reddy, Wesley M. Gifford, and Jayant Kalagnanam. link 🔗7 code

  • Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting. Rasul, Kashif, Arjun Ashok, Andrew Robert Williams, Hena Ghonia, Rishika Bhagwatkar, Arian Khorasani, Mohammad Javad Darvishi Bayazi et al. link 🔗15 code

  • Unified Training of Universal Time Series Forecasting Transformers. Woo, Gerald(Salesforce AI Research), Chenghao Liu, Akshat Kumar, Caiming Xiong, Silvio Savarese, and Doyen Sahoo. link 🔗31 code

  • Chronos: Learning the Language of Time Series. Das, Abhimanyu(Google Research), Weihao Kong, Rajat Sen, and Yichen Zhou. link 🔗46 code

  • Moment: A family of open time-series foundation models. Mononito Goswami, Konrad Szafer, Arjun Choudhry, Yifu Cai, Shuo Li, Artur Dubrawski. link 🔗22 code

  • Timer: Generative Pre-trained Transformers Are Large Time Series Models. Yong Liu(Tsinghua University), Haoran Zhang, Chenyu Li, Xiangdong Huang, Jianmin Wang, Mingsheng Long. link 🔗4 code

2023

  • Large Language Models Are Zero Shot Time Series Forecasters. Nate Gruver(NYU), Marc Finzi(CMU), Shikai Qiu(NYU), and Andrew G. Wilson(NYU). link 🔗174 code
  • A decoder-only foundation model for time-series forecasting. Das, Abhimanyu(Google Research), Weihao Kong, Rajat Sen, and Yichen Zhou. link 🔗55 code
  • One Fits All: Power General Time Series Analysis by Pretrained LM. Tian Zhou, Peisong Niu, xue wang, Liang Sun, Rong Jin. link 🔗220 code

Dataset

Spatio-temporal Foundation Model & Multi-variate Time Series Foundation Model

Survey&Benchmark

Work

Dataset

Foundation Model for Dynamic System

Survey&Benchmark

Work

Dataset

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published