Starred repositories
ReMoDiffuse: Retrieval-Augmented Motion Diffusion Model
MOMENT: A Family of Open Time-series Foundation Models
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts
Unified Training of Universal Time Series Forecasting Transformers
ProbTS is a benchmarking toolkit for time series forecasting.
Codes for "Retrieval-Augmented Diffusion Models for Time Series Forecasting"
[AAAI-23 Oral] Official implementation of the paper "Are Transformers Effective for Time Series Forecasting?"
The GitHub repository for the paper "Informer" accepted by AAAI 2021.
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Translate PDF, EPub, webpage, metadata, annotations, notes to the target language. Support 20+ translate services.
李宏毅2021/2022/2023春季机器学习课程课件及作业
Deep Learning Specialization by Andrew Ng on Coursera.
Official codebase for the Paper “Retrieval-Augmented Diffusion Models”
A latent text-to-image diffusion model
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
Raw data and processing scripts of Weather Captioned Dataset in TGTSF
Additional exercises and data for EE364a. No solutions; for public consumption.
TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.
Official code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
Chronos: Pretrained Models for Probabilistic Time Series Forecasting
Awesome Learn From Model Beyond Fine-Tuning: A Survey
A Library for Advanced Deep Time Series Models.