Highlights
- Pro
Stars
Enchanted is iOS and macOS app for chatting with private self hosted language models such as Llama2, Mistral or Vicuna using Ollama.
PyTorch implementation of multi-task learning architectures, incl. MTI-Net (ECCV2020).
Official PyTorch Code for ICLR24 Workshop Paper: PostRainBench
A global community dataset for large-sample hydrology
[Mamba-Survey-2024] Paper list for State-Space-Model/Mamba and it's Applications
A collection of learning resources for curious software engineers
Transformer training code for sequential tasks
A little walk-trough different types of the block with their corresponding implementation in PyTorch
A benchmark for the next generation of data-driven global weather models.
Deep learning PyTorch library for time series forecasting, classification, and anomaly detection (originally for flood forecasting).
1st place solution for "xView2: Assess Building Damage" challenge.
Time-Series Work Summary in CS Top Conferences (NIPS, ICML, ICLR, KDD, AAAI, WWW, IJCAI, CIKM, ICDM, ICDE, etc.)
Adversarial Sparse Transformer for Time Series Forecasting
[AAAI-23 Oral] Official implementation of the paper "Are Transformers Effective for Time Series Forecasting?"
Hierarchical Image Pyramid Transformer - CVPR 2022 (Oral)
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Graph Transformer Networks (Authors' PyTorch implementation for the NeurIPS 19 paper)
A Library for Advanced Deep Time Series Models.
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
A professionally curated list of awesome resources (paper, code, data, etc.) on transformers in time series.
Multivariate Time Series Transformer, public version
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
A Large-scale Benchmark Dataset for Data-Driven Streamflow Forecasting
ECCV18 Workshops - Enhanced SRGAN. Champion PIRM Challenge on Perceptual Super-Resolution. The training codes are in BasicSR.