Stars
AcadHomepage: A Modern and Responsive Academic Personal Homepage
A static single-page application resume-builder developed using React.js and JSON Resume schema (https://suddi.io/)
MMGeneration is a powerful toolkit for generative models, based on PyTorch and MMCV.
FBCNet: An Efficient Multi-view Convolutional Neural Network for Brain-Computer Interface
The official codes for "PMC-LLaMA: Towards Building Open-source Language Models for Medicine"
This is the code of the paper "Federated Transfer Learning for EEG Signal Classification" published in IEEE EMBS 2020 (42nd Annual International Conferences of the IEEE Engineering in Medicine and …
PyTorch implementations of Generative Adversarial Networks.
PyTorch code for BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
Code for the paper "Improved Techniques for Training GANs"
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), ga…
About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
Dual Attention Network for Scene Segmentation (CVPR2019)
Python package for SSVEP datasets and algorithms. Corresponding paper has been accepted by IEEE TNSRE (DOI: 10.1109/TNSRE.2024.3424410)
A framework for few-shot evaluation of language models.
整理开源的中文大语言模型,以规模较小、可私有化部署、训练成本较低的模型为主,包括底座模型,垂直领域微调及应用,数据集与教程等。
[ICLR 2024] M/EEG-based image decoding with contrastive learning. i. Propose a contrastive learning framework to align image and eeg. ii. Resolving brain activity for biological plausibility.
This is a comprehensive list of Heterogeneous Transfer Learning methods with their resource (paper, code and data).
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
A Library for Advanced Deep Time Series Models.
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility