Highlights
- Pro
Stars
A curated list of neural network pruning resources.
2024 up-to-date list of DATASETS, CODEBASES and PAPERS on Multi-Task Learning (MTL), from Machine Learning perspective.
A curated list of papers of interesting empirical study and insight on deep learning. Continually updating...
An implementation for "Federated Learning with Non-IID Data via Local Drift Decoupling and Correction"
NFNets and Adaptive Gradient Clipping for SGD implemented in PyTorch. Find explanation at tourdeml.github.io/blog/
[ICLR'21] FedBN: Federated Learning on Non-IID Features via Local Batch Normalization
A flexible Federated Learning Framework based on PyTorch, simplifying your Federated Learning research.
"Multi-Source Collaborative Gradient Discrepancy Minimization for Federated Domain Generalization", AAAI 2024, MindSpore version
Official repo for the WACV 2023 paper: Federated Domain Generalization for Image Recognition via Cross-Client Style Transfer.
Localize-and-Stitch: Efficient Model Merging via Sparse Task Arithmetic
[ICLR24 (Spotlight)] "SalUn: Empowering Machine Unlearning via Gradient-based Weight Saliency in Both Image Classification and Generation" by Chongyu Fan*, Jiancheng Liu*, Yihua Zhang, Eric Wong, D…
FusionBench: A Comprehensive Benchmark/Toolkit of Deep Model Fusion
Diabetic Retinopathy Grading with Weakly-Supervised Lesion Priors
Model Merging in LLMs, MLLMs, and Beyond: Methods, Theories, Applications and Opportunities. arXiv:2408.07666.
RETFound - A foundation model for retinal image
Official Implementation of ICLR 2024 paper "Harnessing Explanations: LLM-to-LM Interpreter for Enhanced Text-Attributed Graph Representation Learning"
Exploring the Potential of Large Language Models (LLMs) in Learning on Graphs
PyTorch implementation of "Supervised Contrastive Learning" (and SimCLR incidentally)
Published papers focusing on graph domain adaptation
Python library assists deep learning on graphs
It is a comprehensive resource hub compiling all graph papers accepted at the International Conference on Learning Representations (ICLR) in 2024.
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
An autoregressive character-level language model for making more things
Tutel MoE: An Optimized Mixture-of-Experts Implementation