compression
[ICLR 2020] Drawing Early-Bird Tickets: Toward More Efficient Training of Deep Networks
This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" by Jonathan Frankle and Michael Carbin that can be easily a…
Pytorch implementation of various Knowledge Distillation (KD) methods.
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Awesome Knowledge Distillation
Single Path One-Shot NAS MXNet implementation with full training and searching pipeline. Support both Block and Channel Selection. Searched models better than the original paper are provided.
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
Code for SkipNet: Learning Dynamic Routing in Convolutional Networks (ECCV 2018)
Conditional channel- and precision-pruning on neural networks
Official PyTorch Implementation of Dynamic Hyperpixel Flow, ECCV 2020
Mayo: Auto-generation of hardware-friendly deep neural networks. Dynamic Channel Pruning: Feature Boosting and Suppression.