- San Antonio,Texas.
Stars
Practice on cifar100(ResNet, DenseNet, VGG, GoogleNet, InceptionV3, InceptionV4, Inception-ResNetv2, Xception, Resnet In Resnet, ResNext,ShuffleNet, ShuffleNetv2, MobileNet, MobileNetv2, SqueezeNet…
TFDS is a collection of datasets ready to use with TensorFlow, Jax, ...
Contrastive unpaired image-to-image translation, faster and lighter training than cyclegan (ECCV 2020, in PyTorch)
Advanced Deep Learning with Keras, published by Packt
Pytorch implementation of various Knowledge Distillation (KD) methods.
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content…
Pretrained TorchVision models on CIFAR10 dataset (with weights)
Deep Learning Computer Vision Algorithms for Real-World Use
Convert trained PyTorch models to Keras, and the other way around
Scripts for Imagenet 32 dataset
High-acc(>0.7) model(ResNet, ResNeXt, DenseNet, SENet, SE-ResNeXt) on TensorFlow.
Compress neural network with pruning and quantization using TensorFlow.
🔬 Some personal research code on analyzing CNNs. Started with a thorough exploration of Stanford's Tiny-Imagenet-200 dataset.
AutoEncoder trained on ImageNet
Train ResNet on ImageNet in Tensorflow 2.0; ResNet 在ImageNet上完整训练代码
Parse Robinhood 1099 Tax Document from PDF into CSV
[ACLW'24] LMPT: Prompt Tuning with Class-Specific Embedding Loss for Long-tailed Multi-Label Visual Recognition
Unofficial Pytorch implementation of Deep Compression in CIFAR10
Keras + tensorflow experiments with knowledge distillation on EMNIST dataset
95.76% on CIFAR-10 with TensorFlow2
Keywords: Image Denoising, CNNs, Autoencoders, Residual Learning, PyTorch
[ICCV 2023] Efficient Joint Optimization of Layer-Adaptive Weight Pruning in Deep Neural Networks