Skip to content

My master thesis about optimizing Deep Learning Face recognition models using quantization, pruning and knowledge distillation.

Notifications You must be signed in to change notification settings

DavorJordacevic/IResNet-ArcFace-Knowledge-Distillation

Repository files navigation

IResNet Arcface in Pytorch

Requirements

Training

To train a model, run /KD_Iresnet/train_kd.sh with the correct address.

Testing

To test a model, run mainOne.sh with the correct model path.

Accuracy

Credits

This project is made possible by the wonderful people and projects listed in this document.

  1. Fadi Boutros - Masked-Face-Recognition-KD
  2. Jia Guo - Insightface
  3. Jiankang Deng - Insightface
  4. Xiang An - Insightface
  5. Jack Yu - Insightface
  6. Baris Gecer - Insightface

Citation

@inproceedings{deng2019arcface,
  title={Arcface: Additive angular margin loss for deep face recognition},
  author={Deng, Jiankang and Guo, Jia and Xue, Niannan and Zafeiriou, Stefanos},
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
  pages={4690--4699},
  year={2019}
}
@inproceedings{an2020partical_fc,
  title={Partial FC: Training 10 Million Identities on a Single Machine},
  author={An, Xiang and Zhu, Xuhan and Xiao, Yang and Wu, Lan and Zhang, Ming and Gao, Yuan and Qin, Bin and
  Zhang, Debing and Fu Ying},
  booktitle={Arxiv 2010.05222},
  year={2020}
}
@INPROCEEDINGS{huber2021maskinvariant,  
   author={Huber, Marco and Boutros, Fadi and Kirchbuchner, Florian and Damer, Naser},  
   booktitle={2021 16th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2021)},   
   title={Mask-invariant Face Recognition through Template-level Knowledge Distillation},   
   year={2021},  
   volume={},  
   number={},  
   pages={1-8},  
   doi={10.1109/FG52635.2021.9667081}
}

About

My master thesis about optimizing Deep Learning Face recognition models using quantization, pruning and knowledge distillation.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published