Skip to content

yuxin-jiang/MRKD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MRKD - A Masked Reverse Knowledge Distillation Method Incorporating Global and Local Information for Image Anomaly Detection

This is an official implementation of “A Masked Reverse Knowledge Distillation Method Incorporating Global and Local Information for Image Anomaly Detection” (MRKD) with PyTorch, accepted by knowledge-based systems.

Paper link.

Abstract: Knowledge distillation is an effective image anomaly detection and localization scheme. However, a major drawback of this scheme is its tendency to overly generalize, primarily due to the similarities between input and supervisory signals. In order to address this issue, this paper introduces a novel technique called masked reverse knowledge distillation (MRKD). By employing image-level masking (ILM) and feature-level masking (FLM), MRKD transforms the task of image reconstruction into image restoration. Specifically, ILM helps to capture global information by differentiating input signals from supervisory signals. On the other hand, FLM incorporates synthetic feature-level anomalies to ensure that the learned representations contain sufficient local information. With these two strategies, MRKD is endowed with stronger image context capture capacity and is less likely to be overgeneralized. Experiments on the widely-used MVTec anomaly detection dataset demonstrate that MRKD achieves impressive performance: image-level 98.9% AU-ROC, pixel-level 98.4% AU-ROC, and 95.3% AU-PRO. In addition, extensive ablation experiments have validated the superiority of MRKD in mitigating the overgeneralization problem..

Keywords: Image anomaly detection, Knowledge distillation, Deep learning

Implementation

  1. Environment.

pytorch == 1.12.0

torchvision == 0.13.0

numpy == 1.21.6

scipy == 1.7.3

matplotlib == 3.5.2

tqdm

  1. Dataset.

Download the MVTec dataset here.

  1. Execute the following command to see the training and evaluation results.
python main.py

Visualization

Reference

@article{jiang2023masked,
  title={A masked reverse knowledge distillation method incorporating global and local information for image anomaly detection},
  author={Jiang, Yuxin and Cao, Yunkang and Shen, Weiming},
  journal={Knowledge-Based Systems},
  volume={280},
  pages={110982},
  year={2023},
  publisher={Elsevier}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages