Official PyTorch Implementation of MetaUAS: Universal Anomaly Segmentation with One-Prompt Meta-Learning, NeurIPS 2024.
MetaUAS unifies anomaly segmentation into change segmentation and provides a pure visual foundation model, which requires only one normal image prompt and no additional training, and effectively and efficiently segments any visual anomalies. MetaUAS significantly outperforms most zero-shot, few-shot, and even full-shot anomaly segmentation methods.
You can use our Online Demo to test your custom data for a quick start. Note that the online demo is currently based on CPU. You could also deploy the demo application to your local CPU/GPU server using the following command:
pip install -r requirements.txt
python app.py
If you find this code useful in your research, please consider citing us:
@inproceedings{gao2024metauas,
title = {MetaUAS: Universal Anomaly Segmentation with One-Prompt Meta-Learning},
author = {Gao, Bin-Bin},
booktitle = {Advances in Neural Information Processing Systems},
pages = {39812--39836},
year = {2024}
}