Shilong Liu, Feng Li, Hao Zhang, Xiao Yang, Xianbiao Qi, Hang Su, Jun Zhu, Lei Zhang
Here we provide the pretrained DAB-DETR
weights based on detrex.
Name | Backbone | Pretrain | Epochs | box AP |
download |
---|---|---|---|---|---|
DAB-DETR-R50 | R-50 | IN1k | 50 | 43.3 | model |
DAB-DETR-R101 | R-101 | IN1k | 50 | 44.0 | model |
DAB-DETR-Swin-T | Swin-T | IN1k | 50 | 45.2 | model |
Here are the converted the pretrained weights from DAB-DETR official repo.
Name | Backbone | Pretrain | Epochs | box AP |
download |
---|---|---|---|---|---|
DAB-DETR-R50-3patterns | R-50 | IN1k | 50 | 42.8 | model |
DAB-DETR-R50-DC5 | R-50 | IN1k | 50 | 44.6 | model |
DAB-DETR-R50-DC5-3patterns | R-50 | IN1k | 50 | 45.7 | model |
DAB-DETR-R101-DC5 | R-101 | IN1k | 50 | 45.7 | model |
All configs can be trained with:
cd detrex
python tools/train_net.py --config-file projects/dab_detr/configs/path/to/config.py --num-gpus 8
By default, we use 8 GPUs with total batch size as 16 for training.
Model evaluation can be done as follows:
cd detrex
python tools/train_net.py --config-file projects/dab_detr/configs/path/to/config.py --eval-only train.init_checkpoint=/path/to/model_checkpoint
If you find our work helpful for your research, please consider citing the following BibTeX entry.
@inproceedings{
liu2022dabdetr,
title={{DAB}-{DETR}: Dynamic Anchor Boxes are Better Queries for {DETR}},
author={Shilong Liu and Feng Li and Hao Zhang and Xiao Yang and Xianbiao Qi and Hang Su and Jun Zhu and Lei Zhang},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=oMI9PjOb9Jl}
}