Transformer for Object Re-Identification: A Survey. arXiv
-
An implementation of UntransReID for unsupervised Re-ID is HERE.
-
An implementation of UntransReID for cross-modality visible-infrared unsupervised Re-ID is HERE.
-
An implementation of the unified experimental standard for animal Re-ID is HERE.
-
An in-depth analysis of Transformer's strengths, highlighting its impact across four key Re-ID directions: image/video-based, limited data/annotations, cross-modal, and special scenarios.
-
A new Transformer-based unsupervised baseline, UntransReID, achieving state-of-the-art performance on both single/cross modal Re-ID.
-
A unified experimental standard for animal Re-ID, designed to address its unique challenges and evaluate the potential of Transformer-based approaches.
Please kindly cite this paper in your publications if it helps your research:
@article{ye2024transformer,
title={Transformer for Object Re-Identification: A Survey},
author={Ye, Mang and Chen, Shuoyi and Li, Chenyue and Zheng, Wei-Shi and Crandall, David and Du, Bo},
journal={arXiv preprint arXiv:2401.06960},
year={2024}
}
Deep Learning for Person Re-identification: A Survey and Outlook. PDF with supplementary materials. arXiv
-
An implementation of AGW for cross-modality visible-infrared Re-ID is HERE.
-
An implementation of AGW for video Re-ID is HERE
-
An implementation of AGW for partial Re-ID is HERE.
A simplified introduction in Chinese on 知乎.
-
A comprehensive survey with in-depth analysis for closed- and open-world person Re-ID in recent years (2016-2020).
-
A new evaluation metric, namely mean Inverse Negative Penalty (mINP), which measures the ability to find the hardest correct match.
-
A new AGW baseline with non-local Attention block, Generalized mean pooling and Weighted regularization triplet. It acheieves competitive performance on FOUR challenging Re-ID tasks, including single-modality image-based Re-ID, video-based Re-ID, Partial Re-ID and cross-modality Re-ID.
Method | Pretrained | Rank@1 | mAP | mINP | Model | Paper |
---|---|---|---|---|---|---|
BagTricks | ImageNet | 86.4% | 76.4% | 40.7% | Code | Bag of Tricks and A Strong Baseline for Deep Person Re-identification. In ArXiv 19. PDF |
ABD-Net | ImageNet | 89.0% | 78.6% | 42.1% | Code | ABD-Net: Attentive but Diverse Person Re-Identification. In ICCV 19. PDF |
AGW | ImageNet | 89.0% | 79.6% | 45.7% | GoogleDrive | Deep Learning for Person Re-identification: A Survey and Outlook |
Method | Pretrained | Rank@1 | mAP | mINP | Model | Paper |
---|---|---|---|---|---|---|
BagTricks | ImageNet | 94.5% | 85.9% | 59.4% | Code | Bag of Tricks and A Strong Baseline for Deep Person Re-identification. In ArXiv 19. arXiv |
ABD-Net | ImageNet | 95.6% | 88.3% | 66.2% | Code | ABD-Net: Attentive but Diverse Person Re-Identification. In ICCV 19. PDF |
AGW | ImageNet | 95.1% | 87.8% | 65.0% | GoogleDrive | Deep Learning for Person Re-identification: A Survey and Outlook. In ArXiv 20. arXiv |
Method | Pretrained | Rank@1 | mAP | mINP | Model | Paper |
---|---|---|---|---|---|---|
BagTricks | ImageNet | 58.0% | 56.6% | 43.8% | Code | Bag of Tricks and A Strong Baseline for Deep Person Re-identification. In ArXiv 19. PDF |
AGW | ImageNet | 63.6% | 62.0% | 50.3% | GoogleDrive | Deep Learning for Person Re-identification: A Survey and Outlook. In ArXiv 20. arXiv |
Method | Pretrained | Rank@1 | mAP | mINP | Model | Paper |
---|---|---|---|---|---|---|
BagTricks | ImageNet | 63.4% | 45.1% | 12.4% | Code | Bag of Tricks and A Strong Baseline for Deep Person Re-identification. In ArXiv 19. arXiv |
AGW | ImageNet | 68.3% | 49.3% | 14.7% | GoogleDrive | Deep Learning for Person Re-identification: A Survey and Outlook. In ArXiv 20. arXiv |
Create a directory to store reid datasets under this repo, taking Market1501 for example
cd ReID-Survey
mkdir toDataset
-
Set
_C.DATASETS.ROOT_DIR = ('./toDataset')
inconfig/defaults.py
-
Download dataset to toDataset/ from http://www.liangzheng.org/Project/project_reid.html
-
Extract dataset and rename to
market1501
. The data structure would like:
toDataset
market1501
bounding_box_test/
bounding_box_train/
......
Partial-REID and Partial-iLIDS datasets are provided by https://github.com/lingxiao-he/Partial-Person-ReID
- pytorch=1.0.0
- torchvision=0.2.1
- pytorch-ignite=0.1.2
- yacs
- scipy=1.2.1
- h5py
To train a AGW model with on Market1501 with GPU device 0, run similarly:
python3 tools/main.py --config_file='configs/AGW_baseline.yml' MODEL.DEVICE_ID "('0')" DATASETS.NAMES "('market1501')" OUTPUT_DIR "('./log/market1501/Experiment-AGW-baseline')"
To test a AGW model with on Market1501 with weight file './pretrained/dukemtmc_AGW.pth'
, run similarly:
python3 tools/main.py --config_file='configs/AGW_baseline.yml' MODEL.DEVICE_ID "('0')" DATASETS.NAMES "('market1501')" MODEL.PRETRAIN_CHOICE "('self')" TEST.WEIGHT "('./pretrained/market1501_AGW.pth')" TEST.EVALUATE_ONLY "('on')" OUTPUT_DIR "('./log/Test')"
Please kindly cite this paper in your publications if it helps your research:
@article{pami21reidsurvey,
title={Deep Learning for Person Re-identification: A Survey and Outlook},
author={Ye, Mang and Shen, Jianbing and Lin, Gaojie and Xiang, Tao and Shao, Ling and Hoi, Steven C. H.},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
year={2021},
}
Contact: [email protected]