Skip to content

SeqCo-DETR: Sequence Consistency Training for Self-Supervised Object Detection with Transformers

Notifications You must be signed in to change notification settings

AKU-hub/SeqCo-DETR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 

Repository files navigation

SeqCo-DETR: Sequence Consistency Training for Self-Supervised Object Detection with Transformers

This is the implementation of the paper SeqCo-DETR: Sequence Consistency Training for Self-Supervised Object Detection with Transformers.

Paper is under review, code is under preparation, please be patient.

Keywords

  • Self-supervised learning
  • Object detection
  • Sequence consistency
  • Detection with transformers

Method

method

Results

Table 1: Comparison of detection results fine-tuned on MS COCO train2017 and evaluated on val2017.

robot

Result in COCO

Table 2: Comparison of detection results fine-tuned on PASCAL VOC trainval07+12 and evaluated on test07.

robot

Result in VOC

Table 3: Comparison results on few-shot detection task on COCO, evaluated on the novel classes.

robot

Result in Few-shot

About

SeqCo-DETR: Sequence Consistency Training for Self-Supervised Object Detection with Transformers

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published