Skip to content

Commit

Permalink
updating readme and requirements
Browse files Browse the repository at this point in the history
  • Loading branch information
imelekhov committed Apr 4, 2019
1 parent 1c99152 commit b73b724
Show file tree
Hide file tree
Showing 3 changed files with 43 additions and 6 deletions.
36 changes: 36 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,40 @@
# DGC-Net: Dense Geometric Correspondence Network
This is a PyTorch implementation of our work ["DGC-Net: Dense Geometric Correspondence Network"](https://arxiv.org/abs/1810.08393)

## Installation
- create and activate conda environment with Python 3.x
```
conda create -n my_fancy_env python=3.7
source activate my_fancy_env
```
- install Pytorch v1.0.0 and torchvision library
```
pip install torch torchvision
```
- install all dependencies by running the following command:
```
pip install -r requirements.txt
```

## Getting started
- ```eval.py``` demonstrates the results on the HPatches dataset
To be able to run ```eval.py``` script:
* Download an archive with pre-trained models (click) and unpack it
in the project folder
* Download HPatches dataset (Full image sequences). The dataset is available [here](https://github.com/hpatches/hpatches-dataset) at the end of the page
* Run the following command:
```
python eval.py --image-data-path /path/to/hpatches-geometry
```

- ```train.py``` is a script to train DGC-Net/DGCM-Net model from scratch. To run this script, please follow the next procedure:
* Download the [TokyoTimeMachine dataset](https://www.di.ens.fr/willow/research/netvlad/)
* Run the command:
```
python train.py --image-data-path /path/to/TokyoTimeMachine
```


## Performance on [HPatches](https://github.com/hpatches/hpatches-dataset) dataset
Method / HPatches ID|Viewpoint 1|Viewpoint 2|Viewpoint 3|Viewpoint 4|Viewpoint 5
:---|:---:|:---:|:---:|:---:|:---:
Expand All @@ -13,6 +47,8 @@ DGCM-Net (repo) | 2.33 | 5.62 | 9.55 | **11.59** | **16.48**

Note: There is a difference in numbers presented in the original paper and obtained by the models of this repo. It might be related to the fact that both models (DGC-Net and DGCM-Net) have been trained using ```Pytorch v0.3```.

More qualitative results are presented on the ["project page"](https://aaltovision.github.io/dgc-net-site/)

## How to cite
If you use this software in your own research, please cite our publication:

Expand Down
5 changes: 3 additions & 2 deletions eval.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,8 @@
# Paths
parser.add_argument('--csv-path', type=str, default='data/csv',
help='path to training transformation csv folder')
parser.add_argument('--image-path', type=str, default='data/hpatches-geometry',
parser.add_argument('--image-data-path', type=str,
default='data/hpatches-geometry',
help='path to folder containing training images')
parser.add_argument('--model', type=str, default='dgc',
help='Model to use', choices=['dgc', 'dgcm'])
Expand Down Expand Up @@ -78,7 +79,7 @@
test_dataset = \
HPatchesDataset(csv_file=osp.join(args.csv_path,
'hpatches_1_{}.csv'.format(k)),
image_path_orig=args.image_path,
image_path_orig=args.image_data_path,
transforms=dataset_transforms)

test_dataloader = DataLoader(test_dataset,
Expand Down
8 changes: 4 additions & 4 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
termcolor=1.1.0
opencv-python=4.0.0.21
tqdm=4.31.1
pandas=0.24.2
termcolor==1.1.0
opencv-python==4.0.0.21
tqdm==4.31.1
pandas==0.24.2

0 comments on commit b73b724

Please sign in to comment.