SuperPoint: Self-Supervised Interest Point Detection and Description
This work is based on:
- Tensorflow implementation by Rémi Pautrat and Paul-Edouard Sarlin
- Official SuperPointPretrainedNetwork.
- pytorch-superpoint
- Kornia
Welcome to star this repository!
- Detector repeatibility: 0.67
- Homography estimation on images with viewpoint changes in HPatches dataset: 0.698
(Corresponding result displayed in rpautrat's repository is 0.712. For this pytorch implementation, much better performance can be achieved, 0.725, if using the data generated by rpautrat's magicpoint. Our magicpoint generally produces more points, this may be the reason for the low accuracy. In other words, you may achieve better performance by setting proper values fordet_threshold
,nms
andtop_k
. )
- Convert tf pretrained weight to Pytorch
- Usage:
- 1 Construct network by superpoint_bn.py (Refer to train.py for more details)
- 2 Set parameter eps=1e-3 for all the BatchNormalization functions in model/modules/cnn/*.py
- 3 Set parameter momentum=0.01 (not tested)
- 4 Load pretrained weight superpoint_bn.pth and run forward propagation
- 0 Update your repository to the latested version (if you have pulled it before)
- 1 Prepare your data. Make directories data and export. The data directory should look like,
You can create soft links if you already have coco, hpatches data sets, the commands are,
data |-- coco | |-- train2017 | | |-- a.jpg | | |-- ... | --- test2017 | |-- b.jpg | |-- ... |-- hpatches | |-- i_ajuntament | | |--1.ppm | | |--... | | |--H_1_2 | |-- ...
cd data ln -s dir_to_coco ./coco
- 2 The training steps are much similar to rpautrat/Superpoint.
However we strongly suggest you read the scripts first so that you can give correct settings for your envs.
-
2.0 Modify save model conditions in train.py, line 61
if (i%118300==0 and i!=0) or (i+1)==len(dataloader['train']):
and set proper epoch in *.yaml. -
2.1 Train MagicPoint (>1 hours):
python train.py ./config/magic_point_train.yaml
(Note that you have to delete the directory ./data/synthetic_shapes whenever you want to regenerate it) -
2.2 Export coco labels data set v1 (>50 hours):
python homo_export_labels.py #using your data dirs
-
2.3 Train MagicPoint on coco labels data set v1 (exported by step 2.2)
python train.py ./config/magic_point_coco_train.py #with correct data dirs
-
2.4 Export coco labels data set v2 using the magicpoint trained by step 2.3
-
2.5 Train SuperPoint using coco labels data set v2 (>12 hours)
python train.py ./config/superpoint_train.py #with correct data dirs
-
others. Validate detection repeatability or description
python export_detections_repeatability.py #(very fast) python compute_repeatability.py #(very fast) ## or python export_descriptors.py #(> 5.5 hours) python compute_desc_eval.py #(> 1.5 hours)
model name: superpoint # magicpoint ... data: name: coco #synthetic image_train_path: ['./data/mp_coco_v2/images/train2017',] #several data sets can be list here label_train_path: ['./data/mp_coco_v2/labels/train2017/',] image_test_path: './data/mp_coco_v2/images/test2017/' label_test_path: './data/mp_coco_v2/labels/test2017/'
-