Workshop on Long-Term Visual Localization under Changing Conditions
During ECCV 2020 we will be hosting another installment of the long-term visual localization under changing conditions workshop. We will again have three competitions for different visual localization scenarios. Please see the main workshop page for details. You can find the workshop page for last year's competition here.
Results can be submitted to the challenge under the Submission tab above. Select the relevant challenge from the dropdown menu. For the autonomous vehicle and handheld devices challenges, please submit the method once per dataset, using the same method name for all submissions. Datasets for which no results are available for a method will be displayed with zeroes.
The ranking is performed using the Shultze method.
The pose error thresholds which have been used for each dataset to calculate the fraction of correctly localized images are listed for each dataset on the Benchmark page.
Listed below are the public results on the three benchmark datasets.
Visual localization for autonomous vehicles challenge
Method | Extended CMU Seasons | RobotCar Seasons v2 | SILDa | |||||
---|---|---|---|---|---|---|---|---|
urban | suburban | park | day all | night all | evening | snow | night | |
Hierarchical Localization - SuperPoint + SuperGlue (multi-camera) | 98.1 / 99.8 / 99.9 | 98.3 / 99.5 / 100.0 | 94.2 / 97.1 / 98.5 | 63.8 / 95.0 / 100.0 | 45.0 / 86.2 / 94.6 | 35.5 / 75.0 / 97.1 | 0.0 / 2.4 / 86.3 | 31.7 / 54.4 / 81.9 |
KAPTURE-R2D2-FUSION | 97.0 / 99.1 / 99.8 | 95.0 / 97.0 / 99.4 | 89.2 / 93.4 / 97.5 | 66.0 / 95.1 / 100.0 | 46.2 / 76.5 / 91.4 | 32.4 / 67.4 / 93.3 | 0.2 / 4.1 / 88.9 | 30.4 / 54.2 / 81.1 |
isrf_5k_o2s | 93.8 / 96.6 / 98.2 | 83.5 / 86.8 / 90.5 | 76.4 / 80.9 / 85.7 | 0.0 / 0.0 / 0.0 | 0.0 / 0.0 / 0.0 | 31.9 / 65.2 / 87.7 | 2.9 / 14.9 / 67.8 | 30.5 / 53.8 / 78.8 |
test1 | 0.0 / 0.0 / 0.0 | 0.0 / 0.0 / 0.0 | 0.0 / 0.0 / 0.0 | 0.0 / 0.0 / 0.0 | 0.0 / 0.0 / 0.0 | 0.0 / 0.0 / 0.0 | 0.0 / 0.0 / 0.0 | 0.0 / 0.0 / 0.0 |
Local feature challenge
Method | Aachen v1.1 |
---|---|
night | |
LoFTR [local feature challenge] | 72.8 / 88.5 / 99.0 |
SuperPoint + SuperGlue [local feature challenge] | 73.3 / 88.0 / 98.4 |
isrf_o2s_[local_feature_challenge] | 69.1 / 87.4 / 98.4 |
LISRD + SuperPoint keypoints + Adalam | 73.3 / 86.9 / 97.9 |
R2D2_40K | 71.2 / 86.9 / 97.9 |
LISRD + SuperPoint keypoints | 72.3 / 86.4 / 97.4 |
test1 | 34.6 / 41.4 / 49.7 |
Visual localization for handheld devices challenge
Method | Aachen v1.1 | InLoc | ||
---|---|---|---|---|
day | night | duc1 | duc2 | |
Hierarchical Localization - SuperPoint + SuperGlue | 89.8 / 96.1 / 99.4 | 77.0 / 90.6 / 100.0 | 49.0 / 68.7 / 80.8 | 53.4 / 77.1 / 82.4 |
LoFTR | 88.7 / 95.6 / 99.0 | 78.5 / 90.6 / 99.0 | 47.5 / 72.2 / 84.8 | 54.2 / 74.8 / 85.5 |
RLOCS_v1.0 | 86.0 / 94.8 / 98.8 | 72.3 / 88.5 / 99.0 | 47.0 / 71.2 / 84.8 | 58.8 / 77.9 / 80.9 |
KAPTURE-R2D2-FUSION | 90.9 / 96.7 / 99.5 | 78.5 / 91.1 / 98.4 | 41.4 / 60.1 / 73.7 | 47.3 / 67.2 / 73.3 |
isrf_5k_o2s | 87.1 / 94.7 / 98.3 | 74.3 / 86.9 / 97.4 | 39.4 / 58.1 / 70.2 | 41.2 / 61.1 / 69.5 |
KAPTURE-R2D2-APGeM_top20_0.1mm | 0.0 / 0.0 / 0.0 | 0.0 / 0.0 / 0.0 | 36.9 / 53.0 / 65.7 | 34.4 / 52.7 / 59.5 |
KAPTURE-R2D2-APGeM_top20_0.5mm | 0.0 / 0.0 / 0.0 | 0.0 / 0.0 / 0.0 | 36.4 / 52.0 / 64.1 | 30.5 / 53.4 / 58.0 |