evolin


tests License: Apache License 2.0

EVOLIN is a benchmark for evaluation of line detection and association results. We provide a set of docker-packed line detection and association algorithms, metrics to evaluate them, and line-annotated data. Additional information can be found on our web page and in the article.

Installation

Dependencies

  1. Install dependencies
sudo apt update && sudo apt upgrade && sudo apt install --no-install-recommends -y libeigen3-dev cmake
  1. Install our custom g2opy:
git clone https://github.com/anastasiia-kornilova/g2opy
cd g2opy
git checkout lines_opt
mkdir build
cd build
cmake ..
make -j8
cd ..
python setup.py install
  1. Clone this repository
git clone https://github.com/prime-slam/evolin

Annotated data

To evaluate line detectors and associators, we annotated lr kt2 and of kt2 trajectories from ICL NUIM, as well as fr3/cabinet and fr1/desk trajectories from TUM RGB-D. Only breaking segments have been annotated, such as ceilings, floors, walls, doors, and furniture linear elements. The datasets can be downloaded here.

Metrics

The following detection metrics are implemented:

  • Heatmap-based and vectorized classification
    • precision
    • recall
    • F-score
    • average precision
  • Repeatability
    • repeatability score
    • localization error

The following association metrics are implemented:

  • Matching classification
    • precision
    • recall
    • F-score
  • Pose error
    • angular translation error
    • absolute translation error
    • angular rotation error
    • pose error AUC

Get detection and association results

A list of algorithms and instructions for running them can be found in our repository.

Evaluation

The scripts required for evaluation and examples, as well as the documentation are located in evaluation folder. The results of the evaluation of adapted detection and association algorithms can be found in our article.

Cite us

If you find this work useful in your research, please consider citing:

@article{evolin2023,
title={EVOLIN Benchmark: Evaluation of Line Detection and Association},
author={Kirill Ivanov, Gonzalo Ferrer, and Anastasiia Kornilova},
journal={arXiv preprint arXiv:2303.05162},
year={2023}}
 1"""
 2<div align="center">
 3  <img src="https://raw.githubusercontent.com/prime-slam/evolin/main/assets/logo.png">
 4</div>
 5
 6---
 7[![tests](https://github.com/prime-slam/evolin/actions/workflows/ci.yml/badge.svg?branch=main)](https://github.com/prime-slam/evolin/actions/workflows/ci.yml)
 8[![License: Apache License 2.0](https://img.shields.io/github/license/saltstack/salt)](https://opensource.org/license/apache-2-0/)
 9
10EVOLIN is a benchmark for evaluation of line detection and association results. We provide a set of docker-packed line detection and association algorithms, metrics to evaluate them, and line-annotated data.
11Additional information can be found on our [web page](https://prime-slam.github.io/evolin/) and in the [article](https://arxiv.org/abs/2303.05162).
12
13## Installation
14
15### Dependencies
16
171. Install dependencies
18```bash
19sudo apt update \
20&& sudo apt upgrade \
21&& sudo apt install --no-install-recommends -y libeigen3-dev cmake
22```
232. Install our custom `g2opy`:
24```bash
25git clone https://github.com/anastasiia-kornilova/g2opy
26cd g2opy
27git checkout lines_opt
28mkdir build
29cd build
30cmake ..
31make -j8
32cd ..
33python setup.py install
34```
353. Clone this repository
36```bash
37git clone https://github.com/prime-slam/evolin
38```
39
40## Annotated data
41To evaluate line detectors and associators,
42we annotated `lr kt2` and `of kt2` trajectories from [ICL NUIM](https://www.doc.ic.ac.uk/~ahanda/VaFRIC/iclnuim.html),
43as well as `fr3/cabinet` and `fr1/desk` trajectories from [TUM RGB-D](https://cvg.cit.tum.de/data/datasets/rgbd-dataset).
44Only breaking segments have been annotated,
45such as ceilings, floors, walls, doors, and furniture linear elements.
46The datasets can be downloaded [here](https://disk.yandex.com/d/DLAOGP6FI_27hQ).
47
48## Metrics
49
50The following detection metrics are implemented:
51* Heatmap-based and vectorized classification
52  * precision
53  * recall
54  * F-score
55  * average precision
56* Repeatability
57  * repeatability score
58  * localization error
59
60The following association metrics are implemented:
61* Matching classification
62  * precision
63  * recall
64  * F-score
65* Pose error
66  * angular translation error
67  * absolute translation error
68  * angular rotation error
69  * pose error AUC
70
71## Get detection and association results
72A list of algorithms and instructions for running them can be found in our [repository](https://github.com/prime-slam/line-detection-association-dockers).
73
74## Evaluation
75The scripts required for evaluation and examples, as well as the documentation are located in `evaluation` folder.
76The results of the evaluation of adapted detection and association algorithms can be found in our [article](https://arxiv.org/abs/2303.05162).
77## Cite us
78If you find this work useful in your research, please consider citing:
79```bibtex
80@article{evolin2023,
81title={EVOLIN Benchmark: Evaluation of Line Detection and Association},
82author={Kirill Ivanov, Gonzalo Ferrer, and Anastasiia Kornilova},
83journal={arXiv preprint arXiv:2303.05162},
84year={2023}}
85```
86"""
87from evolin import metrics
88from evolin import typing