EVOLIN is a benchmark for evaluation of line detection and association results. We provide a set of docker-packed line detection and association algorithms, metrics to evaluate them, and line-annotated data. Additional information can be found on our web page, in the article, and in the documentation.
- Install dependencies
sudo apt update \
&& sudo apt upgrade \
&& sudo apt install --no-install-recommends -y libeigen3-dev cmake
- Install our custom
g2opy
:
git clone https://github.com/anastasiia-kornilova/g2opy
cd g2opy
git checkout lines_opt
mkdir build
cd build
cmake ..
make -j8
cd ..
python setup.py install
- Clone this repository
git clone https://github.com/prime-slam/evolin
To evaluate line detectors and associators,
we annotated lr kt2
and of kt2
trajectories from ICL NUIM,
as well as fr3/cabinet
and fr1/desk
trajectories from TUM RGB-D.
Only breaking segments have been annotated,
such as ceilings, floors, walls, doors, and furniture linear elements.
The datasets can be downloaded here.
The following detection metrics are implemented:
- Heatmap-based and vectorized classification
- precision
- recall
- F-score
- average precision
- Repeatability
- repeatability score
- localization error
The following association metrics are implemented:
- Matching classification
- precision
- recall
- F-score
- Pose error
- angular translation error
- absolute translation error
- angular rotation error
- pose error AUC
A list of algorithms and instructions for running them can be found in our repository.
The scripts required for evaluation and examples, as well as the documentation are located in evaluation
folder.
The results of the evaluation of adapted detection and association algorithms can be found in our article.
If you find this work useful in your research, please consider citing:
@article{evolin2023,
title={EVOLIN Benchmark: Evaluation of Line Detection and Association},
author={Kirill Ivanov, Gonzalo Ferrer, and Anastasiia Kornilova},
journal={arXiv preprint arXiv:2303.05162},
year={2023}}