Skip to main content

Evaluation of Visual SLAM Methods in USAR Applications Using ROS/Gazebo Simulation

  • Conference paper
  • First Online:
Proceedings of 15th International Conference on Electromechanics and Robotics "Zavalishin's Readings"

Abstract

The problem of determining the position of a robot and at the same time building the map of the environment is referred to as SLAM. A SLAM system generally outputs the estimated trajectory (a sequence of poses) and the map. In practice, it is hard to obtain ground-truth for the map; hence, only trajectory ground-truth is considered. There are various works that provide datasets to evaluate SLAM algorithms in different scenarios including sensor configurations, robots, and environments. Dataset collection in a real-world environment is a complicated task, which requires an elaborate sensor and robot configuration. Different SLAM systems demand various sensors resulting in the problem of finding an appropriate dataset for their evaluation. Thus, in this paper, a solution that is based on ROS/Gazebo simulations is proposed. Two indoor environments with flat and uneven terrain to evaluate laser range and visual SLAM systems are created. Changing the sensor configuration and the environment does not require an elaborate setup. The results of the evaluation for two popular SLAM methods—ORB-SLAM2 and RTAB-Map—are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bodin, B., Wagstaff, H., Saecdi, S., Nardi, L., Vespa, E., Mawer, J., Nisbet, A., Luján, M., Furber, S., Davison, A.J., et al.: Slambench2: multi-objective head-to-head benchmarking for visual slam. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 1–8. IEEE, New York (2018)

    Google Scholar 

  2. Bresson, G., Alsayed, Z., Yu, L., Glaser, S.: Simultaneous localization and mapping: a survey of current trends in autonomous driving. IEEE Trans. Intell. Veh. 2(3), 194–220 (2017)

    Article  Google Scholar 

  3. Bujanca, M., Gafton, P., Saeedi, S., Nisbet, A., Bodin, B., O’Boyle Michael, F., Davison, A.J., Riley, G., Lennox, B., Luján, M., et al.: Slambench 3.0: systematic automated reproducible evaluation of slam systems for robot vision challenges and scene understanding. In: 2019 International Conference on Robotics and Automation (ICRA), pp. 6351–6358. IEEE, New York (2019)

    Google Scholar 

  4. Ceriani, S., Fontana, G., Giusti, A., Marzorati, D., Matteucci, M., Migliore, D., Rizzi, D., Sorrenti, D.G., Taddei, P.: Rawseeds ground truth collection systems for indoor self-localization and mapping. Autonom. Rob. 27(4), 353 (2009)

    Article  Google Scholar 

  5. Denisov, E., Sagitov, A., Lavrenov, R., Su, K.L., Svinin, M., Magid, E.: Dcegen: dense clutter environment generation tool for autonomous 3d exploration and coverage algorithms testing. In: International Conference on Interactive Collaborative Robotics, pp. 216–225. Springer, Berlin (2019)

    Google Scholar 

  6. Geiger, A., Lenz, P., Urtasun, R.: Are we ready for autonomous driving? The Kitti vision benchmark suite. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition, pp. 3354–3361. IEEE, Berlin (2012)

    Google Scholar 

  7. Giubilato, R., Chiodini, S., Pertile, M., Debei, S.: An experimental comparison of ros-compatible stereo visual slam methods for planetary rovers. In: 2018 5th IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace), pp. 386–391. IEEE, New York (2018)

    Google Scholar 

  8. Huletski, A., Kartashov, D., Krinkin, K.: Evaluation of the modern visual slam methods. In: 2015 Artificial Intelligence and Natural Language and Information Extraction, Social Media and Web Search FRUCT Conference (AINL-ISMW FRUCT), pp. 19–25. IEEE, New York (2015)

    Google Scholar 

  9. Ibragimov, I.Z., Afanasyev, I.M.: Comparison of ros-based visual slam methods in homogeneous indoor environment. In: 2017 14th Workshop on Positioning, Navigation and Communications (WPNC), pp. 1–6. IEEE, New York (2017)

    Google Scholar 

  10. Kasar, A.: Benchmarking and comparing popular visual slam algorithms. arXiv preprint arXiv:1811.09895 (2018)

  11. Kohlbrecher, S., Meyer, J., Graber, T., Petersen, K., Klingauf, U., von Stryk, O.: Hector open source modules for autonomous mapping and navigation with rescue robots. In: Robot Soccer World Cup, pp. 624–631. Springer, Berlin (2013)

    Google Scholar 

  12. Lavrenov, R., Zakiev, A., Magid, E.: Automatic mapping and filtering tool: From a sensor-based occupancy grid to a 3d gazebo octomap. In: 2017 International Conference on Mechanical, System and Control Engineering (ICMSC), pp. 190–195. IEEE, New York (2017)

    Google Scholar 

  13. Li, W., Saeedi, S., McCormac, J., Clark, R., Tzoumanikas, D., Ye, Q., Huang, Y., Tang, R., Leutenegger, S.: Interiornet: mega-scale multi-sensor photo-realistic indoor scenes dataset. arXiv preprint arXiv:1809.00716 (2018)

  14. Magid, E., Pashkin, A., Simakov, N., Abbyasov, B., Suthakorn, J., Svinin, M., Matsuno, F.: Artificial intelligence based framework for robotic search and rescue operations conducted jointly by international teams. In: Proceedings of 14th International Conference on Electromechanics and Robotics “Zavalishin’s Readings”, pp. 15–26. Springer, Berlin (2020)

    Google Scholar 

  15. Magid, E., Tsubouchi, T.: Static balance for rescue robot navigation: discretizing rotational motion within random step environment. In: International Conference on Simulation, Modeling, and Programming for Autonomous Robots, pp. 423–435. Springer, Berlin (2010)

    Google Scholar 

  16. Montemerlo, M., Thrun, S.: FastSLAM: A Scalable Method for the Simultaneous Localization and Mapping Problem in Robotics, vol. 27. Springer, Berlin (2007)

    Google Scholar 

  17. Nistér, D., Naroditsky, O., Bergen, J.: Visual odometry. In: Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004, vol. 1, pp. I–I. IEEE, New York (2004)

    Google Scholar 

  18. Safin, R., Lavrenov, R., Saha, S.K., Magid, E.: Experiments on mobile robot stereo vision system calibration under hardware imperfection. In: MATEC Web of Conferences, vol. 161, p. 03020. EDP Sciences (2018)

    Google Scholar 

  19. Safin, R., Lavrenov, R., Tsoy, T., Svinin, M., Magid, E.: Real-time video server implementation for a mobile robot. In: 2018 11th International Conference on Developments in eSystems Engineering (DeSE), pp. 180–185. IEEE, New York (2018)

    Google Scholar 

  20. Schubert, D., Goll, T., Demmel, N., Usenko, V., Stückler, J., Cremers, D.: The TUM VI benchmark for evaluating visual-inertial odometry. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1680–1687. IEEE, New York (2018)

    Google Scholar 

  21. Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D slam systems. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 573–580. IEEE, New York (2012)

    Google Scholar 

  22. Zakiev, A., Shabalina, K., Tsoy, T., Magid, E.: Pilot virtual experiments on ARUco and ARTag systems comparison for fiducial marker rotation resistance. In: Proceedings of 14th International Conference on Electromechanics and Robotics “Zavalishin’s Readings”, pp. 455–464. Springer, Berlin (2020)

    Google Scholar 

Download references

Acknowledgements

This research was funded by the Russian Foundation for Basic Research (RFBR), project ID 18-58-45017. This work was partially supported by the research grant of Kazan Federal University.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Roman Lavrenov .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Safin, R., Lavrenov, R., Martínez-García, E.A. (2021). Evaluation of Visual SLAM Methods in USAR Applications Using ROS/Gazebo Simulation. In: Ronzhin, A., Shishlakov, V. (eds) Proceedings of 15th International Conference on Electromechanics and Robotics "Zavalishin's Readings". Smart Innovation, Systems and Technologies, vol 187. Springer, Singapore. https://doi.org/10.1007/978-981-15-5580-0_30

Download citation

Publish with us

Policies and ethics