Skip to main content

An Efficient Approach for Detecting Moving Objects and Deriving Their Positions and Velocities

  • Conference paper
  • First Online:
Advances in Computer Vision (CVC 2019)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 944))

Included in the following conference series:

  • 2327 Accesses

Abstract

Well-functioning autonomous robot solutions heavily rely on the availability of fast and correct navigation solutions. The presence of dynamic/moving objects in the environment poses a challenge because the risk of collision increases. In order to derive the best and most foreseeing re-routing solutions for cases where the planned route suddenly involves the risk of colliding with a moving object, the robot’s navigation system must be provided with information about such objects’ positions and velocities.

Based on sensor readings providing either 2-dimensional polar range scan or 3-dimensional point cloud data streams, we present an efficient and effective method which detects objects in the environment and derives their positions and velocities. The method has been implemented, based on the Robot Operating System (ROS), and we also present an evaluation of it. It was found that the method results in good accuracy in the position and velocity calculations, a small memory footprint and low CPU usage requirements.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Intel® RealSense™ Depth Camera D435. https://click.intel.com/intelr-realsensetm-depth-camera-d435.html. Accessed 29 Sept 2018

  2. LaserScan message. http://docs.ros.org/api/sensor_msgs/html/msg/LaserScan.html. Accessed 29 Sept 2018

  3. Marker message. http://docs.ros.org/api/visualization_msgs/html/msg/Marker.html. Accessed 29 Sept 2018

  4. MarkerArray message. http://docs.ros.org/api/visualization_msgs/html/msg/MarkerArray.html. Accessed 29 Sept 2018

  5. PointCloud2 message. http://docs.ros.org/api/sensor_msgs/html/msg/PointCloud2.html. Accessed 29 Sept 2018

  6. RPLIDAR A2. https://www.slamtec.com/en/Lidar/A2. Accessed 29 Sept 2018

  7. Bailey, T., Durrant-Whyte, H.: Simultaneous localization and mapping (SLAM): part II. IEEE Robot. Autom. Mag. 13(3), 108–117 (2006)

    Article  Google Scholar 

  8. Diosi, A., Kleeman, L.: Fast laser scan matching using polar coordinates. Int. J. Robot. Res. 26(10), 1125–1153 (2007)

    Article  Google Scholar 

  9. Durrant-Whyte, H., Bailey, T.: Simultaneous localization and mapping: part I. IEEE Robot. Autom. Mag. 13(2), 99–110 (2006)

    Article  Google Scholar 

  10. Foote, T.: tf: the transform library. In: International Conference on Technologies for Practical Robot Applications (TePRA), pp. 1–6. IEEE (2013)

    Google Scholar 

  11. Foote, T., Purvis, M.: REP 103: Standard Units of Measure and Coordinate Conventions (2010). http://www.ros.org/reps/rep-0103.html. Accessed 29 Sept 2018

  12. Gonzalez, J., Gutierrez, R.: Direct motion estimation from a range scan sequence. J. Robot. Syst. 16(2), 73–80 (1999)

    Article  Google Scholar 

  13. Gustavsson, A.: Find moving objects repository on GitHub. https://github.com/andreasgustavsson/find_moving_objects. Accessed 9 Nov 2018

  14. Hue, C., Le Cadre, J.P., Pérez, P.: Tracking multiple objects with particle filtering. IEEE Trans. Aerosp. Electron. Syst. 38(3), 791–812 (2002)

    Article  Google Scholar 

  15. Kam, H.R., Lee, S.H., Park, T., Kim, C.H.: RViz: a toolkit for real domain data visualization. Telecommun. Syst. 60(2), 337–345 (2015). https://doi.org/10.1007/s11235-015-0034-5

    Article  Google Scholar 

  16. Khan, Z., Balch, T., Dellaert, F.: MCMC-based particle filtering for tracking a variable number of interacting targets. IEEE Trans. Pattern Anal. Mach. Intell. 27(11), 1805–1819 (2005)

    Article  Google Scholar 

  17. Kohlbrecher, S., Von Stryk, O., Meyer, J., Klingauf, U.: A flexible and scalable SLAM system with full 3D motion estimation. In: 2011 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pp. 155–160. IEEE (2011)

    Google Scholar 

  18. Meeussen, W.: REP 105: Coordinate Frames for Mobile Platforms (2010). http://www.ros.org/reps/rep-0105.html. Accessed 29 Sept 2018

  19. Montesano, L., Minguez, J., Montano, L.: Modeling the static and the dynamic parts of the environment to improve sensor-based navigation. In: Proceedings of the International Conferrence on Robotics and Automation (ICRA), pp. 4556–4562. IEEE (2005)

    Google Scholar 

  20. Nethercote, N., Seward, J.: Valgrind: a framework for heavyweight dynamic binary instrumentation. In: ACM SIGPLAN Notices, vol. 42, pp. 89–100. ACM (2007)

    Google Scholar 

  21. Pu, S., Rutzinger, M., Vosselman, G., Elberink, S.O.: Recognizing basic structures from mobile laser scanning data for road inventory studies. ISPRS J. Photogram. Remote Sens. 66(6), S28–S39 (2011)

    Article  Google Scholar 

  22. Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A.Y.: ROS: an open-source robot operating system. In: ICRA Workshop on Open Source Software, Kobe, Japan, vol. 3, p. 5 (2009)

    Google Scholar 

  23. Schulz, D., Burgard, W., Fox, D., Cremers, A.B.: Tracking multiple moving targets with a mobile robot using particle filters and statistical data association. In: Proceedings of the International Conference on Robotics and Automation (ICRA), vol. 2, pp. 1665–1670. IEEE (2001)

    Google Scholar 

  24. Trinh, L., Ekström, M., Çürüklü, B.: Toward shared working space of human and robotic agents through dipole flow field for dependable path planning. Front. Neurorobotics 1, 1–24 (2018). http://www.es.mdh.se/publications/5128-

    Google Scholar 

  25. Wang, C.C., Thorpe, C., Suppe, A.: Ladar-based detection and tracking of moving objects from a ground vehicle at high speeds. In: Proceedings of the Intelligent Vehicles Symposium, pp. 416–421. IEEE (2003)

    Google Scholar 

  26. Yilmaz, A., Javed, O., Shah, M.: Object tracking: a survey. ACM Comput. Surv. (CSUR), 38(4) (2006). http://doi.acm.org.ep.bib.mdh.se/10.1145/1177352.1177355

    Article  Google Scholar 

Download references

Acknowledgments

The research presented herein was funded by the Knowledge Foundation through the research profile “DPAC - Dependable Platforms for Autonomous systems and Control”.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andreas Gustavsson .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gustavsson, A. (2020). An Efficient Approach for Detecting Moving Objects and Deriving Their Positions and Velocities. In: Arai, K., Kapoor, S. (eds) Advances in Computer Vision. CVC 2019. Advances in Intelligent Systems and Computing, vol 944. Springer, Cham. https://doi.org/10.1007/978-3-030-17798-0_25

Download citation

Publish with us

Policies and ethics