Skip to main content

Computer Vision for Micro Air Vehicles

  • Chapter
  • First Online:
Advances in Embedded Computer Vision

Abstract

Autonomous operation of small UAVs in cluttered environments requires three important foundations: fast and accurate knowledge about position in the world for control; obstacle detection and avoidance for safe flight; and all of this has to be executed in real-time onboard the vehicle. This is a challenge for micro air vehicles, since their limited payload demands small, lightweight, and low-power sensors and processing units, favoring vision-based solutions that run on small embedded computers equipped with smart phone-based processors. In the following chapter, we present the JPL autonomous navigation framework for micro air vehicles to address these challenges. Our approach enables power-up-and-go deployment in highly cluttered environments without GPS, using information from an IMU and a single downward-looking camera for pose estimation, and a forward-looking stereo camera system for disparity-based obstacle detection and avoidance. As an example of a high-level navigation task that builds on these autonomous capabilities, we introduce our approach for autonomous landing on elevated flat surfaces, such as rooftops, using only monocular vision inputs from the downward-looking camera.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Agrawal M, Konolige K, Blas MR (2008) Censure: center surround extremas for realtime feature detection and matching. Computer vision ECCV 2008. Lecture Notes in Computer Science, vol 5305. Springer, Berlin, pp 102–115

    Google Scholar 

  2. Armesto L, Tornero J, Vincze M (2007) Fast ego-motion estimation with multi-rate fusion of inertial and vision. Int J Robot Res 26(6):577–589

    Article  Google Scholar 

  3. Armesto L, Chroust S, Vincze M, Tornero J (2004) Multi-rate fusion with vision and inertial sensors. In: Proceedings of the IEEE international conference on robotics and automation, New Orleans, US

    Google Scholar 

  4. Ascending Technologies GmbH. http://www.asctec.de/uav-applications/research/products/asctec-mastermind/

  5. Bachrach A, Prentice S, He R, Henry P, Huang AS, Krainin M, Maturana D, Fox D, Roy N (2012) Estimation, planning and mapping for autonomous flight using an RGB-D camera in GPS-denied environments. Int J Robot Res 31(11):1320–1343

    Article  Google Scholar 

  6. Bajracharya M, Howard A, Matthies L, Tang B, Turmon M (2009) Autonomous off-road navigation with end-to-end learning for the LAGR program. Field Robot 26(1):3–25

    Article  Google Scholar 

  7. Bakolas E, Tsiotras P (2008) Multiresolution path planning via sector decompositions compatible to on-board sensor data. In: AIAA guidance, navigation, and control conference

    Google Scholar 

  8. Baldwin G, Mahony R, Trumpf J (2009) A nonlinear observer for 6 DOF pose estimation from inertial and bearing measurements. In: Proceedings of the IEEE international conference on robotics and automation, Kobe

    Google Scholar 

  9. Bay H, Ess A, Tuytelaars T, Van Gool L (2008) Speeded-up robust features (surf). Comput Vis Image Underst 110(3):346–359

    Article  Google Scholar 

  10. Bosch S, Lacroix S, Caballero F (2006) Autonomous detection of safe landing areas for an uav from monocular images. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, pp 5522–5527

    Google Scholar 

  11. Brockers R, Susca S, Zhu D, Matthies L (2012) Fully self-contained vision-aided navigation and landing of a micro air vehicle independent from external sensor inputs. In: Proceedings of the SPIE, 8387:83870Q-1–83870Q-10

    Google Scholar 

  12. Cheng Y (2010) Real-time surface slope estimation by homography alignment for spacecraft safe landing. In: Proceedings of the IEEE international conference on robotics and automation, pp 2280–2286

    Google Scholar 

  13. Chroust SG, Vincze M (2004) Fusion of vision and inertial data for motion and structure estimation. J Robot Syst 21(2):73–83

    Article  Google Scholar 

  14. Conroy J, Gremillion G, Ranganathan B, Humbert J (2009) Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Auton Robot 27(3):189–198

    Article  Google Scholar 

  15. Corke P (2004) An inertial and visual sensing system for a small autonomous helicopter. Int J Robot Syst 21(2):43–51

    Article  Google Scholar 

  16. Crazyflie Micro Quadrotor. http://www.bitcraze.se/crazyflie/

  17. Di K, Li R (2004) CAHVOR camera model and its photogrammetric conversion for planetary applications. J Geophys Res 109:E04004

    Google Scholar 

  18. Fraundorfer F, Heng L, Honegger, D, Lee GH, Meier L, Tanskanen P, Pollefeys M (2012) Vision-based autonomous mapping and exploration using a quadrotor MAV. In: IROS, pp 4557–4564

    Google Scholar 

  19. Gemeiner P, Einramhof P, Vincze M (2007) Simultaneous motion and structure estimation by fusion of inertial and vision data. Int J Robot Res 26(6):591–605

    Article  Google Scholar 

  20. Goldberg SB, Matthies L (2011) Stereo and IMU assisted visual odometry on an OMAP3530 for small robots. In: 2011 IEEE computer society conference on computer vision and pattern recognition workshops (CVPRW), pp 169–176

    Google Scholar 

  21. Hardkernel. http://www.hardkernel.com

  22. How JP, Bethke B, Frank A, Dale D, Vian J (2008) Real-time indoor autonomous vehicle test environment. IEEE Control Syst Mag 28(2):51–64

    Article  MathSciNet  Google Scholar 

  23. Hrabar S, Sukhatme GS, Corke P, Usher K, Roberts J (2005) Combined optic-flow and stereo-based navigation of urban canyons for a uav. In: IROS

    Google Scholar 

  24. Huster A, Frew EW, Rock SM (2002) Relative position estimation for AUVs by fusing bearing and inertial rate sensor measurements. In: Proceedings of the oceans conference, vol 3. MTS/IEEE, Biloxi, pp 1857–1864

    Google Scholar 

  25. Hyslop AM, Humbert JS (2010) Autonomous navigation in three-dimensional urban environments using wide-field integration of optic flow. Guid Control Dyn 33(1):147

    Article  Google Scholar 

  26. Johnson A, Montgomery J, Matthies L (2005) Vision guided landing of an autonomous helicopter in hazardous terrain. In: Proceedings of the IEEE international conference on robotics and automation, pp 3966–3971

    Google Scholar 

  27. Jones E (2009) Large scale visual navigation and community map building. PhD thesis, University of California at Los Angeles

    Google Scholar 

  28. Jones E, Soatto S (2010) Visual-inertial navigation, mapping and localization: a scalable real-time causal approach. Int J Robot Res 30:407–430

    Google Scholar 

  29. Kelly J, Sukhatme GS (2011) Visual-inertial sensor fusion: localization, mapping and sensor-to-sensor self-calibration. Int J Robot Res (IJRR) 30(1):56–79

    Article  Google Scholar 

  30. Klein G, Murray D (2007) Parallel tracking and mapping for small AR workspaces. In: Proceedings of the 2007 6th IEEE and ACM international symposium on mixed and augmented reality, ISMAR’07. IEEE Computer Society, p 110

    Google Scholar 

  31. Kuwata Y, Teo J, Fiore G, Karaman S, Frazzoli E, How JP (2009) Real-time motion planning with applications to autonomous urban driving. Trans Control Syst Tech 17(5):1105–1118

    Article  Google Scholar 

  32. Luders B, Karaman S, Frazzoli E, How J (2010) Bounds on tracking error using closed-loop rapidly-exploring random trees. In: American control conference, Baltimore, MD, pp 5406–5412

    Google Scholar 

  33. Lupton T, Sukkarieh S (2008) Removing scale biases and ambiguity from 6DoF monocular SLAM using inertial. In: International conference on robotics and automation, Pasadena, California

    Google Scholar 

  34. Lupton T, Sukkarieh S (2009) Efficient integration of inertial observations into visual SLAM without initialization. In: IEEE/RSJ international conference on intelligent robots and systems, St. Louis

    Google Scholar 

  35. MacAllister B, Butzke J, Kushleyev A, Pandey H, Likhachev M (2013) Path planning for non-circular micro aerial vehicles in constrained environments. In: ICRA, pp 3918–3925

    Google Scholar 

  36. Martin GR (2009) What is binocular vision for? a birds eye view. J Vis 9(11):245–267

    Article  Google Scholar 

  37. Meingast M, Geyer C, Sastry S (2004) Vision based terrain recovery for landing unmanned aerial vehicles. In: Proceedings of the IEEE conference on decision and control, vol 2. pp 1670–1675

    Google Scholar 

  38. Mei C, Sibley G, Cummins M, Newman P, Reid I (2009) A constant time efficient stereo SLAM system. In: Proceedings of the British machine vision conference (BMVC)

    Google Scholar 

  39. Mellinger D, Kumar V (2011) Minimum snap trajectory generation and control for quadrotors. In: Proceedings of the IEEE international conference on robotics and automation (ICRA)

    Google Scholar 

  40. Montgomery J, Johnson A, Roumeliotis S, Matthies L (2006) The jet propulsion laboratory autonomous helicopter testbed: a platform for planetary exploration technology research and development. J Field Robot 23(3–4):245–267

    Article  Google Scholar 

  41. Mourikis AI, Roumeliotis SI (2007) A multi-state constraint Kalman filter for vision-aided inertial navigation. In: Proceedings of the IEEE international conference on robotics and automation (ICRA)

    Google Scholar 

  42. Mourikis AI, Trawny N, Roumeliotis SI, Johnson AE, Ansar A, Matthies L (2009) Vision-aided inertial navigation for spacecraft entry, descent, and landing. IEEE Trans Robot 25(2):264–280

    Article  Google Scholar 

  43. Newcombe RA, Lovegrove JS, Davison AJ (2011) Dtam: dense tracking and mapping in real-time. In: IEEE international conference on computer vision (ICCV), pp 2320–2327

    Google Scholar 

  44. Otte MW, Richardson SG, Mulligan J, Grudic G (2009) Path planning in image space for autonomous robot navigation in unstructured outdoor environments. Field Robot 26(2):212–240

    Article  Google Scholar 

  45. Pivtoraiko M, Mellinger D, Kumar V (2013) Incremental micro-UAV motion replanning for exploring unknown environments. In: ICRA

    Google Scholar 

  46. Pizzoli M, Forster C, Scaramuzza D (2014) Remode: probabilistic, monocular dense reconstruction in real time. In: Proceedings of the IEEE international conference on robotics and automation

    Google Scholar 

  47. Qian G, Chellappa R, Zheng Q (2002) Bayesian structure from motion using inertial information. In: International conference on image processing, Rochester, New York

    Google Scholar 

  48. Richter C, Bry A, Roy N (2013) Polynomial trajectory planning for quadrotor flight. In: RSS workshop on resource-efficient integration of perception, control and navigation

    Google Scholar 

  49. Robot Operating System, (ROS). http://www.ros.org

  50. Ross S, Melik-Barkhudarov N, Shankar KS, Wendel A, Dey D, Bagnell JA, Hebert M (2013) Learning monocular reactive uav control in cluttered natural environments. In: ICRA, pp 1757–1764

    Google Scholar 

  51. Roumeliotis SI, Johnson AE, Montgomery JF (2002) Augmenting inertial navigation with image-based motion estimation. In: Proceedings of The IEEE international conference on robotics and automation, Washington, pp 4326–4333

    Google Scholar 

  52. Sarabandi K, Vahidpour M, Moallem M, East J (2011) Compact beam scanning 240 GHz radar for navigation and collision avoidance. In: SPIE, vol 8031

    Google Scholar 

  53. Scherer S, Chamberlain L, Singh S (2012) Autonomous landing at unprepared sites by a full-scale helicopter. Robot Auton Syst 60(12):1545–1562

    Article  Google Scholar 

  54. Schouwenaars T, De Moor B, Feron E, How J (2001) Mixed Integer Programming for Multi-Vehicle Path Planning. In: Proceedings of the European control conference, Porto, Portugal

    Google Scholar 

  55. Seitz SM, Curless B, Diebel J, Scharstein D, Szeliski R (2006) A comparison and evaluation of multi-view stereo reconstruction algorithms. In: IEEE computer society conference on computer vision and pattern recognition, 2006, pp 519–528

    Google Scholar 

  56. Shen S, Michael N, Kumar V (2011) 3d indoor exploration with a computationally constrained mav. In: Robotics science and systems

    Google Scholar 

  57. Shen S, Michael N, Kumar V (2011) Autonomous multi-floor indoor navigation with a computationally constrained MAV. In: Proceedings of the IEEE international conference on robotics and automation

    Google Scholar 

  58. Strelow D, Singh S (2003) Online motion estimation from image and inertial measurements. In: Workshop on integration of vision and inertial sensors (INERVIS), Coimbra, Portugal

    Google Scholar 

  59. Stühmer J, Gumhold S, Cremers D (2010) Real-time dense geometry from a handheld camera. In: Proceedings of the 32nd DAGM conference on pattern recognition, pp 11–20

    Google Scholar 

  60. Templeton T, Shim DH, Geyer C, Sastry SS (2007) Autonomous vision-based landing and terrain mapping using an MPC-controlled unmanned rotorcraft. In: Proceedings of the IEEE international conference on robotics and automation, pp 1349–1356

    Google Scholar 

  61. Theodore C, Rowley D, Hubbard D, Ansar A, Matthies L, Goldberg S, Whalley M (2006) Flight trials of a rotorcraft unmanned aerial vehicle landing autonomously at unprepared sites. In: Forum of the American helicopter society, Phoenix

    Google Scholar 

  62. Weiss S (2012) Vision based navigation for micro helicopters. PhD thesis, ETH Zurich, March 2012

    Google Scholar 

  63. Weiss S, Achtelik MW, Lynen S, Achtelik MC, Kneip L, Chli M, Siegwart R (2013) Monocular vision for long-term micro aerial vehicle state estimation: a compendium. Field Robot 30(5):803–831

    Article  Google Scholar 

  64. Weiss S, Achtelik MW, Chli M, Siegwart R (2012) Versatile distributed pose estimation and sensor self-calibration for an autonomous MAV. In: IEEE International conference on robotics and automation (ICRA)

    Google Scholar 

  65. Weiss S, Achtelik MW, Lynen S, Chli M, Siegwart R (2012) Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. In: IEEE International conference on robotics and automation (ICRA)

    Google Scholar 

  66. Weiss S, Brockers R, Matthies L (2013) 4dof drift free navigation using inertial cues and optical flow. In: IEEE/RSJ International conference on intelligent robots and systems (IROS), pp 4180–4186

    Google Scholar 

  67. Weiss S, Siegwart R (2011) Real-time metric state estimation for modular vision-inertial systems. In: Proceedings of the IEEE International conference on robotics and automation (ICRA)

    Google Scholar 

  68. Yu H, Beard RW (2013) A vision-based collision avoidance technique for miniature air vehicles using local-level frame mapping and path planning. Auton Robots 34(1–2):93–109

    Article  Google Scholar 

Download references

Acknowledgments

This work was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Roland Brockers .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Brockers, R., Humenberger, M., Kuwata, Y., Matthies, L., Weiss, S. (2014). Computer Vision for Micro Air Vehicles. In: Kisačanin, B., Gelautz, M. (eds) Advances in Embedded Computer Vision. Advances in Computer Vision and Pattern Recognition. Springer, Cham. https://doi.org/10.1007/978-3-319-09387-1_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-09387-1_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-09386-4

  • Online ISBN: 978-3-319-09387-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics