Skip to main content

Feature Detection and Tracking

  • Chapter
Concise Computer Vision

Part of the book series: Undergraduate Topics in Computer Science ((UTICS))

  • 10k Accesses

Abstract

This chapter describes the detection of keypoints and the definition of descriptors for those; a keypoint and a descriptor define a feature. The given examples are SIFT, SURF, and ORB, where we introduce BRIEF and FAST for providing ORB. We discuss the invariance of features in general, and of the provided examples in particular. The chapter also discusses three ways for tracking features: KLT, particle filter, and Kalman filter.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 44.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The described generation of 3D flow vectors has been published in [J.A. Sanchez, R. Klette, and E. Destefanis. Estimating 3D flow for driver assistance applications. Pacific-Rim Symposium Image Video Technology, LNCS 5414, pp. 237–248, 2009].

  2. 2.

    See [Z. Song and R. Klette. Robustness of point feature detection. In Proc. Computer Analysis Images Patterns, LNCS 8048, pp. 91–99, 2013].

  3. 3.

    See [Y. Zeng and R. Klette. Multi-run 3D streetside reconstruction from a vehicle. In Proc. Computer Analysis Images Patterns, LNCS 8047, pp. 580–588, 2013].

  4. 4.

    The presentation follows the Lucas–Kanade tracker introduction by T. Svoboda on cmp.felk.cvut.cz/cmp/courses/Y33ROV/Y33ROV_ZS20082009/Lectures/Motion/klt.pdf.

  5. 5.

    We use a (practically acceptable) approximation of the Hessian. Instead of mixed derivatives, we apply the product of the first-order derivatives.

  6. 6.

    A particle filter for lane detection was suggested in [S. Sehestedt, S. Kodagoda, A. Alempijevic, and G. Dissanayake. Efficient lane detection and tracking in urban environments. In Proc. European Conf. Mobile Robots, pp. 126–131, 2007].

  7. 7.

    This is the retinal point where lines parallel to translatory motion meet, also assuming a corresponding direction of gaze.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag London

About this chapter

Cite this chapter

Klette, R. (2014). Feature Detection and Tracking. In: Concise Computer Vision. Undergraduate Topics in Computer Science. Springer, London. https://doi.org/10.1007/978-1-4471-6320-6_9

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-6320-6_9

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-6319-0

  • Online ISBN: 978-1-4471-6320-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics