Skip to main content

Part of the book series: Advanced Sciences and Technologies for Security Applications ((ASTSA,volume 2))

  • 656 Accesses

Abstract

A robust moving object detection system for an outdoor scene must be able to handle adverse illumination conditions such as sudden illumination changes or lack of illumination in a scene. This is of particular importance for scenarios where active illumination cannot be relied upon. Utilizing infrared and video sensors, we develop a novel sensor fusion system that automatically adapts to the environmental changes that affect sensor measurements. The adaptation is done through a cooperative coevolutionary algorithm that fuses the scene contextual and statistical information through a physics-based method. The sensor fusion system maintains high detection rates under a variety of conditions. The results are shown for a full 24-hour diurnal cycle.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Buede DM and Girardi P. (1997). “A target identification comparison of Bayesian and Dempster-Shafer multisensor fusion.” IEEE Tran. Syst. Man. & Cybern., 27(5):569–577.

    Article  Google Scholar 

  2. Cristani M, Bicego M, and Murino V. (2002). “Integrated region and pixel-based approach to background modeling.” IEEE Workshop Motion and Video Comput. pp. 3–8.

    Google Scholar 

  3. Han J and Bhanu B. (2003). “Detecting moving humans using color and infrared video.” in: IEEE Conference on Multisensor Fusion and Integration for Intelligent Systems. pp. 228–233.

    Google Scholar 

  4. Haritaoglu I, Harwood D, and Davis L. (2000). “W4: Real-time surveillance of people and their activities.” IEEE Trans. Pattern Anal. Mach. Intell., 22(8):809–830.

    Article  Google Scholar 

  5. Heesung K, Sandor ZD, and Nasrabadi NM. (2002). “Adaptive multisensor target detection using feature-based fusion.” Opt. Eng., 41(1):69–80.

    Article  Google Scholar 

  6. Horprasert T, Harwood D, and Davis LS. (1999). “A statistical approach for real-time robust background subtraction and shadow detection.” in: Proc. FRAMERATE Workshop held in conjunction with Intl. Conf. on Computer Vision. pp. 1–19.

    Google Scholar 

  7. Joshi R and Sanderson AC. (1999). “Minimal representation multisensor fusion using differential evolution.” IEEE Trans. Syst. Man & Cybernet., 29(1):63–76.

    Article  Google Scholar 

  8. Lipton AJ, Fujiyoshi H, and Patil RS. (1998). “Moving target classification and tracking from real-time video.” in: IEEE Workshop on Applications of Computer Vision, pp. 8–14.

    Google Scholar 

  9. Ma B, Lakshmanan S, and Hero III O. (2000). “Simultaneous detection of lane and pavement boundaries using model-based multisensor fusion.” IEEE Trans. Intell. Transportation Syst., 1(3):135–147.

    Article  Google Scholar 

  10. Majumder S, Scheding S, and Durrant-Whyte HF. (2001). “Multisensor data fusion for underwater navigation.” Robotics & Autonomous Syst., 35(2):97–108.

    Article  MATH  Google Scholar 

  11. Nadimi S and Bhanu B. (2003). “Physics-based models of color and IR video for sensor fusion.” in: IEEE Conference on Multisensor Fusion and Integration for Intelligent Systems. pp. 161–166.

    Google Scholar 

  12. Nandhakumar N and Aggrawal JK. (1997). “Physics-based integration of multiple sensing modalities for scene interpretation.” Proc. of the IEEE, 85(1):147–163.

    Article  Google Scholar 

  13. Pavlidis I, et al. (1999). “Automatic detection of vehicle passengers through near-infrared fusion.” in: Proc. IEEE Conference on Intelligent Transportation Systems. pp. 304–309.

    Google Scholar 

  14. Potter MA and DeJong KA. (1994). “A cooperative coevolutionary approach to function optimization.” in: Proc. of the 3rd Conference on Parallel Problem Solving from Nature. pp. 249–257.

    Google Scholar 

  15. Scribner D, Warren P, Schuler J, Satyshur M, and Kruer M. (1998). “Infrared color vision: An approach to sensor fusion.” Optics and Photonics News. pp. 27–32.

    Google Scholar 

  16. Scribner D, Warren P, and Schuler J. (1999). “Extending color vision methods to bands beyond the visible.” in: Proc. IEEE Workshop on Computer Vision Beyond the Visible Spectrum: Methods and Applications. pp. 33–44.

    Google Scholar 

  17. Shafer SA. (1985). “Using color to separate reflection components.” Color Research Appl., 10(4):210–218.

    Article  Google Scholar 

  18. Stauffer C and Grimson WEL. (2000). “Learning patterns of activity using real-time tracking.” IEEE Trans. Pattern Anal. Mach. Intell., 22(8):747–757.

    Article  Google Scholar 

  19. Therrien CW, Scrofani JW, and Krebs WK. (1997). “An adaptive technique for the enhanced fusion of low light visible with uncooled thermal infrared imagery.” in: Proc. Intl. Conference on Image Processing. 3:405–409.

    Article  Google Scholar 

  20. Ulug ME and McCullough C. (1999). “Feature and data level fusion of infrared and visual images.” Proc. SPIE: Intl. Soc. Opt. Eng., 37(19):312–318.

    Google Scholar 

  21. Viva MMD and Morrone C. (1998). “Motion analysis by feature tracking.” Vis. Research, 38:3633–3653.

    Article  Google Scholar 

  22. Ward G. (1994). “The RADIANCE lighting simulation and rendering system.” Computer Graphics (SIGGRAPH’ 94). pp. 459–472.

    Google Scholar 

  23. Waxman AM, et al. (1998). “Solid state color night vision: Fusion of low-light visible and thermal infrared imagery.” MIT Lincoln Lab J., 11(1):41–57.

    ADS  Google Scholar 

  24. Wren CR, Azarbayejani A, Darrell T, and Pentland AP. (1997). “Pfinder: Real-time tracking of the human body.” IEEE Trans. Pattern Anal. Mach. Intell., 19(7):780–785.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer Science+Business Media, Inc.

About this chapter

Cite this chapter

Bhanu, B., Nadimi, S. (2006). Evolutionary Sensor Fusion for Security. In: Javidi, B. (eds) Optical Imaging Sensors and Systems for Homeland Security Applications. Advanced Sciences and Technologies for Security Applications, vol 2. Springer, New York, NY. https://doi.org/10.1007/0-387-28001-4_13

Download citation

Publish with us

Policies and ethics