Abstract
A robust moving object detection system for an outdoor scene must be able to handle adverse illumination conditions such as sudden illumination changes or lack of illumination in a scene. This is of particular importance for scenarios where active illumination cannot be relied upon. Utilizing infrared and video sensors, we develop a novel sensor fusion system that automatically adapts to the environmental changes that affect sensor measurements. The adaptation is done through a cooperative coevolutionary algorithm that fuses the scene contextual and statistical information through a physics-based method. The sensor fusion system maintains high detection rates under a variety of conditions. The results are shown for a full 24-hour diurnal cycle.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Buede DM and Girardi P. (1997). “A target identification comparison of Bayesian and Dempster-Shafer multisensor fusion.” IEEE Tran. Syst. Man. & Cybern., 27(5):569–577.
Cristani M, Bicego M, and Murino V. (2002). “Integrated region and pixel-based approach to background modeling.” IEEE Workshop Motion and Video Comput. pp. 3–8.
Han J and Bhanu B. (2003). “Detecting moving humans using color and infrared video.” in: IEEE Conference on Multisensor Fusion and Integration for Intelligent Systems. pp. 228–233.
Haritaoglu I, Harwood D, and Davis L. (2000). “W4: Real-time surveillance of people and their activities.” IEEE Trans. Pattern Anal. Mach. Intell., 22(8):809–830.
Heesung K, Sandor ZD, and Nasrabadi NM. (2002). “Adaptive multisensor target detection using feature-based fusion.” Opt. Eng., 41(1):69–80.
Horprasert T, Harwood D, and Davis LS. (1999). “A statistical approach for real-time robust background subtraction and shadow detection.” in: Proc. FRAMERATE Workshop held in conjunction with Intl. Conf. on Computer Vision. pp. 1–19.
Joshi R and Sanderson AC. (1999). “Minimal representation multisensor fusion using differential evolution.” IEEE Trans. Syst. Man & Cybernet., 29(1):63–76.
Lipton AJ, Fujiyoshi H, and Patil RS. (1998). “Moving target classification and tracking from real-time video.” in: IEEE Workshop on Applications of Computer Vision, pp. 8–14.
Ma B, Lakshmanan S, and Hero III O. (2000). “Simultaneous detection of lane and pavement boundaries using model-based multisensor fusion.” IEEE Trans. Intell. Transportation Syst., 1(3):135–147.
Majumder S, Scheding S, and Durrant-Whyte HF. (2001). “Multisensor data fusion for underwater navigation.” Robotics & Autonomous Syst., 35(2):97–108.
Nadimi S and Bhanu B. (2003). “Physics-based models of color and IR video for sensor fusion.” in: IEEE Conference on Multisensor Fusion and Integration for Intelligent Systems. pp. 161–166.
Nandhakumar N and Aggrawal JK. (1997). “Physics-based integration of multiple sensing modalities for scene interpretation.” Proc. of the IEEE, 85(1):147–163.
Pavlidis I, et al. (1999). “Automatic detection of vehicle passengers through near-infrared fusion.” in: Proc. IEEE Conference on Intelligent Transportation Systems. pp. 304–309.
Potter MA and DeJong KA. (1994). “A cooperative coevolutionary approach to function optimization.” in: Proc. of the 3rd Conference on Parallel Problem Solving from Nature. pp. 249–257.
Scribner D, Warren P, Schuler J, Satyshur M, and Kruer M. (1998). “Infrared color vision: An approach to sensor fusion.” Optics and Photonics News. pp. 27–32.
Scribner D, Warren P, and Schuler J. (1999). “Extending color vision methods to bands beyond the visible.” in: Proc. IEEE Workshop on Computer Vision Beyond the Visible Spectrum: Methods and Applications. pp. 33–44.
Shafer SA. (1985). “Using color to separate reflection components.” Color Research Appl., 10(4):210–218.
Stauffer C and Grimson WEL. (2000). “Learning patterns of activity using real-time tracking.” IEEE Trans. Pattern Anal. Mach. Intell., 22(8):747–757.
Therrien CW, Scrofani JW, and Krebs WK. (1997). “An adaptive technique for the enhanced fusion of low light visible with uncooled thermal infrared imagery.” in: Proc. Intl. Conference on Image Processing. 3:405–409.
Ulug ME and McCullough C. (1999). “Feature and data level fusion of infrared and visual images.” Proc. SPIE: Intl. Soc. Opt. Eng., 37(19):312–318.
Viva MMD and Morrone C. (1998). “Motion analysis by feature tracking.” Vis. Research, 38:3633–3653.
Ward G. (1994). “The RADIANCE lighting simulation and rendering system.” Computer Graphics (SIGGRAPH’ 94). pp. 459–472.
Waxman AM, et al. (1998). “Solid state color night vision: Fusion of low-light visible and thermal infrared imagery.” MIT Lincoln Lab J., 11(1):41–57.
Wren CR, Azarbayejani A, Darrell T, and Pentland AP. (1997). “Pfinder: Real-time tracking of the human body.” IEEE Trans. Pattern Anal. Mach. Intell., 19(7):780–785.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer Science+Business Media, Inc.
About this chapter
Cite this chapter
Bhanu, B., Nadimi, S. (2006). Evolutionary Sensor Fusion for Security. In: Javidi, B. (eds) Optical Imaging Sensors and Systems for Homeland Security Applications. Advanced Sciences and Technologies for Security Applications, vol 2. Springer, New York, NY. https://doi.org/10.1007/0-387-28001-4_13
Download citation
DOI: https://doi.org/10.1007/0-387-28001-4_13
Publisher Name: Springer, New York, NY
Print ISBN: 978-0-387-26170-6
Online ISBN: 978-0-387-28001-1
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)