Published online by Cambridge University Press: 30 September 2008
This paper describes real-time computer vision algorithms for detection, identification, and tracking of moving targets in video streams generated by a moving airborne platform. Moving platforms cause instabilities in image acquisition due to factors such as disturbances and the ego-motion of the camera that distorts the actual motion of the moving targets. When the camera is mounted on a moving observer, the entire scene (background and targets) appears to be moving and the actual motion of the targets must be separated from the background motion. The motion of the airborne platform is modeled as affine transformation and its parameters are estimated using corresponding feature sets in consecutive images. After motion is compensated, the platform is considered as stationary and moving targets are detected accordingly. A number of tracking algorithms including particle filters, mean-shift, and connected component were implemented and compared. A cascaded boosted classifier with Haar wavelet feature extraction for moving target classification was developed and integrated with the recognition system that uses joint-feature spatial distribution. The integrated smart video surveillance system has been successfully tested using the Vivid Datasets provided by the Air Force Research Laboratory. The experimental results show that system can operate in real time and successfully detect, track, and identify multiple targets in the presence of partial occlusion.