Please use this identifier to cite or link to this item: http://theses.ncl.ac.uk/jspui/handle/10443/5194
Title: Novel data association methods for online multiple human tracking
Authors: Fu, Zeyu
Issue Date: 2020
Publisher: Newcastle University
Abstract: Video-based multiple human tracking has played a crucial role in many applications such as intelligent video surveillance, human behavior analysis, and health-care systems. The detection based tracking framework has become the dominant paradigm in this research eld, and the major task is to accurately perform the data association between detections across the frames. However, online multiple human tracking, which merely relies on the detections given up to the present time for the data association, becomes more challenging with noisy detections, missed detections, and occlusions. To address these challenging problems, there are three novel data association methods for online multiple human tracking are presented in this thesis, which are online group-structured dictionary learning, enhanced detection reliability and multi-level cooperative fusion. The rst proposed method aims to address the noisy detections and occlusions. In this method, sequential Monte Carlo probability hypothesis density (SMC-PHD) ltering is the core element for accomplishing the tracking task, where the measurements are produced by the detection based tracking framework. To enhance the measurement model, a novel adaptive gating strategy is developed to aid the classi cation of measurements. In addition, online group-structured dictionary learning with a maximum voting method is proposed to estimate robustly the target birth intensity. It enables the new-born targets in the tracking process to be accurately initialized from noisy sensor measurements. To improve the adaptability of the group-structured dictionary to target appearance changes, the simultaneous codeword optimization (SimCO) algorithm is employed for the dictionary update. The second proposed method relates to accurate measurement selection of detections, which is further to re ne the noisy detections prior to the tracking pipeline. In order to achieve more reliable measurements in the Gaussian mixture (GM)-PHD ltering process, a global-to-local enhanced con dence rescoring strategy is proposed by exploiting the classi cation power of a mask region-convolutional neural network (R-CNN). Then, an improved pruning algorithm namely soft-aggregated non-maximal suppression (Soft-ANMS) is devised to further enhance the selection step. In addition, to avoid the misuse of ambiguous measurements in the tracking process, person re-identi cation (ReID) features driven by convolutional neural networks (CNNs) are integrated to model the target appearances. The third proposed method focuses on addressing the issues of missed detections and occlusions. This method integrates two human detectors with di erent characteristics (full-body and body-parts) in the GM-PHD lter, and investigates their complementary bene ts for tracking multiple targets. For each detector domain, a novel discriminative correlation matching (DCM) model for integration in the feature-level fusion is proposed, and together with spatio-temporal information is used to reduce the ambiguous identity associations in the GM-PHD lter. Moreover, a robust fusion center is proposed within the decision-level fusion to mitigate the sensitivity of missed detections in the fusion process, thereby improving the fusion performance and tracking consistency. The e ectiveness of these proposed methods are investigated using the MOTChallenge benchmark, which is a framework for the standardized evaluation of multiple object tracking methods. Detailed evaluations on challenging video datasets, as well as comparisons with recent state-of-the-art techniques, con rm the improved multiple human tracking performance.
Description: PhD Thesis
URI: http://hdl.handle.net/10443/5194
Appears in Collections:School of Engineering

Files in This Item:
File Description SizeFormat 
Zeyu F 2020.pdf35.21 MBAdobe PDFView/Open
dspacelicence.pdf43.82 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.