Real-Time Event-Triggered Object Tracking in the Presence of Model Drift and Occlusion
Date of Issue2018
School of Electrical and Electronic Engineering
ST Engineering-NTU Corporate Lab
In the paper, we propose a novel event-triggered tracking framework for fast and robust visual tracking in the presence of model drift and occlusion. The resulting tracker not only operates at real-time, but also is resilient to tracking failures caused by factors such as heavy occlusion. Specifically, the tracker consists of an event-triggered decision model as the core module that coordinates other functional modules, including a short-term tracker, occlusion and drift identification, target re-detection, short-term tracker updating and on-line discriminative learning for detector. Each functional module is associated with a defined event that is triggered when a set of proposed conditions are met. The occlusion and drift identification module is intended to perform on-line evaluation of the short-term tracking. When a model drift event occurs, the target re-detection module is activated by the event-triggered decision model to relocate the target and reinitialize the short-term tracker. The short-term tracker updating is carried out at each frame with a variable learning rate depending on the degree of occlusion. A sampling-pool is constructed to store discriminative samples that are used to update the detector model. Extensive experiments on large benchmark datasets demonstrate that ETT can effectively detect model drift and restore tracking.
IEEE Transactions on Industrial Electronics
© 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: [http://dx.doi.org/10.1109/TIE.2018.2835390].