Robust Visual Tracking via Parallel Kernel Sparse Representation and NormalHedge
- DOI
- 10.2991/3ca-13.2013.23How to use a DOI?
- Keywords
- Parallel Kernel Sparse Representation; NormalHedge; Visual Tracking (key words)
- Abstract
In this paper, a novel visual object tracking method based on NormalHedge (NH) and parallel kernel sparse representation (PKSR) is proposed to achieve robust tracking accuracy under challenging conditions such as the target and its background sharing similar patterns, occlusion and deformation. Kernel functions are capable of improving classification performance by casting features to high dimensional kernel space. However, standard coordinate descent-based sparse representation method is not efficient enough for tracking problems. Thus we propose a kernel parallel coordinate descent method (KPCD) to efficiently solve 1 minimization in the kernel space. The classification framework is proposed to calculate the loss value of each particle as well. Then, an adaptive dictionary updating method is proposed to create the over-completed dictionary. In addition, the states of the target are calculated by a recently developed online learning method similar to the particle filter called NormalHedge (NH), which has an effective re-sampling method to avoid the degeneracy problem. Extensive test results show the proposed method outperforms several state-of-art tracking methods in complex conditions.
- Copyright
- © 2013, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Jinjun Kuang AU - Cheng Cheng PY - 2013/04 DA - 2013/04 TI - Robust Visual Tracking via Parallel Kernel Sparse Representation and NormalHedge BT - Proceedings of the 2nd International Symposium on Computer, Communication, Control and Automation PB - Atlantis Press SP - 89 EP - 93 SN - 1951-6851 UR - https://doi.org/10.2991/3ca-13.2013.23 DO - 10.2991/3ca-13.2013.23 ID - Kuang2013/04 ER -