Incremental multi-linear discriminant analysis using canonical correlations for action recognition

Cheng-Cheng Jia, Su-Jing Wang, Xu-Jun Peng, Wei Pang, Can-Yan Zhang, Chunguang Zhou, Zhe-Zhou Yu

Research output: Contribution to journalArticle

13 Citations (Scopus)

Abstract

Canonical correlations analysis (CCA) is often used for feature extraction and dimensionality reduction. However, the image vectorization of CCA breaks the spatial structure of the original image, and the excessive dimensions of vectors often cause the curse of dimensionality problem. In this paper, we propose a novel feature extraction method based on CCA in multi-linear discriminant subspace by encoding each action sample as a high-order tensor. An optimization approach is presented to iteratively learn the discriminant subspace by unfolding the tensor along different tensor modes, which shows that most of the underlying data structure, including the spatio-temporal information, is retained and the curse of dimensionality problem is alleviated by the use of the proposed approach. At the same time, an incremental scheme is developed for multi-linear subspace online learning, which can improve the discriminative capability efficiently and effectively. In addition, the nearest neighbor classifier (NNC) is employed for action classification. Experiments on the Weizmann database show that the proposed method outperforms the state-of-the-art methods in terms of accuracy and time complexity, and it is robust against partial occlusion.
Original languageEnglish
Pages (from-to)56-63
Number of pages8
JournalNeurocomputing
Volume83
Issue number-
Early online date27 Dec 2011
DOIs
Publication statusPublished - 15 Apr 2012

Keywords

  • Canonical correlations analysis
  • Multi-linear subspace
  • Discriminant information
  • Incremental learning
  • Action recognition

Fingerprint Dive into the research topics of 'Incremental multi-linear discriminant analysis using canonical correlations for action recognition'. Together they form a unique fingerprint.

  • Cite this