Dynamical Regularity for Action Analysis

Vinay Venkataraman, Ioannis Vlachos and Pavan Turaga

Abstract

In this paper, we propose a new approach for quantification of 'dynamical regularity' as applied to modeling human actions. We use approximate entropy-based feature representation to model the dynamics in human movement to achieve temporal segmentation in untrimmed motion capture data and fine-grained quality assessment of diving actions in videos. The principle herein is to quantify regularity (frequency of typical patterns) in the dynamical space computed from trajectories of action data. We extend conventional ideas for modeling dynamics in human movement by introducing multivariate and cross approximate entropy features. Our experimental evaluation on theoretical models and two publicly available databases show that the proposed features can achieve state-of-the-art results on applications such as temporal segmentation and quality assessment of actions.

Session

Poster 1

Files

PDF iconExtended Abstract (PDF, 524K)
PDF iconPaper (PDF, 1294K)

DOI

10.5244/C.29.67
https://dx.doi.org/10.5244/C.29.67

Citation

Vinay Venkataraman, Ioannis Vlachos and Pavan Turaga. Dynamical Regularity for Action Analysis. In Xianghua Xie, Mark W. Jones, and Gary K. L. Tam, editors, Proceedings of the British Machine Vision Conference (BMVC), pages 67.1-67.12. BMVA Press, September 2015.

Bibtex

@inproceedings{BMVC2015_67,
	title={Dynamical Regularity for Action Analysis},
	author={Vinay Venkataraman and Ioannis Vlachos and Pavan Turaga},
	year={2015},
	month={September},
	pages={67.1-67.12},
	articleno={67},
	numpages={12},
	booktitle={Proceedings of the British Machine Vision Conference (BMVC)},
	publisher={BMVA Press},
	editor={Xianghua Xie, Mark W. Jones, and Gary K. L. Tam},
	doi={10.5244/C.29.67},
	isbn={1-901725-53-7},
	url={https://dx.doi.org/10.5244/C.29.67}
}