Loughborough University
Browse

File(s) under permanent embargo

Reason: This item is currently closed access.

Tracking human pose with multiple activity models

journal contribution
posted on 2016-02-09, 12:25 authored by John Darby, Baihua LiBaihua Li, Nicholas Costen
Tracking unknown human motions using generative tracking techniques requires the exploration of a high-dimensional pose space which is both difficult and computationally expensive. Alternatively, if the type of activity is known and training data is available, a low-dimensional latent pose space may be learned and the difficulty and cost of the estimation task reduced. In this paper we attempt to combine the competing benefits-flexibility and efficiency-of these two generative tracking scenarios within a single approach. We define a number of "activity models", each composed of a pose space with unique dimensionality and an associated dynamical model, and each designed for use in the recovery of a particular class of activity. We then propose a method for the fair combination of these activity models for use in particle dispersion by an annealed particle filter. The resulting algorithm, which we term multiple activity model annealed particle filtering (MAM-APF), is able to dynamically vary the scope of its search effort, using a small number of particles to explore latent pose spaces and a large number of particles to explore the full pose space. We present quantitative results on the HumanEva-I and HumanEva-II datasets, demonstrating robust 3D tracking of known and unknown activities from fewer than four cameras.

History

School

  • Science

Department

  • Computer Science

Published in

Pattern Recognition

Volume

43

Issue

9

Pages

3042 - 3058

Citation

DARBY, J., LI, B. and COSTEN, N., 2010. Tracking human pose with multiple activity models. Pattern Recognition, 43 (9), pp.3042-3058

Publisher

© Elsevier

Version

  • NA (Not Applicable or Unknown)

Publisher statement

This work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at: https://creativecommons.org/licenses/by-nc-nd/4.0/

Publication date

2010

ISSN

0031-3203

Language

  • en