Toggle Main Menu Toggle Search

Open Access padlockePrints

Bootstrapping Personalised Human Activity Recognition Models Using Online Active Learning

Lookup NU author(s): Tudor Miu, Professor Paolo MissierORCiD, Dr Thomas Ploetz


Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


In Human Activity Recognition (HAR) supervised and semi-supervised training are important tools for devising parametric activity models. For the best modelling performance, typically large amounts of annotated sample data are required. Annotating often represents the bottleneck in the overall modelling process as it usually involves retrospective analysis of experimental ground truth, like video footage. These approaches typically neglect that prospective users of HAR systems are themselves key sources of ground truth for their own activities. We therefore propose an Online Active Learning framework to collect user-provided annotations and to bootstrap personalized human activity models. We evaluate our framework on existing benchmark datasets and demonstrate how it outperforms standard, more naive annotation methods. Furthermore, we enact a user study where participants provide annotations using a mobile app that implements our framework. We show that Online Active Learning is a viable method to bootstrap personalized models especially in live situations without expert supervision.

Publication metadata

Author(s): Miu T, Missier P, Plotz T

Publication type: Conference Proceedings (inc. Abstract)

Publication status: Published

Conference Name: 2015 IEEE International Conference on Computer and Information Technology - Ubiquitous Computing and Communications - Dependable, Autonomic and Secure Computing - Pervasive Intelligence and Computing (CIT/IUCC/DASC/PICOM)

Year of Conference: 2015

Pages: 1139-1148

Print publication date: 01/01/2015

Online publication date: 28/12/2015

Acceptance date: 01/01/1900

Publisher: IEEE


DOI: 10.1109/CIT/IUCC/DASC/PICOM.2015.170