Toggle Main Menu Toggle Search

Open Access padlockePrints

Learning disentangled behaviour patterns for wearable-based human activity recognition

Lookup NU author(s): Dr Jie Su, Dr Zhenyu Wen, Dr Yu GuanORCiD

Downloads

Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


Abstract

© 2022 ACM. In wearable-based human activity recognition (HAR) research, one of the major challenges is the large intra-class variability problem. The collected activity signal is often, if not always, coupled with noises or bias caused by personal, environmental, or other factors, making it difficult to learn effective features for HAR tasks, especially when with inadequate data. To address this issue, in this work, we proposed a Behaviour Pattern Disentanglement (BPD) framework, which can disentangle the behavior patterns from the irrelevant noises such as personal styles or environmental noises, etc. Based on a disentanglement network, we designed several loss functions and used an adversarial training strategy for optimization, which can disentangle activity signals from the irrelevant noises with the least dependency (between them) in the feature space. Our BPD framework is flexible, and it can be used on top of existing deep learning (DL) approaches for feature refinement. Extensive experiments were conducted on four public HAR datasets, and the promising results of our proposed BPD scheme suggest its flexibility and effectiveness. This is an open-source project, and the code can be found at http://github.com/Jie-su/BPD


Publication metadata

Author(s): Su J, Wen Z, Lin T, Guan Y

Publication type: Article

Publication status: Published

Journal: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

Year: 2022

Volume: 6

Issue: 1

Online publication date: 29/03/2022

Acceptance date: 02/04/2018

ISSN (electronic): 2474-9567

Publisher: Association for Computing Machinery

URL: https://doi.org/10.1145/3517252

DOI: 10.1145/3517252


Altmetrics

Altmetrics provided by Altmetric


Share