Toggle Main Menu Toggle Search

Open Access padlockePrints

Generative adversarial networks to create synthetic motion capture datasets including subject and gait characteristics

Lookup NU author(s): Dr Metin BicerORCiD

Downloads


Licence

This work is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).


Abstract

© 2024 The AuthorsResource-intensive motion capture (mocap) systems challenge predictive deep learning applications, requiring large and diverse datasets. We tackled this by modifying generative adversarial networks (GANs) into conditional GANs (cGANs) that can generate diverse mocap data, including 15 marker trajectories, lower limb joint angles, and 3D ground reaction forces (GRFs), based on specified subject and gait characteristics. The cGAN comprised 1) an encoder compressing mocap data to a latent vector, 2) a decoder reconstructing the mocap data from the latent vector with specific conditions and 3) a discriminator distinguishing random vectors with conditions from encoded latent vectors with conditions. Single-conditional models were trained separately for age, sex, leg length, mass, and walking speed, while an additional model (Multi-cGAN) combined all conditions simultaneously to generate synthetic data. All models closely replicated the training dataset (<8.1 % of the gait cycle different between experimental and synthetic kinematics and GRFs), while a subset with narrow condition ranges was best replicated by the Multi-cGAN, producing similar kinematics (<1°) and GRFs (<0.02 body-weight) averaged by walking speeds. Multi-cGAN also generated synthetic datasets and results for three previous studies using reported mean and standard deviation of subject and gait characteristics. Additionally, unseen test data was best predicted by the walking speed-conditional, showcasing synthetic data diversity. The same model also matched the dynamical consistency of the experimental data (32 % average difference throughout the gait cycle), meaning that transforming the gait cycle data to the original time domain yielded accurate derivative calculations. Importantly, synthetic data poses no privacy concerns, potentially facilitating data sharing.


Publication metadata

Author(s): Bicer M, Phillips ATM, Melis A, McGregor AH, Modenese L

Publication type: Article

Publication status: Published

Journal: Journal of Biomechanics

Year: 2024

Volume: 177

Print publication date: 01/12/2024

Online publication date: 04/10/2024

Acceptance date: 03/10/2024

Date deposited: 19/11/2024

ISSN (print): 0021-9290

ISSN (electronic): 1873-2380

Publisher: Elsevier Ltd

URL: https://doi.org/10.1016/j.jbiomech.2024.112358

DOI: 10.1016/j.jbiomech.2024.112358


Altmetrics

Altmetrics provided by Altmetric


Share