Recognizing Gaits Across Walking and Running Speeds

For decades, very few methods were proposed for cross-mode (i.e., walking vs. running) gait recognition. Thus, it remains largely unexplored regarding how to recognize persons by the way they walk and run. Existing cross-mode methods handle the walking-versus-running problem in two ways, either by e...

Full description

Saved in:
Bibliographic Details
Main Authors: Lingxiang Yao, Worapan Kusakunniran, Qiang Wu, Jingsong Xu, Jian Zhang
Other Authors: University of Technology Sydney
Format: Article
Published: 2022
Subjects:
Online Access:https://repository.li.mahidol.ac.th/handle/123456789/73729
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Mahidol University
id th-mahidol.73729
record_format dspace
spelling th-mahidol.737292022-08-04T10:53:17Z Recognizing Gaits Across Walking and Running Speeds Lingxiang Yao Worapan Kusakunniran Qiang Wu Jingsong Xu Jian Zhang University of Technology Sydney Mahidol University Computer Science For decades, very few methods were proposed for cross-mode (i.e., walking vs. running) gait recognition. Thus, it remains largely unexplored regarding how to recognize persons by the way they walk and run. Existing cross-mode methods handle the walking-versus-running problem in two ways, either by exploring the generic mapping relation between walking and running modes or by extracting gait features which are non-/less vulnerable to the changes across these two modes. However, for the first approach, a mapping relation fit for one person may not be applicable to another person. There is no generic mapping relation given that walking and running are two highly self-related motions. The second approach does not give more attention to the disparity between walking and running modes, since mode labels are not involved in their feature learning processes. Distinct from these existing cross-mode methods, in our method, mode labels are used in the feature learning process, and a mode-invariant gait descriptor is hybridized for cross-mode gait recognition to handle this walking-versus-running problem. Further research is organized in this article to investigate the disparity between walking and running. Running is different from walking not only in the speed variances but also, more significantly, in prominent gesture/motion changes. According to these rationales, in our proposed method, we give more attention to the differences between walking and running modes, and a robust gait descriptor is developed to hybridize the mode-invariant spatial and temporal features. Two multi-task learning-based networks are proposed in this method to explore these mode-invariant features. Spatial features describe the body parts non-/less affected by mode changes, and temporal features depict the instinct motion relation of each person. Mode labels are also adopted in the training phase to guide the network to give more attention to the disparity across walking and running modes. In addition, relevant experiments on OU-ISIR Treadmill Dataset A have affirmed the effectiveness and feasibility of the proposed method. A state-of-the-art result can be achieved by our proposed method on this dataset. 2022-08-04T03:53:17Z 2022-08-04T03:53:17Z 2022-08-01 Article ACM Transactions on Multimedia Computing, Communications and Applications. Vol.18, No.3 (2022) 10.1145/3488715 15516865 15516857 2-s2.0-85127448325 https://repository.li.mahidol.ac.th/handle/123456789/73729 Mahidol University SCOPUS https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85127448325&origin=inward
institution Mahidol University
building Mahidol University Library
continent Asia
country Thailand
Thailand
content_provider Mahidol University Library
collection Mahidol University Institutional Repository
topic Computer Science
spellingShingle Computer Science
Lingxiang Yao
Worapan Kusakunniran
Qiang Wu
Jingsong Xu
Jian Zhang
Recognizing Gaits Across Walking and Running Speeds
description For decades, very few methods were proposed for cross-mode (i.e., walking vs. running) gait recognition. Thus, it remains largely unexplored regarding how to recognize persons by the way they walk and run. Existing cross-mode methods handle the walking-versus-running problem in two ways, either by exploring the generic mapping relation between walking and running modes or by extracting gait features which are non-/less vulnerable to the changes across these two modes. However, for the first approach, a mapping relation fit for one person may not be applicable to another person. There is no generic mapping relation given that walking and running are two highly self-related motions. The second approach does not give more attention to the disparity between walking and running modes, since mode labels are not involved in their feature learning processes. Distinct from these existing cross-mode methods, in our method, mode labels are used in the feature learning process, and a mode-invariant gait descriptor is hybridized for cross-mode gait recognition to handle this walking-versus-running problem. Further research is organized in this article to investigate the disparity between walking and running. Running is different from walking not only in the speed variances but also, more significantly, in prominent gesture/motion changes. According to these rationales, in our proposed method, we give more attention to the differences between walking and running modes, and a robust gait descriptor is developed to hybridize the mode-invariant spatial and temporal features. Two multi-task learning-based networks are proposed in this method to explore these mode-invariant features. Spatial features describe the body parts non-/less affected by mode changes, and temporal features depict the instinct motion relation of each person. Mode labels are also adopted in the training phase to guide the network to give more attention to the disparity across walking and running modes. In addition, relevant experiments on OU-ISIR Treadmill Dataset A have affirmed the effectiveness and feasibility of the proposed method. A state-of-the-art result can be achieved by our proposed method on this dataset.
author2 University of Technology Sydney
author_facet University of Technology Sydney
Lingxiang Yao
Worapan Kusakunniran
Qiang Wu
Jingsong Xu
Jian Zhang
format Article
author Lingxiang Yao
Worapan Kusakunniran
Qiang Wu
Jingsong Xu
Jian Zhang
author_sort Lingxiang Yao
title Recognizing Gaits Across Walking and Running Speeds
title_short Recognizing Gaits Across Walking and Running Speeds
title_full Recognizing Gaits Across Walking and Running Speeds
title_fullStr Recognizing Gaits Across Walking and Running Speeds
title_full_unstemmed Recognizing Gaits Across Walking and Running Speeds
title_sort recognizing gaits across walking and running speeds
publishDate 2022
url https://repository.li.mahidol.ac.th/handle/123456789/73729
_version_ 1763492835152101376