Egocentric hand detection via dynamic region growing
Egocentric videos, which mainly record the activities carried out by the users of wearable cameras, have drawn much research attention in recent years. Due to its lengthy content, a large number of ego-related applications have been developed to abstract the captured videos. As the users are accusto...
Saved in:
Main Authors: | , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2018
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7856 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-8859 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-88592023-06-15T09:00:05Z Egocentric hand detection via dynamic region growing HUANG, Shao WANG, Weiqiang HE, Shengfeng LAU, Rynson W. H. Egocentric videos, which mainly record the activities carried out by the users of wearable cameras, have drawn much research attention in recent years. Due to its lengthy content, a large number of ego-related applications have been developed to abstract the captured videos. As the users are accustomed to interacting with the target objects using their own hands, while their hands usually appear within their visual fields during the interaction, an egocentric hand detection step is involved in tasks like gesture recognition, action recognition, and social interaction understanding. In this work, we propose a dynamic region-growing approach for hand region detection in egocentric videos, by jointly considering hand-related motion and egocentric cues. We first determine seed regions that most likely belong to the hand, by analyzing the motion patterns across successive frames. The hand regions can then be located by extending from the seed regions, according to the scores computed for the adjacent superpixels. These scores are derived from four egocentric cues: contrast, location, position consistency, and appearance continuity. We discuss how to apply the proposed method in real-life scenarios, where multiple hands irregularly appear and disappear from the videos. Experimental results on public datasets show that the proposed method achieves superior performance compared with the state-of-the-art methods, especially in complicated scenarios. 2018-01-01T08:00:00Z text https://ink.library.smu.edu.sg/sis_research/7856 info:doi/10.1145/3152129 Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Egocentric videos egocentric hand detection seed region generation hand region growing Information Security |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Egocentric videos egocentric hand detection seed region generation hand region growing Information Security |
spellingShingle |
Egocentric videos egocentric hand detection seed region generation hand region growing Information Security HUANG, Shao WANG, Weiqiang HE, Shengfeng LAU, Rynson W. H. Egocentric hand detection via dynamic region growing |
description |
Egocentric videos, which mainly record the activities carried out by the users of wearable cameras, have drawn much research attention in recent years. Due to its lengthy content, a large number of ego-related applications have been developed to abstract the captured videos. As the users are accustomed to interacting with the target objects using their own hands, while their hands usually appear within their visual fields during the interaction, an egocentric hand detection step is involved in tasks like gesture recognition, action recognition, and social interaction understanding. In this work, we propose a dynamic region-growing approach for hand region detection in egocentric videos, by jointly considering hand-related motion and egocentric cues. We first determine seed regions that most likely belong to the hand, by analyzing the motion patterns across successive frames. The hand regions can then be located by extending from the seed regions, according to the scores computed for the adjacent superpixels. These scores are derived from four egocentric cues: contrast, location, position consistency, and appearance continuity. We discuss how to apply the proposed method in real-life scenarios, where multiple hands irregularly appear and disappear from the videos. Experimental results on public datasets show that the proposed method achieves superior performance compared with the state-of-the-art methods, especially in complicated scenarios. |
format |
text |
author |
HUANG, Shao WANG, Weiqiang HE, Shengfeng LAU, Rynson W. H. |
author_facet |
HUANG, Shao WANG, Weiqiang HE, Shengfeng LAU, Rynson W. H. |
author_sort |
HUANG, Shao |
title |
Egocentric hand detection via dynamic region growing |
title_short |
Egocentric hand detection via dynamic region growing |
title_full |
Egocentric hand detection via dynamic region growing |
title_fullStr |
Egocentric hand detection via dynamic region growing |
title_full_unstemmed |
Egocentric hand detection via dynamic region growing |
title_sort |
egocentric hand detection via dynamic region growing |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2018 |
url |
https://ink.library.smu.edu.sg/sis_research/7856 |
_version_ |
1770576557302087680 |