Context-aware and scale-insensitive temporal repetition counting

Temporal repetition counting aims to estimate the number of cycles of a given repetitive action. Existing deep learning methods assume repetitive actions are performed in a fixed time-scale, which is invalid for the complex repetitive actions in real life. In this paper, we tailor a context-aware an...

Full description

Saved in:
Bibliographic Details
Main Authors: ZHANG, Huaidong, XU, Xuemiao, HAN, Guoqiang, HE, Shengfeng
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2020
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8523
https://ink.library.smu.edu.sg/context/sis_research/article/9526/viewcontent/Context_Aware_and_Scale_Insensitive_Temporal_Repetition_Counting.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9526
record_format dspace
spelling sg-smu-ink.sis_research-95262024-01-22T15:01:40Z Context-aware and scale-insensitive temporal repetition counting ZHANG, Huaidong XU, Xuemiao HAN, Guoqiang HE, Shengfeng Temporal repetition counting aims to estimate the number of cycles of a given repetitive action. Existing deep learning methods assume repetitive actions are performed in a fixed time-scale, which is invalid for the complex repetitive actions in real life. In this paper, we tailor a context-aware and scale-insensitive framework, to tackle the challenges in repetition counting caused by the unknown and diverse cycle-lengths. Our approach combines two key insights: (1) Cycle lengths from different actions are unpredictable that require large-scale searching, but, once a coarse cycle length is determined, the variety between repetitions can be overcome by regression. (2) Determining the cycle length cannot only rely on a short fragment of video but a contextual understanding. The first point is implemented by a coarse-to-fine cycle refinement method. It avoids the heavy computation of exhaustively searching all the cycle lengths in the video, and, instead, it propagates the coarse prediction for further refinement in a hierarchical manner. We secondly propose a bidirectional cycle length estimation method for a context-aware prediction. It is a regression network that takes two consecutive coarse cycles as input, and predicts the locations of the previous and next repetitive cycles. To benefit the training and evaluation of temporal repetition counting area, we construct a new and largest benchmark, which contains 526 videos with diverse repetitive actions. Extensive experiments show that the proposed network trained on a single dataset outperforms state-of-the-art methods on several benchmarks, indicating that the proposed framework is general enough to capture repetition patterns across domains. 2020-06-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8523 info:doi/10.1109/CVPR42600.2020.00075 https://ink.library.smu.edu.sg/context/sis_research/article/9526/viewcontent/Context_Aware_and_Scale_Insensitive_Temporal_Repetition_Counting.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Coarse to fine Context-Aware Contextual understanding Cycle length Learning methods Number of cycles Refinement methods State-of-the-art methods Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Coarse to fine
Context-Aware
Contextual understanding
Cycle length
Learning methods
Number of cycles
Refinement methods
State-of-the-art methods
Databases and Information Systems
spellingShingle Coarse to fine
Context-Aware
Contextual understanding
Cycle length
Learning methods
Number of cycles
Refinement methods
State-of-the-art methods
Databases and Information Systems
ZHANG, Huaidong
XU, Xuemiao
HAN, Guoqiang
HE, Shengfeng
Context-aware and scale-insensitive temporal repetition counting
description Temporal repetition counting aims to estimate the number of cycles of a given repetitive action. Existing deep learning methods assume repetitive actions are performed in a fixed time-scale, which is invalid for the complex repetitive actions in real life. In this paper, we tailor a context-aware and scale-insensitive framework, to tackle the challenges in repetition counting caused by the unknown and diverse cycle-lengths. Our approach combines two key insights: (1) Cycle lengths from different actions are unpredictable that require large-scale searching, but, once a coarse cycle length is determined, the variety between repetitions can be overcome by regression. (2) Determining the cycle length cannot only rely on a short fragment of video but a contextual understanding. The first point is implemented by a coarse-to-fine cycle refinement method. It avoids the heavy computation of exhaustively searching all the cycle lengths in the video, and, instead, it propagates the coarse prediction for further refinement in a hierarchical manner. We secondly propose a bidirectional cycle length estimation method for a context-aware prediction. It is a regression network that takes two consecutive coarse cycles as input, and predicts the locations of the previous and next repetitive cycles. To benefit the training and evaluation of temporal repetition counting area, we construct a new and largest benchmark, which contains 526 videos with diverse repetitive actions. Extensive experiments show that the proposed network trained on a single dataset outperforms state-of-the-art methods on several benchmarks, indicating that the proposed framework is general enough to capture repetition patterns across domains.
format text
author ZHANG, Huaidong
XU, Xuemiao
HAN, Guoqiang
HE, Shengfeng
author_facet ZHANG, Huaidong
XU, Xuemiao
HAN, Guoqiang
HE, Shengfeng
author_sort ZHANG, Huaidong
title Context-aware and scale-insensitive temporal repetition counting
title_short Context-aware and scale-insensitive temporal repetition counting
title_full Context-aware and scale-insensitive temporal repetition counting
title_fullStr Context-aware and scale-insensitive temporal repetition counting
title_full_unstemmed Context-aware and scale-insensitive temporal repetition counting
title_sort context-aware and scale-insensitive temporal repetition counting
publisher Institutional Knowledge at Singapore Management University
publishDate 2020
url https://ink.library.smu.edu.sg/sis_research/8523
https://ink.library.smu.edu.sg/context/sis_research/article/9526/viewcontent/Context_Aware_and_Scale_Insensitive_Temporal_Repetition_Counting.pdf
_version_ 1789483258536263680