Eliciting usable gestures for multi-display environments

Multi-display environments (MDEs) have advanced rapidly in recent years, incorporating multi-touch tabletops, tablets, wall displays and even position tracking systems. Designers have proposed a variety of interesting gestures for use in an MDE, some of which involve a user moving their hands, arms,...

Full description

Saved in:
Bibliographic Details
Main Authors: SEYED, Teddy, BURNS, Chris, COSTA SOUSA, Mario, MAURER, Frank, TANG, Anthony
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2012
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7988
https://ink.library.smu.edu.sg/context/sis_research/article/8991/viewcontent/2396636.2396643.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8991
record_format dspace
spelling sg-smu-ink.sis_research-89912023-08-15T01:46:39Z Eliciting usable gestures for multi-display environments SEYED, Teddy BURNS, Chris COSTA SOUSA, Mario MAURER, Frank TANG, Anthony Multi-display environments (MDEs) have advanced rapidly in recent years, incorporating multi-touch tabletops, tablets, wall displays and even position tracking systems. Designers have proposed a variety of interesting gestures for use in an MDE, some of which involve a user moving their hands, arms, body or even a device itself. These gestures are often used as part of interactions to move data between the various components of an MDE, which is a longstanding research problem. But designers, not users, have created most of these gestures and concerns over implementation issues such as recognition may have influenced their design. We performed a user study to elicit these gestures directly from users, but found a low level of convergence among the gestures produced. This lack of agreement is important and we discuss its possible causes and the implication it has for designers. To assist designers, we present the most prevalent gestures and some of the underlying conceptual themes behind them. We also provide analysis of how certain factors such as distance and device type impact the choice of gestures and discuss how to apply them to real-world systems. 2012-12-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7988 info:doi/10.1145/2396636.2396643 https://ink.library.smu.edu.sg/context/sis_research/article/8991/viewcontent/2396636.2396643.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University cross-device interaction gestures mobile devices multi-display environments multi-display interaction multi-surface environments tabletop touch Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic cross-device interaction
gestures
mobile devices
multi-display environments
multi-display interaction
multi-surface environments
tabletop
touch
Graphics and Human Computer Interfaces
spellingShingle cross-device interaction
gestures
mobile devices
multi-display environments
multi-display interaction
multi-surface environments
tabletop
touch
Graphics and Human Computer Interfaces
SEYED, Teddy
BURNS, Chris
COSTA SOUSA, Mario
MAURER, Frank
TANG, Anthony
Eliciting usable gestures for multi-display environments
description Multi-display environments (MDEs) have advanced rapidly in recent years, incorporating multi-touch tabletops, tablets, wall displays and even position tracking systems. Designers have proposed a variety of interesting gestures for use in an MDE, some of which involve a user moving their hands, arms, body or even a device itself. These gestures are often used as part of interactions to move data between the various components of an MDE, which is a longstanding research problem. But designers, not users, have created most of these gestures and concerns over implementation issues such as recognition may have influenced their design. We performed a user study to elicit these gestures directly from users, but found a low level of convergence among the gestures produced. This lack of agreement is important and we discuss its possible causes and the implication it has for designers. To assist designers, we present the most prevalent gestures and some of the underlying conceptual themes behind them. We also provide analysis of how certain factors such as distance and device type impact the choice of gestures and discuss how to apply them to real-world systems.
format text
author SEYED, Teddy
BURNS, Chris
COSTA SOUSA, Mario
MAURER, Frank
TANG, Anthony
author_facet SEYED, Teddy
BURNS, Chris
COSTA SOUSA, Mario
MAURER, Frank
TANG, Anthony
author_sort SEYED, Teddy
title Eliciting usable gestures for multi-display environments
title_short Eliciting usable gestures for multi-display environments
title_full Eliciting usable gestures for multi-display environments
title_fullStr Eliciting usable gestures for multi-display environments
title_full_unstemmed Eliciting usable gestures for multi-display environments
title_sort eliciting usable gestures for multi-display environments
publisher Institutional Knowledge at Singapore Management University
publishDate 2012
url https://ink.library.smu.edu.sg/sis_research/7988
https://ink.library.smu.edu.sg/context/sis_research/article/8991/viewcontent/2396636.2396643.pdf
_version_ 1779156921965608960