Interaction techniques for 3D visual exploration on large displays
As 3D visual data such as 3D medical image and astrophysical simulation is becoming increasingly more detailed and complex, large interactive displays not only provide a good place to visualize it in higher resolution but also enable multiple users to explore it simultaneously and collaboratively...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Theses and Dissertations |
Language: | English |
Published: |
2013
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/54549 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-54549 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-545492023-03-04T00:33:03Z Interaction techniques for 3D visual exploration on large displays Song, Peng. Fu Chi-Wing School of Computer Engineering DRNTU::Engineering::Computer science and engineering::Information systems::Information systems applications As 3D visual data such as 3D medical image and astrophysical simulation is becoming increasingly more detailed and complex, large interactive displays not only provide a good place to visualize it in higher resolution but also enable multiple users to explore it simultaneously and collaboratively. However, most interactive visualization and manipulation techniques are specifically designed for use on desktop computers and cannot be readily ported to large-display-based interaction systems. In recent years, large multi-touch surfaces have got fast development which enable general users to directly use their hands to interact with the graphical contents. In addition, mobile devices with multi-touch screen and 3D-tilt sensing capabilities become widely available which can serve as remote controllers for supporting user interaction with a co-located large display. More recently, a low-cost 3D scene acquisition sensor – the Kinect device is fast gaining popularity, which has significantly reduced the cost barrier of implementing mid-air interaction systems. This thesis investigates effective interaction techniques for manipulating and exploring 3D scientific data on large displays by employing these emerging interaction devices. Since it is impractical to design large-display-based interaction systems for every single piece of scientific data, this thesis selects three typical categories of scientific data to explore: 3D medical volume data, large-scale astrophysical simulations, and conventional 3D virtual environments, targeting at obtaining design experience and knowledge that can be applied for general 3D scientific data. A set of guiding principles are proposed to motivate large-display-based interaction system design. Based on these principles, five interaction systems were designed and implemented to allow users to visually explore the selected 3D scientific data effectively: - A multi-touch tabletop interface to enable general users to interactively create volume exploded views by using a family of novel and intuitive whole-hand multitouch gestures. - An affordable interaction system for volume data exploration and annotation that combines the strengths of a standard upright multi-touch display and a commonly available handheld device. A slicing plane can be directly and intuitively manipulated at any desired position within the volume data by using the handheld device so that various cross-sections of the volume data can be visualized interactively. - A novel matching technique called tilt correlation for identifying smart-phones that make concurrent two-point contacts on a multi-touch wall display so that multiple users can perform exploration tasks simultaneously on the wall display by using their phones. - A phone-based interface to support a nontrivial set of operations for exploring large-scale astrophysical simulation. The interface defines and organizes necessary interactions into control modes, and uses a novel double-tap mode-switching mechanism to seamlessly switch between modes. - A novel handle bar metaphor for virtual object manipulation by using mid-air freehand gestures tracked by the Kinect sensor. It mimics a familiar situation of handling objects that are skewered with a bimanual handle bar. The use of relative 3D motion of the two hands to design the mid-air interaction allows users to provide precise controllability despite the Kinect sensor’s low image resolution. Doctor of Philosophy (SCE) 2013-06-24T01:52:28Z 2013-06-24T01:52:28Z 2013 2013 Thesis http://hdl.handle.net/10356/54549 en 134 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Computer science and engineering::Information systems::Information systems applications |
spellingShingle |
DRNTU::Engineering::Computer science and engineering::Information systems::Information systems applications Song, Peng. Interaction techniques for 3D visual exploration on large displays |
description |
As 3D visual data such as 3D medical image and astrophysical simulation is becoming increasingly
more detailed and complex, large interactive displays not only provide a good place to visualize it in higher resolution but also enable multiple users to explore it simultaneously and collaboratively. However, most interactive visualization and manipulation techniques are specifically designed for use on desktop computers and cannot be readily ported to large-display-based interaction systems. In recent years, large multi-touch surfaces have got fast development which enable general users to directly use their hands to
interact with the graphical contents. In addition, mobile devices with multi-touch screen and 3D-tilt sensing capabilities become widely available which can serve as remote controllers for supporting user interaction with a co-located large display. More recently, a low-cost 3D scene acquisition sensor – the Kinect device is fast gaining popularity, which
has significantly reduced the cost barrier of implementing mid-air interaction systems.
This thesis investigates effective interaction techniques for manipulating and exploring 3D scientific data on large displays by employing these emerging interaction devices. Since it is impractical to design large-display-based interaction systems for every single piece of scientific data, this thesis selects three typical categories of scientific data to explore: 3D medical volume data, large-scale astrophysical simulations, and conventional 3D virtual environments, targeting at obtaining design experience and knowledge that
can be applied for general 3D scientific data. A set of guiding principles are proposed to motivate large-display-based interaction system design. Based on these principles, five interaction systems were designed and implemented to allow users to visually explore the selected 3D scientific data effectively: - A multi-touch tabletop interface to enable general users to interactively create volume exploded views by using a family of novel and intuitive whole-hand multitouch gestures. - An affordable interaction system for volume data exploration and annotation that combines the strengths of a standard upright multi-touch display and a commonly available handheld device. A slicing plane can be directly and intuitively manipulated
at any desired position within the volume data by using the handheld device so that various cross-sections of the volume data can be visualized interactively.
- A novel matching technique called tilt correlation for identifying smart-phones that make concurrent two-point contacts on a multi-touch wall display so that multiple users can perform exploration tasks simultaneously on the wall display by using their phones.
- A phone-based interface to support a nontrivial set of operations for exploring large-scale astrophysical simulation. The interface defines and organizes necessary interactions into control modes, and uses a novel double-tap mode-switching mechanism
to seamlessly switch between modes.
- A novel handle bar metaphor for virtual object manipulation by using mid-air freehand gestures tracked by the Kinect sensor. It mimics a familiar situation of handling objects that are skewered with a bimanual handle bar. The use of relative 3D motion of the two hands to design the mid-air interaction allows users to provide precise controllability despite the Kinect sensor’s low image resolution. |
author2 |
Fu Chi-Wing |
author_facet |
Fu Chi-Wing Song, Peng. |
format |
Theses and Dissertations |
author |
Song, Peng. |
author_sort |
Song, Peng. |
title |
Interaction techniques for 3D visual exploration on large displays |
title_short |
Interaction techniques for 3D visual exploration on large displays |
title_full |
Interaction techniques for 3D visual exploration on large displays |
title_fullStr |
Interaction techniques for 3D visual exploration on large displays |
title_full_unstemmed |
Interaction techniques for 3D visual exploration on large displays |
title_sort |
interaction techniques for 3d visual exploration on large displays |
publishDate |
2013 |
url |
http://hdl.handle.net/10356/54549 |
_version_ |
1759857258744774656 |