Please use this identifier to cite or link to this item:
Title: Interaction techniques for 3D visual exploration on large displays
Authors: Song, Peng.
Keywords: DRNTU::Engineering::Computer science and engineering::Information systems::Information systems applications
Issue Date: 2013
Abstract: As 3D visual data such as 3D medical image and astrophysical simulation is becoming increasingly more detailed and complex, large interactive displays not only provide a good place to visualize it in higher resolution but also enable multiple users to explore it simultaneously and collaboratively. However, most interactive visualization and manipulation techniques are specifically designed for use on desktop computers and cannot be readily ported to large-display-based interaction systems. In recent years, large multi-touch surfaces have got fast development which enable general users to directly use their hands to interact with the graphical contents. In addition, mobile devices with multi-touch screen and 3D-tilt sensing capabilities become widely available which can serve as remote controllers for supporting user interaction with a co-located large display. More recently, a low-cost 3D scene acquisition sensor – the Kinect device is fast gaining popularity, which has significantly reduced the cost barrier of implementing mid-air interaction systems. This thesis investigates effective interaction techniques for manipulating and exploring 3D scientific data on large displays by employing these emerging interaction devices. Since it is impractical to design large-display-based interaction systems for every single piece of scientific data, this thesis selects three typical categories of scientific data to explore: 3D medical volume data, large-scale astrophysical simulations, and conventional 3D virtual environments, targeting at obtaining design experience and knowledge that can be applied for general 3D scientific data. A set of guiding principles are proposed to motivate large-display-based interaction system design. Based on these principles, five interaction systems were designed and implemented to allow users to visually explore the selected 3D scientific data effectively: - A multi-touch tabletop interface to enable general users to interactively create volume exploded views by using a family of novel and intuitive whole-hand multitouch gestures. - An affordable interaction system for volume data exploration and annotation that combines the strengths of a standard upright multi-touch display and a commonly available handheld device. A slicing plane can be directly and intuitively manipulated at any desired position within the volume data by using the handheld device so that various cross-sections of the volume data can be visualized interactively. - A novel matching technique called tilt correlation for identifying smart-phones that make concurrent two-point contacts on a multi-touch wall display so that multiple users can perform exploration tasks simultaneously on the wall display by using their phones. - A phone-based interface to support a nontrivial set of operations for exploring large-scale astrophysical simulation. The interface defines and organizes necessary interactions into control modes, and uses a novel double-tap mode-switching mechanism to seamlessly switch between modes. - A novel handle bar metaphor for virtual object manipulation by using mid-air freehand gestures tracked by the Kinect sensor. It mimics a familiar situation of handling objects that are skewered with a bimanual handle bar. The use of relative 3D motion of the two hands to design the mid-air interaction allows users to provide precise controllability despite the Kinect sensor’s low image resolution.
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Theses

Files in This Item:
File Description SizeFormat 
  Restricted Access
PhD Thesis18.74 MBAdobe PDFView/Open

Page view(s) 50

Updated on Nov 23, 2020

Download(s) 50

Updated on Nov 23, 2020

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.