VisHap: Augmented reality combining haptics and vision

Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura

Research output: Contribution to journalConference articlepeer-review

Abstract

Recently, haptic devices have been successfully incorporated into the human-computer interaction model. However, a drawback common to almost all haptic systems is that the user must be attached to the haptic device at all times, even though force feedback is not always being rendered. This constant contact hinders perception of the virtual environment, primarily because it prevents the user from feeling new tactile sensations upon contact with virtual objects. We present the design and implementation of an augmented reality system called VisHap that uses visual tracking to seamlessly integrate force feedback with tactile feedback to generate a "complete" haptic experience. The VisHap framework allows the user to interact with combinations of virtual and real objects naturally, thereby combining active and passive haptics. An example application of this framework is also presented. The flexibility and extensibility of our framework is promising in that it supports many interaction modes and allows further integration with other augmented reality systems.

Original languageEnglish (US)
Pages (from-to)3425-3431
Number of pages7
JournalProceedings of the IEEE International Conference on Systems, Man and Cybernetics
Volume4
StatePublished - Nov 24 2003
EventSystem Security and Assurance - Washington, DC, United States
Duration: Oct 5 2003Oct 8 2003

Keywords

  • Computer vision
  • Haptics
  • Human-computer interaction
  • Visual tracking

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Hardware and Architecture

Fingerprint Dive into the research topics of 'VisHap: Augmented reality combining haptics and vision'. Together they form a unique fingerprint.

Cite this