VisHap: Augmented reality combining haptics and vision

Guangqi Ye, Jason J. Corso, Gregory Hager, Allison M. Okamura

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Recently, haptic devices have been successfully incorporated into the human-computer interaction model. However, a drawback common to almost all haptic systems is that the user must be attached to the haptic device at all times, even though force feedback is not always being rendered. This constant contact hinders perception of the virtual environment, primarily because it prevents the user from feeling new tactile sensations upon contact with virtual objects. We present the design and implementation of an augmented reality system called VisHap that uses visual tracking to seamlessly integrate force feedback with tactile feedback to generate a "complete" haptic experience. The VisHap framework allows the user to interact with combinations of virtual and real objects naturally, thereby combining active and passive haptics. An example application of this framework is also presented. The flexibility and extensibility of our framework is promising in that it supports many interaction modes and allows further integration with other augmented reality systems.

Original languageEnglish (US)
Title of host publicationProceedings of the IEEE International Conference on Systems, Man and Cybernetics
Number of pages7
StatePublished - 2003
EventSystem Security and Assurance - Washington, DC, United States
Duration: Oct 5 2003Oct 8 2003


OtherSystem Security and Assurance
CountryUnited States
CityWashington, DC



  • Computer vision
  • Haptics
  • Human-computer interaction
  • Visual tracking

ASJC Scopus subject areas

  • Hardware and Architecture
  • Control and Systems Engineering

Cite this

Ye, G., Corso, J. J., Hager, G., & Okamura, A. M. (2003). VisHap: Augmented reality combining haptics and vision. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (Vol. 4, pp. 3425-3431)