Virtual wayfinding using simulated prosthetic vision in gaze-locked viewing

Lin Wang, Liancheng Yang, Gislin Dagnelie

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

To assess virtual maze navigation performance with simulated prosthetic vision in gaze-locked viewing, under the conditions of varying luminance contrast, background noise, and phosphene dropout. Four normally sighted subjects performed virtual maze navigation using simulated prosthetic vision in gaze-locked viewing, under five conditions of luminance contrast, background noise, and phosphene dropout. Navigation performance was measured as the time required to traverse a 10-room maze using a game controller, and the number of errors made during the trip. Navigation performance time (1) became stable after 6 to 10 trials, (2) remained similar on average at luminance contrast of 68% and 16% but had greater variation at 16%, (3) was not significantly affected by background noise, and (4) increased by 40% when 30% of phosphenes were removed. Navigation performance time and number of errors were significantly and positively correlated. Assuming that the simulated gaze-locked viewing conditions are extended to implant wearers, such prosthetic vision can be helpful for wayfinding in simple mobility tasks, though phosphene dropout may interfere with performance.

Original languageEnglish (US)
Pages (from-to)E1057-E1063
JournalOptometry and Vision Science
Volume85
Issue number11
DOIs
StatePublished - Nov 2008

Keywords

  • Low vision
  • Low vision rehabilitation
  • Mobility
  • Orientation
  • Prosthetic vision
  • Retinal prosthesis
  • Simulation
  • Wayfinding

ASJC Scopus subject areas

  • Ophthalmology
  • Optometry

Fingerprint

Dive into the research topics of 'Virtual wayfinding using simulated prosthetic vision in gaze-locked viewing'. Together they form a unique fingerprint.

Cite this