Dynamic composition of tracking primitives for interactive vision-guided navigation

Darius Burschka, Gregory Hager

Research output: Contribution to journalConference articlepeer-review


We present a system architecture for robust target following with a mobile robot. The system is based on tracking multiple cues in binocular stereo images using the XVision toolkit. Fusion of complementary information in the images, including texture, color and depth, combined with a fast optimized processing reduces the possibility of loosing the tracked object in a dynamic scene with several moving targets on intersecting paths. The presented system is capable of detecting objects obstructing its way as well as gaps. It supports application in more cluttered terrain, where a wheel drive of mobile robot cannot take the same path as a walking person. We describe the basic principles of the fast feature extraction and tracking in the luminance, chrominance and disparity domain. The optimized tracking algorithms compensate for illumination variations and perspective distortions as already presented in our previous publications about the XVision system.

Original languageEnglish (US)
Pages (from-to)114-125
Number of pages12
JournalProceedings of SPIE - The International Society for Optical Engineering
StatePublished - 2001
EventMobile Robots XVI - Newton, MA, United States
Duration: Oct 29 2001Oct 30 2001


  • 3D tracking
  • Color tracking
  • Vision-based navigation

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering


Dive into the research topics of 'Dynamic composition of tracking primitives for interactive vision-guided navigation'. Together they form a unique fingerprint.

Cite this