Dynamic composition of tracking primitives for interactive vision-guided navigation

Darius Burschka, Gregory Hager

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present a system architecture for robust target following with a mobile robot. The system is based on tracking multiple cues in binocular stereo images using the XVision toolkit. Fusion of complementary information in the images, including texture, color and depth, combined with a fast optimized processing reduces the possibility of loosing the tracked object in a dynamic scene with several moving targets on intersecting paths. The presented system is capable of detecting objects obstructing its way as well as gaps. It supports application in more cluttered terrain, where a wheel drive of mobile robot cannot take the same path as a walking person. We describe the basic principles of the fast feature extraction and tracking in the luminance, chrominance and disparity domain. The optimized tracking algorithms compensate for illumination variations and perspective distortions as already presented in our previous publications about the XVision system.

Original languageEnglish (US)
Title of host publicationProceedings of SPIE - The International Society for Optical Engineering
EditorsD W Gage, H M Choset
Pages114-125
Number of pages12
Volume4573
DOIs
StatePublished - 2001
EventMobile Robots XVI - Newton, MA, United States
Duration: Oct 29 2001Oct 30 2001

Other

OtherMobile Robots XVI
CountryUnited States
CityNewton, MA
Period10/29/0110/30/01

Fingerprint

navigation
Mobile robots
Navigation
Image texture
robots
Binoculars
Chemical analysis
Feature extraction
Luminance
Wheels
walking
Fusion reactions
Lighting
cues
wheels
luminance
Color
pattern recognition
textures
Processing

Keywords

  • 3D tracking
  • Color tracking
  • Vision-based navigation

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Condensed Matter Physics

Cite this

Burschka, D., & Hager, G. (2001). Dynamic composition of tracking primitives for interactive vision-guided navigation. In D. W. Gage, & H. M. Choset (Eds.), Proceedings of SPIE - The International Society for Optical Engineering (Vol. 4573, pp. 114-125) https://doi.org/10.1117/12.457436

Dynamic composition of tracking primitives for interactive vision-guided navigation. / Burschka, Darius; Hager, Gregory.

Proceedings of SPIE - The International Society for Optical Engineering. ed. / D W Gage; H M Choset. Vol. 4573 2001. p. 114-125.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Burschka, D & Hager, G 2001, Dynamic composition of tracking primitives for interactive vision-guided navigation. in DW Gage & HM Choset (eds), Proceedings of SPIE - The International Society for Optical Engineering. vol. 4573, pp. 114-125, Mobile Robots XVI, Newton, MA, United States, 10/29/01. https://doi.org/10.1117/12.457436
Burschka D, Hager G. Dynamic composition of tracking primitives for interactive vision-guided navigation. In Gage DW, Choset HM, editors, Proceedings of SPIE - The International Society for Optical Engineering. Vol. 4573. 2001. p. 114-125 https://doi.org/10.1117/12.457436
Burschka, Darius ; Hager, Gregory. / Dynamic composition of tracking primitives for interactive vision-guided navigation. Proceedings of SPIE - The International Society for Optical Engineering. editor / D W Gage ; H M Choset. Vol. 4573 2001. pp. 114-125
@inproceedings{6d5f50c2ccf744e28f9ad35fce6402aa,
title = "Dynamic composition of tracking primitives for interactive vision-guided navigation",
abstract = "We present a system architecture for robust target following with a mobile robot. The system is based on tracking multiple cues in binocular stereo images using the XVision toolkit. Fusion of complementary information in the images, including texture, color and depth, combined with a fast optimized processing reduces the possibility of loosing the tracked object in a dynamic scene with several moving targets on intersecting paths. The presented system is capable of detecting objects obstructing its way as well as gaps. It supports application in more cluttered terrain, where a wheel drive of mobile robot cannot take the same path as a walking person. We describe the basic principles of the fast feature extraction and tracking in the luminance, chrominance and disparity domain. The optimized tracking algorithms compensate for illumination variations and perspective distortions as already presented in our previous publications about the XVision system.",
keywords = "3D tracking, Color tracking, Vision-based navigation",
author = "Darius Burschka and Gregory Hager",
year = "2001",
doi = "10.1117/12.457436",
language = "English (US)",
volume = "4573",
pages = "114--125",
editor = "Gage, {D W} and Choset, {H M}",
booktitle = "Proceedings of SPIE - The International Society for Optical Engineering",

}

TY - GEN

T1 - Dynamic composition of tracking primitives for interactive vision-guided navigation

AU - Burschka, Darius

AU - Hager, Gregory

PY - 2001

Y1 - 2001

N2 - We present a system architecture for robust target following with a mobile robot. The system is based on tracking multiple cues in binocular stereo images using the XVision toolkit. Fusion of complementary information in the images, including texture, color and depth, combined with a fast optimized processing reduces the possibility of loosing the tracked object in a dynamic scene with several moving targets on intersecting paths. The presented system is capable of detecting objects obstructing its way as well as gaps. It supports application in more cluttered terrain, where a wheel drive of mobile robot cannot take the same path as a walking person. We describe the basic principles of the fast feature extraction and tracking in the luminance, chrominance and disparity domain. The optimized tracking algorithms compensate for illumination variations and perspective distortions as already presented in our previous publications about the XVision system.

AB - We present a system architecture for robust target following with a mobile robot. The system is based on tracking multiple cues in binocular stereo images using the XVision toolkit. Fusion of complementary information in the images, including texture, color and depth, combined with a fast optimized processing reduces the possibility of loosing the tracked object in a dynamic scene with several moving targets on intersecting paths. The presented system is capable of detecting objects obstructing its way as well as gaps. It supports application in more cluttered terrain, where a wheel drive of mobile robot cannot take the same path as a walking person. We describe the basic principles of the fast feature extraction and tracking in the luminance, chrominance and disparity domain. The optimized tracking algorithms compensate for illumination variations and perspective distortions as already presented in our previous publications about the XVision system.

KW - 3D tracking

KW - Color tracking

KW - Vision-based navigation

UR - http://www.scopus.com/inward/record.url?scp=0035766118&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0035766118&partnerID=8YFLogxK

U2 - 10.1117/12.457436

DO - 10.1117/12.457436

M3 - Conference contribution

AN - SCOPUS:0035766118

VL - 4573

SP - 114

EP - 125

BT - Proceedings of SPIE - The International Society for Optical Engineering

A2 - Gage, D W

A2 - Choset, H M

ER -