Time-to-x: analysis of motion through temporal parameters

Philippe Burlina, Rama Chellappa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Situations involving navigation among maneuvering agents are critical for the study of visual guidance of autonomous vehicles. This paper addresses the case of translational motion with polynomial regimes and defines a general class of temporal parameters (TP) relevant for navigation, enable a qualitative description of the observed agents' depth trajectories. These parameters are shown to be visually recoverable. Instances of such temporal parameters include Time-to-Collision (TTC) and Time-to-Synchronization (TTS), useful for docking or platooning maneuvers. The results are specialized to lower order motions. The recovery of TTC and TTS for arbitrary regimes is a special corollary of our analysis. Computations from direct and feature-based methods are described. A scheme for addressing model order determination, collision detection and temporal parameter estimation is proposed and tested. Experimental results on synthetic and real images are given.

Original languageEnglish (US)
Title of host publicationProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
PublisherPubl by IEEE
Pages461-468
Number of pages8
ISBN (Print)0818658274, 9780818658273
DOIs
StatePublished - Jan 1 1994
EventProceedings of the 1994 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Seattle, WA, USA
Duration: Jun 21 1994Jun 23 1994

Publication series

NameProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
ISSN (Print)1063-6919

Other

OtherProceedings of the 1994 IEEE Computer Society Conference on Computer Vision and Pattern Recognition
CitySeattle, WA, USA
Period6/21/946/23/94

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Time-to-x: analysis of motion through temporal parameters'. Together they form a unique fingerprint.

Cite this