Scale-invariant registration of monocular stereo images to 3D surface models

Darius Burschka, Ming Li, Russell Taylor, Gregory D. Hager

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present an approach for scale recovery from monocular stereo images of an endoscopic camera with simultaneous registration to dense 3D surface models. We assume the camera motion to be unknown or at least uncertain. An example application is the registration of endoscope images to pre-operative CT scans that allows instrument navigation during surgical procedures. The application field is not restricted to the medical field. It can be extended to registration of monocular video images to laser-based surface reconstructions in, e.g., mobile navigation area or to autonomous aircraft navigation from topological surveys. A novel way for depth estimation from arbitrary camera motion is presented. In this paper, we focus on the robust initialization of the system and on the scale recovery for the reconstructed 3D point clouds with accurate registration to the candidate surfaces extracted from the CT data. We provide experimental validation of the algorithm with data obtained from our experiments with a phantom skull.

Original languageEnglish (US)
Title of host publication2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Pages2581-2586
Number of pages6
StatePublished - Dec 1 2004
Event2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) - Sendai, Japan
Duration: Sep 28 2004Oct 2 2004

Publication series

Name2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Volume3

Other

Other2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
CountryJapan
CitySendai
Period9/28/0410/2/04

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint Dive into the research topics of 'Scale-invariant registration of monocular stereo images to 3D surface models'. Together they form a unique fingerprint.

Cite this