A saccade based framework for real-time motion segmentation using event based vision sensors

Abhishek Mishra, Rohan Ghosh, Jose C. Principe, Nitish V Thakor, Sunil L. Kukreja

Research output: Contribution to journalArticle

Abstract

Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of spike-groups as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92%.

Original languageEnglish (US)
Article number83
JournalFrontiers in Neuroscience
Volume11
Issue numberMAR
DOIs
StatePublished - Mar 3 2017

    Fingerprint

Keywords

  • Asynchronous signal processing
  • Dynamic vision sensors
  • Motion segmentation
  • Robotics
  • Temporal information
  • Tracking and following

ASJC Scopus subject areas

  • Neuroscience(all)

Cite this