A saccade based framework for real-time motion segmentation using event based vision sensors

Abhishek Mishra, Rohan Ghosh, Jose C. Principe, Nitish V Thakor, Sunil L. Kukreja

Research output: Contribution to journalArticle

Abstract

Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of spike-groups as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92%.

Original languageEnglish (US)
Article number83
JournalFrontiers in Neuroscience
Volume11
Issue numberMAR
DOIs
StatePublished - Mar 3 2017

Fingerprint

Saccades
Robotics
Equipment and Supplies

Keywords

  • Asynchronous signal processing
  • Dynamic vision sensors
  • Motion segmentation
  • Robotics
  • Temporal information
  • Tracking and following

ASJC Scopus subject areas

  • Neuroscience(all)

Cite this

A saccade based framework for real-time motion segmentation using event based vision sensors. / Mishra, Abhishek; Ghosh, Rohan; Principe, Jose C.; Thakor, Nitish V; Kukreja, Sunil L.

In: Frontiers in Neuroscience, Vol. 11, No. MAR, 83, 03.03.2017.

Research output: Contribution to journalArticle

Mishra, Abhishek ; Ghosh, Rohan ; Principe, Jose C. ; Thakor, Nitish V ; Kukreja, Sunil L. / A saccade based framework for real-time motion segmentation using event based vision sensors. In: Frontiers in Neuroscience. 2017 ; Vol. 11, No. MAR.
@article{75379d0c07ec4dd5813bfef78a6092b5,
title = "A saccade based framework for real-time motion segmentation using event based vision sensors",
abstract = "Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of spike-groups as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92{\%}.",
keywords = "Asynchronous signal processing, Dynamic vision sensors, Motion segmentation, Robotics, Temporal information, Tracking and following",
author = "Abhishek Mishra and Rohan Ghosh and Principe, {Jose C.} and Thakor, {Nitish V} and Kukreja, {Sunil L.}",
year = "2017",
month = "3",
day = "3",
doi = "10.3389/fnins.2017.00083",
language = "English (US)",
volume = "11",
journal = "Frontiers in Neuroscience",
issn = "1662-4548",
publisher = "Frontiers Research Foundation",
number = "MAR",

}

TY - JOUR

T1 - A saccade based framework for real-time motion segmentation using event based vision sensors

AU - Mishra, Abhishek

AU - Ghosh, Rohan

AU - Principe, Jose C.

AU - Thakor, Nitish V

AU - Kukreja, Sunil L.

PY - 2017/3/3

Y1 - 2017/3/3

N2 - Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of spike-groups as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92%.

AB - Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of spike-groups as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92%.

KW - Asynchronous signal processing

KW - Dynamic vision sensors

KW - Motion segmentation

KW - Robotics

KW - Temporal information

KW - Tracking and following

UR - http://www.scopus.com/inward/record.url?scp=85017130373&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85017130373&partnerID=8YFLogxK

U2 - 10.3389/fnins.2017.00083

DO - 10.3389/fnins.2017.00083

M3 - Article

C2 - 28316563

AN - SCOPUS:85017130373

VL - 11

JO - Frontiers in Neuroscience

JF - Frontiers in Neuroscience

SN - 1662-4548

IS - MAR

M1 - 83

ER -