AUTHOR=Mishra Abhishek , Ghosh Rohan , Principe Jose C. , Thakor Nitish V. , Kukreja Sunil L. TITLE=A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors JOURNAL=Frontiers in Neuroscience VOLUME=11 YEAR=2017 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2017.00083 DOI=10.3389/fnins.2017.00083 ISSN=1662-453X ABSTRACT=

Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of spike-groups as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92%.