Microsaccades for asynchronous feature extraction with spiking networks

Publikationstyp
Konferenz
Autor(en)
Kaiser, Jacques and Lindner, Gerd and Tieck, J Camilo Vasquez and Schulze, Martin and Hoff, Michael and Roennau, Arne and Dillmann, Rüdiger
Jahr
2018
Buchtitel
International Conference on Development and Learning and Epigenetic Robotics (ICDL-EPIROB)
Bearbeiter
IEEE
Abstract
While extracting spatial features from images has been studied for decades, extracting spatio-temporal features from event streams is still a young field of research. A particularity of event streams is that the same network architecture can be used for recognition of static objects or motions. However, it is not clear what features provide a good abstraction and in what scenario. In this paper, we evaluate the quality of the features of a spiking HMAX architecture by computing classification performance before and after each layer. %% Three different classifiers are used: a linear Support Vector Machine (SVM), an Histogram and a Liquid State Machine (LSM). We demonstrate the abstraction capability of classical edge features, as were found in the V1 area of the visual cortex, combined with fixational eye movements. Specifically, our performance on N-Caltech101 dataset outperforms previously reported $F_1$ score on Caltech101, with a similar architecture but without a STDP learning layer. %% Indeed, by benchmarking a spiking HMAX architecture on the N-Caltech101 dataset However, we show that the same edge features do not manage to abstract motions observed with a static DVS from the DvsGesture dataset. %% In our experiments, pure unsupervised STDP learning in the S2 layer did not lead to the learning of stable and discriminative patterns. Additionally, we show that liquid state machines are a promising computational model for the classification of DVS data with temporal dynamics. This paper is a step forward towards understanding and reproducing biological vision.
Forschungsfelder
Service-Robotik und mobile Manipulation
Projekt
Human Brain Project
Download .bib
Download .bib
Eingetragen von
Jacques Kaiser