Dr.-Ing. Arne Rönnau
Abteilungsleiter
Werdegang
Arne Rönnau studierte von 2002 bis 2008 Elektrotechnik und Informationstechnik an der Universität Karlsruhe (TH), dem heutigen Karlsruher Institut für Technologie (KIT). Die Schwerpunkte des Studiums lagen in der Regelungs- und Steuerungstechnik sowie Robotik. Seine Diplomarbeit befasste sich mit sensorbasierten 3D-Umweltmodellierung sowie Fußpunktplanung für den sechsbeinigen Laufroboter LAURON.
In den Jahren 2008 bis 2011 arbeitete Arne Rönnau als wissenschaftlicher Mitarbeiter in der Abteilung Interaktive Diagnose- und Servicesysteme (IDS) des FZI. In dieser Zeit arbeitet und forschte er an der Lokalisierung von mobilen Robotersystemen, echtzeitfähigen Steuerungen für Fahrerlose-Transportsysteme sowie der Optimierung von mehrbeinigen Laufrobotern.
In 2019 promovierte Arne Rönnau am Karlsruher Institut für Technologie (KIT) summa cum laude zum Thema "Modellbasierter Entwurf und Optimierung mehrbeiniger Laufroboter“.
Schwerpunkte seiner aktuellen Arbeit liegen in den Bereichen OpenSource Software (OSS) insbesondere ROS (Robot Operating System), Mensch-Roboter-Kollaboration (MRK), Entwurf innovativer Servicerobotik Anwendungen wie z.B. dem BratWurst Bot sowie lernende, tiefe Neuronale Netze/Künstliche Intelligenz. Wichtigstes Anliegen ist ihm die Evaluation und der Transfer dieser Technologien in industrielle, praktische Anwendungen. Arne Rönnau ist Projektleiter in zahlreichen öffentlichen wie industriellen Projekten und Koordinator mehrerer nationaler sowie landesgeförderter Verbundprojekte wie z.B. AuRoA, VeriKI und CyberProtect.
Seit 2011 leitet Arne Rönnau die Abteilung Interaktive Diagnose- und Servicesysteme (IDS) innerhalb des Forschungsbereichs Intelligent Systems and Production Engineering (ISPE) und ist seit 2012 Leiter des FZI Living Lab Service Robotics.
Publikationen
Arne Rönnau arbeitet regelmäßige als Reviewer für internationale Fachkonferenzen wie z.B. IEEE ICRA, IEEE/RSJ IROS, IEEE ICAR oder aber auch als Gutachter für anerkannte Fachzeitschriten wie IEEE Robotics and Automation Letters und IEEE Transactions on Robotics. Darüber hinaus war er Mitglied in verschiedenen Program Committees internationaler Robotik-Konferenzen wie z.B. der IEEE ECMR 2019.
Details und Statistiken zu den Veröffentlichungen sind in meinem Google Scholar Profil zu finden.
Zeitungs- oder Zeitschriftenartikel (4)
- Connecting Artificial Brains to Robots in a Comprehensive Simulation Framework: The Neurorobotics PlatformInfoDetails
Egidio Falotico and Lorenzo Vannucci and Alessandro Ambrosano and Ugo Albanese and Stefan Ulbrich and Juan Camilo Vasquez Tieck and Georg Hinkel and Jacques Kaiser and Igor Peric and Oliver Denninger and Nino Cauli et al., 2017
Combined efforts in the fields of neuroscience, computer science and biology allowed to design biologically realistic models of the brain based on spiking neural networks. For a proper validation of these models, an embodiment in a dynamic and rich sensory environment, where the model is exposed to a realistic sensory-motor task, is needed. Due to the complexity of these brain models that, at the current stage, cannot deal with real-time constraints, it is not possible to embed them into a real world task. Rather, the embodiment has to be simulated as well. While adequate tools exist to simulate either complex neural networks or robots and their environments, there is so far no tool that allows to easily establish a communication between brain and body models. The Neurorobotics Platform is a new web-based environment that aims to filling this gap by offering scientists and technology developers a software infrastructure allowing them to connect brain models to detailed simulations of robot bodies and environments and to use the resulting neurorobotic systems for in-silico experimentation. In order to simplify the workflow and reduce the level of the required programming skills, the platform provides editors for the specification of experimental sequences and conditions, envi-ronments, robots, and brain-body connectors. In addition to that, a variety of existing robots and environments are provided. This work presents the architecture of the first release of the Neurorobotics Platform developed in subproject 10 “Neurorobotics” of the Human Brain Project (HBP). At the current state, the Neurorobotics Platform allows researchers to design and run basic experiments in neurorobotics using simulated robots and simulated environments linked to simplified versions of brain models. We illustrate the capabilities of the platform with three example experiments: a Braitenberg task implemented on a mobile robot, a sensory-motor learning task based on a robotic controller and a visual tracking embedding a retina model on the iCub humanoid robot. These use-cases allow to assess the applicability of the Neurorobotics Platform for robotic tasks as well as in neuroscientific experiments.
- A framework for coupled simulations of robots and spiking neuronal networksDetails
Hinkel, Georg and Groenda, Henning and Krach, Sebastian and Vannucci, Lorenzo and Denninger, Oliver and Cauli, Nino and Ulbrich, Stefan and Roennau, Arne and Falotico, Egidio and Gewaltig, Marc-Oliver and others, Springer, 2017
- Autonomous navigation for reconfigurable snake-like robots in challenging, unknown environmentsDetails
L. Pfotzer and S. Klemm and A. Roennau and J.M. Zöllner and R. Dillmann, 2017
- Scaling up liquid state machines to predict over address events from dynamic vision sensorsDetails
Jacques Kaiser and Rainer Stal and Anand Subramoney and Arne Roennau and Rüdiger Dillmann, 2017
Konferenzbeitrag (13)
- Fast online collision avoidance for mobile service robots through potential fields on 3D environment data processed on GPUsDetails
C. Juelg and A. Hermann and A. Roennau and R. Dillmann, 2017
- Spiking Convolutional Deep Belief NetworksInfoDetails
Kaiser, Jacques and Zimmerer, David and Tieck, J. Camilo Vasquez and Ulbrich, Stefan and Roennau, Arne and Dillmann, Rüdiger, Springer International Publishing, 2017
Understanding visual input as perceived by humans is a challenging task for machines. Today, most successful methods work by learning features from static images. Based on classical artificial neural networks, those methods are not adapted to process event streams as provided by the Dynamic Vision Sensor (DVS). Recently, an unsupervised learning rule to train Spiking Restricted Boltzmann Machines has been presented [9]. Relying on synaptic plasticity, it can learn features directly from event streams. In this paper, we extend this method by adding convolutions, lateral inhibitions and multiple layers. We evaluate our method on a self-recorded DVS dataset as well as the Poker-DVS dataset. Our results show that our convolutional method performs better and needs less parameters. It also achieves comparable results to previous event-based classification methods while learning features in an unsupervised fashion.
- Probabilistic Symbol Encoding for Convolutional Associative MemoriesDetails
Igor Peric and Alexandru Lesi and Daniel Spies and Stefan Ulbrich and Arne Roennau and Marius Zoellner and Ruediger Dillmann, 2017
- Exact spike timing computational model of convolutional associative memoriesDetails
I. Peric and F. Schneider and C. H. Price and S. Ulbrich and A. Roennau and M. Zoellner and R. Dillmann, 2017
- Mensch-Roboter-Kollaboration von der Forschung in die PraxisDetails
A. Roennau, 2017
- Towards Grasping with Spiking Neural Networks for Anthropomorphic Robot HandsInfoDetails
Tieck, J. Camilo Vasquez and Donat, Heiko and Kaiser, Jacques and Peric, Igor and Ulbrich, Stefan and Roennau, Arne and Zöllner, Marius and Dillmann, Rüdiger, Springer International Publishing, 2017
Representation and execution of movement in biology is an active field of research relevant to neurorobotics. Humans can remember grasp motions and modify them during execution based on the shape and the intended interaction with objects. We present a hierarchical spiking neural network with a biologically inspired architecture for representing different grasp motions. We demonstrate the ability of our network to learn from human demonstration using synaptic plasticity on two different exemplary grasp types (pinch and cylinder). We evaluate the performance of the network in simulation and on a real anthropomorphic robotic hand. The network exposes the ability of learning finger coordination and synergies between joints that can be used for grasping.
- Model-based polynomial function approximation with spiking neural networksDetails
S. Ulbrich and T. Steward and I. Peri? and A. Roennau and J. M. Zöllner and R. Dillmann, 2017
- Design of an exchangeable, compact and modular bio-inspired leg for six-legged walking robots ROBOTSDetails
T. Buettner and A. Roennau and G. Heppner and R. Dillmann, 2017
- Model-based Localisation and Segmentation of Modular Satellites using 3D Lidar Point CloudsDetails
M. Grosse Besselmann and A. Roennau and R. Dillmann, 2017
- Exploration and Sample-Return Missions with a Walking RobotDetails
G. Heppner and A. Roennau and R. Dillmann, 2017
- Service robots in the field: The BratWurst BotDetails
F. Mauch and A. Roennau and G. Heppner and T. Buettner and R. Dillmann, 2017
- Predictive motion synchronization for two arm dynamic mobile manipulationDetails
J. Mangler and F. Mauch and A. Roennau and R. Dillmann, 2017
- Forward Dynamics Compliance Control (FDCC): A new approach to cartesian compliance for robotic manipulatorsDetails
S. Scherzinger and A. Roennau and R. Dillmann, 2017
Export Suchergebnis .bib
Kontakt
Telefon: +49 721 9654-228
E-Mail: roennau@ fzi.de- Connecting Artificial Brains to Robots in a Comprehensive Simulation Framework: The Neurorobotics PlatformInfoDetails