CoCar - the instrumented Cognitive Car

Future vehicles will provide active driving support to the driver. Built-in sensors, cameras and intelligent software and hardware systems are used to receive information about their surroundings  and to interpret this input. Thus safety and driving comfort but also energy efficiency will be drastically improved. With the instrumented research vehicle “CoCar” of the FZI Living Lab Automotive, such concepts are developed, evaluated and tested under real conditions.

Goals and Innovations

With our research and development focused on driving comfort, safety and energy efficiency, CoCar is a basic research platform for:

  • realization and evaluation of new, advanced features
  • online integration of new data processing- and control components
  • testing under real-life conditions
  • collection of real benchmark data
  • evaluation of single components and of the entire system in real-life environment

The Vehicle CoCar

Based on an Audi Q5, extensive modifications were made in order to enable safe autonomous driving, advanced perception and control . Additional sensors and hardware components were therefore installed. They fit well into the car design and appearance, while at the same time securing the flexibility of use for various use cases. For instance, the fixed cameras can be easily exchanged for other camera models. In the car interior, it is easy to install and remove different cameras and similar sensors. A multi-functional plug-in feature on the roof of the car enables the connection of external sensors, such as a 360° laser scanner or a dome camera to the vehicle data processing and power supply systems.

Autonomous Driving


A software control of acceleration and brake functions is enabled by real time control and signals transmitted over a CAN-bus. With the steering actuator the wheel can be controlled, while the shift actuator moves the gear stick.. The vehicle's safety concept provides three different modes: the normal mode, in which the driver controls the vehicle and no physical software assistance is performed. In the autonomous mode, the vehicle is completely software-controlled. However, as soon as the driver touches the brake pedal or steering wheel, the vehicle is instantaneously switched to normal mode, and the driver regains full control of the vehicle. In the semi-autonomous mode, on the other hand, the driver and the software share the control of the vehicle. Taking over the control is always possible for the driver, by any intervention without switching to the normal mode. This addresses an emerging future field of research in the area of shared control.

Integrated Sensors

Current vehicles use ultrasound- and radar sensors that enable data acquisition of the environment, needed for parking assistance and ACC-features. For a wider range of advanced functions it is necessary to install additional new sensors that allow a more detailed perception of the environment.


To measure the distance to other objects, we can use laser scanners and/or time-of-flight and/or stereo cameras. Three sets of four parallel scanning layers (Ibeo Lux) are integrated in the front- and rear valance. They are supported by two PMD cameras, one front and one rear. Optionally, a 32-layer laser scanner with a 360° view (Velodyne) can be mounted on the roof.

Video cameras are integrated in various areas of the vehicle. A camera in the front grille provides images of the driving direction. The rear view mirrors are also equipped with cameras that cover the blind spots. In the car interior, additional cameras can be installed to monitor the inner and outer area. Most commonly, one 2D or 3D camera is used to monitor the driver and another one (or two, for a stereo view) turned to the front, reflecting the driver’s frontal view.

The position of the vehicle is estimated by an inertial sensor (OXTS RT3003). This sensor is connected to the two integrated GPS antennas; it receives additional movement data and is thus centimeter-precise.

Example Applications

Automated Parking

With regard to future technical solutions parking scenarios are researched, that enable autonomous parking and automated battery-recharging of electric cars in compact, dense parking systems. At the opening of the HOLL concepts and approaches for aoutonomous parking have been shown.

The driver leaves his vehicle at the entrance to a parking lot. The vehicle plans a path to a free charging station and drives there autonomously after access to the parking lot has been approved by a green traffic light. When the vehicle is going to be used again, the driver can call the vehicle. The vehicle then drives autonomously to the exit of the parking lot where the driver enters the vehicle and takes over control.

Driver Monitoring

For the development of future driving assistance systems, the issue of driver's attention must be handled with great consideration. With this goal in mind, several systems were created at the FZI, designed to determine the driver’s head position, head rotation and view direction. We have used 2D video cameras as well as 3D depth image cameras.

(Kopie 1)

Energy-efficient Driving

FZI is testing new assistance systems that aid the driver in choosing the correct driving strategy and offer hints for energy optimized maneuvers. These assistance systems can be integrated and tested in CoCar. Here, the precise localization and different sensors for data acquisition of the environment are used. The user interface is implemented by displays in the area behind the steering wheel and in the centre console, as well as a touchscreen. In order to enable tactile feedback, it is planned to integrate an active gas pedal.

Follow CoCar

Google+      YoutTube