CoCar - The Instrumented Cognitive Car

Future vehicles will provide active driving support to the driver. Built-in sensors, cameras and intelligent software and hardware systems are used to receive information about their surroundings  and to interpret this input. Thus, safety and driving comfort but also energy efficiency will be drastically improved. With the instrumented research vehicle “CoCar” of the FZI Living Lab Automotive, such concepts are developed, evaluated and tested under real conditions.

Objectives and innovations

CoCar is in the framework of the research and development on driving comfort, safety and energy efficeny at the FZI a basic platform for:

  • realisation and evaluation of new, advanced features
  • online integration of new data processing- and control components
  • testing under real-life conditions
  • collection of real benchmark data
  • evaluation of single components and of the entire system in real-life environment

The vehicle CoCar

Based on an Audi Q5, extensive modifications were made in order to enable safe autonomous driving, advanced perception and control. Additional sensors and hardware components were therefore installed. They fit well into the car design and appearance, while securing the flexibility of use for various use cases at the same time. For instance, the fixed cameras can be easily exchanged with other camera models. In the car interior, it is easy to install and remove different cameras and similar sensors. A multi-functional plug-in feature on the roof of the car enables the connection of external sensors, such as a 360° laser scanner or a dome camera to the vehicle data processing and power supply systems.

Autonomous driving


A software control of acceleration and brake functions is enabled via real time control and signals transmitted over a CAN-bus. The wheel can be controlled by the steering actuator, while the gear stick can be moved by the shift actuator. The vehicle's safety concept provides three different modes: The normal mode, in which the driver controls the vehicle and no physical software assistance is performed. In the autonomous mode, the vehicle is completely software-controlled. However, as soon as the driver touches the brake pedal or steering wheel, the vehicle is instantaneously switched to the normal mode and the driver gets full control of the vehicle. In the semi-autonomous mode, on the other hand, the driver and the software share the control of the vehicle. The driver can take over the control at any time, by any intervention without switching to the normal mode. This addresses an emerging future field of research in the area of shared control.

Integrated sensors

Current vehicles use ultrasound- and radar sensors that enable data acquisition of the environment, needed for parking assistance and ACC-features. For a wider range of advanced functions it is necessary to install additional new sensors that allow a more detailed perception of the environment.


To measure the distance to other objects, we can use laser scanners and/or time-of-flight and/or stereo cameras. Three sets of four parallel scanning layers (Ibeo Lux) are integrated in the front- and rear valance. They are supported by two PMD cameras, one front and one rear. Optionally, a 32-layer laser scanner with a 360° view (Velodyne) can be mounted on the roof.

Video cameras are integrated in various areas of the vehicle. A camera in the front grille provides images of the driving direction. The rear view mirrors are also equipped with cameras that cover the blind spots. In the car interior, additional cameras can be installed to monitor the inner and outer area. Most commonly, a 2D or 3D camera is used to monitor the driver and another one (or two, for a stereo view) turned to the front, reflecting the driver’s frontal view.

The position of the vehicle is estimated by an inertial sensor (OXTS RT3003). This sensor is connected to the two integrated GPS antennas. It receives additional movement data and is thus centimeter-precise.

Application examples

Automated parking

With regard to future technical solutions parking scenarios are researched that enable the autonomous parking and automated battery-recharging of electric cars in compact, dense parking systems. At the opening of the HOLL, concepts and approaches for autonomous parking were presented.

The driver leaves his vehicle at the entrance of a car park. The vehicle plans a path to a free charging station and drives autonomously after the access to the parking spot has been approved by a green traffic light. When the vehicle is going to be used again, the driver can call the vehicle. The vehicle thus drives autonomously to the exit of the parking spot, where the driver enters the vehicle and takes over the control.

Driver monitoring

For the development of future driving assistance systems, the issue of driver's attention must be handled with great consideration. With this objective in mind, several systems were developed at the FZI, designed to determine the driver’s head position, head rotation and view direction. We used 2D video cameras as well as 3D depth image cameras.

Energy-efficient driving

The FZI is testing new assistance systems that help the driver in choosing the correct driving strategy and offer hints for energy optimised maneuvers. These assistance systems can be integrated and tested in CoCar. The precise localisation and different sensors for data acquisition of the environment are used here. The user interface is implemented by displays in the area behind the steering wheel and in the centre console, as well as a touchscreen. In order to enable tactile feedback, it is planned to integrate an active gas pedal.