Research Projects

ITERATION
Individual Training obsERvation and AdapTION
Start: 12/2020
End: 08/2022

The ITERATION project is developing a cost-effective, AI-based system for the automated analysis of fitness exercises. Using an Nvidia Jetson Nano and a standard webcam, 3D joint points are extracted from the video image in real time (BlazePose algorithm) and used to calculate joint angles, cadence, and position on the FitterYOU mat. The results are immediately fed back to the exerciser via a traffic-light feedback display (red, yellow, green). In a two-day evaluation with 22 test subjects, the four core exercises (classic push-up, butterfly tension killer, body plank with foot takeoff, classic squat plus) were tested, and the system demonstrated high consistency with the assessments of a training scientist. In addition, heart rate variability (HRV) is recorded contactlessly via rPPG measurement from the facial ROI. The result is a modular, easy-to-integrate prototype suitable for both B2C and B2B environments.
ITERATION aims to develop an inexpensive, AI-based analysis tool for fitness and dance exercises that is based exclusively on a standard webcam and a compact edge computer (Nvidia Jetson Nano). After an initial evaluation of various pose recognition algorithms (trt_pose, MoveNet, BlazePose), BlazePose was chosen because it can reconstruct up to 33 joint points in 3D from a 2D image, thus providing important reference points for hands, feet, and the entire body.
The software stack includes Linux, TensorFlow, PyTorch, and OpenCV. In a two-stage detector-tracker pipeline, BlazePose first locates the region of interest (ROI) of the person and then estimates the joint points. Based on the joint angles, the range of motion (RoM) is calculated for each repetition; a constant RoM results in a high angle score, while strong fluctuations indicate overload. Similarly, the cadence (time per repetition) is evaluated, and the position on the FitterYOU mat is determined by comparing the recognized hand or foot coordinates with the marked zones. Each subscore provides a value of 0–100, the mean of which forms the total score, which is translated into a traffic light display (red < 75, yellow < 90, green ≥ 90).
For the evaluation, 22 test subjects (7 women, 15 men) were observed at four test stations over two days. The test subjects performed 8 repetitions of each of the four selected exercises, with the system completing the video analysis in approximately one second after the end of the exercise and immediately displaying the traffic light feedback. In addition to pose metrics, the raw data also provided vital parameters: rPPG analysis of the facial ROI was used to determine heart rate and HRV (AVNN, SDNN, RMSSD, LF/HF power, etc.). Heart rate correlated perfectly (r = 1) with an ECG reference sensor, while low-frequency HRV parameters also showed high correlations (p = 0.82–0.98). The project demonstrates that a combination of AI pose recognition, real-time feedback, and contactless HRV measurement provides a practical, scalable system for digital fitness analysis.
Role of the FZI
The FZI Research Center for Information Technology was the central research partner from the start of the project. It selected the hardware platform (Nvidia Jetson Nano + webcam) and set up the Linux operating system, including all necessary libraries (TensorFlow, PyTorch, OpenCV). The FZI carried out the systematic evaluation of pose recognition algorithms, made the final decision in favor of BlazePose, and implemented the two-stage detector-tracker pipeline. In addition, the FZI developed the extension for calculating joint angles, integrated the three subscore modules (angle, cadence, mat position), and implemented the traffic light feedback system.
As part of the evaluation, the FZI provided the two test stations, programmed the video recording and analysis pipeline (≈ 1 s processing time), and monitored the entire test procedure. In addition, the institute implemented rPPG-based heart and HRV measurement, optimized the face alignment algorithm for 35–40 fps, and performed the statistical evaluation of the vital parameters. Finally, the FZI integrated all subcomponents into a modular Python architecture that enables easy connection to the existing FitterYOU app. In doing so, the FZI provided the technical basis, algorithmic expertise, and evaluation infrastructure that turned the ITERATION system into a functional prototype.

Contact person
Staff
Division: Embedded Systems and Sensors Engineering
Headquarters Karlsruhe

Research focus
Applied Artificial Intelligence

In this research focus, the FZI concentrates on practical research into the key technology of Artificial Intelligence (AI). Innovative AI solutions are developed and transferred to application areas such as mobility, robotics, healthcare technology, logistics, production, and supply and disposal on behalf of our partners and customers.

Funding notice:
The ITERATION project is funded by the Federal Ministry for Economic Affairs and Energy.

Project partners:

Go to Top