Test drive in the virtual interior

What at first looks like the latest version of the Car Driving Simulator is in fact the modern test environment for the development of new safety functions in cars. In this simulat­ed environment, IAV uses synthetic sensor data from the computer – and tools from game development. Photorealistic renderings are used instead of real people and test vehicles.

Now that vehicles have a firm grasp on their external surroundings, the interior is now increasingly being equipped with sensors too. They enable new safety functions such as adaptive restraint systems or the monitoring of vital functions to find their way into the vehicle, permitting for instance a call to the emergency services to be automatically placed in case of an emergency. One of these innovations, ”Child Present Detection,“ is likely to be NCAP-relevant in the future: The function is intended to enable the vehicle to determine whether a child has been left behind in the passenger compartment and is now in potential danger of death – due to heat exposure in summer, for instance.

Cameras and radar sensors thus become the “eyes” in the passenger compartment and provide the data for the new safety functions. The algorithms behind this are based on machine learning, which makes safeguarding a challenge. “We need very large amounts of data for this process, which is impossible to generate exclusively through real test drives with real people,” says Maximilian Brenneis from IAV Fahrzeugsicherheit GmbH. “That’s why we resort to synthetic sensor data, which we gener­ate relatively easily and in great variety on the computer. This allows us to develop and safeguard safety functions for our customers quickly and reliably.”

220405 Cockpit Kreuzung 05 16bit

Tool from game development

The technical basis for these photorealistic simulations is the Unreal Engine from U.S. manufacturer Epic Games, which is normally used to produce computer games and Hollywood films. IAV’s experts use the Unreal Engine to create vehicle interiors and people in high quality and detail. Variations in the interior are just as easy to customize as the deceptively real representation of people of all ethnicities, ages and genders.

Moreover, the digital occupants from the computer are not static pillars of salt: They move during the simulated journey (for example, when accelerating or making a sharp turn) and change their facial expressions – just like real passengers. Light and shadow also permanently change the reflections in the interior and thus provide the algorithms with extremely realistic training data. The synthetic sensor values have another appealing feature: They are already perfectly segmented, so there is no need to manually assign individual pixels to different body parts, which speeds up the entire process enormously.

End-to-end tool chain for the entire development process

“The safeguarding of functions in the interior is becoming increasingly important,” says Brenneis. “This applies to all future vehicles, but especially to self-driving cabs. If we were to leave out the person behind the wheel, the technology must be able to keep an eye on the interior and its occupants.”

IAV is meeting the increasing need for testing that this entails with a hybrid safeguard: The bulk of the data comes from the computer. Real test drives are only scheduled at the end of the process. This means that new interiors and derivatives can also be tested without any great effort. Moreover, the innovative approach is not limited to safeguarding: “The synthetic sensor data is also suitable for designing the functions,” explains Brenneis. This enables IAV to offer its customers an end-to-end tool chain for the entire development process. The virtual interior is avail­able for simulated test drives at any time.