Virtual Testing of ADAS & AV Systems

Edge Case Simulations, by Mike Dempsey – MD, Claytex

Cars today are delivered with a plethora of advanced driver assistance systems (ADAS) such as lane keeping assistance, adaptive cruise control, automated emergency braking and much more. These systems are very complex and expensive to develop and yet the customer perception and experience with them is often quite negative.

There are a number of factors to explain this: customer expectations often exceed what the systems are designed to do and it’s difficult to explain the limitations in clear, easily understandable terms. Another issue is that the systems are developed to meet the regulatory requirements but these test cases do not reflect the real world in which they need to perform.

What does this have to do with simulation and edge cases?

Well, if we want to develop our ADAS features to perform better in the real world we need to be able to test them in scenarios that are representative of the real world. However, it is difficult to safely recreate real world scenarios on a physical proving ground. For instance, we don’t really want to risk crashing our prototype vehicle into another vehicle during a test.

But the real challenge is the number of edge cases that need to be considered. We define an edge case as a scenario that is individually unlikely but when considered together, they make up all the risk.

Autonomous vehicle developers now recognise that to achieve commercial viability their systems will need to be trained, tested and validated on a huge number of edge cases.  Similarly, for ADAS features it is increasingly apparent that they need to be developed and validated on the relevant set of edge cases.

At Claytex we have been developing autonomous vehicle simulators that are designed to support the testing, training and validation of the vehicle systems.  We focus on scenario-based testing of the complete system, which means we combine vehicle dynamics, sensor models, control systems and a detailed virtual world complete with traffic and pedestrians to challenge the vehicle.  ADAS developers can, and should, utilise the same simulation technology as they are using the same sensors and control methods.

The type of simulation tool that you need to effectively test an ADAS feature or AV controller is quite different to the simulation tools that have been used for the past 10-20 years of vehicle development.  These are complex closed loop systems where simplifications in any one part of the system model can have a significant impact on the overall capability of the system.  For example, if you have a great vehicle dynamics model with the real control system but use a smooth road and basic animation then it won’t present a representative scene to the perception sensors which in turn means the object detection will find it easy to identify and track targets.  The result is that you will be limited in how much you can use the simulation tools to develop, test and validate your system.

Simulation Manager
Figure 1: Block diagram of ADAS and AV appropriate simulator

Our simulators are built around rFpro which is a driving simulation tool and provides our virtual environment.  A unique feature of rFpro, compared to traditional driving simulation solutions, is that it allows driving simulation to be used to test the vehicle dynamics of road vehicles.  By delivering a high-resolution road surface in real time, while generating accurate, realistic graphics without lag, professional test drivers may contribute to the engineering process while the car design is still model-based. 

The vehicle model can be developed using any of the major vehicle dynamics tools including Dymola, CarMaker, CarSim, Simulink and many more.  At Claytex, we favour the use of Dymola as this allows our vehicle model to include more than just the suspension, we can also model the powertrain, battery, thermal management, and all the other vehicle systems.

rFpro has the industry’s largest library of digital twins of public roads, test tracks and proving grounds, spanning North America, Asia and Europe. These include multi-lane highways, urban, rural, mountain routes and automotive proving grounds, all replicated faithfully from the real world using their unique 3D reconstruction process. 

For drivers testing aspects of vehicle dynamics, these models come with accurately modelled digital road surfaces, built from kinetic LiDAR surveys, using rFpro’s TerrainServer to map the entire drivable surface to a 1cm grid along, and across, the road. Every bump, ripple and discontinuity will find its way through your tyre model into your vehicle under test. 

What this means for ADAS and AV development is that we can develop the vehicle dynamics model and test scenes to have a very high level of correlation between the real and virtual world. This ensures that the motion and related noise sources that affect the sensors is captured in the simulation.

Physics-based sensor models

ADAS and AV systems rely on their perception sensors to detect and understand the world around them.  They typically use a suite of different types of sensor including camera, LiDAR and radar to measure the real world and sensor fusion within the control system to interpret the data.  Detailed sensor models are required to support the development of these systems as when ideal sensors are used it becomes too easy for the systems to identify and understand the scenes and react.  This leads to, for example, automated emergency braking systems being able to identify pedestrians much earlier in the ideal simulation compared to the real world which could lead you down the wrong development path.

Our camera sensors rely on the rendering capabilities of rFpro which supports both real-time and non-real-time simulation modes.  When running in real-time mode we can easily achieve full HD resolutions at 60fps, typically our driving simulators run at even higher resolutions and frame rates.  Camera sensors can be calibrated to include lens distortion and tone mapping effects that enable the simulation to match the real camera you are using.

Figure 2: Simulation of an RCCC camera with lens distortion
Figure 2: Simulation of an RCCC camera with lens distortion

Claytex has developed detailed LiDAR and radar models that include environmental and weather effects.  For example, our real-time model of the Velodyne Puck LiDAR sensor runs at 325 frames-per-second and its position is updated at each frame based on the underlying vehicle dynamics model and rotational speed of the sensor.  The weather model has been developed using real test data to determine the effect of rain, and other weather effects such as fog, on range accuracy, intensity and number of returns. 

The end result is that the sensor models are capable of generating representative data feeds that include the appropriate noise features.  To support the training and validation of the perception systems these data feeds are backed up with a wealth of data such as depth maps, bounding box information, object velocities and much more.

Figure 3: Velodyne Ultra-Puck sensor model output from simulation of a complex scenario

Scenario based testing

Harnessing all this simulation power in an effective way is challenging and scenario-based testing is the most appropriate way when working on the development of the control systems.  Scenario-based testing means that we have a way to specify every aspect of the test including the scenery, static objects such as traffic cones, dynamic objects such as traffic and pedestrians, weather conditions and the intended path for the ego vehicle.  Taken together a specific combination of these define a scenario.

This presents another big challenge which is the definition and management of the scenarios within some form of database. For instance, if we consider a generic scenario where a pedestrian steps out in front of a moving vehicle then there are a huge number of parameter variations that we might need to consider such as the basic mechanics of the scenario: vehicle speed, distance from the vehicle to the pedestrian when they step out, other traffic and parked cars; but there are other factors such as time of day, weather conditions, pedestrian clothing.  This very quickly leads to a huge number of potential scenarios from one simple conceptual scenario.

Figure 4: Test scenario including pedestrians and parked cars after a short shower has made the road surface wet
Figure 4: Test scenario including pedestrians and parked cars after a short shower has made the road surface wet
Figure 5: Same test scenario replayed at night which presents a different challenge to the perception systems

As part of a collaborative R&D project we are working with several partners on novel approaches to the management of the scenarios and how we go about testing and assessing the performance of your system to identify any weak points without having to test every possible parameter combination for every conceivable scenario which is impractical.

To summarise

The effective development of ADAS features to meet real world usage requirements can be enabled through simulation but you will find that the tools you need are more complex and need to integrate every aspect of the system performance.  This migration to new and improved simulation tools to support ADAS development is perhaps even more important in a post-Covid world where physical testing has become even more complicated with additional safety requirements related to social distancing.

Share this:

CM Corporation

234 Whitechapel Road, London E1 1BJ. United Kingdom.
Tel. 44(0) 203 3711914 © 2021. All Rights Reserved.