In the pursuit of developing Advanced Driver Assistance Systems (ADAS) and Automated Driving (AD) technologies, the mobility industry faces an intricate challenge — subjecting sensors to the immense array of scenarios that they might encounter in the real-world. The complexity and diversity of these scenarios make real-world testing impractical due to its inherent limitations in time, cost, and safety. Simulation has emerged as the only feasible solution, allowing for the comprehensive exploration of countless scenarios. However, a critical hurdle has persisted: the fidelity of simulated data often falls short of accurately replicating the nuanced perceptions of sensors in real-world conditions. Bridging this gap demands a transformative leap in simulation quality. Simulation software specialist rFpro has introduced its new ray tracing technology, offering unprecedented simulation fidelity and accurately replicating what sensors ‘see’ for the first time.

The evolution of rendering
Traditional simulation rendering engines have been developed to be used by humans, for example in Driver-in-the-Loop (DIL) simulators. This places performance limitations on the simulation as the data delivered to the driver needs to be in real-time and the latest DIL simulators are delivering graphical frames every 4ms.
Most real-time simulations use a rendering technique called rasterization, where the simulation calculates light taking a single bounce through the scene. This provides a highly efficient way of rendering quickly to enable real-time simulations but compromises fidelity.
rFpro’s ray tracing engine has been developed from the ground up for the development of ADAS and AD sensors creating a Software-in-the-Loop (SIL) testing solution and creating the highest-value synthetic training datasets. Models of vehicle sensors, such as camera, lidar and radar can be de-coupled from real-time, which presents an opportunity to significantly increase the fidelity of the simulation.
Ray tracing reliably simulates the huge number of reflections created by multiple light sources, considering the properties of the materials the light is hitting, and applies this to every element in the scene as perceived by a vehicle-mounted sensor moving through it.
Ray tracing’s ability to trace the path of light rays through a scene enables the generation of highly realistic images. This capability is particularly crucial in scenarios with complex lighting conditions, such as low-light environments, multi-source illumination, or reflections and shadows in urban night driving. These environments have proven to be the most challenging to navigate for AD systems.
This type of rendering is computationally demanding but results in the highest-fidelity, engineering-grade data. The rate of frame rendering is adjusted to suit the level of detail required. This enables high-fidelity rendering to be carried out offline to create machine learning training datasets, or linked to SIL testing systems, or played back in subsequent real-time runs for Hardware-in-the-Loop (HIL) testing. This overcomes the usual trade-off between rendering quality and running speed.

Replicating the way sensors ‘see’ our world
Modern HDR (High Dynamic Range) cameras used in the automotive industry capture multiple exposures of varying lengths of time. For example, a short, medium and long exposure per frame. It is important to simulate this accurately as it ensures that the simulated images contain accurate blurring, caused by fast vehicle motions or road vibrations.
One of the standout features of the ray tracing technology is its accuracy in simulating motion blur, the rolling shutter effect, and LED light source flicker. These phenomena can make a vehicle appear longer than it is, a traffic cone to be slanted or a road sign difficult to interpret, for example. Accurately simulating these effects is the only way to truly represent what sensors ‘see’ in the real-world.
Most AD technologies will use machine learning to train the system to correctly identify objects in the scene and make the correct decisions but the output from these systems can only be as good as the training data used. In order to use synthetic training datasets a simulation environment that mirrors reality must be used, and rFpro’s new ray tracing engine delivers this.

Collaboration with Sony Semiconductor Solutions
Correlating results with the real-world is the cornerstone of any simulation program. As a result, rFpro collaborated with Sony Semiconductor Solutions (Sony), one of the world’s leading providers of sensor components. Sony’s sensor models have been integrated into rFpro. The partnership is focusing on improving the fidelity of simulation solutions further, aiming to reduce the industry’s reliance on real-world data collection during the sensor development cycle.
The companies have worked together to provide an end-to-end simulation pipeline and have developed an efficient interface between Sony’s sensor models and the rendering system which is now common across all of Sony’s automotive image sensors. This facilitates a seamless transition to new sensor models, aligning with the rapid evolution of sensor technologies.
Accelerating Development and Reducing Costs
The integration of advanced ray tracing technology and authentic sensor models into simulation environments offers unprecedented efficiency in the development of ADAS and AD systems. Simulation enables the exploration of a limitless array of scenarios, allowing for comprehensive testing of sensor systems.
One notable efficiency gain is the ability to identify and address edge cases quickly. In traditional testing, waiting for exposure in the real world can be time-consuming, and the majority of miles driven are often uneventful. With simulation, vehicles can drive thousands of high-value, high-activity virtual miles every day, significantly accelerating the development and training process. The growth of High-Performance Computing (HPC) and cloud computing is having a big impact on enabling these simulations to be massively scaled.
The cost savings are two-fold. Firstly, the reduction in reliance on real-world testing minimises the need for extensive physical data collection, cutting down on expenses associated with test vehicles, equipment, and personnel. Secondly, the ability to develop and test sensors in a virtual environment before they physically exist reduces the overall development cycle time. This not only saves costs but also contributes to the timely deployment of advanced technologies.
Collaborative Projects Strengthening the UK’s Mobility Supply Chain
Simulation is a critical part of a much wider toolchain for the development of ADAS and AD technologies. Collaborating with other industry partners is the quickest way to bring these technologies to market safely. rFpro has recently been announced as a consortium partner in two collaborative projects with the aim of advancing the UK’s connected and automated mobility supply chain.
The Centre for Connected and Autonomous Vehicles (CCAV) is facilitating £18.5 million of funding through the “Commercialising Connected and Automated Mobility: Supply Chain” (CCAMSC) competition. This underscores the commitment to supporting the UK’s capabilities in self-driving technologies. rFpro is involved in two projects, Sim4CAMSens and DeepSafe:
Sim4CAMSens: Advancing Sensor Evaluation Frameworks
Sim4CAMSens, receiving £2 million in funding, is led by Claytex and a consortium that includes rFpro, Syselek, Oxford RF, WMG, National Physical Laboratory, Compound Semiconductor Applications Catapult, and AESIN. The project’s primary objective is to enable an accurate representation of Automated Driving System (ADS) sensors in simulation.
DeepSafe: Breaking Barriers in Self-Driving Vehicle Deployment
A further £2 million has been allocated to the DeepSafe project, led by dRISK.ai. The DeepSafe consortium, featuring DG Cities, Imperial College London, Claytex, and rFpro, will focus on overcoming critical barriers in the commercialisation and deployment of self-driving vehicles, across acute data collection, simulation fidelity and scalability.
In addition to technological advancements, the grants awarded to these projects contribute to supporting innovation in the industry, fostering job creation, and attracting investment. The objective is to build the capacity to develop autonomous vehicle technology in the UK and export it to the rest of the world.

Conclusion
As the industry moves towards a future dominated by more automated and intelligent vehicles, the role of simulation in sensor development becomes increasingly vital. The synthetic training dataset advancements highlight a commitment to pushing technological boundaries, with a focus on efficiency, cost-effectiveness, and, ultimately, the realisation of safer vehicles sooner.
Matt Daley, Technical Director rFpro