self-drive
self-drive
Morning Fog

Virtual world created to test self-drive cars

Riley Riley

Breakthrough technology will enable engineers to test and approve self-drive cars in a virtual environment.

UK simulation software specialist, rFpro, has developed a new technique that helps to overcome the safety concerns around testing and developing connected autonomous vehicles (CAVs) on public roads.

By developing a physical model of the real world, known as the Ground Truth, rFpro can accurately test a vehicle’s perception of its surroundings, which has not been possible previously.

This will enable legislators to define an approval process for a vehicle within a completely virtual environment, certifying it as safe to use on real roads.

The technique is already being used to validate vehicle safety in Euro NCAP tests.

rFpro provides driving simulation software, and 3D content, for Deep Learning Autonomous Driving, Advanced Driver Assistance Systems (ADAS) and Vehicle Dynamics testing and validation.

It is being used to train, test and validate Deep Learning systems for ADAS and Autonomous applications by OEMs and Tier-1s.

When developing systems based on machine learning from sensor feeds, such as camera, LiDAR and radar feeds — the quality of the 3D environment model is very important.

The more accurate the virtual world is the greater the correlation will be when progressing to real-world testing.

“Most system modelling begins with ideal sensor models in order to validate the algorithms and control systems of the vehicle, but this bypasses any limitations in the sensors themselves,” rFpro’s technical director, Chris Hoyle, said.

“Difficult lighting conditions, or the reflections in a shop window can corrupt a sensor’s perception of the vehicle’s surroundings, leading to potentially catastrophic errors.

“Thorough validation of a CAV or ADAS-equipped vehicle must include the sensors’ ability to recognise and characterise the features of its environment.”

The ability to evaluate sensor perception during simulation matters because future legislation is likely to dictate the virtual testing and approval of any autonomous system, before its use on public roads is permitted.

The whole system must therefore be tested in a fully-representative virtual environment, not just elements of it.

Due to the vast number of miles required to validate an autonomous vehicle in a huge number of different environments, it is not feasible to do this in the real world.

Sensor perception is the most challenging aspect because it requires a physically accurate virtual world with high levels of correlation to the real world and physically modelled sensors.

“Physical modelling means simulating the materials and properties of every object encountered by the vehicle and its onboard sensors, rather than just an abstract representation of it as used by most systems,” Hoyle said.

“With several years’ experience of creating digital twins of city streets, rural roads, proving grounds and test tracks, we understand the complexities of modelling features like changing weather conditions or road surfaces.

“Our engineers are constantly being challenged by our customers to bridge the gap between simulated and real-world testing.

“Whether this is 8 stereo, 4K cameras with live exposure control and real motion-blur modelled, or even a radar model picking up the micro-doppler from a pedestrian moving their arm.

All of this must be possible for successful simulation and can all be done using rFpro.”

Hoyle believes rFpro is unique in providing closed loop end-to-end simulation of autonomous systems, with a vital element of this process being the inclusion of an accurate vehicle model that responds in a fully representative manner to road surface changes and control inputs.

“Perception is critical not just from a safety point of view but also for consumer satisfaction,” he said.

“For example, with our system, minute road surface differences are accurately modelled, which means ride comfort can be assessed.

“We can explore the CAV’s ability to identify and avoid potholes, like a human would. Without this, passengers are unlikely to want to ride in an autonomous vehicle again.”

A current UK government project, dRISK, uses rFpro software to validate sensor models against the real world so they can be correlated with actual sensors.

This is seen as an essential stepping stone towards end-to-end validation of autonomous systems through simulation.

“Due to the infinite inputs possible from these sensors, there is a big emerging need for autonomous system test engineers making use of simulated software,” he said.

“Previously there has been around a 1:1 ratio of software engineers to test engineers in the automotive sector.

“We believe this is moving closer towards the avionics sector where there are around 5 test engineers per software engineer to ensure the safety of passengers.”

rFpro’s HiDef models are built around a graphics engine that includes a physically modeled atmosphere, weather and lighting, as well as physically modeled materials for every object in the scene.

Hundreds of kilometres of public road models are available off-the-shelf, from rFpro, spanning North America, Asia and Europe, including multi-lane highways, urban, rural, mountain routes, all copied faithfully from the real world.

rFpro scales from a desktop workstation to a massively parallel real-time test environment connecting to customers’ autonomous driver models and human test drivers.

CHECKOUT: Self-drive cars play show and tell

CHECKOUT: Driverless crash: who’s to blame?

Your email address will not be published. Required fields are marked *


Riley