Simulation-based Validation-Efficient Validation Solution for Automated Driving Systems
The safety issues of the High-Level Automated Driving Systems (Level 3-5 ADS) expose new challenges to validation tools compared to the functional safety validation of the advanced driver assistance system (ADAS). Unlike the ADASs, the driver does not have to play an indispensable role in the operation of high-level automated driving systems. When the system does not monitor the driving environment well or does not make the right driving behavior in the next moment, the driver does not need to provide a corresponding solution. According to the SAE, an ADS should at least be able to monitor the driving environment alone without a driver. This not only indicates requirements on the functional safety of the systems, but also puts forward higher requirements on the validation of functional safety.
To comply with these higher validation requirements, higher validation standards and a broader and richer validation test case collection is necessary. However, the cost and efficiency issues of validation test are followed. To solve the challenges brought by ADS validation test and the requirements for its effectiveness and efficiency, the simulation-based test method becomes a favorable tool to simplify the scene construction, accelerate test execution and reduce the cost of test validation. GaiA Framework: The GaiA (Generatable artificial interactive Automation) system is a simulation-based development framework and tool developed by PilotD Automotive (Shanghai) Co., Ltd. based on the core technology in automated driving system simulation from Germany, for the development and validation of automated driving and advanced driver assistance Software features: ● Multi-coordinate Road Constructor: The software can be used to build road sections of different widths, lengths, pitch/roll angles and curves, and intersections. Coefficient of friction and road accessories like traffic lanes, non-motorized lanes or sidewalks can be separately defined for each road segment. In addition, roadside objects in the model library can add into the simulated scene and the user can extend the library at any time. This module also supports the representation of captured real-world scenes. ● Traffic Object Builder: Traffic participants (including bicycles and pedestrians) can be easily generated. Their motion status and behaviors can be set individually or by a standard automatic motion mode (AI intelligent traffic) that can cover most of the actual traffic conditions. ● 3. Vehicle Characteristics: The vehicle can have a realistic dynamic model and add perception sensor models for different purposes (ACC sensors, automatic parking sensors, etc.) at different locations in the vehicle. The simulation efficiency can be ensured as well. ● Automatic Test Control: The system can load and run a set of test scenes and record when a specific event occurs. After the tests, it will also pass the information to the test report generator, which realizes unmanned automatic testing. ● The Weather Representation: GaiA's weather control module can change the weather during automatic testing to simulate weather, such as fog, rain and snow in real-world environments, and even the influences of these weathers on perception sensors such as millimeter-wave radar and vehicle control. ● Chart Drawing: In the Graphics interface, GaiA can draw data from vehicles and sensors in real time, which makes it easier for test engineers to locate system problems more efficiently. ● Rich High-fidelity Extendable Tool Model Library: GaiA can integrate tools and models from PilotD's rich simulation tool library, including, for example, vehicle dynamics two-track models, physical millimeter-wave radar models, and physical lidar models. Based on the expansion, the simulation results will be more realistic, and the validation results will have a more constructive meaning to product development and validation ● Interactive Automated Driving Simulation: GaiA offers the possibility of interconnect simulations of multiple automated driving hosts in the same virtual scene, enabling the tests of system-to-system interaction stability, reliability, and system functions based on interactive communications. Software editions: ● Standard Edition The Standard Edition provides a stand-alone solution for simulation-based automated driving testing. The program is optimized for stand-alone application. Under the premise of realizing most functions of the software, the efficiency of stand-alone operation, the possibility of accelerated testing and the ability of long-term testing are guaranteed. It is suitable for SiL validation test of all types of automatic driving decisions, sensor data processing algorithms and underlying control algorithms. ● Net Edition The Net Edition is a cloud computing and interactive testing solution based on the Standard Edition.
1) Based on the implementation of the Server/Client mode, it is possible to make a large amount of parallel computing in the cloud. At the same time, multiple client links can be used and the simulation can be manipulated in batches.
2) Simultaneous online operation of multiple users makes it possible to perform interactive validation tests between multiple automated driving strategies or multiple samples of a single strategy. ● RT Edition The RT (Real-Time) Edition is optimized in real time to ensure strong real-time performance of the simulation. It is possible to make validation tests such as HiL/ViL with high data synchronization requirements. At the same time, the RT Edition provides a rich set of external data interfaces to let the users get started quickly and start experimenting right away!
PilotD offers a variety of models for testing, calibration and high-fidelity simulation of environment perception sensors for automated driving systems, in order to let our customers test sensors, find out specific sensor performance and take sensor performance into account through simulation-based validation when automated driving systems are developed.
Test & Calibration: In test and calibration, PilotD has a comprehensive test & calibration system and process for environmental perception sensors. Through the test process, PilotD provides our customers with various technical indicators for different sensor types, as well as the evaluation of the performance, advantages and disadvantages of sensor characteristics in various essential specific driving situations. Especially for active sensors (millimeter wave radar/lidar), PilotD can even provide technical support and services in environment and target measurement and reproduction (e.g. RCS) through this process. Simulation and Modeling: In simulation and modeling, this type of model is divided into two categories: active environment perception sensor models (for example, millimeter-wave radar models, lidar models, or ultrasonic sensor models) and passive environment perception sensor models (for example, camera models, or far infrared sensor model). In addition, through PilotD's Simulation-based Validation Environment for Automated Driving Systems - GaiA, and its flexible internal modular structure, the integration of the above described two types of models will also be possible. The integrated model can even be used to simulate some of the new compound sensors that are still in development (e.g. 3D-TOF sensors, etc.) Physical Simulation of Passive Environment Perception Sensors (Camera/Far Infrared Sensors): 1.For passive environment perception sensors, image processing-based recognition algorithms work primarily through signal processing of images taken by the camera.
2.The recognition process is reproducible for signal processing, such as the A/D converter used by the camera, the nonlinear sensitivity of the sensor, the dynamic response characteristics, the filter mask etc.
3.Physically, the effects of external conditions (e.g. lighting conditions, weather, etc.) on the world detected by passive environmental perception sensors are also simulated. Physical Simulation of Active Environment Perception Sensors (Camera/Far Infrared Sensors): 1.Compared to the real world visible to humans, the world seen by active environment perception sensors is completely different.
2.In the simulation validation, the simulated vehicle needs to be equipped with high-fidelity virtual environment- perception sensor models to make the validation meaningful.
3.We simulate detailed physical phenomena such as multipath reflection of electromagnetic waves, or dynamic sensor performance such as detection dropout rate, target resolution, measurement inaccuracy, and “ghost” objects to achieve the High fidelity required by sensor models.
4.As sensor outputs, in addition to the standard object list, the user can even obtain the field strength distribution of the electromagnetic wave received by the sensor, which is also well known as the raw measurement data. This type of data will enable an effective combination of accurate sensor-based validation and efficient system development. References 1)Cao, Peng; Wachenfeld, Walther; Winner, Hermann: Perception Sensor Modeling for Virtual Validation of Automated Driving, in: it - Information Technology (4), Issues 57, 2015
2)Cao, Peng: Modeling Active Perception Sensors for Real-Time Virtual Validation of Automated Driving Systems. Technische Universität, Darmstadt,[Dissertation], (2018)
We have developed a dynamic model for the test vehicle based on a two-track model.
For the development and commissioning of traditional vehicles, this model has great significance for the parameterization and the parallel development in the initial stage of vehicle development. At the beginning of vehicle development, when the prototype vehicle has not been developed, the engineer can use the high-fidelity vehicle dynamics model :
1) to parameterize the vehicle and predict the influence of kinetic parameters and vehicle geometry parameters on vehicle dynamics. Therefore, the potential dynamic problems and irrationality of the designed vehicle are discovered in time;
2) In the earlier vehicle development stage, the other system components of the vehicle are developed in parallel. Based on the vehicle dynamics simulated by vehicle dynamic model, some systems based on vehicle dynamic behaviors, can be developed without a real prototype. After the prototype vehicle is produced, it is only necessary to make some fine-tuning of the performance of the real vehicle. Such a parallel development process saves the development time and cost of the automobile manufacturer compared to the traditional development procedure.
In the development and validation of automated driving systems, in addition to the advantages above, the model can be integrated into Simulation-based Validation Environment (for example, PilotD's GiaA system) to exploring the influence of dynamic behavior on the detection and measurement, as well as the plausibility of the system's response to vehicle dynamics limits.
Through this model, the rotation, displacement, force and torque of the four wheels and the body of the vehicle under test on each coordinate axis can be calculated. On the one hand, the environment perception sensors (such as radar, lidar, camera, etc.) are fixed on the vehicle body, and the results of the detection mostly reflect the relative relationship between the vehicle and other objects (for example: relative speed, relative Distance, relative angle, etc.). The instability of the dynamic behavior of the vehicle will also greatly affect the quality of the detection and subsequent environment comprehension based on the detection. Therefore, in the design of the automated driving system and the subsequent hardware and software in-loop testing, the accuracy of vehicle dynamic behavior simulation also has a decisive influence on the operation quality and efficiency of the automated driving system. This has become one of the reasons why this model is indispensable in the simulation of automated driving simulation.
On the other hand, in addition to detecting the sensing system, the automated driving system also includes the behavior decision system and the underlying control system. The decision/control quality of these two parts is closely related to the dynamic characteristics of the vehicle. For example, if a car’s braking distance is relatively long, the behavior needed by it should be earlier decided.
HiFi Vehicle Dynamic Model: 1.The vehicle model must include a high-fidelity dynamic model that can truly represent each important physical quantity in vehicle dynamics.
2.However, the vehicle dynamics model must be simple enough that simulation-based test validation can still be performed efficiently.
3.Therefore, we developed a vehicle dynamic model based on a realistic two-track model without affecting simulation efficiency.
4.The user can know the specific calculation method of the force and torque carried on the various parts such as the body and the tire. In the case of complex applications, the user can modify it accordingly; or when validation, the error is found, it is easier for users to find out the cause of the specific problem: if it is due to problems in system design? or a simulation error?
Subjective Driving Evaluation:
The use of automated driving systems from Level 1 to Level 3 is inseparable from the driver's intervention. Regardless of whether the driver is responsible for the driving control itself, the driver's subjective feelings directly affect the driving safety. Therefore, the evaluation of the driver's subjective feelings is an important indicator of such systematic evaluation. In addition, the driver's subjective feeling is crucial even for automated driving systems above Level 4. This is the most direct indicator of drive comfort. In summary, the driver's subjective evaluation is an important standard that reflects the quality of the automated driving system and even the safety indicators. Driving Simulator: The driving simulator is considered in nowadays world as one of the main ways to test and improve the driver's subjective feelings in automated driving development.The advantages of the driving simulator are:
1）The driving simulator can provide the driver with a complete driving or riding experience at the beginning of the development of an automated driving system. There is no need to build complex hardware prototypes or prototype vehicles in this process.
2）All algorithms and hardware (such as spring damping system) can be replaced and adjusted simply by parameterization without the need to change the hardware or the large amount of loading and unloading costs caused by the hardware adjustment.
However, using the driving simulator to evaluate the driver's subjective feelings challenges the reality of the representation in driving simulator. PilotD applies the latest virtual reality (AR\VR\MR) technology in the field of visual representation in the construction of the driving simulator. Combined with the driving environment simulation platform GaiA, the most realistic driving experience is represented, which makes the evaluation of driver’s subjective feeling more accurate and convincing.
The process from VR to AR to MR embodies three phases of virtual reality technology. VR VR (Virtual Reality) technology is a technology that integrates human exploration and interaction into a computer-generated three-dimensional world. VR is the beginning and foundation of AR and MR. Through VR, human beings can freely observe the world and make some behaviors in a purely virtual world to change the perspective of observation. The capabilities of VR devices limit the dimensions of behavior that humans can make in this three-dimensional environment. Generally divided into three dimensions and six dimensions. PilotD's VR reproduction can achieve six-dimensional exploration. The device supports the driver's translation and rotation on the X, Y, Z axes, providing the most realistic cockpit experience. AR AR (Augmented Reality) technology is a derivative of VR technology. If you say that VR technology is pure virtual visual representation. AR technology is the visual representation of superimposition of the virtual world and the real world. In automated driving simulation and visual representation, it is not enough to use VR vision to represent the reality. In the VR world, the driver's hands, feet, and various types of HMIs (such as the steering wheel, brake pedal, gas pedal, etc.) are three-dimensional objects that are simulated by computer. The behavior of the virtual object does not necessarily follow the driver's behavior, which brings unreality and makes the driver unable to adapt to the operation. Driving is a complex and semi-conditioned reflex-based operational behavior for humans. Any unreality will lead to differences in human subjective feelings, which will lead to differences in the quality of evaluation. PilotD also integrates AR technology into the driving simulator based on VR. The system automatically locates and reproduces the position of the off-site scene in the driver's perspective. The real HMI such as the steering wheel in the driver's view, as well as the hands or feet that operate on the real HMI, are real in the perspective. Thereby, the above discomfort is avoided, and the quality of subjective feeling measurement and evaluation is further improved. MR MR (Mixed Reality, Hybrid Reality/High Fidelity AR) technology is an improvement and enhancement of AR technology in terms of realism. Based on AR, MR will further analyze real-world information and more accurately embed the virtual world part generated in AR technology into the real world. The MR-based driving simulator will provide more powerful anti-interference capability and more perfect combination of real scenes in the cockpit and virtual scenes outside the cockpit to provide the driver with the most realistic driving experience.