
Authored by: Janet Ooi, IoT Industry and Solutions – Keysight Technologies
As you develop autonomous vehicles (AVs), how do you ensure your automotive radar sensors “see” the intended scenarios through software simulations early in the development cycle? How do you know that your lab tests are thorough enough to ensure vehicle safety on the road?
Developing autonomous driving algorithms is complex because the task involves many variables, including replicating complex, repeatable scenes in a lab environment. The more accurate the scenes, the faster your algorithms can be trained. This is where current in-lab solutions fall short.
The Evolution of Radar Sensors
Radar sensors have evolved significantly over the last decade. In the automotive industry, radar sensors are a critical part of Advanced Driver Assistance Systems (ADAS) and enable features such as blind-spot detection, lane departure warning or correction, automatic emergency braking and so much more.
Radar technology continues to evolve with higher frequencies, wider bandwidth and better resolution. In fact, advances in radar sensor technology push the automation level in vehicles to Level 3+ or 4, requiring the test and validation of more and more scenarios. As a result, automotive original equipment manufacturers (OEMs) and Tier 1 suppliers need to perform more tests with a higher degree of complexity.
Reimagine Test Tactics
Picture a setting in an urban area with high population density, many road intersections, and turning scenarios. There are numerous pedestrians, cyclists, e-scooters and even the three-wheeled cargo delivery bike.
The conventional way of testing the functionalities and algorithms of the radar sensors is by driving on roads for millions of miles. But this tactic will not be able to cover all the potential scenarios, including the one-in-a-million scenarios. In fact, the majority of the tests that you need to develop and validate AV systems need to go through simulation much earlier in the development cycle.
But How?
Automotive OEMs need to emulate real-world scenarios that enable validation of actual sensors, electronic control unit code, Artificial Intelligence logic, and more. Testing the physical hardware in a simulated environment close to real-world scenarios ensures that autonomous vehicles will behave as expected on the road.
The simulated environment and the rendered conditions need to include vehicle dynamics, weather conditions, and surrounding objects as well as real-time responses, in order to test the responses of the radar sensors. However, gaps remain in the technology today that hinder real-world scene renderings.
Technology Gaps in Radar Target Simulation
Number of targets and field of view
Some systems use multiple radar target simulators (RTSs). Each presenting point targets to radar sensors and simulates, horizontal and vertical positions by mechanically moving antennas around. Mechanical automation slows overall test time. Other solutions create a wall of antennas with only a few RTSs. These solutions only allow the radar sensor to see a handful of objects within a very narrow angle in front of it where blind spots can occur.
Minimum distance
Realistic traffic scenes require objects to be emulated very close to the radar unit. For example, approaching a stoplight where cars are two metres or less apart, bikes or scooters might move in the lane, and pedestrians might cross the road very near to the car. Passing this test is of utmost importance to your ADAS autonomous vehicle’s safety features.
Resolution between objects
The resolution between objects refers to the details of the scene and the confidence to know that the algorithm you are testing can distinguish between two objects that are close together. If you cannot identify the objects correctly, it is difficult to fully test the sensors, the algorithms, and the decisions that rely on the data streaming from the radar sensors.
Enable Next-Generation Vehicle Autonomy with In-Lab Full Scene Emulation
The robustness of autonomous driving (AD) algorithms depends on how comprehensive the testing is. This is why Keysight created the Radar Scene Emulator (RSE). Keysight RSE enables OEMs in the automotive industry to test autonomous drive systems with radar sensors faster and with highly complex, multi-target scenes. The RSE allows you to create scenarios with up to 512 objects, and at distances as close as 1.5 metres from the vehicle.
The scenarios can also have dependent attributes, including speed, direction, distance from the vehicle, angle and more.
With Keysight’s Radar Scene Emulator, automotive OEMs can shift testing of complex driving scenarios to the lab. This eliminates the need to drive millions of miles and dramatically accelerates the speed of testing. Using RSE’s industry-first approach, By thoroughly testing decisions earlier in the cycle against complex, repeatable, high-density scenes, and with stationary objects or objects in motion automotive OEMs can accelerate the insights that come from ADAS or AD algorithms.
Sharpen Your ADAS Radar Vision—learn more about Keysight Radar Scene Emulator.


Archive
- October 2024(44)
- September 2024(94)
- August 2024(100)
- July 2024(99)
- June 2024(126)
- May 2024(155)
- April 2024(123)
- March 2024(112)
- February 2024(109)
- January 2024(95)
- December 2023(56)
- November 2023(86)
- October 2023(97)
- September 2023(89)
- August 2023(101)
- July 2023(104)
- June 2023(113)
- May 2023(103)
- April 2023(93)
- March 2023(129)
- February 2023(77)
- January 2023(91)
- December 2022(90)
- November 2022(125)
- October 2022(117)
- September 2022(137)
- August 2022(119)
- July 2022(99)
- June 2022(128)
- May 2022(112)
- April 2022(108)
- March 2022(121)
- February 2022(93)
- January 2022(110)
- December 2021(92)
- November 2021(107)
- October 2021(101)
- September 2021(81)
- August 2021(74)
- July 2021(78)
- June 2021(92)
- May 2021(67)
- April 2021(79)
- March 2021(79)
- February 2021(58)
- January 2021(55)
- December 2020(56)
- November 2020(59)
- October 2020(78)
- September 2020(72)
- August 2020(64)
- July 2020(71)
- June 2020(74)
- May 2020(50)
- April 2020(71)
- March 2020(71)
- February 2020(58)
- January 2020(62)
- December 2019(57)
- November 2019(64)
- October 2019(25)
- September 2019(24)
- August 2019(14)
- July 2019(23)
- June 2019(54)
- May 2019(82)
- April 2019(76)
- March 2019(71)
- February 2019(67)
- January 2019(75)
- December 2018(44)
- November 2018(47)
- October 2018(74)
- September 2018(54)
- August 2018(61)
- July 2018(72)
- June 2018(62)
- May 2018(62)
- April 2018(73)
- March 2018(76)
- February 2018(8)
- January 2018(7)
- December 2017(6)
- November 2017(8)
- October 2017(3)
- September 2017(4)
- August 2017(4)
- July 2017(2)
- June 2017(5)
- May 2017(6)
- April 2017(11)
- March 2017(8)
- February 2017(16)
- January 2017(10)
- December 2016(12)
- November 2016(20)
- October 2016(7)
- September 2016(102)
- August 2016(168)
- July 2016(141)
- June 2016(149)
- May 2016(117)
- April 2016(59)
- March 2016(85)
- February 2016(153)
- December 2015(150)