Author: Lifelong Learning
Introduction
Currently, the implementation of autonomous driving perception requires the installation of various sensing devices on vehicles, such as LIDAR, cameras, and mm-wave radar, which collect data of surrounding traffic environments to enable autonomous vehicles to perceive the world even better than human eyes. This provides more accurate and rich environmental information to decision-making and planning modules, thus enabling safe autonomous driving.
The hardware stack of sensors can make autonomous driving safer on one hand, but on the other hand, individual hardware devices cannot obtain all required data, and may encounter problems in many extreme environments. For example, LIDAR may have a large deviation in data measurement under extreme weather conditions like fog and heavy rain; mm-wave radar does not have the ability to measure height, making it difficult to determine whether stationary objects ahead are on the ground or in the air; cameras can only capture 2D images, and even with the assistance of deep learning, it is still unable to accurately measure the distance between the surrounding objects and the autonomous vehicle. Therefore, autonomous vehicles require different hardware devices to work together, enabling them to obtain perception accuracy and capability no less than that of humans under any circumstances. The emergence of 4D mm-wave radar will revolutionize autonomous driving.
What is 4D mm-wave Radar
As one of the most important sensors in autonomous driving perception, mm-wave radar is crucial to autonomous driving. However, due to the lack of height measurement capability, it is difficult to determine whether stationary objects ahead are on the ground or in the air. When encountering ground or airborne objects, such as manhole covers, speed bumps, overpasses or traffic signs, it is unable to accurately measure the height data of the object. The emergence of 4D mm-wave radar will solve this problem. 4D mm-wave radar is also known as imaging radar, which integrates the fourth dimension of height analysis onto the basis of range, speed, and direction data of traditional mm-wave radar, enabling better understanding and mapping of the environment, thus making the collected traffic data more accurate.
As early as 2020, Tesla announced that it would add a 4D sensor technology to Tesla cars, expanding the current working range by two times to capture more traffic information. The 4D millimeter-wave radar can effectively analyze the contours, behaviors, and categories of the measured targets, adapt to more complex roads, recognize more small objects, monitor obstructed objects or stationary/lateral objects, to provide more reliable information for decision-making.
Solutions of 4D Millimeter-Wave Radar
The 4D millimeter-wave radar was first proposed by an Israeli company in 2019. In early 2020, Waymo announced the launch of the 4D millimeter-wave radar in its fifth-generation autonomous driving perception suite. In the same year, Continental Group released the first mass-produced solution for 4D millimeter-wave radar and stated that BMW would be the first automaker to adopt it. At the 2021 CES, many manufacturers unveiled their products using the 4D millimeter-wave radar, and companies such as NXP, Texas Instruments, and Mobileye successively launched or updated their 4D millimeter-wave radar solutions. Last year, Aptiv unveiled the next-generation L1-L3 level autonomous driving platform, claiming that its sensor suite includes 4D millimeter-wave radar. ZF announced that it has obtained a production order for 4D millimeter-wave radar from SAIC Group and will officially supply it in 2022. Bosch introduced the 5th generation radar ultimate version, 4D millimeter-wave radar, for the first time in the Chinese market.
Like traditional radar, 4D millimeter-wave radar does not show significant deviations in extreme weather conditions. After adding the pitch angle, it can form a point cloud image, which means that 4D millimeter-wave radar can not only detect the distance, relative velocity, and azimuth of objects but also detect the vertical height of objects in front, stationary and lateral moving objects, filling the gap in the shortcoming of traditional radar detection for static targets.
Currently, there are two main technical solutions for 4D millimeter-wave radar:
-
One is to develop multi-channel array RF chipsets, radar processors, and AI-based post-processing software algorithms independently.
-
The other is based on traditional radar chipset suppliers’ solutions, which realize dense point cloud output and recognition through multi-chip polychain or software algorithms.
The main feature of 4D millimeter-wave radar is its high angular resolution. The front 4D millimeter-wave radar can achieve 1 degree azimuth angle and 2 degree elevation angle resolution. When equipped on an autonomous vehicle, the 4D millimeter-wave radar can directly detect the outline of objects around the vehicle while exploring road information. In scenarios with rich road information, such as pedestrians and vehicles mixed together, the 4D millimeter-wave radar can directly identify and determine the movement of corresponding objects (whether in movement and direction). In addition, 4D millimeter-wave radar can also detect geometric shapes, such as the length and width of a tunnel, in tunnel scenes. The advent of 4D millimeter-wave radar has targeted the shortcomings of traditional millimeter-wave radar. It is not only upgrading from 3D to 4D but also provides comprehensive improvements in detection accuracy, sensitivity, resolution, and performance, giving higher safety to autonomous driving. It is expected to make millimeter-wave radar one of the core sensors in autonomous driving systems.
Future Development Trend of 4D Millimeter-Wave Radar
According to industry insiders, the large-scale landing of 4D millimeter-wave radar is imminent. In terms of marketization, the technology is becoming mature, and many innovative algorithms are being commercialized. Many new car models from various car manufacturers already have demand for this technology, especially for automated parking, Level 3 and above autonomous driving. In fact, since last year, there have been many products equipped with 4D millimeter-wave radar for road testing and preparing for mass production.
For example, NXP announced that the industry’s first dedicated 16nm millimeter-wave radar processor S32R45 will be mass-produced for the first time in the first half of this year. Mobileye, a subsidiary of Intel, is also actively promoting the development and application of 4D millimeter-wave radar. In his speech at CES this year, Mobileye CEO Amnon Shashua emphasized the application scenarios of 4D imaging millimeter-wave radar in cars. He said, “By 2025, we want only millimeter-wave radar, not LiDAR, except for the front of the car.” In Mobileye’s plan, consumer-level autonomous driving vehicle solutions based on millimeter-wave radar/LiDAR technology will be launched by 2025. Vehicles equipped with a LiDAR sub-system will only need to install a forward-facing LiDAR and a 360-degree millimeter-wave radar covering the entire vehicle exterior to achieve automatic driving tasks.The consensus within the industry is that autonomous driving technology cannot rely on a single sensor to dominate the market. Based on current market understanding of autonomous driving, there is no one-size-fits-all sensor solution due to the many segmented fields within the market, and varying levels of autonomous driving. It is highly likely that cameras and radars will coexist due to their complementary advantages and disadvantages. A special case is the lidar sensor, which the author considers can be replaced by a solution based on 4D millimeter-wave radar. 4D millimeter-wave radar is currently in its early stages of development, but the author believes that its performance will significantly improve in the future and ideally be able to replace lidar.
This article is a translation by ChatGPT of a Chinese report from 42HOW. If you have any questions about it, please email bd@42how.com.