On December 8th, at the second Baidu Apollo Ecological Conference, Baidu released a high-level intelligent driving solution called ANP, which is commonly known as Navigation Assistance Driving, based on the only L4 level pure vision automatic driving technology in China, Apollo Lite.
ANP is a product that allows for L4 level automatic driving technology to be released into the field of assisted driving, and is unique in that it is based on a pure visual perception system using 10 cameras and less than 30 Tops of computing power. It has the ability to perform automatic driving tasks such as recognizing traffic lights, navigating roundabouts, and making unprotected left turns in urban road scenarios.
In contrast to other industry solutions that are focused on using laser radar, Baidu’s pure visual L4 level automatic driving solution provides a low-cost alternative by leveraging the reduced requirements of current FSD or regulatory-level real-time needs and the lack of high-performance computing chips available.
As autonomous driving will become the decisive high ground in the future of the automotive industry, Baidu hopes to quickly achieve mass production and mass commercial deployment by promoting ANP, using its low-cost advantage, to improve the user experience of mass-produced autonomous driving systems and help automakers create the next generation of intelligent cars.
In fact, Baidu announced the existence of ApolloLite to the public last year. The technology is capable of supporting 10 cameras to process 200 frames per second of data in parallel, and the highest frame loss rate of a single visual link can be controlled below 5‰. The perception system can provide a full 360° real-time environmental perception, and the stable detection of forward obstacles has a visual range of up to 240 meters. With this 10-camera perception system, Baidu’s unmanned vehicles are already capable of performing end-to-end closed-loop autonomous driving without relying on high-line count rotating laser radar on urban roads.Here, lightweight refers to both lightweight sensors: 10 cameras can achieve 360-degree real-time environmental perception, and lightweight computation: 30+ deep learning networks only require a single GPU to support, with a total computing power requirement of less than 30 Tops. Lightweight products mean lower costs and faster large-scale production.
In addition, Baidu has integrated its high-precision maps and V2X into ANP, providing users with a better autonomous driving experience.
In fact, in addition to ANP, Baidu Apollo also launched a mass-produced AVP autonomous parking system in September this year, which has already started mass production cooperation with brands such as GAC, WM Motor, and Great Wall.
From parking to driving, Baidu Apollo’s intelligent driving products have realized a mass-produced solution for full-scenario autonomous driving on complex urban roads. Baidu said that in the next 3-5 years, Apollo’s intelligent driving products are expected to be equipped on 1 million vehicles.
Pure vision or lidar?
It is quite surprising that Baidu, which has always used lidar as the main solution, has started to adopt a pure vision solution.
As everyone in the industry knows, the levels of autonomous driving are divided into L0-L5, among which L3 is the dividing line. Above it is higher-level autonomous driving that does not require human intervention, while below it is collectively known as advanced driver assistance systems (ADAS) of higher levels. Currently, most cars that have been mass-produced by automakers are ADAS at level L3 or below.
Looking at the entire development cycle of the automotive industry, it is difficult to achieve higher-level autonomous driving above L3 in the short term, as it involves many details, safety, and legal issues.
At this stage, as mentioned earlier, for autonomous driving solutions mainly based on lidar, they lack support from high computing power chips, and most importantly, their cost is high.
In fact, for lidar, Qi Ping, the R&D director of ZF China, previously stated that the industry generally believes that the use of solid-state lidar for L4-level autonomous driving is more meaningful, and a lot of work needs to be done before L4 is widely used in passenger cars.Based on this, from Baidu’s perspective, although ANP is formed as a level 4 pure visual autonomous driving solution, the scenarios it empowers are still below level 4 autonomous driving. Below level 4, pure visual solutions have lower costs and can be quickly mass-produced and mounted on vehicles, which is a win-win situation for both automakers and Baidu.
In fact, based on Baidu’s previous autonomous driving solutions, the perception proportion of Baidu’s lidar has always been relatively low compared to other autonomous driving technologies, with only the main lidar.
Therefore, Baidu also stated that in the research of autonomous driving, Baidu has always insisted on increasing the proportion of vision. Maybe one day, vision will be the focus.
However, Wang Liang also emphasized that Baidu’s determination to invest resources in the research and development of pure visual perception solutions does not mean giving up the existing technology route based on lidar. It is necessary to fully understand the importance of true redundancy in the autonomous driving system during the technical practice and decide to use the pressure-surround vision technology to reinforce the multiple sensor fusion perception framework.
The debate over who will dominate future autonomous driving industry development, cameras or lidar, has never stopped. The industry’s denial of lidar is mainly due to its high cost. Huawei previously stated that its lidar product is about to be mass-produced and has a significant cost advantage, and it has had discussions with several automakers about cooperation.
So, once the cost of lidar drops quickly, will there still be a market for pure visual solutions? It is worth looking forward to.
This article is a translation by ChatGPT of a Chinese report from 42HOW. If you have any questions about it, please email bd@42how.com.