Author: MuMu
A few days ago, Baidu released a video of less than two and a half minutes on ANP 3.0.
ANP 3.0, the full name of which is “ANP Apollo Navigation Pilot 3.0”, is a L2+ intelligent driving software and hardware integrated product solution launched by Baidu Apollo for the new generation of smart cars.
After watching this video, I suddenly wanted to score the functions inside it.
In my mind, after a simple comparison between ANP 3.0 and the functions of XPeng P5 CNGP, Mocha DHT-PHEV Urban NOH, Tesla FSD, and Lynk & Co 01 Hybrid CITY NCA, I came to a rough conclusion:
If we do not consider mass production, in terms of the technological level of functions alone, ANP 3.0 and FSD can squeeze into the top of the pyramid, and slightly surpass the other companies in their respective positions.
ANP 3.0 can be understood as the intelligent driving solution provided by Waymo to car manufacturers after many years of technological advancements by Baidu.
In terms of technology, if 10 points are the full score, ANP 3.0 can probably receive 8 points.
After all, it is a product of Baidu L4’s downgrading to L2+, and is born to be fed by a huge amount of L4 automatic driving data.
When Baidu, which has been deeply cultivating L4 for many years, enters the market with ANP 3.0 for L2+ driving, whether or not car manufacturers should let this late-comer honors student pass is a question.
Currently, apart from self-developed WeRide, Haval’s Horizon ADAS, and Huawei deeply linked with Lynk & Co, Changan, and GAC, no other Tier 1 intelligent driving solution can deliver a powerful solution like ANP 3.0 from mass production.
Apollo, together with ANP 3.0, will have many highlights next year.
A little over two minutes to showcase three years of industry capability
In 2019, Tesla was the first to introduce the Navigate on Autopilot (NoA) feature in the automotive industry. After that, domestic car companies such as WeRide followed suit.
Until this year, Chinese companies such as XPeng began to introduce city navigation assistance, and people began to hear various solutions such as CNGP, NCA City Navigation Assistance, and City NZP.
ANP 3.0 appeared among this batch of intelligent driving functions.In the two-minute video feature demo, it showcases the intelligent driving functions in both highway and city scenarios, which encompasses the culmination of the industry’s development over the past three years.
Let’s start with the highway scenario:
Highway assisted driving features such as Tesla’s Navigate on Autopilot and Xiaolong’s Yuanchuang are considered as the admission ticket to intelligent driving.
In the video, ANP 3.0 has four abilities on the highway: automatic ETC payment, intelligent avoidance of large vehicles, passing through the tunnel, and turning around corners.
Currently, car manufacturers can cope smoothly with the two features of turning around corners and avoiding large vehicles intelligently.
The challenging feature is passing through the tunnel, especially when road construction is occurring inside. Among the industry, both Tesla and NIO find it hard to succeed in that situation.
The difficulty of passing through the tunnel lies in the fact that it puts a considerable challenge on the accuracy of the perception system when signal loss occurs. Additionally, when the signal is restored instantly when driving out of the tunnel, maintaining the car’s stability without any staggering, staying in the center of a lane, and others are all current obstacles in the industry.
ANP 3.0 can perform similar passage through tunnels as demonstrated in the video, basically making it in the top tier.
A noteworthy point is that ANP 3.0 is the industry’s first automated ETC solution. It is capable of intelligent lane selection, recognition of lifting the bar, and small space passage of the toll booth, essentially bridging the last kilometer of the highway scene.
The ability to automate ETC payment is unique to ANP 3.0, as the others haven’t rolled out this feature yet.
Moving to the urban scenario:
The city is the place where the complexity of the road increases exponentially, and corner cases, making it more challenging to test the capability of the intelligent driving scheme.
In the video, ANP 3.0 accomplished six functions on urban roads, which are traffic light recognition response, obstacle bypass, close-range cut-off response, non-mechanical mixed crossing at intersections, unprotected U-turns, and left-turns without protection.The lack of protection for left turns has long been one of the challenges for the global autonomous driving industry. It requires a game with opposing vehicles going straight, which still makes many robotaxis difficult to handle, occasionally stopping at intersections.
As can be seen in crowded road conditions, ANP 3.0 first avoided a tricycle at the intersection when making an unprotected u-turn, and then successfully made the turn after waiting for oncoming vehicles to pass by. Next, when making an unprotected left turn, ANP 3.0 first entered the left-turn lane, and then successfully made the turn after waiting for the oncoming vehicles to pass by.
Next are several road scenarios with Chinese characteristics, namely: non-motorized mixed intersection response, close-range cutting response, and obstacle avoidance.
In these scenarios, Chinese roads may be more complicated than foreign roads because there may be delivery two-wheelers, unconventional tricycles, and even pedestrians suddenly appearing among road participants.
Obstacle avoidance
In the video, ANP 3.0 smoothly avoids obstacles on a road with a continuous stream of delivery trucks, parked cars, and pedestrians getting off. These scenarios are even more complicated than some road conditions exhibited by some car companies when showcasing their city-leading auxiliary functions.
However, ANP 3.0 performed very smoothly.
Non-motorized mixed intersection response
When dealing with non-motorized mixed intersection response, ANP 3.0 prepared to turn right. A delivery truck was blocking the crosswalk in front and the system first detected two-wheelers that were preparing to cross the road. Then, ANP 3.0 applied the brakes when the two-wheeler slid off in front of the vehicle. I noticed one detail. When switching to the opposite perspective, ANP 3.0 decelerated in advance when turning right, but still turned forward until the front wheel was out of the straight line stop. When the two-wheelers crossed, ANP 3.0 applied the brakes, showing some element of forecasting the crossing of the two-wheelers in advance.### Response to Cutting In at Close Range
What impressed me the most in the video was the ANP 3.0’s response to cut-in at close range, which was quite impressive.
From the car information displayed on the console, the 360° camera of the ANP 3.0 had already detected the vehicle on the adjacent lane to the right, even before it had changed lanes.
Subsequently, the ANP 3.0 perceived the entire action of the other vehicle, including its acceleration, left turn signal activation, and changing lanes into the ANP 3.0’s lane.
When the other vehicle cut in, the ANP 3.0 was driving at a speed of 55 km/h. The system quickly decelerated to 30 km/h after the cut-in, creating a safe distance from the other vehicle in front.
Perception, decision-making, and control were executed seamlessly.
Currently, the XPeng P5 CNGP is capable of strong countermeasures to cutting in at a speed of 30 km/h and above. Judging from the ANP 3.0’s performance when encountering a cut-in at a speed of 55 km/h, I have great confidence in its ability to handle cutting in.
Response to Traffic Light Recognition
The video also demonstrated the ANP 3.0’s ability to recognize traffic lights.
In 2020, Tesla was the first to propose traffic light recognition, followed by XPeng, Jixue Motors, and Haylion Technologies, among others.
The solutions can be roughly divided into two categories. Tesla uses a purely visual front camera-based solution. Other companies mainly rely on the fusion of cameras and high-precision maps, i.e., visual-fusion-based solutions.
The pure visual solution not only relies on the front-facing camera, but also utilizes deep learning and “shadow mode” to continuously optimize its functionality. The visual-fusion solution is based on a high-precision map, which is then fused with visual perception.
So far, both types of solutions have achieved high accuracy in traffic light recognition. Among them, the visual-fusion-based solution can even achieve innovative features such as countdown display.
ANP 3.0 has followed the mainstream path in China by adopting a camera and high-precision map fusion-based approach.Based on Baidu’s self-developed high-precision map, the ANP 3.0 method inputs the car’s position, queries the location of traffic lights in the map, and the camera image of the current position, and outputs information such as the color status of the traffic lights on the map.
The fact that it can accurately identify temporary traffic lights, hidden and partially obstructed traffic lights shows that ANP 3.0’s visual perception ability is already excellent.
In fact, Baidu ANP, XPeng, and Hozon are all moving closer to Tesla’s pure visual solution, which is to downplay the ability of high-precision maps and open up the function of identifying traffic lights in the range of map-less roads.
Currently, ANP 3.0 can recognize traffic lights even without high-precision maps.
Among domestically produced vehicles, XPeng plans to open this function on the G9 MAX version of urban NGP in the first half of 2023.
Therefore, ANP 3.0’s research and development progress is not slow compared to the entire industry.
It is worth mentioning that the ANP 3.0 plan structure adopts a dual system of pure visual perception and LiDAR. The performance in the generalization test video in these cities and highways is the industry’s first three-domain fusion navigation and assisted driving solution based on pure vision.
When it is truly put into mass production, ANP 3.0 will use LiDAR data as double redundancy.
After watching the entire video, ANP 3.0 can handle a very rich environment, especially on urban roads where it has gone through multiple road sections mixed with vehicles and pedestrians. Its performance in various scenarios can basically enter the top tier.
Being able to achieve this has a lot to do with its background.
Behind ANP 3.0, Apollo L4
To break down ANP 3.0’s capabilities, we need to start with the hardware.
The head of XPeng’s autonomous driving research and development once said, “For simplifying sensors, the main benefit of hardware simplification is still for the car manufacturers. Unlike some approaches, XPeng’s judgment criteria will stand more from the perspective of user experience, rather than simply pursuing enterprise profits maximization.”
This represents the approach of Chinese autonomous players, which is, even if we are poor, we cannot compromise on configurations. Everyone is willing to invest in sensor configurations, and some can even be called rich people.The hardware configuration of ANP 3.0 is no exception. However, it uses a relatively pragmatic approach, including 11V5R12S2L + 2 Nvidia OrinX (508 Tops), specifically:
- 7 8-megapixel HD cameras
- 4 3-megapixel high-sensitivity omnidirectional cameras
- 5 millimeter-wave radars and 12 ultrasonic radars
- 2 forward-facing high-line solid-state lidars
This sensor configuration is basically the same as that of XPeng G9.
Speaking of the XPeng G9, the chief of autonomous driving development at XPeng Motors has made a comparison, and the CNGP of the G9 Max version has greatly improved compared to the P5 of today in terms of pass-through efficiency, gaming ability, intelligence, and perception range.
That is to say, based on the sensor configuration of ANP 3.0, it should perform better than the autonomous driving capability of the XPeng P5.
Similar to the scheme of G9 Max, ANP 3.0 is also designed for urban autonomous driving, and will definitely surpass the autonomous driving capability of the existing mass-produced cars such as the XPeng P5.
It is said that ANP 3.0 is the same as G9 Max and even different from all Chinese players. That is, it has a more powerful lidar capability in terms of data scale.
As mentioned earlier, ANP 3.0 is still in the generalized testing phase and mainly uses pure visual perception. When it enters the mass production phase, ANP 3.0 will add lidar as a perception redundancy.
The advantage of doing so is that it can further cope with unconventional obstacles, night obstacles, dynamic/static obstruction detection, cross-layer parking in parking lots and other complex urban scenarios, improve the safety and stability of intelligent driving, and make the experience safer and more comfortable.
The difference is that the lidar in the ANP 3.0 scheme has a better foundation than XPeng, JiHu, etc.
With the support of Baidu Apollo, ANP 3.0 can use the lidar point cloud data and models from 600 RoboTaxis and 40 million kilometers of road tests in nearly 30 cities in China, generating massive L4 lidar data every day.And companies like XPeng are currently in the early stages of applying LIDAR technology.
Therefore, in terms of LIDAR integration, ANP 3.0 can move faster than XPeng and other car companies.
On the other hand, installing ANP 3.0 means that Baidu Apollo will provide services to car companies as Tier 1.
It is understood that, in terms of LIDAR configuration, Apollo can flexibly arrange the position, quantity, and performance of LIDAR according to the needs of cooperative car customers, in order to meet the needs of different car manufacturers’ schemes and improve the efficiency of landing in various vehicle models and regions.
In terms of software, ANP 3.0 is equipped with an all-scenario intelligent driving software system that supports urban, highway, parking, and three-domain integration, and has a complete self-developed closed-loop data system.
Under this software system, ANP 3.0 has mature pure visual perception algorithm capabilities.
It is based on the Apollo Lite++ pure visual perception scheme and uses the “BEV panoramic 3D perception” technology with the “Apollo color”. In the R&D test phase, relying on the “BEV panoramic 3D perception” technology, ANP 3.0 has become the only intelligent driving scheme in China that can run city scenes in multiple scenarios relying solely on visual perception.
ANP 3.0 also has Baidu’s self-developed high-precision maps as support.
Baidu is the only domestic intelligent driving supplier that self-develops both autonomous driving and high-precision maps. At present, the high-precision maps that can be used for advanced navigation assistance driving have accumulated more than 400,000 kilometers, and have obtained the city map audit qualification numbers of the three first-tier cities of Shenzhen, Guangzhou, and Shanghai.
This map ensures that ANP 3.0 can still run smoothly even under the complex road topology in the city.
More importantly, it is actually the knowhow that Baidu Apollo has accumulated on Robotaxi that matters. The urban scenario is where it is more familiar, and the original visual algorithm and the data left behind when dealing with various corner cases make ANP 3.0 like a top student who has done many practice exams.So, how would car companies see this academic overachiever?
Will ANP 3.0 rock the automotive industry?
The launch of ANP 3.0 easily brings to mind Huawei’s complete suite of smart driving in its drive for innovation. It has been mounted on both the FAW Hongqi Alpha S Hitech and Changan Oushang A800 Hitech models.
However, Huawei’s system has also been resisted by SAIC with its “soul theory,” so it is difficult to see Huawei’s smart driving system in more car models from other automakers.
Can ANP 3.0 successfully penetrate the automotive industry?
In theory, ANP 3.0 would first be deployed by Jidu, a subsidiary of Baidu.
It is also possible for other automakers to choose ANP 3.0.
Currently, the situation is that, with regard to autonomous driving, as AI develops, Great Wall uses self-developed full-stack technology, the Chinese brands of (BAIC, Changan, GAC)×Huawei, SAIC x Momenta, Geely x Mobileye; the rest, which have no specific plan yet, may theoretically be able to work with ANP 3.0. Although brands with distinct collaborative relationships may not necessarily tie up that strongly, part of a brand’s vehicle may take on third-party solutions under the company.
For instance, some brands of Geely and SAIC have already started to work with domestic smart driving solutions. At the outset of the intelligent driving collaboration, the whole market is still up for grabs.
In theory, the faster Tesla and XPeng move, the more ANP 3.0 might quickly be adopted. Actually, there are not many choices of automotive smart driving solutions that can compete with ANP 3.0 as well as be mass-produced at any time.
In my personal opinion, it is the most advanced smart driving solution in terms of technology.
This article is a translation by ChatGPT of a Chinese report from 42HOW. If you have any questions about it, please email bd@42how.com.