Full-time Level 0 assisted driving, but why does it still fall short?

Believe it or not, everyone was impressed by the capabilities displayed by the Huawei HI version of the Arcfox Alpha S, which went viral not long ago. Its ability to execute unprotected left turns, yield to pedestrians on zebra crossings, and handle complex road conditions was truly astounding.

With such expectations in mind, we went to experience Baidu Apollo’s city navigation assist driving. The car we were testing was a WM Motor W6 vehicle equipped with Baidu Apollo’s software and hardware.

One crucial clarification is that the currently available WM Motor W6 in the market does not have the capabilities shown in the video below. It only has the AVP automatic parking function.

The sensors and chips used in Baidu Apollo’s system are:

  • 12 2-megapixel cameras

  • 5 millimeter-wave radars

  • 12 ultrasonic radars

  • NVIDIA Orin X chip

  • Relies on high-precision maps

Yes, you read that right. This hardware architecture doesn’t include the currently popular lidar but instead relies on a low-cost visual solution. Baidu’s answer is that they aim to provide mid-end customers with advanced assist driving, and even partly autonomous driving solutions at a relatively lower price.

Although Baidu Apollo has its own ACU, “Wuren” and “Sixi” simply don’t have enough computing power to meet ANP’s requirements. While “Sanxian” 200T’s computing power is impressive, it won’t be available until 2023. All things considered, choosing Orin X is a very fitting choice regarding computing power and time.

As for the driving experience, I sat in the back seat and never felt uncomfortable throughout the whole journey. It’s very natural, almost like human driving. Both the linear sense of turning and the ability to maintain stability in turns give passengers a sense of safety.

Like Huawei, Baidu Apollo’s traffic light countdown function also depends on cameras, requiring high visual requirements.

However, there were some regrets during the experience. That is, the entire route was only right turns, and we didn’t experience the challenging situation of unprotected left turns. Also, with cars parked on both sides of the road and minimal pedestrian traffic, we couldn’t figure out how it handles tricky situations like jaywalkers. So overall, the experience was not as stunning as Huawei’s, but it still beats all the L2s on the market.It’s worth mentioning that visualizations make people uneasy. Visualization is essentially a means to communicate with human drivers by conveying the images seen by the vehicle during autonomous driving, establishing a communication and trust window. Although what we saw today is still a demo version, the stuttering visuals, the deviation between the displayed number of vehicles and the actual number, still made me feel a bit nervous…

Baidu stated that their goal is to install this technology in one million vehicles by 2023, and they are very confident in achieving this target. Additionally, they will complete the high-precision map survey work in 20 cities by the end of this year.

Before the New Year, people were still comparing the L2 experiences, such as who had the higher level, and who had the best user experience. However, with the involvement of technology companies such as Huawei and Baidu, the user experience has gone to a whole new level. In fact, for Baidu, they can’t be considered as just entering the game, they are just transferring and transplanting the L4 capabilities to the pure visual L2. The final experience is completely beyond our expectations, which can be summed up as: changes are happening too fast!!!!

This article is a translation by ChatGPT of a Chinese report from 42HOW. If you have any questions about it, please email bd@42how.com.