What level has the NOA Demo of Zhi Ji City achieved?

Recently, some netizens shared a video showcasing the NOA capability of the GeometryAuto City for the first time.

Upon viewing the video, we can spot several highlights of this demo test, which we will discuss in detail.

Firstly, we need to focus on the testing environment and conditions. According to the video, the test route was located in Jiading district of Shanghai at night, during light rainfall.

Although the rainfall did not directly affect the sensors such as the camera and LIDAR, the reflective ground caused by the wet weather could have some impact on perception.

Urban scenes present a lot of weak groups such as traffic lights, bicycles, and pedestrians. Driving behaviors and intentions of other vehicles are also harder to predict compared to high-speed scenarios. The NOA demo video by GeometryAuto City showcased a lot of scenes where it had to navigate around obstacles.

Shortly after starting, the vehicle encountered cars waiting to turn left at the intersection. GeometryAuto City’s NOA made a slight right turn and drove around the obstacle. This kind of maneuver happens a lot in urban scenes, and the system must have the ability to perceive 360 degrees around itself, and calculate the route based on surrounding environments.

These small-scale maneuvers are relatively simple, as the camera and LIDAR can identify the type, contour, and specific location of obstacles.

The following scenario is more challenging.

At this crossroad, most vehicles are crowded on the left side because there are left-turn lanes and through lanes. Therefore, there are more vehicles in the left lanes. In this scenario, GeometryAuto City’s NOA quickly chose to turn right, selecting the lane with fewer cars while waiting for the traffic lights. This kind of intelligent lane selection behavior is very human-like. However, it does expose a small flaw- the system’s starting speed is relatively slow, which led to the vehicle being left behind by other cars after passing the intersection.

In a similar scenario, when the vehicle approached an accident-prone intersection, it did not “wait stupidly” on its own lane, but chose to turn right and actively selected an efficient route. However, after signaling, the system did not accelerate boldly and the speed was not high enough. My guess is that the rightmost lane in front of the intersection is a bus lane, which the system determines to be a non-passable area. Therefore, the decision to change lanes this time is correct and similar to human driving behavior, but the final efficiency did not improve.

In addition, regarding avoidance, there was also a classic truck intrusion scene and corresponding performance in Smart’s road test.

The turning truck invaded the vehicle’s driving path, and the vehicle automatically turned to the right to avoid it. The difficulty for the system to visually perceive a large truck is that if only part of its body is exposed in the forward perception field of view, then the system will have difficulty identifying its properties and may even mistake it for “a wall”. In order to recognize it, the system can splice multiple angles of the camera and add temporal information to make it have short-term memory capabilities, which is indeed a trend in BEV solutions, but it is difficult and requires a certain period. Another solution is to continuously identify some features of the truck and let the system “recognize” it. However, after the laser radar intervention, even if the system does not recognize the object, it only needs to know that this problem is an obstacle, and clearly knows its outline and position, then it can make corresponding avoidance processing.

The above scenarios mainly reflect Smart City NOA’s avoidance and bypass capabilities. The most complicated scene in the city scene is the unprotected left and right turns, as well as the avoidance of vulnerable traffic participants such as two-wheelers.

Relatively speaking, the behavior of two-wheelers is more difficult to predict, and some drivers do not follow the traffic rules and run red lights, wander in blind spots, posing an even greater challenge to the system.

In the following scenario, when NOA of Zhi Ji city was making a left turn, it suddenly encountered a two-wheeled electric vehicle travelling in the opposite direction, and the system made an obvious deceleration action after recognizing it. At the same time, when encountering two-wheelers travelling side by side, the system also made a slight avoidance action.

In addition, on this test route, NOA of Zhi Ji city also demonstrated an unprotected left turn scenario. Unfortunately, there were no oncoming vehicles and the system’s strategic ability was not fully displayed. When passing through the intersection, the system took the initiative to avoid pedestrians on the zebra crossing.

There is also the performance on narrow roads. The changes in urban scenes are very diverse, and the width and standardization of roads cannot be compared with highways. However, at lower speeds, vehicles can also plan a reasonable and central path using their own positioning and LiDAR point cloud data.

Overall, NOA of Zhi Ji city has demonstrated good basic skills, and the handling of common city cases is also very good. If we have to point out a weakness, it is that its traffic efficiency still has room for optimization, but as this system keeps on improving through continuous training in the city, its performance will become more “human-like”.

Finally, there is the issue of time nodes. Zhi Ji’s smart driving is now pursuing two parallel lines. On the one hand, it is focusing on delivering high-speed NOA on a large scale, and on the other hand, it is striving to promote city NOA. At the beginning of this year’s LS7 press conference, Liu Tao, the joint CEO of Zhi Ji Automotive, stated that the city NOA based on high-precision maps will start public testing before the end of this year.

This article is a translation by ChatGPT of a Chinese report from 42HOW. If you have any questions about it, please email bd@42how.com.