Remember that Musk previously said Tesla would only use pure visual solutions and no longer use the so-called millimeter-wave radar. However, based on the HW 4.0 hardware revealed by Green, Tesla seems to be putting the millimeter-wave radar that it planned to cut back into its vehicles.
Tesla has launched four versions of its self-driving hardware, iterating three times with versions 1.0, 2.0, 2.5, and 3.0. We are most familiar with versions 2.5 and 3.0, with the emergence of the Model 3 that led more people to pay attention to the brand.
From the earliest MobilEye solution to the self-developed algorithm based on the NVIDIA Drive PX platform, to the self-developed chip from 2019, Tesla seems to have taken a similar approach to the Apple MacBook, whereby self-developed algorithms combined with self-developed chips make for a superior solution.
Currently, HW 3.0, which has only eight cameras, can sense the surrounding information and theoretically achieve self-driving. So why upgrade to HW 4.0? Let’s take a look at the changes to the hardware:
- The number of cameras has increased from eight to eleven, indicating that the current eight-camera solution is not enough, and the resolution has been upgraded from 1.2 million pixels to 5 million pixels.
- The number of CPUs in the SoC has increased from 12 to 20, equivalent to increasing from three sets of four-core CPUs to five sets of four-core CPUs. The NPU has increased from two to three. The motherboard provides two SoCs, one of which provides computing redundancy. The single-chip computing power is approximately 216 TOPS, which is three times that of the third generation. The chip process should be 7 nm, and whether it is Samsung or TSMC will be announced later.
- The millimeter-wave radar has been added.
Perhaps pure visual solutions currently cannot truly achieve self-driving for vehicles. On the algorithm side, Tesla can calculate the distance to the preceding vehicle, but there is still a certain error rate. For example, in the face of complex traffic conditions like those in China, or special weather conditions, we still need the help of radar to better solve the problems.
This time, Arbe Robotics from Israel has used its 4D millimeter-wave radar, which can also be called a replacement product for low-end (16-line) lidar, and compared to traditional millimeter-wave radar that can only perceive “horizontal distance,” “object movement direction,” and “object movement speed,” this one can perceive the fourth dimension “object height information.”
The most practical scenario in life is to measure the height of the overpass above the ground, which can avoid system misjudgment and can be used for imaging functions. The collected traffic data can better assist the autonomous driving computer and tell it more road information.
In addition, Arbe Robotics has launched the Phoenix architecture, which, combined with algorithms, can achieve a horizontal direction of 1° and a vertical direction of 1.7° angular resolution, which has reached the performance of entry-level lidar and is sufficient for filling gaps. The most important thing is that the detection distance has been increased from the previous 160 meters to 300 meters, which is even farther than some lidars.
With the support of hardware and algorithms, Tesla’s autonomous driving no longer needs to wrestle with purely visual solutions, but focuses on algorithms. This does not mean that they are mere stacking materials. This is somewhat similar to Google Pixel. In the era of dual cameras computing depth of field, Pixel 2 can surprisingly calculate depth of field with just one camera, thus achieving a blurring effect of the person, which was really incredible at the time.
For Tesla, the game of autonomous driving may have just begun.
This article is a translation by ChatGPT of a Chinese report from 42HOW. If you have any questions about it, please email bd@42how.com.