Horizon: I am not good at PPT, and prefer to focus on mass production.

Author: French Fries Fish

Spring has arrived, and with it the season of floating willow catkins in Beijing. Whenever this time comes, I always remember that classic line.

“In the spring, everything comes back to life, and the animals on the grasslands…”

I’m not sure how the animals on the grasslands are doing, but I do know that spring has indeed come to the automotive industry.

This week could be described as the busiest week for the automotive industry since 2022, with not only the simultaneous release of new cars from various manufacturers but also the concentrated appearance of various intelligent driving technologies.

As a first-tier domestic AI chip manufacturer for intelligent driving, Horizon has chosen this time to bring their mass-produced intelligent driving technology experience day event.

Nine months ago, we mentioned that Horizon was still a relatively unknown company to many of our readers; nine months later, I can infer from Horizon’s official Weibo account followers that this unfamiliarity may still hold for ordinary consumers.

In fact, Horizon has already launched three mass-produced intelligent driving solutions, including the Horizon Matrix Mono front-view assistance driving solution, the Horizon Matrix Pilot navigation assistance solution, and the Horizon Halo vehicle-mounted intelligent interactive solution. My experience will also be presented in this order.

Mono: L2 Level ADAS Capability with a Single Camera

Mono achieves L2 level ADAS perception capabilities with a single camera. It was the earliest and most mature front-view assistive driving solutions launched by Horizon and has already been equipped in multiple mass-produced models. This time, the Mono experience was completed on a mass-produced Changan UNI-V.

As soon as I got in the car, before I even had time to fasten my seatbelt, Horizon staff raised my expectations by a few notches with a bold claim.

“Running L2 on the highway is too easy; this time, we’ll experience Mono’s L2 functions on urban roads.”

With that said, the Changan UNI-V merged into the flow of traffic near Horizon headquarters.

A little disappointing thing is that I thought the L2 meant the state where Full-Speed Range ACC and LKA Lane Keeping Assist are both activated. However, perhaps due to the condition of the vehicle itself, the Mono experience this time only turned on the Full-Speed Range ACC function.

But let me say, the Full-Speed Range ACC system of Mono is really top-notch.

It was close to 5 pm when we entered the traffic stream, and the rush hour had begun. The situation on the road became more complicated. Private cars, buses, motorcycles, electric bicycles, pedestrians… We encountered almost all the participants you can see on the road in the rush hour. However, it was under this situation that we drove Mono, passed more than ten traffic lights, and the ACC indicator on the dashboard remained on all the time.

The most reassuring thing about Mono is that it intuitively marks the distance between the vehicle and the preceding vehicle.

When the distance from the preceding vehicle displayed on the dashboard decreases and the braking sensation comes, I know that the Mono still has a steady hand on things, and I can continue to sit back.

Of course, Mono also has room for improvement.

For example, perhaps limited by the chip’s computing power, I observed that Mono does not display the motorcycle model on the dashboard at the first time it recognizes motorcycles. Instead, it shows a car model first, then switches to a motorcycle model in about 1 second.

However, this is only a slight dissatisfaction in perception and does not affect its function of controlling speed.

Pilot: High-speed NOA with Pure Vision Implementation

If the Mono experience was just the “appetizer” of this event, then the Pilot experience was undoubtedly the “core dish” of this event.

Before the event, I posted a photo of a “Wild Ideal ONE” on Weibo. This was a development and testing vehicle of Horizon, and the additional installed LIDAR sensors on the car attracted a lot of attention from netizens.

But in fact, Pilot is a purely visual solution that relies on 6 cameras + high-precision maps to achieve high-speed navigation NOA functionality. The purpose of lidar here is only for “truth calibration” to verify the accuracy of Pilot’s visual perception results. When it comes to mass production cars, you won’t see lidar anymore.

Since it is high-speed NOA, it is natural to show its strength when it is on the highway.

As soon as it enters the highway, the Horizon staff lightly flicks the gear shift lever and the voice prompt for NOA function is activated in the car. The HMI human-machine display interface, which is additionally installed by Horizon, also changes from 2D map navigation interface to 3D real-time lane view, displaying 360° of lane and vehicle information around our vehicle.

Overall, Pilot can achieve functions such as automatic follow-up, automatic lane-changing, actively overtaking slow cars, actively avoiding large trucks, driving in big curved sections and automatic entry and exit of the ramp.

Specifically, in your imagination, driving on a big curved ramp may be a big challenge for high-speed NOA.

But it’s not. For a mature pure visual NOA solution, as long as the lane lines on the road are clear, the difficulty of driving on big curved sections and sharp turns is not high. Even if it encounters a small radius of curvature, the system will automatically control the vehicle to reduce speed and pass smoothly.

In contrast, the challenge for the NOA system is greater at the Y-junction ramp with unclear lane lines.

The difficulty of this scenario lies in the abrupt change in lane lines.

Many ramps before Y-junctions are single lanes. As the Y-junction approaches, the distance between the two lane lines on either side will gradually widen until the Y-junction. In the process of approaching the Y-junction, the single-lane ramps will not draw an additional lane line in the middle of the road in advance to distinguish the two lanes in different directions.

In our human eyes, a small road is divided into two directions, which is clear at a glance and easy to distinguish. But for the computer, it relies more on lane line information to determine the direction of travel. In this scenario, the lane lines only widen without changing direction, making it difficult for the computer to judge the road conditions ahead. In many cases, the system will think that the lane has not changed. But when one lane of the ramp suddenly splits into two lanes in different directions, the consequence of such judgment is that the vehicle will go straight into the guardrail at the fork.The performance of the Horizon Pilot in this scenario is exceptionally impressive.

Upon recognizing a Y-shaped intersection, the Pilot will control the vehicle to turn slightly in the forward direction. The car remains in the middle of the lane, but I can feel that it is turning towards the intended direction.

These subtle movements enable me to anticipate the next move of the vehicle with clarity, providing me with a sense of safety and trust.

As for the previously challenging curved ramp driving, even in low visibility scenarios with the sun shining directly at it, the Pilot only needed to gently sway the steering wheel and immediately control the speed to continue driving smoothly.

If we must find faults in the Horizon’s high-speed NOA system, there are two points that come to mind.

Firstly, when changing lanes, the system is overly conservative in judging the distance between cars in front and behind. Based on my visual estimation, it can smoothly change lanes only when the distance between the front and rear cars is more than 20 meters.

Secondly, the system currently cannot independently pass through toll stations. The explanation given by Horizon is that this is for safety considerations. Due to the sudden increase in toll lanes and the more complex road conditions, drivers need to take control of the vehicle before entering the toll station.

I believe that this is also a problem that Horizon Pilot will solve in its future evolution process.

Halo: Multi-Modal Interaction

Whether it’s Mono or Pilot, both reflect Horizon’s absolute ability in AI perception. With this premise, creating an intelligent in-car interaction solution, Halo, has become a logical step for Horizon.

Halo’s core selling point is always-on wake-up.

Currently, mainstream voice interactions often require a specific “Hello, XX” wake-up word to activate the voice system. For the system, this specific word activation can avoid the problem of misidentification during daily conversations, but it also brings the issue of unnatural interactions. Especially in the case of other passengers in the car, suddenly shouting “Hello, XX” can cause me some embarrassment.The highlight of Halo’s full-time wake-up is that it can automatically distinguish whether you are speaking to it or to other passengers in the car, based on semantics.

“For example, if we feel a bit hot now, and want to open the window. These four words are a clear command, but I am not speaking to Halo directly. However, with this situation in mind, Halo also recognizes it, so our car windows are not opened. But when I want Halo to perform my command, like this, ‘open the car window’, you can see, the window is opened.”

Horizon’s employees used this example to demonstrate the characteristics of Halo’s full-time wake-up.

This is indeed a wonderful experience. As human beings, we can distinguish whether the object of this sentence is ourselves based on the tone of voice, volume, and even orientation of our speech. However, for computers, the recognition results of “open the car window” with different tones are practically not very different. But with AI perception, Halo does precisely this.

Because of labels like “autonomous driving” and “smart driving,” Horizon is often compared with Tesla. However, they are not really competitors. If you calculate it carefully, their fields are separated by several layers.

Therefore, although they have developed three highly usable smart driving solutions, Horizon has been emphasizing throughout the event that they are a Tier2 chip manufacturer. Developing these three solutions serves to express that their chip platform can achieve such functions to OEMs.

Compared to the implementation of these end functions, Horizon states that they have made more efforts in the backend that consumers cannot see, from AI chips to operating systems, to AI algorithm, AI function development toolchains, AI cloud-based SaaS development platforms, etc., providing a whole set of easy-to-use smart driving development chains for auto enterprise customers.

With more than 20 designated cooperation car companies, more than 50 pre-installed designated car models, and more than one million chip shipments, and an increasing number of users opting for Horizon, it also validates the correctness of Horizon’s business model.# Back to the chip itself, the Pilot 3 scheme consisting of three Journey 3 chips, with only 15 TOPS computing power, has already provided a very mature and high-speed NOA experience. This makes me look forward to how the subsequent Journey 5 with 128 TOPS computing power and Journey 6 with a maximum of 1000+ TOPS computing power will perform.

The new car equipped with the Journey 5 chip will also be unveiled in the fourth quarter of this year.

Horizon, see you in Q4.

This article is a translation by ChatGPT of a Chinese report from 42HOW. If you have any questions about it, please email bd@42how.com.