Author: Song Shuanghui
Turning the windshield into a display screen, the interactive interface once seen in sci-fi movies has become a reality with the landing of AR-HUD (Augmented Reality Head-Up Display) technology.
HUD head-up display technology was first used in military fighter jets, allowing pilots to see the necessary information on the windshield without looking down. In 1988, this technology was introduced into cars and was first installed on luxury brand models such as Oldsmobile, BMW, and Mercedes-Benz.
Over the decades, the head-up display has gone through technological iterations from C-HUD to W-HUD, but has always been lukewarm. For many car owners, the HUD is still somewhat redundant, and people still prefer to look at their phone navigation systems while driving.
This is because traditional HUDs have a generally small projection area and limited displayable information, and the projected content cannot be seamlessly integrated into the road environment in real time and may even interfere with the driver’s vision.
It wasn’t until the release of the new Mercedes-Benz S-Class AR-HUD in 2020 that this innovative technology began to show people more imagination. Because of the advent of AR-HUD, the limitation of projection area was broken, and more abundant information can be projected, accurately guiding real-world decisions and constantly blending reality with the virtual world, bringing more imagination space and interaction between other sensors.
But what is the better AR-HUD scene experience?
AR is called augmented reality technology in English. To create an AR-HUD with excellent user experience, it requires the synergy of hardware and software. It not only needs suppliers to provide hardware with excellent parameters but also requires car manufacturers to truly integrate the displayed content with the real road environment in software. For a smart car, stacking high-performance hardware is not enough, and car manufacturers also need to do full-process experience optimization, maximize the performance of hardware, and transform a bunch of parameters that consumers cannot understand into tangible user experience.
After experiencing AR-HUD functions in many new cars, the FeiFan R7 left the deepest impression on me due to its four major advantages: superior hardware foundation, content that closely matches actual road conditions, rich content that is distributed reasonably, and finding the optimal scenarios for AR-HUD application.
Today, I will take the FeiFan R7 as an example to share what kind of AR-HUD is necessary to allow users to completely abandon their mobile navigation devices while driving.
Excellent AR-HUD technology requires advanced hardware parameters
Previously, automakers promoted AR-HUD in order to differentiate it from ordinary HUDs. In reality, the principle remains the same, except the field of view is wider and the display area is larger. Such AR-HUDs are little more than gimmicks, and there is no real AR immersion. I believe this kind of HUD should be called HUD MAX.
To achieve true AR effects, hardware must be capable of MAX-level functions. When we drive, we must see the complex road environment ahead with a broad field of vision. If the area displayed on the HUD only occupies a tiny part of our visual field, you might lose the immersive effect even if the technology is good.
Here, we must mention the FeiFan R7, a large five-seat pure electric SUV that has locked onto the industry benchmark, Tesla Model Y. The new model shows us more potential for AR-HUD technology. First, the R7’s field of view has the industry-leading 13°x 5°, and its resolution reaches 1920 × 730, offering users a more delicate picture quality experience. At the same time, thanks to the projection function, this AR-HUD can also provide a 70-inch giant screen visual effect within a range of 7.5 meters ahead, bringing a stunning viewing experience.
For comparison, the AR-HUD field of view in the million-dollar Mercedes EQS is 10°x 5°, while those found in the Volkswagen ID series and Audi e-tron HUDs are 10°x4°. FeiFan’s AR-HUD has the largest field of view.
Good AR-HUD, key to fusion of virtual and reality
To give a more precise example, a regular HUD only covers about one lane marking in front of you, while a certain brand’s HUD MAX covers 2 lane markings, and Flyfun’s true AR-HUD can cover up to 3 lane markings. This means that the AR-HUD can project information on the area where our vision often looks while driving, making the information richer and clearer.
In addition, the human head tends to shake a little while driving, and if the image does not follow the eyes, the AR effect cannot be achieved and the shaky image can cause motion sickness. AR headsets generally have an eye-tracking camera to ensure that the image follows the eyes and remains relatively still in motion.
Flyfun’s R7 AR-HUD also has such a design. With Huawei’s technical capabilities in product development, Flyfun only needs to optimize the experience in the whole process.
Previously, the popular Pokemon Go game was one of the most scenario-based AR applications on the Internet, letting you see Pikachu in front of you just by scanning with your phone camera and breaking the dimensional barrier between virtual and reality.
Applied to vehicles, a qualified AR-HUD should break down the dimensional barriers between virtual prompts and the real road environment, making you feel perfectly integrated and without any sense of dissonance.
Let’s start with the problem of road fitting. If AR projects a Pikachu hovering in mid-air in your house, you will surely feel that the game is very unrealistic. However, if Pikachu is standing on the table and can even climb the books stacked on the desktop, you will find it interesting, because in your consciousness, even Pikachu must follow Newton’s laws of motion to be reasonable.See how things look like when placed on Feifan R7? Our brain also has some “common sense” to obey, just like Newton’s mechanics, for example, the navigation arrow should be attached to the ground, not floating in the air and perpendicular to the ground. The guiding lane line should appear in the lane the driver sees as corresponding to the actual road. Only by doing these, drivers can have the immersion brought by AR when driving.
Imagine, one is that you look down at a 6-foot screen for navigation, the other is that you don’t have to look down, and you can directly see a dynamic guiding arrow on the lane line in front of you. The latter not only conforms to our intuition but also is a kind of private, exclusive “privilege” service.
Good AR-HUD, rich content is a highlight
Speaking of rich content, it’s a bit overkill to use AR-HUD only for navigation. In fact, in addition to common information like speed, speed limit, and navigation, Feifan R7 can also display traveling routes, estimated arrival time, battery life and other data on the HUD. Even ADAS information can be presented in the AR-HUD.
In addition, Feifan R7 also introduces some core and high-frequency POIs in the map, for example, charging stations, shopping malls, cafes, and other information. For example, when you are driving on the road, the AR-HUD shows the logo of a charging station 150 meters ahead of you, and even shows that there are 2 available fast-charging piles inside. You take a glance at the time, and the remaining battery power is not too much, so you can directly judge to go recharge before going home.
In addition to charging stations and Feifan’s future-built exchange stations, Feifan R7’s AR-HUD may have imagination space for displaying commercial information in future OTA upgrades, such as ratings and availability of free parking lots for a restaurant that will pass by on the left, which could greatly improve our quality of life. This AR-HUD provides possibilities.Currently, in parking mode, the FeiFan R7 AR-HUD has a cinema mode that can project a screen in front of the driver, enabling them to enjoy a cinematic experience with ease through the AR-HUD.
This is where the charm of the AR-HUD lies. It could even become the core configuration that distinguishes the next generation of intelligent cars because the space for imagination that AR carries is so vast.
However, you may wonder if the information displayed together will affect the driver’s field of vision. This is not a concern with the FeiFan R7 AR-HUD, as demonstrated in the picture.
We can see that FeiFan understands the distribution of content and the driver’s attention interval. Because the AR-HUD has a large screen, FeiFan has enough space to place speed and other dense information in the non-core attention area of the driver. As shown in the picture, the first thing you will notice is the stationary car and the adjacent lane markings. Then, you will notice that there is no HUD-projected information in this area except for a guiding arrow.
Moreover, you will notice that its UI interface is very concise. Although there is a lot of information, no piece of information occupies too much vision, and there are reasonable gaps between each other. They do not overlap and block our view of the actual environment on the road.
This is because we have “binocular perception,” and our brains are naturally able to “fill in the blanks.” If the obstruction is within a small range, it will not interfere with our vision. For example, if you now hold up your middle finger and put it in front of the screen, you can still see or guess the part covered by this article.
This is professionalism. It presents rich content service without interfering with your view of the traffic environment ahead. I have driven many cars with large HUDs, such as our Ideal L9. I often had trouble with my field of vision being obstructed by the information displayed, as the content was extremely complicated and dense. Especially when I first took delivery of the car, the HUD directly hollowed out and projected the content on the original screen, which was a terrible experience.The FEV R7 I received was just recently delivered, but you can see the first-generation version was already meticulously polished. They have considered and provided the best solution for what areas to display, what areas cannot be obstructed by information, and how to present certain information in a way that better suits our habits before delivering the FEV to the user.
From this, we can see that the FEV team indeed focuses on product experience and is very responsible to the user. This is something many car companies claiming to be “user-oriented” can learn from.
True AR-HUD, and Intelligent Driving Visualization Interaction is the Soul
Lastly, let’s talk about why FEV has found the best user scenario for AR-HUD, which is the visualization of auxiliary driving information. If you’ve ever experienced navigation-aided driving, then you might remember how you would look at the driving assistance status? Typically, it is through the real-time 3D environment modeling generated on the center console screen or instrument screen, which is what most people call SR.
However, SR is essentially a reconstruction of what the system sees in a way that you can also understand. It’s like someone made you a PowerPoint, so the efficiency is relatively low. FEV R7 also has the advanced RISING PILOT Navigation Assistant function. In addition to SR, the FEV R7 utilizes AR-HUD to sync this running status in a more intuitive way.
For example, when the system automatically changes lanes, AR-HUD will display a virtual guide line ahead of time on the target lane to make the intent of the lane change clear at a glance. Additionally, in the classic large vehicle collision avoidance scenario, the system will directly create a red flash on the edge of the target vehicle in the real environment we see, which lets you know intuitively that we can’t afford to mess with the large vehicle and we should stay away from it.
If SR is like PPT presentation, the showcase that FEV made on AR-HUD is more like face-to-face communication; the communication cost of the latter is extremely low and more in line with our psychological expectations. You can immediately understand what the system wants to do, and each step of the system falls within your expectations, which greatly improves the overall driving experience and safety.Using AR-HUD to display intelligent driving information is not only practical, but also very cool. Just imagine, isn’t this the world in the eyes of superheroes in Marvel movies, where you can see things that ordinary people cannot see! A bunch of high-precision sensors detect data, which is calculated by high computational power chips and superimposed on the real environment, presented in front your eyes. It’s just like Iron Man!
Focusing on the uniqueness of scenario experience
AR-HUD is just a microcosm of the driving experience that we can intuitively feel on the FEIFAN R7. Instead of focusing on hardware and functions, the FEIFAN team pays more attention to the uniqueness of software and scenario experience.
For example, after shopping or on rainy days, when the passenger cannot open the car door, the driver can automatically open the passenger door by clicking the shortcut key on the passenger screen. When listening to music or watching videos on the passenger screen, you can connect Bluetooth headphones to avoid disturbing the driver. These detailed designs may not be advanced functions, but they greatly improve the convenience of driving.
This pursuit of excellence in detail handling is also reflected in the advanced driving assistance functions. The FEIFAN R7 features the RISING PILOT high-speed navigation assisted driving function, which can automatically switch lanes according to navigation, overtake slow cars, and avoid large trucks on highways. On city roads, the FEIFAN R7’s basic ACC+LCC adaptive cruise control and lane keeping performance are also excellent, with a better driving experience compared to most L2 driving assistance functions on the market.
For example, the most challenging intersection scenario, many vehicles will directly exit the intersection, while some intelligent driving systems will mechanically follow the car in front, leading to safety risks. However, the lane keeping of the FEIFAN R7 has a high pass rate when passing the intersection, and it will not be led astray.
In addition, the FEIFAN R7 is set up with more humane design when driving together with human. When I used to drive Tesla Model Y to work, the most annoying thing was that as long as I turned the steering wheel slightly, the system would automatically exit, and the assisted driving function became a struggle with the system. However, when experiencing the FEIFAN R7, I found that the system’s steering torque was just right, and it would not frequently exit due to mistaken operations, nor would it create a system-and-human struggle feeling like Tesla.
One of the headaches while driving is driving through a steep curve. While driving a Tesla Model Y through a high curvature bend, I felt that the car tended to hit the guardrail. Therefore, for safety reasons, I always took control of the vehicle while approaching such a curve. However, the Flyin R7 performed excellently while taking turns, providing a stable and smooth ride, without making me feel like the car was swerving. Hence, I could trust the system and let it handle the process.
Another highlight of electric vehicles is the one-pedal mode. However, this mode is mandatory and cannot be turned off in a Tesla. In contrast, in the Flyin R7, users can adjust the intensity of the one-pedal mode and energy recovery adaptively by using a continuous adjustment mode, thus giving them full control over the option.
It is very challenging to gain the acceptance of customers quickly when a new technology is introduced. One of the most direct and effective approaches is to find the best user scenario and use the experiential factor to impress customers. Previously, we saw the imagination of technology in pioneers such as the Model Y. Today, Flyin has implemented the technology in a scenario-based manner.
On the Flyin R7, the AR-HUD which allows the driver to completely abandon their smartphone while driving, and other intelligent designs, all embody one principle: technology should serve the real usage scenario of the users. For users, the ultimate experiential factor outweighs the stacking of numerous functions. This is precisely the strength that gives Flyin confidence in the face of competitors such as the Model Y, which have bragged about their technological prowess with the slogan “Flyin R7 instead of WHY.”
The arrival of the Flyin R7 makes us believe that abandoning a smartphone while driving is only the beginning. When a truly intelligent car breaks down the barriers between software and hardware, it will unleash the infinite potential of intelligent travel.
This article is a translation by ChatGPT of a Chinese report from 42HOW. If you have any questions about it, please email firstname.lastname@example.org.