Author: Mr.Yu
Before we start today’s topic, let’s discuss a question:
Do you think self-driving technology is “killing” or “saving” people?
Don’t rush to answer, let’s keep reading.
Recently, a Tesla enthusiast shared a video in interaction with Musk: a white Tesla Model S was driving on the road while the driver was lying motionless on the steering wheel.
Later, Norwegian police confirmed that this video was not a behavior art piece spreading Slavic humor in the “meanwhile in Russia” entry, but a bizarre accident, and released relevant details in the follow-up:
“A Tesla was parked in a tunnel. The 24-year-old driver was asleep behind the wheel. He was heavily intoxicated, but stubbornly denied driving the car. Despite having video footage of him in the car… we have collected the necessary samples.”
Local Norwegian police later confirmed that the owner was already in a drunken and unconscious state while driving the Model S. After detecting that the driver was unresponsive for half a day, the car eventually stopped. However, the location was not ideal, and the system stopped the car in the middle of the tunnel.
Fortunately, no one or car was injured during the whole process, and the owner received first aid later. The police said that they had conducted necessary tests on the owner, temporarily revoked his driver’s license, and filed a lawsuit against the drunken driver.
Recalling the Huawei self-driving executive who was transferred due to his inappropriate remarks last week, Tesla’s Autopilot driving assistance system actually saved at least one life.
When Musk shared the initial development details of Autopilot, he said that in order to launch the “V1” version of Autopilot to consumers, Tesla had “made a lot of effort.”
Previously, a driver fell asleep while driving a Tesla without an Autopilot system, and hit a cyclist who died later. What made Musk feel aggrieved was that the driver who caused the accident later sued Tesla, claiming that the smell of a new car made him fall asleep.
Fortunately, the judge who tried the case was a sensible person and rejected the lawsuit.Perhaps in order to proactively dispel doubts about the safety of Tesla vehicles, Tesla has been releasing quarterly vehicle safety data reports since October 2018. On the other hand, they have also been publishing data on vehicle fire incidents since July 2019.
According to their Q1 2021 report, there was one reported traffic accident per 6.74 million km driven on average with Tesla Autopilot engaged. For driving without Autopilot engaged, the average was about one reported traffic accident per 3.3 million km driven. And with neither Autopilot nor active safety features engaged, the average was one reported traffic accident per 1.57 million km driven.
Tesla claims that according to the latest data from the National Highway Traffic Safety Administration (NHTSA), there is an average of one collision per 780,000 km driven in the US.
In other words, it seems that Tesla vehicles with Autopilot engaged may actually be more reliable than human drivers.
Elon Musk believes that the core of Autopilot should be a safety system to help human drivers make driving easier and more convenient. By using an array of sensors and algorithms, Autopilot can avoid collisions with any traffic participants beyond its own vehicle, and avoid the vast majority of types of traffic accidents by emergency braking.
The data is reliable and the vision is great, but there are also times when it may not be reliable. For example, an Apple engineer driving a 2017 Tesla Model X crashed into a concrete barrier and died, and the vehicle caught fire and even exploded. His family later sued Tesla, claiming that Autopilot, which should have applied emergency braking, did not prevent the tragic accident from happening.
We should treat technology equally, but safety is always a critical issue. In the end, it comes down to whether or not we get a perfect score–there are no minor issues when it comes to safety.
Just like that lucky guy in Norway. While we don’t know how much he had been drinking or how long the white Model S had been driving on the road before stopping, at least he and others weren’t stuffed into any large or small bags. That’s the luckiest outcome.
The same content may cause some to feel scared, while others may feel comforted.Believe it or not, we have all heard of heart-wrenching stories of bus or coach drivers who suddenly had a heart attack during a journey, managed to safely stop the vehicle on the side of the road with their last bit of consciousness, but unfortunately passed away on the steering wheel, leaving behind grateful passengers who were saved from potential harm.
We would like to believe that the existence of advanced driver assistance systems (ADAS) or autonomous driving features can mitigate such risks and provide precious opportunities for drivers who were unable to stop their vehicles in time, but without causing greater accidents. Perhaps the system can take over the vehicle from drivers who are feeling unwell and buy some time for them to seek help.
Clearly, some drivers have already taken matters into their own hands, long before autonomous driving becomes widely available. There have been some incidents reported where drivers were drunk, like the Norwegian guy who got too drunk to drive but let his Tesla drive him home, or even more reckless cases.
In May 2019, Taylor Jackson from Los Angeles, USA, was caught on camera performing sexual acts on her boyfriend while driving a Tesla with autopilot engaged. She claimed that she trusted the autonomous driving function and had no concerns about safety. However, according to her, the car lost the autopilot mode after she accidentally touched the steering wheel.
Taylor’s reckless behavior went viral on social media and even made it on Bloomberg and other mainstream news outlets. She even tweeted Elon Musk to brag about making Tesla the top search term on Pornhub. Musk responded ambiguously, saying “it turns out there are lots of use cases for autonomous driving that we hadn’t considered”.
We don’t know if Musk’s ambiguous attitude will encourage the frenzy of autonomous driving fans on the internet, but we can safely say that we wouldn’t want to share the road with someone like Taylor Jackson.
Whether it’s the mainstream LiDAR technology favored by most Chinese EV start-ups, or the vision-based approach praised by Tesla, or a hybrid solution of both, they represent different technological and product paths. It is the marketing department’s job to turn them into selling points, but when it comes to life safety, it ultimately depends on drivers’ understanding of traffic laws and the value they place on their own and others’ lives.
Just like those viral videos of people sleeping behind the wheel with autopilot turned on, it’s a story if nothing goes wrong, but an accident if it does.This is what we want to discuss today. It’s not just a matter of whether the technology is advanced, but more about what responsible companies need to convey to the public and consumers, it’s about the rational expectations and clear understanding of the technology and selling points.
Conclusion
Robert Passmore, VP of APCIA responsible for automobile insurance claims, pointed out that if there is no driver available to be questioned, insurance companies need to gather information from the vehicle, such as the speed of the vehicle at the time of the accident, and when the vehicle’s system becomes aware of the collision with people, vehicles, or objects and reacts to them. The scope of coverage may vary depending on the situation. In the case of advanced driving assistance features are equipped on vehicles, if the vehicle loses control due to driver operation errors, the individual should bear the responsibility.
Of course, when it comes to discussing specific liability, something irreversible has already happened.
The incident in Norway did not develop into an accident, at least proving that the assisted driving system has fulfilled its most basic responsibilities. So, back to our initial question: do you think autonomous driving technology is about “killing” or “saving” people?
This article is a translation by ChatGPT of a Chinese report from 42HOW. If you have any questions about it, please email bd@42how.com.