Jia Haonan sent from the Front Passenger Temple
Reference for Intelligent Vehicles | AI4Auto Public Account
For the first time in history, the police were confused by a self-driving car…
A strange incident occurred with Cruise’s RoboTaxi, a self-driving company under General Motors, on the streets of San Francisco at night. The vehicle was pulled over by the police for not using high beams, but upon approaching the car, officers discovered there was no driver behind the wheel.
This was likely the first time the police had encountered this type of situation, and upon returning to their own car, they may have sought assistance.
But wait! Just a few seconds later, while the police had their back turned, the Cruise RoboTaxi sped off and escaped:
General Motors’ Cruise probably never expected their self-driving car to become famous in this way.
However, the core issue behind this incident is one that all self-driving car companies have yet to address: how to deal with traffic violations and police interactions during operation.
What happened?
The basic details of the incident have already been mentioned above, but there are a few other details worth investigating.
Firstly, did the police know there was nobody inside the Cruise RoboTaxi before they pulled it over?
It’s likely they did not know, since the video shows that the police were patrolling when they noticed the car driving without headlights on at night, making it difficult to see inside.
Furthermore, if the police did know that this was a self-driving car and pulled it over, it would at least indicate that the San Francisco Police Department has a standard procedure for dealing with self-driving car violations.
However, based on the fact that the police inspected the car before returning to their patrol vehicle, it appears that they may not have encountered this type of situation before.
The Cruise RoboTaxi’s escape raises more questions:
Most importantly, neither the San Francisco police nor Cruise have explained how the police managed to stop the RoboTaxi in the first place.
Could it be that the Cruise RoboTaxi can simply stop on command? That would be very unsafe for users and passengers…
Perhaps the police used an aggressive method to “force stop” the vehicle?
Or maybe the self-driving car had already switched to remote safety mode when the police approached?
However, no matter what the situation, the behavior of “stopping and then escaping” is not very clever.The behavior has caused “astonishment” among passersby, as can be heard clearly in the video when the Cruise self-driving car “escaped” while people exclaimed, “There is no one in the car.”
Fortunately, the chase scene only lasted a short while, and the Cruise self-driving car obediently pulled over and turned on its emergency flashers:
At this point, it can be inferred that both the remote safety monitor and algorithm of Cruise have realized that there is a problem with the scene.
The police then operated the screen on the car, possibly contacting Cruise.
The automatic driving “violation and escape” challenges both regulations and technology and is an unprecedented event that is historically significant. However, there are many doubts.
Let’s hear Cruise’s official explanation of the incident.
Cruise’s Response: Good Interaction and No Ticket
Cruise’s response:
This is not an “escape,” but rather parking the car in a safe location, which is what it should have done.
The self-driving car had a good interaction with the police and fully cooperated with the law.
The police later contacted Cruise personnel without issuing a ticket.
Cruise did not explain whether it was appropriate to start the car suddenly without giving prior notice to the police.
There is also no indication of whether someone took over when the car was parked on the side of the road.
However, Cruise claims to be in close communication with the police to establish a set of procedures to deal with violations.
Currently, the method is to leave a telephone number in the car and contact Cruise personnel in case of accidents.
The Latest Challenge for Self-Driving Cars: What Happens When They Violate Traffic Laws?
This incident highlights the core issue of what to do when a self-driving car violates traffic laws.
This so-called “handling” refers to how the self-driving car’s developers, operators, and traffic management agencies can establish a standard process for handling violations that ensures legality, safety, and regulation of the vehicle’s actions.
Assuming there was no remote safety monitor taking over Cruise’s self-driving car and it relied solely on AI, this solution was obviously unqualified based on the results.
Possibly this is just a simple precautionary measure, not a proactive response to violation inspections.
In other words, the sensors may detect the flashing lights and sirens of a police car, but only recognize it as a change in the environment and subsequently move to the side to avoid accidents.
The system does not recognize this as a violation, nor does it have a process for dealing with law enforcement, and it is very likely that it is not even aware of what has happened.
Danger and misunderstandings can easily occur again.
What should autonomous driving companies do?
For RoboTaxis that operate on the road, AI drivers may need to learn more atypical scenarios in addition to regular driving skills.
For example, recognizing police cars and their flashing lights does not appear to be categorised by any autonomous driving company as a separate target.
As such, there is no existing database available for AI learning. Thus, this is another complex engineering endeavor that starts from scratch.
Once the system can recognize police cars, how does it distinguish signals emitted by police cars? For instance, how does AI differentiate between whether the police car is addressing itself or other vehicles when signalling to pull over?
This involves target behavior prediction…
Autonomous driving is usually separated by traffic and parking domains, which is really just a way to distinguish different working modes, allowing AI to understand the current scenario and call the appropriate resources to take the corresponding actions.
Similarly, can’t RoboTaxis have a “violation domain”?
In fact, it is not complicated at all. After identifying the violation signal given by the traffic police, simply pull over and stop properly, do not move around, and then provide the operator’s contact information on the interaction screen.
Of course, if the “violation domain” cannot be implemented for some time, there is also the simplest Chinese solution:
5G cloud valet, take a look?
— End of file —
This article is a translation by ChatGPT of a Chinese report from 42HOW. If you have any questions about it, please email bd@42how.com.