A recent, deadly accident has raised questions that we want to see answered.
You may have missed it, but a motorist was recently charged with vehicular manslaughter while allegedly using Tesla Autopilot. Kevin George Aziz Riad reportedly crashed into a car, killing two people. The state is not going to go after Tesla, but the families of the two victims are suing the famous EV maker.
This is not the first time a person has been charged with a serious crime while using an autonomous system. In 2018, Uber tested its fully autonomous fleet, based on the Volvo XC90, on public roads in Arizona.
One of these cars killed a pedestrian, but the car's safety driver, Rafaela Vasquez, was charged with negligent homicide. She was allegedly streaming an episode of The Voice at the time of the incident.
Here we have two severe cases of semi-autonomous and autonomous technology allegedly being involved in the deaths of three people. I use "allegedly" because both cases are still ongoing, and the legal system functions under the concept that a person or company is innocent until proven otherwise.
To me, the Uber case is straightforward. Uber received permission from the authorities to test the autonomous technology, but not without a safety driver behind the wheel. Said safety driver had one job: keeping a close eye on the car to ensure that it followed all the rules and to override the thing when it failed. If the allegations against Vasquez are found to be valid, she's at fault.
The infamous Uber crash resulted in the ride-sharing company giving up on self-driving cars, selling that particular part of its business to someone else.
In short, these are two very different cases, but they do have something important in common. If Vasquez was indeed watching The Voice as claimed, it could only mean that she had grown complacent with the self-driving XC90. Hypothetically speaking, it was doing a good enough job, so why keep on keeping an eye on it?
That brings us neatly to Tesla, Autopilot, and Full Self-Driving. In Riad's case, the state again chose not to go after the manufacturer but rather the driver. And tough Teslaratis love to brag about the abilities of their vehicles, Tesla has an explicit proviso on its website for Full-Self Driving. "The currently enabled features require active driver supervision and do not make the vehicle autonomous."
Yet, we keep on finding videos of morons doing all sorts of things while they should be focusing on the road. Just so there's no doubt, allow me to state my feelings about Autopilot and Full Self-Driving clearly.
You are 100% free to use them while still concentrating on the road like the rest of us. If you stream something, read, nap, eat, or any other activity that takes your attention away from the road, you are a danger to society, and your license should be revoked. There is no such thing as a self-driving car, and to state otherwise is dangerous.
Let me make something else clear. Tesla's Autopilot is a brilliant system. It's probably the best active cruise control out there, but that's all it is. Autopilot was a dumb name from the start, as it created this false sense of superiority amongst the EV elite.
Autopilot was also good enough to create a certain sense of complacency, as we've seen many times before. Why not have a glance down at my phone? And before you know it, a car that can't actually drive itself has been driving for ten minutes.
The big difference here is that Uber's self-driving cars were geofenced, and it paid a person to pay attention at all times. A Tesla with Autopilot and its driver are free to roam wherever they want.
This specific Tesla crash will spark a massive debate. It will likely start in the comments section below, with the Teslarati calling me a hater. Do me a favor and read all our Tesla reviews. We give credit where credit is due, and we criticize where it's fair to do so. The main problem is rabid fanboys, who worship at the altar of Elon. They keep on serving the master, even though he has made more empty promises than actual models. Yes, we have access to the S3XY range, but where's the Cybertruck, Roadster, Semi Truck, Robotaxi, et cetera?
As a reminder, check out the video below. It's a nice collection of empty promises made by the infamous CEO.
I also find it interesting how positive Autopilot articles are used to prove how advanced the system is. Yet, the Tesla fan club will steer clear of this particular accident (and some other controversial crashes in earlier years) and blame it on the driver. Where do you draw the line?
Let me illustrate with an example. Late last year, a man was caught driving drunk in a Tesla. Actually, he was passed out behind the wheel of his Model S. It navigated the highway perfectly until it eventually pulled off the road. One life saved. And it was used many times over by Autopilot fans to demonstrate why the system saves lives.
But let's go a bit deeper than that. What likely made this young man think that he could get away with drunk driving? Yup, it's complacency again. The false narrative that the car can drive itself gave him the confidence to get behind the wheel because the vehicle can drive him home. That very same complacency makes others believe they can get up to all sorts of antics while the Tesla eats up the miles.
Why are you willing to own this story and not the one where two people were killed? It doesn't fit the narrative, does it?
When it comes to Autopilot, there's good and evil. I like that it forced legacy manufacturers to develop their own systems just to keep up. I'm also happy to admit that it saves lives. I've seen it do so. I've also seen video footage of it misbehaving. That's why I call it a fancy adaptive cruise control system. You can't use it and not pay attention, and if found guilty, the state will likely use Kevin George Aziz Riad as an example to warn us all.
Tesla not being culpable (for now) also tells us something important about the autonomous future. At the moment, all manufacturers still require you to hold the wheel and pay attention. Why? If you're metaphorically holding the wheel, you're the guilty party.
The manufacturer is liable when the world switches over to fully autonomous driving. And once that happens, it will only take one crash to unleash the might of class-action lawsuits. The kind of lawsuits that topple companies. Think Dieselgate, but a hundred times worse.
Is Tesla guilty? Not, not according to the law. It has all the correct disclaimers in place, but it could better name its safety systems. Autopilot was a poor choice, as is Full Self-Driving. There is no such thing as full self-driving, and there won't be until the legalities are ironed out and the tech catches up. And getting that done will require input from various fields spanning from engineers to philosophers.
What's the message here? Stop with the self-driving BS. Your car can't drive without you paying attention to the road. To state otherwise is irresponsible and creates this false illusion that it's perfectly okay not to watch the road while the car is driving.
We're not there yet.
Join The Discussion