Autonowashing Kills Again, But Our Community Can Solve This Problem
A recent death in California shows us that “autonowashing” can kill people. A lot of freedom comes with Autopilot, but with that freedom comes responsibility. As a community, we can do a lot better.
(Note: Since this article was published, the California Highway Patrol walked back their strong statement saying Autopilot was active during the crash mentioned later in this article, so we added a note in that section to correct this and update readers)
Freedom Means Personal Responsibility
If you’ve read many of my past articles or follow me on Twitter, you probably know I’m not the stereotypical Democrat greenie. I’m not a Republican or a Libertarian, either. My positions on things are all over the map, and one of those positions some readers here find unpalatable is that I’m into guns. Honestly, I get it. I was raised in a completely different cultural background, and thus have a very different worldview. I disagree with some of you, but I respect where you guys come from.
I only bring this up because I want to make a point by sharing another thing you might not know: I’m a certified firearms safety instructor. I stand up for gun rights, sure, but I also stand up and put my time in for gun responsibility. I want people to have freedom, but everyone exercising that right also needs to know what kind of responsibility they’re taking on when they put a gun in a holster and walk out their front door with it.
Over about 1,000 students, I had a couple dozen walk out of the class once they realized what kind of trouble they could get into. One guy even interrupted the class and said, “This is too much for me. I didn’t know I could end up in prison,” and then walked next door to the gun shop to sell his gun. I emphasized responsibility in my classes that much.
But you didn’t come to CleanTechnica to read about guns (and I didn’t mention it to start a flame war in the comments). The point of bringing this up is to illustrate that one can stand up for the freedom of something while also advocating for responsibility.
Spider-Man’s uncle didn’t tell him that being a man is bad, or that all of the aspects of being a man (being bigger, stronger, and naturally more aggressive) are toxic. Quite the opposite. He wanted Peter to know that the more power you have, the more responsibility you’ve taken on, and that you have to use wisdom in the application of that power if you want to do good with it.
He didn’t know that his nephew was a superhero, but what he said would apply to that, too.
Bottom line here: The more freedom you have to do something, the more power you personally have, but it’s a two-edged sword. You can ruin yourself with power just as easily as you can help yourself and others with it.
Autopilot Is Like A Superpower Elon Musk Lets You Borrow
Let’s have some fun (or bear with me) for a minute looking at a different version of Spider-Man, and what happens when he misused power given to him by a billionaire tech genius.
I know Elon Musk doesn’t like being compared to Tony Stark, but as a billionaire who works with his employees to launch rockets, experiment with AI and brain-computer interfaces, and work to accelerate the transition to clean energy, it’s hard to not see some similarities.
Sorry, Elon, you’re the closest thing we have to Tony Stark. Just don’t die on us, OK?
When we get behind the wheel of a Tesla, we’re like Peter Parker when he got the EDITH glasses from Tony Stark (who had passed away). It gave him immense power, which he accidentally misused at first. He almost killed his classmates with a space-based drone strike.
If Elon is our Tony Stark, Autopilot (and whatever FSD features are working when you read this) is like the EDITH glasses. It gives you the power to make your driving up to 10 times safer* when used responsibly, but also gives you the power to endanger yourself and the public when not used responsibly.
Unlike the systems in other cars with driver assist features, you have a lot of freedom. There’s no glitchy eye-tracking computer that won’t let you wear sunglasses. The “wheel nag” doesn’t bother you that often, compared to other vehicles. The car doesn’t know if you’re actually sitting there or if you’re paying attention. Even law enforcement doesn’t seem to take Autopilot abuse that seriously, so they’re probably not going to force you to do the right thing.
You can get away with quite a bit, in other words. With that freedom comes responsibility, though.
What Happens When We Aren’t Responsible
At worst, it can kill you and others. Most recently, we have the case of a California man whose Tesla rammed into an overturned truck on the freeway at night. He died instantly, and others were hurt. Investigators went out of their way to tell us that Autopilot was on and in use at the time of the crash:
“While the CHP does not normally comment on ongoing investigations, the Department recognizes the high level of interest centered around crashes involving Tesla vehicles,” the agency said in a statement. “We felt this information provides an opportunity to remind the public that driving is a complex task that requires a driver’s full attention.”
Correction Note: Since we published this article, CHP walked back their earlier strong statement indicating that Autopilot was active when the crash occurred. “To clarify,” a new CHP statement said, “There has not been a final determination made as to what driving mode the Tesla was in or if it was a contributing factor to the crash.” We apologize for sharing what turned out later to not be true.
We also are pretty sure that autonowashing (the overstatement of a driver assist feature’s capabilities) played a role in this tragedy.
The most troubling posts are on what appears to be his TikTok account. The posts show him using his Tesla’s driving automation in traffic, describing it as “FSD” and “selfdriving,” and describing his state has “tired” and “bored” https://t.co/D5UvMQM1Srhttps://t.co/TaywbHVq7V pic.twitter.com/Dt6AXDmPTj
— E.W. Niedermeyer (@Tweetermeyer) May 6, 2021
I know many of you don’t like Niedermeyer, mostly because of his book**. But, he has a great point here. When people think Autopilot is Full Self Driving, they are a lot more likely to do dumb things. If you’re lucky, you can get away with it for months like this guy. If you’re not so lucky, you can end up dead.
How The Tesla Community Can Stop This
This whole thing reminds me of the time Homer Simpson got into guns. He chases his family away with his unsafe handling and use of a gun, and then thinks the NRA will approve of it because people think the NRA doesn’t give a damn about gun safety.
When he hosts a meeting at his house, he finds to his horror that nobody, not even the hillbilly or the actual clown, think what he’s doing is OK.
It’s really this way in the real world. If you attend a gun event and handle one unsafely, not only will you get screamed at, but you could be tackled to the ground and have the gun pried from your hand to keep people safe (if you don’t stop it yourself). If you post a picture on a gun-oriented social media group or where gun friends can see it, and you’re mishandling a gun in the photo, expect to have dozens or hundreds of people telling you that you’re a moron.
Even giving bad advice will result in an angry response from other gun people.
There are plenty of ranges, clubs, and other places that will permanently kick you out for misusing a gun, and they’ll call the cops if you come back. It’s taken that seriously (as it should).
We Can Learn From This
The Tesla fan community should be the first people making a stink when people misuse Autopilot or engage in autonowashing. I do see this sometimes, but I don’t think I see it nearly enough.
People posting videos of Autopilot abuse should face the same horde of angry Tesla owners that Lora Kolodny or Jason Torchinsky face when they say something bad about Tesla, but they don’t. People with an Autopilot Buddy or other defeat devices should feel too ashamed of it to even consider showing up at a Supercharger station, and it should be treated like the “cone of shame” that it really is.
We should always correct each other when someone even remotely suggests that Autopilot or the FSD Beta are ready for prime time, is safer than it really is, or says anything that could be construed that way. It’s really that important.
We also shouldn’t be afraid to take on the company when they make this mistake. When Elon or somebody else at Tesla suggests that regulatory approval is what holds FSD back, that’s autonowashing. When they are anything less than crystal clear that Autopilot (even when you’ve paid for FSD) is dangerous without human supervision, that’s autonowashing. When they fail to speak out against blatant abuse of the technology, like we saw recently in San Francisco, that’s autonowashing.
“Elon Musk really knows what he’s doing, and I think people are just tripping, and they’re scared of the future,” he said, before vowing to never stop abusing the feature. “I paid ten thousand for the Full Self Driving Feature, and it does what it’s designed to do…”
You can’t tell me that a few tweets from Elon Musk calling Param Sharma an idiot wouldn’t change his mind (and the minds of people who think like him).
Bottom line: abusing driver assist features should make one a complete pariah in the Tesla community, and both us and the company are failing to make that happen. That needs to change.
Featured image: screenshot from “Lavish P.” Autopilot abuse video.
*There is some debate about claims that Autopilot is far safer than human driving. It’s mostly used on highways, which are 2.5X more safe than city driving. That skews the numbers a bit.
**His book actually isn’t as bad as they say. He has both good and bad to say about Tesla in it.