Tesla Autopilot & FSD Are Not Perfect (Obviously), But Attacks On Them Are Nonsensical & Counterproductive
You’ve got a technology that helps drivers not leave their lanes accidentally, not run into cars in front of them, not run through red lights, not fall asleep driving, and change lanes carefully (with 10 eyes on the roads around you) — yet there’s a whole army of critics calling this technology unsafe, reckless*, and a great risk to good, honest Americans. It’s a little wild.
It feels like this “debate” comes up daily, and having followed this tech for about 7 years, I think I have a right to say that some of the arguments are getting quite “old.”
What triggered another response to the almost religious hate of Tesla driver-assist technology and Tesla’s goal of fully autonomous vehicles this time? This is tweet:
Let’s take this in parts. Before we get into the substance, though, I think it’s worth noting that this isn’t just some guy with an uninformed, poor take on Tesla Autopilot & FSD. As his Twitter bio states, David is a “Visiting Fellow at Harvard Kennedy School, focused on the future of cities, tech & mobility. Startup advisor/policy wonk/writer (@CityLab, @Slate, @WIRED, etc).”
I’m going to irritate or even piss off some Tesla fans in this piece as well as people who think the argument in the tweet above is sensible (because why wouldn’t I try to piss off 99% of people reading this), and one way I’ll do the former is by saying that I assume David is actually a smart, good, thoughtful guy on many topics. But that’s why this kind of logic is so irritating. And also why I find it bewildering at times. It’s pretty easy to understand why the logic in the tweet is faulty, and it certainly shouldn’t be beyond the capability of a visiting fellow at Harvard Kennedy School.
What Do Tesla Autopilot & FSD Do?
I think this matter gets glossed over too much. I was going to skip over a large chunk of it myself, but then I realized that much of the problem with this debate may stem from critics not understanding what Tesla Autopilot and FSD do. I have had Autopilot for years, including the FSD suite. I don’t yet have the most advanced FSD Beta version that certain Tesla owners have been testing and helping to improve for months. So, much of the info below is from personal experience, and the rest (regarding FSD Beta) is from watching FSD videos from others on social media or examining tweets and statements from Tesla CEO Elon Musk.
- Keeps your car in its lane.
- Keeps your car centered in your lane better or more consistently than any human can.
- Prevents you from running into a vehicle if you accidentally don’t notice it slowing down or stopped in front of you (and even in the case of a car in front of the car in front of you slowing or stopping).
- Changes lanes for you with much greater vision than you could ever muster on your own (10 eyes are better than 2).
- Stops at red lights and stop signs for you, whether you are paying attention or not.
- Drives from onramp to offramp on a highway intelligently on its own (passing cars as it deems useful).
FSD Beta …
- Does everything above, but better and more smoothly.
- Drives you basically from parking space to parking space with little or no need for intervention, avoiding all kinds of potential obstacles and hazards in the way.
These are all driver-assist features that, theoretically, make driving safer. Tesla driver stats also imply that they do in fact make driving safer.
What Else Do Tesla Autopilot & FSD Do?
Perhaps some critics know all of the above but don’t know something else: a Tesla will warn you repeatedly, obsessively, and with increasing aggressiveness if you are not doing your part — keeping your hands on the wheel and paying attention to what’s going on. You cannot really use Autopilot or FSD and think they are systems that you can just turn on and ignore.
Just to emphasize it: Tesla’s system has clear, strong, effective communications methods for warning you to pay attention while driving. It’s beyond simple ignorance to experience a Tesla on Autopilot/FSD and think that it doesn’t warn you adequately enough to pay attention and take over the car if needed. The warnings are ample and clear. You have to pretend you don’t see the warnings to act like it’s not clear to some drivers whether they have to pay attention or not.
What Do Tesla Autopilot & FSD NOT Do?
Tesla Autopilot and FSD do not yet perfectly drive people around a city, and certainly not to such an extent that the driver could take a nap, work, play a game, or watch tennis while the car is driving itself. I do not know anyone who thinks the tech can do this. And I’m yet to see anyone using FSD with the assumption that they are not supposed to monitor the system carefully and intervene from time to time.
How Is Tesla Autopilot Or FSD Risking Lives?
With that long intro out of the way, how is Tesla Autopilot or Full Self-Driving risking lives? The firmware is saving lives, not risking them, no?
There is one potential area of weakness. I wrote about it several years ago, but I think Tesla has been handling it well.
If it’s someone’s job to pay close attention to a task but then it turns out they almost never have anything to do, it is nearly impossible to keep them focused on what they are supposed to monitor. You might say Tesla Autopilot and Full Self-Driving are risking lives if Tesla allows drivers to ignore their roles as driver, if Tesla makes them feel like they do not need to pay attention (or lets them think they do not need to pay attention). But Tesla doesn’t do that. There are several things in place that make a driver stay attentive.
So, where is the huge extra risk? What is dangerous about a car helping you to not drive into things?
Addressing The Names
Okay, let’s do this.
The names “Autopilot” and “Full Self-Driving” are highly controversial. Why?
For years, “Autopilot” has spooked people who think it means fully automatic driving. However, that’s not what it means and it never has been what it means. Elon Musk has a background as a pilot, and he knows that Autopilot is assistance technology to help pilots out, not to replace them. It is the same as in Tesla vehicles. Planes do not boot pilots just because they have autopilot, and the same is true with cars.
“Full Self Driving,” or FSD, is a bit of another matter. In my opinion, using this terminology was indeed premature. Yes, full self-driving is the long-term intention, and that’s why Tesla used the term. The hardware in the car is supposed to be adequate for that, but the software is still in development. I bought FSD fully understanding this, with the knowledge that I won’t have to pay for anything else once the software is ready. That’s sensible to me, and I don’t think you can really buy FSD without understanding this. However, I certainly see how selling a package called “Full Self Driving” confuses some outsiders, makes them nervous, and gets them upset. Many people think Tesla is misleading buyers simply by using this terminology. I don’t think that’s the case for reasons explained above and because FSD costs thousands of dollars (so you don’t just buy it without figuring out what it is), but I can see how the name upsets people and makes them think Tesla is being “reckless.” I think it would be a good idea for Tesla to just use different terminology — but, honestly, that’s basically just for outsiders and public perception, not Tesla owners.
*Well, the tech does help people to wreck less.