The Problem with Tesla’s Full Self-Driving

Dan Straub
4 min readJan 13, 2021
Photo by Dan Straub

One of the most exciting reasons to own a Tesla is the car’s ability to drive itself. It is one of the main features that initially piqued my interest and eventually pushed me to purchase a Tesla of my own. However, one year after taking delivery, true full self-driving continues to be elusive for Tesla. The latest full self-driving software is currently available in beta for a select group of owners. While the rest of us wait for a wider release, I want to identify a problem that this software has yet to address.

Tesla’s current full self-driving (FSD) software is highly capable. There is no denying that. But after spending over 10,000 miles with the public build of FSD, I have found there is one area where the software needs refinement.

My biggest gripe with FSD is its driver etiquette. Or better yet, its lack of etiquette.

Most of my experience with FSD is on the highway where the program tends to perform exceptionally well. It holds itself perfectly within the lane, even when the lines are difficult to see. It aims for the horizon and the ping pong movement between lane lines common with early iterations of this software is now non-existent. FSD reduces the mental engagement needed for long-distance driving and allows me to shift my focus elsewhere, while lightly monitoring the car’s performance and keeping my hand on the wheel.

This low engagement form of driving can carry on for miles until you throw other drivers into the mix. This is where using FSD can get frustrating.

Picture this:

You are driving along the interstate through the rolling hills of central Pennsylvania. There are few other cars on the road in this sparsely populated area. The temperature is in the mid-seventies (perfect conditions for a battery-electric vehicle) and the clear conditions allow for the impeccable visibility of wind turbines dotting the horizon.

FSD is engaged and has been for some time now. For the past one hundred miles, you have not interacted with your car other than a slight tug on the steering wheel to let it know you are still awake. As you approach the apex of a hill, an oncoming ramp adjoins the highway and there is suddenly another vehicle attempting to merge into your current lane. You look at your display screen and see the other car represented there: a perfect virtual representation of the road around you.

Any moment now your car will surely slow down or move into the left lane to allow the other traveler onto the highway.

Any moment…

Maybe?

Nope, nothing.

You promptly pull down on the turn signal and your car transfers into the left lane. The merging vehicle is now able to enter the right lane.

Crisis averted.

Had you not intervened, the other driver would be forced to slam on their brakes or drive off the road. All of this would have occurred while your Tesla smugly maintained its speed and position unwaveringly.

This is the kind of interaction I am referring to when I claim that Teslas lack driver etiquette. Every time I utilize FSD I have to take over to allow other traffic onto the highway. Moving into the left lane is something any respectable driver would do for other motorists and while I understand that Tesla is focused on making FSD safe and effective, the nuance of driver etiquette should also have some representation in the software.

For a car with 8 cameras and a slew of hypersonic sensors, it is remarkable how it seems to ignore the trajectories of other drivers on the road. Perhaps this illustrates the current limits of machine learning and artificial intelligence as my Tesla Model 3 has never been able to appropriately address merging traffic.

Now on to situation number two.

Tesla’s radar and cameras are highly effective at recognizing slower moving traffic ahead on the highway. So much so that the car will initiate and execute a lane change all on its own if it notices traffic ahead is moving too slow. The problem with this operation is in some cases it will nearly cut off faster moving traffic in the passing lane. While the car is very adept at analyzing the speed of motorists up ahead, it seems to struggle or completely disregard faster vehicles approaching from behind. I often have to cancel lane changes because there is a swiftly approaching vehicle in the passing lane that my car is trying to merge into.

While my car has never initiated a lane change when another vehicle was in its blind spot, it has opted to change lanes when approaching traffic would have to slow down significantly.

Again, a move that demonstrates a lack of driver etiquette.

In shedding some light on these two specific examples, my desire is not to bemoan the current state of Tesla’s autonomous driving. But instead, to illustrate how complex the problem of autonomous driving continues to be. At its core, Tesla and its competitors are developing a system that can recognize any roadway and respond accordingly. On top of that, the car needs to handle an infinite amount of external variables all while keeping its occupants safe.

Finding the balance between courteous and effective driving will continue to be a problem for Tesla’s FSD program and will take a great deal of time to address. Regardless, I do believe that artificial intelligence will eventually drive better than humans. But Elon Musk’s claim that level 5 autonomy (the car drives and handles all situations with no help from a human) will be here by the end of 2021 seems to be a longshot.

If you would like to see Tesla’s FSD in action, check out my youtube channel:

https://www.youtube.com/channel/UCxeIGd0Yxb8oGBo6n2y-PnA

--

--

Dan Straub
0 Followers

A pharmacist with a love for learning.