Even after a recall, Tesla’s Autopilot does dumb, dangerous things

On the streets of San Francisco, the updated version of Tesla’s driver-assistance software still took the wheel in places it wasn’t designed to handle, including blowing through stop signs

(Video: The Washington Post)

Last weekend, my Tesla Model Y received an over-the-air update to make its driver-assistance software safer. In my first test drive of the updated Tesla, it blew through two stop signs without even slowing down.

In December, Tesla issued its largest-ever recall, affecting almost all of its 2 million cars. It is like the software updates you get on your phone, except this was supposed to prevent drivers from misusing Tesla’s Autopilot software.

After testing my Tesla update, I don’t feel much safer — and neither should you, knowing that this technology is on the same roads you use.

During my drive, the updated Tesla steered itself on urban San Francisco streets Autopilot wasn’t designed for. (I was careful to let the tech do its thing only when my hands were hovering by the wheel and I was paying attention.) The recall was supposed to force drivers to pay more attention while using Autopilot by sensing hands on the steering wheel and checking for eyes on the road. Yet my car drove through the city with my hands off the wheel for stretches of a minute or more. I could even activate Autopilot after I placed a sticker over the car’s interior camera used to track my attention.

(Video: The Washington Post)

The underlying issue is that while a government investigation prompted the recall, Tesla got to drive what went into the software update — and it appears not to want to alienate some customers by imposing new limits on its tech. It’s a warning about how unprepared we are for an era where vehicles can seem a lot more like smartphones, but are still 4,000-pound speed machines that require a different level of scrutiny and transparency.

Tesla’s recall follows an investigation by the National Highway Traffic Safety Administration into crashes involving Autopilot. My Washington Post colleagues found that at least eight fatal or serious crashes have involved Tesla drivers using Autopilot on roads where the software was not intended to be used, such as streets with cross traffic.

These crashes have killed or severely wounded not only Tesla drivers, but bystanders. Tesla says its Autopilot software makes its cars safer overall than those without it.

Announcing the recall, NHTSA said it was supposed to “encourage the driver to adhere to their continuous driving responsibility” when using the technology, and would include “additional checks” on drivers “using the feature outside controlled access highways.” But Tesla wasn’t specific about what, exactly, would change with the update to counteract misuse.

Tesla didn’t respond to my request for comment. NHTSA’s director of communications, Veronica Morales, said the agency’s “investigation remains open” and the agency will “continue to examine the performance of recalled vehicles.”

I found we have every reason to be skeptical this recall does much of anything.

How I tested Tesla’s recall

It goes without saying: Don’t try this at home. I was quite surprised the Tesla would just blow through a stop sign, and activated Autopilot only near stops when there weren’t others around. I was only simulating not paying attention to understand the software’s capabilities and limitations, which are now clear.

I took my Tesla out on two identical test drives, before and after the update. My family leases a blue Tesla Model Y, one of America’s best-selling cars, which we’ve been largely content with. (Tesla can be very clever with software, and one time my car even bore witness to its own hit and run accident.)

The process of simply getting the recall was itself a red flag for a lack of urgency about this fix. Unlike on a phone, where you can go to settings to look for updates, my car had no button to look for or prompt a download. Tesla’s user manual advised updates would download automatically if I had strong WiFi, so I moved my router outdoors near my parked car. When the recall finally arrived — a week and a half later — it contained a number of other unrelated features as well as a patch on top of its original release.

I was using an Autopilot function known as Autosteer, which Tesla dubs “Beta” software but makes widely available. It automatically turns the wheel to keep it within lane lines. Drivers of recent Tesla models can easily activate it by pushing down twice on the right-hand stalk next to the wheel.

(Video: The Washington Post)

In fine print and user manuals most drivers probably haven’t pored over, Tesla says that Autosteer “is designed for use on highways that have a center divider, clear lane markings, and no cross-traffic.” It adds: “Please use it only if you will pay attention to the road, keep your hands on the steering wheel, and be prepared to take over at any time.”

As the crashes spotlighted by The Post investigation indicate, it isn’t clear to some drivers where you’re supposed to use Autosteer and what, exactly, it will do for you. It’s not nearly as advanced as Tesla’s “Full Self-Driving” capability, which requires a $200 per month subscription to access and is designed to be used on city streets.

Unfortunately, little about the recall forces Autosteer to operate only in situations it was designed to handle.

Nothing changed after the recall about what seems to me to be the most critical issue: the places in which Autosteer will activate. I was able to use it well beyond highways, including city streets with stop signs, stop lights and significant curves. Autosteer flew into speed bumps at full speed, causing a raucous ride.

This is bad software design. Teslas already contain mapping systems that know which street you’re on. Tesla’s surround-view cameras can identify stop signs and cross traffic. Why doesn’t Autopilot’s software pay attention to that data and allow Autosteer to activate only on roads it was designed for? The only factor I experienced that seemed to cause it to not operate (and flash a “temporarily unavailable” message) was if streets lacked clear paint lines.

The two times Autosteer allowed my car to roll right through intersections with stop signs were especially nerve wracking. I could tell from icons on the car’s screen that it could see the sign, yet it did not disengage Autosteer or stop. After digging around Tesla’s website, I discovered that Tesla says obeying stop signs and stop lights is a function included for those who pay for Full Self-Driving. Should you really have to pay extra to keep the software your car comes with by default from doing reckless things?

Tesla’s superfans may argue they don’t want their car (or the government) telling them where they can use certain functions. But only Tesla is truly able to judge the conditions where its Autosteer software is safe — that information is opaque to drivers, and clearly people keep misjudging it. I believe cars will get safer with self-driving and driver-assistance software, but need to tap into all available data to do so.

“NHTSA must set their sights beyond this recall and limit Tesla’s Autosteer feature to the limited-access highways for which it was designed,” said Sen. Edward J. Markey (D-Mass.), with whom I shared my test results.

The biggest recall change my tests did reveal was how the car warned me about being attentive to the road while Autosteer was activated. But it’s subtle at best.

At the top of Tesla’s release notes for the recall is that it has “improved visibility” of driver-warning alerts on its main screen. Looking at my own before and after photos, I can see these newer messages — which often ask you to apply slight force to the wheel — have larger type, include an icon and now show up in the upper third of the screen.

It is good for critical messages to not require reading glasses. But I also wonder whether more distractions on a screen might actually take people’s attention away from the road.

Tesla’s recall release notes also suggest the warnings will come more often, saying there is increased “strictness” of driver attentiveness requirements when Autosteer is active and the car is approaching “traffic lights and stop signs off-highway.”

Online, some frequent Autosteer users have complained that the recall gives them hands-on-the-wheel warning “nags” much too often. In my pre-recall test drive, I was able to go for 75 seconds on a San Francisco street with traffic lights without my hands on the wheel before getting a warning. On the same road after the update, I could go for 60 seconds without my hands on the wheel.

I wasn’t able to discern what prompted the hands-on-the-wheel alerts I received. On roads with stop lights, I did sometimes get a warning ahead of the intersection — but usually just deactivated the software myself to stay safe. Ahead of the two stop signs the car ran through, one time I got a hands-on warning, and one time I did not.

More worrisome is how the recall handled my car’s interior camera. It’s used along with pressure on the steering wheel to check whether the driver is paying attention and not looking at their phone.

When I covered the lens with a smiley-face sticker — a trick I read about on social media from other Tesla owners — the car would still activate Autosteer. The system did send more warnings about keeping my hands on the wheel while the camera was covered. But I don’t understand why Tesla would allow you to activate Autosteer at all when the camera is either malfunctioning or being monkeyed with.

Finally, the update release notes said Tesla’s systems would suspend Autopilot for drivers who collect five “Forced Autopilot Disengagements” — a term for when the software shuts itself off when it detects improper use. I was not suspended during my tests, and received only one forced disengagement, which didn’t stop me from re-engaging Autopilot shortly after.

How could the government let this pass?

I also shared my results with Sen. Richard Blumenthal (D-Conn), who told me we need a recall of the recall. “This is tragedy waiting to happen,” he said. “We are going to be demanding additional action from Tesla, and also that NHTSA show some real legal muscle against [CEO] Elon Musk’s mockery.”

NHTSA’s Morales declined to comment on the specifics of my experience. But she said in a statement that the law, known as the Vehicle Safety Act, “puts the burden on the manufacturer” to develop safety fixes.

“NHTSA does not preapprove remedies,” she said. Instead, “the agency will monitor field and other data to determine its adequacy, including field monitoring of the effects of the remedy in addressing the safety problem and testing any software or hardware changes in recalled vehicles.”

Which aspects of the performance would violate NHTSA’s requirements? And how long will this take? Morales said only that the agency’s Vehicle Research and Test Center in Ohio has several Tesla vehicles that it will use for testing.

“Consumers should never attempt to create their own vehicle test scenarios, or use real people or public roadways to test the performance of vehicle technology,” Morales added. “Intentional unsafe use of a vehicle is dangerous and may be in violation of State and local laws.”

Yet every Tesla driver who is using Autopilot with the update is testing the performance of the technology while we wait for NHTSA to do its own. It’s hard to see how post-release review serves public safety in an era where software, and especially driver-assistance capabilities, introduces very new kinds of risk.

Compare a current Tesla to your phone. Apps are subjected to prerelease review by Apple and Google before they’re made available to download. They must meet transparency requirements.

Why should a car get less scrutiny than a phone?

“Tesla’s recall makes clear that the cars of the future require smarter safety solutions than of the past,” Markey said.

Leave a Reply

Your email address will not be published. Required fields are marked *