Washington Post Challenges Tesla on Autopilot

The electronic user manual accessible in every Tesla states that its Autopilot driver assistance system is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.”

But you can turn it on anywhere.

The Washington Post, in its second major examination of accidents blamed on the system, asks why.

Related: Washington Post – Tesla Autopilot Behind More Crashes Than Rival Systems

“Even though the company has the technical ability to limit Autopilot’s availability by geography, it has taken few definitive steps to restrict use of the software,” the Post explains.

That’s rare. Other automakers have similar systems but often limit where they can be used.

Many Rival Systems Geofenced

Today’s cars usually know where they are, as many are always connected to the internet.

So, Ford’s Blue Cruise and General Motors’ Super Cruise limit where drivers can use them. Those systems engage only on pre-mapped highways.

That approach scores better in consumer testing.

Few consumer testing sites have formally evaluated hands-free driving systems. But Tesla’s Autopilot hasn’t performed well in the few tests that have been published. Consumer Reports ranked it the seventh-best system on the market last year and the eighth-best this year. Blue Cruise took the top prize both times.

In Some Accidents, Owners Using It Where Tesla Says Not To

Ford’s system may also be safer than Tesla’s approach.

The Post analysis found “at least eight fatal or serious wrecks involving Tesla Autopilot on roads where the driver assistance software could not reliably operate.” The report includes dash camera footage from a Tesla “blowing through a stop sign, a blinking light and five yellow signs warning that the road ends.”

The Tesla then crashed into a parked car, killing one and severely injuring another.

The Autopilot system has triggered a rare dispute between two federal safety agencies, the Post reports.

Federal Safety Agencies Disagree on What to Do

The National Transportation Safety Board (NTSB) studies traffic accidents but can only recommend safety changes. The board has called for rules that require automakers to fence where drivers can activate automation systems.

The National Highway Traffic Safety Administration (NHTSA) can make actual rules about road safety. The Post reports NHTSA has “said it would be too complex and resource-intensive to verify that systems such as Tesla Autopilot are used within the conditions for which they are designed, and it potentially would not fix the problem.”

Tesla says it’s up to the driver to determine when Autopilot is appropriate. “The driver determines the acceptable operating environment,” the company has said in a letter to the NTSB.

NHTSA says it is conducting at least one “active investigation” of Autopilot’s safety, and last year forced changes to the more expensive, more advanced system Tesla calls Full Self-Driving capability.

“But NHTSA has not adopted any rules that would limit the technology to where it is meant to be used, despite exploring how it could ensure such software adheres to its design limits,” the Post explains.

Leave a Reply

Your email address will not be published. Required fields are marked *