Published: Wed, September 13, 2017
Money | By Armando Alvarado

Investigators fault driver in Tesla Autopilot crash

Investigators fault driver in Tesla Autopilot crash

The National Transportation Safety Board's finding that Tesla's Autopilot shares the blame for a fatal crash with a truck in Florida past year underscores the need for Federal Motor Vehicle Safety Standards covering automated driver assistance technologies, Consumer Watchdog said today.

Reuters reported Monday that the NTSB is expected to find that the system was a contributing factor. The Tesla Autopilot system monitored the driver's attention through his interaction with the steering wheel, which previous findings have determined was incredibly limited.

Brown was driving his 2015 Model S in Williston when a truck made a left turn in front of the auto.

The report's findings state that while the driver of the truck failed to yield the right of way to the Tesla driver, the latter was too heavily reliant on the car's automated system, which is the likely reason he did not try to avoid the oncoming collision.

While the board faulted Brown for not paying attention in the seconds before the crash, they noted Autopilot did not do an adequate job of detecting other traffic and did not inform the driver early enough to allow for sufficient reaction time.

Tesla maintains that Autopilot is not a self-driving system.

NTSB directed its recommendations to automakers generally, rather than just Tesla, saying the oversight is an industrywide problem.

It said the collision should never have happened.

Trump meets with Malaysian PM under scrutiny
The leaders are expected to discuss counter-terrorism efforts, trade, and the Rohingya crisis among other issues. But relations cooled over the 1MDB scandal.

However, NTSB staff pointed to other defects in Autopilot that have not been addressed, such as the fact that Brown was driving on a road that was not meant to ride on Autopilot. "Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention".

"NHTSA should have been a partner with the NTSB in this investigation, but they were not", said John M. Simpson, Consumer Watchdog's Privacy Project Director.

In the aftermath of the crash, Tesla put more stringent limits on hands-off driving, disabling the autopilot feature if drivers repeatedly ignore the audible and dashboard warnings.

Two minutes earlier, according to reports, Brown had set the speed at nearly 10 miles per hour above the posted speed limit.

The driver had Autopilot engaged for 37 of the 41 total minutes of the trip.

Monitoring driver attention by measuring the driver's touching of the steering wheel "was a poor surrogate for monitored driving engagement", said the board. "There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the auto", the statement said.

The Model S is a level 2 on a self-driving scale of 0 to 5.

Today's meeting comes three months after NTSB released a 538-page docket with preliminary information gathered by investigators on the May 2016 crash. "That is simply not the case", said the statement from the family, breaking its silence on the crash. The NTSB suggested to any maker of semi-autonomous vehicles to prevent the use of the technology on roads where the vehicles aren't suited to travel without human control of the vehicle. "There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the vehicle". The Model S was traveling at 74 miles per hour using Autopilot, which failed to identify the truck and never slowed the vehicle down, the NTSB said.

Like this: