• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NTSB will convene on Sept. 12 to determine probable cause of fatal 2016 Tesla crash

FrankCanada97

Roughly the size of a baaaaaarge
Automobile Crash Involving Driver Assist System Focus of NTSB Meeting
tesla-autopilot-crash-warnings-2017-06-20-01+%282%29.jpg

WASHINGTON (Aug. 22, 2017) — The NTSB announced Tuesday it scheduled a board meeting for Sept. 12, 2017, to determine the probable cause of the fatal, May 7, 2016, crash of a Tesla car near Williston, Florida.

One person was killed when a 2015 Tesla Model S collided with a 2014 Freightliner Cascadia semitractor-trailer on US Highway 27A. The Tesla's Traffic-Aware Cruise Control and Autosteer lane-keeping assistance features were being used by the driver at the time of the crash. The Tesla was traveling at 74 mph just prior to impact.

A team of five NTSB investigators traveled to Williston to conduct the on-scene phase of the investigation, using three-dimensional laser scanning to document the crash location, the damaged trailer and the damaged car. As the investigation progressed, the team expanded to cover nine areas of inquiry.

The board meeting is scheduled to be held here in the NTSB's Board Room and Conference Center, 429 L'Enfant Plaza SW, Sept. 12, beginning 9:30 a.m. The meeting is open to the public. For those who cannot attend in person, the meeting will be webcast and a link to the webcast will be available shortly before the start of the meeting at http://ntsb.capitolconection.org/.

U.S. board to vote on likely cause of Tesla 'Autopilot' crash
WASHINGTON (Reuters) - The U.S. National Transportation Safety Board will vote at a Sept. 12 hearing on the probable cause of a May 2016 crash that killed a man using the semi-autonomous driving system on his Tesla Model S sedan, the agency said on Monday.

The fatal incident raised questions about the safety of systems that can perform driving tasks for long stretches with little or no human intervention but which cannot completely replace human drivers.

In June, the NTSB said the driver, Joshua Brown, kept his hands off the wheel for extended periods of time despite repeated automated warnings not to do so. Brown was killed near Williston, Florida, when his Model S collided with a truck while it was engaged in the "Autopilot" mode.
...
...
...
During a 37-minute period of the trip when Brown was supposed to have his hands on the wheel, he apparently did so for just 25 seconds, the NTSB said in June.

Tesla in September 2016 unveiled improvements in Autopilot, putting new limits on hands-off driving and other features that its chief executive officer said likely would have prevented the crash death.

A board spokesman, Eric Weiss, said the NTSB could use the September board meeting to make policy recommendations. The board cannot order recalls or force regulatory changes.

In January, the National Highway Traffic Safety Administration said it had found no evidence of defects in the aftermath of Brown's death.

The NHTSA said Brown did not apply the brakes and his last action was to set the cruise control at 74 miles per hour (119 kph), less than two minutes before the crash - above the 65-mph speed limit.

You may have heard about the Tesla "autopilot" crash from last year. The NTSB have concluded their investigation and will meet on Sept. 12th to vote on the probable cause and if necessary, issue recommendations to relevant parties. The final report with all the analysis and conclusions will be released shortly after.

The public docket containing factual information gathered during the investigation:
https://dms.ntsb.gov/pubdms/search/hitlist.cfm?docketID=59989&CurrentPage=1&EndRow=15&StartRow=1&order=1&sort=0&TXTSEARCHT=
 

Al-ibn Kermit

Junior Member
The driver is the one who broke the rules but I don't see any reason why Tesla wouldn't have expected people to do this. Especially when auto sites have been raving about doing long distance trips without touching the steering wheel.
 

captive

Joe Six-Pack: posting for the common man
wasn't this the guy who was watching harry potter?

Not sure how you can blame anyone but him.
 

mnannola

Member
Even though it's the drivers fault mostly, this:

Tesla in September 2016 unveiled improvements in Autopilot, putting new limits on hands-off driving and other features that its chief executive officer said likely would have prevented the crash death.

doesn't bode well for Tesla saying it wasn't their fault at all. If an update could have prevented this, then it means they are at least partially responsible, right?
 

Paz

Member
Even though it's the drivers fault mostly, this:



doesn't bode well for Tesla saying it wasn't their fault at all. If an update could have prevented this, then it means they are at least partially responsible, right?

No? That's not how improvements to things work.

If I make a knife and warn you that it's sharp and you cut yourself with it I don't become responsible for you cutting yourself because in the future I invent a knife that auto retracts when it detects human skin.

If safety improvements retroactively made people responsible for accidents that occurred before they existed then it would break much of the world.
 

numble

Member
No? That's not how improvements to things work.

If I make a knife and warn you that it's sharp and you cut yourself with it I don't become responsible for you cutting yourself because in the future I invent a knife that auto retracts when it detects human skin.

If safety improvements retroactively made people responsible for accidents that occurred before they existed then it would break much of the world.
They shouldn't call it autopilot if it isn't meant to be used as an autopilot. They actually changed the name of the feature in China because the meaning of the term they named it was "automatic driving".
 

Future

Member
Autopilot is definitely a bad term. It's cool because of the implications, but the implications cause recklessness
 

captive

Joe Six-Pack: posting for the common man
They shouldn't call it autopilot if it isn't meant to be used as an autopilot. They actually changed the name of the feature in China because the meaning of the term they named it was "automatic driving".

Autopilot is definitely a bad term. It's cool because of the implications, but the implications cause recklessness

people are stupid, thats the arugment you're going with?

since no one answered i looked it up, this is the guy who was watching harry potter. https://www.theguardian.com/technol...illed-autopilot-self-driving-car-harry-potter

no sympathy.
 

Al-ibn Kermit

Junior Member
people are stupid, thats the arugment you're going with?

since no one answered i looked it up, this is the guy who was watching harry potter. https://www.theguardian.com/technol...illed-autopilot-self-driving-car-harry-potter

no sympathy.

It's been established that he was reckless. It's not just idiots who are going to get into accidents.

By your logic, cars would have zero side impact protection because that's an accident that only happens due to user error. Fact of the matter is that you need to protect against the most frequent and predictable crash situations and when you promise people that the car can drive itself, which you are basically doing when you name it Autopilot, then the 99.9% accuracy of the self-driving system isn't good enough.

Tesla getting spanked for this is necessary for the car industry as a whole.
 

numble

Member
people are stupid, thats the arugment you're going with?

since no one answered i looked it up, this is the guy who was watching harry potter. https://www.theguardian.com/technol...illed-autopilot-self-driving-car-harry-potter

no sympathy.
He didn't watch Harry Potter (or anything for that matter). You can read the NHTSA and NTSB reports.

Where did I say people are stupid? You don't see an issue with calling something "automatic driving" and expecting people to know it actually means "assisted driving"?
 

captive

Joe Six-Pack: posting for the common man
It's been established that he was reckless. It's not just idiots who are going to get into accidents.

By your logic, cars would have zero side impact protection because that's an accident that only happens due to user error. Fact of the matter is that you need to protect against the most frequent and predictable crash situations and when you promise people that the car can drive itself, which you are basically doing when you name it Autopilot, then the 99.9% accuracy of the self-driving system isn't good enough.

Tesla getting spanked for this is necessary for the car industry as a whole.
disagree. Its the drivers fault, the crash was 100% preventable if he were paying attention to the road.

He didn't watch Harry Potter (or anything for that matter). You can read the NHTSA and NTSB reports.
just going by the news reports i've seen.
Where did I say people are stupid? You don't see an issue with calling something "automatic driving" and expecting people to know it actually means "assisted driving"?
Because harping on the name assuming people will think it means one thing, despite the TOS stating otherwise, despite the repeated warnings of the system to keep your hands on the wheel and eyes on the road. You are assuming that people are too stupid to understand the difference and or follow directions.

What does it matter if the autopilot is set on 74 instead of 65?
because he was speeding, technically.
 

captive

Joe Six-Pack: posting for the common man
Further establishes the driver was reckless and was speeding.

i mean, im not sure i would go that far. I am not familiar with where this guy is from. But generally here in Texas there's the speed limit and "the speed limit." Generally, and especially on highways, the flow of traffic is going about 10 miles above the speed limit and people going the speed limit or under are more dangerous because the flow of traffic has to avoid them.

Now if he were doing 85 or 90 in a 65 i would agree with you.
 

FrankCanada97

Roughly the size of a baaaaaarge
i mean, im not sure i would go that far. I am not familiar with where this guy is from. But generally here in Texas there's the speed limit and "the speed limit." Generally, and especially on highways, the flow of traffic is going about 10 miles above the speed limit and people going the speed limit or under are more dangerous because the flow of traffic has to avoid them.

Now if he were doing 85 or 90 in a 65 i would agree with you.

Funny that you mention this, the whole attitude surrounding speed limits was part of a study conducted by the NTSB.
http://m.neogaf.com/showthread.php?t=1418848&page=1
 

nynt9

Member
i mean, im not sure i would go that far. I am not familiar with where this guy is from. But generally here in Texas there's the speed limit and "the speed limit." Generally, and especially on highways, the flow of traffic is going about 10 miles above the speed limit and people going the speed limit or under are more dangerous because the flow of traffic has to avoid them.

Now if he were doing 85 or 90 in a 65 i would agree with you.

I'd wager that when driving an experimental vehicle, I'd actually pay attention to the real speed limit.
 

sangreal

Member
They shouldn't call it autopilot if it isn't meant to be used as an autopilot. They actually changed the name of the feature in China because the meaning of the term they named it was "automatic driving".

I agree with your point but autopilot doesn't mean full autonomy in any context. Planes still need engaged human pilots even with autopilot (for now)
 
Even though it's the drivers fault mostly, this:



doesn't bode well for Tesla saying it wasn't their fault at all. If an update could have prevented this, then it means they are at least partially responsible, right?

You're having a common reaction to a safety improvement being made, but it's not a helpful reaction. If improving on something meant that you were responsible for all accidents that happened prior to the improvement, that just means that no manufacturer would ever make any safety improvements. We want to encourage such improvements, which is why you can't even make the argument you just did in a lawsuit.
 

sangreal

Member
I'm not so sure the layman understands this though.

Right, that's why I agree it's a terrible name. I guess my point is just that it's interesting that people have this perception considering the first autopilot systems to use the term were way more rudimentary than Tesla's autopilot
 

darscot

Member
74 is speeding but may not be enough to make any difference. Even if he was driving he may not have reacted fast enough. The truck driver is 100% at fault, the Tesla clearly has the right of way. The reports states that there is no evidence he was using electronics. Seems like the truck cut him off the car failed to detect and he failed to react fast enough. It could easily be the same result if he was in control.
 

FrankCanada97

Roughly the size of a baaaaaarge
74 is speeding but may not be enough to make any difference. Even if he was driving he may not have reacted fast enough. The truck driver is 100% at fault, the Tesla clearly has the right of way. The reports states that there is no evidence he was using electronics. Seems like the truck cut him off the car failed to detect and he failed to react fast enough. It could easily be the same result if he was in control.

If I am reading the facts correctly, the car and truck were visible to each other at around 1,100 feet. If the Tesla driver was aware and if we consider the sole witness' testimony to be reliable, the Tesla driver would have had around 6-7 seconds to react. If the Tesla was going the speed limit, he would have maybe an extra second. From my quick analysis, I would guess that if he had started slowing the moment the truck turned in the intersection, he might have missed the truck.

https://dms.ntsb.gov/public/59500-59999/59989/604694.pdf

Witness Interview Transcript:
https://dms.ntsb.gov/public/59500-59999/59989/604743.pdf
 

numble

Member
74 is speeding but may not be enough to make any difference. Even if he was driving he may not have reacted fast enough. The truck driver is 100% at fault, the Tesla clearly has the right of way. The reports states that there is no evidence he was using electronics. Seems like the truck cut him off the car failed to detect and he failed to react fast enough. It could easily be the same result if he was in control.
It wasn't that he wasn't reacting fast enough, it was that he wasn't reacting at all. He did not brake or even put his hands on the wheel for the 7 seconds.

The NHTSA report is out already and they conclude the Tesla driver was distracted:
the driver took no braking, steering or other actions to avoid the collision; and 4) the last recorded driver action was increasing the cruise control set speed to 74 mph less than two minutes prior to impact. The crash occurred on a clear day with dry road conditions. On June 21, 2016, NHTSA deployed a Special Crash Investigations team to the crash site to evaluate the vehicle and study the crash environment. NHTSA’s crash reconstruction indicates that the tractor trailer should have been visible to the Tesla driver for at least seven seconds prior to impact.

...

The Florida fatal crash appears to have involved a period of extended distraction (at least 7 seconds). Most of the incidents reviewed by ODI involved events with much shorter time available for the system and driver to detect/observe and react to the pending collision (less than 3 seconds). An attentive driver has superior situational awareness in most of these types of events, particularly when coupled with the ability of an experienced driver to anticipate the actions of other drivers. Tesla has changed its driver monitoring strategy to promote driver attention to the driving environment.
 
Top Bottom