• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NTSB says driver in fatal Tesla crash was overreliant on the car’s ‘Autopilot’ system

FrankCanada97

Roughly the size of a baaaaaarge
Driver Errors, Overreliance on Automation, Lack of Safeguards, Led to Fatal Tesla Crash
NuEhx0R.png

7JDdMra.png

r92m97f.png

PYZxZdr.png



Via Washington Post
The driver of a Tesla who was killed in a crash that drew worldwide attention last year was too reliant on the car's ”Autopilot" system when he plowed into the side of a tractor-trailer at more than 70 miles per hour, federal investigators concluded Tuesday.

The National Transportation Safety Board said Joshua Brown's overreliance on the auto­pilot system ”permitted his prolonged disengagement from the driving task and his use of automation in ways inconsistent with guidance and warnings from the manufacturer."

The Tesla's ”Autopilot" system functioned as designed in the May 7, 2016 crash. However, the system is meant to augment, not replace the driver, the NTSB said.

”In this crash, Tesla's system worked as designed, but it was designed to perform limited tasks," NTSB Chairman Robert Sumwalt said. ”The result was a collision that should not have happened. System safeguards were lacking."

The board said the ”operational design" of the vehicle's auto­pilot encouraged Brown's overreliance on it.

”Drivers must always be prepared to take the wheel or apply the brakes," Sumwalt said.

The NTSB findings came as a partial exoneration of Tesla and a relief for those working to put autonomous vehicles on the road. Linking the Tesla crash to the coming generation of fully-autonomous cars fueled public fears of vehicles, surveys found.

”I think it's important to clear up a possible misconception," Sumwalt said. ”The automobile involved in the collision was not a self-driving car."

In the aftermath of the crash, Tesla put more stringent limits on hands-off driving, disabling the auto­pilot feature if drivers repeatedly ignore the audible and dashboard warnings. Among the NTSB recommendations Tuesday, the board said automakers should incorporate similar measures and restrict use on highways with cross traffic.

An NTSB investigator testified Tuesday that ”collision mitigation systems" do not reliably detect cross traffic.


The crash has been documented by at least three teams of investigators, including one from the NTSB, which issued a preliminary report in June.

Brown, 40, a former Navy SEAL, was driving down four-lane highway near Willistown, Fla., on a sunny Saturday afternoon with his Tesla Model S set in auto­pilot mode. The system allows the vehicle to guide itself — using multiple sensors linked to a computer system — like a greatly enhanced cruise control system, and comes with automatic emergency braking designed to avoid frontal collisions.

Two minutes earlier, according to reports, Brown had set the speed at almost 10 miles per hour above the posted speed limit.

At about 4:40 p.m., a 53-foot tractor-trailer loaded with blueberries that had been traveling in the opposite direction turned left toward a side road, blocking the path of Brown's Tesla.


The Tesla careened under the truck's trailer, traveled almost 300 feet farther and snapped off a utility pole, spinning around into a front yard about 50 feet away.

The driver of the blueberry truck, Frank Baressi, 62, told the Associated Press that Brown was ”playing Harry Potter on the TV screen." The Florida Highway Patrol said a DVD player was found in the Tesla, but two of the NTSB investigators on Tuesday disputed that it was being used to watch a video.

”We are quite certain that was not the case," the NTSB's Ensar Becic told the board members.


In it's preliminary report, the NTSB said that Brown had his hands on the wheel for just 25 seconds in the final 37 minutes of his drive. The report said that he had received six audible warnings, and seven visual dashboard warnings, from the auto­pilot systems telling him to keep his hands on the steering wheel.

The National Highway Traffic Safety Administration joined the NTSB, the highway patrol and Tesla in investigating the crash. NHTSA determined that Tesla's auto­pilot feature was not a fault and its investigators said Brown never tried to avoid the truck or apply the brakes before the crash.
...
...
...
Traditional automakers plan to gradually introduce features until the day arrives when they've produced a fully autonomous vehicle. But newcomers to the market, like Waymo, plan to put fully autonomous vehicles on the road from day one. Waymo, which changed its name from Google to develop an independent brand, concluded that a vehicle without a steering wheel or pedals was the way to go after discovering its own employees often got distracted when driving autonomous cars equipped with steering wheels.

Anticipating the attention paid to Tuesday's NTSB hearing, Brown's family issued a statement Monday through its lawyer.

”We heard numerous times that the car killed our son. That is simply not the case," the family statement said. ”There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car. "

Full report will be released in the coming days. I'll highlight the specific safety recommendations that the NTSB have issued.

Public containing all information gathered by investigators:
https://dms.ntsb.gov/pubdms/search/hitlist.cfm?docketID=59989&CFID=1126988&CFTOKEN=b1b9a5b7e849bb32-88C471C8-01C8-077F-A22C23E2E219927F
 
They really should have named it something other than autopilot.

At some point, though, a fully autonomous car will crash and kill someone. It's inevitable.
 
They really should have named it something other than autopilot.

At some point, though, a fully autonomous car will crash and kill someone. It's inevitable.

How do you stop this? Pretty simple. Make every car on the road autonomous and communicate with each other locally over a secure encrypted LAN network.

Human beings should not be operating motor vehicles, period. It should be illegal.

Yes I know this is a dream today, but this is what absolutely needs to happen.
 

BriGuy

Member
These half-steps sound like they would be more worrisome than anything. Like if I'm the one driving, I only need to worry about what I'm doing. With auto not-auto pilot, I would have to worry about how I'm driving and how the computer may or may not assist me. I think I'm going to keep my hands glued to the wheel until full autopilot is not just a thing, but more prevalent than not.
 
I agree with everyone about the 'AutoPilot' branding. That's a really misleading characterization of the feature and is likely to get Tesla sued no matter what warnings come along with it.

Kinda bullshit that because of this driver's actions, self-driving cars had a minor setback.

That was always going to happen: people are hypercritical of technology that they're skeptical of to begin with. Despite tens of thousands of people dying per year in traditional automobiles, a handful of autonomous or semi-autonomous crashes will get the majority of negative attention by the 'AHA! I knew it!' crowd that are intimidated by the technology. I'm sure there were horse owners talking shit about the original automobiles when they came about.
 

captive

Joe Six-Pack: posting for the common man
They really should have named it something other than autopilot.

At some point, though, a fully autonomous car will crash and kill someone. It's inevitable.

Absolutely. Something like "Assisted Driving" would have worked. But that doesn't sell cars so....

yes it was the name that caused the driver to ignore the warnings in the TOS that you HAVE to accept before using the auto pilot software. It was the name that caused him to ignore multiple audible and visual warnings that he was not touching the steering wheel and thus not paying attention.
 

RuGalz

Member
I never understand why so many are fine beta testing an evolving technology with their lives. Maybe I am just too old school. And "auto pilot" is just a horrible name for something so immature.
 

Stinkles

Clothed, sober, cooperative
My car has automatic emergency braking and it's terrifying. tends to do it when I'm backing into a parallel space and "panics" when cars (rightfully and safely) drive around me. And it's a screechy, grating braking, not smooth at all.
 

toxicgonzo

Taxes?! Isn't this the line for Metallica?
An NTSB investigator testified Tuesday that ”collision mitigation systems" do not reliably detect cross traffic.
So that's kind of a problem.

The technology still has some hurdles to overcome.
 
I didn't know autopilot was not autopilot. I'd want it all or nothing. Having the car take some actions while I'm driving would take quite a bit of getting used to.
 

TripleBee

Member
It's not about autonomous cars never getting into an accident (even fatal as sad as that is) - it's about whether they do it statistically less often then human drivers.

I realize there's an element of "at least i'm in control of what happens to me" that many people have - but that does little to help when somebody runs a red light straight into you.

If the goal for autonomous cars to gain traction as the default - is that no accidents happen ever, then it'll never happen.
 

Stinkles

Clothed, sober, cooperative
There's very good data to suggest that automated driving will be VASTLY safer than human driving, but rationality and statistics will be pitted against emotional reaction first time there's a fatality in a networked system.

You see it in this very thread.
 

THE:MILKMAN

Member
I didn't know autopilot was not autopilot. I'd want it all or nothing. Having the car take some actions while I'm driving would take quite a bit of getting used to.

The hype and hyperbole around driverless cars isn't helping. It will be one of those things that is very close but very far away with the huge problem of mixing semi-auto cars with full human.

The final 0.1-1.0 percent of getting to 100% fully driverless is going to be very tough I think.
 

Soul Beat

Member
yes it was the name that caused the driver to ignore the warnings in the TOS that you HAVE to accept before using the auto pilot software. It was the name that caused him to ignore multiple audible and visual warnings that he was not touching the steering wheel and thus not paying attention.
I wasn't saying the name of the feature was what caused the crash. I do however believe it was a factor in the crash. People hardly ever read the TOS at all, so it's not too far fetched that this person truly believed that the feature was 100% autopilot.

Plus, if he was actually watching a movie while driving like the truck driver claims, it would also impair his ability to notice both the visual and auditory warnings from the car.
 

WaterAstro

Member
How do you stop this? Pretty simple. Make every car on the road autonomous and communicate with each other locally over a secure encrypted LAN network.

Human beings should not be operating motor vehicles, period. It should be illegal.

Yes I know this is a dream today, but this is what absolutely needs to happen.

I've been thinking of this since forever.

It will be inevitable as the population rises. Humans are too stupid to drive.
 
How do you stop this? Pretty simple. Make every car on the road autonomous and communicate with each other locally over a secure encrypted LAN network.

Human beings should not be operating motor vehicles, period. It should be illegal.

Yes I know this is a dream today, but this is what absolutely needs to happen.

Ten years ago I would have laughed at you. But today I'm in complete agreement.
 
They really should have named it something other than autopilot.

At some point, though, a fully autonomous car will crash and kill someone. It's inevitable.

Okay? Fully manned cars can kill people too. What's your point? That systems never claimed to be 100% perfect are, in fact, not 100% perfect?

So that's kind of a problem.

The technology still has some hurdles to overcome.

Of course.
 

Mesoian

Member
How do you stop this? Pretty simple. Make every car on the road autonomous and communicate with each other locally over a secure encrypted LAN network.

Human beings should not be operating motor vehicles, period. It should be illegal.

Yes I know this is a dream today, but this is what absolutely needs to happen.

This is literally the plot of Motorcity.
 

FrankCanada97

Roughly the size of a baaaaaarge
I wasn't saying the name of the feature was what caused the crash. I do however believe it was a factor in the crash. People hardly ever read the TOS at all, so it's not too far fetched that this person truly believed that the feature was 100% autopilot.

Plus, if he was actually watching a movie while driving like the truck driver claims, it would also impair his ability to notice both the visual and auditory warnings from the car.
He wasn't watching a movie. It's unclear what impaired his awareness.
 

pr0cs

Member
I've been thinking of this since forever.

It will be inevitable as the population rises. Humans are too stupid to drive.
People shouldn't be allowed to own cars either. They're bad for the environment (yes even electric ones) and we've see time and again that we can't be trusted to do the right thing.
Lowest common denominator needs to be a requirement for any vehicle
 

Ivan 3414

Member
How do you stop this? Pretty simple. Make every car on the road autonomous and communicate with each other locally over a secure encrypted LAN network.

Human beings should not be operating motor vehicles, period. It should be illegal.

Yes I know this is a dream today, but this is what absolutely needs to happen.

Why do people suggest ideas like this as if they're aren't people who enjoy driving themselves? You don't think there's going to be humongous blowback if self-driving vehicles are the only vehicles allowed on the road?

Any decision like that would be the prohibition of the 21st century.
 
Why do people suggest ideas like this as if they're aren't people who enjoy driving themselves? You don't think there's going to be humongous blowback if self-driving vehicles are the only vehicles allowed on the road?

Any decision like that would be the prohibition of the 21st century.

I think it's incredibly important that people be allowed to break the rules and take manual control at a moments' notice. Sometimes emergencies require unsafe driving.

Imagine a future hurricane or wildfire where too many fleeing cars overload the system and it can't pathfind its way out of gridlock, so no one can escape in time. Imagine a loved one is dying or giving birth and having to do 30 MPH on the way to the hospital. Maybe the system doesn't know there's construction on the way but you already know which routes not to take. etc. etc.
 

jwk94

Member
One thing that scares me about the Tesla is how the windshield's glass wraps around to the back. Isn't that dangerous?
 

FyreWulff

Member
Why do people suggest ideas like this as if they're aren't people who enjoy driving themselves? You don't think there's going to be humongous blowback if self-driving vehicles are the only vehicles allowed on the road?

Any decision like that would be the prohibition of the 21st century.

There will be grumbling for a bit but at a certain point insurance companies will not insure your car unless it is self-driving capable, and the premiums for driving yourself will be astronomic, because self driving cars will be massively safer at driving than you will be. You don't see people keeping horses and feeding them anymore for transportation. They let someone else take care of them and go ride around on them during the summer recreationally.
 

commedieu

Banned
I feel like they should warn drivers throughly. Most people feel like they can just fuck off in the passenger seat, and it's 2029.

It's cool tech, but you still need to pay attention and ready to take control. Sort of like if a child or Donald trump is driving.
 

captive

Joe Six-Pack: posting for the common man
I feel like they should warn drivers throughly. Most people feel like they can just fuck off in the passenger seat, and it's 2029.

It's cool tech, but you still need to pay attention and ready to take control. Sort of like if a child or Donald trump is driving.

they do. IIRC if you take your hands off the wheel for longer than 30 seconds or a minute it beeps at you.
 
I feel like they should warn drivers throughly. Most people feel like they can just fuck off in the passenger seat, and it's 2029.

It's cool tech, but you still need to pay attention and ready to take control. Sort of like if a child or Donald trump is driving.

I mean, it said he received 6 audible warnings and 7 visual ones
 
I don't see the point of autopilot if I always have to have my hands on the wheel and be paying attention. Get back to me when its fully automated and I don't even need a wheel.

How do you stop this? Pretty simple. Make every car on the road autonomous and communicate with each other locally over a secure encrypted LAN network.

Human beings should not be operating motor vehicles, period. It should be illegal.

Yes I know this is a dream today, but this is what absolutely needs to happen.

I agree with you but the problem is getting to that 100% conversion rate is probably 25+ years off if not more. I mean people still drive classic cars made 50 years ago or more.
 
I think it's incredibly important that people be allowed to break the rules and take manual control at a moments' notice. Sometimes emergencies require unsafe driving.

Imagine a future hurricane or wildfire where too many fleeing cars overload the system and it can't pathfind its way out of gridlock, so no one can escape in time. Imagine a loved one is dying or giving birth and having to do 30 MPH on the way to the hospital. Maybe the system doesn't know there's construction on the way but you already know which routes not to take. etc. etc.

An automated system would eliminate gridlock in a catastrophic event, if anything humans driving their way out of some natural disaster would make things worse and cause more accidents cause humans panic and will not follow order so everyone can get out in time and safely, automated machines would not have this problem.

In regards to an emergency like someone dying or giving birth, the car could have an emergency mode that would make every other automated car move out of the way automatically and give your car priority/right of way while speeding up the vehicle to the same speeds as emergency vehicles. Imagine a human being trying to do this today and not causing complete chaos on the road.

Sorry folks, automation in cars solves too many critically important issues and increases the quality of life for humans in so many unimaginable ways that it cannot be ignored or downplayed in any way.
 
An automated system would eliminate gridlock in a catastrophic event, if anything humans driving their way out of some natural disaster would make things worse and cause more accidents cause humans panic and will not follow order so everyone can get out in time and safely, automated machines would not have this problem.

In regards to an emergency like someone dying or giving birth, the car could have an emergency mode that would make every other automated car move out of the way automatically and give your car priority/right of way while speeding up the vehicle to the same speeds as emergency vehicles. Imagine a human being trying to do this today and not causing complete chaos on the road.

Sorry folks, automation in cars solves too many critically important issues and increases the quality of life for humans in so many unimaginable ways that it cannot be ignored or downplayed in any way.

Systems go down and malfunction, especially in times of crisis when you might have cellular towers getting destroyed or power loss. I'm specifically talking about extenuating circumstances when it might NOT work, you can't simply imply that it would always work. It wouldn't always work. Nothing always works. That's why there always needs to be a manual option.

And an emergency mode that gives your car priority, are you kidding me? You think people could seriously be trusted not to abuse that, in a fully automated setting?

What are you proposing for more recreational vehicles? Motorcycles? Boats? Snowmobiles? Off-road cars or ATVs? In an emergency evacuation I'm picturing utter standstill on the roads as automated driving algorithms fail while the dude on his ATV waves at them as he gets to safety.
 

Smiley90

Stop shitting on my team. Start shitting on my finger.
"In it’s preliminary report, the NTSB said that Brown had his hands on the wheel for just 25 seconds in the final 37 minutes of his drive"

I first read this as "25 seconds out of 37" and thought "well that's not so bad"

then re-read it and saw it was MINUTES.

wow, yeah that's totally the driver's fault. That's now how Tesla's autopilot works at all.
 

KingV

Member
yes it was the name that caused the driver to ignore the warnings in the TOS that you HAVE to accept before using the auto pilot software. It was the name that caused him to ignore multiple audible and visual warnings that he was not touching the steering wheel and thus not paying attention.

That's just human nature though. You have to plan around it to some extent (which they later did by requiring people to put their hands on the steering wheel).

Posting a warning that says "attention, you must be just as attentive while using autopilot as you would be if there was NO autopilot to ensure safe operation of the vehicle" might be sufficient to absolve them of legal responsibility, but it's probably not going to actually move the needle on how many people actually heed said advice.

It's one thing to say "oh this dude was irresponsible", but many many people are irresponsible and will continue to be irresponsible so these systems need to plan for that in ways that effectively shut the vehicle down if the driver is not operating the vehicle properly.
 

FrankCanada97

Roughly the size of a baaaaaarge
I can't wait for the news articles and threads when that happens. Should be fun.

I'm thinking what will really blow up in the news is whether in an unavoidable accident, the self-driving car would prioritize saving it's passenger(s) or bystander(s).
 

FyreWulff

Member
I'm thinking what will really blow up in the news is whether in an unavoidable accident, the self-driving car would prioritize saving it's passenger(s) or bystander(s).

which is a stupid edge case that already happens with human drivers.

a properly designed car makes it survivable for both even in the current human driver situation.
 
Top Bottom