Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Initech

(100,102 posts)
Sun Jul 3, 2016, 04:01 PM Jul 2016

DVD Player Found At Scene Of Tesla Model S Crash Likely To Be Cause, Not Autopilot

Not. The. Onion.

A digital video disc player was found in the Tesla car that was on autopilot when its driver was killed in a May 7 collision with a truck, Florida Highway Patrol officials said on Friday.

Whether the portable DVD player was operating at the time of the crash has not been determined, however, and witnesses who came upon the wreckage of the 2015 Model S sedan gave differing accounts on Friday about whether the player was showing a movie.

Questions of why the car did not stop for a turning truck, and whether the victim, Joshua Brown, was watching the road are critical for Tesla. The Palo Alto luxury electric car maker is facing a preliminary inquiry by federal regulators over the safety of the Model S Autopilot system that was engaged at the time of the crash in Williston, Florida.

It could be weeks if not months before officials make a final determination of the cause of the crash, the first known fatality of a Model S driver while using Autopilot. Meanwhile, the accident is stoking the debate on whether drivers are being lulled into a false sense of security by suchtechnology. A man who lives on the property where Brown’s car came to rest some 900 feet from the intersection where the crash occurred said when he approached the wreckage 15 minutes after the crash, he could hear the DVD player. An FHP trooper on the scene told the property owner, Robert VanKavelaar, that a “Harry Potter” movie was showing on the DVD player, VanKavelaar told Reuters on Friday.

http://venturebeat.com/2016/07/03/dvd-player-found-inside-tesla-model-s-involved-in-fatal-crash/


You know those stories you see where one side automatically goes off on a shitstorm and takes the blame game way too far while more information is needed? This is that more information.
24 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
DVD Player Found At Scene Of Tesla Model S Crash Likely To Be Cause, Not Autopilot (Original Post) Initech Jul 2016 OP
And here's why you don't want to watch a movie while on "Autopilot" progree Jul 2016 #1
yeah that seems kinda basic. Schema Thing Jul 2016 #2
They let it on the road because driver is supposed to pay attention to the road, even if the car is LisaL Jul 2016 #3
Bad decision IMHO -- they know or should know that a driver who is led to believe that progree Jul 2016 #4
terrible idea Locrian Jul 2016 #10
Yeah I think that's a warning they need to put on cars. Initech Jul 2016 #5
On that stretch of road a driver might have had trouble seeing the truck csziggy Jul 2016 #6
So "autopilot" is a good name for this, at this stage of development? To release to the progree Jul 2016 #11
I can't speak to the engineering or software aspects csziggy Jul 2016 #13
Out of curiosity, do the autopilots with LIDAR also suffer that deficiency? Crash2Parties Jul 2016 #14
I doubt that a human assisted self-driving car Urchin Jul 2016 #18
I drive a 65 Plymouth Valiant, this is is not an issue in worry about Kilgore Jul 2016 #7
the root cause of the collision was a truck driver who turned in front of a car. uncle ray Jul 2016 #8
Yeah but if this guy turned on the autopilot thinking it was actually going to autopilot... Initech Jul 2016 #9
Were there skid marks? Urchin Jul 2016 #19
Brakes were never applied, so no skid marks, at least not before the collision progree Jul 2016 #22
Autopilot is just more like next generation cruise control, not driverless car tech like google andym Jul 2016 #12
Perhaps they shouldn't market it as autopilot (or make it really really clear it isn't in the media) Crash2Parties Jul 2016 #15
I think that is is correct-- autopilot is probably too strong a word andym Jul 2016 #16
Yeah, naming it 'autopilot' was a BAD idea... something like Driver Assist System would have been JCMach1 Jul 2016 #23
I am thinking this driver got the autopilot confused with driverless car. Initech Jul 2016 #20
This doesn't add any more information! We already knew it was both driver and system error. Yo_Mama Jul 2016 #17
Would the driver have been watching his DVD player if there was no "autopilot"? nationalize the fed Jul 2016 #21
Regardless, this is pretty scary KeizoOshima2 Jul 2016 #24

progree

(10,918 posts)
1. And here's why you don't want to watch a movie while on "Autopilot"
Sun Jul 3, 2016, 04:23 PM
Jul 2016
Following fatal crash, Tesla partner says Autopilot can’t avoid these kind of accidents, BGR News, 7/1/16

https://www.yahoo.com/tech/following-fatal-crash-tesla-partner-says-autopilot-t-170657301.html?nhp=1

Electrek got its hands on a statement from a Mobileye executive, who explained that the Automatic Emergency Braking system is only capable of protecting drivers from rear-end collisions, not lateral collisions (as was the case in this crash).

Here's the full statement regarding the crash from Dan Galves, Mobileye’s Chief Communications Officer, provided by Electrek:

"We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020."


Amazingly, they let a car on the road with a so-called "autopilot" capability, whose software is this deficient -- one that won't react to a vehicle turning broadside into one's lane in front of you!

LisaL

(44,974 posts)
3. They let it on the road because driver is supposed to pay attention to the road, even if the car is
Sun Jul 3, 2016, 04:36 PM
Jul 2016

on autopilot.

progree

(10,918 posts)
4. Bad decision IMHO -- they know or should know that a driver who is led to believe that
Sun Jul 3, 2016, 04:42 PM
Jul 2016

it is in effect a self-driving car that might have some bugs will be less attentive. Especially with a grossly misleading name like "autopilot".

If it is a professionally trained driver who has training for this specific purpose -- knowledgably testing out "autopilot"'s capabilities and limitations -- then that is one thing. But it sounds to me like they let anyone drive these cars -- albeit with lots of warnings in the manual etc. etc. Not ready for the general public. Especially when it is called "autopilot".

Locrian

(4,522 posts)
10. terrible idea
Sun Jul 3, 2016, 05:40 PM
Jul 2016

Humans make TERRIBLE back up systems. They get distracted etc.
Computers as backups - makes sense.

Initech

(100,102 posts)
5. Yeah I think that's a warning they need to put on cars.
Sun Jul 3, 2016, 05:05 PM
Jul 2016

And maybe a passive aggressive one is needed like this:

"Don't watch fucking movies while you're driving!!!"

csziggy

(34,137 posts)
6. On that stretch of road a driver might have had trouble seeing the truck
Sun Jul 3, 2016, 05:06 PM
Jul 2016

I've driven that road many, many times since 1972. The road is headed WNW - at this time of year in the late afternoon you're driving directly into the sun. Other reports on the accident indicate that the Tesla cameras could not distinguish between the white trailer and the bright sky.

Because of the bad lighting in the late afternoon on that road (US 27) I plan my trips so I don't hit that in the hour before full sunset. If I end up getting close to that time when heading west, I stop and eat dinner and let the sun go down.

People on that stretch of road tend to go anywhere between extremely slow and way way too fast - I've had to swerve to avoid a guy driving a lawn tractor in the right lane going maybe 5 mph, passed farm tractors going 20 mph, or gotten stuck behind locals lazing along at 40 mph. I've also been passed while I was going 70 by cars that made me feel as if I were in park.

That road also has numerous driveways, small country roads and random intersections. A driver on that road needs to pay attention because you never know what will pull out in front of you!

The key point is that Tesla is not selling these cars as self-driving. Yeah, they call it "AutoPilot" but for the same reason airplane pilots generally do not rely on AutoPilot to handle all aspects of flying, the Tesla driver should not rely on AutoPilot to handle all aspects of driving.

progree

(10,918 posts)
11. So "autopilot" is a good name for this, at this stage of development? To release to the
Sun Jul 3, 2016, 05:45 PM
Jul 2016

general public?

That it brakes only for avoiding a rear-end collision? That it won't brake for any other situation -- child running out in the road, someone stepping off the curb, a vehicle turning across (regardless of lighting or weather conditions), etc. A car traveling in the opposite direction entering your lane? (Hopefully it brakes for red traffic lights).

Yeah, I read the early explanations of how lighting conditions were to blame -- a big white truck broadside against a bright white sky -- and that's one thing. But according to this when the software is designed (or undesigned) not to react to situations other than an impending rear end collision, even under ideal lighting conditions, then it is in a totally different realm.


csziggy

(34,137 posts)
13. I can't speak to the engineering or software aspects
Sun Jul 3, 2016, 06:29 PM
Jul 2016

I just have a lot of experience with that stretch of road, enough to avoid it in the late afternoon. In fact, these days when I drive between Tallahassee and Lakeland I take 27-19-98 and stay on 98 all the way into Lakeland (27 and Alt-27 turn off at Perry and Chiefland, respectively' 19 continues south when 98 turns east south of Homosassa Springs).

But I don't expect it to be practical in my lifetime to have totally self driving cars. Though maybe in the next thirty years it will be a toss up as to which is safer, the human drivers or computer operators.

This driver should have known better than to leave everything up to the Tesla - according to what I have read he's posted a lot of videos about the car and was very familiar with it and its features. If he wanted to experiment with totally self-driving that particular stretch of road was a very bad choice. A limited access highway, such as an interstate, in a much less populated part of the country would have been a much better choice.

Crash2Parties

(6,017 posts)
14. Out of curiosity, do the autopilots with LIDAR also suffer that deficiency?
Sun Jul 3, 2016, 06:38 PM
Jul 2016

Since, you know, Tesla refuses to use that particular technology.

 

Urchin

(248 posts)
18. I doubt that a human assisted self-driving car
Sun Jul 3, 2016, 07:28 PM
Jul 2016

I doubt that a human assisted self-driving car is a practical combination.

If a human driver were in a car for which a computer did most of the driving, that person would become bored to tears, and would likely be unable to avoid the temptation to do something else, like text, watch a DVD, surf the web, read, etc.

But when most of the driving is done by the human, they are too occupied by the act of driving to be easily bored.

Kilgore

(1,733 posts)
7. I drive a 65 Plymouth Valiant, this is is not an issue in worry about
Sun Jul 3, 2016, 05:23 PM
Jul 2016

However just about everything else is. This is a car one "heards" down the road, there is no way the driver can't be engaged.

uncle ray

(3,157 posts)
8. the root cause of the collision was a truck driver who turned in front of a car.
Sun Jul 3, 2016, 05:28 PM
Jul 2016

the fact that a DVD player was found means little, we don't know if he was listening to a movie, watching it, or if the accident somehow caused the disc to begin to play. whatever was going on in the Tesla is secondary to the fact that a commercial truck driver failed to yield the right of way and turned in front of a passenger car.

Initech

(100,102 posts)
9. Yeah but if this guy turned on the autopilot thinking it was actually going to autopilot...
Sun Jul 3, 2016, 05:37 PM
Jul 2016

And then watched a movie while the car was in operation, that would not be Tesla's fault would it?

progree

(10,918 posts)
22. Brakes were never applied, so no skid marks, at least not before the collision
Sun Jul 3, 2016, 09:59 PM
Jul 2016
http://www.latimes.com/business/la-fi-tesla-crash-20160630-snap-story.html

The male driver died in a May 7 crash in Williston, Fla., when a big rig made a left turn in front of his Tesla.

In a blog post, Tesla Motors Inc. said the car passed under the trailer, with the bottom of the trailer hitting the Model S’ windshield.

“Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said.


http://www.mcclatchydc.com/news/politics-government/national-politics/article87081747.html
Sixty-two-year-old Frank Baressi, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was "playing Harry Potter on the TV screen" at the time of the crash and driving so quickly that "he went so fast through my trailer I didn't see him."

andym

(5,445 posts)
12. Autopilot is just more like next generation cruise control, not driverless car tech like google
Sun Jul 3, 2016, 06:20 PM
Jul 2016

has been testing. The driver is supposed to pay attention for good reason-- the Tesla technology does not do everything for you, it just assists you. Probably a bad idea-- people really need all or nothing in order to really pay sufficient attention.

andym

(5,445 posts)
16. I think that is is correct-- autopilot is probably too strong a word
Sun Jul 3, 2016, 06:45 PM
Jul 2016

probably should call it driver-assistant or something.

JCMach1

(27,572 posts)
23. Yeah, naming it 'autopilot' was a BAD idea... something like Driver Assist System would have been
Sun Jul 3, 2016, 10:13 PM
Jul 2016

better...

Initech

(100,102 posts)
20. I am thinking this driver got the autopilot confused with driverless car.
Sun Jul 3, 2016, 09:05 PM
Jul 2016

One of these things is definitely not like the other.

Yo_Mama

(8,303 posts)
17. This doesn't add any more information! We already knew it was both driver and system error.
Sun Jul 3, 2016, 07:12 PM
Jul 2016

However I have got to say that a system that is not 100% competent to drive the car (as in more reliable than a human driver in all foreseeable circumstances) should not be deployed in vehicles without specially trained drivers who are paid to test the system.

The reason for this is that:
1) A system that works great 90% of the time will build driver trust in it,
2) The human "driver's" attention will then inevitably wander,
3) Even if it doesn't, the time between a complacent driver's realization that the car is NOT reacting correctly and the time to place hands and correct will often be too little. A driver who is "coasting" and letting the vehicle drive itself, even if that driver's hands are on the wheel and eyes are on the road will react more slowly than otherwise.

Human nature is predictable, and deploying this system in its current state was going to predictably produce accidents.

None of the self-driving systems work 100% - all are messed up by some circumstances, most esp. sensor limitations (dirt/snow/rain clouds, etc). But most of them are being tested with special drivers who are adequately trained.

nationalize the fed

(2,169 posts)
21. Would the driver have been watching his DVD player if there was no "autopilot"?
Sun Jul 3, 2016, 09:23 PM
Jul 2016

Worse, Tesla says this "autopilot" is in Beta. How much sense does that make?

This is absurd and NHTSA should make this "autopilot" illegal until its out of Beta.

No one could have possibly predicted problems with a Beta "Autopilot" system released by a car company that has trouble making falcon wing doors.

Surrounding yourself with Yes Men is sometimes a bad idea. Tesla will be lucky to get out of this without being sued out of existence because this won't be the first instance of an "autopilot" crash.

Musk didn't help by talking about reading a book while his Beta "autopilot" drives the $100,000+ car down the road.



^ Remember that next time you see a Model S careening towards you.
 

KeizoOshima2

(17 posts)
24. Regardless, this is pretty scary
Mon Jul 4, 2016, 02:03 AM
Jul 2016

Self-driving cars just seems to be a recipe for disaster, I have to admit.

Latest Discussions»General Discussion»DVD Player Found At Scene...