General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsDVD Player Found At Scene Of Tesla Model S Crash Likely To Be Cause, Not Autopilot
Not. The. Onion.
Whether the portable DVD player was operating at the time of the crash has not been determined, however, and witnesses who came upon the wreckage of the 2015 Model S sedan gave differing accounts on Friday about whether the player was showing a movie.
Questions of why the car did not stop for a turning truck, and whether the victim, Joshua Brown, was watching the road are critical for Tesla. The Palo Alto luxury electric car maker is facing a preliminary inquiry by federal regulators over the safety of the Model S Autopilot system that was engaged at the time of the crash in Williston, Florida.
It could be weeks if not months before officials make a final determination of the cause of the crash, the first known fatality of a Model S driver while using Autopilot. Meanwhile, the accident is stoking the debate on whether drivers are being lulled into a false sense of security by suchtechnology. A man who lives on the property where Browns car came to rest some 900 feet from the intersection where the crash occurred said when he approached the wreckage 15 minutes after the crash, he could hear the DVD player. An FHP trooper on the scene told the property owner, Robert VanKavelaar, that a Harry Potter movie was showing on the DVD player, VanKavelaar told Reuters on Friday.
http://venturebeat.com/2016/07/03/dvd-player-found-inside-tesla-model-s-involved-in-fatal-crash/
You know those stories you see where one side automatically goes off on a shitstorm and takes the blame game way too far while more information is needed? This is that more information.
progree
(10,918 posts)https://www.yahoo.com/tech/following-fatal-crash-tesla-partner-says-autopilot-t-170657301.html?nhp=1
Electrek got its hands on a statement from a Mobileye executive, who explained that the Automatic Emergency Braking system is only capable of protecting drivers from rear-end collisions, not lateral collisions (as was the case in this crash).
Here's the full statement regarding the crash from Dan Galves, Mobileyes Chief Communications Officer, provided by Electrek:
"We have read the account of what happened in this case. Todays collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020."
Amazingly, they let a car on the road with a so-called "autopilot" capability, whose software is this deficient -- one that won't react to a vehicle turning broadside into one's lane in front of you!
Schema Thing
(10,283 posts)LisaL
(44,974 posts)on autopilot.
progree
(10,918 posts)it is in effect a self-driving car that might have some bugs will be less attentive. Especially with a grossly misleading name like "autopilot".
If it is a professionally trained driver who has training for this specific purpose -- knowledgably testing out "autopilot"'s capabilities and limitations -- then that is one thing. But it sounds to me like they let anyone drive these cars -- albeit with lots of warnings in the manual etc. etc. Not ready for the general public. Especially when it is called "autopilot".
Locrian
(4,522 posts)Humans make TERRIBLE back up systems. They get distracted etc.
Computers as backups - makes sense.
Initech
(100,102 posts)And maybe a passive aggressive one is needed like this:
"Don't watch fucking movies while you're driving!!!"
csziggy
(34,137 posts)I've driven that road many, many times since 1972. The road is headed WNW - at this time of year in the late afternoon you're driving directly into the sun. Other reports on the accident indicate that the Tesla cameras could not distinguish between the white trailer and the bright sky.
Because of the bad lighting in the late afternoon on that road (US 27) I plan my trips so I don't hit that in the hour before full sunset. If I end up getting close to that time when heading west, I stop and eat dinner and let the sun go down.
People on that stretch of road tend to go anywhere between extremely slow and way way too fast - I've had to swerve to avoid a guy driving a lawn tractor in the right lane going maybe 5 mph, passed farm tractors going 20 mph, or gotten stuck behind locals lazing along at 40 mph. I've also been passed while I was going 70 by cars that made me feel as if I were in park.
That road also has numerous driveways, small country roads and random intersections. A driver on that road needs to pay attention because you never know what will pull out in front of you!
The key point is that Tesla is not selling these cars as self-driving. Yeah, they call it "AutoPilot" but for the same reason airplane pilots generally do not rely on AutoPilot to handle all aspects of flying, the Tesla driver should not rely on AutoPilot to handle all aspects of driving.
progree
(10,918 posts)general public?
That it brakes only for avoiding a rear-end collision? That it won't brake for any other situation -- child running out in the road, someone stepping off the curb, a vehicle turning across (regardless of lighting or weather conditions), etc. A car traveling in the opposite direction entering your lane? (Hopefully it brakes for red traffic lights).
Yeah, I read the early explanations of how lighting conditions were to blame -- a big white truck broadside against a bright white sky -- and that's one thing. But according to this when the software is designed (or undesigned) not to react to situations other than an impending rear end collision, even under ideal lighting conditions, then it is in a totally different realm.
csziggy
(34,137 posts)I just have a lot of experience with that stretch of road, enough to avoid it in the late afternoon. In fact, these days when I drive between Tallahassee and Lakeland I take 27-19-98 and stay on 98 all the way into Lakeland (27 and Alt-27 turn off at Perry and Chiefland, respectively' 19 continues south when 98 turns east south of Homosassa Springs).
But I don't expect it to be practical in my lifetime to have totally self driving cars. Though maybe in the next thirty years it will be a toss up as to which is safer, the human drivers or computer operators.
This driver should have known better than to leave everything up to the Tesla - according to what I have read he's posted a lot of videos about the car and was very familiar with it and its features. If he wanted to experiment with totally self-driving that particular stretch of road was a very bad choice. A limited access highway, such as an interstate, in a much less populated part of the country would have been a much better choice.
Crash2Parties
(6,017 posts)Since, you know, Tesla refuses to use that particular technology.
Urchin
(248 posts)I doubt that a human assisted self-driving car is a practical combination.
If a human driver were in a car for which a computer did most of the driving, that person would become bored to tears, and would likely be unable to avoid the temptation to do something else, like text, watch a DVD, surf the web, read, etc.
But when most of the driving is done by the human, they are too occupied by the act of driving to be easily bored.
Kilgore
(1,733 posts)However just about everything else is. This is a car one "heards" down the road, there is no way the driver can't be engaged.
uncle ray
(3,157 posts)the fact that a DVD player was found means little, we don't know if he was listening to a movie, watching it, or if the accident somehow caused the disc to begin to play. whatever was going on in the Tesla is secondary to the fact that a commercial truck driver failed to yield the right of way and turned in front of a passenger car.
Initech
(100,102 posts)And then watched a movie while the car was in operation, that would not be Tesla's fault would it?
Urchin
(248 posts)Were there skid marks?
progree
(10,918 posts)The male driver died in a May 7 crash in Williston, Fla., when a big rig made a left turn in front of his Tesla.
In a blog post, Tesla Motors Inc. said the car passed under the trailer, with the bottom of the trailer hitting the Model S windshield.
Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied, Tesla said.
Sixty-two-year-old Frank Baressi, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was "playing Harry Potter on the TV screen" at the time of the crash and driving so quickly that "he went so fast through my trailer I didn't see him."
andym
(5,445 posts)has been testing. The driver is supposed to pay attention for good reason-- the Tesla technology does not do everything for you, it just assists you. Probably a bad idea-- people really need all or nothing in order to really pay sufficient attention.
Crash2Parties
(6,017 posts)andym
(5,445 posts)probably should call it driver-assistant or something.
JCMach1
(27,572 posts)better...
Initech
(100,102 posts)One of these things is definitely not like the other.
Yo_Mama
(8,303 posts)However I have got to say that a system that is not 100% competent to drive the car (as in more reliable than a human driver in all foreseeable circumstances) should not be deployed in vehicles without specially trained drivers who are paid to test the system.
The reason for this is that:
1) A system that works great 90% of the time will build driver trust in it,
2) The human "driver's" attention will then inevitably wander,
3) Even if it doesn't, the time between a complacent driver's realization that the car is NOT reacting correctly and the time to place hands and correct will often be too little. A driver who is "coasting" and letting the vehicle drive itself, even if that driver's hands are on the wheel and eyes are on the road will react more slowly than otherwise.
Human nature is predictable, and deploying this system in its current state was going to predictably produce accidents.
None of the self-driving systems work 100% - all are messed up by some circumstances, most esp. sensor limitations (dirt/snow/rain clouds, etc). But most of them are being tested with special drivers who are adequately trained.
nationalize the fed
(2,169 posts)Worse, Tesla says this "autopilot" is in Beta. How much sense does that make?
This is absurd and NHTSA should make this "autopilot" illegal until its out of Beta.
No one could have possibly predicted problems with a Beta "Autopilot" system released by a car company that has trouble making falcon wing doors.
Surrounding yourself with Yes Men is sometimes a bad idea. Tesla will be lucky to get out of this without being sued out of existence because this won't be the first instance of an "autopilot" crash.
Musk didn't help by talking about reading a book while his Beta "autopilot" drives the $100,000+ car down the road.
^ Remember that next time you see a Model S careening towards you.
KeizoOshima2
(17 posts)Self-driving cars just seems to be a recipe for disaster, I have to admit.