Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

FrodosPet

(5,169 posts)
Mon Sep 5, 2016, 09:02 PM Sep 2016

Top Safety Official Doesn’t Trust Automakers to Teach Ethics to Self-Driving Cars

Top Safety Official Doesn’t Trust Automakers to Teach Ethics to Self-Driving Cars

Federal rules could lay out how cars decide whom to protect or harm in a crash.

https://www.technologyreview.com/s/602292/top-safety-official-doesnt-trust-automakers-to-teach-ethics-to-self-driving-cars/

by Andrew Rosenblum September 2, 2016


Rapid progress on autonomous driving has led to concerns that future vehicles will have to make ethical choices, for example whether to swerve to avoid a crash if it would cause serious harm to people outside the vehicle.

Christopher Hart, chairman of the National Transportation Safety Board, is one of them. He told MIT Technology Review that federal regulations will be required to set the basic morals of autonomous vehicles, as well as safety standards for how reliable they must be.

Hart said the National Highway Traffic Safety Administration (NHTSA) will likely require designers of self-driving cars to build in fail-safes for critical components of their vehicles, similar to how aircraft manufacturers do.

~ snip ~

“The government is going to have to come into play and say, ‘You need to show me a less than X likelihood of failure, or you need to show me a fail-safe that ensures that this failure won’t kill people,” said Hart.

~ snip ~

6 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Top Safety Official Doesn’t Trust Automakers to Teach Ethics to Self-Driving Cars (Original Post) FrodosPet Sep 2016 OP
There are no built ins with human drivers metroins Sep 2016 #1
The government may need to stifle this technlogy Major Nikon Sep 2016 #5
moral principles that govern a person's or group's behavior. elleng Sep 2016 #2
I don't trust them travelling in winter and mountain passes, either. longship Sep 2016 #3
How ethical would the software on a luxury car be? NT Jerry442 Sep 2016 #4
Some people want your car to kill you instead of them FrodosPet Sep 2016 #6

metroins

(2,550 posts)
1. There are no built ins with human drivers
Mon Sep 5, 2016, 09:06 PM
Sep 2016

The autonomous vehicles would save thousands of lives, if a few errant deaths occurred, those issues can be worked on. The end result is overall less deaths.

Government does not need to stifle this technology.

Major Nikon

(36,827 posts)
5. The government may need to stifle this technlogy
Mon Sep 5, 2016, 11:57 PM
Sep 2016

Fully autonomous vehicles are not going to just appear one day. It's going to be a progression of semi-autonomous technologies which progress to full automation. This progression is already upon us and the obvious hazard is that people are also going to progressively let the car take over and spend less time being attentive enough to take over when needed.

longship

(40,416 posts)
3. I don't trust them travelling in winter and mountain passes, either.
Mon Sep 5, 2016, 11:06 PM
Sep 2016

And especially when combined. When one traverses a winter mountain pass one wants intelligence behind the wheel, not Microsoft fucking Windows.

I don't want autonomous vehicles anywhere near the Grapevine on I-5 in SoCal. Especially not a truck. And especially not in winter.
Tejon Pass

FrodosPet

(5,169 posts)
6. Some people want your car to kill you instead of them
Tue Sep 6, 2016, 12:01 AM
Sep 2016

On the other hand, they want their car to kill you instead of them.
[hr]
People Want Self-Driving Cars That Save Lives. Especially Theirs

https://www.wired.com/2016/06/people-want-self-driving-cars-save-lives-especially/

Jack Stewart | 06.23.16 2:00 PM



WOULD YOU BUY a driverless car that is programmed to kill you? Of course not. Ok, how about a car programmed to kill you if it’s the only way to avoid plowing into a crowd of dozens?

That’s one of the conundrums an international group of researchers put to 2,000 US residents through six online surveys. The questions varied the number of people that would be sacrificed or saved in each instance—if you want to try it for yourself, see if you’d make a good martyr here. The study, just published in the journal Science is the latest attempt to answer ethics’ classic “trolley problem”—forcing you to choose between saving one life and saving many more.

~ snip ~

So what kind of god do people want as a chauffeur? The study found most people think driverless should minimize the total number of deaths, even at the expense of the occupants. The respondents stuck to that utilitarian thinking, although some decisions were harder than others. “It seems that from the responses people gave us, saving their coworkers was not a priority,” says Jean-Francois Bonnefon of the Toulouse School of Economics. But overall, “do the greater good” always won, even with children in the car.

That’s great—until the researchers asked people if they’d buy one of these greater good-doing cars for themselves. Not a chance. People want cars that protect them and their passengers at all costs. They think it’s great if everyone else drives an ethical car, but they certainly don’t want one for their family.

~ snip ~

Latest Discussions»General Discussion»Top Safety Official Doesn...