General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsShould A Self-Driving Car Kill Its Passengers In A “Greater Good” Scenario?
http://www.iflscience.com/technology/should-self-driving-car-be-programmed-kill-its-passengers-greater-good-scenarioOctober 26, 2015 | by Jonathan O'Callaghan
This is a moral and ethical dilemma that a team of researchers have discussed in a new paper published in Arxiv, led by Jean-Francois Bonnefon from the Toulouse School of Economics. They note that some accidents like this are inevitable with the rise in self-driving cars and what the cars are programmed to do in these situations could play a huge role in public adoption of the technology.
"It is a formidable challenge to define the algorithms that will guide AVs [Autonomous Vehicles] confronted with such moral dilemmas," the researchers wrote. "We argue to achieve these objectives, manufacturers and regulators will need psychologists to apply the methods of experimental ethics to situations involving AVs and unavoidable harm."
In their paper, the researchers surveyed several hundred people on Amazons Mechanical Turk, an online crowdsourcing tool. They presented the participants with a number of scenarios, including the one mentioned earlier, and also altered the number of people in the car, the number of people in the group, the age of the people in the car (to include children), and so on.
~ snip ~
The technology is rapidly developing. But will people trust it? And who and how will the tech be protected from hackers? Will the government control where the car is allowed to go for political, as opposed to safety or trespassing issues?
SheilaT
(23,156 posts)and this thing was actually discussed, although somewhat briefly.
Agnosticsherbet
(11,619 posts)Sometimes physics will win.
Xithras
(16,191 posts)The computer driving the car doesn't engage in hoping and wishing. If a crowd of children runs into the road in front of you, you're going to slam on your brakes and swerve in the hopes that you can avoid them. Maybe you will, maybe you won't, but you did what you could and the rest is up to fate.
A computer doesn't deal with hope, it has facts. It knows the road conditions, the vehicle mass, the current speed, and the distance to the target. When those children run out onto the roadway, it will calculate in under 50 milliseconds that, given the laws of physics, there is a 0% possibility of avoiding harm. It knows, with certainty, that someone is going to get hit today.
But the computer has a choice. There are nine trick or treaters under six years old standing in the roadway and you are 0.882 seconds from barreling through the middle of them at 40MPH. It can lock up its brakes and stay on course, in which case those nine six year olds are less than a second away from death. OR, it can swerve to the right, which will cause the vehicle to jump the curb and run down the two parents and three teenagers who were trying to catch the six year old trick or treaters. OR, it can swerve to the left, missing the trick or treaters entirely but colliding with a tree, placing only you at risk of injury,
These types of accidents occur every single day. Which does the computer pick?
Agnosticsherbet
(11,619 posts)within its limits to avoid hitting any living objects or the least number of living objects. It would be required to do the least harm. That, indeed, might be running into a tree. I suspect with a good automatic driving system there will be far fewer accidents.
If we get to the point where machine intelligence is fully self aware and can discuss philosophy, it should not be driving a car.
Yes, those types of accidents happen every day. A car using radar/sonar recognizing targets of opportunity at the edge of the curb would have slowed long before human beings could react, even if we noticed the kids. It could also be programed with dates of holidays like Halloween and be required to take action well ahead of a time, such as driving well below the speed limit, where there are children likely to run across a street.
That is why I say, it should be programed to do the least harm. That would include insisting that it's passengers wear seatbelts, which reduces the chances of damage when the vehicle hits the tree.
Thor_MN
(11,843 posts)The computer has no ability to distinguish what the obstacle is, be it 9 trick or treaters , or a sofa dropped off the back of a truck. Some things at this stage might not even register as an obstacle. It knows, with certainty, that someTHING is going to get hit today.
For the near future, the code is going to be stay on the road and minimize the speed of the impact. It knows nothing of what ISN'T in the road. It doesn't know the difference between a brick wall and hay bale. It likely doesn't know grassy field from edge of cliff. The car is not going to swerve off the road.
Until computers are quite a bit more advanced, there can be no considerations of "greater good".
Humanist_Activist
(7,670 posts)the 6 year olds, most likely not have the time to react appropriately in any case.
MADem
(135,425 posts)Run AHEAD of the object trying to smash into it, for as long as possible. If the attacking vehicle is self driving, too, they should be able to work it out.
It's not like a self-driving car has trouble backing up in a straight line, after all....
These things are coming and they are going to be a boon to the elderly and disabled. They will make them more mobile, more active, more engaged, and vastly improve their quality of life.
I am the "self driving car" for many elderly neighbors and acquaintances. I do it to honor my grandmothers, who had to rely on public transport well into their dotage, and it was often a hardship for them as the stops were not convenient to their homes.
Xithras
(16,191 posts)In an unavoidable collision where SOMEONE is going to be injured, the computers directive should be to follow the path that harms the smallest number of people. If that means that the occupants of the self driving cars are the ones being injured, then so be it.
The problem is that this will have to be mandated by law. If given the choice between a car that might kill you in an accident, and one that will do everything possible to save you, most people will chose the latter.
Travis_0004
(5,417 posts)Load up 800 lbs, and tell my car there are 5 people in the car, so it leans further towards 'protect me'.
kcr
(15,318 posts)A driver would not sacrifice themselves and drive themselves into a wall to save 5 pedestrians. A self driving car should not do the same thing.
Action_Patrol
(845 posts)A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
This doesn't solve your querie but it gave me a reason to post the Laws of Robotics. WhooHoo!
haikugal
(6,476 posts)Warren DeMontague
(80,708 posts)FrodosPet
(5,169 posts)Killing him after about 1938 or 1939 would have prolonged the war, and maybe even handed victory, or at least a stalemate, to the Germans.
Imagine WWII Germany ruled by someone militarily competent?
Kalidurga
(14,177 posts)Not that it matters, but with just that information it seems those 10 people shouldn't be where they are. In any case I am thinking the car doesn't know people from lampposts and it will try to avoid hitting them head on as it would any objects.
Bucky
(54,041 posts)Kalidurga
(14,177 posts)I was with OWS a few times in the middle of the road. I would not be surprised to see Bernie people do the same. It's still stupid though even if I do it. OTOH I don't know why these 10 people are in the middle of the road, maybe they were rescuing kittens and I can't blame them for that.
msongs
(67,433 posts)Hassin Bin Sober
(26,335 posts)Bucky
(54,041 posts)You scenario seems to assume that AI cars will be wildly swerving around narrow street corners and won't have functioning brakes.
Solly Mack
(90,779 posts)The back wheels peeled off the rims and I barely had control of the car (lurching and rocking). I wasn't the owner of the car. This matters because the accident was ruled owner's fault (bad tires). Anyway, I was on the highway, heading north, heavy, though fast moving, traffic. I wasn't speeding but the speed limit was 70 MPH. The tires go, the car lurches, and I opt to head into the median rather than risk hitting other cars and hurting other people. I was in the outside lane - could not make it to the shoulder without hitting others. I chose to limit the injuries and possible deaths to those in the car I was driving. It was seconds between the tires going and my decision.
The car rolled several times, all across the median, almost landing in south bound traffic. 4 injured, 3 without any injuries. No one was seriously injured, though one did spend a week in the hospital. Broken hand, needed surgery - but nothing life threatening. Everyone made a full recovery.
That said....not sure I want the car making the choice for me.
hunter
(38,322 posts)Walking, running, human powered bicycles, tricycles, quadcycles, electric mobility devices, and sailing ships are the very highest art of human transportation.
Other transportation, most especially the fossil fuel machines, do not make this world a better place.
0rganism
(23,962 posts)<you are safer where you are>
"hey car, really we're gonna be late"
<the only winning move is not to play>