Crash: how computers are setting us up for disaster
We increasingly let computers fly planes and carry out security checks. Driverless cars are next. But is our reliance on automation dangerously diminishing our skills?by Tim Harford
When a sleepy Marc Dubois walked into the cockpit of his own aeroplane, he was confronted with a scene of confusion. The plane was shaking so violently that it was hard to read the instruments. An alarm was alternating between a chirruping trill and an automated voice: STALL STALL STALL. His junior co-pilots were at the controls. In a calm tone, Captain Dubois asked: Whats happening? Co-pilot David Roberts answer was less calm. We completely lost control of the aeroplane, and we dont understand anything! We tried everything! The crew were, in fact, in control of the aeroplane. One simple course of action could have ended the crisis they were facing, and they had not tried it. But David Robert was right on one count: he didnt understand what was happening.
As William Langewiesche, a writer and professional pilot, described in an article for Vanity Fair in October 2014, Air France Flight 447 had begun straightforwardly enough an on-time take-off from Rio de Janeiro at 7.29pm on 31 May 2009, bound for Paris. With hindsight, the three pilots had their vulnerabilities. Pierre-Cédric Bonin, 32, was young and inexperienced. David Robert, 37, had more experience but he had recently become an Air France manager and no longer flew full-time. Captain Marc Dubois, 58, had experience aplenty but he had been touring Rio with an off-duty flight attendant. It was later reported that he had only had an hours sleep.
<snip>
It was still not too late to save the plane if Dubois had been able to recognise what was happening to it. The nose was now so high that the stall warning had stopped it, like the pilots, simply rejected the information it was getting as anomalous. A couple of times, Bonin did push the nose of the aircraft down a little and the stall warning started up again STALL STALL STALL which no doubt confused him further. At one stage he tried to engage the speed brakes, worried that they were going too fast the opposite of the truth: the plane was clawing its way forwards through the air at less than 60 knots, about 70 miles per hour far too slow. It was falling twice as fast. Utterly confused, the pilots argued briefly about whether the plane was climbing or descending.
Bonin and Robert were shouting at each other, each trying to control the plane. All three men were talking at cross-purposes. The plane was still nose up, but losing altitude rapidly.
Robert: Your speed! Youre climbing! Descend! Descend, descend, descend!
Bonin: I am descending!
Dubois: No, youre climbing.
Bonin: Im climbing? OK, so were going down.
Nobody said: Were stalling. Put the nose down and dive out of the stall.
<snip>
Robert announced that he was taking control and pushed the nose of the plane down. The plane began to accelerate at last. But he was about one minute too late thats 11,000 feet of altitude. There was not enough room between the plummeting plane and the black water of the Atlantic to regain speed and then pull out of the dive. In any case, Bonin silently retook control of the plane and tried to climb again. It was an act of pure panic. Robert and Dubois had, perhaps, realised that the plane had stalled but they never said so. They may not have realised that Bonin was the one in control of the plane. And Bonin never grasped what he had done. His last words were: But whats happening? Four seconds later the aircraft hit the Atlantic at about 125 miles an hour. Everyone on board, 228 passengers and crew, died instantly.
<snip>
This problem has a name: the paradox of automation. It applies in a wide variety of contexts, from the operators of nuclear power stations to the crew of cruise ships, from the simple fact that we can no longer remember phone numbers because we have them all stored in our mobile phones, to the way we now struggle with mental arithmetic because we are surrounded by electronic calculators. The better the automatic systems, the more out-of-practice human operators will be, and the more extreme the situations they will have to face. The psychologist James Reason, author of Human Error, wrote: Manual control is a highly skilled activity, and skills need to be practised continuously in order to maintain them. Yet an automatic control system that fails only rarely denies operators the opportunity for practising these basic control skills when manual takeover is necessary something has usually gone wrong; this means that operators need to be more rather than less skilled in order to cope with these atypical conditions.
https://www.theguardian.com/technology/2016/oct/11/crash-how-computers-are-setting-us-up-disaster
RKP5637
(67,109 posts)nothing. The computers/machines ran everything and the inhabitants were totally clueless over generations as to how anything worked.
Gidney N Cloyd
(19,838 posts)RKP5637
(67,109 posts)ProfessorPlum
(11,257 posts)You stop being so worried about watching where you are going as you back up, or paying as much attention behind the wheel.
Driving should be a fully engaging and engaged activity. And who is to blame if the car takes over a little bit and an accident results?
Weird stuff.
milestogo
(16,829 posts)which does not hold them responsible in the event of an accident.
Straw Man
(6,625 posts)You never know when you might need to smash into something head on.
I'm only half kidding. Trust in technology is often misplaced. For example, I have a lot of experience driving in winter and could always get my car up my steep, icy driveway -- until I got a car with "traction control." Then it became impossible. Whereas a tiny bit of wheel-slip never used to impede my progress (Momentum is your friend!), now it results in a complete cut of power to the wheels. There I am, halfway up, with no hope of making it out. Thanks, Modern World.
Fast Walker 52
(7,723 posts)it is worrisome