A Blog by Jonathan Low

 

Jun 26, 2014

Google Car, Meet Asiana's Pilots

The problem with driverless auto technology is not the technology. It's us, the humans who use it.

This is unlikely to come as a surprise to anyone who has ever driven on a busy highway or in a crowded city. Or probably anywhere for that matter.

But the real issue goes beyond cars to the heart of the human-technology interface. And it has to do with our increasing reliance on - and belief in - the infallibility of the systems that surround us.

The release of a report on the reasons for the crash of an Asiana Airlines flight at San Francisco International Airport (the heart of the known tech universe!) highlighted the problem by pointing out that the pilots of that flight thought the automated airspeed control was monitoring them, not realizing (or remembering, depending on who you believe) that when they activated the manual override, the automated system ceased functioning.

The pilots thought the machine was in control and the machine thought the pilots were in control. Now imagine how that might play itself during a rainy Friday afternoon rush hour  wherever you live.

We want to believe that technology is infallible. That makes our lives so much more convenient. The less we have to think about the mundane, the happier we are. But the interaction of judgement, influence, knowledge and emotion may render technological systems defenseless or suboptimal. Being an optimistic species we will continue to hope that challenges can and will be overcome. Just dont forget to buckle your seatbelt. JL

Holman Jenkins comments in the Wall Street Journal:

Human beings will find it difficult to teach themselves to be assiduous monitors of systems that increasingly never fail. In practice, all cars would likely have to be driverless—or at least capable of taking control away from a driver—for any cars to be driverless
Pilots to Boeing: "You screwed up. You trusted us!"
Until Tuesday morning's release, partisans were going at each other over whether a National Transportation Safety Board report on last July's crash at San Francisco International Airport should cite Boeing, the plane's maker, as a contributing factor in the only fatal U.S. airline crash in more than five years.
As nobody disagrees, Asiana Flight 214's crew failed in a basic task, keeping track of the plane's airspeed on final approach. Before they could correct their error, the plane's tail smacked a sea wall, breaking off. Amazingly, only three passengers died, two of whom apparently weren't wearing seat belts, while the third is believed actually to have perished after being run over by a fire truck.
But Asiana also blames Boeing's auto-throttle system, which its three-man crew believed would automatically maintain a set airspeed of 137 knots. That gripe received some endorsement in Tuesday's NTSB report though the board deadlocked over whether to require Boeing to change the system. But a deeper trouble here is the increasingly problematic handoff between computers and human beings who are being primed to fail.
San Francisco's automatic glide-slope system was temporarily out of commission, but the weather was perfect. The South Korean crew had been reminded before the flight about a significant feature of the Boeing 777, but apparently forgot that physically moving the throttle levers while using a separate automatic system to regulate their descent would put the auto-throttle system into sleep mode.
Critics now insist Boeing should have included an alert or automatic override in case pilots might fly the plane into the ground using the tools Boeing gave them. That's a cop-out. The chief pilot later claimed "it was very difficult to perform a visual approach with a heavy airplane," according to the NTSB, which would seem to indicate the real problem: The crew was nonplused, perhaps nearly panicked, at the prospect of having to maintain a proper glidepath without help from the airport's sophisticated landing aid.
Diligent annotators of this column will recall Captain Malcolm Scott from nearly a decade ago, who criticized a British Airways decision to ban manual thrust control (which Asiana's pilots should have employed to maintain the plane's airspeed) by its Airbus pilots. Flying skills would atrophy, he warned, suggesting that the industry's implicit goal was to remove the human factor from the cockpit altogether.
Aviation Week magazine has since called the Asiana crash a "rich case study in monitoring deficiencies," adding that "a growing mountain of data suggest that such unpreparedness is closer to endemic than isolated."
Jet Blue, for one, insists that its pilots should be constantly "mentally flying" the plane even when the computer is flying, saying, "If no one is mentally flying the airplane, then no one is flying the airplane."
Unfortunately, human beings will find it difficult to teach themselves to be assiduous monitors of systems that increasingly never fail. And soon, except for landing and takeoff, manual flying may be all but impossible in densely used airspace as controllers pack in planes more tightly and precisely to save fuel and time and to make way for a horde of unmanned vehicles. Already, even as the skies become safer, the greatest risk to passengers is pilots accidentally crashing well-functioning aircraft during those rarer and rarer parts of the flight when they are physically in control.
How does this apply to you as a motorist in the age of Google? It means those who think the driverless car is just around the corner will be sorely disappointed.
To revel in the future that the visionaries hold out, the obstacles are nearly insurmountable. In their lush vision, America's parking lots and driveways could return to nature as a relative handful of always-handy robot cars would supplant the mostly idle cars owned in the millions by Americans today.
In practice, though, all cars would likely have to be driverless—or at least capable of taking control away from a driver in heavy traffic situations—for any cars to be driverless. Otherwise, effectively one jerk in a '74 Buick would own the only right of way.
Doing so, though, would require not only expensive onboard systems in every car but wireless networking that would likely raise privacy and personal autonomy fears far more alarming to many Americans than whether NSA computers are scanning their mostly boring emails and text messages. Imagine a National Rifle Association for car owners.
All this means—sorry—you won't soon be catching up on "Dexter" during your morning commute. But the news isn't all bad. Technology already finding its way into cars will increasingly intervene to relieve us of the accidents we inflict on ourselves by misjudging a curve or failing to brake to avoid the dodderer in front of us who finally is making the left turn he's been signaling since he left Florida seven hours ago.

0 comments:

Post a Comment