The challenge, as the following article explains, is not one of functionality, but primarily that of imagination. JL
Greg Satell comments in Digital Tonto:
As machines begin to assist us in mental tasks, we’re starting to see social skills trump cognitive skills. We will need to learn to collaborate with humans as well as machines in ways that aren’t obvious now, but in a generation will be undeniable. As technology advances, its function evolves and those unable to shift their mental models (will be) unable to compete.Our failure to adapt to the future is less likely to be due to a lack of intelligence than a lack of imagination.
I recently got a call from my mother asking me to help her watch House of Cards on Netflix. She was frustrated and complained, “I keep pressing the thing and nothing happens!” It was hard to get her to understand I had no idea what thing she was pressing or what was supposed to happen when she did.
I’m still not exactly sure what the problem was, but getting her to understand that the buttons on her remote had little to do with the TV in her bedroom and everything to do with giving instructions to servers in faraway places seemed to help. Before long, her frustration with technology turned to fascination with the political machinations of Frank Underwood.
Many businesses have the same problem as my mother. As technology advances, its function evolves and those that are unable to shift their mental models find themselves unable to compete. This is especially true of digital technology, where every generation sees a new crop players emerge while old titans falter. Only a rare few are able to cross the chasm.
The Rise Of Mathematical Machines
The term “computer” used to refer to a person, not a machine. Teams of people, usually women, would sit in a room carrying out arithmetic for complex calculations that scientists needed to do their work. So it shouldn’t be surprising that the first digital computers were used in the same way, as purely calculating machines.
Three machines can claim to be the first true digital computer. The first, called Colossus, was designed in secret at Bletchley Park outside London. The second, the ENIAC, was built at the University of Pennsylvania and the third, the IAS machine was built in Princeton, New Jersey. All were monstrous and built at great expense by governments for military applications.
They soon found commercial applications as well. The design of the IAS machine was open sourced and before long a computer industry made up of IBM and the BUNCH companies (Burroughs, UNIVAC, NCR, Control Data Corporation, and Honeywell) began to serve private corporations with heavy computational tasks, such as payroll and accounting.
By the 1960’s, hard drives and databases were developed and computers took on a storage function as well as pure calculation, but they were still largely confined to the back office. Very few people actually worked directly with computers, although executives would occasionally see a printout of their results.
From Calculation To Communication
In 1968, Douglas Engelbart presented a completely new conception of the computer at an event that was so consequential it is now referred to as The Mother of All Demos. Within a few years, Bob Taylor, who financed Engelbart’s work at DARPA, began working to make it a commercial reality at Xerox PARC’s Computer Science Lab.
While the giants of industry still considered computers to be mathematical machines, Taylor saw that interactive technology could transform them into communication machines. The product his team built, the Alto, had many of the features of the machines we know today, such as a graphical user interface set up as a desktop, a mouse and ethernet connections.
Yet as Michael Hiltzik reported in his history of PARC, Dealers of Lightning, the Xerox brass was less than impressed. Used to dealing with top level executives, they didn’t see the value of personal computers. On the other hand, their wives, many of whom had previously worked as secretaries, were fascinated by its ability to automate basic office tasks.
The BUNCH companies didn’t see it either. Like the Xerox executives, they remained stuck in old mental models as a new crop of companies, including Apple, Microsoft, Compaq and Dell came to dominate the industry. IBM, along with Intel, whose chips powered the revolution, survived intact, but they were rare exceptions.
Automating Cognitive Tasks
Today, digital technology is being transformed once again. Computers, once large machines that took up entire rooms and then bulky boxes placed under desks, have been transformed once again. Now, we not only carry around smartphones more powerful than yesterday’s supercomputers, microprocessors are embedded in everything from toasters to traffic lights.
These new advances are creating several trends that are converging into a completely new paradigm. First, small devices such as smartphones and embedded chips are collecting an unprecedented amount of data. Second, that data is being stored in a network of servers that make up the “cloud.” Third, that data can be accessed and analyzed in real time.
Yet what is truly revolutionary is the way that data is being analyzed. In the past, the logic of analysis was fairly rigid—a particular set of inputs would lead to a predetermined set of outputs. Now, however, an increasingly powerful cadre of learning algorithms take all that information in and make judgments based on context.
Essentially, much like the steam engine automated physical tasks in the 19th century, today digital technology is automating cognitive tasks, from medical diagnoses and legal discovery to even creative work. And, just like in the previous paradigm shifts, most enterprises won’t survive the transformation. We’ll see old giants fall and new leaders emerge.
Learning To Collaborate With Machines—And Each Other
None of these shifts came easy. When the first commercial computer, the UNIVAC, debuted on CBS to predict the results of the 1952 election, the network executives found the results so out of line with human predictions that they refused to air them. Nevertheless, UNIVAC had it right and the “experts” had to contend with being outsmarted by a machine.
Later, when Bob Taylor was building the Alto, he had to devote two thirds of its power to run the display. That seemed crazy for a machine devoted to back office calculations, but it was absolutely essential to create a truly interactive computer. Very few executives at the time knew how to type, so putting a machine with a keyboard on every desk was far from obvious.
In both cases, the paradigm shift was so profound that earlier attitudes seem silly today. Of course computers can make predictions that humans can’t! Of course we need computers to do our jobs! Who would wait for their secretary to get back from lunch so that an email could be typed up? Yet it took years—decades even—for these things to become clear.
Today, as machines begin to assist us in mental tasks, we’re starting to see social skills trump cognitive skills and the basic rules for success will change yet again. We will need to learn to collaborate with humans as well as machines in ways that aren’t obvious now, but in a generation will be as undeniable as the need for a personal computer.
Yet one thing should be abundantly already clear, our past mental models will hold us back. Our failure to adapt to the future is less likely to be due to a lack of intelligence than a lack of imagination.
0 comments:
Post a Comment