A Blog by Jonathan Low

 

Nov 15, 2015

Where Does Technological Innovation Come From?

This debate is reminiscent of an early lite beer add which posed a fundamental question about its advantage: "Tastes great - or less filling?"

The reality in that case, as in this one - whether tech innovation comes from scientific research or a serendipitous, self-organizing network effect - if, of course, both. JL

Nathan Myrhvold and Matt Ridley comment in the Wall Street Journal:

The true origin of the Internet does not lie in brilliant individuals, nor in private companies, nor in government funding. It lies in open-source, peer-to-peer networking. Technology evolves through the interactions of many ordinary people, rather than being handed down from the ivory towers of a few elite geniuses.
Nathan Myhrvold, founder and CEO of Intellectual Ventures:
In his recent essay “The Myth of Basic Science” (Oct. 24), Matt Ridley argues that technological growth owes little to basic scientific research. Instead, he depicts technology as some sort of mystical living entity that is self-organizing and immune to any human intent to direct or accelerate it. “Technology will find its inventors,” he writes.
Government funding of basic science has long been regarded as the foundation of the technological economy and thus an investment that returns far more in economic growth and other benefits than it consumes. Mr. Ridley says no, this is all wrong. The returns on publicly funded R&D are negligible. Patents and Nobel Prizes are “fundamentally unfair things.” When a government pays for scientific research, it actually harms science itself by crowding out the “good” science that corporations and philanthropists would otherwise fund on their own.
The stakes in this debate are very high, and Mr. Ridley’s argument could do real damage to one of the indispensable drivers of the global economy and our way of life.
To support his case, Mr. Ridley trots out instances in which multiple inventors or scientists came up with a similar idea around the same time. Innovation, he extrapolates, is inevitable. But Mr. Ridley misunderstands how parallel discovery works and why it matters.
Advertisement
One day in Beijing this past August, nine men dashed down a track. Usain Bolt of Jamaica won the gold medal with a time of 9.79 seconds, beating out Justin Gatlin of the U.S. by only one hundredth of a second. Isn’t that as fundamentally unfair as a Nobel Prize? It’s such a tiny amount of time! Mr. Ridley’s logic suggests not only that is it unfair to laud Mr. Bolt but also that there is little point in organizing such an event; the race would have happened spontaneously.
I doubt the runners would agree. They were there precisely because one of them would win the top prize. That’s competition, and our species thrives on it. We hold sports championships and festoon the winners with medals, fame and lucrative endorsement contracts because these events and rewards bring out the best performances that humans can muster.
In science, the gold medals have the face of Alfred Nobel on them. Tonight, in labs across the world, ambitious scientists will be working late because they think their team might win one. Some Nobel Prizes reward serendipitous discoveries, but many are the spoils of deliberate campaigns waged against famously hard, “Nobel-worthy” problems. All academic scientists are driven by an unforgiving, publish-or-perish incentive system. Jobs, tenure and funding go to those who regularly produce great ideas first.
It is this relentless, reward-driven mandate to publish that gives rise to parallel discoveries and inventions. Scientists must continually show their work, even to their rivals, so investigators in any field know who is making progress and how. We should be no more surprised by several researchers arriving at the same conclusion independently than we are by the silver medalist crossing the finish line a fraction of a second behind the gold.
In the realm of technological invention, patents are the equivalent of scientific papers and prizes: If you get there first, and your idea is novel and passes rigorous review, then you own it—but only for 20 years. A scientific breakthrough earns fame forever (they’re still “Newton’s laws” some 328 years later), but patents are limited in both geography and time. And they are given only in exchange for documenting the idea. In truth, patents are the original open-source movement.
A patent application becomes public well before the patent issues, so any serious inventor has a pretty good idea what the competition is doing. Word also spreads through technical conferences, where academic and industrial technologists talk about their work to attract funding and customers. And the market itself spreads ideas by showing them in action.
Inventors and scientists can choose to opt out and keep their ideas secret, but fortunately, few do. Patents spread ideas broadly so that others can build and improve on them—just as scientific journal articles do. Both create an incentive for clever people to find better solutions. Mr. Ridley castigates patents as unfair, but the truth is that other kinds of business advantages—such as know-how, brand power, capital plants and political favoritism—are more prone to monopolistic abuse because they are harder to work around and have no expiration date.
Mr. Ridley seems surprised that Edison’s lightbulb wasn’t the first, but the title of the patent was “Improvement in Electric Lights.” Electric lighting was a problem dozens of prior inventors had tried to solve. Edison’s innovation was to make the first bulb with a practical lifetime. He (and his investors) only jumped into the fray because they knew a patent would let them own the result; it would have been pointless otherwise.
Over time, Edison and then General Electric, GE 0.40 % which he helped to found, both cooperated and competed with inventors all over the world to make a better lightbulb. The patent system didn’t fail here; it worked just as it should.
Mr. Ridley argues that technology is some sort of living entity and that humans are as hapless in the face of its powers as are the coral polyps that make up a tropical reef. He believes that “technology will find its inventors,” without bothering to understand how that happens.
Human society organizes systems of cooperation and competition—like foot races, or science journals or patents—governed by rules and rewarded with incentives. It is self-organizing in the sense that no central authority decides what creative people work on; they vote with their time and resources. The people doing this aren’t brainless polyps; instead they are responding to the rules and incentives. The very things Mr. Ridley decries are what makes the system work.
Mr. Ridley writes, “Even the most explicit paper or patent application fails to reveal nearly enough to help another to retrace the steps through the maze of possible experiments.” That’s just flat wrong. The most damning criticism that can be made of a scientific paper is that it is irreproducible, and patents can be invalidated if they don’t fully describe the invention.
Consider CRISPR/Cas9, an exciting new technique for editing DNA that has huge potential applications for medicine but emerged out of basic, government-funded, curiosity-driven research on how bacteria protect themselves against viruses. Jennifer Doudna, Emanuelle Charpentier and co-workers published a gene editing method using CRISPR in 2012. Other labs quickly reproduced the work and extended it to human cells within months. Within the past three years, some 20,000 technical papers have been devoted to CRISPR, as well as several new biotech startups. Meanwhile, Dr. Doudna and Dr. Charpentier won a $3 million scientific prize and likely have a Nobel in their future.
The quick uptake of the CRISPR/Cas9 technique is typical, especially in a well-funded, competitive field like molecular biology. Indeed it is in the interest of the discoverers for this to happen—prizes aren’t given for techniques nobody uses. We’ve seen it over and over again—in the discovery of recombinant DNA, in the use of fluorescent tagging of cells, and in myriad advances in gene sequencing—as Mr. Ridley, who wrote a book on the genome, ought to know.
Another astoundingly incorrect claim by Mr. Ridley is that copying an invention is so expensive as to be pointless. In support, he quotes a study by economist Edwin Mansfield on New England manufacturing companies in the 1970s. His finding: that copying an idea costs 65% as much money and 70% as much time as original invention.
That is an enormous advantage for the copier! By cutting time to market by 30%, a copier could grow 142% as fast as an innovator—and realize much higher profits to boot. Dr. Mansfield’s figures demonstrate why investors rely on patents, and they refute Mr. Ridley’s argument.
Were economists to redo the study today, they would find that the digital revolution has made copying vastly easier in virtually all fields. And as for manufacturing—well, there just aren’t the same proportion of those jobs in America as there were in the 1970s. Did those jobs go to companies renowned for original invention, or did they go to copiers?
The natural experiment has been run, and we all know the outcome. Manufacturing soared in Asian countries with low labor costs, while shrinking in the U.S. Mr. Ridley’s hypothesis that freedom to copy would generate innovation was tested by the real world, and it failed.
Mr. Ridley makes unsupported straw-men assertions, like “Politicians believe that innovation can be turned on and off like a tap.” Really? Research policy makers understand full well that most science doesn’t create either new technology or economic benefit in the short term; it is only over decades that high returns from the rare winners pay for the rest.
So says Edwin Mansfield, who also studied the social returns on government-funded research—work that Mr. Ridley curiously fails to mention. In a series of papers, Dr. Mansfield found that the return to society on government funding in basic science was 28% to 40% a year, returns that would make a hedge-fund manager blush. A string of subsequent studies through 2014 have confirmed that basic science funding yields high returns, not just in the U.S. but also in the U.K., Europe and Japan.
This research convincingly refutes the unsupported assertions of Terence Kealey, whom Mr. Ridley relies on. It also casts doubt on a 2003 OECD study that Mr. Ridley says “has never been challenged or debunked.” Except that it was challenged in an economics journal, almost as soon as it appeared, by an expert reviewer who wrote: “I end up concluding that this data set does not provide very clear guidance as to the role or importance of R&D to growth.” Mr. Ridley accuses economists of ignoring inconvenient arguments, but he seems to reserve that privilege for himself.
Mr. Ridley says “The discovery of the structure of DNA depended heavily on X-ray crystallography of biological molecules, a technique developed in the wool industry to try to improve textiles.” Huh? X-ray crystallography was invented by Max von Laue, improved by the father-son team of W. H. and W.L. Bragg, and adapted to biomolecules by John Bernal and Dorothy Hodgkin and to DNA by Rosalind Franklin—all academic scientists, not yarn spinners. He argues that industry, “[h]aving invented the steam engine…will pay for thermodynamics.” Well, no, thermodynamics was developed by men like Carnot, Clausius, Maxwell, Boltzmann and Gibbs—again, academics not steam mechanics.
He quotes the 18th-century economist Adam Smith saying inventions are often not done by “learned men”; but we’re in the 21st century today. Indeed all of Mr. Ridley’s examples are from the distant past. For the last 100 years, basic scientific knowledge has directly driven or informed the invention of an enormous amount of new technology. Mr. Ridley’s argument might have made some sense in 1915, but not 2015.
The most stunning example of a seismic economic shift to emerge from government-funded labs is the Internet. Mr. Ridley utterly dismisses it.
A McKinsey study found that Internet-related expenditures accounted in 2013 for about 4.3% of the U.S. GDP, or about $721 billion a year (and growing rapidly). Across 13 of the largest economies, the Internet adds about $2.2 trillion dollars a year to GDP.
How does that return compare to the investment? The Internet had many origins, but the three most cited contributors are the invention of the TCP/IP protocol (funded by DARPA), the invention of the World Wide Web (at CERN), and the creation of the Mosaic Web browser (at NCSA/University of Illinois). DARPA’s budgets from its formation in 1958 through 2015 come to about $121 billion after adjusting for inflation. The expenses of CERN, which mostly focuses on particle physics, total roughly $50.5 billion. NCSA budgets are harder to come by, but as a gross overestimate we can take the entire amount spent by the U.S. federal government on all science and research (including health, climate, energy, and all other fields except NASA) since 1962, which comes to $372 billion.
So, even if CERN never found the Higgs boson, the National Institutes of Health never saved a child’s life, and all the knowledge we have accumulated from science in the past 50 years was utterly worthless—even then the taxpayer investment would have been a roaring success for society, based on the Internet alone.
Does government funding “crowd out” some private funding of science and favor some research questions over others, as Mr. Ridley argues? Perhaps to some degree, but once again the world has run another revealing natural experiment: Countries vary in how much governments spend on research, with the U.S. leading by a healthy margin. By Mr. Ridley’s argument, it should therefore have the worst science and the least technology, but the opposite is true. Meanwhile in Europe, where governments spend much less on research, economists anxiously study the “European paradox” to understand why EU nations underperform in the tech economy.
The system we have for motivating and funding basic science is far from perfect, and it could use thoughtful criticism to improve it. But basic science isn’t a myth—it is the foundation of the modern world. You’d think that one of our best science writers could see that.
Matt Ridley replies:
The “myth” I was challenging in my article is that basic science is the source of all innovation. I do not argue in the article, or anywhere else, that basic science does not matter, that it plays no part in innovation, or that it should be defunded. I have spent nearly my whole career cataloging and celebrating the achievements of basic science, mainly in genetics and evolutionary biology. I would like to see more money spent on it, and not just by government, because I think it is the richest fruit of our civilization—worth doing for its own sake, not just because it might help advance technological innovation. I think science is probably less well funded than it could be because it is often trapped in a mind-set that only government will fund it and that it must therefore make a utilitarian appeal rather than an inspirational one.
The focus of my article was technology and innovation in technology. One of my aims was to speak up for the little guy, to point out how much innovation is a bottom-up process achieved by ordinary people, not superhuman geniuses, and often by people working in technology itself, rather than in pure science. To put it another way, much innovation happens before the principles underlying the phenomenon are fully understood. That’s not to say that understanding them does not matter, or cannot help with further innovation; it can.
The field of molecular biology is a good example. The techniques for reading sequences of DNA bases is fundamental to everything in the field. That technology—and yes, it is a technology, even if it was invented in a publicly funded science lab—was the mother, more than it was the daughter, of a great deal of scientific discovery. A lot of what happens in molecular biology is best described as technology, rather than science, and it rightly attracts much private funding as well as public. CRISPR-Cas9 is a technology, too, and a very valuable one for science.
I am surprised to find Nathan Myhrvold effectively defending the “linear model” of science, in which scientists generously hand their results to technologists to be applied. As long ago as 1982, Nathan Rosenberg in his book “Inside the Black Box” showed clearly that the idea that basic science (and its funding by government) underpinned technological and thus economic growth was wrong. Few students of the economics of science now subscribe to this view.
I have my doubts that desire to get a Nobel Prize is a significant motivator to scientists, as Mr. Myhrvold implies, though it surely does not hurt. My point in calling Nobel Prizes unfair was not that certain people do not deserve praise, but that every Nobel Prize goes to one team who put one link in a chain of discovery and could often just as easily have gone to the team that put the previous or subsequent link in the chain. It thus misleads both the public and scientists themselves into thinking of discovery as more “eureka” and less collaborative than it is.
As for patents, Mr. Myhrvold holds to the view of patents as incentives to discovery and transparency. This does sometimes happen. But there is a vast literature showing that, especially in non-pharmaceutical areas, they are also increasingly misused as barriers to entry and toll booths for rent seeking. See “Patent Failure: How Judges, Bureaucrats, and Lawyers Put Innovators at Risk” by James Bessen and Michael Meurer (2008) and “Against Intellectual Monopoly” by Michelle Boldrin and David Levine (2008). Alex Tabarrok in his 2011 book “Launching the Innovation Renaissance,” argues persuasively that a little intellectual property is better than none, but a lot is a bad thing, and that U.S. patent law is well beyond the optimal point.
Mr. Myhrvold says that the 2003 OECD study I cite, which found no evidence that public funding of innovation correlated with economic growth, was “challenged in an economics journal, almost as soon as it appeared.” He refers to a book review in an obscure publication called International Productivity Monitor, whose author’s dislike of the study is based solely on the fact that he finds the results on R&D counterintuitive. The magazine boasts on its own website of publishing largely nontechnical articles. Some journal, some challenge.
On the question of copying, Mr. Myhrvold is wrong to think that Mansfield’s 65% to 70% implies good returns for copying, because that was only the direct costs of copying: add the fixed costs of employing skilled copiers in the first place, and it’s reasonable to assume that the full costs of copying are north of 100%. As Alex Tabarrok argues, in practice imitation is often more costly than innovation, because the learning curve of the imitator is so steep.
Mr. Myhrvold also says that, if copying were a route to innovation, Asia would be much more innovative than it is. But Asia is the source of much innovation, and I cite one spectacular recent example: the electronic cigarette, which is currently driving down smoking rates in Europe at a rapid rate. It was developed in China with little input from science, just a combination of technologies.
As for the Internet, the old claim that it was the product of government-funded science is very misleading. Sure, it depends on many innovations and discoveries, including packet switching and, for that matter, electromagnetism and gravity. But, much as I admire Sir Tim Berners Lee, it’s inconceivable that the World Wide Web or something equivalent would have gone un-invented, given the state of computing and communication.
As Steven Berlin Johnson argues in his book “Future Perfect” (2012), the true origin of the Internet does not lie in brilliant individuals, nor in private companies, nor in government funding. It lies in open-source, peer-to-peer networking. He says it is not even a bottom-up thing, since a bottom implies a top, and there is no top of the Internet. I agree, and this makes my point nicely. Technology evolves through the interactions of many ordinary people, rather than being handed down from the ivory towers of a few elite geniuses.
It needed to be said.  And it could not have been done better.  But then consider the source.  Myhrvold, had he pursued research, would have been one of the top scientists in the world. 

0 comments:

Post a Comment