One is the notion of the lone inventor, fighting the odds and the skepticism of colleagues, as the source of the greatest technological advances.
The other is that the success of the end product is the only measurable outcome worthy of note.
But as the following article explains, enterprises are learning that collaborative teams may be more productive than lone wolves, especially at scale. And that process improvements may ultimately be the most valuable result of the innovation initiative.
Organizations capable of and disciplined enough to ignore traditional forms and measures may be better able to recognize - and act upon - improvements wherever they find them, achieving optimal returns on their investment and effort. JL
Greg Satell comments in Digital Tonto
Innovative enterprises have learned the value of instilling iterative process(es) across integrated, multidisciplinary teams within their organizations. As it turns out, if we are to solve our biggest and toughest problems, we need to learn how to implement that same level of collaboration across our entire society.
By the mid-1980’s, the American semiconductor industry seemed like it was doomed. Although US firms had pioneered and dominated the technology for decades, they were now getting pummeled by cheaper Japanese imports. Much like cars and electronics, microchips seemed destined to become another symbol of American decline.
The dire outlook had serious ramifications for both US competitiveness and national security. So in 1986, the American government created SEMATECH, a consortium of government agencies, research institutions and private industry. By the mid 1990’s, the US was once again dominating semiconductors.
Today, SEMATECH is a wholly private enterprise, funded by its members, but its original model is being widely deployed to solve new problems, such as creating next generation batteries, curing cancer and reviving American manufacturing. The truth is that some of the problems we face today are simply too big and complex to be solved by any one organization.
The Valley of Death
To understand the nature of the problem let’s look at how penicillin was brought to market. As many people know, Alexander Fleming originally discovered the miracle cure in 1928, when his bacteria culture was contaminated by a mold. What most people don’t realize is that his findings sat in an obscure medical journal for a full decade before anyone noticed them.
In fact, penicillin wasn’t deployed until 1943, when the US military used it to cure soldiers in World War II. And it wasn’t until 1945—nearly two decades after the initial discovery—that penicillin was made available to the general public. In the interim, researchers needed to figure out how to isolate penicillin, make it stable and produce it in large quantities.
But lest you think that type of thing is a relic of the past, consider the case of immunotherapy, a radical new cancer treatment that has given hope to even the most terminal cancer patients. Some, who once dreamed of living enough months to see their children’s’ high school graduation are now living happy, vibrant lives and enjoying their grandchildren. The initial breakthrough came in 1987, but the first drug wasn’t approved until 2012—25 years later.
This gap between initial discovery and commercialization is known as the “Valley of Death,” because while research institutions have a clear mandate for basic research and private enterprises are highly adept at bringing new products to market, many pathbreaking new technologies languish in the stage in between, when there is promise but no proof.
The Era of Big Science
Today, much has changed since when penicillin was discovered. The US government now invests nearly $150 billion in R&D, about half of that in fundamental, discovery driven research focused on exploring new horizons. The private sector spends three times that, mostly on translational research intended to transform basic discoveries into new products.
Yet the Valley of Death still remains. The is a wide chasm that separates an initial discovery and a viable product idea. Every year, tens of thousands of papers are published in scientific journals and any one of them, potentially, could hold the key to the next big thing. But for a private firm to invest millions of dollars in a new idea, there has to be more to go on.
Another problem is that the research institutions themselves—government labs and research driven universities—have become so large that they’ve become notoriously hard to navigate. At the same time, the marketplace has become so fiercely competitive—and investors so demanding—that few are willing to take a flyer on an unproven technology.
To bridge that gap, new organizations have risen up that build on the original SEMATECH model to solve our toughest problems. The Joint Center For Energy Storage Research (JCESR) has a five year mandate to develop next generation battery technologies. The Institute for Applied Cancer Science at MD Anderson (IACS) is exploring revolutionary new cures and the National Network for Manufacturing Innovation (NNMI) is working to revive US production capacity.
Innovating The Innovation Process
The key to making these organizations work is integrating the work of discovery driven researchers, applied scientists and engineers in the private sector. “Usually discovery propagates at the speed of publication,” George Crabtree, Director at JCESR told me. “But here, we can operate within the time frame of the next coffee break.”
That’s really essential, because the different players often have widely divergent incentives. Giulio Draetta, the Director of IACS who formerly headed up global basic research at Merck, points out that for academic researchers, publication is the coin of the realm, so they are focused on uncovering results that are new and exciting, not on marketability.
Profit driven companies, on the other hand, feel so much pressure to go to market, that they often pass on ideas with vast potential. “Venture capitalists are looking for rapid exits and researchers are pretty much stuck with the idea that they came to the party with. That means it’s very hard to cull bad ideas and lots of time, effort and money are wasted,” Dreatta says. Yet he has also found that when you put people together, they tend to widen their perspectives.
”Once you have everybody sitting around the table, it’s much easier to come up with new ideas and to discard others that will not work,” he notes. He has also found that, over time, academic researchers realize that there is a great opportunity to put out better publications and win more funding, while drug developers learn to build relationships, offer input and be more flexible about exploring new directions.
Killing Ideas Faster
One of the major advantages of this more integrated approach is that product developers can steer discovery driven researchers in more fruitful directions. In battery research, for example, scientists have long focused on finding materials with greater energy density. Manufacturers, on the other hand, value safety just as much as performance, so that they don’t have to add extra shielding to the battery that increases weight and diminishes overall efficiency.
Draetta of IACS has found many of the same issues arise in pharmaceutical research. “That’s exactly what we face all the time,” he told me. “Often there are factors related to manufacturing costs, potential drug interactions and other things that research scientists aren’t aware of.” Armed with that knowledge, work can be directed toward paths that offer greater viability.
This is absolutely crucial to accelerating innovation because, once a scientist embarks on a particular direction, years can be spent performing the necessary research, verifying the results and preparing it for publication. If that work turns out not to be useful, enormous amounts of time and effort can be wasted. However, if nonviable research can be pruned at the next coffee break, resources can be focused on areas that are more likely to lead to a true breakthrough.
Simply put, integrating efforts earlier in the process can save tremendous amounts of time, money and other resources, while at the same time producing better results.
Putting Manufacturing At The Center
Getting input from manufacturers may seem like a no-brainer, but the truth is that it’s fairly difficult for scientist to do so in a systematic manner. An academic doing exploratory research in, say, genetics or materials science, can’t simply pick up the phone and ring up a manufacturer’s hotline to see what engineers are thinking. Often, that information is considered proprietary.
Yet President Obama’s Advanced Manufacturing Initiative is working to change that. In 2011, he commissioned a report that called for an “innovation policy” rather than an “industrial policy.” The result of that report has been to set up a network of institutes that act as hubs for innovation in areas like 3D printing and integrated photonics.
So far, the program has exceeded the expectations of the initial plan. While the it was intended to receive matching funds for the initial set up period of 5-7 years, manufacturers have exceeded that goal by roughly 50%. Firms have also invested additional money to build facilities near the hubs to better integrate their operations with them. Congress has recently passed legislation to expand the program and ensure its continuity.
Manufacturers are excited about the program as well. Dr. Mukesh V. Khare, IBM’s VP of Semiconductor Research told me, “We found in our previous involvement with SEMATECH, that these types of consortia help us adapt investment, supply chain and retooling for the future. So for photonics, which is a really crucial area for us, it made all the sense in the world to join the NNMI hub.” The company has organized its own Research Frontiers Institute along similar lines.
America has lost 5 million manufacturing jobs since 2000, so integrating scientists into the production process is just as important as integrating manufacturers into the discovery process.
A National Innovation Ecosystem
The innovation architecture set up after World War II has served America extremely well. Publicly funded research, paired with a vigorous private sector has made the US a leader in virtually every area of advanced technology. Other nations may challenge that leadership in one industry or another, but nobody can match the breadth and depth of the United States.
However, the problems we need to solve today are far more complex than before. The journal Nature recently noted that the average scientific paper today has four times as many authors as one did in 1950 and the work they are doing is far more interdisciplinary and done at greater distances than in the past. That greater complexity means that we need to design new approaches to address market failures like the “Valley of Death.”
So it is imperative that we build a more complete ecosystem on top of the post-war architecture. “In the 1980’s and before there was a continental divide that separated basic and applied science,” Ron DePinho, President of MD Anderson Cancer Center, told me. “Since then, the idea of translational research has taken hold, but we still have an inefficient organizational infrastructure to pursue conversion of discovery into new therapies in any systematic way.”
“As one example, there are 8000 early detection cancer candidate biomarkers in literature. Yet only two have progressed to where they can be clinically effective. At MD Anderson, we’ve tried to change that by reaching out to all areas of the ecosystem, including academic institutions, venture capitalists and industrial players ranging from big pharma to startup biotechnology firms, because we believe that the best science happens when we get everybody’s perspective.”
Many innovative enterprises have learned the value of instilling this type of iterative process across integrated, multidisciplinary teams within their organizations. As it turns out, if we are to solve our biggest and toughest problems, we need to learn how to implement that same level of collaboration across our entire society.
0 comments:
Post a Comment