A Blog by Jonathan Low

 

Jun 8, 2015

Intel and Rivals Race to Prepare for the New Moore's Law

Co-evolution rules: as algorithms and data come to dominate technological applications, changes in hardware become imperative in order to keep up. JL

Cade Metz reports in Wired:

The giants of the Internet are changing the hardware—including the chips—that drive their services. Google, Facebook, Chinese search company Baidu, and others are now using GPUs—chips originally built for rendering graphics—to power the latest artificial intelligence tools, including speech recognition, image recognition, and natural language processing.
Microsoft researcher Doug Burger received more than his usual share of email.
On Monday, Intel told the world it was spending $16.7 billion in cash to acquire a company called Altera. And perhaps more than anyone, Burger understands why this deal makes sense for the world’s largest chip maker. At Microsoft, he cooked up a new way of powering the company’s Bing search engine using the low-power programmable chips sold by Altera, pairing them with traditional microprocessors from Intel.
Asked how he views the Intel acquisition, Burger is understandably coy. “For me,” he says, “it meant I answered a lot of extra mail.” But he sees a very real future for the unexpectedly effective chips offered by Altera. As he points out, Microsoft has publicly said that Altera’s chips can boost the speed of Bing in big ways, and that they’re suited for use inside the “deep learning” systems that are rapidly giving online services the ability to identify faces in photos and recognize speech.
In acquiring Altera, Intel is embracing a movement poised to reshape so many of the data centers that underpin the world’s online services. IBM is also exploring the use of Altera chips, as are researchers at the University of Michigan. On the whole, these chips aren’t as powerful as the CPUs that traditional drive computer servers, but engineers can program them to perform specific tasks with a new level of efficiency. “People are doing so many experiments,” Burger says, “trying to figure out the right combination of algorithm and platform that gives you the best results.”

Streamlining the Future

Altera makes what are called field programmable gate arrays, or FPGAs. Such chips have been around for years. Typically, engineers use them as a way of prototyping new processors—trying out new designs. Companies also put them into specialized hardware, including gear that routes data across computer networks. But a few years ago, Burger realized that if Microsoft put them into servers, they could also streamline the operation of a sweeping online service like Bing.
Last summer, Microsoft unveiled a pilot project—Project Catapult—where it tested a network of about 1,600 servers that pair Intel CPUs with Altera FPGAs. According to Burger, the Altera chips could process certain Bing algorithms about 40 times faster than traditional CPUs, providing a boost that could make Microsoft’s search engine twice as fast on the whole. Basically, chips help decide how to rank the items that turn up on Bing’s results page.
At the time, Burger told WIRED that Microsoft would move Bing to this kind of hardware in early 2015. He now says the company has yet to make the transition, but it will. “My team is really heads-down right now,” he says, “trying to make that happen for live customer traffic in the data center.”

‘Absolutely Critical’

The move is just one way that the giants of the Internet are changing the hardware—including the chips—that drive their services. Google, Facebook, Chinese search company Baidu, and others are now using GPUs—chips originally built for rendering graphics—to power the latest in artificial intelligence tools, including speech recognition, image recognition, and natural language processing. And like Microsoft, various companies are exploring other low-power chips that can reduce costs and perhaps boost speeds in the data centers. Google and Facebook are looking at silicon based on the ARM chips that power most smartphones.
According to a recent study from the University of Michigan, if you operate a voice recognition service akin to Apple Siri on traditional hardware, it requires about 168 times more machines, space, and power than a text-based search engine along the lines of Google Search. GPUs and FPGAs, the study shows, can shrink that gap. “It’s going to be absolutely critical that future data center designs include GPUs or FPGAs,” Michigan professor Jason Mars recently told WIRED. “You can get at least an order-of-magnitude improvement.”
Though Microsoft initially described Project Catapult as a way of boosting the performance of Bing—a good old text-based search engine—the company is also exploring FPGAs as a way of running voice recognition tools and other artificially intelligent systems, as it described in a recent research paper. Like GPUs, the company believes, FPGAs are suited for use with “neural nets”—networks of machines that approximate the networks of neuron in the human brain—used to power services that recognize images and spoken words. “The results in that paper were pretty compelling,” Burger says.
According to Burger, FPGAs are better suited to a much wider variety of tasks than GPUs. “The FPGA is less efficient for the stuff that’s exactly suited to GPUs,” he explains, “but for everything else, they can be more efficient, because you can customize the pipeline.” He points out, however, that programming an FPGA may require more work and new engineering skills.
“The FPGA solutions are harder to change that software,” he says. “A really, really good FPGA designer can bring up some of these stacks pretty quickly. But there tend to be, I think, fewer people that do that really well than there are GPU programmers.”

The New Moore’s Law

As the Microsofts and the Googles began to explore these new types of chips, Intel was very much on the outside looking in. GPUs are made by companies like nVidia. Burger calls Altera and its rival Xilinx “the Coke and the Pepsi” of FPGAs. But things are changing.
Intel sees where the market is headed, and it wants to be there. The company has already built experimental motherboards that include both Intel CPUs and Altera FPGAs, and after agreeing to acquire Altera, it intends to offer such boards “as highly customized, integrated products,” even as it works to “enhance Altera’s products through design and manufacturing improvements.”
Burger declines to discuss the Intel-Altera deal, except to say that with Intel controlling a majority of the worldwide market for server chips, this is sure to create a “very interesting dynamic” in the world of FPGAs. Intel says it can better integrate CPUs withe FPGAs. But in a way, the deal also creates less competition in the marketplace. As Burger puts it, it’s no longer Altera’s Coke to Xilink’s Pepsi.
Citing various analyst reports—“I’m not sharing my opinion,” he says—Burger does indicate that the deal is a sign of an even larger transformation in the chip world. As we approach the demise of Moore’s Law—the notion that raw chip power will double every 18 months—chip manufacturers like Intel must find new ways of moving their business forward. If they can’t build faster CPUs, they can can at least offer a new breed of chip. As Microsoft is using it, that’s what the FPGA amounts to.

0 comments:

Post a Comment