Don Clark reports in the Wall Street Journal:
HP has created a working prototype of an unusual system that leans heavily on memory technology to boost calculating speed. In conventional computers, processors are the focal point of design while memory is regarded as a scarce commodity. (It's) designed to lower the cost of sending data over fiber-optic connections and could make it as fast to transmit data from one machine to another as within a single system. It could also stack components vertically to save space
Hewlett Packard Enterprise Co. has reached a milestone in a high-profile plan to deliver a new kind of computer. But the company is betting its business may benefit as much from components it developed for the project as from the complete system.
The company on Monday used an event in London to disclose it has created a working prototype of what it calls the Machine, an unusual system that leans heavily on memory technology to boost calculating speed. Engineers have booted up an early version of the hardware in a lab in Fort Collins, Colo., a test of the design the company says will allow programmers to begin working on new software that will be needed to exploit the system.
HP Enterprise isn’t ready to set a timetable for a commercial version of the Machine, a research project unveiled in 2014 that will require novel memory chips that aren’t expected to be widely available until at least 2018. Its near-term plan is to use components developed for the system in its conventional server systems, disclosing Monday it expects to use some of the technology in systems by 2018 or 2019.Antonio Neri, the executive vice president who oversees HP Enterprise hardware products, said he has directed his staff to “bring those technologies into our current set of products and road maps faster.”
One part of the Machine that is already in production is X1, a chip and related components designed to sharply lower the cost of sending data over speedy fiber-optic connections. HP Enterprise engineers say X1 could make it as fast to transmit data from one machine to another
as within a single system. It could also allow computer designers to stack components vertically to save space, breaking from the horizontal circuit boards used in most servers, the company said.
Participants in the project say a complete version of the Machine, a design expected to begin in servers and spread to smaller systems, could bring more than hundredfold speedups to chores such as helping airlines respond to service disruptions, preventing credit-card fraud and assessing the risk of vast financial portfolios.
One target application is sifting through vast numbers of computing transactions for signs of abnormal activity that indicate a cyberattack. Today’s systems can analyze 50,000 events a second and compare them with five minutes of prior events, HP Enterprise says. With the Machine, 10 million events a second could be compared with 14 days of prior activity, detecting many more problems, the company said.
The company, created a year ago from the breakup of Hewlett-Packard Co., is the No. 1 manufacturer of the server systems that are fixtures in corporate data centers. It is a tough business, as some customers gravitate toward inexpensive commodity-style systems and others turn to cloud computing services.
Broad pressures are pushing HP Enterprise and others toward new computing approaches. More Silicon Valley companies and researchers argue new kinds of servers will become necessary to cope with a growing flood of data from connected devices.
Another force driving innovation is the slowing pace of improvements in the microprocessor chips used in most computers. Chip makers for decades packed double the number of transistors on chips every two years or so, a pattern known as Moore’s Law. Those gains in miniaturization used to bring dramatic jumps in speed and energy efficiency, but not lately.
“We are in a stage where stuff we were doing before has to change,” said Erik DeBenedictis, a technical staff member at Sandia National Laboratories who studies nonconventional computing technologies. “That hasn’t happened for a very long time.”
The Machine, a project involving hundreds of HP Enterprise engineers, diverges from principles the industry has followed since they were laid out in a 1945 paper by mathematician and physicist John von Neumann.
“It’s breaking every rule,” said Meg Whitman, HP Enterprise’s chief executive.
In conventional computers, processors are the focal point of system design while memory is regarded as a scarce commodity, said Kirk Bresniker, an HP Enterprise chief architect and vice president who also holds the title of fellow. Big calculating jobs typically are broken up to be handled by individual microprocessors, which retrieve data from sets of memory chips or devices like disk drives. In many cases, the processors have to wait as data are copied and transferred back and forth among the various components, he said.
The Machine is designed to make those data transfers unnecessary, Mr. Bresniker said. Hundreds to thousands of processors simultaneously can tap into data stored in a vast pool of memory contained in one server or multiple boxes in a data center.
The prototype system described in a briefing at HP Enterprise headquarters here comprises long, drawer-like modules that slide into a conventional chassis. Two modules contain 8 terabytes of memory—roughly 30 times the amount found in many conventional servers—with hundreds of terabytes expected in the future as more chips and modules are linked together, Mr. Bresniker said.
The project has been hampered somewhat by delays in the advent of new, inexpensive memory chips that retain data when a system is switched off. HP Enterprise designed its prototype using industry-standard DRAMs, or dynamic random-access memory. Those chips are fast but expensive and they lose data when a system is powered down. Meanwhile, the company continues to work on options that include a new memory-retaining circuitry known as memristor, but hasn’t set a date for their delivery.
The DRAM-powered system lets programmers write software for the Machine, but it won’t make for a commercially viable product.
“We do need one of these memory technologies to pop for us,” said Andrew Wheeler, an HP Enterprise deputy lab director who also holds the title of fellow.
0 comments:
Post a Comment