That the agency which funded the invention of the internet is now focusing on chip technology should tell the economy about where some of the brightest minds in tech see a great opportunity and/or threat. JL
Devin Coldewey reports in Tech Crunch:
DARPA is throwing an event around its “Electronics Resurgence Initiative,” an effort to leapfrog existing chip tech by funding powerful but unproven new ideas percolating in the industry. The list of partners and participants (includes) MIT, Stanford, Princeton, Yale, the UCs, IBM, Intel, Qualcomm, National Labs. Think of it as trying to spur American innovation in important areas that also may happen to have military significance down the line.
The Defense Department’s research arm, DARPA is throwing an event around its “Electronics Resurgence Initiative,” an effort to leapfrog existing chip tech by funding powerful but unproven new ideas percolating in the industry. It plans to spend up to $1.5 billion on this over the years, of which about $75 million was earmarked today for a handful of new partners.
The ERI was announced last year in relatively broad terms, and since then it has solicited proposals from universities and research labs all over the country, arriving at a handful that it has elected to fund.
The list of partners and participants is quite long: think along the lines of MIT, Stanford, Princeton, Yale, the UCs, IBM, Intel, Qualcomm, National Labs and so on. Big hitters. Each institution is generally associated with one of six sub-programs, each (naturally) equipped with their own acronym:
These are all extremely ambitious ideas, as you can see, but don’t think about it like DARPA contracting these researchers to create something useful right away. The Defense Department is a huge supporter of basic science; I can’t tell you how many papers I read where the Air Force, DARPA or some other quasi-military entity has provided the funding. So think of it as trying to spur American innovation in important areas that also may happen to have military significance down the line.
- Software-defined Hardware (SDH) — Computing is often done on general-purpose processors, but specialized ones can get the job done faster. Problem is, these “application specific integrated circuits,” or ASICs, are expensive and time-consuming to create. SDH is about making “hardware and software that can be reconfigured in real-time based on the data being processed.”
- Domain-specific System on Chip (DSSoC) — This is related to SDH, but is about finding the right balance between custom chips, for instance, or image recognition or message decryption, and general-purpose ones. DSSoC aims to create a “single programmable framework” that would let developers easily mix and match parts like ASICs, CPUs, and GPUs.
- Intelligent Design of Electronic Assets (IDEA) — On a related note, creating such a chip’s actual physical wiring layout is an incredibly complex and specialized process. IDEA is looking to shorten the time it takes to design a chip from a year to a day, “to usher in an era of the 24-hour design cycle for DoD hardware systems.” Ideally no human would be necessary, though doubtless specialists would vet the resulting designs.
- Posh Open Source Hardware (POSH) — This self-referential acronym refers to a program where specialized SoCs like those these programs are looking into would be pursued under open-source licenses. Licensing can be a serious obstacle to creating the best system possible — one chip may use a proprietary system that can’t exist in concert with another chip’s proprietary system — so to enable reuse and easy distribution they’ll look into creating the best system possible — one chip may use a proprietary system that can’t exist in concert with another chip’s proprietary system — so to enable reuse and easy distribution they’ll look into creating and testing a base set that have no such restrictions.
- 3-Dimensional Monolithic System-on-a-chip (3DSoC) — The standard model of having processors and chips connected to a central memory and execution system can lead to serious bottlenecks. So 3DSoC aims to combine everything into stacks (hence the 3D part) and “integrate logic, memory and input-output (I/O) elements in ways that dramatically shorten — more than 50-fold — computation times while using less power. The 50-fold number is, I’m guessing, largely aspirational.
- Foundations Required for Novel Compute (FRANC) — That “standard model” of processor plus short-term and long-term memory is known as a Von Neumann architecture, after one of the founders of computing technology and theory, and is how nearly all computing is done today. But DARPA feels it’s time to move past this and create “novel compute topologies” with “new materials and integration schemes to process data in ways that eliminate or minimize data movement.” It’s rather sci-fi right now as you can tell, but if we don’t try to escape Von Neumann, he will dominate us forever.
A DARPA representative explained that $75 million is set aside for funding various projects under these headings, though the specifics are known only to the participants at this point. That’s the money just for FY18, and presumably more will be added according to the merits and requirements of the various projects. That all comes out of the greater $1.5 billion budget for the ERI overall.
The ERI summit is underway right now, with participants and DARPA reps sharing information, comparing notes and setting expectations. The summit will no doubt repeat next year when a bit more work has been done.
0 comments:
Post a Comment