Mark of the Beast Watch: Can the US Military Re-Invent the Microchip for the AI Era?
SOURCE
As conventional microchip design reaches its limits,
DARPA is pouring money into the specialty chips that might power
tomorrow’s autonomous machines.
The coming AI revolution faces a big hurdle: today’s microchips.It’s one thing to get a bunch of transistors on an integrated circuit to crunch numbers, even very large ones. But what the brain does is far more difficult. Processing vast amounts of visual data for use by huge, multi-cellular organism is very different from the narrow calculations of conventional math. The algorithms that will drive tomorrow’s autonomous cars, planes, and programs will be incredibly data-intensive, with needs well beyond what conventional chips were ever designed for. This is one reason for the hype surrounding quantum computing and neurosynaptic chips.
That challenge has a sister predicament: the end of Moore’s Law. The integrated circuit revolution that gave birth to the modern computer, smartphone, and basically all of Silicon Valley is in its twilight. In the 1960s, Gordon Moore observed that the number of transistors per square inch on integrated circuits was doubling roughly 18 months. That won’t be true after 2020, according to Robert Colwell, formerly of the Defense Advanced Research Projects Agency, or DARPA
“There’s a $300 billion-a-year global semiconductor industry that cares deeply about the answer of what comes next,” Prabhakar said. Apple, for example, is said to be working on a processor devoted specifically to AI-related tasks. But DARPA has money in the game as well, The magic of the integrated circuit — the reason why it will be so hard to replace — is that it was “a computational unit that you could use to do the broadest possible class of problems,” she said. The way forward will be building chips for specific purposes. “If you’re willing to work on specialized classes of problems, you can actually get a lot more out of specialized architectures,” she said. “Special architectures will give us many more steps forward.”
Bottom line: there is no silver-bullet replacement for the integrated circuit on the horizon. But you could achieve something Moore’s Law-like by creating chips that could crunch lots of a specific type of data. Some of these already exist; they’re called application-specific integrated circuit chips, or ASICs.
On Wednesday, DARPA announced several new next-generation chip design initiatives meant to build off that approach.
One, Software Defined Hardware, seeks “a hardware/software system that allows data-intensive algorithms to run at near ASIC efficiency without the cost, development time or single application limitations associated with ASIC development.”
A second program, Domain-Specific System on a Chip, takes a dual approach, letting architects “mix and match general purpose, special purpose (e.g., ASICs), and hardware accelerator coprocessors, as well as memory and [input/output] elements, into easily programmed [system on a chip] for applications within specific technology domains.”
In many ways, the premise of the program comes again from Moore’s 1965 paper. In this case, it’s his observation that, eventually, “the matching and tracking of similar components in integrated structures will allow the design of differential amplifiers of greatly improved performance.”
“With an eye toward the times we now live in, he laid out the technical directions to explore when the conditions under which scaling will be the primary means for advancement are no longer met,” DARPA program managers observed in the Broad Agency Announcement for “Page Three Investments” — an allusion to the actual page in Moore’s paper where the ideas first appear.
The two programs join several others in the Electronics Resurgence Initiative, a $216 million effort to create chip designs for 2030 to 2050.