By definition, a computer is a machine that processes and stores data as ones and zeroes. But the U.S. Department of Defense wants to tear up that definition and start from scratch.
Through its Defense Advanced Research Projects Agency (Darpa), the DoD is funding a new program called UPSIDE, short for Unconventional Processing of Signals for Intelligent Data Exploitation. Basically, the program will investigate a brand-new way of doing computing without the digital processors that have come to define computing as we know it.
The aim is to build computer chips that are a whole lot more power-efficient than today's processors ' even if they make mistakes every now and then.
The way Darpa sees it, today's computers ' especially those used by mobile spy cameras in drones and helicopters that have to do a lot of image processing ' are starting to hit a dead end. The problem isn't processing. It's power, says Daniel Hammerstrom, the Darpa program manager behind UPSIDE. And it's been brewing for more than a decade.
'One of the things that's happened in the last 10 to 15 years is that power-scaling has stopped,' he says. Moore's law ' the maxim that processing power will double every 18 months or so ' continues, but battery lives just haven't kept up. 'The efficiency of computation is not increasing very rapidly,' he says.
Hammerstom, who helped build chips for Intel back in the 1980s, wants the UPSIDE chips to do computing in a whole different way. He's looking for an alternative to straight-up boolean logic, where the voltage in a chip's transistor represents a zero or a one. Hammerstrom wants chipmakers to build analog processors that can do probabilistic math without forcing transistors into an absolute one-or-zero state, a technique that burns energy.
It seems like a new idea ' probabilistic computing chips are still years away from commercial use ' but it's not entirely. Analog computers were used in the 1950s, but they were overshadowed by the transistor and the amazing computing capabilities that digital processors pumped out over the past half-century, according to Ben Vigoda, the general manager of the Analog Devices Lyric Labs group.
'The people who are just retiring from university right now can remember programming analog computers in college,' says Vigoda. 'It's been a long time since we really questioned the paradigm that we're using.'
Probabilistic computing has been picking up over the past decade, Vigoda says, and it's being spurred now by Darpa's program. 'They bringing an emerging technology into the limelight,' he says.
Darpa's 54-month program will run in two phases. During the first companies will build chips using probabilistic techniques. During the second, they will build mobile imaging systems using the chips. Hammerstein expects the systems to be faster and 'orders of magnitude more power-efficient.'
'There's a sense that it's time to revisit some of these issues,' says Darpa's Hammerstrom. 'And this is what Darpa does. We look around and we say, 'This is a place and a time where we could make a difference.''
Hammerstrom wouldn't say how much Darpa is investing in UPSIDE, but he described it as a 'moderate-sized Darpa program.'
Six years ago, Vigoda started a company called Lyric Semiconductor to build a 'probability processor' that can do the work of many chips. Lyric was acquired by Analog Devices, a maker of chips for medical, cellular, industrial and consumer systems, and Vigoda says that the probability processor could be used in any of those markets.
Probabilistic computing has two basic promises: one is to open the door to low-power, high-performance computing, especially in areas where the answer doesn't have to be completely perfect ' image rendering for example.
That's what researchers at Rice university hit at earlier this year when they designed a low-power chip that uses probabilistic computing techniques to do energy-efficient, if occasionally inexact, calculations.
Another promise is to build new types of chips that can solve some of the complex data analysis problems that are on the cutting edge of today's computer science.
'We're using a few percent of the U.S.'s electricity bill on server farms and we can only do very basic machine-learning,' says Vigoda. 'We're just doing really really simple stuff because we don't have the compute power to do it. One of the ways to fix this is to design chips that do machine-learning.'
Tidak ada komentar:
Posting Komentar