Rediff.com« Back to articlePrint this article

A chip that is probably right

March 07, 2009 17:57 IST

Give your computer a break. Does it really have to give you the right answer every single time? Can't it be allowed to screw up once in a while?

Rice University computer science professor Krishna Palem thinks so. He wants to relax the standards of the microchip. "Errors are not bad things if you think about them the way we do," assures Palem, who also directs the Institute for Sustainable Nanoelectronics at Singapore's Nanyang Technological University.

As any SAT-taker knows, getting the exact right answer every time takes a tremendous amount of energy. The same for microchips. When trying to add 8,132,004 to 7,081,974 it's much faster and easier--both for a microchip and a brain--to spit out the rough answer "15 million" than to crunch all the numbers to 15,213,978.

By developing and deploying a new type of logic called probabilistic logic, Palem has invented a computer chip that uses 30 times less electricity while running seven times faster.

Sure, it doesn't always get the answer right, but it is precise enough to be very useful, Palem hopes. It runs on standard silicon CMOS (complementary metal oxide semiconductor) technology and it could boost the life of your cellphone battery by days or even weeks.

"We trade off a small amount of error, get a lot of energy savings and make [the chip] incredibly faster," he says.

Palem's idea came to him first in 2002 after hearing a lecture that the Nobel physicist Richard Feynman had given toward the end of his life. Feynman was pondering what might be the very smallest amount of energy necessary to compute one bit of information.

Feynman was thinking about a bit the way most people would--namely, that the bit would have to deliver the correct answer, a one or a zero. But Palem, aware of another problem facing computing, realized there might be another way to tackle this issue.

As classic computer chips continue to get smaller, it takes a smaller and smaller number of electrons to turn a chip "off" or "on." At some point, possibly in as little as six to eight years, it will be hard for circuits to distinguish between a meaningful electrical signal and the general "noise" of odd electrons bouncing around the atoms that make up chips.

One way to deal with that noise is to employ logic that takes into account the probability of noise getting in the way, so-called probabilistic logic.

"This logic will prove extremely important because basic physics dictates that future transistor-based logic will need probabilistic methods," said Shekhar Borkar, an Intel Fellow and director of Intel's Microprocessor Technology Lab, in a statement about Palem's discovery.

Palem realized that if he could develop and use probabilistic logic, he could build a chip that didn't have to always be right, while still delivering a "right enough" result. This logic, which he calls PCMOS (the "p" for probabilistic), would make for a faster, more efficient chip, and it could also someday be used to help allow chips to continue to get smaller.

Palem explained the concept to his graduate students using a metaphor of a bank balance. If you have $1,000.01 in your bank account, it is much more important for the bank to get that first "one" in the series right, the one that says you have $1,000.

It also important, though a little bit less so, that the bank get the number in the hundreds and the tens columns right. It's even less important that the bank get the individual dollars right, and it hardly matters at all if the bank gets that last number, the penny, right.

This is how Palem's chip crunches data works. It is designed to invest as much energy as needed getting the important numbers right, and much, much less energy getting the pennies right. "Why put in a lot of energy investment, if it's only returning you cents?" Palem reasons.

The logic that now powers chips, Boolean logic, can't handle random failures. So Palem, along with his then doctoral student, Lakshmi Chakrapani, designed a new logic that uses classical Boolean terms like "and" and "or" but allows for some flexibility by incorporating the term "may."

As in, the result may be x or y. From there, the chip finds the traditional Boolean circuit that has the best chance of delivering the right answer.

This kind of logic might have helped Intel back in the 1990s, when it was lambasted by critics for a flaw in its Pentium processor. Executives at the time contended that the chance of an erroneous calculation was one in 9 billion Turns out Intel's real miscalculation, however, was the public outcry over "defective" chips. The company ultimately agreed to take back processors from grumpy customers.

Palem's chip wouldn't be used to calculate something so sacrosanct as money, but audio and video feeds to a small cellphone screen don't need to be nearly as precise as they now are. "Most of the time we over-provide quality," Palem says.

That's partly because, with things like audio and video, there is another computer on the receiving end of the data that can workaround the errors--your brain. "There is that chip, the CPU, in your cellphone and there is the chip in your head," he says.

Palem wants to rely on the brain's ability to make a cohesive picture out of fragments, the way it sees light from a bulb as steady, even though it is flickering on and off 60 times per second.

Palem first published his concept in 2003, and he has been working to develop chips ever since, first modeling the math then building actual chips. Last week Palem presented the results of the tests of his first chip, one that can be used to quickly and efficiently generate random numbers for applications like encryption, at the International Solid-State Circuits Conference in San Francisco.

Palem is nearly finished with the design of a more flexible chip, one that could be used in cellphones or iPods, and hopes to produce prototypes by next year. He is also studying the physiology of vision and hearing to see how much "glitchiness" the brain can handle.

"We want to see how much of the chip's load can be offloaded to us," he says.

It's a new world order: To err is digital, to forgive, human.

Jonathan Fahey, Forbes
Source: source image