Thursday, December 06, 2007

IBM researchers build supercomputer-on-a-chip

From Infoworld online:

I know that most of you are familiar with Moore's Law which concerns the size of computer chips or tech in general. To simplify it : Moore basically said that every 2 years the amount of transistors that can be placed on a chip will double and do it for half the cost. Now with the research going on at IBM that very formula could happen to super computer tech.

The technology, called silicon nanophotonics, replaces some of the wires on a chip with pulses of light on tiny optical fibers for quicker and more power-efficient data transfers between cores on a chip. The technology, which can transfers data up to a distance of a few centimeters, is about 100 times faster than wires and consumes one-tenth as much power. The improved data bandwidth and power efficiency of silicon nanophotonics will bring massive computing power to desks. We'll be able to have hundreds or thousands of cores on a chip.

Now this tech is designed to bring down the total amount of "wire" inside a chip. Each chip contains millions and in the near future, billions of transistors. Now to function, each transistor needs some wire to allow it to function. So in the short term, wire will not be completely replaced. On the horizon though is a transistor or light valve if you would that will store digital information as light instead of electrical energy. This may further shrink the size of a super computer to the size of today's cell phone.

Nanophotonics is still a decade away and storing data as light is only theory and even further in the future. However it is conceivable that within a lifetime it may be possible to carry the computing power now reserved governments and thousands of square feet of space, in one's pocket. Does that sound familiar or what?

2 comments:

Anonymous said...

This is outside my knowledge area, but it seems that the optical side would reduce heat as well (i.e., another bugear of chips?)

Beam Me Up said...

Oh yes, heat is always a concern of computer engineers. Sometimes though I think they handle it as an afterthought. Heat dissipation always struck me as a bit of an afterthought. But your right, at this level of density of CPUs, heat becomes more of a problem then data latency. Even though it's not mentioned, the added bonus here has to be a major reduction in heat. However there are differences in how much data certain wavelengths of light can handle. Wouldn't it be terrible if the best data bandwidth in smack dab in the middle of the infra-red?