Moore's Law may soon be broken

Illustration for article titled Moore's Law may soon be broken

The future is almost never what you expect it to be. Case in point: Over 40 years ago, Intel co-founder Gordon Moore famously predicted that the number of transistors on a microchip would double every two years. His comment, dubbed "Moore's Law" by computer scientists, has been used countless times over the past several decades to assure consumers that their electronics will always get faster and better at a rapid clip. Moore's Law has also been a pet theory among futurists and science fiction writers who believe that electronics will soon grow in complexity at rate so fast it could upend civilization.


And now, at last, it looks as if Moore's Law is about to be broken. It's not because transistors aren't shrinking; it's because we simply don't have the energy to power them once enough of them are loaded onto one chip. Does this mean technological change is going to slow down? Maybe.

The New York Times' John Markoff published an interesting article last week about new research into a problem Moore may not have foreseen:

A paper presented in June at the International Symposium on Computer Architecture summed up the problem: even today, the most advanced microprocessor chips have so many transistors that it is impractical to supply power to all of them at the same time. So some of the transistors are left unpowered - or dark, in industry parlance - while the others are working. The phenomenon is known as dark silicon.

As early as next year, these advanced chips will need 21 percent of their transistors to go dark at any one time, according to the researchers who wrote the paper. And in just three more chip generations - a little more than a half-decade - the constraints will become even more severe. While there will be vastly more transistors on each chip, as many as half of them will have to be turned off to avoid overheating.

"I don't think the chip would literally melt and run off of your circuit board as a liquid, though that would be dramatic," Doug Burger, an author of the paper and a computer scientist at Microsoft Research, wrote in an e-mail. "But you'd start getting incorrect results and eventually components of the circuitry would fuse, rendering the chip inoperable."

The problem has the potential to counteract an important principle in computing that has held true for decades: Moore's Law . . . If that rate of improvement lags, much of the innovation that people have come to take for granted will not happen, or will happen at a much slower pace.

This doesn't mean that Moore wasn't right back in the 1960s and 70s — in fact, he never predicted that the exponential increase in computing power would continue forever. In the 1970s, he suggested that the "every two years" rate would slow after ten years. He might not have anticipated that the slowdown would be due to power and overheating, but he was clear on the fact that exponential growth can't be maintained forever.

Nevertheless, pundits like Ray Kurzweil have used Moore's Law as evidence that our future will be a technological wonderland where changes are so rapid that we reach a singularity in a matter of decades. He calls it "the law of accelerating returns." And Kurzweil's influential work has helped popularize this idea that our future is going to be the result of a Moore's Law scenario spread out over every kind of scientific field.

Now that the Moore's Law is reaching its natural breaking point, will futurists also have to put the brakes on their scenarios for an ultra-fast, high-tech tomorrow? Maybe a slightly slower future won't be so bad, if it allows us to conserve energy for the long term.



Sorry, but this article is just wrong. Observe:

"Almost 50 years ago, Intel co-founder Gordon E. Moore came up with a little idea called Moore’s Law , which basically says that computer processors roughly double in efficiency every two years due to advances in technology along with affordability. So how much smaller, faster and cheaper can computers go? Lots, if graphene , the nanomaterial of the new millennium, has anything to say about that."

Full article here: []

Also, watch this video. Moore's law is alive and doing fine.

Annalee Newitzwrote:

"Maybe a slightly slower future won't be so bad,

if it allows us to conserve energy for the long term."

- What does that even mean? We should just kick back and not continue to work on technological advances that lead to cures for diseases and increases in our own life spans?

I think not.

And lastly:

"Intel President and CEO Paul Otellini said, "Intel's scientists and engineers have once again reinvented the transistor, this time utilizing the third dimension. Amazing, world-shaping devices will be created from this capability as we advance Moore's Law into new realms.""