October 30, 2006

Ignoring Moore's Law

Moore's law soldiers on, with CPUs doubling their computing power every 18 months. Graphical performance is increasing even faster, doubling every six months.

What is the right thing to do with all this delicious computing power?

Ignore it!

Keeping Lean in a World of Plenty

We live in a world of Moore's law overshoot, where a child's toy does more computation than the world's fastest supercomputers just a few years ago.

For decades, our computers have been speed-constrained, so we engineers have been trained to use CPU power as completely as we can. When I was learning to program, it took a long time to recompute a big spreadsheet; it took even longer for a word processor preview a document; and of course it took ages to render a single 3d scene. So we are accustomed to trying to get the fastest CPU possible and work it to the bone. We programmers tweak and tune our algorithms to squeeze out every last CPU cycle. We all love a fast program.

But the fact is, since about 2000, CPU power has overshot what is needed to solve all but a few compute-constrained problems. The same thing that has happened in food and automobiles is happening in computers: we are discovering that more is not better. Better is better.

The interesting unsolved problems now lie elsewhere, in better displays, lower power, smaller form factors... Or going back to the drawing board and asking the basic human questions again....

A Ticket to Uncanny Valley

Not everything is cheap to compute. One domain that continues to suck computing power is 3d graphics, and AMD (with their purchase of ATI) and Intel and Nvidia and others continue to bet on the hunger for faster graphics chips. Yet arguably even in graphics, Moore's law has gone way beyond what people actually want.

Consider the PS3. This upcoming game console from Sony rides the Moore's law curve ("Moore's law is too slow for us," said Sony's Okamoto). The PS3 chips run at 1.8 teraflops, which is about as fast as the world's fastest supercomputer in 1996. This power will actually be handy for doing protein folding. But beyond advancing medical science, what is all this power doing in your video game machine in 2006?

More polygons? Better, more realistic graphics? Sure, the graphics will be more realistic, but "better" is probably not the right word, because it is not clear that gamers actually want more realism than they already enjoy. Early reports on the PS3 suggest that the newfound gigaflops serve mainly to plunge players even more deeply into Uncanny Valley.

Uncanny Valley is the name given to computer animation that is so realistic that the figures look like they are almost made of live human flesh. It is a place inhabited by eerie motion-capture movies like Final Flight of the Osiris and The Polar Express. In Uncanny Valley, bodies are perfectly proportioned, the skin glows and flexes, the human stride bounces, the physics looks real. And yet it is deeply disturbing. What we see in Uncanny Valley are real human bodies that lack the control and emotion of real human minds. Uncanny Valley is a strange, unpleasant place, inhabited by the computerized walking dead.

And many people say that this is one of the reasons graphics seems like an interesting area for Moore's law. Maybe infusing these bodies with the organic imperfection of human minds will take a few more teraflops. Or maybe next year some clever game engineers will figure out how to implement human emotion efficiently within the CPU resources of the PS3. We are not quite there yet.

But still, chasing the CPU-constrained problem is the wrong thing to do.

The Obvious Question

Sony is pumping out every possible ounce of computing power in an effort to improve graphical realism. Sony is pushing the limits; they are not wasting CPU power at all. And that is a sign that they are not asking the right question.

"More realism" answers the question, "what problem can I solve with Moore's law bounty of CPU power?" With the way Moore's law dominates our industry, it is the obvious question for any computer engineer to ask. But it is still the wrong question.

The truth is that the 3d graphics CPU problem for games was solved long ago, when Doom was published and when Nintendo 64 came out. Since then, the problem with video games hasn't been that they aren't realistic enough. The real problem is that they aren't fun enough.

What would be more fun?

Being Canny

The right solution in today's world of plentiful computing power is to ignore CPU issues completely.

In terms of CPU power, the Nintendo Wii doesn't even keep up with Moore's law. The 2006 Nintendo machine does not even double the power of the already-underpowered 2001 GameCube. Looking at the specs, you get the impression that Nintendo engineers went to the factory and basically said, "make me the cheapest motherboard we can make." And in a twist that will probably puzzle most computer engineers, if you look at Nintendo's showcase games, they don't seem to push the modest CPU to its limits at all. Graphically, they look like 1995. The wasted CPU is probably sitting around idle most of the time.

Instead of obsessing about CPU, the Nintendo engineers focused on the way people play. Kids like to get in a bike and steer it; they like to swing a bat; they like to dress up their stuffed animals and make them talk. My kids have more fun scooting around in a cardboard box than they do with a scale-model-replica of a spaceship. Why should we assume that the best way to have fun on a computer is to watch an ultrarealistic simulation and then push two little knobs with your thumbs?

Answering that question produces something remarkable.

More gameplay videos here.

A review of the Wii here.

ABC news suggests, If you're holding off on buying a new PC until Vista has settled, consider getting one of these instead.

Based on early reviews, fans start asking, will Zelda TP be the best game ever made?

The three week wait for the Wii is too much for Cartman on South Park, so he must cryogenically freeze himself until the launch.

Talk about buzz!

Nintendo's innovative design is only half the story, however. Their low-tech bet gives them a second advantage, which I've written about here.

Posted by David at October 30, 2006 02:10 PM
Comments

I think we will first hit the "enough" limit for the hard drive space. We will still need more CPU speed for some time, but in general I agree with you. At some point we have to find better ways to use what we already have, instead of throwing more power at n^2 problems or doing 3d simulations.

Posted by: Artak at November 8, 2006 03:26 AM
Post a comment









Remember personal info?