Tuesday, 14 July 2009

More law? Jude Law? No law.

Moores law; PC technology will improve at a factor of two times every two years. Similar to the level of hair recedence (word?!?) that Jude Law has: 50% less every two years; coincidence? I think not... In actual fact, it wasn't Moore who created this "law". Much like Bell and Edison, he simply refined an existing concept(ual statement) and took the credit for it and attached his name to it, much like leeches stick themselves to juicy people... If you look at the below graph, you'll find it mind-numbingly dull. It is in fact a law that is massively exaggerated and is only vaguely accurate. PC performance changes far more erratically than every two years. The law is only really concerned with the doubling of transistors in a chip and the clock speed. If you're not so interested in computers, then clock speed is how fast time moves. On wheels. Maybe a skateboard.

Recently there's been a monumental shift in computing that completely negates pretty much any law dictating speed increases. Graphics Cards can now be used as General Purpose Processing Units. The common term is actually GPGPU, the centre G standing for Gorgeous. Did I mention that I find silicon and circuits very erotic? Doing common computing tasks like encoding or decoding video, is about 10 or more times faster using a powerful graphics card than a powerful CPU. Well, maybe encoding and decoding isn't exactly a "common computing task"... That would be more along the lines of googling "cheap tarts" (and getting a Sainsbury Lemony type...). So these last two years have seen the effective power of a PC increasing by maybe 15 to 20 times even though the actual transistor count may have only doubled.

I find that part of the problem with PCs is the constant hardware upgrade cycle. It's producing the laziest programming since the dawn of computing; that hallowed moment when some foresighted genius realised that with all of humanities learning and capability to create essentially god-like power, would be best served by allowing us to look up pictures of cats and breasts. However to not wander from topic, you only have to look at the Demoscene coders to realise that with these uber PCs we now have, we're barely pushing them.

Let me give you an example... I've got an over-clocked quad core with four gigabytes of ram, terrabytes of hard-drive space, a graphics card that can probably calculate the combined gravitational pull of the entire solar system in a few seconds. Yet the games themselve improve by very small increments each year and with regular but very small steps in the graphics department. I'm still playing games that looked great three years ago, yet by Moores law they should be eight times better than they did then. The demo coders believe in pushing a small amount of processor, memory and graphics use to it's absolute limit. It's dull to those of you with lives, yet the things these people do with less than a single megabyte of combined memory really put to shame the run-of-the-mill developers releasing "Generic War Game 11".

I'd like to see these coders combine their talents but keep their ethics on programming, to see what they can really do with these uber computers we almost all use on a daily basis now. We should be at a point where games look as good as Pixar movies. Yet because developers are sloppy in their understanding of what the PC can actually do, we're still playing the same looking Marines vs Aliens(tm, me) goddam games we were years ago, and the big difference now? Daylight can turn into night. Wow-fucking-eee. Still it could be worse, the last big thing before that was a lens flare. Games industry types kept going on and on about them for about two years, but what are they exactly, for those of you who don't read technical documents on this kind of boring crap? Well you know those irritating circles and lines you see when you look at a light with your eyes slightly screwed up? Those, cool, huh. *cough*

We see a similiar level of improvement in game graphics on the consoles to the PC's, yet they have static hardware... How can that possibly be if Moores law dictates a doubling of hardware performance every two years? Well, the console only developers, who actually push the graphics of those systems TRY HARD! It's like magic, really, they look to themselves and each other, combine their collective talents and work at improving something through increased knowledge. The shame is that PC developers berate them for sticking with end-of-life hardware, but there's nothing substantially better here on the PC than on those aging boxes sitting under your TVs.Where do the non MHZ/speed based improvements factor into this, I ask, not knowing the answer... Recently-ish some slightly mentally deficient people created a computer dress made of tiles of circuit boards that are powered via other tiles of solar power cells. Is this monumental piece of technology that's pushed our understanding of the world, getting added to Moore's Laws equation? Actually, I hope not. It's silly.

So, what's my point? It may be that I've found a new law. It's going to state "PCs will double in performance over some unquantified period of time, and mans hairline will recede in a similiar period". Jude's Law indeed...

No comments:

Post a Comment

If you're new to this site/blog, head on to the oldest posts first. There's an index over on the right side with May 2009 being the oldest posts!