So, for the last year or so, I've been planning to get a new computer (maybe someday it'll actually happen...). Something close enough to top of the line that I won't need to buy another one for several years. I first got this computer I have now back in 2001, when Warcraft III was going into beta (boy, did that chew up a lot of time; kinda like when World of Warcraft came out...). At the time it had an Athlon XP 1700+ (1.47 Ghz), 512 megs PC133 RAM, and a GeForce 2 video card. I had to upgrade to 1 gig RAM for World of Warcraft, the CPU to an Athlon XP 2200+ (1.8 Ghz) when my CPU fried (thanks to the heat sink fan failing), and a friend just happened to have a GeForce 3 he didn't need anymore (and wasn't going to try to sell). With upgrades, that makes this maybe a 2002 or 2003 or so computer, which makes it about due for an upgrade (particularly considering it's a "gaming" computer). With the improved graphics in Burning Crusade (the World of Warcraft expansion pack), the average frame rate I get is about 13 FPS (down from 17 before the XP).
What I'm currently considering getting (though if I don't get a new computer till after summer, I could probably get more for the same price) is a Dual Core 2 E6600 (2.4 Ghz), 2 gigs DDR2 memory, and either a GeForce 7900 GS, Radeon X1950 PRO, or Radeon X1950 XT. In terms of raw clock rate, that CPU would be 2.67x as fast as my current one, and improvements to instruction performance (cycles per instruction) would make the number even larger. But there's a problem: I'd really only see about half of that performance (1.33x my current performance, plus instruction performance improvements). Why? Because nobody is very good at multithreading, yet.
We're currently in a transition period. We're rapidly approaching the physical cap of clock speed (it's my personal prediction that we won't see clock rates go past 5 Ghz or so, twice current speed, using transistor technology). Already it's much more practical to increase the number of cores/ALUs than increase the clock speed. That's why dual cores have become almost standard, and quad cores will become standard in the next 5 years or so (though honestly, most people - those that don't do anything CPU intensive - don't NEED a multi-core CPU).
Yet programmers aren't keeping up. This move to parallelization is recent enough that still very few programmers have the skills needed to write effective multithreaded code. That's why the Cell, with its 8 cores, is such a terror to program (and the XBox 360 CPU, with 3 cores, to a lesser extent). Some things, like web servers, where you're dealing with a lot of short, independent tasks are easy to split among many threads (not that you'd need a lot of CPU power for a web server; maybe it's a digital signature verification server or something). But can you imagine trying to split your core game logic equally into 8 threads? "Nontrivial" would be a rather dramatic understatement.
Current PC games, such as World of Warcraft (the only game I play, at the moment), are still primarily single-threaded. They may have some helper threads that do various things (a music streaming thread is an easy helper thread to make), but those threads use up very little CPU, compared to the main (single) game logic thread.
I can only hope that this will get better with time. My prediction is that in 5-10 years, the ability to break down complex tasks equally into multiple threads will become mandatory for programming positions in every company producing programs that use a significant amount of CPU (e.g. games). Unfortunately, that doesn't help me now.
Of course, to be fair, multi-core CPUs still have their uses right now. Anything that uses large numbers of threads, including several potentially CPU intensive ones, can benefit from multiple cores (assuming you don't have a gay driver that disables all but one of them; I'm looking at you, Creative!). Our company just got a dual quad-core system (8 cores in all) for its VM server (hosting many VMs). That's an excellent use, because for the most part individual threads will not consume that much CPU, and can be load-balanced well. And, of course, you'll see noticeable performance improvement if you typically run something that consumes a moderate amount of CPU in the background while you play a game (or run a second CPU-intensive program). But again, neither of those helps me :P
Finally, a recommendation. If your school offers a course on multithreading optimization, take it. Better to learn now than later, and hopefully accelerate this transition.
Search This Blog
Wednesday, May 02, 2007
Subscribe to:
Post Comments (Atom)
1 comment:
You can't buy a Dell gaming laptop without acquiring two cores. I suppose owning them will be helpful later on, maybe, but mostly I think we are just expected to hear "this machine has 32 cores!" and go "ooh" and "aah" without realizing that two cores probably doesn't make that much of a difference. :P
Post a Comment