Thursday, October 19, 2006

2007: The Year of Linux on the Desktop? (hah)

Can you hear that? It’s still faint, but you can hear it if you listen closely. It’s the sound of a million nerds clapping. Sony has been talking for months about the possible availability of a distribution of Linux for their super sore-away PS3, and now it looks like that will actually come to pass. Linus, your operating system may soon see the light of day.

Intel, Microsoft, Dell and HP Squirm

On October 17th, a company called Terra Soft circulated a press release stating that they would be releasing a distribution of Linux that will run on the hot PS3 hardware. It’s called Yellow Dog Linux, and they’re claiming compatibility for a couple of very important applications. OpenOffice.org, Firefox, and Thunderbird will all run like a charm on the PS3, or so they say. What’s that? Internet, email, and your term paper. Now, Linux isn’t quite ready for prime-time yet, it’s still a little complicated for the average user. But this will change. And when it does, people will have one less reason to buy a desktop PC, and one more reason to buy a PS3.

That means a slow trend from Desktop PCs to Consoles, with Laptops the only mainstay of big OEMs like Dell and HP.

Two Sides of the Fight

This has a couple of other implications besides just Sunday-night-term-papers-on-the-ps3. The world of Desktop PCs has several things on the line here, and PC gaming hangs in the balance. There are two possible outcomes here: First, the PS3 usurps the Desktop PC and no one buys them anymore; second, balance remains and people still buy desktops. Now, with recent desktop/laptop sales figures, it’s pretty apparent that desktops are on the way out as sixpack machines. Me, you, and everyone we know are buying laptops instead of desktops. It’s already happening. In 2005, more laptops were sold than desktops in the United States for the first time. What do desktops have that are missing in laptops? Speedy video cards. You can’t play Oblivion on a 1000 dollar laptop, but you can on a 1000 dollar desktop.

Let’s recap. Desktop sales are dropping. Laptop sales are increasing. You can type up your papers on your shiny new PlayStation 3. Where does that leave Dell? It leaves Dell with lower margins across the line, and no desktop market. Where does it leave EA? Investing in Console games. Where does it leave you? Either paying an arm and a leg for a desktop PC you build yourself that consumes an ungodly amount of power (thank ATI and nVidia for that – and I’ll get into it later), or buying a PS3/360. I haven’t mentioned it yet, but don’t think for a second that Microsoft will sit on its laurels while Sony positions the PS3 as a desktop-replacement productivity powerhouse.

Now, I’m not trying to say that the desktop PC market is going to disappear all at once by summer 2007. I’m just pointing out that there is a trend forming here, and that it doesn’t bode well for the desktop market (or the PC Game).

There’s hope. It’s called ATIMD. And Intel. And nVidia come to think of it. These three entities do not want to see Consoles win this bloody war, and they’re going to do everything they can to keep in the game.

The Future of Modern Computing

You’ve no doubt hear that AMD recently acquired ATI. You may not have heard that Intel started hiring GPU (graphics processing unit) engineers, and that nVidia recently started developing a CPU. What we have here is a CPU company that now owns a GPU company, a CPU company developing GPUs, and a GPU company developing CPUs. The timeframe that matters here is 2008; that’s when this all comes together.

Desktops are dead, and the big reason is laptops. The little reason is consoles, but that’s not that important. So how do you keep the PC market alive? The only segment that still has life in the US: laptops.

Currently there are a couple of problems with gaming on laptops. The first is heat. In order to dissipate the heat that your CPU/GPU/RAM create, you have to have fans, and space. Space means big laptops, and big laptops aren’t so hot if you know what I mean. The other problem is power consumption. Currently, midrange and high-end laptops have a GPU, a CPU, and a Northbridge/Southbridge. Each one of these consumes power. With the AMD Athlon/Turion, no Northbridge is needed, so that’s an improvement. The future is going to be dealing with the CPU and GPU.

Over the course of the last year or so we’ve seen a strong push towards Dual-core processors by both Intel and AMD. Intel released their Core Duo processor for laptops in January of 2006, and tied it directly into their Centrino marketing to put two processors in every laptop. The important part is this: What if one of those cores was not a CPU, and instead was a GPU? Then we’d be eliminating a chip, dropping power consumption and heat dissipation, all the while increasing the usability of a laptop for things like games.

This means survival. That’s what AMD was thinking about when it bought ATI. That’s what Intel is thinking about when it hires experienced GPU engineers. That’s what nVidia is thinking about when it starts development on a CPU. These are all companies whose end would largely be spelt by the end of the PC Gaming market. If there’s not a new game that requires a faster CPU/GPU, why would people upgrade their computers? Furthermore, if laptops are the PC market, how do we keep the PC market alive without games for them? Where does my market go if no one buys my product?

The solution is not to make the games for the laptop; it’s to make the laptop for the games. Increase power without destroying battery life. You keep that laptop a laptop. I want a laptop that I can take to class during the day and plug into my monitor to play games on at night. Currently that does exist, but it’s big and nasty and I don’t want to pay three thousand dollars for a laptop that weighs 14 pounds. But by creating a CPU with powerful GPU elements built in, AMD and Intel can stay in the processor game and keep their margins safe from IBM. By developing a CPU/GPU, nVidia will stay alive, as opposing to falling by the wayside.

There’s even an in-between option: A CPU with minimal GPU elements that will run Windows Vista so that the real GPU in the laptop can turn off and save energy while on battery power. This is probably what we’ll see first, before the technology is in its later generations. But eventually, keep an eye out for a transition from buying a CPU and a video card to buying a CPU and a GPU for a second socket, or simply a CPU with a GPU integrated into it the processor die.

That’s the future. CPUs that have GPUs on board. It’s going to be a big fight. I like PCs more than Consoles, so you know which side I’m on.

Sunday, October 08, 2006

Why Quad-Core Processors Are a Waste of Money

…for at least the next 18 months.

Because they are. Is that a surprise? How important is it to you, personally, to encode three DIVX movies and play counterstrike at the same time? Extremely important? Congratulations. You just defined yourself as a virtually non-existent minority, freak. Let’s examine why for just a second.

Dual-Core Processors (and why they make sense, kind of)

You like writing papers. And listening to music. You might even like downloading and watching movies. Probably, you’ve got at least two programs open on your computer apart from the operating system: AIM and iTunes. Chances are you leave those running most of the time. Welcome to 2006, you don’t need a dual core processor.

You play games as well? Woah there, things just got a little more complicated. Do you make a habit of turning off torrents or Limewire or Morpheus or Bearshare or AIM or Word or Powerpoint or Outlook or Firefox when you play games? If you do, welcome to your savior: dual-core processors. For the first time ever, having two processor cores has a price tag that is reasonable for the average computer user. Before, to enjoy the benefits of Symmetric Multi Processing (SMP from here on out), you would have had to buy a dual-processor motherboard (expensive) and two processors that had SMP enabled (expensive x2). Now, largely due to competition and not consumer demand, you can get a single processor with two cores. So instead of needing all that extra expensive hardware, you can just get a normal system. It looks like Dell sells a dual-core Athlon64 X2 system (with a 19 inch LCD, video card, and a gig of ram) for about 700 bucks – a whole system that’s equivalent to a dual-processor computer for around the same price as a PS3 with a couple of controllers and no TV. If you build it yourself you can get an even better deal.

Now when you boot up Counterstrike to FRAG some NUBS, you can leave all your fancy piracy programs open. Counterstrike will run on one core and most of the rest of the programs (and your OS) will use the other one. Now, instead of getting 62 frames-per-second (FPS) while you’re downloading Zarathura, you can get 88.

What I’m getting at here is that dual core processors have practical uses. They can have an effect on your user experience. They CAN. That doesn’t mean that they WILL. Most people consider 35 FPS to be playable for a game. Once you hit 60, you’ve satisfied an even more vast majority of the minority that plays games on PCs. In my example, you increased your FPS by 16, from 62 to 88. Most people wouldn’t notice the difference there. My single-core computer doesn’t choke when I start up a game while I’m torrenting, never has. Dual-cores may have an effect, but for most people that effect is negligible.

It is, however, convenient. Game runs on one core, everything else on the other. That makes sense, right? But the difference is really not all that tangible at the moment. But in the future it will be.

There are a couple of other cases in which a difference really can be seen. Do you use CAD, ray-tracing, 3d rendering, or video transcoding applications? Dual core processors are for you. In fact, quad-core processors are for you too, as these are applications that are very easily multithreaded (for the meek: glossing over some details, multithreading basically splits a program into multiple threads, each of which can be run on a different processor/core). Don’t know what any of those things I mentioned are? Welcome to the rest of the human race. You don’t really need more than one core.

The other case is the case of “the future.” Right now most of the benefits of dual- and multi-core processors can not be seen. The same goes for the ability of modern processors to process in 64-bit chunks. And for virtualization. There are notable exceptions, but for the most part there are no end-user/consumer applications that take advantage of multiple processors or 64-bit. This is the application support. FOR HARDWARE TO BE USEFUL, THERE MUST BE AN APPLICATION BASE TO TAKE ADVANTAGE OF IT! Currently, that application base does not exist. But it will come; there’ll be pressure from many directions on the developers to start multithreading their applications. So, if you're upgrading and you don't plan to upgrade for a while, splurge for a dual-core. The software will catch up with your computer.

A Case for Dual-Core: Alan Wake

Let’s look at a good example. Remedy, the studio that developed the Max Payne franchise, is set to release a new game in the not-so-distant future. It’s called Alan Wake. At IDF Intel used Alan Wake as an example of a game that took advantage of its new quad-core Intel Quad processor. According to Remedy, the damn thing won’t even run without a dual-core processor. Now, I don’t think that’s how it will end up, but I think it’s reasonable to assume that unless you have a dual-core, you won’t be able to have all the eye-candy and advanced physics turned on.

There’s a reason Alan Wake takes advantage of more than one processor core: They designed from the very beginning to do so. One core was used for the graphics thread – preparing graphical information for processing by the GPU – another was used for game logic – AI and whatnot – a third for physics, and the last for anything else the computer needed to do. This is both a case and a caveat. The case is that Alan Wake is an example of a mainstream program that will take advantage of being able to run multiple threads simultaneously. The caveat is that they had to start working on it as a multi-threaded application, and that of those threads the most taxing one still only runs on one core. This means that most games that are coming out in the next 12-18 months probably won’t be all that well multithreaded. Some might be able to do what Alan Wake does, but even that is crude at this point. Of all those tasks, the really important one is the graphics information; the entire game waits on it to move forward. That’s still assigned to one core. That’s a problem, and it’s not going to be fixed this year, or next year. Probably not even in 2008.

Quad-Core Processors (and why you’re an idiot if you buy one)

So you need that extra power, eh? Those extra cores? That extra FPU power? My roommate just got a system with two dual-core processors in it. What’s it doing right now with all that power? It’s running Seti@home. Four times. It’s really doing a great job processing those work units. What else does it do? Not a whole lot. You do not need a dual-core processor to run games, read email, chat with your friends, or write papers. So don’t buy one. Period. Wait for two years, and then buy one. Then they might be useful, you know, when the software catches up with the hardware. We still don’t even have programs that really take advantage of SSE3 optimizations, and that was released with the Pentium 4 more than two years ago.

What’s the moral of the story? If you must be on the cutting edge, get a dual core. You might even get a chance to take advantage of it. If you’re happy with what you’ve got, sit tight and watch the goofy masses spend their thousands on power that they couldn’t use if they tried.