Wednesday, April 26, 2006

Yes. YES.

So I was browsing the news wires this afternoon when I stumbled upon something I didn't expect to see. "Techniques of Written Storytelling Applied to Game Design," by Jeff Noyle on Gamasutra. In my first post on here I discussed the break between the story and the gameplay in most modern games. What I pointed out was that the story was decidedly separate from the game, and that was a big problem because it interfered with the player experience.

Well, I'm not the only one who thinks this, thank god. Jeff Noyle brings up some fantastic points, the most important in my opinion being the "Show, don't tell" part of the article.

In our little progression that flows from books to movies to games, we have varying degrees of visualization. Books are all about interpretation of events for personal reflection. Movies take that a step in the visual direction, giving us a visual representation of what the director/writer wants us to see. In games, it's important to take this even further so that the player is not removed from the game experience by unsightly text or undue intrusions into the player's agency.

There are several different ways to affect player agency negatively. The moment you take control from the player and give it to the system, you've got a problem. The moment you pop up text on the screen explaining ANY situation you've got a problem. The moment you help the player figure out a puzzle or a question using anything other than the construct the player is already in, you have a big problem.

Just like Noyle says, "the player is smarter than you think." The moment you assume the player is stupid and just tell him that the book on the table has the code for the door to their right, you remove the player from the experience and the player is removed from the game instead having an experience. Not that having a book with a code in it would have been a particularly engaging experience in the first place, but you get my drift.

It is important for the story to be an integral part of the gameplay. That way, the gameplay will guide the player instead of having the story tell the player what to do. Always allow the player to assimilate information on his/her own, don't tell them "We're equipping your boat with a weapon, make it through the swamps and get to our base on the other side because otherwise the aliens will win." Instead, make it apparent that it would be a good idea to get through the swamps, and have the player ask for the weapon upgrade. Maybe the aliens are hot on the player's tale. Allow the player to play a game of discovery instead of just listening and reacting. The more involved the player is in the storyline, the better. In fact, why not just make discovery of the storyline part of the storyline.

Thursday, April 20, 2006

Oh God Sony, What Have You Done?

Doomed, we're all doomed!

So many people are waiting for the ps3 to flog their spending dollars that we're actually seeing a glut in game sales. Even more so than what's normal when a new generation of consoles is released. You can bet that when the ps3 is released we won't be seeing any of these month-on-month decreases. With Sony and Nintendo both shooting for Christmas 06 I have a feeling that's going to be quite the quarter for games... maybe pull us out of this horrible black hole?


Edit (Jan 07): Looks like it doesn't even matter because THERE ARE NO GAMES.

Wednesday, April 19, 2006

A Little Birdie Told Me Intel Would Lose the War

So why are console games more popular than PC games? Lets see. You don’t need an expensive computer to run them… no technical knowledge required to buy them and set them up. You don’t have to update your video card drivers. Oh yeah, and you use a controller instead of a keyboard and mouse.

Could this be the biggest reason that consoles are breaking into living rooms more than PCs are breaking into the worldwide gaming market?

Over the last six or so months, Intel has started hyping their ViiV (64) platform. Much like the “Centrino” mobile platform (which requires a specific CPU, Chipset and Wireless card), for computers to sport the ViiV sticker they will be required to have very specific hardware. CPU, GPU, OMG WTFDRM. This whole situation has been (in my opinion) a response to the positioning of the new consoles (360 and ps3 specifically) as more than just game consoles. Sony and Microsoft don’t have to work to get their hardware into the living room, whereas Intel is already well situated in the office. Computers are not for fun, they’re for work. Joe Sixpack buys a computer so he can check email and find porn, not so he can play Oblivion – and he certainly doesn’t intend to upgrade his video card every 8-12 months. He ESPECIALLY will not pay more than 300 dollars for the aforementioned video card when the total cost of his computer was probably between 1.5 and 3 times that.

So what am I saying here? Well, Intel has a big problem. Lets look at the players here – Sony, Microsoft, Nintendo, Intel, and AMD. There’s someone missing there. Where are the similarities between the ps3, the 360 and the Revolution?


All three of the new next-gen consoles sport processors developed and produced by IBM. You can bet that Intel isn’t happy watching IBM beat it to the living room and consoles eat into its high-end CPU sales, and it’s trying to do something about it. Stage right enter ViiV. A new platform for “Digital Media.” Intel is using (too late in my opinion) to break the stranglehold that Sony and Microsoft essentially already have on the American living room. So where does interface play a part in all of this?

Well, let’s see here. The Nintendo Entertainment System Launched in October of 1985. It could play games, and it single-handedly pulled the US video game market out of the doldrums. How did one interact with this particular system? Well, it had an A button and a B button and a d-pad and… well you get the idea. It was a controller. How did the system work? Plug it into the TV and turn it on. Cool. Easy. No technical knowledge required. Because there was no learning curve, people with no savvy computer skills bought consoles in droves. And they still do to this day.

Enter my generation. I personally didn’t have access to any consoles before the 64 because my parents didn’t like them, but I did have a computer. I played Empire, Stronghold, and Railroad Tycoon. Just to give an example of why most people didn’t play games on PCs back then, every time I wanted to play Stronghold I had to edit and reload the autoexec.bat files in DOS. Know what the means? Yeah, didn’t think so. It basically closed out PCs from being the gaming system of choice. People bought consoles. Now, we don’t have to deal with that any more but it was a big problem back then – DOS was not created in order to play games, it was (stolen and) developed so that Mr. Gates had something to show to IBM after they made a deal. Oh, there’s IBM again. Funny they keep coming up, now that they’re not a part of the PC market at all. I’m not saying that there’s some kind of rivalry between IBM and Intel, but… well, maybe I am. Intel wants a piece of that sweet living room CPU market, and its offerings right now simply are not designed for it – but IBM found an easier way in, in the form of console game systems that Americans now feel comfortable buying without coming off as computer nerds.

So basically, as my generation and the generations after me will have grown up with consoles, and the interface of a controller is now much more intuitive and understood in terms of gaming than a keyboard and mouse. It’s even becoming natural. Why is it becoming natural? Because consoles are cheaper, easier, and more prolific than gaming computers. As such, console games sell more and reach a much wider audience than pc games do. And it will continue that way, maybe forever, or maybe just until technical knowledge is about a million times more available than it is today. I can build a PC, but most people can’t and that’s where the problem begins. It ends at Microsoft, but they’re cleaning up their act the rest of the way with DirectX 10 and Vista. RIP OpenGL, I’ll cry a tear for you if no one else will.

Does this also explain why Guitar Hero, Keroke Revolution and DDR attract such wide audiences? I think so. Give a person a game that uses their body instead of a keyboard and they’ll be able to pick it up real quick. Why? They use their body constantly. They know how it works. They can learn how to manipulate it naturally because that’s how our bodies work. So let’s all sit back and watch consoles continue to destroy PCs in game sales year after year. Just wait until MMOs start coming out on consoles, there will be no more safe genres on the PC at all!

Monday, April 17, 2006

Games can scare me, but they have to immerse me first

So I finally started playing Fear. Cool game, great AI. I have to admit, I’m enjoying it. I can only play for a little while at a time though, there seems to be some kind of break with it for me. It’s fun, and the gameplay is pretty compelling. But there’s a problem… I feel like I’m playing a movie. The game is very linear, and while that’s not necessarily a problem as this game has gotten quite a lot of acclaim, I’m starting to really tire of this. And it’s not even the linear style, it’s the fact that it’s apparent. When it’s apparent then it’s difficult for me to become immersed. It happens every once in a while, but not for very long.

It’s almost as if the game itself was designed entirely separately from the gameplay and mechanics. This brought something to mind – I haven’t enjoyed a non Civ-style game in a quite a while. Half-Life 2, the new Prince of Persia, and especially the new Tomb Raider; these games all felt like an exercise: action story action story action story. There seems to be a serious break between design of story and the actual design of the game.

This reminded me of a conversation I had with Aaron Ruby a couple of months back while I was avoiding my intern duties at DICE. The meat of it started when he was answering questions about a reading they did from Smart Bomb (amazon link) that were related to gameplay. He felt like people were approaching games from the wrong direction and at the time I was really excited because I'd been tossing around the same idea. In fact, I’m pretty frustrated about it, it’s really starting to bother me. Our conversation went into a lot of depth about interactive theory and interface theory – and how the two are related – but I’ll spare you all that until later. Anyway, later an internet episodic comedy…. thing… about gaming called “Pure Pwnage” (located at summed it up pretty well. Sure, World of Warcraft is great, and casual, and addicting. But… it seems like everyone is playing World of Warcraft instead of trying to create games that have interesting, innovative mechanics.

Now I don’t actually think that’s true. But I do think that what Aaron and I chatted about is a really important part of the problem with current game design. It appears, to me at least, that when you got to Best Buy or EB and buy a new game you’re buying a story that harbors a game - not a game that harbors a story or that is a story. And this is a problem. A big one. This break between the story and the gameplay really hurts the interactive experience. Designing a film is not like designing a game. Neither is designing a book. Just look at the language – one does not design a book, one writes a book; one also writes a script. The experince of the reader (not player) is one of just that, an experience solely dictated by the author. The same goes for movies. As the creative lead for a movie (director/writer) or a book (author) you have complete control over each element of your media. Green is green and red is red. A murder may appear to be something different to me than you, but it is still a murder, and the story is still determined by you, as is pacing through it. Yes different things have different effects on everyone, but you still do not allow for any emergent narrative in a movie or book, whereas in every game that you give the player the ability to move you have an inherent level of emergence.

When you’re designing a game, you have control over all of the variables except for the most important one in an interactive system– the player. Effectively, this means that you also do not have complete control over the story – more specifically, you do not have control over the story that the player experiences. That is, unless you design it from the point of view of the interactive experience that the player might have. Otherwise what you have is a book that a player plays through, and during play it is dreadfully apparent that while you may shoot these guys differently than you did before you’re still playing within a framework that is not flexible. Half-life 2 is a fantastic example. The atmosphere in that game was truly compelling, I absolutely loved the first 10 minutes of it. The colors were great, it really did create the appearance and atmosphere – using a whole lot of film theory – of an Orwellian city. But after I started playing it I was incredibly under whelmed. It appeared cookie-cutter to me. Do this, do that, here’s your goal make it happen. You played as a character who had a story; but you were not the character, you just played as the character. The game did an absolutely horrible job of combining the gameplay and the story, and as such it felt like you were playing Half-Live 2 the movie. No room for anything new within the construct. and the story wasn’t even that good.

I hope we start seeing a shift. Hope hope hope.