Archive for the ‘ We Could Be Better ’ Category

Immersion (Why Games Are Special)

(Originally posted here; has 13,123 views)

I read a forum thread somewhere recently—I want to say NeoGAF, but I can’t find it ’cause my registration’s pending so I can’t access search—that talked a bit about words and concepts we’d like to see removed from gaming. It was a pretty fascinating topic, and I was happy to see that the used-to-the-point-of-meaninglessness word “visceral” and the anti-game “cinematic” were frequently cited. It was perfect timing, then, for Kirk to post an article highlighting a video arguing against the use of the term “immersion” in video games the next day.

I disagreed rather vehemently. I still do, which is why I’ve spent several hours (as opposed to my normal twenty minutes) to prepare a response.

Before I get into this, I must warn you that I might be someone harsh on Mr. Abraham and those who agree with him. He’s gotten so much fluffy praise from people who consider themselves to be on the forefront of games criticism (a field which, from what I’ve read, is incredibly circlejerky and not nearly as knowledgeable on the subject as it thinks it is) that I think some harshness is in order.

Anyone who believes that “immersion” is a term that should not apply to gaming, or that ideas involving immersive design should be removed from video games is frighteningly wrong. Not only that, but the argument that “immersion” is a bad term, or that games should not be made with immersion in mind are as dangerous to the medium as attempts to ban it.

Guess I should back myself up, huh?

I’ll be covering two main points, because it appears that these guys either fail to understand what immersion means or genuinely want the concept of immersion to die.

Let’s start with the English language.

Okay, so, first things first, a little English language primer (thanks to squibsforsquid‘s responses to my initial response to Abraham’s video):

The English language is incredibly nuanced. Words that seem to be identical to each other can actually have subtly different meanings that aren’t covered by others. “Immerse/Immersed/Immersion” is a great example of this. A simple dictionary lookup reveals it to be something along the lines of “engrossed” or “attention-grabbing,” but if that were the case, then one would wonder why similar words and phrases would not suffice. Why does “immerse” and its various forms exist?

The answer lies in its other definition: to be submerged entirely in a body of water.

Imagine, if you will, that the English language is all the food in a grocery store. Words like “engrossed” and “immersed” are like varieties of lettuce. Sure, you might think that iceberg and romaine lettuce are both leafy green veggies, so they can be used interchangeably, but nothing could be further from the truth: indeed, romaine has a radically different texture and moisture than iceberg (I prefer the darker, bitter taste of romaine, personally, but some people like the cool crunchiness of iceberg).

An English-language example of this would be the substitution of “good” for the word “like.” What we like is something inherently personal and subjective—it’s something that matches up to our own personal standards of enjoyment. What is good is something that compares favorably to set standards—usually ones external to us, like cultural standards. Saying something is “good” does not inherently mean that we like it; likewise, saying that we “like” something does not necessarily mean that it is a good thing.

Similar terms are not identical ones.

Immersion isn’t simply “paying a lot of attention to a thing.” There’s more nuance to it than that. Merriam-Webster’s example, “We were surprised by his complete immersion in the culture of the island,” hints at a level of integration into something. When someone says “he was immersed in the water,” they’re talking not talking about being engrossed with water, they’re talking about going under.

The people who first used the term “immersion” when applied to game design didn’t choose the word lightly. There’s a reason that the immersive sim genre of video games is called the immersive sim and not “engrossing games” or something else. “Immersion’s” unique texture within English makes it a term uniquely suited to discussing an element of video games that other mediums don’t have (you can pay attention to any medium; you can only be immersed in something interactive).

Any game can be engrossing—Tetris is engrossing, for instance—but few games can be truly immersive. Few games can make their players a part of the world within them.

This is an important point, because immersion, in this sense, is something that’s entirely unique to video games. Nothing—no movie, no play, no book—can be truly immersive the way a video game can be.

Basically, to sum things up so far, “immersion” is a term that isn’t always used correctly. When referring merely to the act of being deeply involved in a game, yes, immersion is an improper term, but we should not remove it from our gaming lexicon entirely, because it’s a term that accurately describes one of the primary elements of what separates video games from other entertainment mediums.

Where am I getting this from, you ask?

Right, so, let’s jump back to 1974. Gary Gygax and Dave Arneson (sorry, Dave, but while you take alphabetical precedence, Gary wins for having alliteration and an x in his name, which just makes him cooler) created this game called Dungeons & Dragons.

It was a role-playing game.

I’m not talking about stat-based adventure JRPG stuff, either. I’m talking about a true role-playing game (speaking of role-play, there’s another thing that will confuse you if you try to find a dictionary definition—understanding the use of the word, specifically regarding its origins and relationship to improvisational theatre, is key to understanding what is and isn’t a role-playing game). Basically, they created an instruction set for how to role-play.

The goal was to empower players to have adventures in worlds of their own creation, a radical departure from other games (sports, Milton Bradley-style board games, etc). At the same time, it wasn’t a performance thing, like theater. It was just “hey, let’s explore a world!”

The rules behind DnD served the purpose of making sure players didn’t get overpowered or do absurd things. You don’t actually need a turn-based system, stat points, party members, and so on and so forth to have an RPG, it just makes things a bit easier for a GM to handle.

Jumping forward a bit, we hit 1981 and two games, Ultima and Wizardry. It was effectively the birth of the video game RPG; other games had preceded them (I once read that a computer game called DnD showed up in 1975), but these two games were the watershed moment. Ultima and Wizardry used incredibly limited technology at the time to try to emulate the RPG experience.

A necessary digression: when Japanese developer Yuji Horii saw Wizardry for the first time, he got really excited by the prospect, and, apparently being unaware of the purpose of Wizardry’s mechanics, cloned a lot of the ideas and created Dragon Quest, the game from which all JRPGs since have descended. Most of the time, things don’t work out quite this well and new genres aren’t created, but in the JRPGs case, things worked because Horii is a boss. The lesson here is that you shouldn’t go creating a game unless you understand why the mechanics behind it exist. This is also the reason why regenerating health is used in a lot of games it has no business being in.

While the JRPG gained popularity and became its own thing (and confused a bunch of people as to what the RPG actually is), Western devs were still quietly making their own RPGs, but with added computer power. Instead of making turn-based, top-down games with various battle systems, they were focusing on evolving the genre, making it distinct even from the pen and paper games which had birthed it, while at the same time, keeping the spirit of the RPG intact.

Now, I should point out that video game RPGs are still absurdly limited! Computers cannot improvise the way that GMs can. That said, there are some areas where they excel… and that’s where Looking Glass comes in.

If you understand one thing about the history of video games, it should be that no game studio on the planet will ever be more important than Looking Glass Studios was. These guys pioneered first-person games, sandbox games (what, you thought Shenmue or GTAIII was the first sandbox game?), flight simulation (when they died, the flight sim industry died), stealth games, and a bunch of other stuff. Their employees have gone off to help invent the Xbox (forever transforming the gaming landscape and eliminating Japan’s stranglehold on the console industry), work on Guitar Hero and Rock Band, revitalize The Elder Scrolls (heavy immersive elements in those games), create Deus Ex, work for Valve, and so on and so forth.

Oh, and one of the first games they ever made was Madden, so there’s that.

Perhaps their most important contribution to game design, however, was immersion.

The Looking Glass guys, in the early 90s, had a revelation: they could use simulation elements to add new life to their worlds! From this, the immersive sim was born.

Basically, you take that core idea behind role-play (I want to be someone in another world) and use computers to create a world players can interact with. That’s really all there is to it. You make the game in first-person, to reiterate the fact that the player is his or her character. You create levels that feel like real spaces, then populate it with complex AI that can do more than just fight. If you can, you try to throw in elements like physics, good graphics, a high degree of interactivity, and so on and so forth. You also cut down as many abstractions as possible (abstractions in a game context are basically just mechanics that provide a simpler way of approaching real-life ideas—such as turn-based gameplay when a computer can’t handle a real-time approach).

What we’ve found is that immersive games, provided they are easy enough to get into (Deus Ex, for instance, inundates players with information in its training level and summarily throws players into the deep end with Liberty Island; this is a bad way to do things), actually have a huge draw and significant lasting appeal. Some recent examples of immersive games include STALKER (more than 4 million units sold—not bad for a Ukrainian studio with next to no marketing), Fallout 3, and Skyrim. Other games, like Assassin’s Creed and Dark Souls, use immersive elements to enhance their experience.

People love these games. They love being able to enter a new world and interact with it. They love emergent gameplay—why else do you think GTA is such a popular series? Skyrim was successful because it facilitated exploration. Crysis was unique because it allowed deeper physical interaction with the world. STALKER’s advanced AI and player needs (eating, for instance) helped its players sink completely into the role of the amnesiac Marked One.

Far Cry 2, flawed as it was, got the love it got because it let players treat the world as an actual world. Yesterday, I read about someone who stacked up cars in Far Cry 2, blew them up, set fire to a field, caused the base he was attacking to catch on fire (which burned some of his enemies alive and confused others), and then walked in and took what he needed without anyone realizing he was there.

(I realize that I could probably write an entire essay on the power of emergent gameplay and why Dwarf Fortress and STALKER are the greatest games ever made, but I’ve got enough stuff to talk about as it is).

Immersion is the future of video games.

I realize that “the future of video games” is a phrase that gets used a lot, primarily to describe whatever trend is currently popular (Facebook games, iOS games, casual games, motion control, you name it), but I’m using it in a slightly different context: I’m talking about progress.

Most people don’t really think about the future advances in tech. What can Kinect really do for us? What does Goal-Oriented Action Planning AI do to enhance video games? What doesprocedural generation mean to video games? How does the RPG fit in with all this? What can we do with interactivity, that sacred ideal that elevates video games beyond all other mediums by eliminating passivity?

The people arguing that games shouldn’t be immersive are as ignorant as the people who argue that Role-Playing Games are nothing more than stat-based adventures. These people want to hold the industry back—to keep it at some larval stage where they’re most comfortable. Maybe it’s out of fear (after all, I don’t doubt that bards objected strongly to novels, nor do I doubt that novelists objected strongly to the medium of film), or maybe they just… really enjoy stat-based adventure games or strategy titles or what have you (I know I do!); I don’t really know their motives.

What I do know is that they’re trying to fight human nature.

Don’t believe me?

Let’s go back to the beginning.

The Epic of Gilgamesh is one of humanity’s oldest surviving works of fiction. It’s a massive adventure story. Fast-forward to ancient Greece and Homer; note the vast influence of his works (basically all of Western fiction owes its existence to Homer and Plato/Aristotle/Socrates). Jump ahead even further, and take a gander at the increasing believability of fiction (Shakespeare, particularly), as well as the increasing accessibility of entertainment. Check out how the integration of music and storytelling in the 1500s led to the birth of the opera. Pay attention to the rise of global exploration during the Renaissance, as well as the scientific leaps and bounds made by a formerly-repressed society. Study the emergence of 19th century literary criticism, as well as the explosive popularity of novels. Read up on the birth of film, radio, television, comics, and their subsequent popularity.

What do these all have in common?

Well, I was hoping to have a word for you, but I don’t. Curiosity, maybe? Discovery? Newness? Escapism? None of these really quite sum up what I’m trying to get at, so I’ll put it like this: people only enjoy the mundane so much. At some point, every single one of us is going to seek out new experiences. We crave new sensations. We savor them. Experiencing the new is one of the primary motivating factors of human existence.

Humanity, as a whole, has a fascination with the new. When we look back at fiction, we can observe humanity’s fascination with the idea of exploring other worlds. CS Lewis’s Narnia adventures cover this. Lev Grossman’s The Magicians explores it too (fun fact: his brother apparently worked at Looking Glass). Fantasy and science fiction stories sell like crazy. There’s a reason that films like The Girl With The Dragon Tattoo didn’t do nearly as well as Avatar. One is mundane. The other is not.

The fact of the matter is that we, the human race, are a bunch of insatiably curious creatures who constantly desire new experiences. Discovery is humanity’s raison d’etre (oh yeah, I can be just as pretentious as the self-styled game critics; ce que je dis, je le dis dans une autre langue, donc, ce que je dis est profond?).

So what’s the future going to be like?

We are creatures driven by discovery. Why do you think Skyrim did so well? Why do you think New Vegas failed? The former facilitated discovery and exploration; the latter was too focused on being a good RPG to care about the world it had created.

The future of games is going to capitalize on this. Arguing that we should eliminate the concept of immersion in games, that the immersive sim should be dead, or anything else along similar lines, is like arguing that we shouldn’t have voice acting and ought to stick with scrolling text. It is an argument that says “games should not be more than they already are!”

Modder Robert Yang may consider immersion to be a fallacy, but he’s mistaken: the future of video games really is the holodeck. All those things I mentioned earlier—Kinect, procedural technology, better AI, and so on and so forth—are the tools that are slowly pushing us towards that end.

…I haven’t even begun to talk about the real-world benefits of creating immersive games. Someone smarter than me could surely go on at length about the possibilities of immersive simulations that allow people to live through various simulated events for… a wide variety of reasons. Someone training to be an EMT could be forced to go through a triage situation, with accurate simulations of panicking people, secondary threats, sensory barrages, and so on and so forth. Researchers could study crowd dynamics (using more advanced AI than anything presently available) in the aftermath of a disaster in order to better understand how to design environments to protect against them. The military already uses immersive sims to save training costs. There are a ton of non-entertainment applications for immersion. Saying we should kill the concept is horrifying, because it’s so limiting.

…and so we come to the conclusion.

There will still be room for the [insert any unimmersive game here] of the world. I’m not saying that they should die; there’s nothing inherently wrong with them. Instead, I’m looking at this in a long-term perspective—not the next week, or the next month, or the next year, but the next century of game development. Games are… going to become something else. Traditional video games will still exist, but this new thing, this transportation to another world… that’s the future. Saying we should kill the concept of immersion and only give credence to attention is a terrible idea.

Considering the way they seem to feel about immersion, it would appear that Ben Abraham, Robert Yang, and Richard Lemarchand don’t just misunderstand the term, but want the legitimate usage to die as well. While I don’t know a lot about Abraham’s personal philosophies, Yang’s made his pretty clear in his Dark Past series of blog posts—he thinks the immersive sim should die. Lemarchand’s philosophies are made clear by the games he creates, and
Do I sound upset?

These guys seem smart—really, they do—but by failing to understand the nuance of the word “immersion,” they seem primed to damage the medium.

Look, I may be just a poor college student (I can’t even afford a good school) who is trying to learn game design while his school falls down around his head (seriously, I’m not kidding about the good school thing). Unlike Lemarchand and Yang, I’ve never made a video game in my life. I’ve worked on some other forms of RPG before, and I’m trying to work on an indie game right now, but I obviously don’t have the body of work behind me that these guys do. I may never have the body of work behind me, at the rate things are going.

…but… I feel like they’ve got it all wrong. If they’re the guys who tell us where games should go—if we follow them—I know we’ll be worse off for it.

They scare me.

(Also, in case anyone is wondering, yes, this is one of the reasons I prefer Western to Japanese games. Japan tends to prefer to design more abstract, non-immersive games, which is a totally valid method of expression, but not one I personally enjoy)

On Art and Smart Games

(Posted here)

Hey guys, thought I’d write another longish #speakup post; this one’s about the list of artistic games on Brainygamer. I wrote it as an open letter to Michael Abbot, who runs the place. I cut out the introduction for Kotaku, since you guys already know who I am, hopefully.

I think something’s missing. See, the thing Clark was getting at–and the thing that precious few people fail to understand–is that he’s not talking about just “art.” He uses these qualifiers, like “true art,” or uses words and phrases like “puerile” and “intellectually lazy.” Those qualifiers are very important, because he’s referring to a divide in art that’s rarely (maybe never; I’ve never actually seen anyone bring this up) mentioned: high and low artCitizen Kane was, arguably, the first high art film. Most of what had come before were merely adaptations of other works (Wizard of Oz, Gone with the Wind, Ben Hur, etc), and while there had been a few stepping stones (like Metropolis, M, and The Cabinet of Dr. Caligari), Citizen Kane was the game changer. Most of the people who say that games “don’t need a Citizen Kane” don’t really understand what Orson Welles did to the cinema landscape. After Kane, everything changed.

Before I get into the high/low art thing, however, I need to back up one quick second and define art:

Art is a thing that is created or performed with the primary intent to stir up emotions within the audience.

In this way, many things are art. For every The Four Seasons, there is Baby. For every Jane,there is a Twilight. Actually, if we go by Sturgeon’s Law, for every one good thing, there are nine bad things, but whatever. The point here is that we see a distinct schism in art. Some things are timeless and will spawn endless discussion centuries after their creators have passed on, while others are transient, their laughable, short-sighted attempts at profitability greatly robbing them at artistic merit. The former are high art; the latter, of course, are the opposite.

The question should never have been “can games be art?” It should have been “can video games be high art?”

We then run into another problem: broad generalizations. The simple fact of the matter is that we can’t ask whether any one medium is art of any kind, because there are a lot of little differences. Baraka and Casablanca are art, but an instructional video telling you how to interact with customers or an advertisement for cold cereal is not. The Scream and Starry Nightare art, but the handicapped sign on a bathroom stall and a full-page newspaper advertisement with pictures of cars at low, low prices are not. The same is true of games: some are, some aren’t.

Perhaps the best description I’ve heard of games is actually Wikipedia’s: games are “structured play.” It gets right to the point and encompasses every game type, from board games, like chess, to sports, like basketball, to video games… except… well… video games are a bit broader than all that. There’s a reason no one says pente or basketball (the performance art that is the Harlem Globetrotters aside) are art–they call them games and sports. Unlike Risk or ōllamaliztli (sorry), video games use a lot of artistic elements, and I’m not talking about the craftsmanship of board pieces or illustrations on cards or anything. Some tell stories. Some exist more to craft mood than anything else. These video games are more than just games–they’re hybrids; instead of being merely tools that structure gameplay, video games combine elements of other art forms, like storytelling, with rules-based systems, and you get video games.

As technology has progressed, however, things have gotten really weird. Some might argue that, at some point, they stop being games and become something else. We’ve added all these simulation elements–instead of enemies adhering to specific rule sets, with rigid, turn-based battles, we’ve got things that try their hardest to simulate actual encounters. Games like STALKER aren’t really games anymore–they’re entire worlds to explore. Some games use their mechanics like a sculptor might use his tools, shaping an artistic experience out of it. Somewhere along the line, some video games moved beyond just structured play and got into something more.

In other words, some video games are art, some aren’t.

This leads me to modify the question: “Can some video games be high art?”

And, since we’re asking that question, we’re going to want proof one way or the other, so the next question that follows is: “are any games worthy of being called high art?”

The answer to the first question, I think, is yes. There is very little academic discussion centering around games-as-art, and what few attempts are made tend to be weak attempts at justifying one’s love for a particular title. Most of the “intellectuals” (oh yes, scare quotes seem very well deserved) who debate games are little more than educated fanboys, and they rarely seem to be educated about the right sort of things. I’ve encountered more enlightening discussion of game and game story through random commenters I’ve met (we get into these cool discussions about Aristotelian philosophy and the strengths and weaknesses of the medium and how the medium doesn’t lend itself well to traditional storytelling) than I have reading about games by people who fancy them serious critics.

Now, you may have noticed by now that I haven’t mentioned intellectual stimulation at all. There’s a good reason for that: when Clark talks about intellectual stimulation, he’s not talking about puzzle or strategy games. He’s talking about the intellectual stimulation that comes from artistic merit–the part where we start critically discussing things.

This brings me to my primary criticism of the list: most of the submitters don’t seem to know what they’re talking about. A ton of games are on there only because “they make me think a lot,” which, again, isn’t what Clark is looking for. It provides an idiotic counterpoint to his claim. Many of the submissions are riddled with spelling errors and barely give reasoning beyond “I really like it and it moved me.” Movement is all well and good, and listening to wubstep makes me feel something on an emotional level, but that certainly doesn’t make it high art, which is what Clark very clearly wants.

You have basically two kinds of art games: narrative and mood. Narrative games would, of course, be games with the primary intent to tell a story. In a perfect world, the game mechanics are subservient to the story (the “gameplay > story” fallacy is a really big subject I could get into, but I don’t have the time; maybe later?), functioning as the language or technique that conveys the story, but more often than not, people focus too much on the gameplay and not enough on the story, which is the equivalent of a novel with a lot of very nice words and a story not worth telling. Mood games are… things like STALKER or Shadow of the Colossus. They are the ambient music and the abstract paintings of gaming.

Many of the games on the list have no right being there. Uncharted 2 possesses the narrative depth of Transformers 2: Revenge of the Fallen. Mass Effect 2 is a poorly-written (there’s no second act where the team gels, jarring the suspension of belief) white supremacist gameHalf-Life 2 is a mess, its narrative structure oddly centered on Eli Vance. Red Dead Redemption is apredictable unevenly-written (FBI man’s random speech, for instance; characters waxing eloquent at random), ludonarratively dissonant game. I’ve given up watching television shows that are better-written than Heavy Rain, like Lie to Me. Strategy and puzzle games don’t really belong there at all because of the whole sports thing (in fact, Starcraft 2 is kind of the majority of the esports scene).

I’ve noticed that some people mention how a work is referential as if that makes it an intelligent work, but merely being referential isn’t what makes a work good. The other day, someone, when discussing Metal Gear Solid with me, argued that it was excellent because it featured science, science fiction, and real-world events. I’m sorry, but there’s more to art than that. Metal Gear Solid is a narrative joke, and no one with any knowledge on the subject of good storytelling could honestly call it a work of art on that front. Yes, it does a lot of interesting things and plays with the medium, but there are many films with excellent traits that fall flat on their face where script is concerned, which prevents them from being considered high art. Transformers 3, for instance, has some fantastic direction, camera work, lighting, and special effects, and is one of the best uses of both IMAX and 3D filmmaking ever, but that doesn’t save it from being low art.

A lot of people, no doubt, will feel defensive about this: that’s good. The current arguments for why some of these games should be considered the pinnacle of the medium are weak, and an intelligent defense of a great many of these games needs to be made. I can’t really do a lot to back of my claims in the interest of time and space, but I’d gladly do so at a later date. Also, I assume there will be a number of people who read this and get really upset; people tend to be more invested in games than other mediums, presumably because of a level of some sort of involvement bias (I realize this isn’t a real term, but, as far as I know, there is no term to describe the cognitive bias where you spend time with a thing you enjoy and become reluctant to admit its faults because you feel as though you’re admitting that you wasted your time), which, I think, is part of the reason gamers seem to be significantly more prone to anger and fanboyism than fans of other mediums.

Games, even the most-loved, highest-rated games out there, deserve a lot more criticism than they get, especially when it comes to narrative, and that is what Clark is upset about. As I look at the video gaming landscape, I see a small number of games (ten, at present count) that might be considered the Gone with the Winds and Wizard of Oz’s of the world. Clark believes he’s found it in Braid and Journey, but the only that strange, weirdly-insular, self-involved field (the same few critics seem to pop up over and over again, for one thing) that calls itself games criticism really seems to care. I see no Citizen Kane of gaming, or even a Watchmen, but I hope we get one soon.

So… um, yeah.

Basically, I’m disappointed in the list as it stands. I feel like it would benefit some filtering.

Also, I think you only need one entry per title, but maybe that’s just me.

~DocSeuss

PS – As a synesthete, I’ve always found Rez to be a bit lacking. Still qualifies as a mood game, I guess.

Why First-Person Stealth is Best

(Originally posted here; has 16,434 views)

I’m done playing third-person stealth games.

I can’t do it anymore.

I want to, believe me, but… yeah. No. I can’t. I love stealth. I love the idea of stealth. I love sneaking through a level, either ghosting it or taking out everyone without being noticed. There’s a feeling of empowerment there that comes with solving the puzzle that is a good stealth level.

Look, have you ever played a game that you’ve broken? I’m talking about a game like Skyrim, where you mod a sword to have 9999 damage so it kills everything in one hit, completely removing the challenge from the game. I’m talking about cheating.

Like many of you taffers, I once believed that third person was the only way to do stealth. I thought that it was the only way to figure out whether you could move, because as soon as you get into an AI’s line of sight, they’ll notice and start looking for you, and that really makes or breaks a stealth game.

I love the genre—whether it’s Assassin’s Creed or Splinter Cell or Hitman or whatever—but little did I know that they were all doing it wrong. Harsh words, I know, but bear with me.

Recently, an old stealth game was rereleased after a thirteen-year absence from store shelves. It was called Thief, and it was developed by the guys who went on to make games like Deus Exand Skyrim.

Unlike most stealth games, it was in first person.

How did they get around the line of sight stealth problems, you might ask? Well… they didn’t. See, line-of-sight is actually horrible. In real life, stealth doesn’t work that way. Line-of-sight is a method that’s used only because it’s incredibly simple to create. It is, in fact, rather lazy. A third person camera basically exists as a gameplay abstraction designed to keep the player from giving their position away whenever they want to see if they can move.

In real life, you could listen to the position of people, poke your head around corners without being noticed, and hide in the shadows without being seen. In a game where all stealth is based on line-of-sight, you can’t do that, so you have to be in third person, or it sucks.

…well…

What if you made a stealth game where you could listen to the position of other people, poke your head around corners without being noticed, and hide in the shadows without being seen? That would be a lot better, right?

Turns out it is.

It adds a whole new layer of challenge to stealth. It requires intelligence to play. Sound becomes a fantastic method of level navigation. It means you don’t need to cheat and look around corners unrealistically, because now you can hear guards snorting or sneezing or chatting or whistling or just even walking.

Do you have any idea how amazingly badass it is to hide in the shadows right in front of a guy, step out like Batman himself, and stab him in the face? It’s incredible! There’s no feeling like it in the world (besides being Batman!).

I can’t go back to third-person stealth after this. There’s no depth to it—not challenge beyond an arbitrary, unrealistic, and unforgiving line-of-sight issue beyond the occasional “DONT MAKE NOISE!” component.

Thief is the best stealth game I’ve ever played.

Food for Thought

“Where is the game that questions governments, challenges society, hell, asks a bloody question? Let alone issues. Good heavens, imagine a game that dealt with issues!”

John Walker, of RPS fame, recently posed this question in his article bemoaning the lack of games that have any real substance to them. As someone who had a conversation just yesterday about all the games that I consider to have terrible stories (which is almost all of them), you can’t get much dismissive than me. So… when I recommend a game, understand that I do it because I have incredibly high standards.

…aaaaand that’s why I was surprised when John went on to say this: “I want there to continue to be Call Of Duty games. But I also want there to be gaming’s All Quiet On The Western Front. It’s our 1935, and it’s about time it happened.”

Well, um.

Call of Duty: Modern Warfare 2 is that game, John.

I know, I know. Everyone wants to hate on Modern Warfare 2. They say it’s a dumb, stupid Baysplosionfest. It’s strange, really: before Modern Warfare 2 came out, Call of Duty was one of the best (the absolute best) gaming experiences in the known universe. Call of Duty 4 was one of the finest games in a year that saw Portal and Bioshock released. The nuke scene, the sniper level… so much of that was memorable and superb in every way. Like the original Half-Life before it, Call of Duty transformed the industry through the best scripted events (which are not evil in and of themselves) that had ever been seen, while maintaining a high standard level of linear, corridor-shooting interactivity.

Then came Modern Warfare 2, and all that changed. Unlike most people, who seem to think that one of the finest development studios out there would suddenly be the worst ever, I’m going to blame the rushed, eighteen month development cycle, the fact that the studio had very little love for this game (much like Call of Duty 2) and were only making it as part of a deal that would let the now-stillborn-but-possibly-at-Respawn Future Warfare project come into being, and the fact that the game, in a series birthed on the PC, became nothing more than a bad PC port. Remember the “it has mouse support” debacle? The lack of dedicated servers? For some people, those wounds are still fresh. I think that’s where a lot of the hate really comes from.

Plus, the internet is a thing. The internet is a vast hate machine. It hates what’s popular–look at all the grief Halo, a franchise from one of the best shooter developers to ever walk the Earth with games nothing less than stellar (barring Halo 2’s campaign)–received when it was popular. Look in comment threads around the internet, and you’ll still find people coming out of the closet, admitting that “it wasn’t really that bad,” or “I never really hated it.” Ignore Call of Duty’s longest-time fans and dumb the game down, and you’ll get people screaming about how stupid everything about it is. Have it beat the highest-grossing movie ever made in the span of a month or two, and you can bet there will be a backlash against the game’s popularity as well.

Call of Duty: Modern Warfare 2 did deserve some of the complaints it got, mind you. The rushed development led to bugs and imbalances, and the campaign was a bit rough. That said, the people who complained about it are drooling morons.

They love to complain that it was dumbed down and stupid, despite being like every Call of Duty before it, which was praised as intelligent and awesome.

First things first: Modern Warfare moves at a much, much faster pace than most video games. There is no time to stop and have a conversation–imagine if you never had time to stop in Mass Effect 2: you’d never get to know a single character in the game, beyond the occasional “I WILL DESTROY YOU!” or whatever from a teammate. They’d be empty shells. Because of this unrelenting pace, you never really get a feel for the characters, even if they’re actually pretty well defined. I’m not just talking about Soap and Price, either. Makarov is a particularly interesting character. Even Dunn and Foley’ve got personalities.

I believe, across the Modern Warfare series, you play from the perspectives of at least fourteen different characters: Soap, Roach, Price, Yuri, a British soldier, three marines, an astronaut, a father on vacation, two gunners in an AC-130 gunship, a dictator, a CIA spy, and others I can’t recall off the top of my head. That can get pretty confusing. The game is a lot like 24, the action show where Kiefer Sutherland punches people in the face to stop terrorism and it works. That many protagonists in a fast-paced game that never gives you the time to get to know anyone is going to confuse the shit out of people, especially those who are expecting the story to be stupid and aren’t paying attention. If you pay attention to the Modern Warfare games, nearly everything makes sense. Gamesradar’s infamous and rather absurd ‘plot holes in Modern Warfare 2’ article falls apart. The only real plot holes I can remember having any validity are “how, exactly, did Price survive and not manage to be returned to the UK, why is Task Force 141 under Shepard’s control, and how did so many Russian airplanes make it across the continental United States without being noticed before they got to Washington, DC?”

But, hey, eighteen month development time. Three mistakes. That’s really not bad.

Ultimately, however, that fast pace and rapid character shift means that all the plot bits that are there–the frequently-good writing–is often ignored.

I can’t say I got much of a feel for the characters in, say, Mass Effect 2. Grunt and Miranda were pinocchios. Jacob was… inoffensive. Legion was a robot. Garrus was just Space Batmanpunisher because The Dark Knight was a cool movie. Jack was a sensitive girl who kept everyone at bay with anger. Samara was a ronin (Samara? Samurai? get it?). You can’t really say much for those characters. They’re walking encyclopedia entries with loads of personal information. Rarely do they make observations about the world (unless that observation seems to exist to contrast them to the world around, like someone writing about a kid from the country showing up in the city and going “wow, you people are strange!”), or ask questions, or demonstrate any real personality. In the gameplay, it’s even worse.

It’s interesting to get a feel for the characters of Modern Warfare 2, however. Makarov is very much a chess player. He’s arrogant. Patriotic. A complete bastard. Zakhaev’s death scarred him tremendously. Shepard’s blind patriotism to America leads him to cross the line, murdering his own people and innocent civilians to put some pride back on America’s face. Price, however, transcends nationalism, ultimately going rogue, becoming a man without a country for the greater good of the human race.

Those three characters actually sum up one of Modern Warfare 2’s major themes (did Mass Effect 2 have a theme? Nah, it was just a bad, grimdark Dirty Dozen knockoff without the all-important team-building second act): that nationalism and misguided patriotism is a terrible thing indeed. One of the quotes used in the game was from Albert Einstein, who said, “Nationalism is an infantile disease; it is the measles of mankind.” Many of the series’ trademark “death quotes” revolve around themes of nationalism, patriotism, and the dangers thereof.

People like to say that No Russian was a publicity stunt–well, it wasn’t. It was the other half of Modern Warfare 2’s point. Modern Warfare 2 flipped the war on terror on its head, putting the US in the shoes of Afghanistan and Iraq, and asked “is this just?”

Think about it! For reasons you believe to be just, you are made to do a morally questionable act because it might help stop a bad thing. Doing so turns the world on its head. Your country is framed for the actions of a few–perhaps by the country that was already planning to invade you for other reasons. One half of the game has you playing the part of the confused soldier, not knowing what’s going on, being given random, seemingly disconnected objectives, and trying to stave off a surprise invasion. The other half has you playing as the man trying to catch the people responsible.

Modern Warfare 2 ends with you stealthy murdering American soldiers in Afghanistan to pound the point home, as if it wasn’t clear enough.

Was this right? Was this just? Was this invasion a good thing?

The game’s a bit of a “blood for oil!” conspiracy-type story, I’ll admit (Russia took out the US satellite that gave them entry into the US before No Russian took place). It dwells a bit too much on the events and not enough on the characters (but… what would you do? Cutscenes? All the character time is spent during loading sequences and in gameplay dialog; the game’s as efficient as a shark when it comes to gameplay–it’s even better than Half-Life in its relentless desire to keep you in the experience–it never locks you in a room and lets you run around like a madman for ten minutes). It’s got a great deal of failings. But… it does ask questions. It bothers to be more than just an action game. I think the only other post-2007 games I played that really did that were Bioshock 2 and Minerva’s Den, and I’ll write about them elsewhere.

You may dislike the theme, the unrealism, or even disagree with the argument it puts forth. But you can’t disagree that it tries, and it would be hard to disagree with the suggestion that few games try as hard. The only reason it failed was because no one came in expecting it to have a good story, and then, when they did play it, nobody bothered to pay attention to what was actually there. It’s as if they were like “nah, it’s not going to be good, so I don’t care,” or maybe they just fell for the fantastic set pieces. Or, hey, maybe they all just played the multiplayer.

Whatever the case was, people ignored Modern Warfare 2’s story and point, and then they went on about how bad it was. Say what you will about its shortcomings–I can point out many shortcomings in All Quiet on the Western Front–but Modern Warfare 2 made an effort to make a point about the world around us, and there are damn few games I can say the same for.

Also, is the only game I’ve played with homages to one of the best action movies ever: The Rock. Saving the White House, riding on the underwater subthingies, and fighting through the showers were all direct references to some of the best bits in the movie.

The story has its problems, don’t get me wrong, but in terms of actually bothering to ask good questions, Modern Warfare 2 does its job. If you want something greater than baby food… give Modern Warfare 2 a thoughtful go. To run with Walker’s food comparison, I’d say that Modern Warfare 2 is to game stories as a jelly sandwich is to baby food–it’s food for five year olds as opposed to food for infants. Games do need to grow up. They suck. I hate nearly every game story I’ve encountered, unless I’m in a mood for bad stories (which I am, on occasion), but Modern Warfare 2, despite all the hate it gets, is actually one of the few steps in the right direction.

Freedom: On the Authority of the Character

Hey guys. This post is older than it looks, so it might not look as if it were intended to be part of a series. I don’t think it needs editing, though. Previous posts are here and here.

I’ve been playing Skyrim a bit in my free time. Also, I’ve been thinking about character interactions in Bioware games, as news about Mass Effect 3 reaches fever pitch. In addition, I was reading a thread a few weeks ago about graphics, so Uncharted 3 is getting mentioned (mostly by two or three people with Uncharted/Sony-exclusive-title avatars), as is The Witcher 2. I was also in a discussion a month or so ago about Deus Ex: Human Revolution and a (not the winter one, an earlier one) Steam sale allowed me to purchase the DLC at $7.49.

These things all have something in common: Freedom. The other day, I read an article about 2011 being the year of the sandbox title (often associated with freedom), and, of course, I just wrote about the idea of total freedom a few posts ago. There’s a reason for this, but I’ll write about it at a later date. For now, let’s just talk about a hypothetical game and hypothetical freedom.

Game Q, as we’ll call it, generally offers you a lot of freedom. There are a few points, however, when it takes that freedom away. It’s not a mechanical breakdown, though. Where Deus Ex: Human Revolution taught you to expect freedom and build your character as you saw fit, then turned everything on its head in a fit of stupidity, Game Q takes the freedom away when the plot demands it.

Let’s say, for instance, that you’ve pissed off Evil Mister X. You’re playing a mission, sneaking around Factory Z in order to find evidence pointing to the location of The MacGuffin (though you could just as easily have gone in guns blazing, or maybe stealthily executed everyone in your path; whatever you wanted), when, suddenly, Evil Mister X calls you out on the PA system, locks the doors to the room you’re in, and fills it with sleeping gas. You wake up, tied to a chair, bright lights shining on you, with Evil Mister X’s favorite interrogator preparing to stab you with a few exotic-looking needles or something.

You’ve just lost the freedom to play the way you wanted.

Let’s back the story up a bit. Earlier in the game, you did a favor for Evil Mister X. Turning him down puts you in the first situation. He doesn’t hate you this time around, however, so when doing the mission, suddenly the alarms go off, soldiers pop out of nowhere, aggressively looking for the intruder. It turns out that Evil Mister X sent his favorite assassin in to help you out, but, being Evil Mister X, he wanted it done with some style, so the assassin went in guns blazing, ruining your stealthy plans.

Isn’t that a better game than one where you have total freedom to do whatever you want?

See, Evil Mister X is a pretty big bad guy. He doesn’t take kindly to doing things someone else’s way. He does them the way he wants. For him to be a valid character, he needs to appear as if he’s making choices, even if those choices conflict with the outcomes you had in mind. If everyone just listens to you and does whatever you want no matter what, they start to feel less fully realized. There’s something wrong with a game that gives you plenty of freedom, but bends over backwards keeping everyone else in check so they only ever do what you want.

Let’s look at Infamous 2 for a moment.

Nyx, the fire-wielding hot-head (a cliche that annoys me, but whatever) conduit, offers, a few times, to do things that sound totally batshit crazy, like crashing a trolley car into an enemy base to take out all the bad guys with relative ease (but it’ll kill lots of cops). If you choose not to do it, she gets pissed, but that’s about it. So far, she won’t do anything to contradict you (I haven’t beaten the story yet), and that actually kind of bothers me. It’d be nice if I planned to do something my way, and Nyx went ahead with her plan and made a mess of things anyways.

The one obvious problem is that you essentially have the same outcome, no matter what. If you do Nyx’s plan, other people will be mad at you and cops will be dead. If you don’t do Nyx’s plan, she’ll be mad at you… and the cops will be dead. All that really changes is whether or not you wanted it to happen, and then players run the risk of feeling like their choices have no consequence, which, as I’ve previously discussed, is a bad thing. There’s no point in having a choice if the outcome is always the same, after all.

Uncharted is a pretty great example of doing the opposite. It never lets you make a choice, and as a results, its characters can feel more like real people. Never mind that Infamous 2’s characters are way better than anything Uncharted has to offer–they’re held back by having to remain secondary to your choices. Uncharted’s aren’t. They can do whatever the writer wants them to do.

It’s a prickly problem: do you want freedom or do you want real characters?

…why not have both?

If Evil Mister X doesn’t know you’re going on this mission, maybe neither things will occur.

I’ve been running with the idea that, like Deus Ex, Game Q is an immersive sim. The idea behind immersive sims is that the AI often uses non-scripted behavior to make the world feel more alive. Wolves will hunt bunnies because it’s in their nature, not because the game designer said “okay, as you round this corner, those wolves will chase that bunny.” It’s a genre that more effectively creates game worlds which feel alive, and being able to transport us to worlds by making them feel alive is something that games really ought to be doing more often. After all, if they try to tell us a story and allow us to participate in it, then nothing should break that illusion, right? (Oh, man, that’s going to have to be another post for another day. Too long.)

See, scripting can be good–just look at the original Half-Life, one of the greatest games of all time, for proof of that. At the same time, it can be bad when used in excess (see Uncharted, which is so much worse than Call of Duty when it comes to scripting and level design reducing freedom that it isn’t even funny–yet another post for another day). I think Game Q should operate with some level of scripting, but it should only do so in a way that enhances the story or the characters. Evil Mister X shouldn’t do a thing because the game designer wanted him to–Evil Mister X should be ready and able to do a lot of things dependent on the player’s behavior in the game, because that’s who he is.

Ultimately, those scripted behaviors throughout Game Q mean that the player feels like they need to interact in a specific way with any NPC they meet.

If Friendly Boss might help you out for sneaking in to Base Y, maybe you should let him know. If the game is able to track your play style (“player completes missions with 30% sneaking, 10% shooting, 60% disguises”), maybe NPCs might recognize that you did a mission if you keep using that play style, so you might want to consider changing things up. Maybe you know that one of Evil Mister X’s spies has infiltrated your organization (it might even be Friendly Boss!), so you decide not to tell anyone and do everything off the grid so nobody learns about your mission until it’s done.

Basically, I think removing player freedom doesn’t necessarily mean the game stops being free. If you lose your freedom as the result of your actions, then… it was your freedom that got you there. If anything,  your freedom is enhanced when it gets taken away. Ghandi once said (more or less) that freedom doesn’t matter unless you have the freedom to screw up. If you choose something that screws you over… well, that’s still freedom, even if it means being tied in a chair and being beaten by it. As long as Game Q doesn’t permanently take that freedom from you, it should be fine.

Somebody else once said that the people who value freedom are the ones who have it taken away. It seems to me that the game would matter more if you were put in situations where you had no freedom (as a direct result of your freedom, as just discussed), and you had to re-earn your freedom through some way.

Game Q should be able to combine the player the freedom and unscripted nature of the immersive sim alongside the scripted nature of more story-focused games, topping both by having characters that appear to make intelligent decisions based on player actions. They’re still reactive characters, like you’ll find in story-focused games like Mass Effect (I never said they had to be good stories, did I?), ultimately doing what they do based on what you do, but at least they’re not either simple AI behaviors or set-in-stone scripted behaviors.

I guess you could think of this implementation of scripting as… really elaborate AI behaviors. Jamie Griesemer and Chris Butcher, in their presentation “The Illusion of Intelligence,” which discusses the implementation of Halo’s AI, mention how part of the illusion of enemy intelligence was by giving Halo’s enemies a wide variety of things to do and letting them be around long enough to use some of those abilities. The scripting is just a really large event that occurs based on the context the characters find themselves in. It makes them seem better.

Complete (not total) freedom gives you a game that doesn’t feel genuine because its characters don’t do anything big. There’s rarely any human X-Factor in there. You just do things the way you want to do them, the end. The world doesn’t change as a result of your actions beyond, of course, “oh, this mission’s sub-objective was not to be detected, so you lost a chance to earn 500 XP and some dialog options changed.” The choices don’t really have consequences, and, as you should know by now, choices are meaningless without consequences. Likewise, a scripted game is going to be the same no matter what, so, once again, your choices have no consequences, because you have no choice. You do what you’re told and nothing ever changes.

A hybrid of these two should offer the strengths of both while eliminating their weaknesses.

That’s the theory, anyways.

Kubrick & Me

I think my first Kubrick film must have been Spartacus. Even then, I could tell that the film, and, consequently, its director, were far above and beyond their peers. It’s astonishing just how timeless Kubrick’s films all are. In the ensuing years, I’ve devoured many of his other classics, from 2001: A Space Odyssey to Full Metal Jacket. Most recently, I found myself watching The Shining, which is a terrifying film, not only because of the excellent performances and curious script, but also because of the clever lighting and camera work. Kubrick fascinates me for a wide variety of reasons, such as his minimal, yet extremely high quality output, to the fact that he never did the same film twice. If you don’t know much about the man, this documentary is a good place to start.

To say that he influenced me heavily would be an understatement–few creative types have had the level of impact that Kubrick had. There are two specific traits of his which I’ve found myself emulating quite a lot throughout my life, and I think they’re quite important things that everyone could apply in their lives, particularly if they’re intent on working in an artistic medium like games.

What am I referring to?

“…there are thousands of decisions that have to be made, and if you don’t make them yourself, and you’re not on the same wavelength as the people who are making them, it becomes a painful experience, which it was.”

Over the years, in various team roles, this has always proven to be the case. Many people, it seems have interpreted this the wrong way, assuming that whoever makes all the decisions is also the only one coming up with ideas, which is, of course, absurd. In the documentary linked above, R. Lee Ermey, the Sergeant from Full Metal Jacket, talks about how he improvised one of the most memorable lines in the movie. While he could have cut the line because it wasn’t what he wanted (and Kubrick was known for being a perfectionist who would take waaay more shots than most directors before he found the right one), Kubrick made the decision to keep it in, and Full Metal Jacket was better as a result.

Executive decisions are important, particularly when it comes to artistic merit. Art is not created by committee, nor should it be. Kubrick was the artist behind all of his films, even if he worked with hundreds, if not thousands of others. Those movies were very much his. Having a singular artistic vision driving a work is absolutely vital to the process, though elements such as collaboration and improvisation cannot be understated. Collaboration brought us some of the best episodes of the Simpsons and movies like Airplane! after all.

Art is not a democratic process, nor should be.

"But if you don't support democracy, the Communists win! And steal our bodily fluids!"

The second major lesson to be learned from Stanley Kubrick is an anecdote, found, again, in the documentary, where one of the crew, responsible for designing things like the War Room, mentioned how Kubrick, upon seeing his designs, wanted to know why the man had chosen the various elements that he did. A great attention to detail is a common theme that one can find in the works of many great artists, actually. Stephen Spielberg, in a retrospective on his career by the Director’s Guild of America talked about how, in Lawrence of Arabia, when the eponymous protagonist is at a well, you can see a trail laid down over years and years of visits by the various Arabic tribes. It’s an element that, while unnecessary, adds more depth to the world. Orson Welles, the greatest director of all time (and how could he not be? the man directed Citizen Kane, considered by pretty much everyone to be the greatest film ever made, masterminded The War of the Worlds Radio Broadcast, the greatest hoax ever pulled, and created F for Fake, the best film about forgery you will ever see), was an absolute master at this, with everything, from lighting (The Third Man) to camera angles (Citizen Kane) .

This isn’t often an element of video games, unfortunately. One you start asking yourself why the game writers and artists made the decisions they did, things start to fall apart all too easily. Even in an artistic triumph like Uncharted, where an uncanny amount of detritus litters the screen, making the world feel lived in, you’ll start to see the cracks in the facade. If you don’t believe me, well, here’s an excellent look at the game. It’s unfortunate that this is the case, and I’m positively delighted to see how iD’s artists have taken game art to the next level with Rage.

It’s a remarkably solid game, and I’m still powering through it, so I probably won’t talk about it much until I’m done, but I do want to point this out: While the whiners seem to love complaining about how the game’s story isn’t very original (it’s not, but the topic of a natural apocalypse hasn’t actually been done in games before; they’ve all been man-made or unexplained zombiepocalypses so far, and this is an iD game, so I’m not sure why people would suddenly be concerned about stories), or how the weapons are weak (which is a flat-out LIE), or how the enemies are uninspired (also lies and maybe heresy; in terms of enemy movement, there is nothing like them and they vary widely; however, despite the massive variation between enemy types, I’ve only run across nine major types in the first ten hours, not counting vehicles), the art direction is either not commented on (because yay for texture pop-in!) or called derivative of Fallout 3 or Borderlands, which is just silly.

Let me see if I can explain that awful butchery of a paragraph that was sort of a mini-review: Everything in Rage seems to be hand-sculpted. There’s this insane attention to detail that is so intense that you’d be hard-pressed to find much artistic repetition, which is rampant in most games, even Uncharted 2. The game just does not reuse props that often! Instead, it makes absolutely ever single thing you will ever do, well… new. The only time you do the same thing twice is when you accept a mission (which is totally optional!) to go back to a place you’ve already been.

When idTech 5 isn’t being cantankerous, you can really get a feel for just how unique and hand-crafted everything is. There’s a sense that this is a real world shaped by real people, not some world designed by artists trying to make a game as quickly as possible. Whether it’s the semi-photorealistic (albeit exaggerated for mood) use of color or the shapes of the structures, Rage is just…

It’s everything a 3D game ought to be, artistically.

You know, I like this post. I like the first half and I like the Rage portion. I like how they fit together. But… while they’re connected ideas, they don’t make for a very coherent article, and I apologize for that. I like this too much to break it apart, even if it doesn’t work well. It’s not like I’m being graded on this or anything, and I have to go to class now, where I will be graded on something, so… adios for now!

Remember Me?

"We who are about to rock salute you."

Let’s play a game!

It’s quite simple, and you should enjoy it, unless you’re some sort of daft punk who hates music. All you need to do is adjust your volume to proper levels, click on a link, and close your eyes. Once you’ve guessed the name of the song playing, you can open your eyes, check to see if you were correct, and repeat, until you’ve heard all the songs listed. Then we’ll have a think about it.

  • If you can guess what this is, congratulations, you are alive.
  • Here’s another easy one.
  • Honestly, if you haven’t heard this, there is no hope for you.

Right, so, easy, no?

  • This one, you should recognize, but… well… hm. I’ll get to it in a moment.
  • Again, I expect you to recognize this, maybe.
  • I’m not sure if you could call it a theme, but it should be somewhat recognizable.

Was that more difficult? I expect it was.

Surely you recognized those songs, Princess Leia!

I didn’t actually see Star Wars, Mission Impossible, or The Pink Panther until very recently, but I knew those themes. Ask me to hum them for you, and I probably could. Ask most people and, unless they’re horribly tone-deaf, chances are they can hum, whistle, or sing those tunes with great ease. The latter songs, though? I’m sure many people have played or heard of the games they come from, but I doubt many people, when asked “hey, do you know Fallout 3/Bioshock’s/Assassin’s Creed 2’s theme,” would be able to recall the tunes right away. Show them a picture of the games, and you’ll have instant recognition–but you can do that with the films, too. Star Wars looks distinctively Star Wars, and pretty much everyone should be familiar with the cartoon Pink Panther.

See, video games have a problem. Any sort of audiovisual entertainment, whether it’s a television show, a movie, or a video game, relies on certain cultural shorthand for recognizability. A theme song, a logo, or a character are all symbols that can be used to convey a lot of information in a very small amount of space. The Jurassic Park theme immediately conjures up pictures of dinosaurs in my head, and at the same time, the fact that it has multiple connections to Indiana Jones and Jaws (John Williams composed their soundtracks, and Stephen Spielberg directed the films), means that I might start thinking about them as well. Essentially, it’s about conveying an idea, or many ideas, without words. It’s like a signature–a thing that says “yes, I am a part of this idea or from this person, whatever it or whoever they may be.”

This Bat-symbol means something to most people:

A picture may be worth a thousand words, but a symbol is worth a million.

This one does too:

Atomic batteries to power! Turbines to speed!

The Bat-symbol is visual shorthand for the idea of Batman, but there are multiple interpretations of the character. Adam West’s Batman isn’t like Michael Keaton’s or Christian Bale’s, and as a result, the Bat-symbol changes. It’s still a Bat, so you know it’s about Batman, but one particular symbol makes you think about the Nolan films and another makes you think about the Burton ones. You see, it’s all about identity. Give a game a symbol, a logo, a distinct character design, or a song, and you give it a unique identity. If someone were to write replace the letters B-I-O-S-H-O-C-K with H-A-L-O, you’d still know it was Bioshock’s logo. But if you replaced the Bioshock theme… well, it doesn’t really matter. Bioshock’s theme is nice, but ultimately insignificant. It’s soundtrack music–designed to stay in the background. The same is true for most video games; the music simply isn’t good, or at least distinct, enough to burrow itself into our cultural identity.

I know a good number of people who think that certain JRPGs they played as children and teenagers have great music. As a former music teacher and honor band member, I honestly couldn’t tell you that Chrono Trigger or Final Fantasy have great music. It’s mostly forgettable unless you have a certain emotional attachment to the game itself; otherwise, the music doesn’t really stay with you. Firelake’s Dirge for the Planet is a fairly forgettable song, except for the fact that it’s inextricable from my time playing STALKER. It’s a song that plays on the radio in the few safe places in the entire game. Hearing it is intensely comforting to me, but to people who haven’t played STALKER (or had their music turned off), Dirge has no meaning. Half-Life 2, Earthbound, and Dragon Age have such forgettable soundtracks that I have replaced them in my head with Armin Van Buuren, Scorpions (specifically, the Rhythm of Love), and Electric Six/Led Zeppelin, respectively. Mass Effect may have a theme song, but that didn’t stop Bioware from using Two Steps from Hell’s Heart of Courage in place of it during the trailer.

Now, I did say that it’s true for most video games. There are a few exceptions, most notably Halo and (before it’s even out, no less) Skyrim. If you hear them once, it’s unlikely you will ever forget them.

Ultimately, I think games could do with a more distinctive identity. With greater, more distinctive themes, they could push themselves further into the general pop cultural consciousness. As it is now, game music is basically forgettable drek, cared for only by the people who already love the games that each song is from. Come on, games industry, let’s do better, shall we?