Archive for the ‘ About Me ’ Category

Blog All the Things!

According to advice from Lifehacker, I should blog every day. Cool. I’ll get right on that. Today, I’ll repost some posts I’ve made on Kotaku. After that, we’ll see what happens. Hopefully, I’ll post daily, but probably not, but I will try.



The final Freedom post is incoming! Should hit some time Thursday!

Not sure what all I’m going to be blogging about soon, though I’ve got some ideas. I’d like to touch on the concept of people who only react obviousness (most gamers), but I need to come up with some theories of how to circumvent it. An article on what makes a good PC game might be neat, and perhaps a discussion of Microsoft Flight and Age of Empires Online, or, rather, The Fall of Microsoft Games Studios, would be cool. I’m working on the Half-Life 2 breakdown, but I don’t really want to dig at it until it’s complete. Then there’s Dark Souls and my frustrations with it. I’m playing Thief right now, and it’s cool, but I’d like to get further ahead in the game before I really touch on things. I’ve got a neat thought about it I don’t think people have really touched on before, and it relates to old-school 3D games in general.

I’ve Been Away (and I’ve got a great reason)

I’ve been away because I am making a video game, as well as beginning the new semester and getting more hours at work.


I’ve been making a video game. It’ll be awesome.

Semi-regular posting will resume next week.

Kubrick & Me

I think my first Kubrick film must have been Spartacus. Even then, I could tell that the film, and, consequently, its director, were far above and beyond their peers. It’s astonishing just how timeless Kubrick’s films all are. In the ensuing years, I’ve devoured many of his other classics, from 2001: A Space Odyssey to Full Metal Jacket. Most recently, I found myself watching The Shining, which is a terrifying film, not only because of the excellent performances and curious script, but also because of the clever lighting and camera work. Kubrick fascinates me for a wide variety of reasons, such as his minimal, yet extremely high quality output, to the fact that he never did the same film twice. If you don’t know much about the man, this documentary is a good place to start.

To say that he influenced me heavily would be an understatement–few creative types have had the level of impact that Kubrick had. There are two specific traits of his which I’ve found myself emulating quite a lot throughout my life, and I think they’re quite important things that everyone could apply in their lives, particularly if they’re intent on working in an artistic medium like games.

What am I referring to?

“…there are thousands of decisions that have to be made, and if you don’t make them yourself, and you’re not on the same wavelength as the people who are making them, it becomes a painful experience, which it was.”

Over the years, in various team roles, this has always proven to be the case. Many people, it seems have interpreted this the wrong way, assuming that whoever makes all the decisions is also the only one coming up with ideas, which is, of course, absurd. In the documentary linked above, R. Lee Ermey, the Sergeant from Full Metal Jacket, talks about how he improvised one of the most memorable lines in the movie. While he could have cut the line because it wasn’t what he wanted (and Kubrick was known for being a perfectionist who would take waaay more shots than most directors before he found the right one), Kubrick made the decision to keep it in, and Full Metal Jacket was better as a result.

Executive decisions are important, particularly when it comes to artistic merit. Art is not created by committee, nor should it be. Kubrick was the artist behind all of his films, even if he worked with hundreds, if not thousands of others. Those movies were very much his. Having a singular artistic vision driving a work is absolutely vital to the process, though elements such as collaboration and improvisation cannot be understated. Collaboration brought us some of the best episodes of the Simpsons and movies like Airplane! after all.

Art is not a democratic process, nor should be.

"But if you don't support democracy, the Communists win! And steal our bodily fluids!"

The second major lesson to be learned from Stanley Kubrick is an anecdote, found, again, in the documentary, where one of the crew, responsible for designing things like the War Room, mentioned how Kubrick, upon seeing his designs, wanted to know why the man had chosen the various elements that he did. A great attention to detail is a common theme that one can find in the works of many great artists, actually. Stephen Spielberg, in a retrospective on his career by the Director’s Guild of America talked about how, in Lawrence of Arabia, when the eponymous protagonist is at a well, you can see a trail laid down over years and years of visits by the various Arabic tribes. It’s an element that, while unnecessary, adds more depth to the world. Orson Welles, the greatest director of all time (and how could he not be? the man directed Citizen Kane, considered by pretty much everyone to be the greatest film ever made, masterminded The War of the Worlds Radio Broadcast, the greatest hoax ever pulled, and created F for Fake, the best film about forgery you will ever see), was an absolute master at this, with everything, from lighting (The Third Man) to camera angles (Citizen Kane) .

This isn’t often an element of video games, unfortunately. One you start asking yourself why the game writers and artists made the decisions they did, things start to fall apart all too easily. Even in an artistic triumph like Uncharted, where an uncanny amount of detritus litters the screen, making the world feel lived in, you’ll start to see the cracks in the facade. If you don’t believe me, well, here’s an excellent look at the game. It’s unfortunate that this is the case, and I’m positively delighted to see how iD’s artists have taken game art to the next level with Rage.

It’s a remarkably solid game, and I’m still powering through it, so I probably won’t talk about it much until I’m done, but I do want to point this out: While the whiners seem to love complaining about how the game’s story isn’t very original (it’s not, but the topic of a natural apocalypse hasn’t actually been done in games before; they’ve all been man-made or unexplained zombiepocalypses so far, and this is an iD game, so I’m not sure why people would suddenly be concerned about stories), or how the weapons are weak (which is a flat-out LIE), or how the enemies are uninspired (also lies and maybe heresy; in terms of enemy movement, there is nothing like them and they vary widely; however, despite the massive variation between enemy types, I’ve only run across nine major types in the first ten hours, not counting vehicles), the art direction is either not commented on (because yay for texture pop-in!) or called derivative of Fallout 3 or Borderlands, which is just silly.

Let me see if I can explain that awful butchery of a paragraph that was sort of a mini-review: Everything in Rage seems to be hand-sculpted. There’s this insane attention to detail that is so intense that you’d be hard-pressed to find much artistic repetition, which is rampant in most games, even Uncharted 2. The game just does not reuse props that often! Instead, it makes absolutely ever single thing you will ever do, well… new. The only time you do the same thing twice is when you accept a mission (which is totally optional!) to go back to a place you’ve already been.

When idTech 5 isn’t being cantankerous, you can really get a feel for just how unique and hand-crafted everything is. There’s a sense that this is a real world shaped by real people, not some world designed by artists trying to make a game as quickly as possible. Whether it’s the semi-photorealistic (albeit exaggerated for mood) use of color or the shapes of the structures, Rage is just…

It’s everything a 3D game ought to be, artistically.

You know, I like this post. I like the first half and I like the Rage portion. I like how they fit together. But… while they’re connected ideas, they don’t make for a very coherent article, and I apologize for that. I like this too much to break it apart, even if it doesn’t work well. It’s not like I’m being graded on this or anything, and I have to go to class now, where I will be graded on something, so… adios for now!

Subjectivity is Absurd: Why Being an Expert is a Real Thing

(I’ll add pictures later.)

What follows is the first of two blog posts about how I think about games and games criticism. I have run into a few instances of people telling me that I can or can’t criticize certain things due to their popular reception or because these things are subjective. I feel it would be instructive to express my history and my mental approach to games and criticism in order to help people more fully understand my thought process, in order to help them understand that it’s not an issue of fanboyism or a case of the contrarian blues or a resistance to popularity or nostalgia or any of those things; I simply try to take an intellectual approach to games criticism. Hope this helps.

I am a critic.

Generally, you hear that word given a negative connotation (“everybody’s a critic!” when someone criticizes another), and in part, it’s because of the word I just used, criticize. What you may not realize, however, is that that’s the second definition in the dictionary (which, for our purposes today, is Merriam-Webster’s online dictionary). The first definition is “to consider the merits and demerits of, and judge accordingly.” A merit is, of course, whatever the object of criticism does well, and a demerit is whatever it does poorly. As a critic, it is my responsibility to judge a work and determine its quality based on its individual merits.

A core tenet of criticism–not just in games criticism, but in criticism in general, is the understanding that things can be criticized. The field of criticism cannot exist without this. When people argue that “oh, well, what I think is good is subjective…” well, no, not really. They don’t hire me to judge skating competitions because I am not an expert. Likewise, a skating judge isn’t generally hired to criticize literature. If culture was entirely subjective, and the quality of a worked was judged solely by the individual, then honestly, academic discourse on the arts would not exist. There would be no one to study the merits of art, only psychologists to study why people like certain things.

As you can see, this is not the case. Serious, knowledgeable, and intelligent people, don’t believe in subjectivity, not when it comes to defining the quality of a work.

What many people fail to understand is that one’s enjoyment of a thing does not affect the quality of that thing. There is no Observer Effect in art. How you perceive something does not affect that thing. It’s merely your perception. A critic’s responsibility is to look at that thing and judge it by the qualities that human culture has deemed to be good or bad. That’s what “good” or “bad” is. Those words exist to express how culture perceives something. Those words are not used to determine how you perceive a thing. The very dictionary definition of the word good is that it is “of a high quality or standard, either on an absolute scale or in relation to another or others.” Of perception affected good or bad, then cultural ideas determining good would have no purpose. There would be nothing inherently bad about the mass-murder of innocents, because the person committing the act perceived it to be a good thing. Instead, we would determine that their perception was wrong, that they were insane or evil, and that they deserved punishment.

Good and bad, then, can never be tied to perception. Instead, as we see by that dictionary definition, good is all about comparison, and where do we get things to compare from? Culture. Generally, a thing that is good is a thing that bears qualities that, in other, comparative works, have stood the test of time.

“But,” you may protest, “what do we use to express how we perceive a work? How do we convey our opinions?” Don’t fret. The English language has you covered. If you enjoy something, you “like” it. If you don’t enjoy something, you “dislike” it. I realize I might sound condescending there, but, in my experience, many of the people who confuse “like” and “dislike” with “good” and “bad” tend to be  very insistent people who just don’t like it when you say bad things about things that they enjoy.

And really, that’s why the idea of subjectivity is so attractive. It allows people who like bad things to defend themselves. The people who know that Twilight is bad can defend their like of it by saying “everything’s subjective!” After all, most people don’t want to admit that they like bad things (but for things like this, the English language once again comes to the rescue! They’re our guilty pleasures. I love listening to Speedycake’s Caramelldansen remix. It’s awful.)

There is absolutely nothing I, or anyone else, can do to make you like or dislike something. If I can prove, using cultural standards, that something is good or bad, this should in no way affect whether or not you enjoy something! What you enjoy is a massive complex ultimately defined by you. Your life experience, such as pleasant memories accompanying a specific song, will determine how you feel about a song. Likewise, everything from brain defects (such as the wonderful gift that is synesthesia) to belief systems to simply whatever art you have consumed will affect your perception. Despite a pretty good relationship with my mother, Pink Floyd’s “Mother,” from “The Wall” (which I believe is the greatest album ever recorded) speaks to me, and, as such, speaks to me on  a very deep level. Watching my grandmother slip into mental decline means that “Wish You Were Here,” again by Pink Floyd, also has an extreme meaning. On the flip side, I think Mozart’s music is absolutely soulless.

Each person is a unique, incredible individual. It might sound corny to say that, but, hey, it’s true. The sum of each person’s quiddity (that whatness of a person) is different for each and every person. What we like and what we dislike is a part of that. What’s good or bad… well, that’s culture. It might be argued that, indeed, everything is subjective, but that’s a debate best left to the philosophers. Critics exist and the English language has accounted for standards of good or bad and like and dislike. One is cultural, the other is personal.

Have I hammered this point home enough? I hope so.

I would be remiss if I did not mention two absolutely vital exceptions. First, there is always room for new art and cultural shift. People didn’t like Victor Hugo’s mixed-race heritage and extravagant lifestyle, so they downplayed the importance of the Count of Monte Cristo, which is one of the greatest works of fiction ever produced. The Treachery of Images, by the Belgian René Magritte, is not something that had been done before. It was a new kind of art, and as such, could not readily be compared with already existing works. One could say “oh, well, you know, that’s not a very good looking pipe; Van Gogh did it better,” but that would be missing the point. Also, Van Gogh’s pipe totally sucks.

An equally important exception is that sometimes, the rules can be broken. The rule of thirds is a good rule to follow when taking pictures. In fact, there’s a neat little blog post here that I just ran across (it uses the same blog skin I just switched to!) that explains the rule of thirds quite nicely. Check it out. However, that rule can be broken; totally centering an image has some profound effects on the viewer’s mind. It’s actually quite unpleasing to see, and when coupled with great sound design, editing, and lighting, you’ll end up with something like this scene from The Shining. By breaking this rule, Stanley Kubrick made an incredibly unpleasant image that conveyed the horror of that scene far more effectively than if he had followed rules of good film. It’s important to realize that the rules are super important, but at the same time, they aren’t hard and fast. Deep characterization and growth is absolutely important to a good story… except when it isn’t. As an example, the story (if it can be called that) of T. S. Eliot’s The Waste Land simply would not work with well-defined characters. Breaking the rules is okay, but unless you’re approaching something with the calculated genius of Eliot or Kubrick, then you’re more likely to benefit from following the rules.

As an aside, you may wonder why I’m not really going in-depth with what rules are good and what rules are bad. Well, there are rather a lot of them. Some of them are better described as techniques than rules. Also, a lot of these rules aren’t universal, even in terms of storytelling. There are lines that convey an idea in a book that simply won’t work in a movie, while the rules for television editing are totally different from film editing, while narrative structure in music, like Pink Floyd’s The Wall, just wouldn’t work in, say, comics. Each art medium has its own rules and techniques to follow.

I have experience as a critic. I’m not just some green kid who has a little WordPress blog; I’ve actually worked as a film and comics reviewer on a now-dead (the site imploded when its editor decided to control everything so the entire staff up and left) reviews website. I’ve also got an extensive background in literature, test extremely well in English/Literary exams (top 1% of the US scores, for what it’s worth), study film and game design, and I read like a maniac. I’ve also won a handful of writing awards. I don’t say this to toot my own horn, but to emphasize my own experience. I’m no Doctor of Literature or anything, nor am I a famous published author, but I do have more of a background than the average joe. In other words, I have more of a grasp on things like good storytelling than the average game reviewer does, even if they are paid to write about games and I’m not. When I criticize games, I’m not just saying “hey, I didn’t have fun with the game,” I’m giving you the reasons, from a lifetime of experience that most people don’t have, as to why gaming is bad. Gaming, particularly from a narrative perspective, isn’t seen as art because, hey, it’s not particularly artistic. Game writing is often quite bad. In class today, we watched part of The Room, which is generally considered The Worst Film Ever. I’ve seen worse game cutscenes.

While I realize that this sounds arrogant, I don’t really know how else to put it: most game writers are just journalists who grew up playing games. Some of them aren’t even journalists–just enthusiasts who, again, grew up playing games. You’ve got the odd few Actual Games Writers, like Kieron Gillen, and generally, the magazines hire better writers than blogs (there are exceptions: Leigh Alexander, who is a head honcho on Gamasutra and a great games writer with a lot of really thought-provoking articles, got into writing through, if I remember right, Destructoid’s community blogs), but overall, games writers just haven’t got the level of critical backing that a lot of other industries do. Partly, this is an academic failure. Next semester, I can sign up any one of a number of film, literature, music, or art courses. Video game criticism and narrative courses, on the other hand, don’t really exist.

I’m probably overselling myself here, and I certainly don’t mean to. It’s just… when you have someone who grew up with games like Final Fantasy VIII, they’re extremely likely to go “wow, this is one of the greatest love stories of all time,” even when the story was actually a case of the writers trying to force two completely unlikable (Squall seems entirely apathetic throughout the game and Rinoa is “an outspoken and passionate woman who follows her heart in all situations,” or, in other words, is a Mary Sue) characters together while going on at length about a bunch of random witches who murder people while crowds cheer and suddenly half way through “oh hey do you remember that we all grew up in an orphanage together?”

Did my sentence lose coherency there? Yeah, well, Final Fantasy VIII was worse. Within the niche that is video gaming, however, Final Fantasy VIII is extremely well liked. If you gave it to the average person, they’d call it like it is–strange, confusing drek.

It’s not just nostalgia, though. In games criticism, there are two major issues that people seem to forget about: first, game reviews are often based on initial impressions. If everyone did that, Transformers 3 would be considered one of the greatest films of all time. It’s easy to get swept up in the “HOLY COW THOSE ROBOTS JUST PUNCHED EACH OTHER!” of the moment and totally miss the fact that the writing and characters suck. In games writing, that initial “HOLY CRAP!” feeling gets carried right into the reviews, and there’s rarely pause for critical thought. The initial emotions become the review, and people just don’t bother to go “hey, wait a minute…” It’s exacerbated by the second major issue (which can be seen simply by reading various game reviews that are aggregated on Metacritic), which is that many games reviewers post a review before they’ve bothered to finish the game. This is partly due to extremely busy schedules and partly due to wanting to get new reviews out as early as possible around the time of a game’s release. What you have is a recipe for “wow, that was a really dramatic moment, so I’m going to praise this game!” and be done with it.

So! Where was I going with this?

Just this: there is such a thing as objective criticism of an art form, but not everything is absolutely concrete, and while my opinions may differ from the norm, I also have a background most professional games writers lack and am not in the kind of environment that would facilitate the initial impressions that make up most game reviews. After my first playthrough of Mass Effect 2, I was screaming about how it deserved Game of the Year. After my second, I started seeing cracks in the foundation of the game that I couldn’t ignore. I’ve got the time and expertise to talk about that… most people don’t have both, sadly. I’m not better than anyone… I just have a different scope of experience that results in me perceiving things a bit differently in a way that allows me to explore things in a way that I hope will be beneficial to others. Ultimately, I hope my criticism results in better games being made (most likely through consumers demanding better games; I have no pretentions of nabbing a game designer’s attention and causing a radical paradigm shift in her brain), and a broader acceptance of games as an art form. After all, criticism’s entire purpose is to point out what’s good and what’s bad in the hopes that the good will be seen more and the bad will be eliminated.

Don’t we want that?

That said, it doesn’t mean I won’t be subjective. I’m totally going to write about what I like and dislike here on alphatown. Looking at everything with a purely analytical eye can suck, and this blog would become dry and boring if it’s all I ever did.

One thing I didn’t touch on, by the way: I really didn’t start gaming until 2007, so I’m relatively untainted by game nostalgia. You may be going “wait… what?” right about now. Unfortunately, we’re out of time… so we’ll talk later, okay?