Subjectivity is Absurd: Why Being an Expert is a Real Thing

(I’ll add pictures later.)

What follows is the first of two blog posts about how I think about games and games criticism. I have run into a few instances of people telling me that I can or can’t criticize certain things due to their popular reception or because these things are subjective. I feel it would be instructive to express my history and my mental approach to games and criticism in order to help people more fully understand my thought process, in order to help them understand that it’s not an issue of fanboyism or a case of the contrarian blues or a resistance to popularity or nostalgia or any of those things; I simply try to take an intellectual approach to games criticism. Hope this helps.

I am a critic.

Generally, you hear that word given a negative connotation (“everybody’s a critic!” when someone criticizes another), and in part, it’s because of the word I just used, criticize. What you may not realize, however, is that that’s the second definition in the dictionary (which, for our purposes today, is Merriam-Webster’s online dictionary). The first definition is “to consider the merits and demerits of, and judge accordingly.” A merit is, of course, whatever the object of criticism does well, and a demerit is whatever it does poorly. As a critic, it is my responsibility to judge a work and determine its quality based on its individual merits.

A core tenet of criticism–not just in games criticism, but in criticism in general, is the understanding that things can be criticized. The field of criticism cannot exist without this. When people argue that “oh, well, what I think is good is subjective…” well, no, not really. They don’t hire me to judge skating competitions because I am not an expert. Likewise, a skating judge isn’t generally hired to criticize literature. If culture was entirely subjective, and the quality of a worked was judged solely by the individual, then honestly, academic discourse on the arts would not exist. There would be no one to study the merits of art, only psychologists to study why people like certain things.

As you can see, this is not the case. Serious, knowledgeable, and intelligent people, don’t believe in subjectivity, not when it comes to defining the quality of a work.

What many people fail to understand is that one’s enjoyment of a thing does not affect the quality of that thing. There is no Observer Effect in art. How you perceive something does not affect that thing. It’s merely your perception. A critic’s responsibility is to look at that thing and judge it by the qualities that human culture has deemed to be good or bad. That’s what “good” or “bad” is. Those words exist to express how culture perceives something. Those words are not used to determine how you perceive a thing. The very dictionary definition of the word good is that it is “of a high quality or standard, either on an absolute scale or in relation to another or others.” Of perception affected good or bad, then cultural ideas determining good would have no purpose. There would be nothing inherently bad about the mass-murder of innocents, because the person committing the act perceived it to be a good thing. Instead, we would determine that their perception was wrong, that they were insane or evil, and that they deserved punishment.

Good and bad, then, can never be tied to perception. Instead, as we see by that dictionary definition, good is all about comparison, and where do we get things to compare from? Culture. Generally, a thing that is good is a thing that bears qualities that, in other, comparative works, have stood the test of time.

“But,” you may protest, “what do we use to express how we perceive a work? How do we convey our opinions?” Don’t fret. The English language has you covered. If you enjoy something, you “like” it. If you don’t enjoy something, you “dislike” it. I realize I might sound condescending there, but, in my experience, many of the people who confuse “like” and “dislike” with “good” and “bad” tend to be  very insistent people who just don’t like it when you say bad things about things that they enjoy.

And really, that’s why the idea of subjectivity is so attractive. It allows people who like bad things to defend themselves. The people who know that Twilight is bad can defend their like of it by saying “everything’s subjective!” After all, most people don’t want to admit that they like bad things (but for things like this, the English language once again comes to the rescue! They’re our guilty pleasures. I love listening to Speedycake’s Caramelldansen remix. It’s awful.)

There is absolutely nothing I, or anyone else, can do to make you like or dislike something. If I can prove, using cultural standards, that something is good or bad, this should in no way affect whether or not you enjoy something! What you enjoy is a massive complex ultimately defined by you. Your life experience, such as pleasant memories accompanying a specific song, will determine how you feel about a song. Likewise, everything from brain defects (such as the wonderful gift that is synesthesia) to belief systems to simply whatever art you have consumed will affect your perception. Despite a pretty good relationship with my mother, Pink Floyd’s “Mother,” from “The Wall” (which I believe is the greatest album ever recorded) speaks to me, and, as such, speaks to me on  a very deep level. Watching my grandmother slip into mental decline means that “Wish You Were Here,” again by Pink Floyd, also has an extreme meaning. On the flip side, I think Mozart’s music is absolutely soulless.

Each person is a unique, incredible individual. It might sound corny to say that, but, hey, it’s true. The sum of each person’s quiddity (that whatness of a person) is different for each and every person. What we like and what we dislike is a part of that. What’s good or bad… well, that’s culture. It might be argued that, indeed, everything is subjective, but that’s a debate best left to the philosophers. Critics exist and the English language has accounted for standards of good or bad and like and dislike. One is cultural, the other is personal.

Have I hammered this point home enough? I hope so.

I would be remiss if I did not mention two absolutely vital exceptions. First, there is always room for new art and cultural shift. People didn’t like Victor Hugo’s mixed-race heritage and extravagant lifestyle, so they downplayed the importance of the Count of Monte Cristo, which is one of the greatest works of fiction ever produced. The Treachery of Images, by the Belgian René Magritte, is not something that had been done before. It was a new kind of art, and as such, could not readily be compared with already existing works. One could say “oh, well, you know, that’s not a very good looking pipe; Van Gogh did it better,” but that would be missing the point. Also, Van Gogh’s pipe totally sucks.

An equally important exception is that sometimes, the rules can be broken. The rule of thirds is a good rule to follow when taking pictures. In fact, there’s a neat little blog post here that I just ran across (it uses the same blog skin I just switched to!) that explains the rule of thirds quite nicely. Check it out. However, that rule can be broken; totally centering an image has some profound effects on the viewer’s mind. It’s actually quite unpleasing to see, and when coupled with great sound design, editing, and lighting, you’ll end up with something like this scene from The Shining. By breaking this rule, Stanley Kubrick made an incredibly unpleasant image that conveyed the horror of that scene far more effectively than if he had followed rules of good film. It’s important to realize that the rules are super important, but at the same time, they aren’t hard and fast. Deep characterization and growth is absolutely important to a good story… except when it isn’t. As an example, the story (if it can be called that) of T. S. Eliot’s The Waste Land simply would not work with well-defined characters. Breaking the rules is okay, but unless you’re approaching something with the calculated genius of Eliot or Kubrick, then you’re more likely to benefit from following the rules.

As an aside, you may wonder why I’m not really going in-depth with what rules are good and what rules are bad. Well, there are rather a lot of them. Some of them are better described as techniques than rules. Also, a lot of these rules aren’t universal, even in terms of storytelling. There are lines that convey an idea in a book that simply won’t work in a movie, while the rules for television editing are totally different from film editing, while narrative structure in music, like Pink Floyd’s The Wall, just wouldn’t work in, say, comics. Each art medium has its own rules and techniques to follow.

I have experience as a critic. I’m not just some green kid who has a little WordPress blog; I’ve actually worked as a film and comics reviewer on a now-dead (the site imploded when its editor decided to control everything so the entire staff up and left) reviews website. I’ve also got an extensive background in literature, test extremely well in English/Literary exams (top 1% of the US scores, for what it’s worth), study film and game design, and I read like a maniac. I’ve also won a handful of writing awards. I don’t say this to toot my own horn, but to emphasize my own experience. I’m no Doctor of Literature or anything, nor am I a famous published author, but I do have more of a background than the average joe. In other words, I have more of a grasp on things like good storytelling than the average game reviewer does, even if they are paid to write about games and I’m not. When I criticize games, I’m not just saying “hey, I didn’t have fun with the game,” I’m giving you the reasons, from a lifetime of experience that most people don’t have, as to why gaming is bad. Gaming, particularly from a narrative perspective, isn’t seen as art because, hey, it’s not particularly artistic. Game writing is often quite bad. In class today, we watched part of The Room, which is generally considered The Worst Film Ever. I’ve seen worse game cutscenes.

While I realize that this sounds arrogant, I don’t really know how else to put it: most game writers are just journalists who grew up playing games. Some of them aren’t even journalists–just enthusiasts who, again, grew up playing games. You’ve got the odd few Actual Games Writers, like Kieron Gillen, and generally, the magazines hire better writers than blogs (there are exceptions: Leigh Alexander, who is a head honcho on Gamasutra and a great games writer with a lot of really thought-provoking articles, got into writing through, if I remember right, Destructoid’s community blogs), but overall, games writers just haven’t got the level of critical backing that a lot of other industries do. Partly, this is an academic failure. Next semester, I can sign up any one of a number of film, literature, music, or art courses. Video game criticism and narrative courses, on the other hand, don’t really exist.

I’m probably overselling myself here, and I certainly don’t mean to. It’s just… when you have someone who grew up with games like Final Fantasy VIII, they’re extremely likely to go “wow, this is one of the greatest love stories of all time,” even when the story was actually a case of the writers trying to force two completely unlikable (Squall seems entirely apathetic throughout the game and Rinoa is “an outspoken and passionate woman who follows her heart in all situations,” or, in other words, is a Mary Sue) characters together while going on at length about a bunch of random witches who murder people while crowds cheer and suddenly half way through “oh hey do you remember that we all grew up in an orphanage together?”

Did my sentence lose coherency there? Yeah, well, Final Fantasy VIII was worse. Within the niche that is video gaming, however, Final Fantasy VIII is extremely well liked. If you gave it to the average person, they’d call it like it is–strange, confusing drek.

It’s not just nostalgia, though. In games criticism, there are two major issues that people seem to forget about: first, game reviews are often based on initial impressions. If everyone did that, Transformers 3 would be considered one of the greatest films of all time. It’s easy to get swept up in the “HOLY COW THOSE ROBOTS JUST PUNCHED EACH OTHER!” of the moment and totally miss the fact that the writing and characters suck. In games writing, that initial “HOLY CRAP!” feeling gets carried right into the reviews, and there’s rarely pause for critical thought. The initial emotions become the review, and people just don’t bother to go “hey, wait a minute…” It’s exacerbated by the second major issue (which can be seen simply by reading various game reviews that are aggregated on Metacritic), which is that many games reviewers post a review before they’ve bothered to finish the game. This is partly due to extremely busy schedules and partly due to wanting to get new reviews out as early as possible around the time of a game’s release. What you have is a recipe for “wow, that was a really dramatic moment, so I’m going to praise this game!” and be done with it.

So! Where was I going with this?

Just this: there is such a thing as objective criticism of an art form, but not everything is absolutely concrete, and while my opinions may differ from the norm, I also have a background most professional games writers lack and am not in the kind of environment that would facilitate the initial impressions that make up most game reviews. After my first playthrough of Mass Effect 2, I was screaming about how it deserved Game of the Year. After my second, I started seeing cracks in the foundation of the game that I couldn’t ignore. I’ve got the time and expertise to talk about that… most people don’t have both, sadly. I’m not better than anyone… I just have a different scope of experience that results in me perceiving things a bit differently in a way that allows me to explore things in a way that I hope will be beneficial to others. Ultimately, I hope my criticism results in better games being made (most likely through consumers demanding better games; I have no pretentions of nabbing a game designer’s attention and causing a radical paradigm shift in her brain), and a broader acceptance of games as an art form. After all, criticism’s entire purpose is to point out what’s good and what’s bad in the hopes that the good will be seen more and the bad will be eliminated.

Don’t we want that?

That said, it doesn’t mean I won’t be subjective. I’m totally going to write about what I like and dislike here on alphatown. Looking at everything with a purely analytical eye can suck, and this blog would become dry and boring if it’s all I ever did.

One thing I didn’t touch on, by the way: I really didn’t start gaming until 2007, so I’m relatively untainted by game nostalgia. You may be going “wait… what?” right about now. Unfortunately, we’re out of time… so we’ll talk later, okay?

    • Richard W. Arthur.
    • October 9th, 2011

    Thank you for your exposition on the nature of criticism, which strikes me as intelligently considered, although like most matters that involve distinctions between subject and object, the point may be made that judgments regarding value, point of view, beauty, ethical worth, etc., are at their best a smooth synthesis of both–that in assessing the relative values of some phenomenon we are revealing our own and our contemporary values. Nature and nurture are both operative; up and down create dimensionality; male and female create species; heads and tails create a coin. Both are necessary. Regards, R. Arthur.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: