Yesterday (tumblr version is here), I wrote a piece on The Citizen Kane of Video Games. I explained what that meant and why—so if you're rolling your eyes and saying we don't need one, go read that article first, then get back to me. At the end, you might have noticed that I mentioned that whatever the Citizen Kane of Video Games will be, if there is one (the way gaming impacted our daily lives has been a much more slow infiltration than film), it's going to be an FPS.
Time to explain why.
First, I'm going to repeat a story that goes something like this: while playing a popular video game, the storyteller's parent walks in the room and notices that the player is trying to figure out how to cross a river. The parent asks "why don't you chop down a tree and make a bridge?"
The storyteller, of course, thinks that's crazy and tries to explain why, but the parent doesn't get it. The storyteller's been playing games for years, and is intimately familiar with the concept of abstraction, even if they can't explain what that means. The parent, on the other hand, sees something very different.
The parent sees another world.
Video games come in two different varieties***. First, we've got the arcadey games. They're the ones that rely on rules and heavy layers of abstraction to communicate an idea. Then we've got simulationy games, which feature significantly less abstraction, attempting to communicate the world in very real terms. The expectation of the non-gamer, upon seeing a 3D world, is that said 3D world will operate much the same way the real world does. In the real world, you can chop down a tree to make a bridge.
Abstraction is the concept of simplifying basic human interactions. Civilization is a very abstract game because you're not actually chatting with political leaders, having nice, lengthy talks about diplomacy, nor are you building a city; you're dealing with these concepts in very limited means. Another example of this would be an RPG—and for the purposes of this discussion, I do mean a game that actually features roleplay (so no JRPGs)—like Dragon Age: Origins.
The player views the game from the third-person, top-down perspective. They interact with the world by clicking on an object—click and a door opens. Delete X bad guys by clicking on a button that means your character repeats a stab animation, causing the bad guys to lose #HP, and you will gain the ability to do it more quickly next time. Transfer objects from your inventory to another character, and that character will open up new dialog options that may provide you with new equipment to change their stats. The game's tactics system lets you take simplified situations and plug in your own simplified response: "if [health] is under 10%, then use [smallest health pack]," or "if [enemy] is under 10% [health], then use [ranged attack]."
This is abstraction. It's a wonderful way to communicate really complex ideas in very simple means. Plus, plenty of people simply enjoy playing games like games. They enjoy playing Monopoly or Settlers of Catan or Fisaco or Dungeons & Dragons or any one of a number of cool games.
In a simulation, things are much more like the real world. The goal is to be as realistic as possible; that usually means avoiding turn-based play and attempting finer control over things. A game like Receiver removes as much abstraction as possible, focusing entirely on how you would actually manipulate a gun if you had one. The intent is to be as close to reality as possible.
And these are very general terms. Plenty of games walk the line between simulation and gaminess; Far Cry 2, for instance, was very much a sim-oriented game, but every single safe house functioned in just about the same way, the diamond system was very simplified, all stores featured an identical setup, and so on. Fable features the third person perspective, but attempts to simulate your interactions with other people in as many ways as possible (talking, actions, whether you own their house or not, etc), which, as an aside, is one of the reasons that Fable would be a lot better as a computer game than a console game.
Oh, right, I should mention that: console games tend to be rooted much more in being a game than computer games do. It's why the immersive sim genre never really took off on consoles until a lot of the simulation stuff was reduced and the games became more gamelike. There's a distinct way of thinking when it comes to console and computer design that's very different. Even abstract computer games are often very focused on abstract simulation concepts—see games like Ultima VII or X-COM. They're both played from a bird’s eye perspective, and both seem to be heavily influenced by tabletop gaming more than anything else. Despite this, they still attempted simulation in different ways than we might expect. Ultimately, we could spend hours talking about how they work and function, though, so I'll just paraphrase, something that Julian Gollop, the creator of the original X-COM, said in relation to Firaxis' XCOM released in 2012:
“I guess my original game was a bit more simulation-ny and the new game is a bit more board game-y.”
Okay, so, why bring up the dichotomy between console and computer games?
Simple. Back at the start of this generation, before Steam had gained traction, and even somewhat during the sixth generation, Western developers really started moving into the console space. Many of these Western developers, like Bungie, Epic, and Ensemble, had formerly been computer-exclusive, rather than console developers.
What we can see, as Western developers transition over to consoles, is that the market begins to explode. People begin to react in a big way, buying games from Western developers. They can't stop, won't stop. Games from Japanese or console-exclusive developers start not to get picked up that much.
Part of this, I think, is because Western games really brought in a bit more of a simulation focus, letting people do things they thought they ought to be able to do. The big game for this, of course, was Grand Theft Auto 3, which said "hey, here's a city, now do whatever you want in it." And sure, you couldn't do everything, but there was so much to do, to see, to try, and audiences lapped it up. It was incredible in that regard.
Japanese games kinda stayed where they were. Many Western developers who'd been console-only tended to either change or die. Console sales from Western devs ended up exploding in popularity, computer developers started moving over to consoles, and it looked like the PC was going to die, though that ended up not being the case.
Thing is, the pure simulations never really did that well. Oh, sure, Microsoft Flight Simulator is a franchise that's older than Pac-Man, and sold pretty well (until Microsoft went "oh no, the PC is going to die; even though FSX is selling really well, we should shut down ACES!"), but by and large, developers who focused on simulation mechanics, like Looking Glass Studios, ended up dying or changing drastically.
One of the reasons that a lot of older players have a hard time enjoying newer games is because newer games feature this curious overlap between simulation and arcadiness. Games feel washed-out, hollow. There's been a sort of massive merging between studios, with people trying to make sandbox arcade games. Largely, people think that games have to be games because that's the medium's name; on top of that, we've got a lot of developers throwing things like level progression into the mix because either other people are doing it and it seems cool, or because they want to manipulate players into spending time with the game/enjoying the game/whatever.
As such, we’ve got a development culture that, on computers, seems really focused on board games (a big discussion in and of itself), and everywhere else, we’ve got a bunch of AAA games that walk the line between arcade and simulation play, sometimes without a real focus on either one, other times, deliberately trying to be both at the same time.
One of these games is Skyrim.
On one hand, it’s a gamey game. It’s got XP leveling stuff, skills, stats, and the like. On the other hand, it’s trying to be a living, breathing world. Instead of random battles that you might see in a more abstract, game-driven game like Arcanum: Of Steamworks and Magick Obscura (an amazing game everyone should play), Skyrim has creatures and people that live their lives, walking around, doing things that living creatures would do. During the day, they do things that living things would do during the day. During the night, same deal.
Some people are really upset about this. They don’t like that the game has become simultaneously more abstract than previous games (by removing skills) and less abstract (by allowing players to level up based on what they do, rather than building characters in the traditional way). It’s a very jarring game to someone who expects a more traditional RPG. The same can be said of a game like Fable, which is even more of an unusual take on the RPG genre than Skyrim.
But it’s worth noting that these franchises do really well. Fable is Microsoft’s best-selling franchise in Europe (It was implied by Spencer when he said “Europe’s biggest franchise” in regards to a reveal that turned out to be Fable Legends), and Skyrim has done crazy-good in terms of sales.
So why the FPS?
First off, okay, it doesn’t have to be a first-person shooter, but I think it should be a first-person game. The first-person camera puts the player’s eye physically closer to any interaction they can have with a game, it allows the player to identify with the game by not showing a third-person puppet that merely acts out the player’s commands, and it changes the way we think about experiences by limiting what we can see, by changing the way we perceive sound, and a bunch of other stuff. Plus, it can be enhanced by the Oculus Rift.
The first-person game, in other words, puts us in a virtual world more closely than anything else. It's going to make us say I did this, I chose to do this, this was my call. Any other kind of game might give us an excuse, but not the first-person game. A first person game can put us, actually us, in another world, and lets us do what we choose to do.
As such, the person who approaches a game and thinks “I should be able to make a bridge by chopping down a tree” is the person who may connect more, on an intellectual and emotional level, with Minecraft (indeed, Minecraft has proven to be hugely appealing to non- or new gamers)than other games. The person who sees a 3D world and thinks “I should be able to treat this like a real space” is very different from someone who has been playing console games for a very long time. They make a logical assumption about how people interact with worlds, rather than an assumption educated by the experience of playing games.
Essentially, people who don’t play games come to games with an expectation that these virtual worlds are, in a way, a kind of proto-holodeck. I know, I know, people have said that’s not possible, or shouldn’t be possible, but I feel that’s a perspective built on a lifetime in games and thinking about games in the context of games. It’s too inside the box; people from the outside, much like Orson Welles when he first made Citizen Kane, have new and exciting ways to think about games.
I think the holodeck idea only fails if you’re really busy focused on the fact that “games” is a word inside “video games,” and you focus on making a game, you’re inherently limiting yourself. The games that really resonate with people seem to be the games that put people in other worlds, tapping into the innate human craving for exploration. In other words, I feel that we need to divorce ourselves from the concept of game, and look a bit more to crafting experiences.
Interaction is what makes games pretty unique, but not in a ‘choose your own adventure’ sense, which is largely the way games currently treat it. It’s possible to create introspection. Developers can use everything from gameplay mechanics to art to story to sound to every single element of a video game get players to do things and then make players reflect on those things. We’ve already started to see this happen in games like Spec-Ops: The Line, Papers, Please, and Bioshock 2 (though it’s at least as old as Marathon: Infinity (spoilers)).
The Citizen Kane of Video Games, in other words, is a game that is going to have broad appeal while putting players in a unique headspace and making them reflect on who they are and what they’ve done. Instead of being a game that tells us about the world, though it will do that, the Citizen Kane of Video Games is going to be a game that tells us more about ourselves than we’ve ever known.
Or it may never exist.
Citizen Kane didn’t make a lot of money. While it influenced many people who came after, for the better, and while it’s widely considered the best because it was both influential and is still practically technical perfection, so many years later, it also came about in a time when people didn’t have as many distractions as they do now. Games are different beasties than film; it’s hard for individuals like myself to play many games before 1996, and I know plenty of people who have struggles enjoying wonderful games like System Shock 2. We exist in a world where people are constantly talking about The Next Big Thing, where developers can receive immediate feedback via Twitter (Mike Laidlaw, Dragon Age: Inquisition’s boss man, told me he’d talk to the programmers about the possibility of adjusting the game’s FOV).
We live in a world where Citizen Kane may not be possible, where Grand Theft Auto 3 had a tremendous impact, where people are gradually accepting video games as low art, where indie developers exist and can begin making high art independent of the system. We don’t live in the Hollywood of the 1940s, where the only way to make a movie was to have a studio do it.
So maybe things change by degrees, and we don’t get a perfect analog. Maybe Gone Home* is our Citizen Kane, and instead of being the source of all change, it’s merely the tipping point, the watershed moment that things have built up to.
Maybe we just don’t have a Citizen Kane.
But hey, let’s hope developers to make something truly excellent, something that appeals to what it means to be human, and something that positively influences the games that come later. That alone is good enough.
***This is a generalization. The dichotomy between 'game' and 'simulation' experiences is way more nuanced and complex than I'm making it out to be. I'm speaking in extremely, extremely broad terms.
*I don't own Gone Home. I can't afford it. Merely responding to what I've heard about the game from trusted friends, which is really encouraging stuff.