When Kotaku published their “The Best and Worst of the Last Generation of Gaming” piece, they said they were going to draw the line somewhere. For them, that ‘somewhere’ meant cutting out mobile platforms and the PC. I think this is problematic; no scientist says, when discussing the Cambrian Period, that they’re going to ignore the Cambrian explosion, after all. That explosion was kind of a big deal, and something that happened at the same time, and something that completely reshaped life as it was known at the time.
Me, I’m a PC gamer. I love my platform of choice—and it’s my platform of choice in part because of the tremendous, rapid… dare I say explosion of growth during this last generation.
So what is this essay?
This essay is me arguing that not only does the PC deserve to have games that should be counted among the best this generation has to offer, but that the PC should also be recognized as the single most important platform of the generation. Maybe even all time.
Let’s begin, shall we?
Generation shifts aren’t just marked by changes in hardware, they’re marked by philosophical design shifts as well. The previous generation was defined by the switch from Western developers from the console to the PC. The Elder Scrolls made its first appearance on the Xbox. Valve began to rise to prominence. Epic started work on its first console titles. Remedy’s Max Payne series blew minds with its wonderful Woo-esque gameplay.
Consoles were the easiest way to game, and a known quantity. PC titles—and Western development in general—have always been more simulation-oriented, and people value that a great deal. It’s one of the reasons why, for good or ill, Grand Theft Auto III defined the last generation: it was a game that let people do just about anything they felt they should reasonably be able to do.
Of course, the Exodus of PC developers to consoles was troubling, too. People started to believe that PC gaming was dying—consoles, after all, was where the new mainstream money was coming from. Where the PC had commanded excellent numbers throughout the 1990s, in the early 2000s, that niche wasn’t enough anymore.
So if the Sixth Console Generation was defined by the West’s move from PC to consoles and the explosion of growth within consoles as a result, the Seventh Console Generation was about developers coming to terms with the fact that they were no longer niche and trying to cater to larger audiences. The Exodus to consoles was nearing its apex, and it seemed, for sure, that the PC was dying.
For the PC gamer, this generation began worryingly.
Console publisher Square Enix took over the PC-focused Eidos. Microsoft shuttered some of the best studios of all time, but not before trying to get them to release their games on consoles. Ensemble, a studio that released games that sold in the millions back when selling in the millions was a pipe dream for just about everyone, got shuttered after releasing an RTS on consoles that actually worked. FASA, developers of acclaimed franchises like MechWarrior and Crimson Skies—both of which had been amongst the most-played series on the original Xbox Live—was shut down after developing the multiplayer-only* Shadowrun. And then there was ACES, the studio that worked on the venerable Flight Simulator franchise, a series so old that it outdates not just Mario or Pac-Man, but Ultima. Gone in a puff of smoke. Epic’s Unreal Tournament 3 never managed to be quite as wonderful as previous titles in the series, and Gears of War on the PC performed poorly, so Epic seemingly abandoned the platform that had supported it for so long.
And then there was Windows Vista.**
Sure, we had FEAR. We had Half-Life. But let’s not kid ourselves—our best and brightest were jumping ship if they hadn’t already. Our OS was a mess. Building computers was expensive and time-consuming, especially where trouble-shooting was involved. Consoles could auto-patch their games, and the PC… while it had the technology, didn’t use it that often. The genres that made us unique either ended up bastardized on consoles or dead, entirely. The vague promises of a few PC games in development, like Alan Wake or STALKER, were hardly convincing.
Everyone knew the PC was on life support.
As the saying goes, evolve or die.
Looking back on it, we really shouldn’t have worried. The PC’s always been at the forefront of games development. It’s not just power—great things have come from constraints, after all—it’s the freedom. You can create anything on the PC. There are no gatekeepers. Valve wouldn’t have had a chance to debut Steam on consoles. On a platform where you can do anything, you're always going to trump the competition. Always.
Evolve or die? Evolution is the PC’s jam. Consoles evolve once every few years, like the ticking of some incredibly slow clock. The PC, on the other hand? It’s evolving by the second. The PC can’t die because the PC can’t stop evolving. Consoles had the PC on the ropes, but the PC never dies.
That’s why, in 2007, the second-best year in video game history after 1998,**** the PC came out swinging.
Nvidia released the legendary 8000 series of graphics cards, and suddenly, hardware stopped being quite so much of a problem; CPUs hit 3 gigaherz and seemed to stay there. DDR3 would follow shortly after, making the only real upgrade anyone would require a graphics card.
But 2007 really got going with the release of some of the best, most exciting video games ever made. STALKER: Shadow of Chernobyl, the long-promised, long-delayed first-person survival epic hit that March. Perhaps one of the buggiest games of all time, STALKER’s also one of the most brilliant, best-designed, incredible games that has ever been released. Call of Duty 4: Modern Warfare hit that fall, and what had once been a predominantly-PC series hit the mainstream so hard it blinked. PC gaming followed that up almost immediately with Crysis—a game that didn’t just remind the world that PC gaming still had the best graphics—but did it with style. Sure, its detractors tried to argue that it was nothing more than a tech demo,*** but the truth is, Crysis, and the subsequent Crysis Warhead had some of the best shooter gameplay ever.
And then, of course, Valve released The Orange Box.
As a developer, just how do you compete when one of the most acclaimed developers of all time drops three new games in your lap, and tosses in two more for free? How do you even deal with that? Here you are, being told that consoles are the future, and Valve goes “hey, everyone, check out our Steam platform. Here’s five games to check it out. Oh, and if you already own the two older games we have, pass ‘em on to your friends who don’t.”
The plan was brilliant. Not only did it introduce a crazy amount of people to Steam through the highly-anticipated release of Team Fortress 2 and Half-Life 2: Episode 2, but it encouraged them to give games to their friends. It made sure that more and more people were going to check out what this Steam thing was. I was one of those people. A friend bought the Orange Box, gave me her copy of Half-Life 2, and introduced me to the fascinating, somewhat troubled, hilariously-bad-looking Steam. It was annoying, having to take a hard drive to school every day to play Half-Life 2, instead of simply playing the game at home, but I managed. Suffice it to say, The Orange Box included the second-through-fifth video games I ever purchased for myself, right after the original Bioshock. Not needing a copy of Half-Life 2, I passed my extra copy another friend, who summarily opened a Steam account.
But Valve wasn’t done there: near as I can tell, their first Christmas sale hit in December of 2007—all games went on sale, no matter what they were, for between 10 and 50% off. That would only get better the following year.
Console games, of course, wouldn’t take this assault lying down. Sure, the PC releases that year—I didn’t even mention The Witcher, Supreme Commander, Command & Conquer 3: Tiberium Wars, Penumbra: Overture, Medieval II, or World in Conflict—were all some of the best, most interesting games released, but consoles had interesting plans in play. First, and most obvious, was Halo 3, which practically cemented consoles as the place to play multiplayer video games.
Second was the release of Braid in 2008.
Indie gaming had existed before Braid—Spiderweb Software’s Jeff Vogel’s been developing indie RPGs since 1994—but it’s hard to deny that Braid wasn’t a watershed moment in the way people perceived games. It said “hey, remember when all those computer programmers made games in their basements? Turns out, people can still do that.” The Xbox Live release of Braid showed everyone that games could be independently developed and lucrative.
And what better platform for indie games than Steam?
Things started moving quickly after Braid. Steam embraced indie games. Facebook provided a way for small studios to release social games, introducing a whole new breed of gamer to the mix. The Asian Free to Play model really took off, and even Valve’s Team Fortress 2 aggressively went after the model. League of Legends caught on like wildfire, the entire streaming trend took off… and Steam sales just kept getting better and better. While indie games sort of killed major modding efforts, Steam helped revive them with Steam workshop. Bioware's biggest game for a while was the PC-influenced Dragon Age: Origins; the console-fueled Dragon Age II in comparison was a disaster so bad that retailers allegedly refused to sell a Game of the Year edition of the game.
Last November, Steam had 50,000,000 users. It’s growing at a rate faster than Xbox Live and PSN.
It’s not just Steam, of course. Good Old Games, later just GoG, under the guiding hand of CD Projekt became a haven for classic PC titles; it’s the best place for a retro gaming fix right now, in part because the fine folks at GoG make sure that the games all run on modern PCs. Windows 7 released to critical acclaim. The Humble Bundle made us rethink the way games were sold. Services like Mumble helped redefine voice chat standards. Kickstarter changed the way we thought about our relationships with publishers, both as consumers and developers. Cryengine, Unity, and Unreal all made themselves freely available to indie developers, empowering people to create their own experiences more than ever before. Amazon and GreenManGaming have gotten into the business of selling keys with ludicrous discounts—why spend $100 to pick up Borderlands 2 and its Season Pass when you can spend $60 and get some extra credit on your account? HDMI cables and their relevant ports on graphics cards, combined with Microsoft’s 360-for-PC controller and Steam’s Big Picture Mode lets PCs do what consoles do, but with a great deal more freedom.
Honestly, can you think of a single meaningful change in the way we play games that didn’t come from the PC? We’re going into the Eighth Generation thinking about Oculus Rift, which came from the PC. The console with the biggest positive press is the Steam Machine. Publishers are looking at things like Free to Play and Social gaming, which came from the PC. Hugely famous developers, like Cliff Blesinzski, are out there saying things like “[I’m] really realizing that there is a direct correlation, bugs notwithstanding, between how good your game is and how many unique YouTube videos it can yield. And that is one of the mantras I am continuing to hammer.”
And yeah, you’ve still got guys like Eidos Montreal and Bioware trying to make games that everyone will buy, rather than realizing that niches can profitable.
Suddenly, the ultra-curated, arcade experience offered by consoles ring rather hollow. It’s the Minecrafts and the modded Skyrims that capture our attention. Nobody really cares about watching a Livestream of someone playing through a tightly-controlled game like Uncharted, but give ‘em something more free-form, more open, and you’ve got their attention.
Gaming, in all its forms, is being influenced by the PC. The Wii’s motion controls did little, despite selling well because of a perfect storm of limited supply and “this can make your kid healthy” advertising from the kind of people who give parenting advice but don’t know much about games. Playstation’s only contribution was a negative one, with the excessively-cinematic titles like Uncharted and God of War. The single worst aspect of this generation, the “cinematic without really understanding cinema,” the crappy gameplay, the “hey, let’s take control away to show you how hard we worked on animation…” that’s Sony’s contribution to gaming this generation, and what a terrible contribution it is. Microsoft had some neat things going with indie games, but a single monolithic corporation can’t make the changes that allow things like Steam, Kickstarter, and the Humble Bundles to pop up. It may have standardized console multiplayer experiences, but it’s a distant second place when it comes to “platform that’s changed the way everyone responds to games.”
And the thing is, despite all this, the PC is still a bit of an underdog.
Simulation-oriented design has largely fallen by the wayside in favor of XP-driven, arcadey mechanics of console games. Plenty of traditional genres have died in favor of more mobile-driven play. Games journalists almost totally ignore the medium, unless they’re British or write for a PC-specific publication. They seem more excited to talk about indie games now that they’re arriving on the next-gen consoles than they ever did when these games were announced for the PC. Few people speak with nostalgia about older PC games, even though it certainly seems that some of those older games not only warrant discussion, but more attention than many of their counterparts. You'll get a post about Kojima's Twitter, but most people don't even know who Craig Hubbard or Doug Church are.
The PC changed gaming. It led the way—it still leads the way—in everything, from gameplay to graphics to AI to new ways to experience video games. People abandoned it and it still proved it was the best. It’s influenced everything. And… at the end of the day, when Kotaku publishes articles about the best and worst games of this generation, they’re not going to talk about the brilliance of STALKER or The Witcher. They’re not going to write any significant retrospectives about Thief any time soon. They’re not going to analyze games in the context of their PC history.
But you’ll get another fifty-thousand posts about how much fun the latest Nintendo game is, and that really bugs me, not because the latest Nintendo game isn't fun, but because it feels like all this promise, this potential, this amazing stuff... nothing less than the future of video gaming as a medium, well, it gets left by the wayside. When I hear that the Editor in Chief of one of the biggest gaming websites out there (not Kotaku, mind) is finally going to build a PC—but he wants to ensure the experience is just like a console... it leaves me pretty sad.
And I get that, I do: we write about what we enjoy, and the truth is that most games journalists grew up with console games and just tend to value them more. PC gaming may be the future, but it’s also a niche. I shouldn't expect all that much, and maybe, maybe I should just be happy with the occasional retrospective series on No One Lives Forever. But... these are the best games I've ever played. The console titles I've tried... they're good, don't get me wrong, but when I go back and play games I've never tried before, it's the PC games that grab me, and the consoles that make me raise and eyebrow and question just how thick the rose-colored glasses of the fans are. When I played Thief for the first time back in 2011, I wasn't expecting a game that would completely, utterly, and definitively annihilate every single stealth game ever made, from Metal Gear Solid to Splinter Cell to Assassin's Creed, but it does. They're not even worthy to be called stealth games, not in Thief's presence—and it was first released in 1998.