How the Video Game Industry Is Failing Its Fans
Business + Economy

How the Video Game Industry Is Failing Its Fans

Electronic Arts

As little as five years ago, it seemed as if video games were on the precipice of a breakthrough moment. A generation who had grown up in front of a console had come of age and was willing to shrug off the criticism that video games were disposable garbage as just an older crowd’s dismissal of the new. Technology had advanced to the point where interactive storytelling was no longer clunky, and the possibilities seemed endless.

Japanese style role-playing games such as the Final Fantasy series had made very successful releases that were almost nothing but plot, Bioware’s Mass Effect series had proven itself a space opera worthy of the Star Wars crowd, the Grand Theft Auto franchise had moved from senseless violence to telling real stories about the American experience and Irrational Games’ Bioshock provided a twist ending so shocking that even hardcore film snobs were forced to reckon with the reality of video games as a serious storytelling medium.

Now, the industry finds itself back at a crossroads as the proliferation of cheap and profitable multiplayer titles threatens to drown out sophisticated, plot-driven games. What happened? The answer can be given in a single word: “multiplayer.” A slightly longer version of that answer: The gaming industry is subject to the same pressures as any other entertainment business does from rapidly changing technology, a fickle audience and production companies that want maximum profits from minimal investment.

Related: Video Game Changers — 21 Titles That Rocked the Industry

As broadband technology improved and became commonplace, video games began to explore the possibilities it offered. The two most dramatic and impactful changes were in the introduction of “Downloadable Content” (DLC) — that is, additions to the game that can be purchased after the initial purchase — and the expansion of multiplayer content, or games that are played with (or against) other players via the Internet. For some games this made immediate sense. Sports games and fighting games, which had often been two-player games even in their earliest iterations, enabled you to play a soccer match or kung-fu fight against a player in England, or India or California rather than someone next to you on the couch.

Soon, other games began to experiment with this type of play. Most important was the Call of Duty series, which puts players in the shoes of a soldier in different combat theaters throughout history. Though even the current version of the game does have a single-player “story mode,” it was this multiplayer mode that was the game’s most wildly popular (and used) feature.

Initially, this multiplayer experience was done through physical proximity. Gamers would hang out in rooms with multiple computers connected to one server. Some places even rented out space as “Networked Gaming Rooms”, but the increased effectiveness of broadband rendered this experience unnecessary. Now, you can do it from your own couch.

From the developers’ point of view, it’s easy to see the appeal of this content, and in a way it mirrors the reality TV boom of the early 2000s. Both have the key advantage of letting producers focus on what they can control (game play, art and sound design) and removes the most uncontrollable of the creative elements (the writer).

Writers become all but unnecessary. At this point, multiplayer content is fairly templatized. Drop a “squad” of soldiers/knights/wizards/space marines into a hot zone of Middle Earth/Mars/WWII Germany/Afghanistan and fight the various Orcs/Nazis/Aliens/Insurgents the game throws at you. No additional plot, and only the barest need for dialogue. Simple as that.

Related: The App-Selling Power of Kate Upton’s Cleavage

Given the number of ambitious titles that have flopped over the past five years, you can understand the developers’ hesitation to invest in more ambitious projects. Lest there be any doubt, video games are a massive and massively profitable industry, pulling in an estimated $46.5 billion dollars in sales last year. But developing a top-of-the-line title for one of the big studios is also a drain on time and resources that can be on par with the production and promotion of a major motion picture. When a AAA title flops, money is lost, and usually lots of it.

Titles such as Rockstar’s exceptionally ambitious but ultimately boring LA Noire, or the recent flop The Order, which was scandalously brief, were the kind of failures that could kill a company. The failure of Big Huge’s Kingdoms of Amalur: Reckoning was so harsh that it not only took down the company, but also out a decent-sized dent in Rhode Island’s economy.

And then there was Mass Effect 3. Mass Effect 2 had been an almost instant success and is widely considered one of the greatest games of its generation. The follow-up, released in 2013, was widely considered to be an improvement in terms of game play. The problem was, to fans of the series, the ending felt like a betrayal. A video game that was extremely fun to play and sold lots of copies was considered an industry-changing failure because people didn’t like the ending.

Imagine that in the NES days. “The princess is in another castle? That’s just B.S., I’m never playing Super Mario Brothers or any other Nintendo game ever again.”

But gaming and storytelling have both come a long way since the ‘80s. And so, in the same way that television turned to the reality show and films turned to comic books, game developers (those who haven’t completely retreated to simple and extremely profitable world of mobile gaming) use multiplayer to avoid the risks of storytelling. The problem is some gamers still want a story and are still invested in the medium — and their backlash is growing.

At the time of the release of the current generation of consoles, much fuss was made about the systems’ requirements to be constantly connected to the Internet. This felt, to some, like an invasion of privacy and to others like a naked money grab. Sony and Microsoft both relented to an extent, but many games are unplayable without a network connection. Next came the fury around the release of Titanfall, the first AAA title for Xbox, which was exclusively multiplayer and extremely reliant on the health of its network.

As more and more titles get multiplayer releases, the protests grow. In the spring, it was announced that Star Wars Battlefront would not have a single-player campaign at all. The only way to play the game was online. Fans threatened a boycott. Similar threats have been made in relation to Nintendo’s announcement at this year’s E3 show that it would be releasing a multiplayer version of Metroid (without the iconic Samus) next.

Related: 13 Movies You Should See This Summer

The ultimate solution for television (and indirectly for film) was to back away from the traditional networks. Through cable and streaming, quality scripted television emerged and succeeded again. There was no need to bow to corporate sponsors, network censors or suits who were worried about storylines being “too dark.” (Can you imagine a ‘90s style, focus-grouped Game of Thrones?)

There really is no comparable path for video games. Indie studios have gained a lot from the technological advances, but they still don’t have the resources or access to creative talent to produce a truly competitive blockbuster title.

There is nothing inherently wrong with multiplayer games, in the same way that there was nothing inherently wrong with reality TV. The problem is that studios increasingly have no incentive to release any other kind of title, causing certain hardcore players to worry that the kind of game they love will soon be a relic of the past and that they will be left talking about the 2000s in the same way film buffs talk about the ‘70s. “Remember when movie theaters showed the Godfather and Taxi Driver instead of Teenage Mutant Ninja Turtles?” has become “Remember when they made games like Mass Effect?”

Top Reads from The Fiscal Times:

TOP READS FROM THE FISCAL TIMES