Translate

Wednesday, March 28, 2018

Paying to Win, Progress, and Level Up: The Road to Industry Suicide?

From top left to bottom right: Candy Crush Saga, Farmville, Dungeon Keeper Mobile, Call of Duty: WWII, Assassin's Creed: Origins, and NBA 2K18. These games contain in-game currency that allows players to purchase extra lives, faster progression, and increased chances of getting desired items via randomized loot boxes.

In an age long since past, video games were an avenue of pure adventurous escapism in the eyes of those sitting on the floor or couch across from the TV in their living rooms or bedrooms with a controller in hand, being close to their computers with their fingers flying across the keyboard, or hanging around with friends at a local arcade. The moment we ran our PCs and consoles or insert our quarters into the arcade machines, gamers of my generation, myself included, were transported into various worlds uncovering treasures, slaying monsters, uncovering secrets, going through stories rivaling those of other media, feeling like badasses, and experiencing the thrill of getting a high score to show off our skills. These times were what we would call the Golden and Silver Ages of gaming. But since the latter half of the 7th generation video game era, we have entered what a lot of us would probably call the Dark Ages.

Those Dark Ages are marked by the increased use of season passes, content locked behind paywalls, downloadable content (commonly abbreviated as DLC), pre-ordering of games with limited public knowledge as what those games are other than exclusive content, final products being inferior to what were being shown at trade shows, incomplete products, pricy editions more expensive than the standard ones, the use of always-online connectivity, microtransactions, and (much more recently) loot boxes. Given the fact that video games as a whole have become one of the largest economies in the world with some budgets rivaling those of Hollywood movies, it should come as no surprise to the major video game developers and producers to look for ways to make more money from this industry. Yet that very pursuit seems to be increasingly driven by a few business practices that are appealing to those running the companies for its profitability who appear ignorant of the financial well being of their customer base. One of those practices is, of course, the use of microtransactions, payments for additional content in a game that has either been downloaded for free or paid at a full retail price.

In order to understand how this practice came to be profitable to the business and why it has been building up consumer resentment, one must look to the history as to how it came to be. The free-to-play business model was first pioneered in the early 1990s and the early 2000s for the MMORPG PC marketplace by a South Korean game company called Nexon. Among its titles made under this model, one of the company's most notable and long-lasting ones is MapleStory, a 2D side-scrolling MMORPG. In addition to its basic RPG features, the game featured an in-game shop called The Cash Shop in which players would buy cosmetic features and pets for their virtual avatars for small amounts of real-world money, which lead to the term 'microtransactions.' This feature, coupled with being downloaded for free, had garnered up to 72 million users and up to $16 million dollars in revenue in around four years since its initial 2003 release in its native South Korea. That sort of success led Min Kim, CEO of Nexon's America branch, to claim that the free-to-play model as being "the future of gaming." Of course, that claim has been called into question as other companies experimented with and implemented that model into their own products in their own PC games, smartphones, social media platforms like Facebook, and even big budget AAA video games.

Screenshot of MapleStory from MMOs.com.

In addition to Asian game companies competing with Nexon, Western companies have sought their own approach to the business model that Nexon pioneered during the mid-to-late 2000s. As social media and smartphones began seeing widespread use, some companies started to develop games specific for those mediums. Two of the most notable such companies are Zynga, known for its Farmville series, and King, developer and publisher of Candy Crush Saga. But rather than just selling cosmetic items, Farmville, Candy Crush Saga, and other Western mobile and web-based games sold various virtual goods and advertisements, the former of which generated the most revenue. The virtual goods provided various benefits ranging from extra lives to speeding up progress in something like building armies in a strategy game. These microtransactions, combined with the appeal to the casual market, the social psychological manipulation, and the addictive nature in the game design, have lead to a multi-billion dollar mobile-phone/web-browser bonanza spearheaded by Farmville, Candy Crush Saga, Clash of Clans, and other titles of what would become known as free-to-play games (F2P, for short), freemium games, and (in more infamous terms) pay-to-win games. According to one report, 79% of iOS and Google Play app store revenues come from microtransactions in F2P games. While this business model has seen, and continues to see, large financial success, it has caused a great deal of controversy on various economic, legal, and ethical grounds which still continues to this day.

From left to right: Candy Crush Saga, Farmville, and Clash of Clans.

One part of the controversy is how some F2P companies have used their games to generate revenue. In the initial days of Farmville, Zynga used advertisements to make players purchase in-game currency with real world money in ways that resemble scams. Shortly after being called out on it, Zynga removed those scams and the company's co-founder and CEO Mark Pincus admitted to doing "every horrible thing in the book just to get revenues." Some F2P games have faced accusations of duping young children into using their parents' credit cards on in-app purchases without realizing that the former were spending the latter's real money. One such game was The Smurfs' Village, released for the iPhone and other Apple products in late 2010. And one of those incidents that led to such accusations occurred in February 2011 when an 8-year-old 2nd grader unknowingly raked up a $1,400 bill on her mother's Apple account by buying Smurfberries. Another problem with most F2P games is how they tend to focus on generating revenue from only a small percentage of players known as "whales," people who are willing to spend hundreds if not thousands of dollars on in-app purchases; in one extreme case, one whale spent around $1 million dollars on in-app purchases for Game of War: Fire Age, a mobile game mostly known for its 2015 Super Bowl commercial featuring Kate Upton. On a more disturbing note, a senior producer of one F2P company admitted to working with teams that focused on gathering user data to target the most whales possible rather than on game design.

Left: An advertisement for Game of War: Fire Age featuring Kate Upton. Right: A screenshot of Game of War: Fire Age.

On the other side of the coin, microtransactions first started to appear in traditional video games in early 2006. Among the first of those microtransactions was the infamous Horse Armor Pack for The Elder Scrolls IV: Oblivion for the Xbox 360 and PC. As the years passed and the F2P market boomed, the use of microtransactions by AAA game companies increased drastically as a way to cover their rising development costs and to combat the second hand used games market by companies like GameStop. They have been selling not only cosmetic items like weapon skins but also various items that affected players' experience with single player campaigns (in-game resources, one-time consumable items, speeding up progression, custom characters) and multiplayer modes (better weapons, random game-altering supply drops, in-game insurance). Things that could be unlocked entirely through gameplay in the old days, like additional characters and costumes in fighting games, are now being offered for additional money. In some games, players could get any of these items by purchasing and opening loot boxes/crates which generate them at random intervals. This practice, like the F2P business model, had its roots in the Chinese and Korean MMO and F2P markets and was first adopted by Western game companies in 2011 as a way to generate more revenue with games like Team Fortress 2, Star Trek Online, and Lord of the Rings Online. As the use of loot boxes and microtransactions became widespread across AAA video games over the years, disapproval and resentment has been building up among the gaming community and average consumers, some of whom have went so far as to express lamentation of what seemed to be a decline of one-time purchases of whole gaming experiences by saying that microtransactions are ruining video games. It was only recently that such resentment had reached a boiling point with the release of games containing an excessive amount of loot crates and microtransactions that range from pay-to-progress to pay-to-win such as Star Wars: Battlefront II (I am, of course, referring to the one developed and released by EA on the PS4, Xbox One, and PC on November 2017; not the game originally developed by Pandemic and published by LucasArts in the early 2000s on the PS2, Xbox, and PC), Middle-earth: Shadow of War, Destiny 2, Assassin's Creed: OriginsMass Effect: Andromeda, and Forza Motorsport 7.

From top left to bottom right: Star Wars: Battlefront II, Middle Earth: Shadow of War, Forza Motorsport 7, and Assassin's Creed: Origins. These games contain a variety of in-game purchases, many of which break the games and encourage players to buy them in order to progress faster.

The eruption of protest against video game microtransactions began fairly late last year when players complained to EA about the multiplayer mode in Star Wars: Battlefront II prior to it being released on November 17. The complaints concerned the use of microtransactions and loot boxes in the multiplayer progression system, particularly when it comes to paying more money to unlock hero characters like Darth Vader and Luke Skywalker at a faster pace than grinding. In EA's official Reddit post on November 12 that would rapidly go down as the most downvoted in the website's history by 664,000 negative votes (and the subject of a PC modder's joke), they claimed that this method is intended to give players "a sense of pride and accomplishment." By release day, EA made some last minute alterations to the multiplayer progression system by turning off all in-game purchases but the pandora's box of consumer resentment and controversy was unleashed. Professional game journalists and YouTubers took note of the frustrating progression system in their reviews, most of which were negative. Fans protested the game across the Internet by, for instance, overwhelmingly disliking it in the Metacritic user review scores. A few days prior to that, one player calculated that it would take 4,528 hours of playtime or a payment of $2,100 to unlock all of the multiplayer base-content. Concerns were raised as to whether or not the use of in-game purchases lead to gambling addiction, especially on a small percentage of vulnerable people as attested by a 19-year-old who spent $10,000 on those in a couple of games that relied on microtransactions. The backlash against Battlefront II's loot box system was so fierce that a few state governments took wind of it, with Hawaii's State Representative Chris Lee labeling the game as a "Star Wars-themed online casino, designed to lure kids into spending money." If that wasn't enough, a few other countries have launched investigations as to whether or not the use of loot boxes constitute gambling, including Belgium, the United Kingdom, Sweden, and the Australian state of Victoria. No definitive conclusions have been reached and these investigations are still ongoing. After over four months of backlash from the gaming community and scrutiny from a few US senators and representatives, EA overhauled Battlefront II's entire progression system by removing loot boxes entirely, among other things. Of course, the whole Star Wars: Battlefront II fiasco is just a tip of the larger iceberg of the larger issue regarding microtransactions in video games.

Arguments for and against the use of microtransactions and loot boxes have been polarizing to say the least. On the 'for' side, there are people who say that gamers are "overreacting" to the issue and that publishers should raise prices; that microtransactions now make up more revenue than full-prices releases; that the controversy is having no impact on game sales (at least not immediately); that microtransactions are necessary to cover rising development costs; that this is the future of gaming whether consumers like it or not; and that any concerns about a potential economic crash as a result of this practice is just dismissive apocalyptic rhetoric. On the 'against' side, it is argued that microtransactions and loot boxes are destroying video games; that players are being turned into payers; that games are being turned into never-ending games that churn out revenue; that some companies like Ubisoft are looking to turn video games into 'games as a service;' that this business practice is becoming more predatory and intrusive based upon matchmaking patents written by Activision and psychological research funded by EA into difficulty adjustment and engagement as a way to entice players into making in-game purchases in online multiplayer games. All these polarizing arguments lead to a couple of questions which, as far as I know, have no definitive answer. Is the AAA video game industry's continued reliance on microtransactions and loot boxes going to lead to a crash on a scale equivalent to, if not more, so than the one that occured in 1983? Is the gap between the business' desired quarterly earnings and consumer dissatisfaction becoming significant enough to drive a lot of gamers away and put a significant amount of game companies out of business?

Of course, this is not to say that every single video game company is or will be jumping on the microtransaction bandwagon. As the controversy surrounding Star Wars: Battlefront II was heating up, CD Projekt RED, a Polish game company best known for The Witcher series of action RPGs, took to Twitter slamming the loot box craze, saying that it's upcoming sci-fi RPG Cyberpunk 2077 will be an "honest" gaming experience like The Witcher 3 and that they will "leave the gread to others." CD Projekt RED recently revealed that Cyberpunk 2077 will not have any microtransactions in the single player campaign. When asked on Twitter if the new and upcoming PS4 title God of War will have any microtransactions, director Cory Balrog answered, "No freakin' way!!!" And just because microtransactions in video games make some companies a lot of money doesn't mean that video games cannot be profitable without them. Capcom's recent title, Monster Hunter: World, has sold 7.5 million copies on the Xbox One and the PlayStation 4 in both physical and digital sales in its' first five weeks since its release. Prior to that, the developers behind that game went on the record saying that including loot boxes and microtransactions "wouldn't make much sense" as doing so would "create friction" players. Ninja Theory's latest game Hellblade: Senua's Sacrifice managed to make a profit after selling 500,000 copies on the PlayStation 4 and the PC in less than four months after being released. A version for the Xbox One was recently announced. Late last February, Guerrilla Games, best known for the PlayStation exclusive FPS series Killzone, released a sci-fi action RPG called Horizon Zero Dawn on the PlayStation 4. By its' one year anniversary, it sold more than 7.6 million copies. There are also a large number of independent titles that sold very well without the use of microtransactions or loot boxes like Hotline Miami, Undertale, Stardew Valley, Cuphead, and Five Nights and Freddy's.

From top left to bottom right: Monster Hunter World, Hellblade: Senua's Sacrifice, God of War (PS4), and Cyberpunk: 2077. The first two games have done quite well without microtransactions. The last two are promised to not have microtransactions in the single player campaigns.

You might say, "Well, that's just your opinion." Perhaps this is all just that: an opinion of one man. But that should not excuse the sort of business practices that deliberately entice, even force, consumers, to spend more money on video games than initially purchased at the store or downloaded for free. If done in a way that is consumer friendly, the free-to-play business model could allow paying customers to expand on the experience of a free-to-play game while allowing non-paying customers to enjoy it without the need to spend anything, like what games like MapleStory did when they launched. But its games that use the business model to monetize in ways that frustrate consumers and waste their time and money that tend to get the most news and social media attention. As a gamer for around twenty years, I value my time and money. I constantly search for the best video games in terms of quality and reputation that would be worth my time and money. I am willing to pay to play the role of a martial artist, a supersoldier, a superhero, a pilot, a warrior, and an adventurer as I embark on, participate in, and experience a fighting tournament, an intergalactic war with a massive alien force, a covert operation deep behind enemy lines, a monster-infested house, city, or landscape with limited resources, and a grand quest accompanied by reliable and trustworthy companions. I will not pay to level up, to progress, or to win. As an aspiring game developer and designer seeking a career in the field, I desire to create video games using the ones I've been playing as sources of inspiration and sell them to interested and willing customers. With the right know-how, I'm confident that I will be able to create a strong source of income without having to resort to draining wallets and bank accounts; without needing to make the most advanced and beautiful photorealistic graphics possible, to use intrusive advertising, to resort to relying on Hollywood-type theatrics and effects, or anything that would end up costing more than the process of making video games. In my opinion, adopting this sort of mindset is the best way for the video game industry to minimize and avoid the risk of an economic burst or, to put it another way, avoid the road to industry suicide.

1 comment:

  1. Hello I am from gacorteros.com,
    Indeed, we can't deny that currently there are a lot of game industries that apply a pay to win system. sometimes this irritates me. But after thinking about it, there really is no other way to survive and perform services on the game if not with such a system.

    ReplyDelete