Current Affairs is

Ad-Free

and depends entirely on YOUR support.

Can you help?

Subscribe from 16 cents a day ($5 per month)

Current Affairs

A Magazine of Politics and Culture

jiris (shutterstock)

‘Crunch’ Is No Cheat Code for Better Video Games

The arms race between studios is burning out developers, and nobody is benefitting except the bosses.

Months after The Last of Us Part II (TLoU2), the latest video game by studio Naughty Dog, came out in June 2020, gamers kept finding new animations and gameplay mechanics that proved how much effort the developers had put into the smallest details. Blood would melt snow, glass shards would stick to enemies’ faces if you tossed a bottle at them, and exploded zombie bits would stick to the ceiling before falling off. For less gruesome examples, you could look for tadpoles in a dirty, secluded hot-tub, or gawk at characters’ realistic hair movement. Such a level of detail makes sense: game developers want players to be immersed in the environments they’ve created, so they’ll try to make the environments as lifelike and responsive as possible.

Except the vast majority of players will never even encounter most of these details. Those tadpoles? I only ever saw them myself because an astute Redditor pointed them out. On my next playthrough, I went out of my way to find them: and sure, there they were, in a rundown post-apocalyptic backyard, swirling in a filthy little hot-tub. “Huh, that’s neat,” I probably muttered before continuing the game, and forgetting about the tadpoles until writing this article.

Even in 2021, the internet was in awe of TLoU2 yet again: in a gameplay video, a player runs up to an enemy they’ve just shot, and with a timely button-press grabs the enemy’s ammo while it’s still mid-air—a mechanic the community had not noticed for over half a year. And since the game is so painstakingly animated, the character actually reaches for where the ammo is, and turns her head after it.

Mind you, in my three playthroughs of TLoU2, I’ve never once noticed this myself. You could argue that’s a positive thing. Shouldn’t a good game be able to react to any player input, regardless of how unlikely that input might be? What’s the harm in making the experience more immersive, particularly with these little moments of serendipity?

But what’s left unsaid is the labor necessary to produce all these details. In fact, bullshit reactionary controversies regarding TLoU2—about a lesbian couple, a muscular female character, a trans character—overshadowed what ought to have been the actual controversy: work conditions. By the end of 2018 (over a year before the original release date in February 2020) the staff at Naughty Dog were in “crunch,” working grueling schedules for prolonged periods, as much as a hundred hours per week. Though any project in any industry might require some late nights to pull off on time, the phenomenon has become a norm at game studios, to the point that it’s been dubbed “crunch culture.” Think 12-hour days and coming in on weekends, usually done by salaried workers who don’t make overtime pay—though the boss might bring in some sandwiches! And not just at the end of the project when deadlines loom, but for months, or even years.

The human cost is difficult to overstate. As journalist Jason Schreier wrote about the TLoU2 production for game news website Kotaku:

Employees would come in wearing sick masks so they could keep working even with bad coughs (before the recent coronavirus outbreak). They’d skip meals—or showers. One developer told me they had seen people so shackled to their desks that they wouldn’t even take the time to go to the kitchen and grab the free crunch dinners.

(Okay, maybe forget about those sandwiches I mentioned.)

When the game’s release was initially delayed by three months (a later one-month delay was due to the pandemic), the developers didn’t breathe a sigh of relief. After all, the delay wasn’t to relieve the overwhelming schedule, but to simply continue it for a longer period. A developer interviewed by Schreier said, “The first thing that they wanted to reiterate is that we aren’t slowing down the pace.” In their quest to push the technological- and the storytelling-bounds of the medium, the studio did not relent, despite the stress on its workers.

Naughty Dog ought to have known better. After facing horrible crunch during production for Uncharted 4, they’d expressed a desire to avoid the stressful schedules going forward. However, the crunch on their very next game, Uncharted: Lost Legacy, turned out even worse for some developers. Then there was TLoU2, a game that, I admit, looks stunning and was an absolute joy to play. But it could’ve been shorter, and worse-looking, and filled with fewer little details, and it still would’ve been great. (The organic conversations between enemies were awesome; the tadpoles almost no one notices less so.) And maybe the developers would’ve been able to go home at a reasonable time.

Naughty Dog isn’t unique in this regard. Crunch culture doesn’t refer to the practices within any single company, but rather those of the entire industry. Despite promises by Polish game studio CD Projekt Red that they’d avoid crunch during their latest production, Cyberpunk 2077, game news website Polygon reported that “Employees at CD Projekt Red […] have reportedly been required to work long hours, including six-day weeks, for more than a year.” The “non-obligatory crunch policy” proudly touted by studio co-founder Marcin Iwiński became quite obligatory for some employees, and the same crunch practices extended to the studio’s overseas contractors.

Another popular studio, Rockstar Games, has come under fire for its reliance on crunch. For Red Dead Redemption 2, a game similar to TLoU2 in being packed with details that will be missed by most players, a seemingly innocuous change—putting black bars on the top and bottom of cutscenes—added weeks of work for several developers. When the schedule couldn’t budge, they turned to crunch instead. For the original Red Dead Redemption, the schedule was so exhausting that the employees’ spouses shared an open letter in 2010 criticizing the company for damaging their employees’ health.

In a similar incident six years prior, the partner of a developer at Electronic Arts used her blog to lambast the company for its labor practices: an initial “pre-crunch” of 48 hour-work weeks (meant to alleviate later workloads) turned into 72 hour-work weeks, and finally into 84 hour-work weeks. The “EA Spouse” post, as it’s come to be known, led to a greater awareness of crunch culture and several successful class action lawsuits against the studio. But little changed in the industry.

A 2017 survey by the International Game Developers Association (IGDA) reported that 51 percent of respondents experienced crunch in their jobs, with an additional 44 percent working long or extended hours that they did not refer to as “crunch.” 43 percent of respondents had experienced crunch more than twice in the previous two years—which, considering the long production cycles of video games, would often imply multiple crunch periods for the same game (as opposed to the occasional retort that crunch is merely a feature of production shortly before release).

Unsurprisingly, this additional labor is rarely properly compensated. The survey reports that over a third of respondents (37 percent) received no compensation at all. Some received free meals or future time off, while only 18 percent received paid overtime. And overtime it most certainly is: 37 percent reported working 50-59 hours per week, 29 percent reported 60-69 hours per week, and 14 percent reported working over 70 hours per week.

As the aforementioned Kotaku journalist Jason Schreier wrote about a developer on Legend of Heroes: Trails in the Sky, “Fresh off nine months of 80-hour work weeks, Jessica Chavez took a pair of scissors to her hair. She’d been working so hard on a video game—14 hours a day, six days a week—that she hadn’t even had a spare hour to go to the barber.”

The health effects of working under such conditions are hardly surprising. Chavez also “dropped 10% of her body weight during this period.” Nathan Allen Ortega, a former Telltale Games employee, was so overstressed while working at the studio that he developed an ulcer and started coughing blood. A developer at Bioware—the studio behind Anthem and Mass Effect: Andromeda—described an “epidemic” of depression and anxiety during production. Another described mental breakdowns, or “stress casualties,” as a common occurrence. The “EA Spouse” blog post included descriptions of the writer’s partner suffering from chronic headaches and stomach aches. The open letter regarding Rockstar specified stress, fatigue, and depression symptoms as results of the studio’s practices. Human bodies are quite literally not built to withstand crunch.

And yet, these practices aren’t even a guarantee of high-quality output. A number of high-profile games with crunch problems have still been critically panned; the two Bioware games I’ve mentioned, Anthem and Mass Effect: Andromeda, were both flops.

Cyberpunk 2077, while widely praised for its story and graphics, was so replete with glitches upon release that a review described the game as “virtually unplayable: rife with errors, populated by characters running on barely functional artificial intelligence, and largely incompatible with the older gaming consoles meant to support it.” All that was the product of months (and months, and months) of crunching.

As a 2015 study titled “The Game Outcomes Project” showed by comparing developer responses about crunch to their games’ Metacritic results (which aggregate critic reviews), productions that involved crunch actually tended to result in worse games. Which makes sense: overworking in stressful conditions might initially boost productivity, but is unlikely to foster a long-term creative environment.

While there are exceptions—Naughty Dog has found great success with the Uncharted and Last of Us series—I can’t help but wonder about all the great games that could’ve been made if those developers had stayed longer in the industry. After all, this is a line of work notorious for turnover rates. In the 2018 Game Developers Conference survey, nearly 2/3 (63 percent) of respondents had been in the industry for 10 years or less. The IGDA survey concludes that the “prototypical” game developer is a “32 year old […] who does not have children.” Which isn’t surprising, since it’s difficult to participate in crunch with an aging body and kids at home. To quote developer John Veneron, who’s worked on the Subnautica series and is now with Unknown Worlds Entertainment, “The older I get, the less tolerant I am of making those sorts of sacrifices.”

To reiterate: game developers work unreasonable schedules, receive poor compensation while suffering health issues, all for a potentially crappy product; these conditions then drive them out of the industry.

So why are studios still turning to crunch? Stardock developer Derek Paxton identifies two main driving forces behind it: external pressures (by publishers, stockholders, financial limitations, etc.), and poor project management, often stemming from studios favoring the ambition and idealism of designers over a pragmatic approach that wouldn’t overwhelm the developers.

The latter example is quite interesting, since productions regularly seem to bite off more than they can chew: studios have in recent years opted for a “fewer, bigger, better” approach, pursuing a handful of oversized projects in hopes of finding the next big thing. In 2010, Electronic Arts put out 48 games; in 2018, they were down to 12. In the same years, Sony went from 46 to 20, Activision Blizzard from 26 to 10, Microsoft from 21 to 6, Take-Two from 19 to 9, and so on. There are a few notable exceptions (both Sega and Square Enix had more output in 2018 than they did in 2010), but the overall trend points toward fewer and fewer games, as studios push for lifelike graphics and huge open worlds packed with content. Such an overzealous scope of development requires immense effort by the developers, regardless of whether they can meet the expectations. Apparently, a developer on Cyberpunk 2077 initially thought that the 2020 release date was a joke, expecting the game to be ready no earlier than 2022. Except, of course, the decision wasn’t up to them, and the studio kept pushing.

Though it’s often easy to understate the amount of work that goes into creating art, with games—whose productions require armies of programmers and designers—the labor ought to be obvious. Games are made off the backs of workers, and crunch culture promotes horrible work environments.

And sure, some people enjoy crunch. Game writer Walt Williams, who worked on the Star Wars, Bioshock, and Borderlands franchises, wrote a controversial piece titled “Why I Worship Crunch.” In it, he admits that crunch is exploitative, unnecessary, and harmful. However, he also offers perhaps the best description of how intoxicating crunch can feel in the middle of production: an out-of-body experience, “the disgustingly sweet taste of Red Bull,” the adrenaline of being in a fight, the destructive behavior of guzzling free soda and “knowing that each delicious slurp sells off tiny pieces of your soul.”

Frankly, reading Williams’ paean to crunch is difficult. He is afraid of stepping away from work for too long because he might start questioning why he’s doing this to himself. He describes having chronic muscle pain and a sleep disorder; when he does get to sleep, he suffers horrible nightmares. He writes:

I’ve edited scripts in ICU rooms, responded to emails while begging lovers not to walk out the door, sent brainstorming lists during the birth of my child. I held my grandfather’s hand while he passed away, then went into his office and wrote text for mission descriptions.

But Williams also tells a lie essential to the status quo of crunch: “It’s only crunch if you don’t want to do it.” These practices are destructive regardless of whether individuals choose them or not—a competent manager needs to prevent their staff from overheating. And with threats and social pressure rarely being explicit, it’s not necessarily obvious whether crunch is voluntary.

When he finally wonders whether the crunch was worth it, Williams concludes that “the price was fair.” Whether he’s right or not (I lean to the latter), he’s glorifying a toxic work environment that affects his colleagues.

Even if they don’t enjoy it, a lot of developers see crunch as an opportunity to prove themselves. Kevin Agwaze, a developer at U.K.-based Studio Gobo, recounts going to a specialized school that taught coding for game development, where the workweek was brutal and the dropout rates high—a trial by fire for the industry ahead. As per Take This, a mental health advocacy organization, “some in the industry consider crunch a right of passage, a kind of hazing to demonstrate you are tough enough, dedicated enough, and passionate enough to be in game development.” Which is a fascinating bit of circular reasoning: studios will impose crunch so developers can earn a spot in their productions, but developers only need to earn it this way because studios are imposing crunch.

At Naughty Dog, they’re open about expectations of crunch. During hiring, they make it clear they’re looking for dedicated perfectionists. Afterward they expect the same quality of work from junior contractors that they do from senior employees (read: crunch or you’re gone). Game designer Byron Atkinson-Jones has told the story of his first job with a game company—when he and a group of colleagues left on time one day due to burnout, they were warned the next day that they were “naturally selecting” themselves (again: crunch or you’re gone). Another developer described having a rolling contract without clear renewal conditions, and walking on eggshells to please their bosses (in case it wasn’t clear: crunch or you’re gone).

The large-scale layoffs at game companies—often caused by certain workers being “disposable” during lengthy productions—only make their work environments even more stressful. In late 2018, Activision Blizzard fired 800 employees even as they bragged about record revenues and awarded a signing bonus of $15 million to a high-ranking executive, and the company’s CEO had just made $31 million. In a more recent incident, the same CEO received a staggering $200 million bonus while laying off as many as 190 employees— though they got gift cards! Other prominent game studios like Electronic Arts, ArenaNet, and Goodgame Studios have had mass layoffs in recent years. With a lack of job security, a game developer has to please their boss so they’re not next on the chopping block (for the people in the back: crunch or you’re gone).

These threats, implicit or explicit, consistently equate working long hours with passion, and leaving at a reasonable time with a lack of commitment. In her book Work Won’t Love You Back, labor journalist Sarah Jaffe frames the “labor of love” rhetoric as a tool of exploitation. If jobs can be viewed as extensions and realizations of our personal passions, we can be convinced to ignore a poor work environment. “Do what you love and you’ll never work a day in your life” has a nice ring to it, but it is fundamentally false: work, no matter how enjoyable, is still work, and ought to be compensated as such.

Similarly, Jaffe criticizes the rhetoric of “workplace as family” (notably prevalent in the gaming industry, where one studio even refers to itself as a “fampany,” an obnoxious portmanteau of “family” and “company”). You might find yourself befriending your coworkers regardless—but Jaffe’s point isn’t to criticize solidarity between colleagues, nor does she mind people enjoying what they do. Rather, she asks us to be wary of how these rhetorical tools might lead us to accept exploitation. Personal gratification isn’t a replacement for fair wages and reasonable work hours.

Something sinister lurks beneath the cheerful veneer of fampany-speak. Studio Gobo has promoted “Gobo Friday Lunch,” where employees can bond over “a warm home-cooked meal.” Again, company-provided lunch isn’t necessarily the point of criticism here—but it isn’t a “home-cooked” meal. Your workplace isn’t your home, even if pandemic-related work-from-home practices may have blurred the line. Of course, if you were repeatedly told that your workplace was your home, your colleagues were your family, and your job was actually a dream come true, it might not seem that strange to stay at the office after hours. After all, what are sleepless nights compared to fulfilling your dreams alongside family? (You love your family, don’t you?)

Jaffe draws a line from the modern game industry’s crunch culture to early ARPANET developers programming rudimentary Dungeons & Dragons adaptations and distributing them across the network they were building. The games allowed them to play with the network, learning its capabilities without doing real work—what would later be termed “playbor” (play + labor). Their long nights at the office, as well as the play-as-work dichotomy of their adventure games, were blurring the distinction between work and home. Emerging software companies (and game companies in particular) quickly adopted the playbor practice, and crunch soon followed.

Which is not to say that all game studios are the same. Ironically, even though you might expect small indie studios to be rife with crunch practices due to their limited financial resources, it’s mostly large studios that are the culprits—Electronic Arts and Activision Blizzard are as big as they get, and Naughty Dog has the pedigree of prestige television. Indie studios, on the other hand, seem to have figured it out.

Supergiant Games is the best example: since 2011 they’ve released four critically acclaimed games, and the latest, Hades, has been rubbing shoulders with TLoU2 at award shows. And yet, they also prioritize “sustainability” and a work-life balance for their employees. In case that’s a little too vague, they also offer unlimited time off, mandate rest days, and require employees to stop sending emails past 5 p.m. on Fridays. At Crispy Creative, co-founder Kylan Coates insists on properly compensating employees, which includes paying contract workers for design tests (a practice which isn’t commonplace in the industry).

But the split between indie and large studios isn’t black and white; some indie studios have been criticized for poor working environments, and Rockstar Games has made steps toward alleviating crunch after the backlash for Red Dead Redemption 2. Regardless of a studio’s size, game developers can’t rely on benevolent bosses. More often than not, companies will choose profits over people. Ideally, game developers would control the (cough means of production cough) studios themselves, which is already true for some smaller developer-owned cooperatives: Talespinners, Motion Twin, and Future Club, among others. At these companies there are no bosses. The workers might elect managers among themselves, but they all control the studio together, rather than being beholden to the whims of stock-holders or layoff-prone corporate overseers.

Utopian visions like the Bioware Cooperative or Naughty Dog Worker Collective are far off, though. For every worker-owned Motion Twin, there’s a significantly larger CD Projekt Red, with a CEO admitting crunch practices to investors and game directors insisting on “non-obligatory” crunch but mandating it behind the scenes. What the game industry needs right now is organized labor.

In recent years, there have been some promising developments on this front. The first ever strike in the gaming industry was conducted by unionized voice actors, who were part of SAG-AFTRA. They did not win all their demands, but they received pay raises and demonstrated their collective power in an industry notoriously wary of unionization. The first fully successful strike in the industry came a few years later, when contracted writers on the mobile game Lovestruck: Choose Your Romance went on strike with the help of Communications Workers of America (CWA). And in recent years, workers at South Korean studio Nexon formed a union, Paradox Interactive signed a collective bargaining agreement with their employees in Sweden, and developers at Glitch have unionized with CWA.

Developers themselves are supportive of labor unions: in the 2019 Game Developers Conference survey, 47 percent of responders answered “Yes” on whether game developers should organize, with another 26 percent answering “Maybe.” It’s a promising sign that nearly 3/4 of developers are interested in fighting for better working conditions. To win, however, they’ll need the labor movement: formal unions and labor advocates alike. Then maybe, just maybe, they’ll be able to go home before fucking midnight.

Meanwhile, consider not buying games from studios with documented crunch practices. Doing so wouldn’t even require a sacrifice beyond waiting a few weeks and buying physical copies, since games are commonly resold soon after release. You might not get much of a discount, but none of the money will go to the crunch-reliant studios. Sure, personal consumption choices won’t resolve the issue on their own (“voting with your dollar” will never be as effective as mass organizing) but it’s the least we can do until game developers unionize.
As podcast producer Jordan Mallory once tweeted, and hopefully presciently, “I want shorter games with worse graphics made by people who are paid more to do less and I’m not kidding.”

More In: Labor

Cover of latest issue of print magazine

Announcing Our Newest Issue

Featuring

Our eclectic and verdant rainforest issue!

The Latest From Current Affairs