Did you know video games make more money than the film and music industries combined? It’s unsurprising considering approximately 3.09 billion people – almost half of the world’s population – are active gamers.
It wasn’t always this way. For most of their existence, video games have fallen under scrutiny. From the belief only good-for-nothing non-achievers take interest in them to a severe moral panic when video games were blamed for a school shooting, the industry – and its community – has seen it all.
Let’s explore how, against all odds, video games went from the world’s biggest pariah to the most lucrative entertainment form of them all.
Video Games Are Older Than You Realize
Off the top of your head, what would you say the very first video game was? Pong, Space Invaders, or digital versions of chess or tic-tac-toe are excellent guesses, but they’re incorrect. It was hosted on Nimatron, an electro-mechanical machine designed by a man named Edward Condon, and it looked like this:
Nimatron hosted a strategic math game called Nim. First exhibited at the New York Fair in 1940, it predates the first programmable computer, EINAC, by five years.
A hit among fairgoers, Nimatron saw more than 100,000 games played during its exhibition. Although it wasn’t digital, and had little influence on modern computers, it innovated the concept all video games are based on – machines used for fun. For this reason, it’s considered the ancestor of all video games.
No One Knows What the First Digital Video Game Was
Over the next decade, new ideas for computer games were conceived, but none came to fruition. The first of these was the Cathode-Ray Tube Amusement Device, built and patented in 1947.
It was a simple machine, and its objective was for users to direct a CRT beam at on-screen targets. Even though it’s considered the earliest interactive electronic game, many disregard it because it wasn’t computerized – and was more a simulator or oscilloscope than it was a game. More importantly, it was never manufactured for public use.
Shortly after this, in 1948, Alan Turin and David Champernowne conceptualized and wrote a chess simulator, but the computers at the time were too limited for them to ever be able to actualize their patent.
As computers became more widely available, they were isolated to military facilities and universities. Many games could have been developed in secret – for research, testing, or personal use – never to be released to the public. For all we know, the very first digital computer game is lost in time.
The Father of Gaming Didn’t Care about Video Games
As for games released to the public, Bertie the Brain – the first arcade game – was first displayed in 1950, at the Canadian National Exhibition.
The funny part is, its creator, Josef Kates, didn’t mean to innovate modern video games. He simply wanted to show off his latest invention – the Additron Tube, a more efficient way to run computers. Aware most civilians likely wouldn’t make sense of it, he programmed his new machine to play tic-tac-toe, a game everyone knew how to play.
According to Popular Mechanics, Kates went on to remember Bertie the Brain fondly, but felt his other accomplishments, like inventing automated traffic lights, were more prolific.
Baby Boomers Had a Ton of Games, but Most Couldn’t Play Them
Throughout the 50s and 60s, many games were developed, but most, if not all, of them were experimental – they were designed to test the limits of, or flaunt, new technology.
Soon after Bertie became a hit, the Nimrod was born. By most definitions, it was the first gaming computer, designed specifically to play a digital version of Nim. It was short-lived, and was retired and dismantled after only three weeks on display in 1951.
At the same time, various games were created at, and remained exclusive to, universities and military bases. One of these was OXO (another tic-tac-toe game) – the first game created for a stored memory computer in 1952. It was only available at the University of Cambridge, and was intended for research, not entertainment.
1958 brought Tennis for Two – the first game created exclusively for fun, and the first two player game. It was set up at the Brookhaven National Laboratory to entertain visitors and was such a hit, it even got an upgrade. It too was dismantled after a year.
In 1961, Spacewar! became the first game available on more than one machine, but since there was no commercial market for computers, it was limited to academic institutions. This was the case for all games that followed in the 60s and early 70s – they were fun novelties only a few people got to experience or enjoy.
The Sumerian Game, created by Mabel Addis in 1964, was the first strategy game, the first story-based game, and the first game developed by a woman.
Arcade Games Boom
1971 saw the release of Computer Space, a space shooter based on Spacewar! of ten years before. Its creators, Nolan Bushnell and Ted Dabney, attempted to develop the game for computers, but were unable to find a financially viable way to do so.
Inspired by coin-operated electro-mechanical devices, like Nimatron, they decided to give it a modern spin – and arcade games as we know them were born.
Even though Computer Space wasn’t the triumph Bushnell and Dabney hoped it would be, it was the world’s first commercialized video game, so it paved the way for the video game industry.
In almost no time at all, arcade games boomed, beginning with Pong in 1972 – the first commercially successful, and critically acclaimed, video game.
Pong was so popular, other manufacturers developed blatant knock-offs in attempts to mimic its success. None of these clones ever reached the same heights as Pong, but they oversaturated the arcade industry, causing it to plummet.
By 1974, video games hung by a thread. Developers were abundant, but audiences lost interest in paddle-style games. To compensate, new genres took center stage, including shoot ‘em up and racing games.
The First Controversy
At this point in time, video games were largely considered adult entertainment, and arcade machines weren’t common in child-friendly spaces. Still, when Death Race released in 1976, the media couldn’t help but condemn it.
The objective was to chase and run over goblins with a car. When a player hit a target, it would turn into an obstructive tombstone. The more goblins hit, the more clutter filled the screen, and the harder the game became.
The problem, according to Wendy Walker, writing for Associated Press, was that the goblins “resembled humans,” and let out a child-like shriek when killed. According to her, this gave the impression of murder. Her widely distributed article criticizing the grizzly gameplay is regarded as the spark that ignited the video game industry’s first moral panic.
Concerns about Death Race spread nationwide and was even covered by the New York Times. Petitions to remove the misunderstood machine from public access grew enough to have the game pulled from some venues, but for the most part, it had the adverse effect.
The game, now shrouded in controversy, grew in popularity as curious and rebellious minds hoped to find out what all the fuss was about.
The funniest part in all of this is what Death Race actually looked like. This is the game the media labeled as “insidious,” “morbid,” “gross,” and “violently graphic”:
A New Invasion
Thanks to waning interest in popular games and concerned moral guardians, the arcade industry dwindled in the mid 70s. Games were still developed, but they didn’t receive much hype, and gaming as a whole fell to the wayside – until 1978, that is, when Space Invaders arrived. Unlike the arcade games that came before which were popular thanks to novelty, Space Invaders captivated audiences with a few never before seen features.
It was the first fixed shooter game, it had a continuous and interactive soundtrack that intensified according to your gameplay, and it introduced the concept of fixed high score tables. Space Invaders was also the first game to feature retaliating enemies, and by consequence, was the first game to end when the player lost instead of when they ran out of time.
These features were so innovative, Space Invaders was an instant commercial success. By the end of the year, it sold over 100,000 units and grossed more than $6 million – or $3 billion dollars when adjusted for inflation – making it the highest-selling video game of its time, an honor it kept until 1980.
Guinness World Records deemed it the “Top Arcade Game of All Time,” and The Times ranked it the most influential video game of all time.
Video Games Become Mainstream
Reverence for Space Invaders paved the way for the golden age of arcade gaming – an era which debuted a slew of beloved titles including Centipede, Donkey Kong, Frogger, Galaxian, and Pacman.
Donkey Kong was the first platformer, and the first game to follow a visually unfolding storyline. Pacman – the game that knocked Space Invaders from its #1 spot – introduced power-ups, cut scenes, and most importantly, characters.
As silly as it may seem, the pie-chart shaped yellow guy we all know and love was the first video game character in history.
Even more fascinating, Pacman’s form was deliberately made ambiguous in hopes it would appeal to women, and it worked. Its simple controls, which didn’t require aiming or pushing buttons, also opened gaming up to everyone, not just those who were, or wanted to be, good at games.
Pacman’s widespread appeal allowed it to expand as a franchise, and so it spawned related merchandise, films, TV shows, novels, spin-offs, sequels, and even mainstream music. It may not be official, but it’s fair to say Pacman was the first video game to grow a fandom.
The End of an Era
The golden age not only gave the world legendary games; it turned what was once a niche industry into a goldmine. The popularity of Pacman, Donkey Kong, and the aforementioned titles of the early 80s meant arcade machines were in high demand.
Soon enough, they made their way into family friendly spaces like malls, grocery stores, cafes, and eventually dedicated game arcades, but the industry flew too close to the sun.
By 1983, sales started to wane. A number of factors contributed to this, including expensive upkeep, parental concerns, oversaturation, and increasing difficulty which pushed casual players away.
As arcades watched the sun begin to set, gaming as a whole had nowhere else to go but home.
Home Consoles Come into Play
The first home console, The Magnavox Odyssey, was released in 1972, and home PC games were first established around 1974 with the Apple II. These, as well as other, gaming devices were relatively popular for their era, but they were considered supplementary rather than mainstream. With a few exceptions, they only appealed to hobbyists.
That is, until the Atari 2600 hit shelves in 1977. Initially, it had a steady but underwhelming sales run, so Atari – desperate to stay in the game – obtained licenses to reproduce popular games, including Space Invaders. Within a year, the Atari 2600 skyrocketed in popularity, selling by the millions.
Atari stumbled upon a formula that worked, and it was only a matter of time before its competition replicated it, taking Atari down a notch or two. In an effort to regain control of the industry, Atari made some questionable choices, including disappointing and ill-received console versions of Pacman and E.T. – the final nail in the coffin.
Consumers – who were already moving away from consoles and toward PCs – lost faith in Atari, and, by association, video games as a whole. The once thriving industry lost up to 97% of its revenue, and many developers went under. It wasn’t solely Atari’s fault, but some believe it catalyzed what’s now known as the video game crash of 1983.
For the next two years, gaming came to a stand still, with little interest and even less innovation. Little did anyone know what was about to unfold.
Nintendo Changes the Game
During the video game crash in the US, the industry still boomed in other countries, most notably Japan.
Nintendo was already established in 1983, thanks to a number of popular arcade titles including Wild Gunman, Mario Bros, Popeye, and Donkey Kong. Unlike the developers who were forced to stand down amidst the recession, Nintendo had other plans – to release their Family Computer, better known as the Famicom.
At first, the Famicom wasn’t received well. Manufacturing faults led to issues with the console’s chip set, and Nintendo soon had to recall its run. After some tweaking and design improvements, the Famicom was put back into circulation, and went on to become Japan’s highest-selling console in 1984.
The next logical step was to release the Famicom internationally. Nintendo moved to redesign and rebrand the Famicom to better suit American consumers, and the Nintendo Entertainment System, or NES, was born.
Released in North America in 1985, the NES sold 1 million units in its first year, and 7 million units in 1988 alone. By 1990, the NES had outsold all other gaming consoles, and was more prominent in American households than PCs were. Gaming was back in business, and this time, it wasn’t going anywhere.
Nintendo Takes Control
A combination of excellent timing, marketing, and a great selection of games (including Super Mario Bros.) meant Nintendo single-handedly reignited a dying industry, and brought the video game crash to a close.
Something more significant is often overlooked. At first glance, the NES may seem like a standard piece of archaic hardware, but it innovated what we consider the most integral development in gaming so far: controllers.
Console peripherals were vastly different before the NES. Most of them were built to move Pong or table tennis paddles, and their rudimentary designs emphasized this. The Atari 2600 later added joysticks – analog shafts for movement – accompanied by simple buttons for other functions.
Nintendo revolutionized gaming with handheld controllers. They were as intuitive as they were ergonomic, and included more functionality than all other consoles before them. The addition of direction pads meant game developers could design more complex games. The controllers were also detachable, and made room for a world of accessories – like the Zapper, used to optimize gameplay for Wild Gunman, Duckhunt, and other shooter games.
The NES not only breathed new life into the gaming industry, it changed how people play games, and, therefore, how games are designed. Say what you want about Nintendo, but its contribution to the world is more valuable than it gets credit for.
The Console Wars Begins
By the end of the 1980s, video gaming saw unprecedented popularity and the video game crash was all but forgotten.
The NES and Sega’s first console, the SG-1000, were released on the same day in 1983 in Japan. Although the latter wasn’t as innovative as the NES, it sold 110,000 more units than Sega anticipated. The console was an unlikely hit, but once the NES found its footing, Sega simply couldn’t keep up, mostly due to its lack of popular titles.
The NES outsold the SG-1000 in a landslide, but the rivalry would continue for many years to come – just in time for the dawn of a new era.
A Whole New World
The 1990s saw a video game renaissance of sorts, thanks to new technology and revitalized interest in gaming. New consoles, peripherals, and original games were in high demand, so the industry thrived.
The decade also innovated a number of fresh genres, including first-person shooters, platformers, side scrollers, action RPGs, survival horrors, and fighting games. The latter were such a hit, Street Fighter II revived the arcade industry for a time.
It was also the birth decade of dozens of franchises we know and love, including Sonic the Hedgehog, Pokémon, Mortal Kombat, GTA, Civilization, Crash Bandicoot, Spyro The Dragon, Fallout, Tomb Raider, Baldur’s Gate, Resident Evil, and Fifa.
The Technological Revolution
The 90s were when PCs became commonplace, so computing, and its many byproducts – including game development and design – saw immense improvement.
3D graphics slowly replaced 8-bit gameplay; the implementation of CDs, DVDs, and memory cards increased storage and allowed for save files; and the advent of the internet enabled online play. Game consoles became more sophisticated and remained the device of choice, and handheld consoles, most notably the Nintendo GameBoy family, were smash hits.
Halfway through the decade, a new sensation hit shelves: the Sony PlayStation. It sold more than 100,000 units on its release day, and went on to become the first game console to sell more than 100 million units. It was so successful and so far ahead of anything else that it rendered the Sega-Nintendo rivalry irrelevant.
Video games took a giant leap forward in the 90s, and many gamers remember it as the best and most definitive decade of video games. It wasn’t all good, though. The gaming world was about to be rocked with its most notorious controversy yet.
Columbine Causes a Moral Panic
Remember when I said video games were blamed for a school shooting? On April 20th, 1999, Columbine High School became the scene of what was, at the time, the deadliest school shooting in US history.
Two seniors, Eric Harris and Dylan Klebold, took the lives of 13 students, one teacher, and themselves, and wounded dozens more. The massacre sparked a media frenzy, and in an effort to understand how two young boys could ever commit such an atrocity, everything they took interest in – from music to film – came under fire.
One of the scapegoats was a video game called Doom. The first-person shooter was the first in history to receive a mature rating, and faced a ton of criticism regarding its excessive violence and gore, “heavy” soundtrack, “satanic” imagery, and depictions of hell.
A moral panic ensued when it came out Harris was an avid Doom player, claimed the massacre would be “like playing Doom”, and referred to himself as “just another Doom demon.”
At one point, a rumor that Harris created a custom Doom level resembling the high school surfaced. Many believed the shooters played the game to practice what they would go on to do, with some claiming the game itself was a “murder simulator.”
The aftermath of the Columbine massacre sparked the most heated and prominent discussion about violence in video games yet. Families affected by the shooting attempted to sue 25 entertainment companies involved in the creation and distribution of Doom, but since the game was never to blame, their case was dismissed.
The New Millennium
You’d think after the hullabaloo surrounding Doom, game developers would soften up a little, but the opposite happened. In the 2000s, video games became bolder, less filtered, and more popular by the day – but they were also more refined, and in a lot of cases, more realistic. This led to more than a few controversies.
This was the decade video games truly began to soar, with some of the biggest titles of all time making their debut. We’re talking Grand Theft Auto: San Andreas, Star Wars: Knights of the Old Republic, Half Life 2, Tony Hawk’s Pro Skater 2, The Elder Scrolls III and IV – Morrowind and Oblivion, Counter-Strike, Prince of Persia: Warrior Within, God of War, Call of Duty 4: Modern Warfare, Uncharted, Devil May Cry, Resident Evil 4, Guitar Hero, World of Warcraft, Pokemon’s third and fourth generations, Wii Sports, The Sims, and what’s often considered the greatest game ever made, Bioshock.
A New Player
Sony had dominated gaming for just under a decade, and had every intention of keeping its place at the top. Right at the turn of the decade, the PlayStation 2 dropped, and to this day remains the highest-selling game console of all time. This didn’t stop other manufacturers from continuing to compete – except for Sega, who officially threw in the towel and returned to third-party development.
Some of the most revered consoles of all time arrived in the 2000s, including Nintendo’s GameCube, Game Boy Advance (GBA), Wii, and DS; and Sony’s Playstation Portable (PSP) and PlayStation 3.
Sega’s departure wasn’t the end of the console war, though. A new entity was about to take its place and begin a rivalry so intense it’s continued for 20 years.
On November 15th, 2001, Microsoft released the first XBox – and immediately gave Sony a run for its money. Preorders sold out within 30 minutes, and Xbox sold more than a million units within 3 weeks in North America alone. The Xbox 360 followed in 2005.
Meanwhile, PC gaming was going strong, but mostly took a back seat in favor of consoles. Cross-platform gaming was yet to find its feet, and Sony and Xbox entered into their long-standing licensing race. This meant gaming became more divided than ever before.
Mobile Games Level Up
Mobile games have been around since 1994, but back then, they were limited to their hardware. Snake, Tetris, or badly rendered sports games and platformers were typically built into phones, and although they were challenging, addictive, and excellent time killers, they weren’t anything to be lauded.
This all changed in the 2000s when mobile phones became more sophisticated. Graphics improved, and Wireless Application Protocol (WAP) made mobile games downloadable for the first time.
Then, in 2007, Apple changed the course of history by introducing the iPhone. Its touch screen had the highest resolution of its time and made games infinitely easier to play. A year later, Apple introduced the App Store, which pushed mobile game development and monetization to new heights, and placed mobile games on the map.
The 2000s was one of the most controversial decades for video games – and most of it centered around GTA.
In June 2003, lives were lost in two separate shootouts. In both cases, the perpetrators claimed they were inspired by GTA games – III and Vice City, to be exact. Then, in 2008, a group of teens wreaked havoc in New Hyde Park, New York. After attacking an elderly man and hijacking a car, It was alleged the teens were obsessed with, and attempting to emulate, GTA IV.
These crime cases weren’t GTA’s only controversy. In 2004, San Andreas sparked worldwide outrage when hidden code for a sexually explicit minigame was uncovered by modders. The discovery sent shockwaves through the gaming industry.
During the debacle, San Andreas was pulled from stores, and outright banned in Australia. It fell under Federal Trade Commission investigation, forced the Entertainment Software Rating Board (ESRB) to give it an adults only rating, and even inspired Hilary Clinton to put forward a bill declaring the distribution of explicit games to minors a federal offense. RockStar Games also found itself embroiled in multiple class action lawsuits.
On top of this, many more video game-based crimes were committed in the 2000s, which once again led to discussions about violence in video games and video game addiction. Over the years, regulatory boards reassessed how video games are rated, and implemented new standards to protect minors.
Gaming Is Redefined
The 2010s was the busiest era for video games so far. Gaming expanded its horizons and, thanks to social media and content creation, was largely destigmatized. Gaming found new respect both as a creative art, a sport, and a valid – if not cool – hobby. More people, regardless of age or gender, became avid gamers, and the industry and hobby alike diversified.
Chances are you’re still playing some of the gems we got in the 2010s, including GTA V, Red Dead Redemption 1 and 2, The Last of Us, Dark Souls, Dishonored, Bloodborne, The Legend of Zelda: Breath of the Wild, Minecraft, Fortnite, and, of course, The Elder Scrolls V: Skyrim.
The “PC Master Race” Emerges
Not much changed in terms of the console war in the 2010s. Sony released its PlayStation 4 in 2013, followed by the PlayStation 4 Pro in 2016, to much success. Microsoft lost favor when it released the Xbox One and the Kinect – both commercial failures.
Meanwhile, Nintendo had its ups and downs. The 3DS was bogged down by a hefty price tag and a lack of varied games, while the Wii U was one of the biggest flops in video game history. Even though Nintendo seemed to fall further behind with each new release, it made a surprising comeback with the best selling Switch in 2017 and the Switch Lite in 2019.
It was an exciting decade for console players, but ultimately consoles were outshone by further advancements in computing and PC gaming. As more emphasis was placed on capacity, gaming PCs slowly became the standard – although some would say developers didn’t get the memo.
And so a new war began, that between the PC Master Race and the Console Peasants – as they now refer to each other.
On the bright side, online gaming, and to a degree cross platform games, grew more popular, and by the end of the decade, the once heated great debate about how to play mostly fizzled out.
Social Media Takes Over
Perhaps the most significant part of gaming in the 2010s was how rapidly the sense of community grew because social media made it easy for gamers to meet and engage with each other.
As YouTube, Twitch, Instagram, and other platforms expanded, they also became a means for gaming-related content creation to dominate. Let’s plays, playthrough challenges, community commentary, news channels, and a plethora of other gaming-related channels found mainstream success – to the degree that a gaming-focused channel at the time was the most subscribed to YouTuber of them all.
Social media also gave gamers a louder voice, and made it easier to moderate game communities. As one example, the disgraced Pokémon Speed Runner, Jadiwi, was caught largely due to his carelessness in his streams – something everyone could see and pay attention to.
Social media also had a huge role to play in validating video games as a career. It’s difficult for parents to argue when professional gamers and content creators alike make millions.
Gamergate and Even More Controversies
Social networking might have improved the game sphere, but it was also weaponized. The most notorious example is Gamergate, an online harassment campaign that shook the gaming world in 2014.
It started when Zoe Quinn – who at the time identified as female – published a text-based game called Depression Quest. Its release was a success, but a few days later, Quinn’s ex-boyfriend posted a smear campaign against them. Before anyone could do anything about it, angry and hateful gamers fabricated a story that Quinn had exchanged sexual favors for positive reviews.
Quinn had to face months of online harassment and violent threats, before the perpetrators shifted their attention to any and all prominent female gamers – who were then harassed, threatened, and doxxed as well.
Gamergate exposed discrimination, gatekeeping, toxic masculinity and cyberbullying in gaming communities, and became the subject of much controversy. It also deepened the consistent moral panic regarding violence in video games and its psychological effects on its majority male players.
… And Then, the Pandemic Happened
The lockdowns may have been a disaster in all other industries, but it had the opposite effect for video games. Everyone was locked indoors round the clock, so gamers had more time to play games, and non-gamers – with nothing better to do – turned to video games as boredom busters.
The video game industry generated more than $150 billion in revenue in 2020 – more than films, music, and sports. It’s not unbelievable when you consider gaming was the perfect pastime during the pandemic. It didn’t require going outdoors, it was immersive, and it had a social element to it, if one enjoyed multiplayer games.
Nintendo grew to dominance once more as the Switch became the top platform of the year, and Animal Crossing: New Horizons – released in March 2020 – became one of the most played and highest grossing games of the year.
The pandemic’s effect on gaming wasn’t all positive. Combining the standstill in production during the lockdowns and an increase in demand for electronic devices, a global chip shortage ensued. This not only meant fewer goods could be produced, but it also skyrocketed the prices of electronics, including graphics cards and consoles.
Speaking of consoles, the war was briefly reinstated when the PlayStation 5 and the Xbox S series both released in November 2020, after the worst year ever, when everyone and their grandmother turned to video games to escape.
The Future of Video Games
We’ve arrived at the here and now. The chip shortage is only just starting to wane, and gamers are pleased with the many delights we’ve enjoyed so far, including Elden Ring, Pokemon Scarlet and Violet, Legend of Zelda: Tears of the Kingdom, Hogwarts Legacy, and the recent smash hit, Baldur’s Gate 3. Heck, even Bethesda woke from its slumber to publish Starfield and keep Elder Scrolls fans at bay.
Mobile games are starting to catch up to traditional games in terms of revenue, gaming content is experiencing unprecedented success, the Game Awards have become one of the most talked about annual events, and some of our favorite games and franchises are being renewed and rebirthed for us to experience again.
It’s not as lovely as it seems, though. While the games currently being produced are of higher quality than ever before, the gaming industry is in shambles.
Unity recently came under fire for attempting to charge independent developers exorbitant fees for their service. Pokémon Go saw a mass exodus of players after new management bulldozed the game’s initial charm and affordability. Game companies everywhere are reshuffling and laying off hundreds, if not thousands, of personnel in favor of cheaper labor and more profits – and all this doesn’t even scratch the surface of the sociopolitical drama regarding diversity, inclusivity, and accessibility.
The gaming world is changing, and it might not be for the better. While we’re headed into a brave new world of VR and sophisticated AI bound to make gaming more immersive and challenging than it’s ever been before, we’re also on the verge of a modern video game crash.
Though it’s unlikely video games will ever come to a standstill again, gamers are growing more exhausted of game companies abusing their communities and taking shortcuts with each passing day. Tensions are mounting, and the industry is facing severe criticism amid desperate calls for change.
Our economy could also play a role here. A renaissance in gaming happened after the last recession – and if games are as brilliant as they are now, imagine the possibilities if they’re forced to stop, take a step back, and aim even higher.