The 30 moments that defined the video game industry From consoles to genres, from lawsuits to graphics, these are the 30 big moments that define the video game industry—from 1977 to the early 00s.

In 1977 the Atari 2600 launched and along with it much of what we know today as the videogame industry. How did we get where we are today? We have surveyed our history and chosen one moment from each year that defines who and what we are today.

1977: This is your hardware, now and forever

The wildly popular Atari 2600 gave us today’s modern console: a general purpose CPU, dedicated graphics and sound hardware, a standard audio/video output, generic controller I/O ports, an interface for swappable media, all powered by a wall outlet. For comparison, the battery-powered Magnavox Odyssey had neither sound nor color graphics and the Fairchild Channel F had an internal speaker and hardwired controllers.

Thirty years ago Atari cultivated the image of a console sitting prominently in front of a television, surrounded by stacks of games and spare controllers and happy people holding controllers. Nintendo uses nearly identical images with fewer wires to sell its Wii.

1978: Space Invaders

Shortly after Atari put its first Pong machine into Andy Capp’s Tavern, the owner called to report the machine was broken. When Al Alcorn, creator of the Pong machine, arrived to examine and remove the machine he discovered the problem: overflowing coin container.

When Space Invaders appeared exclusively on the Atari 2600 in 1980, it defined the arcade-to-console hit and the concept of a system-selling game. Such arcade conversions are less important today, but killer applications and system exclusives like Super Mario 64 and Metal Gear Solid 4 still drive the console and handheld hardware markets.

1979: Learning to Fly an Apple

Interest in Bruce Artwick’s 1976 articles on 3D graphics and flight simulation led him to the 1979 creation and eventual publication of Flight Simulator for the Apple ][. The realistic instrumentation ignited imaginations and the World War I dogfighting mode demonstrated the videogame potential. Cutting edge wireframe graphics dazzled everyone, driving Flight Simulator to be one of the best selling pieces of Apple ][ software in the 1980s.

Microsoft bought a license from subLOGIC and released their Microsoft Flight Simulator 1.00 in 1982. Twenty-four years later they’re still releasing new versions, although the graphics and scale have gotten a wee bit of an upgrade.

Runner-up: Speaking of simulations, in 1979 Mattel created more than a better football game for their new console, the Intellivision. Beyond the full-length quarters and the callable plays, NFL Football was an officially-licensed sports game. The Intellivoice add-on wasn’t quite up to color commentary, but John Madden would more than suffice once EA got into the game.

1980: Activision beats Atari on its own system

Forget women; try spurned programmers. After receiving no bonuses and no credit for creating games generating millions of dollars for Atari, a group of programmers — David Crane, Larry Kaplan, Alan Miller, and Bob Whitehead — broke away and created Activision, the first third party developer for a videogame console. That rebellion defined the third party system which dominates today’s videogame market.

The liberated ex-Atari programmers created a new experience for videogame players. By holding itself to a higher standard Activision created games free of the sprite flicker seen in Atari’s own software. Original games like Kaboom! and River Raid came to define the Atari 2600. Crane’s Pitfall! could be considered the foundation of today’s platform games. And programmer photos and messages in the game manuals no doubt inspired a generation of budding developers.

Runner-up: Programmer Warren Robinett found his own way around Atari’s rigid policies: he hid his name in a secret room inside Adventure, what one might consider the first graphical adventure. Through an elaborate process discovered after the game was released, players could access that secret room and the flashing words “Created by Warren Robinett”. The world first easter egg surely puzzled many youngsters at the time, but consider how much less interesting our games would be without them.

1981: Back before they were called “gaming rags”

The article you’re reading right now and much of the writing on this very website owes its roots to the videogame press of the early 1980s, in particular to Electronic Games Magazine. Several terms we use regularly to describe games — easter egg, scrolling, and screenshot — were coined in Electronic Games editorials penned by co-founder Bill Kunkel. The magazine also created the sections we still use today to categorize content on videogame websites: news, previews, reviews, hints, and editorials. Only the letters section has fundamentally changed — we now call them user comments and they are published instantaneously rather than weeks after they’re written.

Runner-up: After the accidental success of his D&D-to-Apple conversion, Akalabeth, Richard Garriott set out to create a more polished commercial product. The result was Ultima, or what we know today as Ultima I: The First Age of Darkness. Ultimately Garriott did more than just create Ultima and its sequels — their popularity laid the foundation on which an entire genre of graphical role-playing games was built.

1982: E.T. stuck in a hole, Atari too

At the end of Summer 1982, the Atari 2600 dominated the home console market and Atari obtained exclusive rights to a game based on the box office smash movie, E.T.: The Extra-Terrestrial. All they had to do was program the game, package it up, and enjoy the profits. On a five week deadline, they enlisted Howard Scott Warshaw (who had spent four to five months on the hit game Yars’ Revenge) who designed the game in two days and then coded it up in the remaining time. Without time to test the game with real people, Atari produced 4 million cartridges and started shipping to retailers.

The problem? Few people enjoyed flying E.T. out of those damned holes long enough for him to phone home and Atari only sold 1.5 million cartridges. Making a stinker of a game is one thing, but being left with 2.5 millions extra copies is quite another altogether. From the E.T. debacle and other missteps Atari went on to report a half billion dollar loss the next year and was ultimately sold off in pieces in 1984.

It’s not hard to see smaller versions of Atari’s hubris and fall throughout history: Romero’s Daikatana, Nintendo’s spurning of Sony’s CD-based console design, the death of 3dfx. At this point, it’s probably best that we tastefully avoid mentioning the PlayStation 3.

1983: Big patent lawsuits had to start somewhere

For many, videogame patent lawsuits may seem a new development, but their roots actually go back to the dawn of the industry itself. One of the first big fights involved Coleco’s Expansion Module #1 which played Atari 2600 games through a ColecoVision. Atari sued in December 1982 for patent violations and unfair competition, asking for $350 million in damages. Plucky Coleco upped the ante and countersued for $500 million. By March 1983 they’d reached an historic settlement: Coleco would pay to license Atari’s patents, continue making their Expansion Module #1, and produce the Atari 2600-compatible Gemini console.

Since that time no console manufacturer has tried to make its system compatible with its competitors’ games, although Bleemcast and Connectix Virtual Game Station are third party software solutions that come close. The real lesson here is how these disputes are typically resolved: mutual agreements between the two parties. Look no further than the recent Immersion vs. Sony, in which Sony finally relented and licensed Immersion’s force-feedback technology.

Runner-up: While Dragon’s Lair was functionally little more than a Simon Says video player, the gimmick of its amazing graphics carried it through to fame and fortune. It pioneered the storage of games on optical media and more than any previous game showed the potential for what videogames could be, graphically at least.

1984: A sandbox in space

Before Grand Theft Auto III made three-dimensional sandbox games fashionable, David Braben and Ian Bell created one of the first and certainly one of the most influential such games: Elite. A player start as the harmless Commander Jameson, owner of 100 credits and a modest trading ship, and determines his lot by how he lives and works. He can take on military missions or mine asteroids or simply trade as an honest member of society. On the darker side, the player may take to piracy or become a bounty hunter. The player was granted something quite special — the ability to choose, and then to live with the rewards and consequences. To create a suitably rich sandbox on limited 8-bit machines, Elite used a procedurally generated universe, creating 8 galaxies from a seed and a random number generator.

1985: I see blocks falling in my sleep

It began on a PDP-11 clone in Moscow, where it quickly spread to IBM PC computers. From there to Hungary, then Britain and the United States and then the world. Not a virus, but Tetris — which has proven nearly as debilitating to work productivity. Surely Alexey Pajitnov, a mathematician, had no idea just how powerful his falling-block creation would be, but even today variations show up on practically every new consumer device with a screen and a couple of buttons. Is there any videogame that people played in 1985 whose graphics and mechanics haven’t changed, but people still play obsessively?

Runner-up: In the mid-1980s life rarely got any better than raiding a dungeon with your D&D friends. More than a few teenagers dropped pen and pencil and rushed to the arcade to play Gauntlet and act out their fantasies with color graphics and booming sound effects. Today we play more sophisticated games — Diablo II and World of Warcraft come to mind — but the mechanics of hacking and slashing are still there.

1986: Nintendo, industry savior

Today it sounds ridiculous, in late 1985 Nintendo had to cajole toy retailers — painfully stung by the 1983-1984 videogame industry crash — to stock its new Nintendo Entertainment System. By the end of 1986 the industry had begun to turn the corner and Nintendo grasped the North American videogame market with an iron fist that would not be significantly weakened until the 1990s. The new age would bring many wonders — Super Mario Bros., Metroid, The Legend of Zelda, Mega Man, Castlevania, und so weiter — and millions upon millions of dollars into Nintendo’s coffers.

As we will see in future decades, Nintendo has always been experimenting with controls and the D-pad on the NES controller, appropriated from Gunpei Yokoi’s Game & Watch devices, is still with us more than 20 years later. No system since has had a controller without such a directional pad. In addition to this new control method, Nintendo offered a light gun and R.O.B. and would later offer the Power Pad, a sort of electronic Twister pad which today’s kids would assume was for Dance Dance Revolution.

1987: A Link to the Future

American NES sales soared on the strength of Super Mario Bros. and The Legend of Zelda, Shigeru Miyamoto’s other game, would push them further upon its release in 1987. Designed as a nonlinear complement to Mario’s sequential levels, Zelda drew in gamers with its open world and hidden dungeons, despite Nintendo’s own fears that consumers would reject its complexity. How quaint, given that each Nintendo console (with the exception of the unfortunate Virtual Boy) has hosted at least one, if not two, Zelda games. Only Mario is more important to Nintendo’s bottom line.

Runner-up: While youngsters were out rescuing Zelda, Larry Laffer — the original 40-year-old virgin — was trying to get laid. Dressed in his leisure suit and cruising the city of Lost Wages, Larry found sex and love after a few false starts. Despite several Larry sequels, a BMX game with strippers, and a cup of hot coffee the videogame industry of the 21st century is still struggling to understand just what sex is and why it’s so bad but feels so good.

1988: A 20-year Football Dynasty

The Madden NFL Football series from Electronic Arts was born on an unlikely platform: the Apple ][. The 1988 release of what would become the longest running sports franchise innovated in several key areas: full 11-man teams, play editing, and the behind-the-line view. From the Apple ][ the series moved to other home computers like the Commodore 64 and then came into its own on consoles like the Super Nintendo and Genesis where it began annual updates. As consoles with 3D graphics abilities, like the Sony PlayStation, became available, the Madden series moved to a fully 3D game, and since then has continued to add features, like online play, to exploit its host platforms. The exclusive licensing deal signed between EA, the NFL, and the NFL Players Association in 2005 evoked stiff criticism as players wondered whether the series would lessen its pace of innovation. EA perhaps feels no need to respond; the 1.8 million copies of the PlayStation 2 version of Madden NFL 06 that it sold in 2006 — a rate of nearly 14,000 copies per day — should make it confident that Madden fans are plenty happy.

1989: The first money-printing handheld

It is difficult to overstate the importance of Nintendo’s Game Boy in videogame history. It was the first successful handheld game system, soaring where the obscure Microvision had crashed a decade earlier. Many people first enjoyed networked games on the Game Boy via its serial link port and cable. Nintendo’s own NES didn’t even have stereo sound yet and the Game Boy’s innovative 160 x 144 LCD display wasn’t as good as a TV but more than enough for a games. At $90, consumers snapped up the Game Boy to the near exclusion of all competitors. By comparison, Nintendo made ten times more Game Boy Tetris cartridges than Atari sold Lynx handhelds.

And for more than 16 years the Game Boy has reigned supreme among portables. Perhaps it is fitting that the handheld which stole the Game Boy throne was Nintendo’s own creation, the Nintendo DS.

1990: Keen to make the PC shine

John Carmack discovered a programming technique that made smooth-scrolling 16-color EGA games possible and from that he and his friends created a complete clone of Nintendo’s Super Mario Bros. 3 — which Nintendo appreciated but declined to license. So Carmack and crew turned their project into Commander Keen, and true action gaming for the PC was born. Earlier CGA action games had existed, but with the advent of EGA and then MCGA and VGA the PC was finally on an equal footing with consoles of the day. Beyond 1990 the battle lines were clearly drawn, and the war rages to this day: PC or Console?

1991: Fighting in the Streets

Street Fighter II.

What if Street Fighter II hadn’t been made? It sounds a tad silly, but it is a serious question. The original Street Fighter wasn’t a runaway success and it took Capcom four years to create a sequel. But oh, what a sequel!

Modern fighting games are largely built on features that either premiered with or were popularized by Capcom’s Street Fighter II: the six-button layout, special moves, canceling moves, and stringing those moves into combos. The game’s competitive versus mode even engendered a code of etiquette in players.

In any arcade a player could stand in line at the machine and eventually put his quarter on the machine’s glass to claim the next game. Losers went to the back of the line for the chance to buy into the game again later. A winner stayed as long as he won, his skills theoretically stretching a single quarter into a glorious all-day game. This brilliant “loser pays” strategy paid off, converting a player’s competitive nature directly into money in Capcom’s coffers.

Street Fighter II inspired a decade of fighting games which kept the arcade scene alive in the United States for almost another decade. Like a lot of things, the SF2 effect was a two-edged sword with both positive and negative effects on the industry. On the one hand the fighting game market eventually produced Virtua Fighter and Tekken, but did we really need breast physics too?

Runner-up: More than fifteen years after its introduction Civilization’s blend of civics, economics, diplomacy, warfare, and history continues to addict new players. As much as any other game Civ epitomizes the “simple to grasp, difficult to master” ideal to which game developers still strive. As Civilization has spread to numerous platforms, spawned multiple revisions and sequels, and inspired dozens of copycat games one gets the distinct impression that the game’s original tagline, “Build an Empire to Stand the Test of Time”, was actually Sid Meier’s business plan all along.

1992: Capcom didn’t invent modern survival horror

Stop me if you’ve heard this one: you’re stuck in a zombie-infested mansion with only your wits and modest weapons to help you live long enough to learn the house’s terrible secret and then escape. If you blurted out Resident Evil, you’re probably a product of the PlayStation generation. If you said Alone in the Dark from Infogrames, then have a cookie. (If you sniveled Sweet Home, nice try.)

Like Capcom’s later B-movie survival horror game, Alone in the Dark used 3D characters rendered over 2D backgrounds and featured both action and puzzle sequences. It even had in-game texts that gave clues not only to the underlying history of the mansion but also hints to puzzles. With two sequels out before the first Resident Evil game and a new one due in 2008, Alone in the Dark is not only the starting point for modern survival horror but also its longest running example.

1993: Death Match Point

Just the single-player component of id Software’s DOOM transformed the PC industry overnight, driving people to upgrade to computers with faster processors and sound cards. However, DOOM’s multiplayer deathmatch component still influences how games are made today. Certainly earlier networked competitive games exist, but the advent of ethernet networks on university campuses and id Software’s liberal shareware distribution introduced deathmatch to a huge audience of players with powerful computers and lots of time to kill. As players learned to modify DOOM and then create their own levels, deathmatch-specific levels began to appear. Since DOOM, id and games based on its engine have continued to define competitive networked gaming. As a result, nearly every first- or third-person shooter on the market today includes a deathmatch component wherein two or more players will attempt to kill each other, often repeatedly, with outrageous virtual weaponry and pungent language.

1994: I am become ESRB, destroyer of games

The gore in Mortal Kombat and the sexual undertones of Night Trap upset some very important people — parents and congresspersons. After various hearings, the U.S. Congress gave the collective videogame industry one year to start regulating itself. So in 1994 the Interactive Digital Software Association (now the Entertainment Software Association) created the Entertainment Software Rating Board or ESRB. They established a straightforward system: a publisher submitted a video of a game to the ESRB and the board then assigned a rating (like Teen or Mature) and descriptors to elaborate on why the game received that rating. The gambit worked: the formation of the ESRB has mostly satisfied critics and kept government regulation at bay.

For years the ESRB rating process was a formality. Like any good regulating body the ESRB occasionally causes an uproar, most often by assigning controversial ratings on highly visible games. The key issues for the ESRB are still gore and sex. Rockstar, the popular developer, caused the biggest uproar in 2005 when it slipped a hidden sex game dubbed Hot Coffee past the ESRB. The game, Grand Theft Auto: San Andreas, was re-rated Adults Only (AO), up from Mature (M), and many stores either returned their stock of the game to the publisher, Take Two Interactive, or sold the game with the AO label until updated copies were released. The ESRB also hit Take Two and Rockstar a second time when they assigned an AO rating to Manhunt 2 not for sex but for violence. With the moral battles of last decade still being fought by and with the ESRB, one wonders if the industry is ready for the next generation of games and their promise of user-generated content.

1995: An Expo is Born

Not content with creating the ESRB the year before, the IDSA held its first Electronic Entertainment Expo, colloquially E3, in Los Angeles during May 1995. The timing could not have been better: Sega had released its Saturn, Sony showed off its PlayStation, and Nintendo revealed details of its Nintendo 64. (Nintendo also crowed about the Virtual Boy. Oof.) Console manufacturers and software producers quickly grasped the value of a centralized time and place for announcing new products and outspending competitors on lavish displays of conspicuous consumption. Publishers used E3 to premier games like Mario 64, Half-life, Metal Gear Solid 2, and Halo (the shooter, not the third-person action/adventure game shown at the July 1999 MacWorld) at press conferences more akin to pep rallies than legitimate news events.

For a little over a decade game companies and an eager gaming press converged on Los Angeles (and Atlanta, briefly) to pat each other on the back and be photographed with scandalously-clad girls holding game boxes. The show floor budgets ballooned, the expectations rocketed skyward, and eventually game companies thought of better ways to spend their money — like making games. E3 2006 was the last traditional big trade show, replaced in July 2007 with the invite-only E3 Media & Business Summit.

1996: With SGI graphics, back when that meant something

Nintendo comes out of nowhere every 10 years and revolutionizes videogame controls. The NES popularized the D-pad in 1986 and in 1996 the Nintendo 64 would bring analog joystick controls to the masses. The Atari 5200 and Vectrex had tried over a decade earlier, but people weren’t playing games that benefited from analog controls. Ever tried Pac-man with an analog stick? Dreadful. Beyond that, most analog controls were dedicated to a particular type of game: flight sticks for airplanes and steering wheels for cars.

But Nintendo’s innovative 3D platformer, Super Mario 64 rendered in Silicon Graphics quality, cried out for analog controls and Nintendo’s M-shaped controller with its analog joystick fit perfectly. Within months Sega had released its own 3D Analog Pad for the Saturn and in 1997 Sony released its first Dual Analog Controller, later replaced with the DualShock controller. Each new console today has some method of analog control, and even handhelds like the Nintendo DS and PlayStation Portable have gotten into the act.

1997: Lord British invites you to pay him promptly every month

For geeks who enjoyed D&D but wanted it to go faster and with less paperwork, Ultima Online must have seemed like a dream come true. In Lord British’s online world you could create a character, specialize in various skills, go out on quests with groups of friends, and eventually buy your own virtual home. All you had to do was pay a monthly fee and then spend the requisite hundreds of hours to earn the money and experience. In a matter of six months over 100,000 people signed up and created that era’s largest massively multiplayer online game (MMOG). Over a 10 years and a half-dozen expansion packs later, Ultima Online is still running with a new fully 3D client and a modest 140,000 regular subscribers.

The success of Ultima Online itself isn’t so important as what it proved: that there are hundreds of thousands of people willing to pay a monthly fee and give up other activities to live an alternative existence. UO was the first hard evidence that millions of dollars could be made annually providing persistent online worlds where people scripted their own game experiences. Origin’s success spurred other companies to create similar games: Everquest, Lineage, and eventually World of Warcraft.

Runner-up: In 1996 id Software’s Quake was a remarkable 3D game that extended the deathmatch concepts first employed in DOOM. However, the network code was not optimized for the internet. In December 1996 John Carmack and others released QuakeWorld, a client designed to provide responsiveness even for the majority of people who connected to the internet via phone line and modem. The online first-person shooter scene coalesced in 1997 with the rise of QuakeWorld, Team Fortress (the most popular QW mod), and GameSpy (the popular third-party server browser) along with various community sites focused on clans, tournaments, and hardware optimization. From those roots sprang today’s networked competitive teamplay games and communities.

1998: The irrational exuberance of 3dfx

If the idea of three video cards in one computer seems a bit outrageous today, imagine what it must have been like in 1998. That’s when 3dfx introduced its add-on Voodoo2 graphics accelerator. A normal 2D VGA card was required, with the Voodoo2 taking over only when a 3D game like GLQuake was running, and to get optimal 3D performance you would actually need two Voodoo2 cards which then ran in scan-line interleave (SLI) mode. Whether you had one Voodoo2 card or two, the effect was stunning. While the PlayStation and Nintendo 64 had been doing 3D graphics for years already, the Voodoo2 marks the start of a race to faster, smoother consumer-level computer graphics the likes of which we haven’t seen since.

3dfx soon found itself competing with NVIDIA and then ATI, both of whom began offering superior products which not only played GLQuake better, but did so at a lower price and in true 32-bit color. The software market itself began to rely on 3D cards, creating games which could not be played without one, like Kingpin: Life of Crime from Xatrix.

Runner-up: In November 1998 Valve software released Half-life to the acclaim of players and critics alike. The cinematic, immersive action-adventure sold over 8 million copies and represented a standard by which players would measure for years to come. It also hosted what is undoubtedly the most popular modification of a commercial game to date, Counter-strike. On this success Valve went on to build Half-life 2 and, perhaps more importantly, Steam.

1999: An online dream cast aside

Despite its ultimate failure, the Sega Dreamcast will stand as the starting point of modern online console gaming. Earlier online console services were rudimentary and required additional hardware. The Sega Dreamcast was the first home console that could go online out of the box, and the first to offer pay-to-play online games. These features clearly affected Sega’s primary rival, Sony, who promised many online features for the upcoming PlayStation 2 in press reports from 1999.

Once Sega abandoned the Dreamcast, Sony quietly dropped its plans for online gaming and movie distribution, and settled for a much less ambitious patchwork strategy. Sony left room for a competitor to pick up the pieces and build the next natural step in console console games. Even so, Sega’s dedication to the online Dreamcast lasted through 2007 when it finally shut down its Phantasy Star Online servers.

2000: Seinfeld, the videogame

Can you sell a game about nothing? Many game developers had tackled this open problem, first posed by Nolan Bushnell in 1978, but it would take one of the greatest game designers of all time, Will Wright, to provide the proof. In 2000 he showed that you can sell a game about the minutiae of people’s lives. To date, 16 million copies of his proof have been sold to consumers, along with millions of copies of corollaries (or expansion packs) which extend Wright’s results.

All mathematical joshing aside, The Sims in 2000 reached the audience that many companies, like Nintendo, seek today: casual gamers. If nothing else, The Sims is a proof that the casual market exists, but like many existence proofs it fails to be practical, for it does not provide details on just how to attract and extract money from those casual gamers. In fact, The Sims Online showed that even Electronic Arts didn’t know how to extend its initial success very far at all.

Also see our piece The games about nothing which covers the Seinfeld genre.

2001: A sandbox named Liberty

Some will say that the videogame world is too parochial, but Grand Theft Auto III proved that the industry could make products that would engage all parts of society: precocious children, inattentive parents, state and federal legislators, and of course Jack Thompson. Regrettably most of these people were blinded by the violence of GTA3 and missed the point.

What really makes Grand Theft Auto III special is not cop-killings or hooker-beatings, although those have an undeniable charm, but the feeling of standing on a street corner in a living, breathing city knowing you can choose to go just about anywhere and attempt just about anything. It’s the freedom to choose to kill cops or beat hookers — or do something else entirely, like drive a taxi or a fire engine. A player learned the ebb and flow, the night life and gang territories, the shortcuts and secrets of Liberty City much as well as he knows his own hometown. The rich, intriguing sandbox set the standard for not only its own sequels but also for countless other games from other companies.

2002: Live from Redmond

If the Dreamcast dragged consoles online in 1999, Microsoft pioneered the coherent online experience in November 2002 with Xbox Live. The idea: provide a seamless, uniform experience for any online game on the Xbox. The system uses a single user identity across all Live games, called a gamertag, which players use to connect to the system and communicate with friends. A player’s gamertag also carries a reputation, an attempt to temper online misbehavior similar to eBay user feedback. Since each Xbox contained a hard drive, Microsoft could offer downloads of additional data, like extra levels, to complement Xbox Live-enabled games. Many games even included online voice communication, a technology which was still uncommon in computer games where users had been more likely to have the necessary hardware.

Of course Microsoft didn’t give away their new service for free. The original starter package cost a cool $50, and included 12 months of service along with a headset. After the first year Microsoft began charging $50 for 12 months of service and bumped the starter packages to $70. By the end of 2003, Microsoft began offering Premium Content, downloadable add-ons for games for a modest price of $5 or more. Microsoft’s killer-app, Halo 2, launched in 2004 with Xbox Live multiplayer and within 12 months the number of Xbox Live accounts doubled to 2 million.

Xbox Live did more than line the pockets of Microsoft, however. It made an online service essential to the modern console. Even recalcitrant Nintendo, previously dismissive of consoles online, put its new Wii online within months of its Holiday 2006 launch.

Runner-up: Microsoft wasn’t the only company dreaming of selling 0s and 1s online. Valve released its Steam 1.0 client along with Counter-strike 1.4 in early 2002, sowing the seeds for what would eventually become an empire of virtual game sales. While it was initially limited to game patching, Steam eventually grew into a large content-delivery system and online community. By 2004, Steam was used to pre-install Valve’s Half-life 2 before the game reached store shelves and in 2007 is selling games for independent developers and traditional publishers alike.

2003: Virtual violence begets violence?

Two horrible incidents in 2003 would influence public opinion on videogame depictions of violence and murder. On June 7th, after being booked for car theft, 18-year-old Devin Moore shot and killed two policemen and a dispatcher in the Fayette, Alabama police station. He fled in a patrol car but was later apprehended, telling the Associated Press “Life is a video game. Everybody’s got to die sometime.”

A later report by the television news magazine 60 Minutes would make the connection to Grand Theft Auto: Vice City, a game that Moore had played heavily. Just weeks after Moore’s crime, stepbrothers Joshua and William Buckner (ages 14 and 16, respectively) shot a .22-caliber rifle at cars on Interstate 40 in Tennessee as they claimed they’d seen in Grand Theft Auto. Their bullets killed one person and wounded another.

In both cases, the criminals were found guilty despite their claimed videogame inspirations. Separate civil suits against Take Two and other videogame companies are still pending in both the Moore and Buckner cases, however. Pressure on the industry has mounted since 2003 as they have fought restrictive legislation in a variety of states. Recently Michigan, Minnesota, Louisiana, Oklahoma, and Illinois have all passed laws to limit the sale of videogames. In response, courts have struck down all of those laws, a clear victory for the industry.

Still, a cloud of suspicion hangs over videogames. Within hours of the murders at Virginia Tech this year Jack Thompson was on Fox News attempting to make a connection to videogames, even though no such connection apparently exists.

2004: Married to the company

Working conditions for videogame developers were deplorable before November 2004, but when EA_Spouse, an anonymous blogger, explained the cost in plainly human terms the world took notice. She called Electronic Arts, her spouse’s employer, a “money factory” that granted time “off for good behavior” and thereby unleashed an outpouring of grief and disgust across the industry. Electronic Arts and many other companies soon found themselves rethinking development crunch strategies while simultaneously trying to contain class action suits filed by their employees.

By 2005 EA had settled one such case brought by graphic artists for $15.6 million, and then in 2006 negotiated a $14.9 million settlement in a similar case brought by programmers and engineers. In June 2007 Sony Computer Entertainment America settled a February 2005 case with a group of current and former employees for unpaid overtime to the tune of $8.5 million. Activision is currently being sued by California employees for overtime pay denied them because the company claims they are exempt under state law.

Since the original outrage, some changes have apparently been made. The original EA_Spouse, since identified as Erin Hoffman, has recently said “From what I understand, the [EA] Los Angeles studio [where her husband, Leander Hasty had worked] has made a really big turnaround”.

2005: Everyone wants to be a Hero

After the abstract music games FreQuency and Amplitude failed to take off, developer Harmonix took a more familiar road to fame: the electric guitar. Despite minimal fanfare and a premium $80 price, Guitar Hero became the sleeper hit of 2005. With a plastic guitar controller and a truckload of popular music tracks, gamers snapped up as many copies of the game as the publisher, Red Octane, could provide. Dance and karaoke games had been around for years, but with nothing quite approaching the popularity of Guitar Hero. The game represents an essential realization: singing or performing isn’t enough on its own. Instead, people want to be a star, and the little plastic guitar gives them that.

Activision quickly bought up the rights and Guitar Hero II for PlayStation 2 launched to record sales during Holiday 2006, priced $10 higher than its predecessor. Then in 2007 the Xbox 360 received its own version of Guitar Hero II (priced at a hefty $90) and the PlayStation 2 version received a compilation of 1980s music. Month after month, the Guitar Hero games take spots in the top 10 software sales.

The effect on the market has been profound. Electronic Arts quickly cut a publishing deal with Harmonix, the original Guitar Hero developer, and will soon crank the amp up to eleven with Rock Band. More importantly, the revenues from Guitar Hero have made Activision the largest third-party publisher in the industry, displacing Electronic Arts which had held that title since the early 00s. Combined with Nintendo’s moves in 2006, Guitar Hero marks two major shifts in the industry. First, developers have taken steps away from traditional videogame controllers and  toward more intuitive models. Second, Nintendo and Activision are now in the controlling positions that Sony and EA held for several years.

2006: Why didn’t the Wii launch with a lightsaber game?

Nintendo is used to skepticism. With the exception of the Game Boy, they’ve had difficulty attracting support outside their core constituency since the time of the Nintendo 64 and Virtual Boy. The industry initially looked with doubt upon the Nintendo DS but is now falling all over itself to cash in on the handheld’s success. With a decade’s experience as underdog, only Nintendo could have presented the unusually monikered Wii and its motion controller to the world so proudly, so brazenly. And, as with Nintendo’s faith in the DS, they knew precisely what they were doing.

You need look no further than Sony’s own palindromic Sixaxis controller — which also responds to motion — and Microsoft’s pledge to appeal to children and casual gamers to realize that Nintendo is in the industry’s driver seat. The Wii’s intuitive controller has opened up the system to a giant audience of customers, that majority of human beings who don’t like hunkering on the couch tapping out abstract patterns on a button-studded hunk of plastic. Wii Sports, the pack-in game, is instantly understood by everyone, from little 3-year-olds up to great grandparents. Even older games like Resident Evil 4 are getting new life with judicious Wii controller integration. This isn’t theoretical appeal, however. The Wii continues to sell as many systems as Nintendo can put on the shelves, and could sell in its first year as many systems as the  Xbox 360 will have sold in two.

Now the industry is struggling to understand Nintendo’s success. Is the market really growing? Are casual games the future? Are the Xbox 360 and the PlayStation 3 akin to the final glorious generation of Cretaceous dinosaurs? Ask us again in 30 years.

1 thought on “<span class="entry-title-primary">The 30 moments that defined the video game industry</span> <span class="entry-subtitle">From consoles to genres, from lawsuits to graphics, these are the 30 big moments that define the video game industry—from 1977 to the early 00s.</span>”

Add to the conversation