DunDunDunpachi
Patient MembeR
And it's because of the business model.
Hop into any thread about Sony vs Microsoft vs Nintendo and you'll likely see comments about how "competition is good" and how "competition means the gamers win". The idea here is that if the game companies are locked in bitter competition over your dollar, they will strive to produce better games. Makes sense. I agree with the logic.
With that in mind, I don't think there was ever a fiercer environment of competition than in the arcades of the 80s and early 90s.
Here's how it worked:
Game companies were responsible for producing the game, developing the hardware, and collaborating with vendors to sell their cabinets to arcade proprietors. The arcade proprietor (and bar owners, since that's where "arcades" first popped up) wanted a cabinet that was popular enough to recoup the initial investment.
And that initial investment was not small. Many arcades even had to lease cabinets because of the expense. How much was each machine? Back in the early 1980s, a cabinet cost between $1,500 to $3,000, or $5,000-10,000 in today's buying power. That's over 6000 quarters dropped merely to break even. Understandably, proprietors wanted only the best games that were sure to make them money.
However, the arcade proprietor was not the only customer in this situation. The gamer who walked up to the machine was also a customer. So, arcade games had to walk a tightrope, balancing the profit-margins of the arcade owner against the stingy pockets of the average gamer.
The results of this "arcade tightrope" business model?
- graphics and art design had to be flashy enough to attract that first coin. For the first half of gaming's history, all of the best-looking games were coming out in arcades. That only makes sense: the hardware was far more powerful than anything you could reasonably buy at home. This ensured gaming would progress in the hardware/graphical departments instead of racing to the bottom for the sake of low overhead (i.e. smartphone games).
- controls had to be acceptable. Broken or frustrating controls meant the player would not bother with dropping a second coin.
- gameplay had to be addictive. Players would not tolerate boring or derivative gameplay. This meant that innovation was more important than imitation, but fine-tuning was more important than both.
- difficulty had to be "fair". This is highly subjective, of course, but the idea that arcade games were nothing but "quarter munchers" isn't exactly true. For an arcade game to make a profit, it had to get repeat customers. Those customers had to feel as though the difficulty was fair and that their skills could pay off (in the form of more play time).
This attitude carried over to consoles up through the mid-90s. This era of the NES, Turbografix, Genesis, Amiga, Commodore 64, Atari 2600, Super Nintendo, etc is often referred to as the "golden age of gaming". Many of the favorite games from this era were direct arcade ports or were heavily influenced by arcade games. Most of the developers during this time had their fingers in the arcade pie, in some form.
However, the market for home platforms (consoles and personal computers) was very different. Cinematic flourish and storyline was used to pad the value of a game. No one would accept a game that could be beaten in 15 minutes. It would either get returned or sold. When you spend several months of your allowance on a title, you want it to last.
So, console gaming and PC gaming adjusted accordingly. More padding. More "content". Easier difficulty. None of that is inherently wrong, but since the game companies only had to cater to one set of customers -- the owner of the console -- they no longer walked the arcade tightrope like before. I believe this is why gaming fundamentally changed during the 90s. In many cases, it changed for the better. But in some cases, I think it changed for the worst.
"Hard games" are seeing a renaissance as the market swings back the other way. Drawn-out, 100-hour experiences are simply too long for the average consumer. Gamers from every walk of life are playing shorter, snappier, simpler games. The Wii, DS, and rise of smartphones each demonstrate this to be true. From my perspective, this is a return to the old arcade mentality: extending a game's length with tough-but-fair difficulty is a valid alternative to artificial padding.
Hop into any thread about Sony vs Microsoft vs Nintendo and you'll likely see comments about how "competition is good" and how "competition means the gamers win". The idea here is that if the game companies are locked in bitter competition over your dollar, they will strive to produce better games. Makes sense. I agree with the logic.
With that in mind, I don't think there was ever a fiercer environment of competition than in the arcades of the 80s and early 90s.
Here's how it worked:
Game companies were responsible for producing the game, developing the hardware, and collaborating with vendors to sell their cabinets to arcade proprietors. The arcade proprietor (and bar owners, since that's where "arcades" first popped up) wanted a cabinet that was popular enough to recoup the initial investment.
And that initial investment was not small. Many arcades even had to lease cabinets because of the expense. How much was each machine? Back in the early 1980s, a cabinet cost between $1,500 to $3,000, or $5,000-10,000 in today's buying power. That's over 6000 quarters dropped merely to break even. Understandably, proprietors wanted only the best games that were sure to make them money.
However, the arcade proprietor was not the only customer in this situation. The gamer who walked up to the machine was also a customer. So, arcade games had to walk a tightrope, balancing the profit-margins of the arcade owner against the stingy pockets of the average gamer.
The results of this "arcade tightrope" business model?
- graphics and art design had to be flashy enough to attract that first coin. For the first half of gaming's history, all of the best-looking games were coming out in arcades. That only makes sense: the hardware was far more powerful than anything you could reasonably buy at home. This ensured gaming would progress in the hardware/graphical departments instead of racing to the bottom for the sake of low overhead (i.e. smartphone games).
- controls had to be acceptable. Broken or frustrating controls meant the player would not bother with dropping a second coin.
- gameplay had to be addictive. Players would not tolerate boring or derivative gameplay. This meant that innovation was more important than imitation, but fine-tuning was more important than both.
- difficulty had to be "fair". This is highly subjective, of course, but the idea that arcade games were nothing but "quarter munchers" isn't exactly true. For an arcade game to make a profit, it had to get repeat customers. Those customers had to feel as though the difficulty was fair and that their skills could pay off (in the form of more play time).
This attitude carried over to consoles up through the mid-90s. This era of the NES, Turbografix, Genesis, Amiga, Commodore 64, Atari 2600, Super Nintendo, etc is often referred to as the "golden age of gaming". Many of the favorite games from this era were direct arcade ports or were heavily influenced by arcade games. Most of the developers during this time had their fingers in the arcade pie, in some form.
However, the market for home platforms (consoles and personal computers) was very different. Cinematic flourish and storyline was used to pad the value of a game. No one would accept a game that could be beaten in 15 minutes. It would either get returned or sold. When you spend several months of your allowance on a title, you want it to last.
So, console gaming and PC gaming adjusted accordingly. More padding. More "content". Easier difficulty. None of that is inherently wrong, but since the game companies only had to cater to one set of customers -- the owner of the console -- they no longer walked the arcade tightrope like before. I believe this is why gaming fundamentally changed during the 90s. In many cases, it changed for the better. But in some cases, I think it changed for the worst.
"Hard games" are seeing a renaissance as the market swings back the other way. Drawn-out, 100-hour experiences are simply too long for the average consumer. Gamers from every walk of life are playing shorter, snappier, simpler games. The Wii, DS, and rise of smartphones each demonstrate this to be true. From my perspective, this is a return to the old arcade mentality: extending a game's length with tough-but-fair difficulty is a valid alternative to artificial padding.
Last edited: