• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

I think there's a reason (other than nostalgia) that 80s and 90s games are "better"

DunDunDunpachi

Patient MembeR
And it's because of the business model.

video-arcade-01.jpg


Hop into any thread about Sony vs Microsoft vs Nintendo and you'll likely see comments about how "competition is good" and how "competition means the gamers win". The idea here is that if the game companies are locked in bitter competition over your dollar, they will strive to produce better games. Makes sense. I agree with the logic.

With that in mind, I don't think there was ever a fiercer environment of competition than in the arcades of the 80s and early 90s.

Here's how it worked:

Game companies were responsible for producing the game, developing the hardware, and collaborating with vendors to sell their cabinets to arcade proprietors. The arcade proprietor (and bar owners, since that's where "arcades" first popped up) wanted a cabinet that was popular enough to recoup the initial investment.

And that initial investment was not small. Many arcades even had to lease cabinets because of the expense. How much was each machine? Back in the early 1980s, a cabinet cost between $1,500 to $3,000, or $5,000-10,000 in today's buying power. That's over 6000 quarters dropped merely to break even. Understandably, proprietors wanted only the best games that were sure to make them money.

However, the arcade proprietor was not the only customer in this situation. The gamer who walked up to the machine was also a customer. So, arcade games had to walk a tightrope, balancing the profit-margins of the arcade owner against the stingy pockets of the average gamer.

The results of this "arcade tightrope" business model?

- graphics and art design had to be flashy enough to attract that first coin. For the first half of gaming's history, all of the best-looking games were coming out in arcades. That only makes sense: the hardware was far more powerful than anything you could reasonably buy at home. This ensured gaming would progress in the hardware/graphical departments instead of racing to the bottom for the sake of low overhead (i.e. smartphone games).
- controls had to be acceptable. Broken or frustrating controls meant the player would not bother with dropping a second coin.
- gameplay had to be addictive. Players would not tolerate boring or derivative gameplay. This meant that innovation was more important than imitation, but fine-tuning was more important than both.
- difficulty had to be "fair". This is highly subjective, of course, but the idea that arcade games were nothing but "quarter munchers" isn't exactly true. For an arcade game to make a profit, it had to get repeat customers. Those customers had to feel as though the difficulty was fair and that their skills could pay off (in the form of more play time).

This attitude carried over to consoles up through the mid-90s. This era of the NES, Turbografix, Genesis, Amiga, Commodore 64, Atari 2600, Super Nintendo, etc is often referred to as the "golden age of gaming". Many of the favorite games from this era were direct arcade ports or were heavily influenced by arcade games. Most of the developers during this time had their fingers in the arcade pie, in some form.

However, the market for home platforms (consoles and personal computers) was very different. Cinematic flourish and storyline was used to pad the value of a game. No one would accept a game that could be beaten in 15 minutes. It would either get returned or sold. When you spend several months of your allowance on a title, you want it to last.

amigamainimage-1.jpg


So, console gaming and PC gaming adjusted accordingly. More padding. More "content". Easier difficulty. None of that is inherently wrong, but since the game companies only had to cater to one set of customers -- the owner of the console -- they no longer walked the arcade tightrope like before. I believe this is why gaming fundamentally changed during the 90s. In many cases, it changed for the better. But in some cases, I think it changed for the worst.

"Hard games" are seeing a renaissance as the market swings back the other way. Drawn-out, 100-hour experiences are simply too long for the average consumer. Gamers from every walk of life are playing shorter, snappier, simpler games. The Wii, DS, and rise of smartphones each demonstrate this to be true. From my perspective, this is a return to the old arcade mentality: extending a game's length with tough-but-fair difficulty is a valid alternative to artificial padding.
 
Last edited:

cdthree

Member
It's funny cause I get that vibe and feeling playing some GaaS mobile games. Kind of does feel like I'm playing a arcade game from the late 80's early 90's, with the flash and vibe. Also the get in and play immediately is reminiscent, too.
 
I am hoping there is a balance in the near future where short/arcade games can provide a tough but fair difficulty and for those who are less skilled such as myself, offer an easier difficulty if you struggle too much (i.e: Cannot beat the first level or die X amount of times).

I felt that Nintendo were on the right track with this with the Super Guide but I didn't like how they essentially tell you how to play the game or took the controls away from you and it essentially plays itself.

I think Arcade styled games can make money in this day and age through the console space, and pricing them lower than the more cinematic experience is the way to go!

I think they are coming back because the average person playing these games are older and prefer shorter games overall, but there are still a few who will drop a game if it gets too difficult and blame the game for wasting their time, which is why I think the option to lower should be there and only appear under certain parameters (so a skilled player will never see it unless ghey intentionally play badly).

I do prefer the Pixel Art of 2D games of Yesteryear but I am not that into the games from the 80s or 90s due to some bizarre gameplay choices of Yesteryear (I still very much enjoy looking at these games in case studies but I am not very tolerant of difficulty spikes or obnoxious game designs. Grew up to dislike it honestly).

It is very interesting that you have looked into this thoroughly and quite enjoy reading people's perspectives on the decade. We certainly didn't have to pay extras or patches if a game was broken on release. Some of these practices should come back such as quality control amd ensuring the customer is satisfied whether they start off bad at the games or whether they want more of a challenge, but they should know their audience well too.

Here's hoping some of the things that enhanced the 80s and 90s come back and leave some of the more questionable design choices behind (including the modern Day One Patch Nonsense) :)
 

nkarafo

Member
The whole "difficulty" part is misunderstood in the industry. You can see what happened after Dark Souls got popular. Everyone thought it was the difficulty of the action that defined these games. How many times you died, how often, the "nope" memes and shit like that.

It's not that though. Every game can be like Dark Souls this way. Just play it on the highest difficulty. Pretty sure even some dumb mainstream game like CoD can be a nightmare at the highest difficulty.

Dark Souls difficulty comes from one much more important aspect. The "don't treat players like dumb little children" one. Don't spoon feed them and don't had hold them. It's just as simple as that. Dark Souls difficulty is mostly about figuring out the game yourself, not just about being killed every few minutes.

There are very few games anymore that do this. Everyone has to cater to the lowest common denominator because it has to sell to everybody, gamers or not. Ever since gaming became a mainstream trend (after the PS1) this is the biggest issue for me.
 

Arbitrary

Neo Member
Lower latency controls and higher framerates were kind of a big deal, too. Arcade games almost all ran at 50/60hz, and often with <16ms of latency between input and on-screen response.

These days we accept 30fps as standard on consoles with only the occasional moan, but not just that - it's common to have around 100ms of lag between input and on-screen response (sum of TV input lag and latency from the graphics pipeline and game engine architecture)

Yet, there's this silly obsession with adding more pixels (720p -> 1080p -> 4k, and now even discusssion of 8k) while not addressing the much bigger issues of responsiveness and framerate.

Old games, at least on console, were also far quicker to get into - in the 8/16bit days, you could plug in a cartridge, power on the console, navigate a very simple menu and be playing almost instantly. These days you've all too often got system updates, game updates, unskippable cutscenes, and massive tutorials to slog through before reaching the fun.
 

DunDunDunpachi

Patient MembeR
The whole "difficulty" part is misunderstood in the industry. You can see what happened after Dark Souls got popular. Everyone thought it was the difficulty of the action that defined these games. How many times you died, how often, the "nope" memes and shit like that.

It's not that though. Every game can be like Dark Souls this way. Just play it on the highest difficulty. Pretty sure even some dumb mainstream game like CoD can be a nightmare at the highest difficulty.

Dark Souls difficulty comes from one much more important aspect. The "don't treat players like dumb little children" one. Don't spoon feed them and don't had hold them. It's just as simple as that. Dark Souls difficulty is mostly about figuring out the game yourself, not just about being killed every few minutes.

There are very few games anymore that do this. Everyone has to cater to the lowest common denominator because it has to sell to everybody, gamers or not. Ever since gaming became a mainstream trend (after the PS1) this is the biggest issue for me.
The overcomplication of games is partially to blame. I remember Nintendo remarking that their design philosophy behind the Wii and DS was that modern controllers had "too many buttons". Granted, I don't want things to be simplified for the sake of it, but sometimes less is more. Let the player figure it out. Good game design will shine whether it is explained in a tutorial or not.

Lower latency controls and higher framerates were kind of a big deal, too. Arcade games almost all ran at 50/60hz, and often with <16ms of latency between input and on-screen response.

These days we accept 30fps as standard on consoles with only the occasional moan, but not just that - it's common to have around 100ms of lag between input and on-screen response (sum of TV input lag and latency from the graphics pipeline and game engine architecture)

Yet, there's this silly obsession with adding more pixels (720p -> 1080p -> 4k, and now even discusssion of 8k) while not addressing the much bigger issues of responsiveness and framerate.

Old games, at least on console, were also far quicker to get into - in the 8/16bit days, you could plug in a cartridge, power on the console, navigate a very simple menu and be playing almost instantly. These days you've all too often got system updates, game updates, unskippable cutscenes, and massive tutorials to slog through before reaching the fun.
The time spent loading up a game, watching the unskippable splash screen (UE4 Engine! Havok physics! Such-and-such CGI studio! Developer who made this game!), and then starting up a new save file is often longer than a full session of Donkey Kong. This was especially bad last gen when game installs and patches could set you back an hour or more.
 
Last edited:
Game companies were responsible for producing the game, developing the hardware, and collaborating with vendors to sell their cabinets to arcade proprietors. The arcade proprietor (and bar owners, since that's where "arcades" first popped up) wanted a cabinet that was popular enough to recoup the initial investment.

Interesting topic. There's not a whole lot wrong with your assessment, but allow me to play devil's advocate in order to spice things up a little.

First of all, for somebody claiming that this is a non-nostalgic view on retro gaming, your OP is full of nostalgia. The 80's and 90's had their fair share of absolutely sh*tty games too and as a consumer you had much less possibilities to make an informed decision. Most games were either bought through viva voce or based on the pure magic of the box art alone. This allowed for a lot of crappy games to strive and as a kid I burned my fingers quite a number of times.

As somebody who poured quite a lot of money into arcade machines back in the day, let met tell you that arcade games wasn't all roses and sunflowers. The difficulty for many arcade games was artificially inflated so that you would pour more money into those darn machines. Lots of games had a flashy intro in order to hook you up, only to fall flat mid-game. The appeal of arcade many arcade games were their graphics, because it was something that you could not experience on your console at home. Don't get me wrong, I always enjoyed the social aspect of arcade gaming, but not all was great.

As Scorsese would say, the "proliferation of images" has lead to a decline in cinema culture. With games, it's kind of the same. How many games were released each year in the 80's and 90's? Certainly a lot less than by today's standards. This meant that each game had a much bigger impact and consumers needed to be less picky because there simply wasn't as much of a choice. Nowadays games have become so ubiquitous that they struggle to compete for your attention. Games of the past didn't have that problem. As a kid I used to stick a lot longer with my games, because new games were more scarce, nowadays I'm glad if I can finish a game before jumping to the next one.

One of the reasons why Star Wars, Lawrence of Arabia, Casablanca or Ben Hur became such classics and so ingrained in our culture is because there weren't just enough movies to go around. Nowadays you have an overabundance of products thrown at a jaded consumer with only a select few ever even bubbling up to the collective public conscience. To an extent, it's the same with video games and maybe a reason why we are looking back to the past with rose tinted glasses.
 

Petrae

Member
Arcade games had to hook players in less than a minute. If they didn’t, players didn’t drop any more tokens into the games, and arcade operators would often rotate them out. This stands in stark contrast to the legions of games today that take hours to hook players. I don’t have time for that nonsense, especially with so many options out there. If a game doesn’t hook me within the first hour, I quit and don’t play it again. Video games aren’t meant to be a penance or time-wasting toll booths. I refuse to have to invest 10+ hours into a game just so I can “get to the good parts”.

Arcade games also had fair difficulty curves. This was, again, due to the fact that they had to hook you into wanting to drop more tokens inside. Games like Ninja Gaiden for the Xbox and the Soulsbourne titles don’t have this. They pummel you from the jump and are inaccessible for many players. Contrast the former against the Ninja Gaiden trilogy for the NES; games that are still quite difficult, but at least give players a few stages to adjust to the play controls and gain some confidence before progressively becoming more challenging and eating extra lives. Itagaki and Team Ninja had their own vision, but I will never be convinced that it was a good vision.

I still primarily play coin-ops today. My Switch is loaded with stuff like Moon Patrol, Double Dragon, and Donkey Kong. I spend plenty of time with coin-op compilation discs, like Taito Legends and Namco Museum. These are the games that I grew up playing and the ones that respect my time the most. Very little narrative/setup, then all gameplay that lasts as long as my skills and endurance allow.
 

nkarafo

Member
Arcade games had to hook players in less than a minute.

I agree and they used their fancy graphics to achieve that.

Arcade games were the pinnacle of graphics technology, the state of the art. There wasn't anything you could buy at home that came close to what you could see in the arcades. I mean look at Space Harrier. That thing was released in 1985. Do you have any idea how home video games looked in 1985? It took 10 years to have an "arcade perfect" Space Harrier port. This gap continued to exist until the late 90's when companies decided it's cheaper to use mainstream console/PC hardware to drive their arcades. And this is were the decline started.
 

DunDunDunpachi

Patient MembeR
Interesting topic. There's not a whole lot wrong with your assessment, but allow me to play devil's advocate in order to spice things up a little.
Of course. Thanks for the in-depth reply. I will try to cross sabers in a worthy fashion.

First of all, for somebody claiming that this is a non-nostalgic view on retro gaming, your OP is full of nostalgia. The 80's and 90's had their fair share of absolutely sh*tty games too and as a consumer you had much less possibilities to make an informed decision. Most games were either bought through viva voce or based on the pure magic of the box art alone. This allowed for a lot of crappy games to strive and as a kid I burned my fingers quite a number of times.
I won't deny that crappy games exist. This is true for any entertainment product, however. It's immaterial that crappy games existed back then. The question is whether they were able to survive and thrive in that market.

And I would point out that it was significantly more difficult for said crappy games to survive (and thrive) in the arcade environment compared to the home platforms, let alone compared to today. My evidence is that when the market did attempt to stuff the shelves with crap, the market crashed for several years (in the USA, at least).

There was degree and tradeoff to the whole thing. For instance, beat 'em ups often had the best-looking sprites and the widest variety of enemies but at the cost of being coin-eaters in most cases. Lightgun and racing games -- which often ended the game session even if you won the race/mission and/or put on a time limit to make it much harder to stay playing -- were also b.s. The bargain was that you were playing the best-looking games on the market, and you were still paying far less than if you bought a brand-new game.

Rentals were sort of the middle ground. You could test out the game much in the same way without compromising the experience.

So yes, there's a degree of nostalgia, which I admit in my title. But I think there's a reason other than nostalgia which causes some people to look back and call those old games "better".

As somebody who poured quite a lot of money into arcade machines back in the day, let met tell you that arcade games wasn't all roses and sunflowers. The difficulty for many arcade games was artificially inflated so that you would pour more money into those darn machines. Lots of games had a flashy intro in order to hook you up, only to fall flat mid-game. The appeal of arcade many arcade games were their graphics, because it was something that you could not experience on your console at home. Don't get me wrong, I always enjoyed the social aspect of arcade gaming, but not all was great.
Right, and I am definitely weighing the comparison in my favor because we are talking about looking back at these games and seeing if they are "better". We tend to ignore the crap and remember the good stuff.

The thrust of my argument isn't that every game from back then is better. Rather, that the business model resulted in a baseline quality that we do not see in today's market. A product is shaped by its environment and market in many respects. One of the old models (arcade) is no longer shaping our games, so it makes sense that games may have lost something.

Yes, I am making the case that the old arcade tightrope was a good "something", but there's no denying something indeed was lost.

As Scorsese would say, the "proliferation of images" has lead to a decline in cinema culture. With games, it's kind of the same. How many games were released each year in the 80's and 90's? Certainly a lot less than by today's standards. This meant that each game had a much bigger impact and consumers needed to be less picky because there simply wasn't as much of a choice. Nowadays games have become so ubiquitous that they struggle to compete for your attention. Games of the past didn't have that problem. As a kid I used to stick a lot longer with my games, because new games were more scarce, nowadays I'm glad if I can finish a game before jumping to the next one.

One of the reasons why Star Wars, Lawrence of Arabia, Casablanca or Ben Hur became such classics and so ingrained in our culture is because there weren't just enough movies to go around. Nowadays you have an overabundance of products thrown at a jaded consumer with only a select few ever even bubbling up to the collective public conscience. To an extent, it's the same with video games and maybe a reason why we are looking back to the past with rose tinted glasses.
My simple counter to this is where are the modern equivalents, then? A larger market with a larger pool of developers should have been able to surpass the old games and kick them into obsolescence. This has occurred in some gaming genres, like adventure games.

In some cases, these equivalents can be found in the indie scene. I think Metroidvanias have been coopted by indies to the point where a modern developer might be unable to match that quality with a big budget alone.

But otherwise, where are the cutting edge games from the biggest game companies putting these old games out of fashion? We do have examples of modern equivalents being made -- like Pac Man Championship Edition, Castle Crashers, and modern 2D Mario -- and they sell well enough. It can't all be nostalgia since we also have examples of old games being brought back and flopping. We also have examples of conventions and expos and modern arcades where folks who didn't grow up during that era are playing these games and enjoying them. At best, we might argue that they were influenced by the proliferation of these games over the decades, but it can't be chalked up only to nostalgia.

To me, this indicates there is an inherent quality to the old games that still provide entertainment value today. This isn't too unusual. Chess and Go are still played today. Is it because they were "first" or because there's still an inherent quality to their mechanics?

In the case of film and in the case of music, there's always an element of nostalgia, but I could argue those old directors and musicians made heavier use of fundamental tropes and their works are more enduring at a result. There are thousands of black and white movies -- older than the movies you referenced -- that have fallen completely out of style. Why was their early influence not as enduring as Ben Hur or Star Wars, since they came out earlier? There's hours upon hours of old African-American spirituals, ragtime music, and Irish folk ballads, yet it's the jazz and rock 'n roll that we still play and remember fondly.
 

zenspider

Member
I couldn't agree more here. There's these underlying ethics of design for arcade, console, and PC that are all about the value proposition down to the root. The console space acts like the battleground for arcade vs. PC, but every era has a successful ethic that's codified.

One of the easiest examples to point to for the death of the arcade ethic was EGM's (I think) review of Strider 2 for PS1. They called the game too short and too easy for the 49.99USD asking price. Capcom's concession to the console ethic was 'unlimited continues'. You can fill in the rest here. Spamming the continues made the game short and easy. Putting the impetus was on the player was a design failure.

Now, the latest Strider is a Metroidvania, with upgrades and backtracking to add 'content', similar to the NES adaption (a console ethic of design).

I think the thing we lost as a hobby is the value of mastery. For the most part, we just want to see the end of the movie, and I think it shows in what passes for quality content.
 
Last edited:

DunDunDunpachi

Patient MembeR
I couldn't agree more here. There's these underlying ethics of design for arcade, console, and PC that are all about the value proposition down to the root. The console space acts like the battleground for arcade vs. PC, but every era has a successful ethic that's codified.

One of the easiest examples to point to for the death of the arcade ethic was EGM's (I think) review of Strider 2 for PS1. They called the game too short and too easy for the 49.99USD asking price. Capcom's concession to the console ethic was 'unlimited continues'. You can fill in the rest here. Spamming the continues made the game short and easy. Putting the impetus was on the player was a design failure.

Now, the latest Strider is a Metroidvania, with upgrades and backtracking to add 'content', similar to the NES adaption (a console ethic of design).

I think the thing we lost as a hobby is the value of mastery. For the most part, we just want to see the end of the movie, and I think it shows in what passes for quality content.
The counterpoint is that a console or PC game dev could crank up the difficulty on a barebones game, call it "challenge" and charge full price while laughing to the bank.

We have definitely lost the value of mastery. As I seek out those sort of experiences more, I play each game longer and I enjoy it more. But on the flip side, I bin (so to speak) a great number of games because they don't offer me enough depth of mechanics to be worthy of my skill investment. The market is large enough to support players like me and players who want to "see the end of the movie", but what's strange is that one style of gaming gets all the attention, all the money, all the development time and the other style has been forgotten or relegated to the indie devs to pick up the slack.
 

Raven117

Member
I think this is a very nostalgic look at those games. Especially the 80s. They were still trying to figure out what made the games fun to begin with. “Design” didn’t really come into play until the 90s
 

Toe-Knee

Member
The overcomplication of games is partially to blame. I remember Nintendo remarking that their design philosophy behind the Wii and DS was that modern controllers had "too many buttons". Granted, I don't want things to be simplified for the sake of it, but sometimes less is more. Let the player figure it out. Good game design will shine whether it is explained in a tutorial or not.


The time spent loading up a game, watching the unskippable splash screen (UE4 Engine! Havok physics! Such-and-such CGI studio! Developer who made this game!), and then starting up a new save file is often longer than a full session of Donkey Kong. This was especially bad last gen when game installs and patches could set you back an hour or more.

Astrobot is a shining example of this. Only two buttons attack and jump with just a normal and charge attack as a result the gameplay is instantly understood and instantly rewarding.

I like to describe it as pure joy in game format.
 

ZywyPL

Banned
Back in the day the games were made by passionates, now they're made by accountants. A "slight" difference, that makes all the difference. Plus, the technology limits made the old games books alike, where you had to use you imagination, and everyone had his own vision of that those low res 2D graphics represent, now it's all on the table, just like movies, and all the magic is gone.
 
Last edited:

DunDunDunpachi

Patient MembeR
I think this is a very nostalgic look at those games. Especially the 80s. They were still trying to figure out what made the games fun to begin with. “Design” didn’t really come into play until the 90s
Since we know these games were designed and went through extensive playtesting before they were packaged and sold to vendors, you'll have to shine a light on what you mean by "design didn't really come into play until the 90s". Like, design by committee?

The devs weren't asking themselves "I wonder if responsive controls would make this game fun". That was a given.
 

#Phonepunk#

Banned
They were developed by smaller teams. Thus, there is a more personal touch in every level of the game. It’s similar to music, if you notice, most of the more generic pop songs are all written by dozens of people. Design by committee kills creativity.

Also, games were not big money back then. Smaller budgets mean less oversight. Not expecting every game to make a huge profit to cover costs means more experimentation.

Finally, it was the first gen of developers, artists, musicians, etc. Inspired more by books, movies, tv, comics, etc. Compare that to people who go into games now because they grew up loving games and that’s all they really know, the first generations were more about multi disciplines than the factory line stuff we have now
 
The thing that I do miss about the older generations are the small scale budgets and having smaller teams.

I don't see how these big companies can't have a smaller studio (with extremely multi talented people) with a smaller budget and say "Make whatever the hell you want and we will give it s little marketing for the dedicated gamers".

Child of Light is a good example of what I am talking about from Ubisoft and it has done extremely well. Let some smaller individuals come together and just have an hour or two messing about with some ideas and see if a good game can be made out of it!
 
Last edited:

JimmyRustler

Gold Member
Back in the day the games were made by passionates, now they're made by accountants. A "slight" difference, that makes all the difference. Plus, the technology limits made the old games books alike, where you had to use you imagination, and everyone had his own vision of that those low res 2D graphics represent, now it's all on the table, just like movies, and all the magic is gone.
Pretty much this. Today is everything is so big business and you can just feel it most of the time. Sure, here and there still games made with passion do get released and sometime even the business approach gives us great games but yeah... It's just not the same. There is a reason there is such nostalgia for the 80s and 90s but none whatsoever for the 00s and it sure as hell doesn't look like anyone will have nostalgia for the current decade either.
 
Last edited:
I was sure this was an Afro Republican thread, but there was no poll, so I got confused.

The reason why gaming was better is because of a bunch of reasons. The first game designers were techies, not ex-game journalists. Game design was tightly coupled with the technology, making the game mechanics more explicit on the screen. The people who played video games were capable of playing games with more complexity and willing to put more effort into games before discarding them. Early game design was built around what game designers wanted to see, not whatever the psychologist in charge of hooking whales tells the shareholders will be the best way to retain players who hate their game. Also, remember when two to four player games were played on the same screen? Holy shit, that was awesome.
 

Knightime_X

Member
It's only really better when newer games in the series fail to live up to the standards older games have.
Usually this is mostly based on gameplay, fun factor, and graphical appeal. Maybe some other aspect as well but those stick out to most for me.

Take Castlevania Lords of Shadows 2.
It's a relatively modern game with much more advanced graphics and maybe other gaming concepts.
It's still nowhere near as fun as playing Castlevania 4, Rondo of Blood or whatever you find to be more fun in that series that's much older.

I think the problem is that companies are trying to fix what isn't broken in an attempt to not appear like a "rehash" or cookie cutter game.
They change everything you like to make it different.

Here is the problem...
A % of players are guilty of this:

Gamers hate it when a game never changes.
Gamers hate it when a game DOES change.

It's only acceptable when a game changes but not in a bad or unwanted way.

I would argue that everyone loves Resident Evil 2 but Resident Evil 2 Remake will get even more love and better sales.

Ultimately I think it just boils down to which era has more games you enjoy.

I only think 90's were better because there was more to it than just games themselves.

Stores had demo stations
More games had demos for you to try out.
Less people complained about a game having difficulty settings or not.
Arcades were alive and well.
Games back then were more complete and had less scummy practices.

I love both but for different reasons.
 

Raven117

Member
Since we know these games were designed and went through extensive playtesting before they were packaged and sold to vendors, you'll have to shine a light on what you mean by "design didn't really come into play until the 90s". Like, design by committee?

The devs weren't asking themselves "I wonder if responsive controls would make this game fun". That was a given.
It would help if I remembered the documentary that talked about it. Especially in the Atari and early arcade days. There was usually only one developer/designer to many games.
 

petran79

Banned
Computer games were also popular for complex strategy and simulation games, a genre that due to its nature never became as popular, despite pushing technology forward.

But arcades despite their supposedly kiddy games,like Bubble Bobble, were never a place meant for kids. They were associated with shady places and gambling. A ridiculous anti-gambling law was the death blow to arcades in early 2000 where I live. Small arcade operators closed and the scene never recovered. It was not just about new technology trends that lead them to their demise.

Console ports of arcade games were usually easier too, eg Atari version of Dig Dug, NES version of Kung Fu Master etc NES was considered easy and for kids even back then.


It would help if I remembered the documentary that talked about it. Especially in the Atari and early arcade days. There was usually only one developer/designer to many games.


Andrew Brezier in a university speech mentioned that back then in order to produce video games you needed to have a license. Meaning you had to invest dozens of thousands of pounds in equipment and hiring staff for the company.
Whereas now anyone can create and sell commercial games.
It is no surprise then that games and developers would be fewer but usually games would more polished compared to the available technology.
 

dionysus

Yaldog
I think the OP is only remembering the best arcade games. My recollection of arcades is that most games were the arcade version of shovelware.
 

Airola

Member
Those arcade games weren't made for all people to experience them through. They were made as a challenge where the default position is that the game beats you and not the other way around. If someone wanted to get further he had to pay for each time he wants to progress further. And every time he fails, he has essentially lost a concrete coin. It feels as if that couldn't work at all; how would a person want to put more coins inside a machine that doesn't let him see the whole game through? Well, people didn't expect to see the full game through. Their coins were sort of a BET to a challenge where seeing the end was a known "impossibility" but they would want to see if they would get further in the game or get more points than before. Arcade gamers knew they were not entitled to win the game. And also very often just playing the earlier parts again was a fun experience to have. It was fun to go through the first level of Outrun or Vigilante again.

So, the creators of the games not only had to make the game to be interesting to play at its core, but they also had to design them to make people be interested in however small progression they can make in the games. Getting a higher score had to feel like you earned the score by your skills. It had to feel that maybe with a bit more focus and skill you could get the sensation of getting a few more points than before. Or in games with non-score based progression the new parts had to feel exciting to get into.

Today people expect to see the end of the game and the games are designed with that in mind. That makes the core of the games be fundamentally different from what they used to be. Just progressing is not enough. If that progression doesn't guarantee you eventually beat the game then the game is thought to be bad. In the arcades the progression wasn't a guarantee for anything. It was only all about "this is where you were able to proceed the last time, do you want to try again?" And even if you wasn't able to proceed further than before, the game still had to be fun to play. In the games today people don't think it's fun to play a point you already played through once. Hell, games don't really much even allow you to play those parts over again - or at least they are designed to have the default option of not having to do again anything you have already done.

Like, think about Tetris. Imagine someone saying "hey I've already cleared 10 lines - let me start from that point again" and them making the game to have the default option as continuing where you finished the last time. No, the game is designed so well that people love to start it from the beginning over and over again. People would put coins in a Tetris arcade machine. People would pay a bigger price to have the game as their own. The design of the game is so good that people are willing to pay for it even though they know it's really unlikely they would get really far in the game. That game is fun even if you don't beat your last record. And that's the mentality in old games in general. The games were made to be fun both when you put money in the machine to have another continue or if you don't have more money and have to start it all over again at a later time. They HAD to be designed that way or else they couldn't exist.

As I mentioned earlier, it was fun to play the first levels of games like Outrun and Vigilante more than once. The overall game design made the games be that way. A game like Contra is still today amazingly fun to start all over again from the beginning. Scores and lives and the difficult nature of the game aren't just some archaic relics of the past. In a game like Contra it's really exciting and tense to know you have only a few lives and you will lose a life from just one hit, but that you can gain an extra life if you get enough points. That mixed with amazingly well designed really interesting levels make the game a masterpiece. It just wouldn't be as fun if you were able to have infinite lives and never having to start from the beginning again. Sure, the NES version has the 30 lives code but those will end at some point too and every life you lose is one less try for you. If you aren't good enough you will not see the end of the game even with that code. I think Contra is one of the best examples of perfect arcade design.
 
Console games were at the time a sort of third option from arcade and computer games: longer and more in depth than the former by necessity (single purchase to be justified, less flashy visuals), yet more streamlined (less data space, simpler input) and fast paced than the latter.

We got several new types of games out of the deal, such as the Metroidvania and the JRPG.

Those were some pretty interesting times.

Granted, games did ship with bugs, but it's safe to say they were less egregious/difficult to isolate due to the shorter/simpler development cycles of the day and were fewer and further in between. Don't quote me on that though.
 

DunDunDunpachi

Patient MembeR
Those arcade games weren't made for all people to experience them through. They were made as a challenge where the default position is that the game beats you and not the other way around. If someone wanted to get further he had to pay for each time he wants to progress further. And every time he fails, he has essentially lost a concrete coin. It feels as if that couldn't work at all; how would a person want to put more coins inside a machine that doesn't let him see the whole game through? Well, people didn't expect to see the full game through. Their coins were sort of a BET to a challenge where seeing the end was a known "impossibility" but they would want to see if they would get further in the game or get more points than before. Arcade gamers knew they were not entitled to win the game. And also very often just playing the earlier parts again was a fun experience to have. It was fun to go through the first level of Outrun or Vigilante again.

So, the creators of the games not only had to make the game to be interesting to play at its core, but they also had to design them to make people be interested in however small progression they can make in the games. Getting a higher score had to feel like you earned the score by your skills. It had to feel that maybe with a bit more focus and skill you could get the sensation of getting a few more points than before. Or in games with non-score based progression the new parts had to feel exciting to get into.

Today people expect to see the end of the game and the games are designed with that in mind. That makes the core of the games be fundamentally different from what they used to be. Just progressing is not enough. If that progression doesn't guarantee you eventually beat the game then the game is thought to be bad. In the arcades the progression wasn't a guarantee for anything. It was only all about "this is where you were able to proceed the last time, do you want to try again?" And even if you wasn't able to proceed further than before, the game still had to be fun to play. In the games today people don't think it's fun to play a point you already played through once. Hell, games don't really much even allow you to play those parts over again - or at least they are designed to have the default option of not having to do again anything you have already done.

Like, think about Tetris. Imagine someone saying "hey I've already cleared 10 lines - let me start from that point again" and them making the game to have the default option as continuing where you finished the last time. No, the game is designed so well that people love to start it from the beginning over and over again. People would put coins in a Tetris arcade machine. People would pay a bigger price to have the game as their own. The design of the game is so good that people are willing to pay for it even though they know it's really unlikely they would get really far in the game. That game is fun even if you don't beat your last record. And that's the mentality in old games in general. The games were made to be fun both when you put money in the machine to have another continue or if you don't have more money and have to start it all over again at a later time. They HAD to be designed that way or else they couldn't exist.

As I mentioned earlier, it was fun to play the first levels of games like Outrun and Vigilante more than once. The overall game design made the games be that way. A game like Contra is still today amazingly fun to start all over again from the beginning. Scores and lives and the difficult nature of the game aren't just some archaic relics of the past. In a game like Contra it's really exciting and tense to know you have only a few lives and you will lose a life from just one hit, but that you can gain an extra life if you get enough points. That mixed with amazingly well designed really interesting levels make the game a masterpiece. It just wouldn't be as fun if you were able to have infinite lives and never having to start from the beginning again. Sure, the NES version has the 30 lives code but those will end at some point too and every life you lose is one less try for you. If you aren't good enough you will not see the end of the game even with that code. I think Contra is one of the best examples of perfect arcade design.
Thanks for the thoughtful post. That aspect of "trying again" yet still having fun is definitely missing from modern gaming. My gaming habits have gradually shifted to re-playing what I already own instead of buying all of the new stuff (and selling off games I don't plan to replay). Some modern games are still very enjoyable to replay, but a lot aren't designed with this in mind. Or, they artificially segment off portions of the experience to route B, C, and D to "add replayability" without changing anything meaningful.

Gamers' attitudes have shifted too, which you pointed out. It's almost anathema for a game to ask you to replay content. "What?!? This level again?!" If the game asks you to re-fight a boss or return to a previous area, there'd BETTER be a new hidden chest or a new way to interact with the environment.

Yet, games like Hotline Miami were praised for their trance-inducing repetition of levels. Roguelikes are also mind-numbingly repetitive in certain respects, but since each new attempt is also a fresh roll of the dice, we don't mind it so much. To reference an equivalent from the 90s, there was a shmup developer called PSiKYO that would often include the feature of shuffling the order of stages. This reduced the feeling of playing the same thing over and over again.
 
It depends on what you mean by "better" because that could be your nostalgia speaking. AAA gaming has post-Atari/Coleco always been the result of board room meetings and trying to do checklists. People seem to forget that but that's how consoles used to be. It was computers and PC that has the creative AAA games due to the different environment during the 80's and 90's and those being ported TO consoles would make the exception.

Otherwise, look at some of the biggest games during the 80's and 90's that had the THEN equal to a AAA budget today and you see a lot of meh games, flops, and games playing safe. Some of them people like a lot, some people didn't.

A lot of peoples favorite games from that time frame were A-AA games, but because of technical limitation it was possible to not look too far off the other games. The split became a bigger issue once the 3DO/PSX half of the 90's started and that set-up things for what we have today.

The biggest issue today is that AAA gaming involves too much money, as in, massive adverting/marketing, so a lot of the games you here of are AAA games, when during the 3DO/PSX-Xbox/PS2/Early 360 time you could have A/AA developers still able to reach audiences or even get bigger publishers to put out their games. Now you have to "look" through the shelves or keep up with niche gaming mags/forums to find the new cool A-AA games because AAA gaming owns the vast majority of mainstream media presence. Leading to the false belief that games "got worse" so to speak.

I remember late last gen while this one was starting, that AAA took up so much of the marketing the "average" consumer would see, that some people would bring up games like bayonetta among other games and pull out crap statements like "Japanese still focus on gameplay" when Japanese AAA devs go through the same checklists. The issue is you're comparing A-AA Japanese games to AAA Western games and that doesn't make any sense. But these people don't want to look for A-AA western games because AAA western games have such a marketing pretense some people think that's all that's out there.

It's really something to think about.
 
Top Bottom