• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Were 8 and 16-bit games 60fps?

Futaba

Member
Theres more to it than just framerates, generally the consoles did not operate on any form of framerate but simply wrote to the screen to update the image whenever it needed to and a complete image had been shown, as such the TV is constantly updating at 60hz or 50hz depending where youre from and the console is sending new images to the TV only when needed, that being the console is always outputting the same image and updates to the next frame when the set is ready for a new image.

so your answer is, 8bit/16bit/32bit were essentially 1-60fps with display forced vsync when using RF out.
 

Alo81

Low Poly Gynecologist
Isn't there a code to make the entire game half-speed?

Not sure! If there is one I don't know of it.

There is a "lightning speed" mod that I was able to modify to slow down the game speed, but it didn't actually render more frames. I'm certain that a cheat COULD be made to make the slow motion universal, but I don't think anyones made it yet.
 
The problem here is that people are trying too hard to describe complex behaviors with a single value.

Does SMB2 target 60fps? Absolutely. Does it run at 60fps when it's not under heavy load? Yep. Does it hold 60fps all the time? No.

Is it 60fps? Within the bounds of how people use English, that's a very imprecise question.

Except for when it slows down, which is fairly often.

I don't mind, I persevered. 25FPS, 30FPS, 60 FPS - as long as it is not too slow I don't really care. But if the reason for targeting 60FPS is because it is smooth and makes for a better experience, and there are parts of the game where it is apparently not smooth, I don't think it is really fair to say that it succeeded at having the goal of running at 60FPS.

HTupolev knows what's up.

The question was never whether the games were locked at 60 fps. The hardware in those times almost always assured that there will be slowdowns, especially in graphics-intensive games like those churned out by Konami. It is fair enough to say that they definitely ran 60 fps instead of saying that they definitely didn't.
 

kudoboi

Member
Why are people so obsessed with 60FPS as of lately?

1080P/60FPS should have been the bare minimum for this generation. This was the standard for PC for many years now and it is unbelievable that this generation consoles still cannot meet the requirement. Meanwhile, PC is already moving to 1440P and 120FPS

I'd imagine that 1440P/ 2160P and 60FPS/ 120fps will be the standard on PC mid gen
 

Reg

Banned
1080P/60FPS should have been the bare minimum for this generation. This was the standard for PC for many years now and it is unbelievable that this generation consoles still cannot meet the requirement. Meanwhile, PC is already moving to 1440P and 120FPS

I usually run my games at 1080p/30fps on pc. I've got a 5ghz 2500k and a 6950 btw.
 

Eusis

Member
1080P/60FPS should have been the bare minimum for this generation. This was the standard for PC for many years now and it is unbelievable that this generation consoles still cannot meet the requirement. Meanwhile, PC is already moving to 1440P and 120FPS
The reasons it worked for PCs though was that they were held back by consoles, if not those then needing to address a wide variety of computers, and that fact they're free to make comparatively overpowered hardware for people to buy. Consoles, being locked hardware, have the luxury of trying to optimize the code and min max the graphical effects for solid performance, and arguably settling for 1080p/60FPS just means playing last gen games better than last gen could.

Of course, the other angle is that we have hit diminishing returns, though the likes of AC4 do have impressive detail on PS4 over prior consoles, and at a minimum it would be nice to reliably be 1080p at least with 60 FPS for the games/genres that need it more. I don't know whether the XB1 helps there by being weaker, though I guess so as I imagine many of those developers would see what they could've done at 900p if they were closer to equal rather than doing that (or even 720p) on XB1 then just setting the resolution higher on PS4.
 

davepoobond

you can't put a price on sparks
ummm

most, if not all, SDTVs in NTSC were interlace-only, so it was 60 FIELDS per second.

nowadays all good, modern, HDTVs are 60 FRAMES per second. computer screens were always progressive, so PC games benefited from progressive scan.

a field = half a frame.


PAL was progressive scan, so even though the frame rate was lower as a standard, it was better because it rendered a full frame per second, rather than a half frame per half second. broadcast bandwidth had a lot to do with those standards being set in place.


so, yeah most games were 60 fields per second... emulators and the such convert it into progressive scan.
 

danielcw

Member
computer screens were always progressive,

[...]


PAL was progressive scan

No and No.

PAL is just a color standard and none of the widely used pal-based TV (b/g/d/m) broadcast standards were progressive.

And there were interleaced computer monitors. i used one for my snes and watching tv via composite video
 

davepoobond

you can't put a price on sparks
No and No.

PAL is just a color standard and none of the widely used pal-based TV (b/g/d/m) broadcast standards were progressive.

i forgot exactly what the benefit was -- it actually had more lines of resolution rather than a better refresh technology...
 

Zoggy

Member
1080P/60FPS should have been the bare minimum for this generation. This was the standard for PC for many years now and it is unbelievable that this generation consoles still cannot meet the requirement. Meanwhile, PC is already moving to 1440P and 120FPS

I'd imagine that 1440P/ 2160P and 60FPS/ 120fps will be the standard on PC mid gen

there is no standard for pc. i've been using a 1680 x 1050 monitor for years, some play on 1280 x 720 and some play on 1440p or 2160p, then build and set their game settings accordingly to get the desired framerate.

and for the love of god framerate has always been design choice for consoles. they could make the game any goddamn framerate they want and sacrifice graphics effects or vice versa.

same way you tinker with your settings on every pc game.

do you literally set everything on ultra 16x msaa and run at 4k? no you dont and you don't have the pc to do it.

these new wave of pc gamers over the past few years (probably because its so easy to build now) are getting really really ignorant.
 
Why are people so obsessed with 60FPS as of lately?

Since they graphical push the next gen offers is not as significant it was from XBOX to X360 (or PS2 to PS3) people are looking at what can improve.

60fps is a vastly more enjoyable gaming experience than 30fps (in most games) and therefore people FINALLY deem it important.

See this uprise of people suddenly knowing what fps stands for as a blessing. Developers have a lot more pressure on them now to create a 60fps game which makes consoles a viable product for people who played on PC's for a while.

I tried to go back to X360 to play Halo 4,, nope,,, I was intrigued by the campaign, but the 25-30 fps was brutal and couldn't deal with it.
 

HTupolev

Member
most, if not all, SDTVs in NTSC were interlace-only, so it was 60 FIELDS per second.
a field = half a frame.
The timing characteristics make it pretty easy for a 480i-scanning CRT to also support native 240p scanning by placing both even and odd fields in the same locations on-screen. Incidentally, this is exactly what older consoles do with 240-line content. You can call it "60 fields", but it actually is frames, albeit 240-line frames with visible scanlines because of how skinny the electron beam is relative to the line separation.

It's later 480i games that started taking advantage of the interlacing, very common in the sixth gen.

PAL was progressive scan, so even though the frame rate was lower as a standard, it was better because it rendered a full frame per second, rather than a half frame per half second. broadcast bandwidth had a lot to do with those standards being set in place.
No, PAL is 576i at 50 fields per second.

Also, if PAL had targeted progressive scan by halving the refresh rate as you said, that would:
1-not have offered a bandwidth advantage, and
2-been completely unwatchable, because 25Hz is astronomically below the threshold where humans notice CRT flicker.
 

Steiner84

All 26 hours. Multiple times.
becasue to me in almost all games it is the better experience.
Especially in every game that has its Focus on gameplay and not on cinematic.
definately in every fast paced game like most first and third Person and all racing games.
 

petran79

Banned
Sorry to break it to you, but every part of this is wrong. The refresh rate for VGA Mode X or 13h* (which is what 320x200, 256-colour DOS games like Doom and Jazz Jackrabbit used) was 70Hz, not 60Hz like NTSC. The Doom engine was capped at half that, meaning it maxed out at 35 FPS regardless of your monitor's capabilities. Koren already explained how console games could run at 60 FPS on ordinary TVs.

*Edit: I originally wrote "Mode X" alone here. Doom used a modified 13h with similarities to Mode X. Hey, at least I didn't call it Mode 7!

yes, probably it was due to computer monitors higher refresh rate and bandwidth and lack of flickering that gave the impression of a smoother picture.
but running at 60 fps does not mean it can reach that number. movement can be choppy even at 60 fps

eg compare Genesis version of Aladdin and The Lion King on TV with MS-DOS version on computer monitor.
MS-DOS version runs much smoother. probably also due to CPU and GPU power difference....

But in the late-90s on Windows monitors could reach 120+ hz on 640x480 resolutions. It was at that time where 3D games began to make a difference, Then LCD monitors brought the refresh rate down again.....

but VGA card also mattered. the more memory it had, the smoother 2D MS-DOS games would run. I had a 512 KB card, so a lot of games had choppy frame rate.
 
Man, the same old ignorance over and over...

* People mixing up a game's framerate with how many animation frames a sprite had. Mario's jumping in Super Mario Bros. only had 2 frames of animation, so it was a 2fps game! LOLs, no.

* Spouting about field and frames, and how everything can only be technically 60 fields per second. As described exhaustively in this thread, you can stack fields on top of each other to make a frame. 480i becomes 240p, which is what most of these consoles did. Seriously, NES had progressive scan!

* Saying nonsense like, "Sure, the game goes at 60fps, until it slowed down!" Yeah, obviously.

Gamers had it good back then in terms of frame rate and control. Most of the best games were 60fps games. Since the PS1/N64 era, and the advent of polygons, more and more games are targeting 30fps, sacrificing that smoothness for cinematics, which some find unfortunate.
 
but VGA card also mattered. the more memory it had, the smoother 2D MS-DOS games would run. I had a 512 KB card, so a lot of games had choppy frame rate.

Video cards back then didn't do much more than act as a frame buffer. You copy a piece of memory onto it and at the v-sync, that memory is converted to a video signal and displayed to the screen.

The amount of video memory only determined how big your frame buffer was and hence, your max resolution.

A 320x240 screen, at 8 bits per pixel, only needed 76,800KB.
640x480 = 307,200 KB

Having less video memory means smaller resolutions, which means less memory to blit to the video card, which means faster framerates, not slower. :)
 

Z3M0G

Member
Why are people so obsessed with 60FPS as of lately?

People need something they can quantify when comparing this gen to last gen... directly comparing things like lighting, polygon count, and even texture resolution can be difficult and even subjective to some/most people. But 1080p vs 720p, 60fps vs 30fps are improvements that can be easily understood and expected.

Basically, they are already the buzzwords of the new generations. It's unclear if people will still care as much a year from now, or if more or less games will reach a new 1080/60 standard... or if it even will become a casual standard.
 
PAL TVs could only display 25fps. So I thought they couldn't run games any higher.

PAL = 50 FPS (great colors, sharp picture)
NTSC = 60 FPS ("dirty" colors, smooth picture)
Todays TVs: All 60 FPS

60 FPS should be the standard for quality graphics. Its a pity that only Nintendo is developing almost all their games with 60 FPS (at least on Wii and Wii U).

Most developers today sacrifice visual quality for cheap graphic effects.
 

DonMigs85

Member
They were ideally supposed to scroll at 60FPS. Usually if the system couldn't keep up you would get slowdown rather than frame drops/skipped frames (for example, having a swarm of bees on the screen in A Link to the Past)
 

nded

Member
Yes, most games from the 8-bit and 16-bit console eras targeted the maximum refresh rate of the displays they were running on, so 60 or 50 fps depending on your region. The fact that they used fields/half-frames does not mean they were actually 30 frames per second, visual information is still being updated every 16.66ms, just at half the vertical resolution.
 

Audette

Member
X-men 2: the clone wars for Sega Genisis, Recently noticed playing this on my CRT that it's a great exempt of refresh rate.
Single player really does have that high framerate feel/look to it. When playing 2 player mode the frame rate is heavily reduced.
Noticed this when playing two players and when one of us would game over, BOOM, blast processing :p My buddy is one of those people who can't tell FPS differences, and chalked it up to the camera only following 1 player, it's definitely more that that happening.

While I don't know if any older games actually did hit 60FPS, this ones a pretty good example that some 16bit games were able to display at a much smoother refresh.
 

tronic307

Member
They sort of had to be 60FPS because they were mostly 240p. The NTSC standard was 480i 30 locked @ 15.75kHz, so if the resolution was halved to 240p the refresh rate had to double to maintain the same frequency. Games that output 480i 30, 3D, or updated the frame buffer at 1/2 or 1/3 refresh rate were few and far between until the 5th generation (PlayStation, N64, Saturn). RPM Racing for the SNES was 480i, so was the two player mode in Sonic the Hedgehog 2. I'm sure there were a few more examples.
 

Koren

Member
No, PAL is 576i at 50 fields per second.
Technically, PAL is a color encoding (to get color with less than 3 signals) and has nothing to do with resolution.

Most specifications based on PAL (such as PAL-B/G, PAL-I, etc.) are indeed 576i at 50fps, but PAL-M is 480i at 60fps.

Besides that, I concurr.


60 interlaced is really only 30fps.
I disagree, but in any way, it's progressive on 8/16 bits consoles and not interlaced (like 32 bits)

Maybe but you couldn't have a full frame at 60 fps so I guess it depends on how you interpret it.
You could have (and you HAD) a full frame at 60fps. Just at half the resolution. But it was ~224p (progressive) at 60fps for example for SNES in NTSC regions.
 

Koren

Member
Basically, they are already the buzzwords of the new generations. It's unclear if people will still care as much a year from now, or if more or less games will reach a new 1080/60 standard... or if it even will become a casual standard.
I was already caring (like many people based on publications I read at this time) about 60Hz when I was playing F-Zero X on N64 vs other racers. Ditching backgrounds and visual effects to get 30 cars at 60fps were a GREAT decision.

It's not a new idea in any way, and it won't go anywhere. You'll find many requests for 60fps during any generation since PSOne. I don't want to look for those to much, but see for example a petition about Project Gotham 2 on original XBox:
https://groups.google.com/forum/#!topic/microsoft.public.xbox/80v77Z3sJSs

Edit: Didn't know "Petition online" was a banned site (why? I'm curious). Sorry, replaced the link with a Google Groups one.
 
so, yeah most games were 60 fields per second... emulators and the such convert it into progressive scan.

You're wrong. Go do some more research. NES/SNES era games ouputted in progressive, not interlaced, hence why they didn't suffer from flickering. This TV mode is known as 240p. Look it up.
 

dark10x

Digital Foundry pixel pusher
smoother 2D scrolling in MS-DOS mostly

for MS-DOS 1-2 MB were enough for most games.

For Windows games, 4 MB and up was necessary, especially for 3D.
It's interesting as CRT monitors did indeed offer a great number of different refresh rates compared to 60 Hz TVs but that didn't actually result in increased fluidity on the PC as the PC wasn't particularly well suited to moving tiles around at high frame-rates. 2D games were generally pretty choppy on the PC up until 1994-ish or so and, even after that point, a lot of them still suffered from issues that weren't present on consoles.

I know, I was there and was desperately trying to match console 2D performance on my PC. There just weren't many games that could pull it off and those that did (like Jazz Jackrabbit) did so with severe sacrifices (very simplistic backgrounds, for instance).

When console games WERE ported over they either had performance issues or super high requirements.

Of course, other non-x86 computer hardware was much better suited to this type of scrolling. Amiga, C64, and the like were all very capable in this regard.

It should be noted that the approach to 2D on 3DO was similar to the PC which is why, despite its faster overall hardware, it struggled to handle 60 fps 2D platformers. Gex, for instance, ran at 30 fps with serious slowdown that dropped it well below in many cases. The games that could deliver smooth performance were rare. Shame everything was output in interlaced mode (despite being internally handled at 320x240).

Maybe but you couldn't have a full frame at 60 fps so I guess it depends on how you interpret it.
As others have noted, this is not a good way to look at it.

More importantly, it's not relevant to this thread. After all, the OP was specifically talking about 8 and 16-bit games which *DID* output in progressive.

When resolutions increased we moved into interlacing due to limitations of TV technology but that doesn't mean what you see will appear as anything other than 60 fps.

There are examples of games which render 60 fields per second which actually DO take advantage of interlacing to increase performance. They still visually appear as 60 fps but there are artifacts, but most games from the PS2/GC/XBOX era rendered internally in progressive scan regardless of what was output on screen. Many PS2 games can be forced to output at a full 480p (any game that wasn't using field rendering, basically).
 

bengraven

Member
When I was a kid we had a thing called "slowdown".

That's when the framerate dropped like a motherfucker down to like 10fps. We didn't know about frames then, we just called it 'slowdown".

Sometimes it helped save our lives.

These days you kids call THAT "bullet time".
 

Wonko_C

Member
You see.... 60fps didn't matter back then. Even amazing games like Metal Slug wasn't 60fps (had no clue) Games were simple and enjoyable.

PC gaming ruined everything.

PC gaming did, seriously? Are you purposefully ignoring decades of arcade and 8/16 bit console gaming? What about those who aren't "PC gamers" and know their stuff?

believe me. During the nes days nobody even talked about graphics, much less framerate. We were just happy to have good games after years of atari. Lol

Not me, I still remember talking to my uncle excited about the new game Legend of Zelda on the NES and telling him the graphics were great. A term I learned by reading Nintendo Power and other gaming magazines when I was still 8 years old... Then I got reprimanded because I got too excited for a videogame. (Back then when videogames were the ONLY thing I talked about as a kid.)

Man, the same old ignorance over and over...

* People mixing up a game's framerate with how many animation frames a sprite had. Mario's jumping in Super Mario Bros. only had 2 frames of animation, so it was a 2fps game! LOLs, no.

* Spouting about field and frames, and how everything can only be technically 60 fields per second. As described exhaustively in this thread, you can stack fields on top of each other to make a frame. 480i becomes 240p, which is what most of these consoles did. Seriously, NES had progressive scan!

* Saying nonsense like, "Sure, the game goes at 60fps, until it slowed down!" Yeah, obviously.

Gamers had it good back then in terms of frame rate and control. Most of the best games were 60fps games. Since the PS1/N64 era, and the advent of polygons, more and more games are targeting 30fps, sacrificing that smoothness for cinematics, which some find unfortunate.

This, I've never understood people who talk like they know what are they talking about, and in the cases where I'm wrong I see it as a learning experience.
 
Speaking of which, where are some instructions on how to do this?

Whoops, didn't realize this thread kept going

I used Skyfireblaze's method. By far the easiest way to do this stuff, the other UI-less versions and XMedia made me want to kick a puppy.

I recorded with FRAPS at Half-size @ 120fps (fixes the res to 960x540 if your game is 1080p). Otherwise my cpu can't keep up. This res is probably the ideal for any other program.

BTW, if you use free fraps, instead of something else, and you hate the watermark, you'll have to crop it out. I used handbrake for that. It's in one of the settings tabs, the top arrow set (is not at his desktop right now). The correct value depends on the on resolution. For 540p, its 28-36 or something. Be sure your new "master" is high quality :) .
 

Eusis

Member
You're wrong. Go do some more research. NES/SNES era games ouputted in progressive, not interlaced, hence why they didn't suffer from flickering. This TV mode is known as 240p. Look it up.
You can see this on some HDTVs too if you hook up a PS2 and play a PS1 game (or the right PS2 game) on it. It'll either tell you it's in a 240 resolution, or it'll just not display anything at all, which kind of indicates it's a distinct signal from 480i in and of itself.
 
PAL = 50 FPS (great colors, sharp picture)
NTSC = 60 FPS ("dirty" colors, smooth picture)
Todays TVs: All 60 FPS

60 FPS should be the standard for quality graphics. Its a pity that only Nintendo is developing almost all their games with 60 FPS (at least on Wii and Wii U).

Most developers today sacrifice visual quality for cheap graphic effects.

I wouldn't call NTSC colour dirty at all, if anything it was nicely saturated, the picture really 'popped' but to me the reds always looked a little off.

PAL TVs more often than not could display 60hz too, even very old pre-Scart models if you were willing to face electrocution by adjusting the horizontal sync although they couldn't decode the colour.

Had to play my imported Super Famicom for weeks in black and white :(
 

Gattsu25

Banned
Why are people so obsessed with 60FPS as of lately?

After nearly a decade of stating that framerate doesn't matter and thinking that graphical fidelity can only be achieved at 30fps with tearing.. people are now starting to see just how utterly wrong they were.

They got a taste of good looking games running at 60fps and they crave more.

lAsuBEm.jpg
 

Eusis

Member
After nearly a decade of stating that framerate doesn't matter and thinking that graphical fidelity can only be achieved at 30fps with tearing.. people are now starting to see just how utterly wrong they were.

They got a taste of good looking games running at 60fps and they crave more.
Doubt it's the same people, most of the time anyway. No doubt we've had some converts with games like MGR, Wii U , or some early PS4/XB1, but it's probably more that the people that wanted 60 FPS are getting louder again when we can have games that look as amazing as Killzone stay above 30 reliably. If we can have that then dialing down means that basically any game type that isn't overly hectic and perhaps not too wide open can look great at 60 FPS!

Though that may crumble later in the gen anyway. And I do stand by it being case by case whether 30 FPS is "good enough", only a rare few games do I actively prefer to stay 30 FPS like South Park.
 

VillageBC

Member
I miss my CRT and is fluid display capable of handling frame rate changes with little effect on your eyes. With my dell 2312 I'm forced into vsync/triple buffering to prevent tearing and anything that pans slowly l looks like a jittery mess.

Yeah I need to tweak my settings more.
 
Top Bottom