• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why did the GB, GBC and GBA displays/games run at 59.7 frames per second?

Peltz

Member
GB, GBC and GBA games ran at 59.7 FPS (as far as I know). You can see this if you try to play these games in GB Player - there will be some input lag even on CRT and a slight "skip" every few frames.

There is now a homebrew application called Game Boy Interface Ultra Low Lag (GBI ULL) that will actually allow the GB Player to run in its native refresh rate (and resolution!) on some rare CRTs that can support it. The Sony PVM 20M2U is one such CRT that I can personally attest to that allows this. Not all PVMs and BVMs will sync properly though and as far as I'm aware, no consumer sets will sync at this odd frame rate either.

My question is, why did Nintendo not develop the hardware to run at a clean 60fps in the first place? Did DS hardware have a similar FPS quirk? In fact, do any other video game devices have odd refresh rate caps?
 

Stopdoor

Member
Never heard about this, not often you hear framerate talk about the Game Boy. Are comparisons with the Game Boy Player performance really comparable? I just assumed any input lag was due to software quirks. But I guess you probably know more than me.
 

McSpidey

Member
You might be surprised to find out practically nothing runs at a clean 60. Even what we think of as 60hz screen really run at 59.94hz.

It's just about finding a near enough multiple from a single source clock circuit.
 
Bound to cpu cycles it seems.

The GBA has a TFT color LCD that is 240 x 160 pixels in size and has a refresh rate of exactly 280,896 cpu cycles per frame, or around 59.73 hz. Most GBA programs will need to structure themselves around this refresh rate. Each refresh consists of a 160 scanline vertical draw (VDraw) period followed by a 68 scanline blank (VBlank) period. Furthermore, each of these scanlines consists of a 1004 cycle draw period (HDraw) followed by a 228 cycle blank period (HBlank). During the HDraw and VDraw periods the graphics hardware processes background and obj (sprite) data and draws it on the screen, while the HBlank and VBlank periods are left open so that program code can modify background and obj data without risk of creating graphical artifacts.
 

Mike Golf

Member
If I remember correctly it was to alleviate, especially on the original GB and its derivatives, the very poor pixel response times of their screens so instead of a consistently smooth smearing of dark shades you'd get a slight judder making it a little easier to track movement and see what you're doing.
 
You might be surprised to find out practically nothing runs at a clean 60. Even what we think of as 60hz screen really run at 59.94hz.

It's just about finding a near enough multiple from a single source clock circuit.

This. Hell some games will display exact refresh in the graphics options. I forget examples off the top of my head, but my 60hz Samsung (with Freesync) reports as 59.8 or something like that.
 

NoKisum

Member
Because the Game Boy line never had to run a Soulsborne, or a competitive FPS, or anything else involving "git gud" tendencies.
 

Alpha17x

Neo Member
My experience on the hardware end of things is limited, and I've never worked for or with Nintendo, but based on my own experiences doing my own nonsense with hardware, I suspect it was an issue of system optimization. probably input or over-all system latency. Original intent of the system; run at 60hz, however doing so might have caused issues with certain flag ship titles or perhaps all titles, With hundreds of thousands of parts already ordered, binned and accepted, system engineers get creative and find that reducing the refresh rate by few hertz solves the problem and isn't enough of a change to be detrimental.

The product is shipped on time, hits shelves on time, Nintendo doesn't have to spend millions to fix what could be solved with a 'firmware' tweak, and customers don't notice anything out of place, because perceptibly, it isn't.

This of course is assumption and conjecture on my part and my experience with hardware is with open dev-boards like the Nvidia Jetson or Hardkernal Odroid systems.
 

Paragon

Member
To annoy me when I try to use Retroarch for those games and get slight judders or input lag.
If you're using RetroArch you should be synchronizing the framerate to the refresh rate.
Enable V-Sync (preferably Hard GPU Sync unless it stutters) and Audio Sync, then disable the Maximum Run Speed option.
Wait until the Estimated Screen Framerate deviation settles as low as it can - ideally below 1% - and then select it to set the Vertical Refresh Rate to that value.
Now the game speed should be locked to your refresh rate and there won't be any more stuttering due to <1% timing differences between the emulated system and the display.

To solve this problem any other way, you need a Variable Refresh Rate display. (G-Sync/FreeSync)
No other display will be able to run at a refresh rate completely locked to the original game speed.
Unfortunately, it seems that there is no way to fully disable the sync features in RetroArch, and so the massive advantage it has over other emulators when running on a fixed refresh rate display becomes a disadvantage when trying to use a variable refresh rate display.
I get constant stuttering in RetroArch when using G-Sync, which disappears when using the above V-Sync settings.
 
In fact, do any other video game devices have odd refresh rate caps?

A lot of arcade hardware runs at very non standard timings. For example Seibu Kaihatsu hardware ran at ran at 54hz. MVS and a lot of Konami games run at 51.98khz which is why character shadows flicker so badly when emulating the hardware.

Fire up a version of mameuifx and you will see there are close to a hundred different outputs in the refresh tab. Bring on freesync technology in mainstream lcd televisions!
 

LewieP

Member
I'm pretty sure the DS has GBA hardware inside, which means that GBA games should probably run at the same framerate as they do on a GBA.

Sure, I just wasn't sure if that applied to the display or not.

As an aside, I guess that a Wide Boy 64 is a similar deal to the GB Player?
 

Peltz

Member
A lot of arcade hardware runs at very non standard timings. For example Seibu Kaihatsu hardware ran at ran at 54hz. MVS and a lot of Konami games run at 51.98khz which is why character shadows flicker so badly when emulating the hardware.

Fire up a version of mameuifx and you will see there are close to a hundred different outputs in the refresh tab. Bring on freesync technology in mainstream lcd televisions!
MVS ran at 51.98?! How does that not cause screen tearing on my CRT? (I have a CMVS). What was the AES's frame rate?

Yea all this is pretty surprising. Now I wonder why only GB Player has stuttering issues.

Because the Game Boy line never had to run a Soulsborne, or a competitive FPS, or anything else involving "git gud" tendencies.

This is obviously not the reason. We are also finding out that arcade hardware had lower than 60-fps as well and that had fighting games with frame-specific inputs. I'd argue that fighting games rely on smooth frame rates as much as souls games or fps... if not more so.
 

Borman

Member
MVS ran at 51.98?! How does that not cause screen tearing on my CRT? (I have a CMVS). What was the AES's frame rate?

Yea all this is pretty surprising. Now I wonder why only GB Player has stuttering issues.



This is obviously not the reason. We are also finding out that arcade hardware had lower than 60-fps as well and that had fighting games with frame-specific inputs. I'd argue that fighting games rely on smooth frame rates as much as souls games or fps... if not more so.


MVS isn't that low.

MVS vertical refresh rate is an odd 59.1856060~ Hz.
http://www.neo-geo.com/forums/archive/index.php/t-224444.html?
 
Top Bottom