• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Blind test shows gamers overwhelmingly prefer (and can identify) 120Hz refresh rates

Maybe this will finally put a cork on the people who claim people don't notice the difference or care about it, goddamn I hope devs pursuit high framerates so we could get back to the golden era of gaming visually instead of the current muddy, slow crap we get on consoles.
 
People who say they cannot tell the difference between 30 and 60 fps,or 60 and 120 Hz, or 720p, 1080p and 4K, do one of two things. They're fanboyishly parroting their favorite company's marketing talking points. Or they really mean that they can tell but just don't care.

Or they just want better visuals than something they have to watch an hour in a comparisson to even remotly notice

I dont want to sacrifice even a single grass blade to a bigger than 30fps frame rate, because i find it the most useless thing ever in gaming

Then again, i like slower paced games like RPGs the most :), so graphics are 1000x more important than frame rate there
 
Who cares what gamers think? I use my PC monitor for games 10% of the time I use it.
120hz displays have ghastly crap colours. I've got a 30" Dell for a reason and even then - I go into the lounge, pull up a photo on the HTPC on the plasma and it embarasses the Dell 30"

I'm settled personally, OLED or bust for my next display, lounge or PC - I'll settle for nothing less.

Is your Dell a U3011 by any chance? Because there is no way your plasma embarrasses it when it comes to colour reproduction.
 
That top product link goes to your cart, not the product.

Also, I'm fairly certain that the 4k/60fps thing is partially dependent on what your GPU is as well.

A little late, but I fixed it.

It's an Asus VG248qe

As for the GPU, it's a GTX 780.

My down sampling settings are just slightly modified from Corky's and scitek's

scietek's settings work, Corky's makes my monitor blink. Both of them are using older GPU's and I am assuming 60hz screens
 
I play in 2560x1440p with dual 7970s. The high res makes the cards run HOT HOT HOT.
If I don't cap my framerate at 60fps, temps creep up into the 87-88 degree range.

Granted if I drop to 1920x1080 the temps are not a problem. But why on earth would anyone do that :p

Point: 120fps gaming would probably burn out my cards pretty quickly. So, from that perspective, I think I'll stick to 60.
 
You should check this out and see if you can easily tell the 60|30 split.

ibqTqCfQJ3qbor.gif

My browser loaded this gif at a slow speed initially and the difference was quite easy to see then. However once it got up to full speed I couldn't tell the difference.
 
I wonder if this is what causes a lot of my motion sickness in first person games? I have a PC and a PS3/XBOX hooked up to the same TV, I can play Borderlands 2 fine on the PC, but it makes me sick on XBOX (everything else is the same, even the controller I use).
 
People who say they cannot tell the difference between 30 and 60 fps,or 60 and 120 Hz, or 720p, 1080p and 4K, do one of two things. They're fanboyishly parroting their favorite company's marketing talking points. Or they really mean that they can tell but just don't care.

I don't really agree with that.

I think that some people honestly just can't tell a difference. What's annoying is when people who can't tell the difference say "NO ONE can tell a difference" simply because they can't.
 
Seems like a silly test. 120 hz is strictly better then 60 hz with all else being equal.

Why not a more fair test representative of the real world trade-offs involved?

Compare 120 hz at 720 p versus 60 hz at 1080p.

My guess is that those results would be quite different.
 
BTW, for those who use Crysis 3 as an example of unattainability.


Textures: Very High

Shadows: Medium

Water: Low (unless you like performance problems)

Everything Else: Low

AA: None

Congrats, you are on your way to 120hz, with a negligible loss in eyecandy. Drop your resolution as needed.
 
Seems like a silly test. 120 hz is strictly better then 60 hz with all else being equal.

Why not a more fair test representative of the real world trade-offs involved?

Compare 120 hz at 720 p versus 60 hz at 1080p.

My guess is that those results would be quite different.

What would be the point of your proposed test, exactly?
 
All I've been playing lately has been L4D2, at a locked 120fps.

It's amazing.

Actually, my friends were impressed by my reactivity in killing enemies saying "you're not like that in other games".

Then I realized.
 
I think the jump between 30 and 60 is bigger than 60 and 120.

60 is already silky smooth as it is.

I'll be happy with 60fps. We don't need 120fps.
 
Game doesn't have to run at 120fps to get advantage from 120hz panel.
120hz gives new displayable, vsynced framerates.

When framerate drops from 60fps it doesn't go directly to 30fps, it goes to 40.
Same from 30 the next is 24fps.
120hz monitors are a lot better especially with <60fps gaming.

Personally I cannot wait for proper 240hz monitors as they will give plenty more vsynced framerates, including 48fps for watching Hobbit. ;)

At least this poster understands Hertz vs FPS. Everyone else is still fighting that they can't get 120fps which this thread is not about. Its about the refresh rate of the monitor.
 
At least this poster understands Hertz vs FPS. Everyone else is still fighting that they can't get 120fps which this thread is not about. Its about the refresh rate of the monitor.
Yeah, people always assume that 120 (or 144) Hz displays ae only useful if you achieve those framerates, but they improve Vsynced responsiveness and fluidity at any rendering speed.
 
Yeah, people always assume that 120 (or 144) Hz displays ae only useful if you achieve those framerates, but they improve Vsynced responsiveness and fluidity at any rendering speed.

And they're also better for movies if you ever watch them on your monitor.
 
What about TVs? My next computer is probably gonna be hooked up to an HDTV that I think has 120Hz functionality. Usually I turn that crap off and leave the TV in Game Mode for console games, but I wonder if I should try it for PC games.
 
ibqTqCfQJ3qbor.gif


By the gifs above, i cant tell the difference betwen 30fps and 60fps at all

So i wonder how anyone would sacrifice extra graphics to get something that is not noticable even in a side by side comparisson

I wont even go to the 120 fps, i think 30 vs 60 is already 99.99% similar (going by the gifs posted and is actually 100% similar since i cant tell any difference)

Also i will not mention slower paced games, where this near zero difference would be below zero or something
 
What about TVs? My next computer is probably gonna be hooked up to an HDTV that I think has 120Hz functionality. Usually I turn that crap off and leave the TV in Game Mode for console games, but I wonder if I should try it for PC games.

That's not actual 120Hz by the sound of things.
 
What about TVs? My next computer is probably gonna be hooked up to an HDTV that I think has 120Hz functionality. Usually I turn that crap off and leave the TV in Game Mode for console games, but I wonder if I should try it for PC games.
TVs don't support 120Hz input. Their "120 Hz" feature is useless (or even detrimental) for gaming.

Good color at 60fps > 120fps.
We should have both.
 
So do LED tvs actually have 120hz or is it all just interpolation? How does 120hz work anyways, do you need two inputs?


edit; seems it was answered two up...
 
Got the ASUS VG248QE which can refresh at 144hz. However I had to change the refresh rate to 120hz because for some reason 144hz made me dizzy.
 
So do LED tvs actually have 120hz or is it all just interpolation? How does 120hz work anyways, do you need two inputs?
It's generally all just interpolation. In terms of input, dual-link DVI or DisplayPort have sufficient bandwidth with just one cable.
 
ibqTqCfQJ3qbor.gif


By the gifs above, i cant tell the difference betwen 30fps and 60fps at all

So i wonder how anyone would sacrifice extra graphics to get something that is not noticable even in a side by side comparisson

I wont even go to the 120 fps, i think 30 vs 60 is already 99.99% similar (going by the gifs posted and is actually 100% similar since i cant tell any difference)

Also i will not mention slower paced games, where this near zero difference would be below zero or something
Your browser may not be displaying the gifs correctly, or maybe you just can't discern the difference. Most PC gamers can tell the difference between 30 and 60fps, which is why PC games locked at 30 receive a ton of complaints. People are starting to request 120fps support in greater numbers now.

The best thing about PC gaming is that you're free to customize the experience to suit your needs, instead of having the dev dictate the IQ trade-offs.
 
Your browser may not be displaying the gifs correctly, or maybe you just can't discern the difference. Most PC gamers can tell the difference between 30 and 60fps, which is why PC games locked at 30 receive a ton of complaints. People are starting to request 120fps support in greater numbers now.

The best thing about PC gaming is that you're free to customize the experience to suit your needs, instead of having the dev dictate the IQ trade-offs.

I believe that some people may be able to tell a difference, i just oppose the posts saying that somehow everyone should see a huge difference when for some people there may be none (i am definitly one of them)
 
The last time I posted this the immediate criticisms were the following:

Within Subjects Design which means there was no control group.

Selection bias, as it was advertised as a "120 Hz event" and thus the attendees are self-selected.
First off, that does NOT disqualify the test results, only shifts them towards a higher percentage.

Second off, it seems they used FPS games to judge this. Completely wrong. Use a test where you scroll across a SC2 arena slowly, no match. Then ask which monitor stays clearer during scrolling. I'd bet on something like a 95% rate. The difference while slowly scrolling across terrain topdown without gameplay to interfere is very, very visible.

edit: how would I go about trying to OC my panel on an AMD card?
 
Stick to consoles mate, you're out of your depth here. You haven't played games on a 120Hz monitor so your opinion has no validity whatsoever.
I have actually, my friend is a PC gaming obsessive so I've had this stuff shoved in my face on countless occasions recently.

And which modern plasma have you played games on which somehow makes your opinion have some validity in that argument?

Also don't call me 'mate', it makes you come across like a condescending little thing.
 
ibqTqCfQJ3qbor.gif


By the gifs above, i cant tell the difference betwen 30fps and 60fps at all

So i wonder how anyone would sacrifice extra graphics to get something that is not noticable even in a side by side comparisson

I wont even go to the 120 fps, i think 30 vs 60 is already 99.99% similar (going by the gifs posted and is actually 100% similar since i cant tell any difference)

Also i will not mention slower paced games, where this near zero difference would be below zero or something

Just because you can't see the difference doesn't mean others can't, or that it's a miniscule difference to others.
 
I only ever witnessed 120fps back in my CRT days, where every cheap CRT could do 120Hz at 640x480. It was a spinning 3D cube and it looked magical: the movement was so smooth and natural it almost felt I could reach my hand and touch it.

Just because you can't see the difference doesn't mean others can't, or that it's a miniscule difference to others.

Also, using GIFs is beyond stupid because not all browsers/devices can display them at 60fps.
 
Is your Dell a U3011 by any chance? Because there is no way your plasma embarrasses it when it comes to colour reproduction.

It is a U3011 and it most definitely embarasses it in regards to colour reproduction, it's a PLASMA not an LCD.
I think you need to go back to display tech 101, sorry man.
 
I have actually, my friend is a PC gaming obsessive so I've had this stuff shoved in my face on countless occasions recently.

And which modern plasma have you played games on which somehow makes your opinion have some validity in that argument?

Also don't call me 'mate', it makes you come across like a condescending little thing.

I have a P50GT60 so I'm well versed in how good plasmas can be. Still doesn't hold a candle to 120fps for GAMING.

It is a U3011 and it most definitely embarasses it in regards to colour reproduction, it's a PLASMA not an LCD.
I think you need to go back to display tech 101, sorry man.

Maybe you're U3011 has been poorly calibrated. There's a reason most professionals use IPS displays.
 
It is a U3011 and it most definitely embarasses it in regards to colour reproduction, it's a PLASMA not an LCD.
I think you need to go back to display tech 101, sorry man.
You need to clarify what you mean by "color reproduction" then. The U3011 achieves 115% of AdobeRGB coverage and a delta E of ~1 calibrated -- it's not really "embarrassed" by anything in terms of color reproduction (well, maybe outside of some Eizo $3000 monitors).
 
That DmC gif was busted for me. When I click "show element" it pops into the new window and then the difference becomes so much more apparent. On my regular browser window it looks different so firefox definitely does something weird to it. Besides here's the classic thingy to look at the difference.

http://boallen.com/fps-compare.html
 
You need to clarify what you mean by "color reproduction" then. The U3011 achieves 115% of AdobeRGB coverage and a delta E of ~1 calibrated -- it's not really "embarrassed" by anything in terms of color reproduction (well, maybe outside of some Eizo $3000 monitors).

I'm not even bothering to google it for you, backlit LCD monitors look like shit compared to plasma, the dynamic range is a mess. You believe whatever you like, good for you.
 
I have a P50GT60 so I'm well versed in how good plasmas can be. Still doesn't hold a candle to 120fps for GAMING.



Maybe you're U3011 has been poorly calibrated. There's a reason most professionals use IPS displays.

It's called burn in and price.
It's also genuinely difficult to actually make plasmas under a particular size, this is all well documented, one of the CNET writers is a complete videophile nutcase and actually knows his stuff, quite good articles.

Picture quality going from the study to the lounge HTPC, no question the plasma wins, skin tones alone on a simple JPG, one looks like a picture on a monitor, the other looks vastly more real
 
Top Bottom