• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

240Hz vs 144Hz vs 120Hz vs 60Hz

Wait wait ! So if my monitor is 60 hz is there a point to cranking my fps higher than 60 ? This is news to me. If so then Also do you need to find refresh rate of your tv too?
 
I've got a PG279Q, which is 1440p and 144hz. I can game perfectly fine at 60fps, and in fact I usually do run games around there since it can be difficult to get games up to 144hz at 1440p (and I don't play games that benefit a whole lot from 144hz, like CS:GO).

The refresh rate difference is actually most noticeable to me outside of gaming, when I'm just moving my mouse around or dragging windows around. There's something about the smoothness of the mouse cursor and being able to read windows as I move them around that I really like.

120hz + ULMB is also kind of incredible, though I don't find myself using it very often. This test in particular is amazingly clear and readable on that setting.
 
I'm currently on a Swift 1440p gsync monitor and I had a 4k monitor at one time as well. Yeah 4k makes everything much more crisp, but the refresh rate is god awful coming from a 144hz panel. It's extremely noticeable. I eventually got to the point where I'd rather have the refresh rate over the fidelity of 4k. Feels much better.

Now that 4k/144hz monitors will be releasing soon, you can get the best of both worlds.

For those with high refresh panels are you adjusting settings in games to hit 144fps? I found that anything around 90+ feels really good with gsync.

For single player games 4k@60 is perfectly fine. It's best of both worlds, maxed 4k is gorgeous and 60 is smooth as hell.
 
Yup non-K, tired of seeing them maxed out while my GPU is chilling.

Some games behave like that and it's normal. Though, if I were you I'd go look up some 1080 Ti reviews of the games you're playing. They will likely be playing with very high end CPUs, so you can see roughly how much of a bottleneck your CPU is. If I had a 1080 Ti I'd probably be eyeing up Coffee Lake or X299 (or whatever it's called), depending on what games you play. :p
 
Yup non-K, tired of seeing them maxed out while my GPU is chilling.

fwiw i'm currently on a 6 core i7 4930k @ 4.5ghz, and I don't see any bottle necking problems on a single 1080Ti @ 1440p.. BF1 MP pushes it hard, but that game will take as many cores/threads as you can give it.
 
I just bought the Asus VG248QE and playing it at 1080/144hz and it's amazing. Not that great of monitor but for $230 why not? Using it with a msi 1070 card.
 
The refresh rate difference is actually most noticeable to me outside of gaming, when I'm just moving my mouse around or dragging windows around. There's something about the smoothness of the mouse cursor and being able to read windows as I move them around that I really like.
Yup and that difference is there in games too, smoother motions with fast cameras and the details are shown perfectly.

This is why 30fps with motion blur is such an image quality killer and strangely so often hyped for how pretty they are, like in Driveclub and Forza Horizon 3. They look hot in photo mode but in gameplay it's a whole other story, not only do you still see the stutters because the motion blur simply can't hide that much stuttering, you also lose all those pretty details everyone us hyping them for whenever something moves fast.

After upgrading to Windows 10 and getting access to Forza Horizon 3 on PC I'll never go back to play that on consoles, it's a night and day difference.

Driveclub is unfortunately locked into the console prison and isn't getting a 60fps upgrade on Pro either since Sony closed the studio. Hopefully the new studio makes games for PC too.
 
I mean, yeah, it goes without saying that if you compare them by playing them back at a minute fraction of the speed they'll ever realistically be displayed at the difference is pretty clear.

I love high framerates, but I'll never actively chase higher than 60fps because A) I have no interest in spending the many, many thousands of dollars required to build and maintain a PC capable of consistently outputting those framerates, and B) I game on console a lot, and I know that anything higher than 60fps is a pipe-dream in that space.
 
So far GSync does not seem to ameliorate the jarring drops below 60 as advertized, in fact I still prefer locked 30fps to unlocked 40-59 with GSync.
Yeah unfortunately what I tried of g-sync was not the god level gaming upgrade I was lead to believe. Then again I was not one to find tearing that big of a deal in the first place.
 
People curious could visit an apple store and try out the new ipad pros. They have 120hz displays and the non pro ipads are 60hz so it should be a good opportunity to see the difference. Also, I'm not sure if the safari browser will play 120fps videos but you might also be able to try the battlefield 3 120fps video sample in blurbusters on those ipads

http://www.blurbusters.com/hfr-120fps-video-game-recording/

What's funny about this is that my gf and I went to the AT&T store today to test out the new iPad since they had the display models out. She told me she wanted one and notified me about the specs which I wasn't aware of and one of the things she mentioned was that the display itself is 120hz. I didn't believe her at first thinking she was tricking me to regret my iPad Pro purchase but low and behold the scrolling/video experience was a completely different one compared to the old one. I was so upset that I didn't hold off on the new one but that's what I get for being impatient :(.

But hey lol, I have the Samsung CFG70 27" and I have to say, the monitor provides a completely different experience when it comes to gaming compared to my old one.
 
This is why 30fps with motion blur is such an image quality killer and strangely so often hyped for how pretty they are, like in Driveclub and Forza Horizon 3.
I was beginning to think I was the only one. Motion blur is routinely praised on this forum and said to look great in some games. For example FH3 is regularly praised for it's motion blur. Yet the way I see it is, you have all this iq, then proceed to SMEAR THE FUCK OUT OF IT, ON PURPOSE, then people say it looks great? WHAT?! motion blur is one of those settings that i immediately turn OFF in order to preserve iq when things are in motion. Motion blur looks fucking terrible, I don't get it at all.
 
Been playing ps4 games for the first time in a long while and horizon is really driving me crazy with its framerate because I just haven't played a 30fps action like game in probably 2 years now. It's worse because I've also been playing the hell out of nioh and flipping between the surge running at 100fps, nioh at 60, and then horizon at 30...

Yeah I really don't care how good the game is supposed to look in screenshots. 30fps is just ugly looking and tars the game's otherwise great visual presentation.
 
I was beginning to think I was the only one. Motion blur is routinely praised on this forum and said to look great in some games. For example FH3 is regularly praised for it's motion blur. Yet the way I see it is, you have all this iq, then proceed to SMEAR THE FUCK OUT OF IT, ON PURPOSE, then people say it looks great? WHAT?! motion blur is one of those settings that i immediately turn OFF in order to preserve iq when things are in motion. Motion blur looks fucking terrible, I don't get it at all.

They often cite movies/CGI as an example, I think it's a similar problem that the Hobbit movies had to face with HFR. A lot of people think something looks right because that is how it always was their entire life, then something comes along which is actually more acurate to real life than what they are used to and that then looks off to them, just because it's not the same thing. In a technically perfect world the only motionblur that exists is an effect of our own visual system of our eyes and how we interpret what we see, no artificial motionblur on the tech side.

But the details are way more complex, there could also be a reason why many PC gamers are more adamant about framerate than console players, described here (because you can train your visual system):
http://www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/

tldr from the article: Jordan DeLong (assistant professor of psychology at St Joseph's College in Rensselaer, majority of his research is on visual systems.):
"I think typically, once you get up above 200 fps it just looks like regular, real-life motion"
 
I was beginning to think I was the only one. Motion blur is routinely praised on this forum and said to look great in some games. For example FH3 is regularly praised for it's motion blur. Yet the way I see it is, you have all this iq, then proceed to SMEAR THE FUCK OUT OF IT, ON PURPOSE, then people say it looks great? WHAT?! motion blur is one of those settings that i immediately turn OFF in order to preserve iq when things are in motion. Motion blur looks fucking terrible, I don't get it at all.
They say it looks great because it hides some stuttering, but look at the sides when driving and it could be mistaken for a N64 game. Well, no, it's not quite that bad lol, but the pretty details shown in the photo threads surely are nowhere to be find. It's awful. And the lower the framerate the more it destroys the image quality because the frames it tried to smear out is further apart. Which obviously is catastrophic for a fast racing game, since the faster something moves the worse it will look.
And the worst thing is, IT'S STILL STUTTERING!

Edit: Since I bought the PC I've been gaming on triple screens at 144hz, middle screen gsync, and it really is a game changer for racing games. Can't wait to play Forza 7 on this rig! =)
 
I mean, I have a 144 hz g-sync monitor and its nice when it hits 100+ FPS, but anyone saying 60 is bad in comparison is being super hyperbolic. Hell, even 30 is fine for me still.
Well if you consider 30 FPS "fine", it's not surprising that you also think 60 FPS is too.
After spending a week playing games at 100 FPS, and then playing a game locked to 60 FPS, the difference was very jarring to me.
I genuinely thought it was running at 30 FPS until I confirmed it was running at 60.

And that's not even double the framerate.
I almost regret not buying a 240Hz monitor, if not for the fact that they're smaller TN panels rather than IPS.
I wish those new 3440x1440@200Hz monitors were IPS instead of VA, because I refuse to buy another VA panel.
I can't wait for 120Hz HDMI 2.1 VRR OLED displays, but I hope 240Hz is not that far behind them—or maybe they will do 240Hz at 1080p instead of 4K.
I'd pay a lot of money for a high refresh rate 1080p native OLED. It's the controller hardware holding things back, the response time is low enough that they should be able to handle well above 240Hz.

Didn't Valve had some study that 90 fps was to point where you can't see the difference anymore and thus also being the best fps for VR or something?
It's more like 90 FPS strobed is the minimum they could get away with without making the majority of people using it motion sick—not the ideal framerate.
Just like 24 FPS was the minimum they could get away with for film, back when it was actually projected film using a single shutter.
As soon as they moved to a double-shutter—and especially now on today's flicker-free displays—24 FPS became inadequate.
That's why interpolation exists to smooth things out—because 24 FPS doesn't look smooth on current displays any more. It used to.

even apple has joined the 120+hz vrr future
I believe Apple are still only switching between fixed refresh rates based on the content being displayed rather than actually having true Variable Refresh Rate support that syncs up with the source framerate.

Motion resolution is the most important in my opinion. My Vt60 plasma is able to recreate 1080-1200 lines of motion resolution compared to 300-400 lines that 4k oled tv's are able to produce. What's the point of having a really high framerate like 120fps (at 4k for example) if the panel can only reproduce 300 lines of motion? I've read that higher hz on the tv improves motion resolution so that's should be the next evolution for oled's/qleds, a 1,000hz panel would help tremendously.
Plasma TV scores highly on those tests, but there are a number of issues.
To begin with, even top marks is not that impressive.
The test scrolls at a fairly low speed, and this was specifically chosen to give Plasma TVs top marks.
CRTs still have significantly higher motion clarity, but don't score any better in this test.
Games are also significantly faster moving.

Plasma TVs use PWM to draw the image so a "60Hz" plasma might actually be updating at 600Hz or more.
That means considerably reduced motion clarity in the real-world, as anything moving quickly will break up into multiple colored images:

vt60staticfxsi6.jpg
vt60motionbwsb4.jpg

Higher refresh rates do nothing on their own to improve motion clarity without accompanying framerates.
60 FPS on a 600Hz flicker-free display would have the same motion clarity as 60 FPS on a 60Hz panel.
Interpolation from 60 FPS to 600 FPS would help, but that also introduces visual artifacts and latency, making it unsuitable for gaming.

What you need to do to improve motion clarity is strobe the image—and the rate of the strobe must be equal to the framerate. Any higher and you get multiple images displayed, which is much worse than any motion blur.
Here's an example of that from a CRT, showing 96/72/48/24Hz with a 24 FPS source.

The problem with strobing the image is that the framerate must be equal to the refresh rate, and never drop—it's just like VR in that regard.

Another other issue is that, at 60Hz, people notice a lot of flickering.
Flicker is still noticeable at 120Hz or even 240Hz, but is considerably reduced.
The problem is that you then need to run the game at a locked 120/240 FPS which is very challenging.

And finally, strobing the display significantly reduces its brightness.
To double motion clarity, you halve the brightness.
To match a CRT's motion clarity, we need roughly a 20x improvement compared to 60Hz flicker-free displays.

There is another benefit to strobing the image, beyond improving motion clarity.
It also makes motion appear much smoother.

Caution: low framerate flickering images.
This image should demonstrate the difference in smoothness that strobing the image can make.
Both white circles are moving at the same framerate.
Cover up each half of the image, and the lower circle appears to be moving much smoother than the upper one.
That's why 60 FPS on a 60Hz CRT used to be so much smoother than 60 FPS is on a 60Hz LCD, and why film at 24 FPS used to appear smooth.


I can't be the only one that accepts 30 and 60 fps. Obviously monitors and setups capable of hitting 100+ are better, but I'm just not bothered when I go from 60+ to 30 fps.
You're not, because console games wouldn't sell if most people couldn't tolerate ≤30 FPS.
But 30 FPS gives me migraines and/or makes me motion sick.
I couldn't even watch the E3 conferences this year without using interpolation to smooth out the framerate.

This is hyperbolic.
It's noticeable but not "insane".
60fps is a slideshow now according to some people in this thread. They world has gone mad.
Or: people have different perceptions and standards.

I'm assuming people like you guys basically can't play console games?
Correct.
The majority of console exclusives may as well not exist for me, when they make me ill.
I really tried to play Zelda: Breath of the Wild—even with significant input lag by enabling interpolation on my TV—but I just couldn't play it without having to go lie down for the rest of the day.

Yeah I just got a Viewsonic GSync on the idea of "making sub-60 smoother" but I can still feel it when it hits below.
I think it is still preferable to stay far above 60 at all costs.
Buying a G-Sync monitor did the opposite of that for me.
Since it makes framerates above 60 FPS viable without requiring an absolute lock on those framerates, I'm now trying to hit 80 FPS+ in every game instead of just trying to prevent drops below 60.

I think the people happy with G-Sync at low framerates were fine with low framerates to begin with, they just didn't tolerate bad frame-pacing.

tldr from the article: Jordan DeLong (assistant professor of psychology at St JosephÂ’s College in Rensselaer, majority of his research is on visual systems.):
“I think typically, once you get up above 200 fps it just looks like regular, real-life motion”
I'm not convinced, unless there are some conditions which go along with that statement.
200Hz displays are not enough to eliminate motion blur for example.
On a flicker-free display like an LCD or OLED, that results in 5ms image persistence, which is considerably more motion blur than a CRT has. (below 1ms)
You need at least 1000Hz and probably more like 2000Hz+ to minimize motion blur on a flicker-free display.
That's part of the reason why companies like NVIDIA are working on prototype displays like that. (they have demonstrated a 1700Hz display)

If you are strobing the image at 200Hz to bring the image persistence down below 1ms, you will have good motion clarity but some flicker will still be visible.
And stroboscopic artifacts are still going to be a problem for games unless you are still adding a lot of motion blur via the game.

200Hz video might be fine, but games move very quickly compared to video.
They can easily move at several-thousand pixels/second if you're playing games with a mouse.
Until your refresh rate and framerate is high enough that you have ≤1 pixel of motion per frame, you're never going to have perfect motion, and have to cover it up with motion blur to smooth things over.
 
I just got a 165hz monitor a few days ago and it's an absolute game changer, high refresh rate coupled with GSYNC is something to behold.

Have you tried ulmb yet? Truly God tier monitor tech that's imo better and more important than Gsync. I don't know if it's even possible or practical given technology may move beyond needing it in the future, but my ultimate desire of monitor tech is getting a screen thst can do ulmb and gsync at the same time. Playing games like Vanquish at a locked 120 fps with near crt level image persistence is almost like a religious experience to behold. Check it out if you haven't!
 
It is rough to go back to 60hz after playing at 144hz with g-sync on PC.
the image also feels calmer when I overclock the monitor to 180hz
 
60 is plenty for me. Heck, I started playing Forza Horizon 3 and was surprised when I saw the frame-rate set at 30 in the options. Damn good motion blur, I guess.
 
i'd like to upgrade from 144hz to 240hz but none of them support 3d, i won't be giving that up anytime soon
 
In order to take advantage of 144hz do you also need a high dpi mouse?

I've seen some models reach 16000 dpi.
I currently use a 2000dpi mouse and play at 100hz max.
 
And to think that there are people who actually prefer 30fps. Shame that most AAA titles will stick to 30fps on consoles and the mid-gen upgrades only focused on higher resolutions :(
 
I got a 144hz monitor a year ago and I will never go back to anything lower than that (120 maybe). 60 fps doesn't feel like 60 fps anymore and playing first person shooters on my brothers 60hz monitor feels bad.
 
Having to choose between the PG27UQ and PG35VQ this year is going to be a pain. Leaning towards the PG35VQ though.
 
I mean, I have a 144 hz g-sync monitor and its nice when it hits 100+ FPS, but anyone saying 60 is bad in comparison is being super hyperbolic. Hell, even 30 is fine for me still.

It is bad when compared to the higher Hz combined with a strobed backlight.

The motion blur reduction becomes significantly different at 100hz and above.
 
I thought 3D only needed 120hz

you do but any monitors that are 120hz+ don't just automatically support 3d vision unfortunately, fairly sure it only works with tn panels and definitely needs support added, to be fair there is one 240hz with 3d support but of course it's the piss takingly expensive asus PG258Q for a bargain ÂŁ550 for a 1080p monitor
 
To be honest OP, I start to not see a difference between 100 Hz and 144 Hz anymore, but that's probably just me. The jump from 60Hz to 120Hz is nice, but not as significant as going from 30 Hz to 60 Hz. At least in my subjective perception. I for example don't feel the "oh man, this very bad" effect when going from my brothers 120Hz display back to my 60Hz display, in contrast to me going from 60fps on my PC to 30fps on my Pro.
The biggest problem with very high frame-rates is the cost for me though. Very high FPS are more demanding than high resolutions because you need a good CPU and GPU for that, especially if you like to play modern AAA games at high settings. For example my 7700k/1080 setup is able to push 1440p/60 at reasonable ultra settings in games, but there isn't enough headroom left for me to even consider getting a 1440p/100Hz+ gysny monitor. I'd need to constantly upgrade my CPU/GPU and always go for the high end models to be able to reach 100+ fps at high settings and high fps.
 
Personally, I'll only start to get satisfied when the first 1920hz screens hit the market.

Anything below that is just an eye straining slideshow to me.
 
Is there any 120/144hz monitors that have/approach IPS panel color quality? Most of what I've seen have TN panels. I'd love to get high refresh rate monitor but my work (digital arts) probably means I'm stuck with IPS right? I still love my Dell U2515H though lol
 
I just wish that 120Hz became the new standard. It divides nicely to 60, 30, 24 Hz
In order to take advantage of 144hz do you also need a high dpi mouse?

I've seen some models reach 16000 dpi.
I currently use a 2000dpi mouse and play at 100hz max.

No, DPI is related to resolution.
 
Have you tried ulmb yet? Truly God tier monitor tech that's imo better and more important than Gsync. I don't know if it's even possible or practical given technology may move beyond needing it in the future, but my ultimate desire of monitor tech is getting a screen thst can do ulmb and gsync at the same time. Playing games like Vanquish at a locked 120 fps with near crt level image persistence is almost like a religious experience to behold. Check it out if you haven't!

I really hope this is achievable, motion blur is still noticeable at 144hz and pretty bad at 60. I still use g-sync over ulmb because getting modern games to run at a locked 120 fps needs serious hardware or is in some cases downright impossible.
 
In order to take advantage of 144hz do you also need a high dpi mouse?
I've seen some models reach 16000 dpi.
I currently use a 2000dpi mouse and play at 100hz max.
No, you need a mouse that updates at a high polling rate to avoid stutters, not high DPI.
The majority of gaming mice support 1000Hz now.
ASUS have some mice advertising "2000Hz" but I believe some people have shown that it is not really updating at a true 2000Hz polling rate.

High DPI mice are for higher resolution displays, to avoid angle/pixel skipping.
You use a very high DPI on the mouse, and a very low sensitivity in the game.
Turning speed is the same, but movement is smoother and you have finer control.
Many games do not properly support high DPI mice though, so you can't use it in all of them - but that's why most mice now have DPI-changing buttons.

And to think that there are people who actually prefer 30fps. Shame that most AAA titles will stick to 30fps on consoles and the mid-gen upgrades only focused on higher resolutions :(
I don't believe that anyone would prefer 30 FPS if you could choose between 30 or 60 without making any other sacrifices.

I just wish that 120Hz became the new standard. It divides nicely to 60, 30, 24 Hz
You shouldn't need to worry about that with variable refresh rate displays.
The refresh rate automatically synchronizes to the framerate.
I don't know of any video players which properly support VRR yet though.
But I just use SVP to interpolate everything to my display's maximum refresh rate anyway, instead of changing refresh rates.
The only thing that's frustrating is that 100Hz on the desktop means I have to hit a button on the monitor to switch over to 60Hz temporarily for YouTube videos.

Have you tried ulmb yet? Truly God tier monitor tech that's imo better and more important than Gsync. I don't know if it's even possible or practical given technology may move beyond needing it in the future, but my ultimate desire of monitor tech is getting a screen thst can do ulmb and gsync at the same time. Playing games like Vanquish at a locked 120 fps with near crt level image persistence is almost like a religious experience to behold. Check it out if you haven't!
NVIDIA are working on it.
There are multiple problems with variable strobed displays though.

As framerate changes, so does flicker frequency, and so does display brightness.
When you start dropping to lower framerates, flicker becomes very obvious.

What NVIDIA have shown so far is that they will double up the refresh rate when your framerate drops below half of the monitor's maximum.
So a 144Hz display would refresh at 73Hz for 73 FPS, but 144Hz again at 72 FPS.

The problem is that double-strobing results in double-images in motion.
I don't think there's really any way to solve that without interpolation.

And 73Hz is already lower than NVIDIA allows ULMB to operate at.
For some reason they restrict it to a handful of refresh rates, starting at 85Hz, instead of allowing 60Hz strobing for games locked to 60 FPS for example.

If they do get it working or release it as an update for existing monitors, it seems like there are going to be some severe compromises with it.
It should be great for games that you can keep running at high - but not locked - framerates though.
 
you do but any monitors that are 120hz+ don't just automatically support 3d vision unfortunately, fairly sure it only works with tn panels and definitely needs support added, to be fair there is one 240hz with 3d support but of course it's the piss takingly expensive asus PG258Q for a bargain ÂŁ550 for a 1080p monitor

I have the PG258Q and it's an amazing monitor. If you want to use either ULMB or 3D Vision it's head and shoulders over any other monitor on the market. You can use ULMB and 3D Vision at 105+ nits.

I find the biggest improvement with the PG258 is the motion clarity. When games are in motion it's so much clearer than my old VG248QE at 144hz. It's still not quite as clear as ULMB, but it's a huge step in that direction.
 
I thought it wouldn't matter that much til I started playing fps on pc with high mouse sensitivity. For fps I definitely feel held back by 60hz. Vast majority of games though, who carez.
 
For anyone who wants to viscerally feel the difference between 60Hz and 120 Hz: go to an Apple Store or a Best Buy, and use the regular iPad (60) and then the iPad Pro (120). It's insane.

I'm afraid by doing that I'll be ruining my experience with all my current devices' refresh rate!!
 
I picked up a 144hz predator 1080p monitor 2nd hand from ebay for ÂŁ100. It's ace. Not sure if I even need gsync with this tbh, at low framerates (according to Black Desert online, as low as 45fps) it still feels buttery smooth..
 
My current display does 1080p@120hz or 4k@60hz

Honestly, im one of those "ugh fuck i cant stand 30fps unless i absolutely have to" kind of dickheads, but when i tried to play a game at 120hz, and then dropped it to 60, it was noticable but not a huge difference to me.

If the jump from going 30fps to 60fps is a 10/10 leap, then 60 to 120 was like a 3/10.

BUTTTTT, i have read it does more for twitch fps gaming and things like that, which i do not play.
 
Not this shit again
Using slow motion to show that 144hz is better

144hz is a joke

10000hz is where it's at
Gotta have a ultra high speed camera
 
...

I'm not convinced, unless there are some conditions which go along with that statement.
...

There are, they're outlined in the article in detail but the answer ist not a simple technical numbers answer. As for how accurate they are? That's not for me to decide but it makes sense to some degree!
 
This is a pointless video, of course 60 Hz is four times choppier than 240 Hz. It doesn't tell you anything about the experience of seeing a 60 Hz game. For the record, I think 100+ FPS is necessary for shooters.
 
30fps/hz is fine for certain games.

60fps/hz is fine for most games.

85-100fps/hz is a definite improvement for certain very fast paced competitive twitch games.

144hz or higher is welcome overkill, which I haven't personally seen, but I'm pretty sure I started not to notice really when getting up to 100 already.

Currently in the awkward situation if choosing a monitor to replace my 1440p 25" dell 60hz.

1) 25" is my favorite size for sitting in front of at a desk, and 1440p looks much cleaner on it than on a 27", and it fills my field of vision. But there are virtually zero 25" screens :(

2) I really wish there was any kind of 4k monitor that could be clocked to higher refreshes at lower resolutions only - a compromise monitor. Anything higher than 60hz at 4k is obviously ridiculous, but couldn't it do lower resolutions at higher refresh rates as an alternative?

3) The market for this high refresh stuff is TINY, and this is obvious when you look at the prices.

This is a pointless video, of course 60 Hz is four times choppier than 240 Hz. It doesn't tell you anything about the experience of seeing a 60 Hz game. For the record, I think 100+ FPS is necessary for shooters.

Haha, this is so true.
 
Is there any 120/144hz monitors that have/approach IPS panel color quality? Most of what I've seen have TN panels. I'd love to get high refresh rate monitor but my work (digital arts) probably means I'm stuck with IPS right? I still love my Dell U2515H though lol

They do exist. The ASUS PG279Q ROG SWIFT and the ACER XB271HU BMIPRZ are popular options.
 
Top Bottom