• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sony confirms PS5 is getting VRR

Geki-D

Banned
Considering that pretty much every multiplat game runs better on PS5, what is the actual benefit of VRR that the Series X is getting right now but the PS5 isn't?
 

Kuranghi

Member
It’s a KS8000 by European numeration, which was a 9000 in US models I think. Thanks for the advice. I don’t like OLED but I’ll take your offer when I’ll seek for advice for the next toy ;)

Sounds good about PM, I'm an LCD man myself so I can recommend you some LCDs.

I think you are right about the model numbers, that should be a FALD, so you have a really nice set anyway so no rush to upgrade.
 
Last edited:

Kuranghi

Member
Considering that pretty much every multiplat game runs better on PS5, what is the actual benefit of VRR that the Series X is getting right now but the PS5 isn't?

When the v-sync implement is shite and you have tearing/bad frame-pacing its massively reduced with VRR. Ps5 isn't tearing as much as XSX but it does still have stutter when it drops frames below its v-sync target so if you had it on PS5 you wouldn't have the XSX tearing and more importantly the framerate would appear smoother even if it was dropping to 58/59 for one second and then locking to 60 again the next.

So VRR on XSX right now will fix the problems its having, ie tearing, while PS5 doesn't have (nearly as much, but some) tearing but it does have stutter from prioritising complete frames over tearing, so if you had VRR on PS5 then that problem would be solved.
 

FranXico

Member
Considering that pretty much every multiplat game runs better on PS5, what is the actual benefit of VRR that the Series X is getting right now but the PS5 isn't?
Even multiplats that happen to run better on PS5 still exhibit some tearing. It is better to have the feature than not have it.

Having said that, like John Linneman pointed out multiple times already, VRR is not something developers should be relying on to not optimize their engine.
 

Radical_3d

Member
Sounds good about PM, I'm an LCD man myself so I can recommend you some LCDs.

I think you are right about the model numbers, that should be a FALD, so you have a really nice set anyway so no rush to upgrade.
I’m pretty sure this model is edge lit. I don’t think Samsung had a KS model with FALD.
 

Radical_3d

Member
You are right on your model but there was one FALD KS model, it was called the 9500 here and the 9800 in the US: https://www.flatpanelshd.com/article.php?subaction=showfull&id=1460114926

Thats the worst thing you could've told me! Its time for an uuuuuuupgraaaaaade my man.
29vu.gif

Besides I’m not buying another TV technology at the start of its cycle, like VRR right now. That and that better techs like inorganic LEDs are about to hit the market rendering VA LEDS and OLEDs obsolete.
 
Last edited:

MastaKiiLA

Member
Good for the ones jumping in to the 4K now. Personally I’m not upgrading until the HDR format war is over or at least every TV supports both.
Same here. I have a Samsung UA55NU7100, and see no reason to upgrade anytime soon. Maybe in a few years, or if my screen goes bad like my 2 previous Samsung HDTVs did. Otherwise, it does what I need. I don't need 120Hz features. It's big, bright, and sharp. That's all I require for now.
 

thelastword

Banned
Don't let people fool you in thinking VRR is the answer. For a game which performs less than optimal it makes the game more tolerable, but it does not improve controller response.....

If I'm playing a fighter at 120fps and it drops frames in one level to 92fps, but is 120 everywhere else, that experience is not the end of the world, all this judder talk is way overblown by extremist framerate purists who are more concerned about when a frame drops instead of the overall experience......Alternatively, if I'm playing the same game at 120hz and it's framerate is falling more often in all levels, it's tearing all over the place, VRR does not replace frames, it's a worse experience relative to controller response, which is more important....Hence why higher frames is more important.

A game is 60fps and it drops 1-2 frames it's not the end of the world.....One instance of a spike in a level is not the end of the world, the whole experience does not become unplayable because of it. As a matter of fact, the lower your framerate target, the more framerate consistency is necessary.....30fps should be a flat line, dips below 30fps is much worse than dips from 60fps and the same at 120fps, it's an ascending slope and I know that if you play on a 60fps panel or a 120fps panel, that would be the way these monitors are synced, yet the controller response would be much higher with drops from the higher framerates anyway, so certainly not in the same category as 30 with dips....You would still be playing with high frames.....Even if a game does not dip below 30fps and it's a clean 33.3ms line, I prefer an unlocked 60fps mode for controller feedback and response.....If people want a cap, they can have it, but I say give me the option of an unlocked framerate just the same, don't lock it just because someone says that's how he prefers to play....I will take better controller response everytime...


As for VRR, that won't be standard for everybody for years.....The focus should be for devs to ship their games in a better state. Yet that's a slippery slope, people will complain either way. Devs will have to lower fidelity/settings to hit framerates and people will complain about graphics then.....It's not as good as the other version's graphics, but boosting graphics will come at the cost of framerate.....It's the same thing really, many people wanted 60fps this gen, devs are pushing 60, even offering options and many modes. People still complain about the modes outside of what they asked for.....You were asking for 60fps, they give it to you locked, now you are complaining about unlocked resolution mode or you are complaining that the 60fps mode is too low in rez, just to hit some magical number......Godfall is 1350p, why can't they make it 1440p, but then it would dip and there would be another conversation....Oh they could have reduced the resolution.......It's a never ending round, people have to understand that different hardware have different constraints, you can't get perfect framerates and your ideal resolution with every game and with every developer, games have to ship sometimes, deadlines have to be met....That is why devs are so adamant that a console has the least bottlenecks as possible and is quick to triangle, the turnout in performance and development is heightened many times over and better results are more quickly realized.....That should always be the focus as opposed to depending on a feature which will take years to standardize.........For better framerates, I'm more interested in Super Resolution for consoles over VRR trying to mitigate bad performance, at least with SR my framerates can be locked even moreso and my resolution would not be as dynamic or low...it would stay high....I think that's the future for better and locked framerates across the board...
 

ZywyPL

Banned
giphy.gif



Better late than never. Now let's hope devs will start offering unlocked framerate option in their games, rather than fixed performance/resolution modes.
 

Radical_3d

Member
Same here. I have a Samsung UA55NU7100, and see no reason to upgrade anytime soon. Maybe in a few years, or if my screen goes bad like my 2 previous Samsung HDTVs did. Otherwise, it does what I need. I don't need 120Hz features. It's big, bright, and sharp. That's all I require for now.
I have a UE-KS8000 and Samsung never again. Great panels, abysmal operating system.
 

REDRZA MWS

Member
I have a UE-KS8000 and Samsung never again. Great panels, abysmal operating system.

Like Mastakilla said, Samsung panels, in my experience, are shit. My cousin has had two Samsung panels shit the bed. I myself was always a Sony tv guy up until 6-7 years ago. Bought an LG OLED, the E7P, and now I’ll never look back.I’ve owned 3 LG OLED’s total. The C8, and this year I bought a 77” GX anticipating the new consoles hdmi 2.1 features.
I game extensively, not one hint of the dreaded burn in. LG adds some safeguards that are really great for gaming. Countless hours gaming on these sets with zero issues. Perfect picture, zero input lag, there’s just no comparison to them right now.
 
Only in VRR compatible games.

So yeahs that's the worst possible way to apply VRR which shows it wont be used on PS5 all that much.

VRR compatibility must be mandatory otherwise game devs will ignore it and/or use custom vsync modes which wont work with VRR if you force it on globally (what happened on Xbox One, not all its games work with VRR).

Also the way the PS5 works by staying at 60hz output and only activating 120Hz on games that support it means 30FPS games can never benefit from VRR low frame rate compensation. For 30FPS games to work with LFC 2.5x frame doubling you must have 120Hz output at all times, you cannot do it with 60hz output.

So for the PS5 your looking at a scenario were
At 60hz output only 60FPS VRR compatible games benefit from the 48-60Hz VRR range.

At 120hz output only 120FPS VRR compatible games benefit from the 48-120hz VRR range.

No VRR help for 30FPS titles.

Also the PS5 does not support 120Hz PC video timings used on PC monitors, there is a difference between TV 120hz and PC 120Hz, not all PC displays support TV 120Hz and of course the PS5 probably will not support Freesync or 1440p further limiting its use.

That is not ideal, clearly VRR was not a priority at Sony.

Would have it really been so hard for Sony to
1) make VRR compatibility mandatory for all PS5 games
2) bought a few 144Hz Freesync monitors and made sure PS5 worked with those at 144Hz (capped to 120FPS), makes it easy to just buy a 144Hz monitor.
3) allow 144/120hz output at all times so you can actually benefit from the improve panel response time and reduced input lag even if your game isnt 120fps.

That's all they had to do in making it a slam dunk.
 
Last edited:

Radical_3d

Member
Like Mastakilla said, Samsung panels, in my experience, are shit. My cousin has had two Samsung panels shit the bed. I myself was always a Sony tv guy up until 6-7 years ago. Bought an LG OLED, the E7P, and now I’ll never look back.I’ve owned 3 LG OLED’s total. The C8, and this year I bought a 77” GX anticipating the new consoles hdmi 2.1 features.
I game extensively, not one hint of the dreaded burn in. LG adds some safeguards that are really great for gaming. Countless hours gaming on these sets with zero issues. Perfect picture, zero input lag, there’s just no comparison to them right now.
Let aside that OLED has less nits and are more about dark room movie/games and I live in the sunniest country on the planet with a fantastic and sunny living room... Man, you have owned 3 OLED in 6-7 years. Of course none was burned in XD My budget for TVs is not that splendid so I'll just wait since micro-LED, inorganic-LED or something like that hits the market. And buy a effing Sony.
 

thelastword

Banned
So if both consoles have VRR, what will they say, "both consoles are performing like for like" ultra smooth, no tearing.....No need for stats, no need to show framerate disparities or tearing on each console....VRR solves all problems mate....No need for tech channels, VRR...
 

JohnnyFootball

GerAlt-Right. Ciriously.
Let aside that OLED has less nits and are more about dark room movie/games and I live in the sunniest country on the planet with a fantastic and sunny living room... Man, you have owned 3 OLED in 6-7 years. Of course none was burned in XD My budget for TVs is not that splendid so I'll just wait since micro-LED, inorganic-LED or something like that hits the market. And buy a effing Sony.

You're going to be waiting a LONG while. MicroLED is right now very difficult to produce (the microleds pretty have to be produced one at a time) and unless they can make a breakthrough it will be at least 3 years before you can buy one.
 

Bojanglez

The Amiga Brotherhood
I have a UE-KS8000 and Samsung never again. Great panels, abysmal operating system.
I'm with you on this. It started off bad and seemed to get worse with each firmware update, more and more bloated crap and adverts. I have got it somewhat usable by completely disconnecting it from the internet, deleting all apps and putting it in accessibility UI mode.

Rumours are there may be Samsung QD-OLEDs coming on the market next year, it sounds like good tech, but I'd probably only get one if they sell the panels to others (e.g. Sony) and I can get one from them.
 

Radical_3d

Member
You're going to be waiting a LONG while. MicroLED is right now very difficult to produce (the microleds pretty have to be produced one at a time) and unless they can make a breakthrough it will be at least 3 years before you can buy one.
3 years is more than enough. I plan to stay 4K HDR HDMI 2.0 this whole generation. And once Apple starts ordering micro LED for their everything I think they’ll come up with something. They always push the industry forward.
Rumours are there may be Samsung QD-OLEDs coming on the market next year, it sounds like good tech, but I'd probably only get one if they sell the panels to others (e.g. Sony) and I can get one from them.
Every flagship Samsung display since 2016 has been a quantum dot display.
 
Last edited:
That's... Nice, I guess. Tbh, not really relevant, but it is not bad at all to have it available.
It's one of those enthusiast feature that gets a lot of attention (it's also really nice to have when you happen to have a TV that supports it, I'm sure next year it will be on most midrange TVs).
 

Bojanglez

The Amiga Brotherhood
3 years is more than enough. I plan to stay 4K HDR HDMI 2.0 this whole generation. And once Apple starts ordering micro LED for their everything I think they’ll come up with something. They always push the industry forward.

Every flagship Samsung display since 2016 has been a quantum dot display.
Yes QLED. I'm talking about QD-OLED.

QD-OLED is a hybrid of the two technologies, should be the best of both worlds (Black levels of OLED with greater peak brightness)
 

RoadHazard

Gold Member
3 years is more than enough. I plan to stay 4K HDR HDMI 2.0 this whole generation. And once Apple starts ordering micro LED for their everything I think they’ll come up with something. They always push the industry forward.

Every flagship Samsung display since 2016 has been a quantum dot display.

Current Samsung QD displays are LCD, they have no OLEDs.
 
Top Bottom