• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Samsung Announces First Freesync 2 Monitors & Hints on First Games

Thraktor

Member
Anandtech have posted an article on three new Samsung monitors, which are the first to support AMD's new Freesync 2 standard for adaptive frame-rate and HDR.

What's Freesync 2?

Freesync 2 is an extension of the Freesync standard with two main additions. The first is that all monitors must support full-range adaptive-sync with low framerate compensation (LFC). This effectively means that a game running at anywhere from 0fps up to the maximum framerate of the monitor will get all the benefits of adaptive sync. Many Freesync monitors already support LFC, but not all do.

The second, and much more important, aspect of Freesync 2 is a standardised way of dealing with HDR monitors. Instead of using an intermediate colour space like HDR10, Freesync 2 allows games to tone map directly to the display's internal colour space, eliminating the need for an extra tone-mapping step on the monitor, which would add lag (this is why latency typically increases on TVs when you enable HDR). It also, in theory at least, allows for the support of arbitrarily wide colour gamuts, beyond that of HDR10 or Dolby Vision.

freesync_2_presentation.jpg

It does have some drawbacks, though. Firstly, unsurprisingly enough, it will only work with AMD graphics cards (while I believe it's technically open for Nvidia to adopt, I wouldn't hold your breath). The second is that games actually need to explicitly add support for it, so it's all dependent on how many developers do so.

Samsung C49HG90

Samsung's flagship Freesync 2 monitor is a slightly ludicrous 32:9 aspect ratio, 49", 3840 × 1080 curved display.

samsung_C49HG90.jpg


It supports 144Hz, 1 ms moving picture response time (MPRT) and 95% DCI-P3 at up to 600 nits, with a quantum dot VA panel. It's due to be available in late June for $1499.

Samsung C32HG70 and C27HG70

The other two Freesync 2 displays (which I can't find any photos of) are somewhat more traditional 16:9 ratio 2560 × 1440 curved displays, at 32" and 27" respectively. Like the 49" model, they use quantum dot VA panels which operate up to 144Hz, 1ms MPRT, 95% DCI-P3 and 600 nits. The 32" model is $699 and the 27" is $599.

Software

Interestingly enough, the article also gives some hints on software support for Freesync 2:

Anandtech said:
Samsung says that it had collaborated with DICE and Ghost Games to enable HDR in the upcoming Star Wars Battlefront II and Need for Speed Payback games, which may indicate that these two titles will be among the first to support AMD’s FreeSync 2.

Anandtech said:
Samsung plans to demonstrate its CHG70- and the CHG90-series monitors in action at the Ubisoft booth at the E3 convention this week, just a couple of weeks before the devices will hit the market. This in turn implies that this Ubisoft is also set to support AMD’s FreeSync 2 (good news for the GPU developer) in at least some of its titles, but this is a speculation for now.

Having EA and Ubisoft onboard seems like a good start, hopefully we get some more news over the next few weeks before these hit shelves.
 

Thraktor

Member
Come on Nvidia, just support Freesync already

Nvidia do have their own HDR solution, which afaik operates in a similar way. The real issue here is Microsoft not implementing proper colour management in Windows, which would eliminate the need for these vendor-specific solutions.
 
So do the lower-tier monitors have HDR?

Or is that a thing that all Freesync 2 monitors have to support in order to call themselves that?
 

llien

Member
Nvidia do have their own HDR solution, which afaik operates in a similar way. The real issue here is Microsoft not implementing proper colour management in Windows, which would eliminate the need for these vendor-specific solutions.

What does Microsoft have to do with what is happening between a graphic card and a monitor?
 

CamHostage

Member
The real issue here is Microsoft not implementing proper colour management in Windows, which would eliminate the need for these vendor-specific solutions.

Does the impending next-gen HDMI (which will have its own variable refresh rate functionality) fix this lack of solidarity, or does it just add to the mess?
 

wildfire

Banned
Come on Nvidia, just support Freesync already

Freesync 2 is almost as restrictive as Gsync.

Freesync2 is nothing like Freesync and as a result won't be as widely supported because it demands higher quality control exactly like Gsync.
 

dr_rus

Member
Really? It's not like USB 1 and USB 2 situation?

Well, Freesync 2 still have Freesync in it obviously (the VRR sync tech based on VESA adaptive sync spec) but the rest of FS2 is actually about HDR, the VRR itself hasn't evolved beyond AMD's demanding more strict h/w support for FS2 certification. I dunno how it ended up but they were pondering the idea of charging display manufacturers for this certification as well. I'm also not 100% sure but it seems that FS2 HDR spec part contain some proprietary h/w which should be implemented by the display maker to be compatible.

Basically, as I've said a multitude of times, "supporting Freesync" is a lot of work and it's not clear how exactly NV would get any profit from such support. So still don't count on them supporting anything in place of Gsync but a mandated industry standard which Freesync is not.

http://www.anandtech.com/show/10967...improving-ease-lowering-latency-of-hdr-gaming
 

Thraktor

Member
What does Microsoft have to do with what is happening between a graphic card and a monitor?

MacOS handles wide-gamut displays seamlessly at the OS level, so there's no reason Windows shouldn't be able to.

To expand on this, in MacOS, any application can define the colour space it operates in and reliably expect the OS to accurately display the image as far as the monitor will allow. So, if a MacOS application is designed to operate in sRGB (the default) and the display supports the wider P3 colour space, the operating system will properly map the app within the display's wider space. Conversely, if an app is defined to operate in P3, but is running on an sRGB display, the colours outside sRGB will be clipped, but everything within it will be accurate. An app can even have a mixture of assets in different colour spaces and display seamlessly, and can query the OS for the details of the monitor (i.e. ICC profile, etc.) if it wants to handle colour mapping internally.

Windows has its own colour management solution, but it's entirely reliant on application support, which is pretty slim. A properly colour managed app, such as Photoshop, can query the OS for the monitor's ICC profile, as above, and achieve accurate colours on any calibrated monitor, but any app which isn't colour managed will just be rendered as-is, without being converted to the appropriate colour space. For a piece of software which assumes sRGB, this means that it will be oversaturated on a wide-gamut display, perhaps significantly.

This wouldn't be a problem if everything on Windows was colour managed, but in reality very little of it is. Even large portions of Microsoft's own software isn't colour managed, 14 years after they started supporting colour management in the OS. Hell, the Windows 10 Photos app isn't even colour managed, which says something about their enthusiasm for colour management.

What this means is that if you use a wide-gamut display on a Windows PC, you're getting better colour reproduction in some software, but you're actually getting worse colour reproduction in a lot of other software at the same time. When it comes to games, if you use a wide-gamut display (i.e. a HDR display) then the default behaviour is that the vast majority of games will be oversaturated and look terrible, and when you stop playing games then a lot of other software will be oversaturated and look terrible as well.

Because of this both AMD and Nvidia have had to implement workarounds in their drivers in order to bypass Windows colour management and properly support HDR in selected games without messing up colours in other software. As this has to be done on a driver level and requires games to be designed to interface with said drivers it's almost inevitably going to end up splitting the market with separate solutions for the two GPU brands.

Does the impending next-gen HDMI (which will have its own variable refresh rate functionality) fix this lack of solidarity, or does it just add to the mess?

If both AMD and Nvidia support the adaptive sync feature of HDMI 2.1, then it would provide a platform agnostic solution for adaptive sync, but it wouldn't change anything about the split in HDR implementations.
 

Thraktor

Member
Possibly my next monitor, i wish they'd make a 4K variant though.

I feel 1440p is a nice medium at the moment (I don't anticipate getting a GPU any time soon which could handle 4K at 60Hz, let alone 144Hz). My only issue is the curve, I prefer flat monitors, but I'd imagine we'll see more Freesync 2 monitors of varying shapes and resolutions over the next few months.
 

llien

Member
FreeSync 2 = FreeSync 1 + mandatory Low Framerate Compensation + HDR lag reduction trick.

Well, Freesync 2 still have Freesync in it obviously (the VRR sync tech based on VESA adaptive sync spec)

Pushed in there by AMD, last time I checked. ;)

I dunno how it ended up but they were pondering the idea of charging display manufacturers for this certification as well.

Any source on that?

Basically, as I've said a multitude of times, "supporting Freesync" is a lot of work and it's not clear how exactly NV would get any profit from such support.

I'm constantly being told that "supporting Freesync" is exactly how "gsync notebooks" work, by supporting cursed VESA standard pushed by AMD. Is that not the case?


AMD and Nvidia need to get behind the same fucking standard already

Before FreeSync and it's sibling VESA standard was out, nVidia quite clearly stated that it's their tech and they weren't going to license it to anyone.
So AMD or Intel can't use GSync, even if they so wished.

MacOS handles wide-gamut displays seamlessly at the OS level, so there's no reason Windows shouldn't be able to.

To expand on this, in MacOS, any application can define the colour space it operates in and reliably expect the OS to accurately display the image as far as the monitor will allow. So, if a MacOS application is designed to operate in sRGB (the default) and the display supports the wider P3 colour space, the operating system will properly map the app within the display's wider space. Conversely, if an app is defined to operate in P3, but is running on an sRGB display, the colours outside sRGB will be clipped, but everything within it will be accurate. An app can even have a mixture of assets in different colour spaces and display seamlessly, and can query the OS for the details of the monitor (i.e. ICC profile, etc.) if it wants to handle colour mapping internally.

I get your point, but the tech we are talking about is happening exclusively between GPU
and monitor. It is about the way of telling an HDR screen how to display things.
Normal (standard) way of doing it, involves calculation steps, when monitor needs to figure how to map passed colors to what it is actually able to handle. (creators of that standard didn't bother about lag much, apparently) AMD's FreeSync 2 covers that, by allowing (so much more powerful than what screens have) GPU do these calculations
 

Buggy Loop

Member
Cmon AMD, I hope Vega disrupts the market a bit like Ryzen in bang for the buck category (not top end, i never can afford that..).
 

llien

Member
Alienware Announces 240 Hz Gaming Monitors: AW2518H (G-Sync) and AW2518HF (FreeSync)


As for the specifications: both of these displays operate at 240 Hz, native, not overclocked. To achieve this rate, its panel is 24.5-inch, 1080p, and TN. The structure itself has a thin bezel on the top, left, and right side, although the bottom has a bit more thickness for the Alienware typeface logo and buttons. Despite being otherwise identical, the G-Sync model (AW2518H) has an MSRP of $699.99, while the FreeSync model (AW2518HF) is $200 cheaper at $499.99.

https://www.pcper.com/news/Displays...-Monitors-AW2518H-G-Sync-and-AW2518HF-FreeSyn
 

dr_rus

Member
FreeSync 2 = FreeSync 1 + mandatory Low Framerate Compensation + HDR lag reduction trick.
It's a bit more than that but this is essentially true.

Pushed in there by AMD, last time I checked. ;)
Pushed in where? Into HDMI 2.1 specs? HDMI Foundation consist out of eighty companies, half of which is several times bigger than AMD. I seriously doubt that AMD can "push" anything in there.

Any source on that?
The link I've posted.

I'm constantly being told that "supporting Freesync" is exactly how "gsync notebooks" work, by supporting cursed VESA standard pushed by AMD. Is that not the case?
...What?

a) It's not exactly the same because the implementations are obviously different. I don't know why it's so hard for people to understand but let me give you an example here: both Windows and Linux run on the same x86 h/w - does it mean that they are the same and it's a total no bother to make Windows programs work on Linux? Can be applied to porting games between Windows and PS4 as well for example.

b) You may have noticed that not all notebooks with GeForce GPUs in them support Gsync even though the adaptive sync spec is required in eDP so technically they all can do it - that's precisely because of a) as it's not easy and not free to support adaptive sync on some specific product - notebook or display, doesn't matter - as you have to make sure that this product works as intended and support this in your s/w side of the implementation (drivers).

c) Again, VESA adaptive sync standard was not "pushed by AMD", it existed in the eDP spec for some time prior to AMD deciding to use it for Freesync after they were hit by Gsync out of nowhere. Adaptive sync has a side effect of lowering display power consumption because it can throttle down display refreshes when you're low on power hence why eDP spec had it for some time now. What AMD did is requested to add this part of eDP spec as an optional feature to DP 1.2+ specification.

Before FreeSync and it's sibling VESA standard was out, nVidia quite clearly stated that it's their tech and they weren't going to license it to anyone.
So AMD or Intel can't use GSync, even if they so wished.
It's exactly the same with Freesync - and Freesync 2 specifically - you can't license it to use it in your GPU. Nobody but AMD uses it.
 

llien

Member
Might pre order the 32 in. These are HDR right?

Yes, with wider gamut, vs simply supporting HDR input (which exists only to deceive customers and should be banned in my opinion)


Pushed in where? Into HDMI 2.1 specs? HDMI Foundation consist out of eighty companies, half of which is several times bigger than AMD. I seriously doubt that AMD can "push" anything in there.

First into VESA Display Port 1.2a, then into HDMI 2.1.
I don't get the "80 companies" point, there is hardly any standard of this kind with small number of participants.
Are you seriously challenging that adaptive sync in 1.2a was initiated by AMD?

The link I've posted.
Basically anand's author speculating.
Anyhow, I think nobody expects along nvidia's $200 premium for such certification.


...What?

a) It's not exactly the same because the implementations are obviously different. I don't know why it's so hard for people to understand but let me give you an example here:

I think you are missing the point here.
Both AMD's and nVidia's GPU's use the same way to communicate with notebook's screen supporting adaptive sync. There is no "gsync chip" involved in it. The communication is akin to what is used by FreeSync, based on eDP by VESA, which is there since... 2009.

b) You may have noticed that not all notebooks with GeForce GPUs in them support Gsync even though the adaptive sync spec is required in eDP so technically they all can do it - that's precisely because of a) as it's not easy and not free to support adaptive sync on some specific product - notebook or display, doesn't matter - as you have to make sure that this product works as intended and support this in your s/w side of the implementation (drivers).

Note all monitors support FreeSync either, even though most of them use upscaler chips that have built in support for it. I am not getting how it is addressing "nVidia, like AMD, uses eDP, not it's own custom chip, in it's 'gsync' notebooks".

c) Again, VESA adaptive sync standard was not "pushed by AMD", it existed in the eDP spec for some time prior to AMD deciding to use it for Freesync after they were hit by Gsync out of nowhere.
It existed back in 2009 and was covering only "embedded" display port.
AMD pushed it to normal DP.
Had eDP not existed, AMD would not be able to respond to GSync so quickly.

It's exactly the same with Freesync - and Freesync 2 specifically - you can't license it to use it in your GPU. Nobody but AMD uses it.

Yeah, but you don't need AMD's license to use DP 1.2a's adaptive sync features, nor to support HDMI 2.1's adaptive sync features.

The reason nVidia isn't doing it has apparently nothing to do with licensing, but more with income from GSync chips, and other positive side effects of it like vendor lock-in.
 

dr_rus

Member
First into VESA Display Port 1.2a, then into HDMI 2.1.
I don't get the "80 companies" point, there is hardly any standard of this kind with small number of participants.
Are you seriously challenging that adaptive sync in 1.2a was initiated by AMD?
The point is that you can't "push" something into HDMI specs without the whole board agreeing with you and if they do then it's not really you "pushing" anything there but the joint effort of HDMI Foundation. DP 1.2a is a totally different situation as I've already explained - and it's also optional, even in DP 1.4.

Basically anand's author speculating.
Anyhow, I think nobody expects along nvidia's $200 premium for such certification.
Prices are irrelevant, the sole fact of FS2 possibly requiring paid licensing is what matters. NV can adjust the prices of Gsync h/w anytime they want. The fact that they didn't so far means only that they are fine with how Gsync is selling.

I think you are missing the point here.
Both AMD's and nVidia's GPU's use the same way to communicate with notebook's screen supporting adaptive sync. There is no "gsync chip" involved in it. The communication is akin to what is used by FreeSync, based on eDP by VESA, which is there since... 2009.
No, it's you who are missing the point - communication means zilch, it's just a protocol running on top of h/w specs of DP (there are no h/w changes in DP 1.2a+ to support this, it's pure s/w). What matters is how this protocol is used to achieve the needed results - and this is the real implementation of adaptive sync which is completely different between Freesync and Gsync Mobile.

Note all monitors support FreeSync either, even though most of them use upscaler chips that have built in support for it. I am not getting how it is addressing "nVidia, like AMD, uses eDP, not it's own custom chip, in it's 'gsync' notebooks".
It's addressing it directly and I've already explained how. People like you assume that this is a switch somewhere in the drivers while it's in fact a lot of work which GPU vendor must do for you to enjoy it later, and NV will get no profit from supporting Freesync this way. Their Gsync business is built in a way which makes it sustainable via the Gsync h/w sales even though I'd wager that at least 75% of that Gsync module cost goes into actual s/w support cost and not the h/w production cost. There is no way for NV to get these costs back from supporting Gsync over eDP spec right now but to include it into the GPU costs which would either mean less ASPs for them (with no apparent gain whatsoever) or higher costs for all consumers, even those who don't have or even plan to have anything Gsync. Basically, it makes no business sense.

It existed back in 2009 and was covering only "embedded" display port.
AMD pushed it to normal DP.
Had eDP not existed, AMD would not be able to respond to GSync so quickly.
Exactly, and their "push" to DP was an addition of optional extension which remains optional for three DP versions since then which clearly shows the extent of AMD's "pushing" capabilities without VESA. No reason to think that it's any different with HDMI.

Yeah, but you don't need AMD's license to use DP 1.2a's adaptive sync features, nor to support HDMI 2.1's adaptive sync features.

The reason nVidia isn't doing it has apparently nothing to do with licensing, but more with income from GSync chips, and other positive side effects of it like vendor lock-in.
You need license to use Freesync and you need to build your own adaptive sync solution if you want to use VESA adaptive sync spec which is costly not only in building but in future support too. HDMI is a licensed standard by default thus everyone have to pay licensing fees if they want to have HDMI port on their devices. And yes, you are correct, NV just don't see any benefit in supporting Freesync for them, as a commercial organization. Free things are nice but they will never work if somebody isn't paying for them.
 

Charcoal

Member
Sorry for the bump, this is the newest thread I could find.

Are these new monitors one's to keep an eye one? I'm in the market for a new one and I've never had HDR, 144hz, etc. Does anyone have any suggestions?
 

DieH@rd

Banned
Sorry for the bump, this is the newest thread I could find.

Are these new monitors one's to keep an eye one? I'm in the market for a new one and I've never had HDR, 144hz, etc. Does anyone have any suggestions?

If you are looking at new freesync monitors, these new ones from Samsung look to be great.

I am however seeking for a good 40-43" HDR monitor/TV, and since I cannot get my hands on Sony 43XD80xx model [not avaiable anymore in my region], my next best hope are some of these monitors.

Hopefully there will be Freesync2+HDR announcements for us who are aiming for 40-43" range.
 
Top Bottom