• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Will the PC version of Forza Horizon 3 support HDR?

So it's the upscaling of color basically?

fpQ7eD4.jpg

For console games I think the 8 bit source gets resampled to 10 bit and that's why the OG PS4 got HDR compatible with a firmware upgrade.

Internally the console applies the extended dynamic range (the light range between the deepest black and the brighter white) to the 8 bit source

I don't know about PC, I think only Alien Isolation has true 10 bit support
 
For console games I think the 8 bit source gets resampled to 10 bit and that's why the OG PS4 got HDR compatible with a firmware upgrade...

I don't know about PC, I think only Alien Isolation has true 10 bit support

Yeah I assumed og ps4 was fudging it and Pro has true 10 bit. I'm more confused about older pc gpu capability in that area. Will be nice to see a side by side once Pro is released showing difference between it and the patched og ps4.
 
That's the output color depth.

The actual game rendering often happens in HDR framebuffers, and has done so for a long time. Here is a 2004 Gamedev.net article about HDR rendering.

The difference now is that a larger part of that color and contrast space can be mapped to the display -- but that doesn't require more performance.

The main technology site here in Italy says otherwise: rendering games in full 10-bit depth requires more hardware resources than 8-bit....

"Games are developed in most cases already in HDR, said developer Michele Caletti. Rendering Engines that manage the light already support a wider range than normal TVs, But in Gaming, HDR refers only to the additional light information because color bit-depth is still limited to 8 bit, and will be for the foreseeable future. Also the color space is still the standard sRGB, no wide color gamut"

Yeah I assumed og ps4 was fudging it and Pro has true 10 bit. I'm more confused about older pc gpu capability in that area. Will be nice to see a side by side once Pro is released showing difference between it and the patched og ps4.

This applies also to Pro and Xbone S/Scorpio. In videogames you get one part out of three of the HDR experience. What we generally call TRUE HDR

1) Broad Light Range
2) Wide Color Space
3) High Color Depth (10/12 bit)
 
So the issue actually is lagging UWP support. I was aware that it will be added to DirectX, but had no idea of the timeline.

Note that the hardware SDK solution supports every version of Windows :p

I mean... saying it's lagging UWP support is putting words in their mouth and a bit disingenuous to what they talk about in the video and at the GDC talk. It applies to Win32 as much as UWP.

Don't forget it's the entire OS as a whole which will support HDR. Don't only think specific to games. How will the display hardware itself surface themselves in the control panel and be controlled by users? Where will HDR and the WCG be used throughout the OS design itself? What standards will they support? How will this be surfaced to creators in UWP's and Win32 and also DX12? The DRM requirements Netflix has for UHD content is different than 1080p content, so they likely need to work on adding that if necessary. MS also needs to update the compositor of Windows itself to support HDR, and all of the internal display code to handle these formats, because as of right now, all of the hardware SDK's require you to be in Exclusive Fullscreen because the Windows Compositor doesn't support HDR.

Regarding the standards, they said Windows will be supporting two formats. One going up to 10k nits, and the other going up to 5.2 million nits. Sweet jesus that would be one bright display.
 

Durante

Member
The main technology site here in Italy says otherwise: rendering games in full 10-bit depth requires more hardware resources than 8-bit....

"Games are developed in most cases already in HDR, said developer Michele Caletti. Rendering Engines that manage the light already support a wider range than normal TVs, But in Gaming, HDR refers only to the additional light information because color bit-depth is still limited to 8 bit, and for the foreseeable future. Also the color space is still the standard sRGB, no wide color gamut"
Color gamut is an authoring / art thing, and doesn't require any additional hardware resources.

I don't really understand what that quote is trying to say. Color bit depth is most definitely not limited to 8 bit when you e.g. render to a 16 bit per component framebuffer.
The final output image used to be limited to 8 bits per component (in consumer displays), and that is exactly what these new HDR signaling standards address.

Perhaps it's easier to understand if you think of it like downsampling: the games where actually rendering at higher precision and then had to downsample for the display -- except that it wasn't spatial pixels that are being downsampled, it's color and contrast information.
 
Color gamut is an authoring / art thing, and doesn't require any additional hardware resources.

I don't really understand what that quote is trying to say. Color bit depth is most definitely not limited to 8 bit when you e.g. render to a 16 bit per component framebuffer.
The final output image used to be limited to 8 bits per component (in consumer displays), and that is exactly what these new HDR signaling standards address.

Perhaps it's easier to understand if you think of it like downsampling: the games where actually rendering at higher precision and then had to downsample for the display -- except that it wasn't spatial pixels that are being downsampled, it's color and contrast information.

They get downsampled because they need to run the games at high resolution and at a good frame-rate.

I don't get how you can think that rendering the final game with a higher color bit-depth and/or color space, won't affect performances and resolution....
 

Durante

Member
giphy.gif


Seriously wouldn't something like that generate a ton of heat and need an active cooling solution? I do understand it isn't ALL areas lighting up at once but still seems like you'd need better heat dissipation for that much brightness in a display.

Display technology will end up being different to it is today by the point that we get 5.2 million nit displays. I wouldn't expect that for a longgg while in terms of consumer hardware, though the point is, Windows itself, its SDK's and the applications and games developers create would then be compatible at the very least with those new displays whenever they come out.
 
Rendering at a higher bit depth absolutely does affect performance, significantly so.

What I'm trying to tell you is that many high-end games have been rendering at a higher bit depth for roughly a decade. Here's a HDR rendering / non-HDR rendering comparison dating from 2006.

OK but does it matter? In the end you have to release the final game on consoles that have specs limits. So if you use a lot of that power for 10-bit depth, you will have to sacrifice something else.

Maybe in the PC world with a lot of power you could try. But would it be worth it?
 

Theonik

Member
giphy.gif


Seriously wouldn't something like that generate a ton of heat and need an active cooling solution? I do understand it isn't ALL areas lighting up at once but still seems like you'd need better heat dissipation for that much brightness in a display.
Yes and no, the more efficient the light source the less is the heat. Samsung's panels this year do a ramp up and down with peak brightness though only showing it for short bursts. Partly because of this.

OK but does it matter? In the end you have to release the final game on consoles that have specs limits. So if you use a lot of that power for 10-bit depth, you will have to sacrifice something else.

Maybe in the PC world with a lot of power you could try. But would it be worth it?
What they are trying to tell you is that most games already do it. There is many benefits from it. Switching to HDR output only involves removing the tonemapping step of your rendering pipeline on altering it for 10 bit with HDR metadata.
 
So, is this tech retroactive and old games can look better with hdr screens?

Developers have to provide the appropriate HDR metadata for the game for it take advantage of it so no, just booting up an old game on an HDR display won't give the the fuller spectrum of brightness, it has to do a handshake between software metadata and the display. Sony devs explained this in the Pro presser iirc. For example TLOU Remastered will need the HDR patch for it to take advantage of HDR.

Yes and no, the more efficient the light source the less is the heat. Samsung's panels this year do a ramp up and down with peak brightness though only showing it for short bursts. Partly because of this.

Yeah I read that earlier about the ramping up and down with brightness, and also read that edge lit displays (like the Samsung models) can have problems showing games with the proper brightness distribution across the display. Not sure what that would look like but I believe gaf user Karak said it has a "pillaring" effect of light trailing off behind a character in the center of the display for instance.
 

Hawk269

Member
Thank you for that info, I was looking for HDR content for the PC for weeks now since I bought my 4K HDR TV.

I can also confirm that the 980Ti is HDR compatible:

1kijcb.png


PS: The difference on this test between HDR and SDR is mind blowing.

Would you mind listing and providing links to what you have to download to get this to work and the link for whatever content you are testing?
 

Hawk269

Member

While a 1080ti does not exist, even if it did, it could not do 4k at 144fps, unless you turn everything down except resolution and that is still questionable. I have a Titan X Pascal in SLI and The Division maxed out at 4k averages around 78fps in the in-game benchmark. To think a theoretical 1080ti can drive a 4k/144/HDR is dreaming.
 

dsk1210

Member
Would you mind listing and providing links to what you have to download to get this to work and the link for whatever content you are testing?

I provided a link on the last page, go to the website and where it says to do next select SDK and not white paper.
 
Would you mind listing and providing links to what you have to download to get this to work and the link for whatever content you are testing?

1) Download these files from Nvidia.

2) Run the file run_hdr.

3) You can choose inside the test some different pictures from the UE4 tech demos and choose HDR, SDR etc.
 

Durante

Member
So, is this tech retroactive and old games can look better with hdr screens?
No, it needs to be patched.

My point in that discussion was that doing so will not introduce any performance penalty if the game is already rendering to HDR buffers internally (which many high-end games are).
 

mario_O

Member
No, it needs to be patched.

My point in that discussion was that doing so will not introduce any performance penalty if the game is already rendering to HDR buffers internally (which many high-end games are).

Well, this is great news. If all the hdr data is there and it doesn't introduce any performance penalty, I can imagine many devs will patch their games to support this standard.
 

Momentary

Banned
I picked up the LG OLED55e6p, but I haven't watched anything that supports HDR until just now. It makes content look pretty vivid. I'm not gonna lie. I like it alot, but it's not gonna make me haul my computer downstairs once games start supporting it.
 
What gpu do you have?

wasn't my gpu I was worried about.

I have a pascal titan x.

what does concern me is the bandwidth limitations of HDMI 2.0.

It seems like it won't be possible to enjoy HDR @ 4k:60 w/ 4:4:4 chroma sub-sampling.

color output is limited to 8bpc when in this mode. If I drop to 4k:30 or 4k:60:422, I can pick 10 or 12bpc in nvidia control panel.
 

dsk1210

Member
I picked up the LG OLED55e6p, but I haven't watched anything that supports HDR until just now. It makes content look pretty vivid. I'm not gonna lie. I like it alot, but it's not gonna make me haul my computer downstairs once games start supporting it.

Well you will be missing out on 4k 60fps and HDR. I can't understand the reluctance to hook the PC to the TV, I have done for years now, I also have a G-sync monitor at the side but it really never gets used recently.

There is big picture mode in steam for game booting, idp can be adjusted on the OS level for 4k performance as well.
 

Pagusas

Elden Member
wasn't my gpu I was worried about.

I have a pascal titan x.

what does concern me is the bandwidth limitations of HDMI 2.0.

It seems like it won't be possible to enjoy HDR @ 4k:60 w/ 4:4:4 chroma sub-sampling.

color output is limited to 8bpc when in this mode. If I drop to 4k:30 or 4k:60:422, I can pick 10 or 12bpc in nvidia control panel.

To be fair, 4K:60:4:2:2 on a tv with a fast game is barely noticeable at all in terms of any quality loss. More than worth it for HDR during gaming, then You'll switch back to 4:4:4 8-bit for the OS.
 

Momentary

Banned
Well you will be missing out on 4k 60fps and HDR. I can't understand the reluctance to hook the PC to the TV, I have done for years now, I also have a G-sync monitor at the side but it really never gets used recently.

There is big picture mode in steam for game booting, idp can be adjusted on the OS level for 4k performance as well.

Well, I don't want to hog up the TV because my wife likes to watch her shows as well. I have a office with a comfy couch and a big enough monitor that I game in. I'm not reluctant about hooking up a PC to my T.V. I'm reluctant about constantly moving a full hard line water loop build with a Titan X Pascal in it that took me time and a good bit of money to put together. The case also has tempered glass. So if it accidentally knock the edge of it up against a wall it will literally explode. I found that out the hard way.

I'll enjoy movies and show for now. I'll wait for a monitor with HDR that has 120 or more hz with lower latency to game on. Right now LG has just released an new 38" 21:9 3840 x 1600 panel that will hopefully have HDR and a 120hz refresh rate at some point if companies like ACER and ASUS decide to utilize it.
 
From a similar thread about HDR support on the PC version of Gears Of War 4 Coalition confirmed that PC has no HDR support.

Gears 4 has HDR on Xbox One S. It does not support it on PC. PC just started incorporating HDR support on cards recently and many monitors don't have the functionality.

The PC version does support full 4K, Ultrawide, etc.


I believe that we should have a similar confirmation about Forza Horizon 3.

I have pre-ordered the game for the PC and I have a HDR TV so I would like to know (as many other I assume) if the game will have HDR support (on launch or even after that with a patch) or not.
 

Outrun

Member
From a similar thread about HDR support on the PC version of Gears Of War 4 Coalition confirmed that PC has no HDR support.




I believe that we should have a similar confirmation about Forza Horizon 3.

I have pre-ordered the game for the PC and I have a HDR TV so I would like to know (as many other I assume) if the game will have HDR support (on launch or even after that with a patch) or not.

Yes, I was about to post the same thing.

It is funny how this is a thing now.
 

dark10x

Digital Foundry pixel pusher
Thank you for that info, I was looking for HDR content for the PC for weeks now since I bought my 4K HDR TV.

I can also confirm that the 980Ti is HDR compatible:

1kijcb.png


PS: The difference on this test between HDR and SDR is mind blowing.
Be sure to try the other test images included. Some of the darker scenes are far more impressive.
 

Vipu

Banned
While a 1080ti does not exist, even if it did, it could not do 4k at 144fps, unless you turn everything down except resolution and that is still questionable. I have a Titan X Pascal in SLI and The Division maxed out at 4k averages around 78fps in the in-game benchmark. To think a theoretical 1080ti can drive a 4k/144/HDR is dreaming.

Yeah maxed, but you can put settings smartly so it almost looks same but you have a lot more fps.
In all games there is settings that eat tons of fps but have barely any visible difference.
So yeah it would be possible, goal would not be to have stable 144fps anyway, just 80-100 would be already enough.
 

x3sphere

Member
wasn't my gpu I was worried about.

I have a pascal titan x.

what does concern me is the bandwidth limitations of HDMI 2.0.

It seems like it won't be possible to enjoy HDR @ 4k:60 w/ 4:4:4 chroma sub-sampling.

color output is limited to 8bpc when in this mode. If I drop to 4k:30 or 4k:60:422, I can pick 10 or 12bpc in nvidia control panel.

Noticed I can't pick Full dynamic range when in 10bpc mode, is that normal? It's locked to Limited.

In the HDR SDK test... 8bpc + Full dynamic range looks better than 10bpc + Limited. The colors seem way oversaturated in 10bpc. I have a C6 OLED + Titan X Pascal
 

MazeHaze

Banned
Noticed I can't pick Full dynamic range when in 10bpc mode, is that normal? It's locked to Limited.

In the HDR SDK test... 8bpc + Full dynamic range looks better than 10bpc + Limited. The colors seem way oversaturated in 10bpc. I have a C6 OLED + Titan X Pascal
It's a limitation of hdmi, you need to lock at 30hz or lower your res from 4k to get 10bpc full/444
 

MazeHaze

Banned
Just tried out the hdr demo myself. I feel like some of the sdr examples are more washed out than sdr game actually look on my hdr display. The one on the dark train tracks is legit though, as soon as you switch it from sdr to hdr, those bright lights get way brighter and the darks get way darker without losing any detail.
 

IMACOMPUTA

Member
Thank you for that info, I was looking for HDR content for the PC for weeks now since I bought my 4K HDR TV.

I can also confirm that the 980Ti is HDR compatible:

1kijcb.png


PS: The difference on this test between HDR and SDR is mind blowing.

Man, that's obviously some watered down bargain basement HDR and not REAL HDR.

/sarcasm

I really want to know if FH3 on PC is going to support the xb1 impulse triggers.
 
Top Bottom