• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Game Graphics Technology | 64bit, procedural, high-fidelity debating

HTupolev

Member
I get that retrieving data (that's already baked
The irradiance data doesn't get "retrieved." Outside of runtime, it doesn't exist anywhere. Far Cry 3's probes don't store irradiance samples for different environment conditions that get looked up and interpolated, they store a set of coefficients to a transfer function.

I guess it would be possible in a weird philosophical sense to describe the transfer function as an encoding of a large implicit data set, and the real-time parameters as indices, but you'd be drawing that line in an odd place. If we really want to go there, we could just as easily describe all real-time GI methods as baked, since the scene and light sourcing together form an index with which we can look up the lighting result.

I'm just speaking to the word "dynamic" in my own warped definition.
It's not an issue of definition, it's an issue of context. I'm saying "X is done in real time", and you're objecting with "Y is not done in real time." Or, more precisely, I'm saying "the transfer function is evaluated in real time", and you're objecting with "it's not real-time because the transfer function isn't itself modified in real time."
 

gamerMan

Member
According to the Senior Shading Artist at Naughty Dog: fur and stubble are not using alpha cards or displacement, they are using similar tech called parallax, combined with "shell" tech,

Here is her answer in the comments: https://www.artstation.com/artwork/4lGm8

Are you sure she is talking about the facial hair? That page you linked to is talking about the fur shader. It is obvious that the fur shader is using the shell technique and parallax mapping.

The facial hair looks like it is using hair sheets with alpha transparent textures. Here is an explanation of the the technique.

http://udn.epicgames.com/Three/CreatingHairUsingAlpha.html

Hair_03.jpg


Hair_04.jpg
 
Yea, it seems like a much easier technique (i.e. overlapping shells that use a PRNG to pick a pre-masked texture by which hairs grow from the surface) than sprite cards although the approach seems similar. Among the drawbacks I see, getting rid of aliasing would be high on the list. Also, the diffuse wrap lighting (evaluating NdotL beyond the normal range) doesn't really add self-shadowing to the hair. I'd rather see ray-tracing for shadows, or some form of occlusion to give the hair more shape (especially when the hair is indirectly lit). Obviously getting it to react to forces, gravity, etc.. would be a plus.


https://www.artstation.com/artwork/JOY4A

Here are some features we used to create the volumetric look of the hair, realtime in-game:
1. Using baked shadow map and wrap diffuse to fake the shadow. It helps us to get the feeling of depth and volume of the hair.
2. Scatter is really important for blonde hair to look right. It’s because the lights scatter between hair strands.
We can’t afford ray-tracing, of course, so we approached the similar result by wrapping the lights and offsetting the dot(N,V).
3. We used typical kayjiya-kay specular model for the specular of the hair. We also put hair id maps to masked out the offset value of the spec.

Are you sure she is talking about the facial hair? That page you linked to is talking about the fur shader. It is obvious that the fur shader is using the shell technique and parallax mapping.

The facial hair looks like it is using hair sheets with alpha transparent textures. Here is an explanation of the the technique.

I don't know exactly since she didn't say that much about the hair type used in the hair page but I presumed she used the same tech of fur. We will see more in depth tech analysis at SIGGRAPH.
 

Frozone

Member
The irradiance data doesn't get "retrieved." Outside of runtime, it doesn't exist anywhere. Far Cry 3's probes don't store irradiance samples for different environment conditions that get looked up and interpolated, they store a set of coefficients to a transfer function.

I guess it would be possible in a weird philosophical sense to describe the transfer function as an encoding of a large implicit data set, and the real-time parameters as indices, but you'd be drawing that line in an odd place. If we really want to go there, we could just as easily describe all real-time GI methods as baked, since the scene and light sourcing together form an index with which we can look up the lighting result.

That's really where I'm getting at. :)

It's not an issue of definition, it's an issue of context. I'm saying "X is done in real time", and you're objecting with "Y is not done in real time." Or, more precisely, I'm saying "the transfer function is evaluated in real time", and you're objecting with "it's not real-time because the transfer function isn't itself modified in real time."

I get what you are saying.
 
She was only speaking about the fur collars, which are clearly using (pretty oldschool) shell tech that I first remember seeing back in 2002 in Star Fox Adventures (Conker Live & Reloaded and Shadow of the Colossus also used this). The stubble and all other hair on the characters are done with alpha cards (read her post again and the title of the image set, she's pretty clear about the shells being used only for fur).

The parallax she's referring to is just the result of having a number of instances of fur geo stacked on top of one another with little dots of fur visible (with the rest of the polys alpha masked out). It looks great looking straight down at the tips of the fur up to about a 45 degree angle, but you can see the stacked polys start to separate around the silhouette edges when the fur volume becomes perpendicular to the camera.

I don't know whether you have the game or the opportunity to play around with it in photomode, but check it out. It's pretty easy to see the hair polygons if you zoom in and drop the FOV all the way down to get way closer than you're really supposed to. Nadine's hair in particular is fantastic looking by any modern standards, but it's still clearly made up of a bunch of hanging polygon strips (with some basic phyics) and alpha masked curls. Interspersed are a few 'hero curls' that have the full spiral rings of hair modeled out, but the majority is essentially flat planes with textured images of curly hair.

For the beards and stubble, check a page or two back (or just about any of the making-of videos on their animation tech) and you'll see the hundreds of little triangles that have alpha mapped beard stubble on them for Drake and the other guys.

EDIT: You can see the separation and breakdown of the effect that takes place on the glancing silhouette edges in her shot here (exact same effect as Sully and Sam's fur lined jackets but just with fewer shell steps):

Yeah I know that tech. There is a similar third party solution for UE4.
 

Javin98

Banned
I'm not sure how many of you are aware of this, but patch 1.03 of The Witcher 3 on consoles claims to reduce vegetation shadow pop in. For this to be possible, shadow draw distance would have to be increased. So, I set to test this out. Until this week, I couldn't find the exact location that DF used for a perfect comparison. But a recent thread mentioned it and I was able to get the exact location, Drahim Castle. Without further ado, here's the comparison:

jpg

10KtO7h.png


Both PS4 versions, top one is DF's shot on patch 1.01. Bottom one is a shot I took earlier with the latest patch installed. While the improvement is marginal, it's interesting to see shadow draw distance improved via a patch. This is a lot of work, but I wonder if someone could go to Drahim Castle and test out the foliage setting on PC.

Also, as a side note, I'm on my second playthrough of Uncharted 4, anyone want shots for any graphical techniques?
 

nOoblet16

Member
Also, as a side note, I'm on my second playthrough of Uncharted 4, anyone want shots for any graphical techniques?

PoM with self shadowing please.
Also shadow draw distance, I want to make sure if the distant shaows they use is baked or real time since ND have been known to mix both.
 

Javin98

Banned
PoM with self shadowing please.
Also shadow draw distance, I want to make sure if the distant shaows they use is baked or real time since ND have been known to mix both.
Hmm, POM with self shadowing can be kinda hard. I've certainly seen a lot of POM, but I'm not very good at spotting self shadowing. I'll try my best regardless. Be back with shots tomorrow. As for shadows, I think it's safe to say that anything dynamic, moving and destructible is using dynamic shadows. Unless I'm mistaken you can't bake shadows unless the objects are static and non-interactive, right?
 

KKRT00

Member
Hmm, POM with self shadowing can be kinda hard. I've certainly seen a lot of POM, but I'm not very good at spotting self shadowing. I'll try my best regardless. Be back with shots tomorrow. As for shadows, I think it's safe to say that anything dynamic, moving and destructible is using dynamic shadows. Unless I'm mistaken you can't bake shadows unless the objects are static and non-interactive, right?

Its easy. In the scene where Drake has flashlight on, turn him into the surface with POM and then go into the photo mode, and look the POM surface from angle that also looks at the flashlight.

POM doesnt create geometry, so it doesnt have real shadows, so it all need to be included in the algorithm that handles POM shading.
 

Javin98

Banned
Its easy. In the scene where Drake has flashlight on, turn him into the surface with POM and then go into the photo mode, and look the POM surface from angle that also looks at the flashlight.

POM doesnt create geometry, so it doesnt have real shadows, so it all need to be included in the algorithm that handles POM shading.
Oh, good point. Thanks! I'll give it a try and post shots here tomorrow. Also, as a tip to those who have the game, if you verify if a bumpy surface is POM, simply turn on Tri Color mode. If it isn't rendered, it simply isn't geometry. From what I noticed, footprints in the snow are POM.
 
This is a lot of work, but I wonder if someone could go to Drahim Castle and test out the foliage setting on PC.

Here you go, though it's with a desaturated Reshade. It's interesting that the foliage shadows have their own setting. Buildings still cast shadows far away, even on PS4.

I also experiemented (a while ago) with pushing the foliage detail setting beyond ultra. Check out the distant trees in the bottom two screens, between the big oak on the left and the windmill on the right. Very performance intensive though...

EDIT: Huh. Also just noticed it added extra branches to the oak on the left as well. Presumably those would have popped in as I got closer on default ultra settings, but they render from farther away when you bump up the foliage detail setting. So it's much more than just shadow range.

27158815190_a9bb6edd0f_o.png


ULTRA
26826355324_81985be686_o.jpg


BEYOND ULTRA
26826354484_cf73941144_o.jpg
 

Javin98

Banned
Here you go, though it's with a desaturated Reshade. It's interesting that the foliage shadows have their own setting. Buildings still cast shadows far away, even on PS4.

I also experiemented (a while ago) with pushing the foliage detail setting beyond ultra. Check out the distant trees in the bottom two screens, between the big oak on the left and the windmill on the right. Very performance intensive though...
Thanks, but I wasn't asking for ultra screens. Obviously the console versions are no match for the PC version on Ultra. :p

I was hoping for Medium or High comparisons.
 
I thought this was a nice touch in ROTR. I don't think I've seen a projector cast an image onto a character before in a game. It seems fancy.

Many games have done this.
Doom 3 can only render stencil shadows with hard distinct lines as per the shadows in image 1.
DOOM 3 can render any kind of shadow that the OGL engine of that time supports.

You can have full soft-shadowing in DOOM 3 with mods like Sikkmod or other graphics mods.
 
Thanks, but I wasn't asking for ultra screens. Obviously the console versions are no match for the PC version on Ultra. :p

I was hoping for Medium or High comparisons.

I can do that too! I just doubled the foliage shadows from Ultra to see how that would look here out of curiosity, and now I don't want to live without it =(

 

Javin98

Banned
I can do that too! I just doubled the foliage shadows from Ultra to see how that would look here out of curiosity, and now I don't want to live without it =(
Thanks! What setting is this pic set to? The shadows still render pretty far compared to my PS4 shot.
 
Thanks! What setting is this pic set to? The shadows still render pretty far compared to my PS4 shot.

That one is set to almost double Ultra's setting. Ultra setting is 1.8, Low is 0.8, and the one above is 3.0. It's a setting called FoliageDistanceScale. Very pretty, very painful for my PC =P

Below are the game's default low, medium, and then high settings for this location. Note that it also affects the draw distance of trees, most noticeable at the low setting. Ultra adds trees to the back left of this scene that even High doesn't quite render.

LOW
26826678244_9a2b59a5a5_o.png


MEDIUM
26826690134_e0b2cc0409_o.png


HIGH
26826699254_fa75050ed5_o.png
 
DOOM 3 can render any kind of shadow that the OGL engine of that time supports.

You can have full soft-shadowing in DOOM 3 with mods like Sikkmod or other graphics mods.

Default idtech 4 as it is present in DOOM 3 cannot do that though, which was what I was referencing.
 

Javin98

Banned
That one is set to almost double Ultra's setting. Ultra setting is 1.8, Low is 0.8, and the one above is 3.0. It's a setting called FoliageDistanceScale. Very pretty, very painful for my PC =P

Below are the game's default low, medium, and then high settings for this location. Note that it also affects the draw distance of trees, most noticeable at the low setting. Ultra adds trees to the back left of this scene that even High doesn't quite render.
Thanks for the comparisons! Looks like the shadow draw distance is between Low and Medium, then.
 

Javin98

Banned
So, as per request, I've been looking for POM with self shadowing in Uncharted 4 and I'll just admit it, I'm not that good at spotting it. There are many instances where I'm quite sure it's POM, but I'm not sure if it self shadows. As such, I'm a bit afraid of posting those pics because I don't want to spread misinformation. However, Beyond3D is a good source for this and one user in particular has many examples. Credits to Clukos:

uncharted4_athiefsend0uxg6.jpg

uncharted4_athiefsendiwa3o.jpg


Now I'm not sure if he is entirely right, so hopefully someone with more technical know how can weigh in on this. Also, I got a shot of how far the dynamic shadows can draw. I'll post it tomorrow.

Note to Clukos if he reads this thread:
Sorry for always stealing your pics. Take it as a compliment that I use them because they are good examples :p
 

dr guildo

Member
Are you sure she is talking about the facial hair? That page you linked to is talking about the fur shader. It is obvious that the fur shader is using the shell technique and parallax mapping.

The facial hair looks like it is using hair sheets with alpha transparent textures. Here is an explanation of the the technique.

http://udn.epicgames.com/Three/CreatingHairUsingAlpha.html

Hair_03.jpg


Hair_04.jpg

Ingame hair model :
27156671996_66f522f707_o.png

27156674726_bd4fb15264_o.png

27156680586_29f73b5c19_o.png

27156679606_b128a7570b_o.png


Their solution for hair is simply amazing !
 

dr guildo

Member
Its good, it has pretty nice shading on Drake though.
Amazing solution for hair is in Paragon, really hope that more devs will copy what they did there.

I didn't know the paragon solution for hair, I found that :
maxresdefault.jpg


I completely agree with you, it's the most convincing thing I have seen for hair. Is it really this quality in-game ?
 

Moosichu

Member
Has anyone here done a graphics project at University?

We have to meet the follow objectives:

To display a range of Computer Science skills involved in the design, implementation and testing of a significant computer system. Usually this is a piece of software but it could be hardware or even the assembly of a knowledge base or a mechanically-assisted proof.

To demonstrate your ability to plan and carry out a large project in a coherent and effective way, adhering to the principles of design, quality and management required for good software engineering.

To show an understanding of the context in which your selected project lies. This includes the relationship of the task to the broad surrounding areas of Computer Science and other project-specific fields as well as an awareness of known results and the literature that supports your particular specialist area.

To select (and justify your selection of) suitable programming languages, techniques, algorithms, tools and data structures and convince the Examiners that you can learn new ones as necessary.

To plan and organise the collection and presentation of evidence that will show that the end result behaves in the way intended.
To prepare a formal report (the dissertation) in clear and concise expository form which will convince its readers that objectives 1-5 have all been achieved.

Any good graphics ideas that could do this?
 

Noobcraft

Member
Has anyone here done a graphics project at University?

We have to meet the follow objectives:



Any good graphics ideas that could do this?
Someone I know made his own 3D rendering software for a computer science project. Now he works at Microsoft.
 

Javin98

Banned
Nooblet also asked for a shot of the draw distance for dynamic shots, so here it is:
sSevvUq.png


You can see the trees in the distance still cast shadows. The trees sway in the wind and the shadows will follow the movement of the trees. So unless I'm mistaken, I'm guessing that proves that it is dynamic and not baked into the environment. In fact, Uncharted 4 should be relying much less on baked lighting and shadowing than previous games (and The Order), because most objects can be destroyed or will at least react to wind and player movements, unlike the previous games which had mostly static environments.
 

KKRT00

Member
I didn't know the paragon solution for hair, I found that :
maxresdefault.jpg


I completely agree with you, it's the most convincing thing I have seen for hair. Is it really this quality in-game ?

I have not played the game yet, but it was the goal of this tech and i dont think they have other solution for hair really.
 
So, as per request, I've been looking for POM with self shadowing in Uncharted 4 and I'll just admit it, I'm not that good at spotting it. There are many instances where I'm quite sure it's POM, but I'm not sure if it self shadows. As such, I'm a bit afraid of posting those pics because I don't want to spread misinformation. However, Beyond3D is a good source for this and one user in particular has many examples. Credits to Clukos:

uncharted4_athiefsend0uxg6.jpg

uncharted4_athiefsendiwa3o.jpg


Now I'm not sure if he is entirely right, so hopefully someone with more technical know how can weigh in on this. Also, I got a shot of how far the dynamic shadows can draw. I'll post it tomorrow.
I am not seeing any POM self shadows here, rather just shading like normals would be even as well (so there is proper darkening of discrete surfaces, but no casting).
It is best to look for the darkening on one shaded side of an offset surface to be projected onto another or beyond the confines of the offset but still within the texture surface, like here in Crysis 2 (you can see the shadow hitting the top part of another offset even):
crysis2_2015_04_01_18waox2.png
 

Javin98

Banned
I am not seeing any POM self shadows here, rather just shading like normals would be even as well.
Yep, like I said, it's not necessarily right info. I wasn't sure, so I brought it here for debate. Just to clarify, are you saying this isn't POM? Or just POM without self shadowing?
 
Yep, like I said, it's not necessarily right info. I wasn't sure, so I brought it here for debate. Just to clarify, are you saying this isn't POM? Or just POM without self shadowing?

Yeah to be clear, it looks like pom, but without self shadowing.
I can't remember from memory, did Crysis 3 feature PBR?

I think they may have added it to the engine for Ryse.

This still has to be done unforunately (as I have been lazy), but we need a greater definition of this term.

Crysis 3 has elements of what one would consider PBR features, like linear lighting or proper light fall off as well as things like fresnel and local reflections. But it did not use a normalised micro facet model to represent surfaces, nor did it use a library of materials to compose object surfaces. Rather a lot of it was purely artist driven on a per level basis with a lot of texture information in the diffuse textures which is against most PBR work flow ideas of decoupling all of the lighting and information from the diffuse.

With Ryse they changed this as you mentioned for the latter part.
 

Javin98

Banned
Yeah to be clear, it looks like pom, but without self shadowing.
Ah, I see. That's the problem I had when I was looking for self shadowing POM myself. I could find many instances with POM, but without self shadowing. I think self shadowing is rarely, if ever used on POM in Uncharted 4, really. Also, I think I should clarify that the examples are not geometry. With Tri-Color mode enabled, those bumpy surfaces simply disappear.
 

nOoblet16

Member
Has anyone here done a graphics project at University?

We have to meet the follow objectives:



Any good graphics ideas that could do this?

I made a real time particle rendering system using CUDA, for my final year project as part of my undergraduate degree in traditional Computer Science.
Everything was coded from scratch rather than using any tools or engine ofcourse. Something like this:
https://www.youtube.com/watch?v=2VRQE_jNyXo

Or you can try a random terrain generator:
https://www.youtube.com/watch?v=03vn1Nv9hyk

Or if you are feeling smart you can mix both and have a terrain generator with fluid simulation on top of it where the fluid flows through this randomly generated terrain...you can most likely use something like Blender3D or Unity for this and your supervisors won't mind as long as it looks nice.

You can also do ocean wave simulation using FFT.
https://www.youtube.com/watch?v=IrUehq6vJss
https://www.youtube.com/watch?v=ujnkAQ52t0Y
 

riflen

Member
General question: since when is HDR is featured on GPUs (for both nVidia and AMD)?

I'm pretty confident that GPUs have been rendering games with high dynamic range for over 10 years already. Once we got programmable shaders it became feasible I think. The confusion really is that HDR is a very loose term and usually just means "higher-than-we-had-before range".

I believe this new burst of marketing around HDR refers to several things coming together.

-New HDMI revision that can support enough bandwidth to transmit the data.
-New video codec that supports greater precision and can preserve a greater range.
-OLED panels that can resolve a greater range.

This is about video, so to support this, GPUs don't have to do much other than feature output ports that meet a particular standard.

If you're wondering about HDR in games today, we already have it to a degree, but it would be nice to have OLED monitors to do it better justice.
I think DELL have released the first that costs $5k, features contrast ratio of 400,000:1, a 0.1 ms response time, 4k resolution and 120 Hz.
 

nOoblet16

Member
yeah thats significantly worse than uncharted

What's to say Uncharted's hair doesn't look like ass without TAA?

Any kind of hair would look crap without AA, if it's alpha blended rather than purely polygonal then some sort of temporal AA will be needed too.
 
No, it's dither city in the game. It pretty much relies on the temporal AA to hide it. Here's the game maxed out, AA off/on

That is kinda cool honestly (making such a smart use of TAA, presuming you are OK with the side effects)... though it would be nice if they offered actual transparency shading support in the engine.

----

I just saw this and thought it looked fantastic btw.

Reset's procedural cloud / atmosphere simulation:
https://giant.gfycat.com/IdenticalNimbleFerret.webm
 
What's to say Uncharted's hair doesn't look like ass without TAA?

Any kind of hair would look crap without AA, if it's alpha blended rather than purely polygonal then some sort of temporal AA will be needed too.

the right side has aa enabled which is what im focusing on.


yeah thats awesome
 

dr_rus

Member
General question: since when is HDR is featured on GPUs (for both nVidia and AMD)?

If you mean the processing in FP16/FP32 then since FX and R300.
If you mean using >32bit buffers to store the results then since NV40 for NV and R520 for AMD (although that's mostly for blending enabled targets allowing to use them as frame buffers, storing was probably possible before that).
If you mean the >24bit output then I don't really remember but it's been some time.
If you mean the support for standards used in modern HDR displays (including HDR video on Blu-Ray as well) then it's since Pascal/Polaris.
 

dr guildo

Member
That is kinda cool honestly (making such a smart use of TAA, presuming you are OK with the side effects)... though it would be nice if they offered actual transparency shading support in the engine.

----

I just saw this and thought it looked fantastic btw.

Reset's procedural cloud / atmosphere simulation:
https://giant.gfycat.com/IdenticalNimbleFerret.webm

For the moment, I think that no studio surpasses what has achieved GuerillaGames with their cloudscape engine :
https://www.youtube.com/watch?v=ezKR3VuqD9k

https://www.guerrilla-games.com/read/the-real-time-volumetric-cloudscapes-of-horizon-zero-dawn

But Theory's work is amazing too, just a bit under GG'one, because it GG clouscape is more various, it can create many types of clouds in a same sky. In the video I shared, you can see in the sky, cirrus + cumulus @30sec. In Theroy'sky, I see only cumulus.
 
If you mean the processing in FP16/FP32 then since FX and R300.
If you mean using >32bit buffers to store the results then since NV40 for NV and R520 for AMD (although that's mostly for blending enabled targets allowing to use them as frame buffers, storing was probably possible before that).
If you mean the >24bit output then I don't really remember but it's been some time.
If you mean the support for standards used in modern HDR displays (including HDR video on Blu-Ray as well) then it's since Pascal/Polaris.

doesnt AMD state that all gcn will support hdr monitors? they have been using it as a point of marketing against nvidia IIRC. when it comes to games at least
 

riflen

Member
doesnt AMD state that all gcn will support hdr monitors? they have been using it as a point of marketing against nvidia IIRC. when it comes to games at least

Yeah but it's marketing. Games already render in HDR, it's just that they tone map down for SDR displays. Definitely Maxwell (maybe Kepler, not sure) already has everything you need for HDR games. The game just requires a patch to have the tone mapping revised for HDR displays and it's pretty simple to implement, according to developers. I think DisplayPort 1.2 is enough for HDR support at typical panel resolutions and refresh rates.
 
Yeah but it's marketing. Games already render in HDR, it's just that they tone map down for SDR displays. Definitely Maxwell (maybe Kepler, not sure) already has everything you need for HDR games. The game just requires a patch to have the tone mapping revised for HDR displays and it's pretty simple to implement, according to developers. I think DisplayPort 1.2 is enough for HDR support at typical panel resolutions and refresh rates.

thought i remember something readinh being missing from all nvidia gpus pre-pascal for hdr monitor support.
 

dr_rus

Member
doesnt AMD state that all gcn will support hdr monitors? they have been using it as a point of marketing against nvidia IIRC. when it comes to games at least

A GPU must conform to HDMI 2.0a+ / DP1.4+ standards to actually support the upcoming HDR monitors. Afaik no AMD GCN GPU prior to Polaris have such support so I don't know what they mean when they say that. NV GPUs are hardly lacking in standard support on their outputs so I'm pretty sure that whatever HDR-related stuff GCN supported prior to Polaris - Kepler/Maxwell support it as well.

I guess that they mean that they can output more than 24 bits from their GPUs outputs but it's the same for any modern NV GPU as well. I've been running my TV at 48 bit HDMI output for some time now. The issue is that this won't really work with new HDR displays because the output is still lacking the signaling needed to control the HDR on such displays as these were added to the standards only in HDMI 2.0a / DP 1.4. Maybe GCN cards can program these in somehow, I dunno.
 

riflen

Member
thought i remember something readinh being missing from all nvidia gpus pre-pascal for hdr monitor support.

Well we'll see I guess. There's not much information on it yet, because I don't think there are any HDR monitors available in the consumer space at the moment.
AMD did say in their CES presentation this year that the older R9 300 range will support HDR for games and photos, so I don't think Polaris is giving you much in this area apart from HDR support at higher resolutions and refresh rates via DP 1.3. This is just concerning HDR for games, mind you.
 

Jux

Member
There is two different things here: HDR rendering and HDR output.
- HDR rendering is what happens inside an engine. Various engines have been doing HDR rendering for quite some time with different levels of quality (a lot of approximation was used up to last gen like using RGBM encoding instead of FP16 textures for instance)
- HDR output requires an HDR TV that are just starting to be shown (some TVs support 10/12 bits inputs but it's just more precision, not HDR) HDR output from GPU has been available for quite some time too. For instance PS3 could output a backbuffer in FP16 but it was pretty useless with the screens available at the time. What will happen with true HDR screens is that Engine will no longer require a tonemapping step and will output the full dynamic range to the TV, letting the device do the tonemapping.
 
Top Bottom