(wall of text incoming)
The technology, both software and hardware existed for years and years now.
I'm trying to understand exactly what it is, and it all sounds like PR too me.
This is my understanding of it, i don't know if i'm right, so can someone explain it to me
There's 2 aspects of it, software and hardware:
For software, this was introduced (my first exposure to it at least) was the source engine, it was one of the biggest selling points/technological achievements for the engine.
at the end of the day, in software, you're manipulating how the lighting works and is displayed, thus giving the HDR effect.
It's been so many years since, that i assumed all game engines use HDR as lighting to some degree, but they way sony and microsoft puts it, this is some new thing that can only game developers are programming for now and needs to be switched on? (this part confuses me)
Which leads me to the hardware. HDR enabled games will only work on HDR screens. This I somewhat get.
If you can't display the colours accurately, you'll never see the details.
As a pc monitor user, I (believe) have never had this problem though. My monitor, the dell u2410, is a 8-bit panel with 10-bit (or 12-bit) internal colour processor, which is a top tier colour reproducer for its time (and should still hold up today).
I know that most monitors, even some low tier ips panels are still 6-bit in colour.
tl;dr
Is 'HDR' enabled tvs just the non technical term for 10-bit panels?
If so, I should have had a pseudo 'true' HDR experience for a long time now since my monitor was capable to processing 10 (or 12) bits of colour.
Were game previously not programmed for HDR; in lighting or colour palette usage (developers were using less than 10 bits of colour all this time)
:answer:
i'll add more as there are better answers
The technology, both software and hardware existed for years and years now.
I'm trying to understand exactly what it is, and it all sounds like PR too me.
This is my understanding of it, i don't know if i'm right, so can someone explain it to me
There's 2 aspects of it, software and hardware:
For software, this was introduced (my first exposure to it at least) was the source engine, it was one of the biggest selling points/technological achievements for the engine.
at the end of the day, in software, you're manipulating how the lighting works and is displayed, thus giving the HDR effect.
It's been so many years since, that i assumed all game engines use HDR as lighting to some degree, but they way sony and microsoft puts it, this is some new thing that can only game developers are programming for now and needs to be switched on? (this part confuses me)
Which leads me to the hardware. HDR enabled games will only work on HDR screens. This I somewhat get.
If you can't display the colours accurately, you'll never see the details.
As a pc monitor user, I (believe) have never had this problem though. My monitor, the dell u2410, is a 8-bit panel with 10-bit (or 12-bit) internal colour processor, which is a top tier colour reproducer for its time (and should still hold up today).
I know that most monitors, even some low tier ips panels are still 6-bit in colour.
tl;dr
Is 'HDR' enabled tvs just the non technical term for 10-bit panels?
If so, I should have had a pseudo 'true' HDR experience for a long time now since my monitor was capable to processing 10 (or 12) bits of colour.
Were game previously not programmed for HDR; in lighting or colour palette usage (developers were using less than 10 bits of colour all this time)
:answer:
This is a related but different technique called HDRR. It uses internal HDR buffers, but the result are tone-mapped down to SDR with a bloom effect to simulate overbrightness. a HDR display could display a much larger brightness range natively, without the use of bloom.
Software developers need to change the output format and the way the tone-mapping works.
The number of bits only tell you the amount of brightness steps, but not the range or distribution of said steps. HDR monitors have a peak brightness and use a transfer function called SMPTE.2084 (or HLG) instead of the "gamma" curves SDR monitors use.
HDR =
* 10 bit + panel
* High contrast panel (FALD LED LCD or OLED, some projectors)
* High peak brightness (>500 nits)
* SMPTE.2084 (for HDR10 and DolbyVision) and/or HLG transfer function support.
* Metadata support for tone-mapping.
* Usually also come with wide color gamut.
Nope, there are no consumer HDR monitors available right now.
i'll add more as there are better answers