(wall of text incoming)
The technology, both software and hardware existed for years and years now.
I'm trying to understand exactly what it is, and it all sounds like PR too me.
This is my understanding of it, i don't know if i'm right, so can someone explain it to me
There's 2 aspects of it, software and hardware:
For software, this was introduced (my first exposure to it at least) was the source engine, it was one of the biggest selling points/technological achievements for the engine.
at the end of the day, in software, you're manipulating how the lighting works and is displayed, thus giving the HDR effect.
It's been so many years since, that i assumed all game engines use HDR as lighting to some degree, but they way sony and microsoft puts it, this is some new thing that can only game developers are programming for now and needs to be switched on? (this part confuses me)
Which leads me to the hardware. HDR enabled games will only work on HDR screens. This I somewhat get.
If you can't display the colours accurately, you'll never see the details.
As a pc monitor user, I (believe) have never had this problem though. My monitor, the dell u2410, is a 8-bit panel with 10-bit (or 12-bit) internal colour processor, which is a top tier colour reproducer for its time (and should still hold up today).
I know that most monitors, even some low tier ips panels are still 6-bit in colour.
tl;dr
Is 'HDR' enabled tvs just the non technical term for 10-bit panels?
If so, I should have had a pseudo 'true' HDR experience for a long time now since my monitor was capable to processing 10 (or 12) bits of colour.
Were game previously not programmed for HDR; in lighting or colour palette usage (developers were using less than 10 bits of colour all this time)
:answer:
i'll add more as there are better answers