• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

HDTVTest - TCL Unleash 163-inch Micro LED TV with 10,000 Nits HDR

ShirAhava

Plays with kids toys, in the adult gaming world
I don't understand The brightness race

anything more than 600 nits and I'm like
giphy.gif


just place your screen in a not well lit room like a sane person
 

CamHostage

Member
We're getting to the point where size constraints are no longer with the TV, but with access to your home being the limitation. My 85" LCD, due to depth, only just managed to get into mine due to a stairwell (which is wide and modern). I know of others taking out their lounge window area to get a TV in.

C5fC9bW5wBX3Vpg5ZkHEWc-320-80.gif


Then I think back to having 4 of us struggle moving a 36" Panasonic CRT into a lounge.... wow do things move on.

For a long while, we had a TV back behind a bigger TV on the same TV stand because we took one try at moving the CRT and said, "That fucker ain't moving." Eventually we had professional movers for a different reason and the CRT finally went away, but at some point we might have just walled that big beast in like a godforsaken Cask of Amontillado...
 

King Dazzar

Member
I don't understand The brightness race

anything more than 600 nits and I'm like
giphy.gif


just place your screen in a not well lit room like a sane person
Well, 4k nit always has been one of the main mastering options, especially with some Dolby Vision content. And that's with their awareness that HDR is meant for dark room viewing. But it doesn't mean you're permanently getting blasted with maximum luminance.

I was doing some gaming tonight on my 3k nit set as usual with lights all off. And the game (HZD) can go upto 10k nit, so all 3k nit was often being used. But with it my panel illuminates the entire room lol. So periphareal vision all gets evened out with the luminance coming from the 85". So maybe having a larger set, helps with it all as it will spread general luminance more. Anyway, I never get eye strain any more than when I watch my 65" OLED. Yes it makes my retina flinch at times, but thats what I want from my HDR.

Just dont ask me how much power it consumes - its thirsty!
 

King Dazzar

Member
C5fC9bW5wBX3Vpg5ZkHEWc-320-80.gif




For a long while, we had a TV back behind a bigger TV on the same TV stand because we took one try at moving the CRT and said, "That fucker ain't moving." Eventually we had professional movers for a different reason and the CRT finally went away, but at some point we might have just walled that big beast in like a godforsaken Cask of Amontillado...
When I sold my Panny 36" CRT, I said to my mate buying it: "I'll help you get it in the van, but after that its all yours forever and ever!!" A few years later he split up with his girlfriend and I said what happened to the Panny? And he replied "she can fucking keep it" lol
 
OLED users: my mole eyes get burned already from my ultra dim 150 nit screen! N-nobody wants this much brightness!

Lol ok. Can't wait for this tech to shrink down. The final end game display technology.
 

Fake

Member
Still a ways off at the earliest. There are still a number of production and yield issues that don't appear to have a good solution at present.

I guess OLED had similar problems, and the price was way way higher.

Years passed and OLED still have the burning screen issues to fix.
 

Schmendrick

Member
Cool PR stunt, but let's not forget that this thing is only as big as it is because the micro LEDs aren't nearly as micro as they need to be for normal TV sizes with decent ppi...yet.
 

NeoIkaruGAF

Gold Member
OK, cool.
Now show me how a 4K movie moves on it without frame interpolation.
We can have a perfect still image or demo video on any screen.
 

Mattyp

Gold Member
Believe it or not, OLED was once crazy expensive and out of reach for people.
My first Pioneer (not sure if they even make TVs anymore) Plasma 42" was $10,000. We're talking 15 years ago.

Young cunts be dumb as shit on here, can't wait for this to be available. Even at $20k I would replace my 160" projector.

But then again maybe not, projector in a dedicated theatre space just gives the right image for movies.
 
Last edited:

Bojji

Member
Came in looking for clueless responses and it didn't take Bojji Bojji long.

I know Micro LED is the endgame tech for displays, it has all advantages of OLED and none of the weaknesses (like Blade...).

What I don't get is this nit obsession, at some point it's just getting ridiculous. 4000 nits is needed to display full BT2020 color space but let's be real, majority of people (men especially...) won't be able to perceive differences between most of them and many, many (I don't know exact %) games never go beyond SRGB color space. But 10.000 nits? Probably just future proofing from Dolby, right now there is no logical need for that.

This tv looks like it's in vivid mode (most highlights are probably cranked up to maximum) and requires massive amounts of power, in my opinion it's beyond reasonable levels.
 
Last edited:

NeoIkaruGAF

Gold Member
My first Pioneer (not sure if they even make TVs anymore) Plasma 42" was $10,000. We're talking 15 years ago.
No way you payed that much for a plasma in 2008, especially a 42”. Must have been way earlier than that. Unless you’re talking Canadian bucks.
 
Is TCL any good? I see lots of deals here in the Netherlands.... Never heard of it and I assumed it was an Alixpress brand.
 
Last edited:

TheStam

Member
I know Micro LED is the endgame tech for displays, it has all advantages of OLED and none of the weaknesses (like Blade...).

What I don't get is this nit obsession, at some point it's just getting ridiculous. 4000 nits is needed to display full BT2020 color space but let's be real, majority of people (men especially...) won't be able to perceive differences between most of them and many, many (I don't know exact %) games never go beyond SRGB color space. But 10.000 nits? Probably just future proofing from Dolby, right now there is no logical need for that.

This tv looks like it's in vivid mode (most highlights are probably cranked up to maximum) and requires massive amounts of power, in my opinion it's beyond reasonable levels.

I've been wondering what the max needed nits would be. It's also usually usually measured in a microscopic part of the screen or on my OLED screen ABL kicks in and white turns to grey. But at some point it's just gonna hurt your eyes and be turned down as well. But on a small area of the screen with dark background like Ori, 1000 nits seems perfectly fine for me, but the frustrating part for me now is playing on say a snow map or bright daylight and the screen is looking dull. It's as if full screen average nits should be more relevant.
 

S0ULZB0URNE

Member
I know Micro LED is the endgame tech for displays, it has all advantages of OLED and none of the weaknesses (like Blade...).

What I don't get is this nit obsession, at some point it's just getting ridiculous. 4000 nits is needed to display full BT2020 color space but let's be real, majority of people (men especially...) won't be able to perceive differences between most of them and many, many (I don't know exact %) games never go beyond SRGB color space. But 10.000 nits? Probably just future proofing from Dolby, right now there is no logical need for that.

This tv looks like it's in vivid mode (most highlights are probably cranked up to maximum) and requires massive amounts of power, in my opinion it's beyond reasonable levels.
cant-unsee-my-eyes.gif
 

Diddy X

Member
I've been wondering what the max needed nits would be. It's also usually usually measured in a microscopic part of the screen or on my OLED screen ABL kicks in and white turns to grey. But at some point it's just gonna hurt your eyes and be turned down as well. But on a small area of the screen with dark background like Ori, 1000 nits seems perfectly fine for me, but the frustrating part for me now is playing on say a snow map or bright daylight and the screen is looking dull. It's as if full screen average nits should be more relevant.

Right it's full bright screens where oled lack brightness.
 
I know Micro LED is the endgame tech for displays, it has all advantages of OLED and none of the weaknesses (like Blade...).

What I don't get is this nit obsession, at some point it's just getting ridiculous. 4000 nits is needed to display full BT2020 color space but let's be real, majority of people (men especially...) won't be able to perceive differences between most of them and many, many (I don't know exact %) games never go beyond SRGB color space. But 10.000 nits? Probably just future proofing from Dolby, right now there is no logical need for that.

This tv looks like it's in vivid mode (most highlights are probably cranked up to maximum) and requires massive amounts of power, in my opinion it's beyond reasonable levels.
My eyes! Omg who wants 10,000 nits!? What purpose does this even serve at accurately capturing the dynamic range of real life, the very thing HDR was created to attempt!?
Meanwhile in real, every day life that YOU live in and never have any problems with brightness:
4Rk27xe.jpeg
 

HeisenbergFX4

Gold Member
My eyes! Omg who wants 10,000 nits!? What purpose does this even serve at accurately capturing the dynamic range of real life, the very thing HDR was created to attempt!?
Meanwhile in real, every day life that YOU live in and never have any problems with brightness:
4Rk27xe.jpeg
The thing I don't get is why so many people feel the need to tell others what they don't need or what they don't want or what they think is best for my personal viewing pleasure

If 10k nits was an affordable option tomorrow and someone doesn't want it then great, I am not forcing you to buy it

But if its something I want then who cares :)

Hisense announced a 10k nit 20k dimming zones mini LED 110" TV the 110UX that I cant wait to experience in person when my local audio/video place gets one in
 

dave_d

Member
What I don't get is this nit obsession, at some point it's just getting ridiculous. 4000 nits is needed to display full BT2020 color space but let's be real, majority of people (men especially...) won't be able to perceive differences between most of them and many, many (I don't know exact %) games never go beyond SRGB color space. But 10.000 nits? Probably just future proofing from Dolby, right now there is no logical need for that.
One advantage I've seen mentioned is to do all the better forms of black frame insertion (like rolling scan) requires brighter displays. (Since using BFI dims things down.) Anyway like someone else wrote, lets see the motion clarity race.
 

Bojji

Member
My eyes! Omg who wants 10,000 nits!? What purpose does this even serve at accurately capturing the dynamic range of real life, the very thing HDR was created to attempt!?
Meanwhile in real, every day life that YOU live in and never have any problems with brightness:
4Rk27xe.jpeg

Eyes adapt to brightness, this is obvious and one of the reasons why going higher and higher with nits is becoming stupid

Adaptacion-por-cambios-de-dimensiones-de-la-pupila.png


With lower ambient brightness you can show high dynamic range with much less nits.

Interesting stuff about HDR from movie theaters:

Chang reported that the tests involved both direct view, such as LED panels and projection technologies. They set up double-blind testing with a wide variety of test viewers, from members of AMPAS and NATO and other professional organizations to students and other non-pros, who were asked to rate the visual experiences.

The results indicated significantly higher ratings as they lowered black levels down towards .05 nits (five millinits) with a flattening of the curve below that level. In terms of highlights, the 108-nit Dolby Cinema spec was “marginally preferred over SDR’s 48-nit” standard, but ratings increased significantly at peak brightness levels up to 300 nits, and then flattened out with the experience of viewing 500- and 800-nit versions being rated roughly the same. While there’s a lot more to do before new standards are written, this testing suggested 300 nits maximum brightness and five millinits darkest black as a possible sweet spot.
Zell also touched on the issue of how our eyes actually respond to light and why more nits doesn’t necessarily equate to a better experience. “When we watch a movie at 48 nits or 100 nits,” he said, “our irises stay open. When we’re looking at 150 nits, they start to close down and by 300 nits, they are really small. A screen at full white at 300 nits, makes viewers’ “heads jerk backwards, it makes them cover their eyes with their hands.”

 

King Dazzar

Member
My eyes! Omg who wants 10,000 nits!? What purpose does this even serve at accurately capturing the dynamic range of real life, the very thing HDR was created to attempt!?
Meanwhile in real, every day life that YOU live in and never have any problems with brightness:
Great post.
Eyes adapt to brightness, this is obvious and one of the reasons why going higher and higher with nits is becoming stupid

Adaptacion-por-cambios-de-dimensiones-de-la-pupila.png


With lower ambient brightness you can show high dynamic range with much less nits.

Interesting stuff about HDR from movie theaters:




Why is it stupid? If we can get a TV that can 100% reflect its original mastering, then why not. Quite a lot of mastering at 4k nit has already been done and will likely increase - DV was always originally focused on 4k nit mastering. MLA emissive is becoming more capable and QD-OLED will advance further too. Do I personally want 10k nit? Not really. But 4k nit would be great.

I keep saying this. But many miss the point. Bear in mind that 4k nit capable displays will be 4k nit peak brightness on small windows and APL/larger windows will be far less. And even then if you keep the panel in a calibrated mode. Viewing of the mid point/paper white will remain exactly the same. But when you hit a scene which is higher luminance, current panels auto back light limiter, no longer is a concern and the panel will also then be able to increase the specular highlights too as originally intended when being mastered.

Funny how people seem to have adapted to 1500nit MLA and QD-OLED, with no complaints at all about them being too bright.
 
Last edited:

Meicyn

Gold Member
I'm sure I'm conversing with Vampires
vampire dracula GIF
It’s no big deal. It’s like when Apple users insisted they didn’t need larger screens back when Samsung started making larger phones. Apple sycophants couldn’t stop parroting Steve Jobs about how your thumb can only reach so far when using it one handed, so 3.5 inches is what everyone should be happy with.

Then they later made bigger screens when they started losing market share to Samsung, and now small phones rarely get updates by Apple.

Same shit will happen with OLED. The moment the tech achieves a breakthrough and manages sustained 500+ nits full window, the current talking points will be abandoned. In the interim, we have to suffer through Crossfit inspired rhetoric.
 
Top Bottom