• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Old CRT TVs had minimal motion blur, LCDs have a lot of motion blur, LCDs will need 1,000fps@1,000hz in order to have minimal motion blur as CRT TVs

01011001

Member
No dude you're wrong but you're being stubborn about it. The average person is not too tech savvy and telling them that flat panel TVs have a lot of motion blur is something they can easily understand. Please you're being stubborn just accept that you're wrong on this one.

I mean you don't know if that's so easy for the average person to understand either.

if you tell a non-tech savvy person that a TV has a lot of motion blur I bet many will then ask back what that means, just like people would if you told them it has persistence blur.

because what does the average joe know about motion blur or what that means in practice? Motion blur is not a term that anyone outside of tech savvy people and/or gamers or film nerds really ever think or talk about.

the only way you can rightfully claim that motion blur is easier to understand than persistence blur is if you tested it in the field at an electronics store. you switch from using Motion Blur to Persistence Blur every other customer and then write down how often each one asked back what that is.

and then after like 100 or if you wanna be more precise maybe 1000 customers you see which one was asked back for clarification more :)
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
All these 1000, 10.0000 a million or something refresh rates don't mean much.

In order to achieve clarity in these displays, the content must also run at such high frame rates.

I do have a 240hz monitor and when i feed it 240fps content, the clarity is improved (better than 60hz, not as good as CRT). But if the content is the usual 60fps, which is what you want from most games or when you emulate console games, there is no difference, you still getting a very blurred moving image. The refresh rate doesn't matter in this instance.

So there needs to be a hardware change, a technological change, for this to work. All the softwar tricks in the world will never be as good because these need processing and processing creates artifacts and input lag.

And all this just so we can get a similar quality like the one we had for decades before flat panels took over. Talk about a huge step back.



240hz is not enough, i test it myself every day against my CRT TV and monitor.

BFI is also pretty bad because it mangles the colors and brightness.
Bullshit
 

JRW

Member
ya whenever I play games on my 2008 Kuro Plasma I'm reminded of how behind LCD's still are in motion clarity even when comparing the same games on my 27" Dell 144Hz Gsync PC monitor, last game was God of War PC, 60fps/60Hz on the Plasma looks better than 144fps/144Hz on the LCD.
 

BadBurger

Gold Member
here are the specs for the Sony Trinitron GDM-FW900


That's nice as a kind of theoretical I guess, but who back in 2014 or so would be spending $2000 on archaic tech unless they needed the best CAD experience out there, and how would it benefit modern media? In any form? Slightly less motion blur in some games? And that's all before we even get into the nitty gritty of a TV that weighs more than a healthy teenager in 2014 that lacks all of the (even then) modern attributes and technologies.

Not a convincing argument to me.
 
Last edited:

Type_Raver

Member
Using a console on CRT monitor represent!

I miss my 21' Dell trinitron CRT, but a recently acquired a free 19' Hyundai CRT (shadow mask), works quite well, and is quite bright too!

I've got an old 21' Apple CRT, which looks quite nice and works well on PC, but lacks geometry adjustment buttons and isn't good for consoles.

Always on a lookout for a 19 or 21', early 2000s model sony, dell, sun or Mitsubishi.

 
Last edited:
Top Bottom