• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

4K to be called "Ultra High-Definition"

Status
Not open for further replies.
I still use an SD TV. I've seen and used HD TVs. I can tell the difference. I don't care. High Def doesn't make what I'm watching actually BE better, just look better. I'm not going to get an HD TV until the one I have dies, and considering it is only 10 years old or so, I don't see it dying for quite a while. The only issues I've had are some video games that the text is either cut off or really small. I'll watch blu ray rips on my computer monitor (which actually IS a small HD television), and I'll watch those same rips on my SD TV through my Xbox and they look pretty damn good on my SD TV, too. I don't need a movie screen in my house.

I watch 15-20 year old VHS tapes on an SD TV when I visit my friend and her daughters. Yes, Aladdin 3 looks a bit mushy after 20 years and on tape on an old TV. But it is still completely watchable. The movie itself isn't going to be any better just because Genie's color is extra sharp. And that theory is why SD TVs are still in the majority in America. "But it looks SO MUCH BETTER" "Eh, it is fine".

If regular HD TVs still aren't dominant over SD, and 3D TVs are even less of a percentage of market share, movie theater quality TVs probably aren't going to do very well. How expensive will a 4k TV be?
 
These companies are literally bankrupt of ideas on how to push TVs out the door.
This isn't the problem. The struggle they have lies in the extremely harsh competition and the race to the bottom. The prices drop way too fast, it's hard to sell TVs on a higher price point unless you can push an obvious advantage. And those companies think 4k is the next big thing they can sell with a luxury tax.

Sigh, I wish they would focus more on other aspects (power consumption, image quality, latency, build quality) instead of resolution.
 
It's really worth it for games. 720p is a considerable improvment over 480p.
Burnout Paradise was so difficult to play on an SDTV since it was harder to make out the oncoming road and traffic.
 
I wonder if the people who can tell/can't tell are partly separated by how often and how vigorously they use computers.

If I set Windows to 640x480 on my 24" monitor, it won't look good from any distance, at least until you're far enough away that your eyes stop bleeding. Even 1920x1200 on a my 24" feels too low at this point. Especially when I look from my S3 to the monitor.

If anyone has any doubts, one of the most obvious examples is Football. Going between football on an HD channel and an SD channel is impossible not to notice.

I don't think you can separate them this way.
I like high resolution when using computers because I sit very close from it. 2560x1440 is great and I won't mind 4K for monitors.

TV is a different matter, I'm at least 3 m away, most of the time somewhere between 3 and 4.5 m. In this case I don't even mind SD (though I surely prefer HD).
 
What a poor choice for a piece of technology. Calling it Ultra implies it.... Oh forget it.

Ultra 64 was a cool name though, but it was sensible to change it :)
 
Eh, it's cool. Reminds me of VHF (Very High Frequency) and UHF (Ultra High Frequency).

Marketing come up with a new name for 12K in the future.

Edit: Sorry guys, but it had to be done.

AlbDj.png
 
I watch 15-20 year old VHS tapes on an SD TV when I visit my friend and her daughters. Yes, Aladdin 3 looks a bit mushy after 20 years and on tape on an old TV. But it is still completely watchable. The movie itself isn't going to be any better just because Genie's color is extra sharp.
"Tv isn't going to better only because it has color."

If you can tolerate VHS you must be half blind...
 
It's a pity that 90% of films are shot at no higher than 4K resolution.

Wha? If they're shot on film, they have a much higher resolution than 4K by default.

The movies that are going to suffer most from upgrades like this are movies made digitally in the early days of digital filmmaking. A 4K remaster of Star Wars Episode 2 is going to look awful in 4K unless they actually rerender the effects in a higher resolution.
 
"Tv isn't going to better only because it has color."

If you can tolerate VHS you must be half blind...

No. It is a bit shitty, but you get used it again in a hurry. I don't have a VCR and I wouldn't watch tapes at my house, but when little kids want to watch some Disney movies they have on tape, I'm not going to tell them "No! You're blind! We can only watch this on blu ray!" Because it isn't that big of a deal.

I can see and tell the very noticeable difference. I just don't care. Shit doesn't bother me. I'm fine with DVD quality. My eyes work pretty well. VHS kids movies don't make me angry.
 
Anyone know why 4K isn't named 2K? And 8K isn't 4K?

After all...

480p = 480 pixels vertically
720p = 720 pixels vertically
1080p = 1080 pixels vertically

So this makes no sense:

4K = 3840 pixels horizontally
8K = 7680 pixels horizontally

This would:

2K = 2160 pixels vertically
4K = 4320 pixels vertically
 
If we get to 16K we can probably make out spermatozoa.

image.php



Anyone know why 4K isn't named 2K? And 8K isn't 4K?

After all...

480p = 480 pixels vertically
720p = 720 pixels vertically
1080p = 1080 pixels vertically

So this makes no sense:

4K = 3840 pixels horizontally
8K = 7680 pixels horizontally

This would:

2K = 2160 pixels vertically
4K = 4320 pixels vertically

When in doubt go with the bigger sounding name.
 
Anyone know why 4K isn't named 2K? And 8K isn't 4K?

After all...

480p = 480 pixels vertically
720p = 720 pixels vertically
1080p = 1080 pixels vertically

So this makes no sense:

4K = 3840 pixels horizontally
8K = 7680 pixels horizontally

This would:

2K = 2160 pixels vertically
4K = 4320 pixels vertically

The 4K terminology comes from the digital film camera industry, and they followed the conventions of traditional film which was measured horizontally in terms of the picture frame(35 mm, 72mm, etc.).
 
So when is this going to be consumer friendly? By consumer friendly i mean when can i buy one for $1000-1500 and not 50k?

And even when they become affordable when will we actually get content that supports it? I mean really supports it and not being upscaled. Most major cable/sat providers have either just hit 1080i broadcast or 1080p (so they claim) and blu-ray has just become the "norm" we gonna get blu-ray 2.0? and when this 4k TV does become the new standard what about 8k?

At least 5yrs out, and that would only be for blu-ray. Cable TV and gaming? Probably 10yrs out, if ever.




Yeah I don't know what I was thinking. I'm running on very little sleep. ;_;

But why are you sitting 2.5ft from your TV? That's too close.
 
Well I guess if you're PC gaming at your desk, that may be OK. But that would be way too close for me gaming on my couch.

Why would it be any different? It's all a question of how much field of view your display takes.

On my couch I sit about 8 feet away from a 90" projected image.
 
weird, I thought they have been using the term ultra-high-def for resolution over 1080p many years ago, I remember seeing it maybe on nvidia's website or something graphic card/display related.

I guess it wasn't official and just something they made up as a buzzword
 
Why would it be any different? It's all a question of how much field of view your display takes.

On my couch I sit about 8 feet away from a 90" projected image.

I just would never game if I had to sit 2.5ft from the display. Too close for me. I sit about ~6ft away from my new 55"
 
The technology is moving so fast they can't even think of original names in time for it to come out.

I think they should go the Starbucks route. My TV has Venti Vision for movies but it plays games in Grande!
 
One thing that hasn't been talked about much is that they are at least forcing QuadHD input as part of the requirements to get UHD certification. I'm glad they're at least going that far.

While 8K may be part of the UHD spec (much like HD included several resolutions), it is going to be quite some time before we see 8K sets. Even longer before most regions have any sort of content that supports it (possibly decades). With this move, consumers will not really need to do research on individual model specifications. Any UHD display they purchase will be guaranteed to actually input its display resolution.

That's quite a turn around from what happened with those early 1080p displays that could only input 720p/1080i. While I suppose down the road this could eventually become problematic with 8k, it's quite possible they'll avoid that situation. Even if they don't, it's much better than the 1080p/i issue. The problem was interlacing, since many sets did not have proper 1080p de-interlacing. With the CE's, broadcasters, and media companies all skipping on an interlaced half-step this time ... we won't have similar problems. No motion artifacts. Plus scaling 4K (or 1080p) to 8k is trivial.





No because "HD" is nothing but a marketing term anyways.
Yes and no. At least for displays, HD always meant support for at least one or more resolutions in the set of 720p, 1080i, and 1080p. Essentially it was derived from the ATSC standard.

The same applies here. The UHD CE designation is derived from the UHD broadcasting standard.




Anyone know why 4K isn't named 2K? And 8K isn't 4K?

After all...

480p = 480 pixels vertically
720p = 720 pixels vertically
1080p = 1080 pixels vertically

So this makes no sense:

4K = 3840 pixels horizontally
8K = 7680 pixels horizontally

This would:

2K = 2160 pixels vertically
4K = 4320 pixels vertically
It was borrowed from digital cinema terminology. Unlike CE's who used vertical resolution in the HD era, cinemas have always used horizontal resolution.

As to why CE's switched to this (though now UHD will take over), I can guess. While 720p and 1080p aren't too hard to remember since it's only two resolutions, adding two more seemingly random numbers for J6P to remember is a bit more difficult in terms of marketing. Not to mention, 2160p and 4320p simply don't roll off the tongue as nicely as 720p and 1080p due to more syllables. So they went with a simpler notation. And right or wrong, they felt in order to avoid confusion since some people know about film resolutions ... they'd jump on a pre-existing terminology.





ivjyencoxkBQo.jpg

We know where this is going.
perfect :)
 
lg4ktvreviewfrontscreen.jpg


It’s difficult not to be dazzled by 4K’s superior resolution on a screen of this size. Sure, we’ve seen 70-, 80- and 90-inch displays before, but those TVs were limited to 1080p resolution. The difference between 1080p and 4K above 70 inches is both substantial and easily discernible.

While 1080p HD images look very good on screen sizes 70-inches and larger, 4K Ultra HD images are better described as spectacular. The level of detail that can be seen is something nearly anyone can appreciate. The improvement trickles down to other performance points that casual viewers may not be cognizant of, such as shadow detail and fine-line detail.

Bottom line: Higher pixel density makes images on the screen look more like reality than TV. Scenes of the bright blue Mediterranean looked so convincing, it was hard not to want to jump right in. The stars in night skies looked much more like stars – tiny pinpoints of light – than a smattering of blurry white spots.

http://www.digitaltrends.com/tv-reviews/lg-84lm9600-review/

let the 4K revolution begin.
 
Sharps 80 inch models look like shit for the most part, once 4k becomes the norm for TVs that size, itll be pretty awesome
 
Next Gen console going to be sub 30fps again just for the sake of having "ULTRA HIGH DEF" in the bullet point.
 
Status
Not open for further replies.
Top Bottom