• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

4K Video Gaming is already here (Toshiba Video Demo )

Krilekk

Banned
I remember when people scoffed at the idea of 1080p. Next-gen we might see games running at 4k. Costs need to go down first and mainstream acceptance of televisions. This is still a good 6-7 years off from mainstream; which is perfect to coincide with PS5/Xbox1080.

TVs are driven by content. Not a single channel is gonna switch to anything above 1080p in the next ten years because

a) they just bought all the equipment for 1080p (a lot of them only 1080i/720p) and spent millions on it
b) higher resolution equals more bandwitch equals much higher cost to get your program to the viewers
c) it doesn't make any sense for developers to support it. A game that runs at 30 fps in 1080p (and I guess we'll see 720p as the standard nextgen) will struggle to reach even half of that frame rate in 4K. And let's not get started talking about stuff like RAM requirements and fill rates. It. Just. Won't. Work! 4K is for professional picture and video editing but not for games and TV.
 
The posts of 'no point' are always so funny ... because we all know deep down inside this will be standard in a couple years and we'll wonder how anyone lived with 1080p.
 

DieH@rd

Banned
You can see the FF Agni demo in the background. So I guess FFXV is going to be 4k.

Theres one video from IFA when reporter went to those 3 screens. One showed 1080p image [Agni and other demos], second one upscaled 1080p, and third one native 4k.
 
TVs are driven by content. Not a single channel is gonna switch to anything above 1080p in the next ten years because

a) they just bought all the equipment for 1080p (a lot of them only 1080i/720p) and spent millions on it
b) higher resolution equals more bandwitch equals much higher cost to get your program to the viewers
c) it doesn't make any sense for developers to support it. A game that runs at 30 fps in 1080p (and I guess we'll see 720p as the standard nextgen) will struggle to reach even half of that frame rate in 4K. And let's not get started talking about stuff like RAM requirements and fill rates. It. Just. Won't. Work! 4K is for professional picture and video editing but not for games and TV.

Content is driven by TVs as well. It works both ways. Verizon is already looking into broadcasting stations at 4K with ESPN also contributing with 4K shows.

a) A lot of that equipment is capable of capturing and broadcasting resolutions above 1080. You don't think all that expensive equipment was only future-proofed up to 1080i do you?

b) true

c) It will be slow at first, but it does currently work. 2K works for me just fine for video games at frame rates way beyond 60fps. 4K isn't that far off.

Also, a lot of the shows you may currently watch on TV were originally shot in 2/4K btw.
 

Corky

Nine out of ten orphans can't tell the difference.
I'd like to have a 4k monitor/tv
I wont buy one until they're at the same price-level of today's 1080p sets.
I'll never, ever, understand the resistance against new and more advanced tech. It's expensive, you don't HAVE to buy it.

"I'll never need X gb of ram"
"Who needs Y mbs of hdd space?"
"4k? I can't even tell the difference"
"How could you possibly fill an entire dvd?!"
"Why would you even need 1mb/s connection?"
 

Mandoric

Banned
Damn, 25k and they couldn't use a 300MHz HDMI chip? I know the 7970 has one cause 1080p 120Hz (or 60Hz 3D) over HDMI was one of their bragging points. 1080p/120 and 4k/60 should require the same bandwidth, so I can only assume Toshiba's still using an old gimped chip.

1080p/120 and 4k/30 are the same bandwidth. 4x is twice the width, twice the height, for a total of 4x the pixels.

The Toshiba set in that video costs about as much in Japan as their same-size 1080p set did in 2005.
 

mrklaw

MrArseFace
I'd like to have a 4k monitor/tv
I wont buy one until they're at the same price-level of today's 1080p sets.
I'll never, ever, understand the resistance against new and more advanced tech. It's expensive, you don't HAVE to buy it.

"I'll never need X gb of ram"
"Who needs Y mbs of hdd space?"
"4k? I can't even tell the difference"
"How could you possibly fill an entire dvd?!"
"Why would you even need 1mb/s connection?"

I don't see anyone saying 'begone this evil technology'. We're more discussing whether it is relevant.

visual acuity isn't the same as faster higher stronger though. You can always download faster (and things get bigger so its a constant race).

Its more like audio. Bluray has lossless audio with TrueHD or DTS MA, so you don't need anything more (you can just spend your money on speakers to do it justice). There is presumably a similar point with video. Its a mix of a few things - resolution, colour depth to avoid banding etc, framerate and compression.

resolution and colour depth are probably the most objective measures and it should be possible to have a definition of what ppi is enough for 90% of consumers to not distinguish pixels. Its the retina argument basically, and you have to factor in viewing angle (not distance because that will vary with screen size).

Compression and framerate might be more subjective, lossless would be great but is impractical, so its about what we perceive as good enough.
 

sol_bad

Member
Anyone that thinks you need a massive screen as big as a wall to appreciate 4k resolution is a bit silly.
I have a mate who has a 27 inch i-mac with a screen resolution of 2560x1440, It looks way better with way more detail than my 54 inch 1080p screen.
 

gofreak

GAF's Bob Woodward
I don't see anyone saying 'begone this evil technology'. We're more discussing whether it is relevant.

visual acuity isn't the same as faster higher stronger though. You can always download faster (and things get bigger so its a constant race).

Its more like audio. Bluray has lossless audio with TrueHD or DTS MA, so you don't need anything more (you can just spend your money on speakers to do it justice). There is presumably a similar point with video. Its a mix of a few things - resolution, colour depth to avoid banding etc, framerate and compression.

resolution and colour depth are probably the most objective measures and it should be possible to have a definition of what ppi is enough for 90% of consumers to not distinguish pixels. Its the retina argument basically, and you have to factor in viewing angle (not distance because that will vary with screen size).

Compression and framerate might be more subjective, lossless would be great but is impractical, so its about what we perceive as good enough.

I've always wondered though if it's a bit more complicated with rendered graphics vs video?

I mean, I know on my monitor that when I ratchet up resolution as far as I can go, the improvements continue to be noticeable to the IQ...to temporal artifacts and aliasing (both edge and texture/shader aliasing) etc. I'm not sure on my PC that I've reached a point of it not making a difference.

Now I haven't tried 4K, but... I just wonder if there's a difference here, if we can apply the same measures used to talk about relative returns of resolution in video (movies) vs computer graphics. Like how people used to wonder why at 640x480 you didn't see 'jaggies' in video while you did in computer graphics...the sampling of light against film is a very different process than sampling in rendering.
 

KageMaru

Member
Lol without even looking, I knew this thread was made by onQ123. The man is on a mission to make 4k seem relevant. =P
 

onQ123

Member

How can people not want that?



Lol without even looking, I knew this thread was made by onQ123. The man is on a mission to make 4k seem relevant. =P

No I just report & bring info about 4K & other gaming related stuff.


tumblr_m8ly63uyKq1qfqg5v.jpg
 

Blasty

Member
I'd like to have a 4k monitor/tv
I wont buy one until they're at the same price-level of today's 1080p sets.
I'll never, ever, understand the resistance against new and more advanced tech. It's expensive, you don't HAVE to buy it.

"I'll never need X gb of ram"
"Who needs Y mbs of hdd space?"
"4k? I can't even tell the difference"
"How could you possibly fill an entire dvd?!"
"Why would you even need 1mb/s connection?"

4K is completely different from all the other examples. Some people can't even see the difference between their CRTs and HDTVs Why would you except those people not to say they don't see the point?
 
Some have quickly written that the "human eye can't tell the difference between 1080p and 4K" as if it's so obvious, you would have to be an idiot to not agree. Reality is that it's too complicated a scenario to just throw out there as fact. It depends entirely on your view distance in relation to the size of the screen. There are really good arguments against upgrading to 4K, but your average tech-savvy NeoGaf poster probably would easily tell the difference between 4K/1080p on a 55" TV.

But that also means 4K will be relegated to a very hardcore niche market (read: Laserdisk/SACD) as the average consumer won't buy a TV large enough to popularize the format. Not to mention the fact HD penetration was hard enough - hell, both my and my wife's parents regularly watch the non-HD ESPN/CBS/SPEED signal on their TVs as it is. Infrastructure costs required to get such a signal to a home through cable/satellite also means we won't see anything but a small Blu-ray selection in 4K for a long time.
 
I'm against 4K. why? becuz the majority of these demonstrations/tvs are actually sub 4k.
call me when they actually do 4k ala 4096x2160

Won't happen because it's not a 16:9 ratio. Sure, it's fun to put an arbitrary limit on what you might think is acceptable, but to reduce black bars and stupid scaling issues, 4k is "sub-4k" for a reason.

Isn't stinkles Frankie formerly with Bungie and now with 343?Yeah, it was a joke.

Oh? That makes the pixel "halo guy" picture that Onq posted even funnier.
 

-COOLIO-

The Everyman
Look at your tv! now back to real life! now back to your tv! now back to real life! Sadly, your tv is not as sharp as real life, but it could be if it were 4k. I'm on a horse.
 

padlock

Member
Your brain can resolve the resolution pretty easy on a 46-60+ sizes. Don't get res mixed with ppi.

Only if you're sitting 5 feet away from them.

A person with 20/20 vision can resolve around 300 pixels per inch / viewing distance in feet.

A 46 inch 1080P has 47.88 pixiels per inch. That means that you need to sit no more then 6.26 feet away to be able resolve all the detail. In most homes, people sit much further back then that and 4K is only usefull for very large screen sizes.
 
The posts of 'no point' are always so funny ... because we all know deep down inside this will be standard in a couple years and we'll wonder how anyone lived with 1080p.
1080p isnt eveb standard and you expect 4k to be in a couple of years? The earliest i would says is at least a little over a decade. But people are right 4k screens for monitors and tablets are great.
 

Mandoric

Banned
Only if you're sitting 5 feet away from them.

A person with 20/20 vision can resolve around 300 pixels per inch / viewing distance in feet.

A 46 inch 1080P has 47.88 pixiels per inch. That means that you need to sit no more then 6.26 feet away to be able resolve all the detail. In most homes, people sit much further back then that and 4K is only usefull for very large screen sizes.

Reminder: those "recommended distance" charts (which is quickly being revealed as wrong, viz. actual testing with retina displays now that we can make them) do not mean "sit one inch further away, and 4k is useless". It means "sit one inch closer than the 1080p distance, and a 4k screen will give you a visible improvement".
 

scitek

Member
I think this is a fair question: which would look better, 3840x2160 at native res on a 4KTV, or 3840x2160 downsampled to 1080p?
 

Mihos

Gold Member
1080p isnt eveb standard and you expect 4k to be in a couple of years? The earliest i would says is at least a little over a decade. But people are right 4k screens for monitors and tablets are great.

It would be 10 years from whenever they first started to introduce them, so might as well start now.
 
Ok if the fastest video card on the market is struggling to hit 30 fps at that res how in the hell is the PS4 gonna manage that? I am not looking forward to sub-HD games being upscaled to 4K.

keep playing on hd tv saves money on not buying 4K tv and you get 60fps
KO_ YOU WIN
 
Going up in resolution ain't free. Next gen we will probably see many 1080p games, but there will always be developers choosing 720p in order to have more complex graphic.
 
Anyone who has ever used a PC for gaming knows how much of a difference resolution/ppi makes. Can't wait, personally. My GNex needs big screen company.
 

mrklaw

MrArseFace
I've always wondered though if it's a bit more complicated with rendered graphics vs video?

I mean, I know on my monitor that when I ratchet up resolution as far as I can go, the improvements continue to be noticeable to the IQ...to temporal artifacts and aliasing (both edge and texture/shader aliasing) etc. I'm not sure on my PC that I've reached a point of it not making a difference.

Now I haven't tried 4K, but... I just wonder if there's a difference here, if we can apply the same measures used to talk about relative returns of resolution in video (movies) vs computer graphics. Like how people used to wonder why at 640x480 you didn't see 'jaggies' in video while you did in computer graphics...the sampling of light against film is a very different process than sampling in rendering.

oh definitely. But I think thats perhaps a function of realtime computer graphics being a compromise in many ways - especially AA etc.

So if you have edge AA for instance - it makes sense that at higher resolutions, those edges will be thinner and any artifacts less noticable.

How about if you spent more time on the IQ but with a lower destination resolution, Vs higher resolution with less IQ? Eg 1080p with high quality AA Vs 4k with no AA. if its just downsampling, it could even be the same source image.

I'm not a huge fan of the crispy look that PC games can have sometimes. I don't want vaseline filters, but I'd like something more
 

grendelrt

Member
So funny these threads roll around every time tech updates. My favorite was blu ray. People yelling, "I cant see a difference between up-scaled DVD and a Blu Ray, there is no point to Blu Ray!!!" I have a 64" Plasma, I will wait until 4K and OLED (or some new tech) are both in harmony before I upgrade for now.
 

majik13

Member
I understand films can capture insane amounts of detail and exceed 1080p resolution, but I have little desire to see films at a higher resolution than 1080p. Mind you I'm in the pro-4K camp, as I have the luxury of sitting close enough, but I predict that filmmakers aren't going to utilize 4K that much- few films made up to this point have even taken full advantage of 1080p.

Computer GUI's and games and photography are what will benefit most from 4K.

eh? not sure what exactly you mean by "taken advantage" but pretty much every film shown in theaters in rendered and packaged at 2k. whith IMAX being the exception at 4k. 2k being a bit bigger than 1080p.

I am in the camp that 4k is unnecessary for 95%+ of the population for home tv viewing. At these numbers we are hitting significant diminishing returns for the price.

The only benefit I see is higher res stereo 3D. And probabaly pc monitors if you sit close enough and the screen is big enough.
 

majik13

Member
So funny these threads roll around every time tech updates. My favorite was blu ray. People yelling, "I cant see a difference between up-scaled DVD and a Blu Ray, there is no point to Blu Ray!!!" I have a 64" Plasma, I will wait until 4K and OLED (or some new tech) are both in harmony before I upgrade for now.

people keep bringing up arguments like this, saying back in the day everyone was saying 1080p was unnecessary or like above, blu-ray.

but honestly, i don't remember this. I dont remember people screaming that current SD standards at the time were good enough, and no one will want or need HD. I am sure a few people said it. But I remember quite the opposite.

Eventually you hit a ceiling with this stuff, in mass market. And all it ends up being is just a bigger number, to most consumers.

I am certain that eventually 4k and such will come, but I think consumers wont really see a difference or care. And wont be much of a selling point.
 

grendelrt

Member
people keep bringing up arguments like this, saying back in the day everyone was saying 1080p was unnecessary or like above, blu-ray.

but honestly, i don't remember this. I dont remember people screaming that current SD standards at the time were good enough, and no one will want or need HD. I am sure a few people said it. But I remember quite the opposite.

Eventually you hit a ceiling with this stuff, in mass market. And all it ends up being is just a bigger number, to most consumers.

I am certain that eventually 4k and such will come, but I think consumers wont really see a difference or care. And wont be much of a selling point.

It was definitely here and if you ever go on any AV forums like avsforum.com it was all over the place. And not to mention the awesome reporting from mass media saying it as well.

Here ya go, http://www.neogaf.com/forum/showthread.php?t=340590
 

DR2K

Banned
Games have been able to hit over 1080p for years, don't tie in just consoles to this discussion.

We're talking about consoles that need to be affordable to the masses, not unrealistic gaming rigs that easily exceed that.
 
I don't see the point of 4K.


Ultimately the point of 4k, as with theater projectors is to display extremely large images while maintaining a very high PPI.

4k may not seem like a big deal now, but manufacturers are just adjusting to what the typical size screen people have at home, back in the early days of HD, 32" and 42" were common, 720p was just fine for those, but as 50 and 60" TVs became more popular, 1080p started to make more sense.

Soon 85" displays will become more affordable and for people who get an 85" display, can do so without sacrificing the PPI they'd get from a 50" 1080p set.

Remember, as screens get bigger and the resolutions stay the same, your PPI suffers. In the future when I upgrade from my current 64" TV, likely to an 85" or 100"+ set (I just bought my 64") I will definitely want to get a 4K set so that I don't loose any detail ;)
 

Zaptruder

Banned
So funny these threads roll around every time tech updates. My favorite was blu ray. People yelling, "I cant see a difference between up-scaled DVD and a Blu Ray, there is no point to Blu Ray!!!" I have a 64" Plasma, I will wait until 4K and OLED (or some new tech) are both in harmony before I upgrade for now.

Extra resolution will be perceivable up until the 8k mark.

But that relies on the resolution taking up more and more room on your eyeball.

If you sit at a normal distance away from a normal screen, you're not going to see much if any benefit at all (especially if your visual acuity is less than 20/20 - which most people's* are)

*given that 20/20 is the average optimal, and that people are generally unable to tell subtle visual acuity deterioration until it becomes severe enough to impact on their day to day activities - by which time, they're around the 20/60 to 20/100 mark.
 
Unfortunately, I am worried about older consoles the further and further we go along. When I try to show my grandson Super Mario Bros. on a real NES in 50 years, it will look terrible on the 12k TV.
 

gofreak

GAF's Bob Woodward
oh definitely. But I think thats perhaps a function of realtime computer graphics being a compromise in many ways - especially AA etc.

So if you have edge AA for instance - it makes sense that at higher resolutions, those edges will be thinner and any artifacts less noticable.

How about if you spent more time on the IQ but with a lower destination resolution, Vs higher resolution with less IQ? Eg 1080p with high quality AA Vs 4k with no AA. if its just downsampling, it could even be the same source image.

Sure, that speaks to the heart of the debate I think we've seen elsewhere about full resolution increases vs lower res + AA. I think there was some talk recently enough about it and how pixar actually renders at 1080p in some movies - but adds a shit-tonne of AA and other work to the frame. AA is an attempt to deal with finite resolution, ultimately. It may well be cheaper and more economical than going with a full blown increase in res.

However, my point is, when I see some of these visual acuity charts and all of that...I'm increasingly questioning their relevance to games and other rendered graphics vs film (which is always effectively 'super sampled' from the infinite signal of light). I think the issue of resolution benefits is on a different curve when it comes to rendered stuff - at least a more complex one that depends on lots of other variables, like whether there's other sampling going on asides from 1 per pixel of resolution. I remember the same kind of charts being used by some back in 05/06 to dismiss 1080p as necessary....and while there may have been borderline truth for some screensize/distance ratios when it comes to movies, it seems crazy now if we're talking about graphics. Then again, 720p with a shit-tonne of supersampling etc. would look really good too so... :)
 

mrklaw

MrArseFace
Ultimately the point of 4k, as with theater projectors is to display extremely large images while maintaining a very high PPI.

4k may not seem like a big deal now, but manufacturers are just adjusting to what the typical size screen people have at home, back in the early days of HD, 32" and 42" were common, 720p was just fine for those, but as 50 and 60" TVs became more popular, 1080p started to make more sense.

Soon 85" displays will become more affordable and for people who get an 85" display, can do so without sacrificing the PPI they'd get from a 50" 1080p set.

Remember, as screens get bigger and the resolutions stay the same, your PPI suffers. In the future when I upgrade from my current 64" TV, likely to an 85" or 100"+ set (I just bought my 64") I will definitely want to get a 4K set so that I don't loose any detail ;)


The increase from a standard 32" TV to a 40 or 46" TV requires you to leave a bit more space in the corner of your room. An 84" TV requires you to organise your life and room around it. That kind of TV becoming even remotely common requires a change in how we live. The increase over time doesn't just keep going ad infinitum.
 
Top Bottom