• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

4K Video Gaming is already here (Toshiba Video Demo )

majik13

Member
Click on the 4K Wall-e image open it at full zoom with only Wall-e in the picture , that's what you will be seeing on a 46" 4K TV if you're just looking at 1/4 of the screen (that's if you're doing this on a 23" 1080P Monitor)


you can step back 8 feet & you will still notice more detail than you would if you was somehow able to just make your 23" monitor 46" with the same pixels.

I am at work and don't have a long enough mouse cord to do this, and not sure how accurate it would be since the images are not identical. There are also other variables involved, do we know the exact source of these images, is the 1080p one from a compressed screengrab?

would 4k video movie stills, be the same as 4k poster image stills, there will most likely be some sort of compression on 4k video?

Is a monitor a comparable device to a tv screen(I really dont know, someone probabaly does)
 

onQ123

Member
"no it wouldn't be pointless because if you have a 23" 1080P monitor you can just look at the 4K image at full size & what will be on your screen will be like looking at 1/4 of a 4K 46" screen."


I think we may be talking about different things. on a computer there is fitting to the screen, how most images our viewed by default. and then if the image is larger than the screen, you can zoom in to 100% or a 1/1 pixel ratio.

Not sure how you can say there is no zooming in and then say you fully zoom on a image, in the same sentence.

so when someone says full zoom, or when filling the screen or "looking at 1/4" of image on a screen. I consdered that zooming in, as apposed to the full image fitting inside the screen.

is this not what we are talking about?


That was just to help you get a gasp of what I was saying. it's not really zoomed it's 1:1 & giving you a idea of what you will be seeing on a 4K TV that's 4X the size of a 1080P monitor.

you wouldn't have to be zoomed in on the 4K TV that's 4X the size of the 1080P because the other 3/4 of the picture will be on the screen with the 1/4 that you are seeing on your 1080P 23" monitor.
 

majik13

Member
That was just to help you get a gasp of what I was saying. it's not really zoomed it's 1:1 & giving you a idea of what you will be seeing on a 4K TV that's 4X the size of a 1080P monitor.

you wouldn't have to be zoomed in on the 4K TV that's 4X the size of the 1080P because the other 3/4 of the picture will be on the screen with the 1/4 that you are seeing on your 1080P 23" monitor.

yeah, no I got that, and always grasped that. that is what i said or meant in my original post, you would zoom in on the 4k image until it was 1:1.

There is no point to zooming in past 100% Not sure why anyone would think that is what I meant.

my original comment was just saying that is pointless if that is all people were gonna do. Of course you will see more detail on a computer monitor.

But doing the sit back test, and upscaling an identical 2k image to fit the same size as 4k I suppose could be comparable to viewing somethign on a 4k tv, unless we are missing something.
 

whitehawk

Banned
Good luck convincing all those people who finally upgraded their tube TVs and can't tell the difference between 480p and 1080p to upgrade again.
It's not going to be a fast progression like with HDTV, no one said it would.

HDTVs had more benefits than just a larger resolution.

- Bigger screens
- Lighter
- Thinner

A lot of people bought them for those reasons alone and still watch SD programming on them.
 
Take the Wall-E screen for example.

http://0.tqn.com/d/kidstvmovies/1/0/I/H/walle008.jpg


on a 4K 46" screen I would be able to read "solar charge level" from that screen cap from about 3 feet away while the whole screen is in view.


I have 20/10 vision this will be different for other people but that's the kind of detail 4K will bring.

I've got 30/40, and I can tell the difference. This whole 4k isn't needed thing is a running joke. It has to be.
 

onQ123

Member
yeah, no I got that, and always grasped that. that is what i said or meant in my original post, you would zoom in on the 4k image until it was 1:1.

There is no point to zooming in past 100% Not sure why anyone would think that is what I meant.

my original comment was just saying that is pointless if that is all people were gonna do. Of course you will see more detail on a computer monitor.

But doing the sit back test, and upscaling an identical 2k image to fit the same size as 4k I suppose could be comparable to viewing somethign on a 4k tv, unless we are missing something.

that's not the same because you can't take a 2k image with 2MP & just upscale it & expect it to create 6 more MP & look the same as a 4K image with 8MP.



I'm telling you the best way to get a idea of what you would see on a 4K TV because a 4K\8MP image at 1:1 on a 1080P monitor is just like seeing 1\4 off a 4K TV 4X the size of that monitor but instead of only being able to see that 1 corner you will be able to see the whole image with that seem detail.
 

majik13

Member
that's not the same because you can't take a 2k image with 2MP & just upscale it & expect it to create 6 more MP & look the same as a 4K image with 8MP.



I'm telling you the best way to get a idea of what you would see on a 4K TV because a 4K\8MP image at 1:1 on a 1080P monitor is just like seeing 1\4 off a 4K TV 4X the size of that monitor but instead of only being able to see that 1 corner you will be able to see the whole image with that seem detail.

eh? yeah, i get it we are saying the same thing, i am just saying to be able to switch back and forth between the 4k image and a 2k image, you will have to scale up the 2k image to the same size of the 4k, so that when you switch back and forth they line up exactly. that way it is easier to tell the difference is all. so that they will be 1:1 in size and placement.
 

Sky Chief

Member
Seems a bit confusing as they're pushing it for people who want really high-end products and have the best PC setups...yet only 30hz...could use a few more years in development before a major consumer release....perhaps to coincide with the PS5.

It's not like Blu-Rays would make the most of it right? Or?

When the first 1080p TVs came out they could not even take 1080p signals because the HDMI standard at that time was maxxed out at 720p/1080i and Blu Ray didn't even exist. This was in mid 2005 less than a year and a half before the PS3 launched.

The first 1080p TV that was released in the USA was the Sony Qualia 006, a 70" SXRD model and it cost $13,000.

Less than 3 years later I bought a 60" Sony SXRD 1080p TV that also supported 120Hz and accepted native 1080p input for $1500 a few months before buying an MGS4 PS3 bundle.

I have been watching 1080p Blu Rays and even playing (a few) 1080p games on that TV ever since.

To say that 4K needs a few more years of development is just crazy. People said exactly the same thing about 1080p.
 

majik13

Member
When the first 1080p TVs came out they could not even take 1080p signals because the HDMI standard at that time was maxxed out at 720p/1080i and Blu Ray didn't even exist. This was in mid 2005 less than a year and a half before the PS3 launched.

The first 1080p TV that was released in the USA was the Sony Qualia 006, a 70" SXRD model and it cost $13,000.

Less than 3 years later I bought a 60" Sony SXRD 1080p TV that also supported 120Hz and accepted native 1080p input for $1500 a few months before buying an MGS4 PS3 bundle.

I have been watching 1080p Blu Rays and even playing (a few) 1080p games on that TV ever since.

To say that 4K needs a few more years of development is just crazy. People said exactly the same thing about 1080p.

you have to also realize, that movies have been made for decades at pretty much 2k resolution before there were any real consumer hdtvs. So for HDTV adoption there was tons and tons of useable content driving the technolgy. Pretty much anything shot on 35mm. The only real 4k content is IMAX movies. So the benefits of adoption are not nearly as great for 4k, and will certainly be slower than hdtv adoption.
 

SapientWolf

Trucker Sexologist
here's some equivalent backup to your "science".


http://www.clarkvision.com/articles/eye-resolution.html

enjoy yourself.
That was dealing with what it would take to reconstruct an image with the same theoretical resolution that a human with maximum visual acuity might see. But the question here is at what point do two pixels become indistinguishable, causing benefits from an increase in resolution to be wasted. That distance is around 1 arcminute, and pixels closer than that can't be resolved by those with 20/20 vision.

Which is a different question of what the optimal capture resolution is for a photo being shown on a particular display. A better source can improve the image quality even when adding more pixels to the display does not, due to the nature of digital sampling.

As it stands now, non-PC gamers wouldn't get any benefit from this because consoles won't drive 4k. PC gamers with very large monitors could derive some benefit, but it's not proportional to the decrease in performance. People with viewing angles under a certain threshold won't see any benefit at all, and that includes a large majority of living room setups.
 

onQ123

Member
Ok, here you go.
http://i.imgur.com/n6H90.jpg
This is in true 4K 4096x2160, not 4KHD.


now take this image & set it to it's full size on a 23" 1080P monitor or 15" 720P laptop & place his face in the view of the screen & step back to whatever distance that you would be watching a 46" or so 4K TV , that's the same detail you would be getting on his face with the rest of the image on the screen.
 

majik13

Member
now take this image & set it to it's full size on a 23" 1080P monitor or 15" 720P laptop & place his face in the view of the screen & step back to whatever distance that you would be watching a 46" or so 4K TV , that's the same detail you would be getting on his face with the rest of the image on the screen.

someone should take these same images, however it is done, and make them 2k, and then reformat/upscale them back to 4k size.

also of note, the average viewing distance from a tv is 9 feet.
according to cnet
Why 4K TVs are stupid (still)
 

majik13

Member
well i did something like that but nobody seems to give a fuck :p

yeah I did notice it when you posted it, took me awhile to figure out it was an animated gif of the difference, I dont think most people noticed that though, maybe I sign that when you are not looking for it the resolution detail increase doesn't pop out.

I think what would be better is a much larger image that can fill a whole screen though.

edit: a gif is a good way to do it, as long as isnt destroying original IQ when converting it, or the upload process.

also, from roughly 5 feet or so, I dont have any measure devices, I cant see any difference in that image.
 
you have to also realize, that movies have been made for decades at pretty much 2k resolution before there were any real consumer hdtvs. So for HDTV adoption there was tons and tons of useable content driving the technolgy. Pretty much anything shot on 35mm. The only real 4k content is IMAX movies. So the benefits of adoption are not nearly as great for 4k, and will certainly be slower than hdtv adoption.

There have been many movies since shot in 4K and tv shows as well. Not to mention digital transfer of film can go 4K and beyond. There is plenty of content available.
 

majik13

Member
Whatever you did to downsample that image destroyed the IQ, because I could see the difference even when I dropped down to 640x480 on my CRT.

probably because it is only a portion of a larger image I am assuming. So when looking at it through a tv, it is essentially scaled up. Its not the full image. So that would have an effect on the original IQ.
 

Loofy

Member
Heres the picture resized 50% in MS paint, then resized 200%
Im thinking this would be the equivalent to a really crappy upscale. Actual hardware upscalers should work much better.
http://i.minus.com/ilxWJ1ZmTVTRe.png

Even then it isnt much different. People will appreciate the pixel density of 4K displays, but a 4K video format doesnt necessarily mean (alot)more visual information compared to 1080p.
 
Heres the picture resized 50% in MS paint, then resized 200%
Im thinking this would be the equivalent to a really crappy upscale. Actual 4K upscalers should work much better.
http://i.minus.com/ilxWJ1ZmTVTRe.png

Even then it isnt much different. People will appreciate the pixel density of 4K displays, but a 4K video format doesnt necessarily mean (alot)more visual information compared to 1080p.

No, just 4 times as much information.
 

Zaptruder

Banned
Ok. So I've done some research and testing myself - I'll concede that there certainly more nuance to the nature of visual acuity than the familiar 1 arcminute rule of thumb that is cited as the limits of normal 20/20 visual acuity.

The 1 arcminute number for those that were curious - is derived from the Snellen Acuity test (the letter chart that we normally see at the doctor's office) - with the 20/20 mark (second or third line from the bottom) been used to determine that acuity threshold. The problem with that rule of thumb however is that while some letters start to blur into the shape of some other letters - not all letters blur into the shape of all other letters. By which I mean, something like an I at that distance will not turn into something like an O. While something like a M will be difficult to discern from a H.

From a less used test - basically a simpler one where people have to figure out if something is 2 lines or 1 thicker line - visual acuity goes all the way to around .3 arcminute... Even better still is our ability to detect whether or not two lines are lined up against each other properly - by way of the Vernier acuity test. We have a 'hyper acuity' to that sort of difference, with a 0.13 arcminute sensitivity in a healthy eye.

From 10 feet on a 40" screen, doing some A-B testing on 1080p and 720p on the habour scene provided earlier in this thread, I can tell the difference in flipping between the images - but only the finest white line details appear to be different - all other elements appear the same.

From 7 feet, I can start to resolve other small detail differences - specifically, text on the boat and other areas of similar contrast.

From 5 feet, I can tell the difference between most of the image flipping back and forth; all the edges and some of the textures.

Given that 40" @ 1080p is roughly .5 arc minute, @ 720p is .77 arc minute, then it would seem even average acceptable acuity can discern some differences all the way up to 4k @ 35 degree FOV (each pixel would be roughly .45 arc minute on the retina).


In a way, it's fair to say that the benefits of 4k (over 1080p) start from roughly 30 degree FOV and up... if you're willing to have a larger TV, move closer, or both, then you'll be happy with the differences.

The funny thing is for me, because I use my 1080p TV from so close anyway (I use it as a computer monitor much of the time), I would've always stood to benefit from a 4k upgrade in resolution.

But nonetheless, a modified version of my contention still holds - that this upgrade is far less useful for most people (people that aren't AV enthusiasts) than the previous 480 to 1080p. As other's have mentioned before, there are physical limitations to the size of screens that we can house in our homes - and 46" appears to be a nice sweet spot for many households.

For me, even when I stand to benefit, I don't find myself overly excited by the idea - but will be more than happy to take the upgrade when it price drops to a commodity level in 5 years time.
 

Shady859

Member
As a console gamer I might care about this technology when I buy my PS5 and Xbox 1440. When the current generation released people complained about unreadable txt etc because they were aimed at 720 HDTVs and half the population was still old school tv. By now 1080 is dirt cheap and the norm and consoles have no push to cater to these 4K owners.
 

coldfoot

Banned
These kind of arguments are a joke.
It's like when someone says "Show some gameplay to some people and many of them couldn't tell you how many frame per second they are".
It doesn't matter if they "can tell" or not: a better framerate still feels a lot better, smoother, more fluid, even if people aren't consciously capable of pointing a finger and quantify the difference.
Diminishing returns:

Everyone will notice the difference between 15fps and 30fps
Many people will notice the difference between 30fps and 60fps
Few will notice the difference between 60 fps and 120fps
No one will notice the difference between 120fps and 240 fps

Same goes with resolution, when comparing typical living room TV sizes and resolutions.
 

onQ123

Member
Heres the picture resized 50% in MS paint, then resized 200%
Im thinking this would be the equivalent to a really crappy upscale. Actual 4K upscalers should work much better.
http://i.minus.com/ilxWJ1ZmTVTRe.png

Even then it isnt much difference. People will appreciate the pixel density of 4K displays, but a 4K video format doesnt necessarily mean more visual information compared to 1080p.

Did you miss the part where I could read the words "solar charge level" ?


people trying to take 1080P images & convert them to 4K are missing the point of 4K, because the point of 4K is to have 8 million pixels with bits of detail in each one of these pixels, up scaling a 1080P image to 4K is going to give you 6 million fake pixels to fill in the gaps so there will be a lot of lost details but even then it's no longer 1080P it's 1080P upscaled to 4K so it's still not giving you a real comparison.


looking at 1/4 of a 4K image on a 1080P monitor & just picture having 3 more 1080P monitors surrounding that monitor with the rest of the scene in them and that's what you will be getting with 4K.

it's not about making a 1080P scene bigger it's about being able to see that 1080P scene & 4X more.

& that 1/4 of the scene that used to be the full screen at 1080P will still have the same detail that it had before but the difference is you can now see everything that was surrounding it.
 

mnannola

Member
What about 8K? 16K? 32K? There has to be a point where us primitive humans cannot tell the difference at a normal TV viewing distance. Without seeing this in action I can't assume 1080P is this point, but blu-rays looks damn good on my 52" in the living room.
 
Literally the first thing I thought of when I read this is

Barney-s-House-barney-stinson-840618_624_352.jpg


and now I want.....
 

OMT

Member
Ok. So I've done some research and testing myself - I'll concede that there certainly more nuance to the nature of visual acuity than the familiar 1 arcminute rule of thumb that is cited as the limits of normal 20/20 visual acuity.

The 1 arcminute number for those that were curious - is derived from the Snellen Acuity test (the letter chart that we normally see at the doctor's office) - with the 20/20 mark (second or third line from the bottom) been used to determine that acuity threshold. The problem with that rule of thumb however is that while some letters start to blur into the shape of some other letters - not all letters blur into the shape of all other letters. By which I mean, something like an I at that distance will not turn into something like an O. While something like a M will be difficult to discern from a H.

From a less used test - basically a simpler one where people have to figure out if something is 2 lines or 1 thicker line - visual acuity goes all the way to around .3 arcminute... Even better still is our ability to detect whether or not two lines are lined up against each other properly - by way of the Vernier acuity test. We have a 'hyper acuity' to that sort of difference, with a 0.13 arcminute sensitivity in a healthy eye.

From 10 feet on a 40" screen, doing some A-B testing on 1080p and 720p on the habour scene provided earlier in this thread, I can tell the difference in flipping between the images - but only the finest white line details appear to be different - all other elements appear the same.

From 7 feet, I can start to resolve other small detail differences - specifically, text on the boat and other areas of similar contrast.

From 5 feet, I can tell the difference between most of the image flipping back and forth; all the edges and some of the textures.

Given that 40" @ 1080p is roughly .5 arc minute, @ 720p is .77 arc minute, then it would seem even average acceptable acuity can discern some differences all the way up to 4k @ 35 degree FOV (each pixel would be roughly .45 arc minute on the retina).


In a way, it's fair to say that the benefits of 4k (over 1080p) start from roughly 30 degree FOV and up... if you're willing to have a larger TV, move closer, or both, then you'll be happy with the differences.

The funny thing is for me, because I use my 1080p TV from so close anyway (I use it as a computer monitor much of the time), I would've always stood to benefit from a 4k upgrade in resolution.

But nonetheless, a modified version of my contention still holds - that this upgrade is far less useful for most people (people that aren't AV enthusiasts) than the previous 480 to 1080p. As other's have mentioned before, there are physical limitations to the size of screens that we can house in our homes - and 46" appears to be a nice sweet spot for many households.

For me, even when I stand to benefit, I don't find myself overly excited by the idea - but will be more than happy to take the upgrade when it price drops to a commodity level in 5 years time.

Thanks for being eminently reasonable in a thread where stubborn, angry ignorance seems to be the norm
 
It's arguable that at a certain point of resolution the human eye couldn't tell (for native 4K on a 4K display I'd say you would absolutely be able to tell in-person), but take into consideration other things than just a resolution bump. With a higher resolution, it'd decrease the possibilities of (visual) jaggies present even without much or any AA. Throw AA on to it and it'd look even better.
For games, you can also think of the bump as more "working space" for developers. With higher poly counts and everything, expanding the resolution capacity beyond 1920x1080 is more than appropriate.

Now what I don't understand were the rumors of 8K even in next generation consoles, if modern PC 4K gaming is just in the beginning of consumer availability with games native much lower res.

Iceblade said:
And you probably didn't see the point of 720/1080p when you were still rocking a CRT in 2003.
I was too young in 2003 to understand or care.

I'd say you're still too young to understand or care.
 

Quixz

Member
I can do a comparison shot of 4K frame vs a 1080p frame in case anyone wanted to try stuff out with it. I own a RED. Give me a relatively simple scene to do and I'll light it and shoot it at 4K and 1080p resolution.

I can shoot in these resolutions, the third one would be the one that probably becomes standard UHDTV:

5K (5120 x 2700): 1-12fps (HDRx 6fps)
4K (4096 x 2160): 1-30fps (HDRx 12fps)
4K QHD (3840 x 2160): 1-30fps (HDRx 15fps)

Isn't QHD 960×540?
 

onQ123

Member
Pictures on my 14MP camera isnt twice as detailed as my 7MP camera.

Tell me where you was able to see your 14MP pictures at full resolution to be able to say that it's not twice as detailed as your 7MP camera? unless you printed out some big pictures from your 14MP camera & 7MP camera & didn't see much difference. but even then the lenses & sensors play a big part in the IQ.
 

Loofy

Member
Did you miss the part where I could read the words "solar charge level" ?
It does look sharper. But you can sorta make out solar in the upscaled one too...
Again this is just using MS paint. I wonder what it would look like using toshibas superresolution upscaling.
i5pHk3q1eaIoz.png
 
Top Bottom