• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

4K Video Gaming is already here (Toshiba Video Demo )

Hcoregamer00

The 'H' stands for hentai.
This is exciting, but AT&T doesn't even offer 1080p display for goodness sakes. They offer only 720p and 1080i.

:(

I shudder to think what kind of corners they will cut once 4k and 8k become standard.
 

TheExodu5

Banned
Unfortunately, I am worried about older consoles the further and further we go along. When I try to show my grandson Super Mario Bros. on a real NES in 50 years, it will look terrible on the 12k TV.

It won't look any worse on the 12K TV than it will on a current LCD. In fact, it should look a lot better since scaling artifacts will be far less noticeable.
 
The increase from a standard 32" TV to a 40 or 46" TV requires you to leave a bit more space in the corner of your room. An 84" TV requires you to organise your life and room around it. That kind of TV becoming even remotely common requires a change in how we live. The increase over time doesn't just keep going ad infinitum.

Right, I could explain to someone the point of owning a ferrari, doesn't mean it applies to everyone :p

I have the space for an 85" right now, unfortunately the largest consumer plasma was 65" when I bought mine, if not I would've gotten a bigger one. In the future when the price is right (under $5,000 for me at least) I will upgrade for a larger TV and doing that I will also want to upgrade to a 4K TV to keep my pixel per inch high as well.

Today we have 1080p 42" TVs which I think is silly unless you're using them as a monitor display and sit right in front of them :p Likewise 4K 50" TVs would be dumb but I'm sure as time goes buy and the manufacturing is a lot cheaper, all sets will support 4k regardless even if just for marketing, but 85"+ screens will benefit from the boost.
 

emb

Member
I'm more in the "I don't really" care group personally. I understand the benefits of higher resolutions, and I can even notice HD if I'm looking for it, but going without it generally doesn't bother me much.

Maybe these 4K TVs will bring down the price of 1080p sets though. I need a good one for 360/PS3 games with tiny font sizes.
 

gofreak

GAF's Bob Woodward
Following from my last post, I went to look again at those Wall-E shots used to show the 1080p original render vs the 5kx2k 'marketing' render.

http://hq55.com/disney/walle/walle-disneyscreencaps.com-521.jpg

http://0.tqn.com/d/kidstvmovies/1/0/I/H/walle008.jpg

While the point that you can have a nice smooth image at 1080p is well made - and while there appears to be a colouring difference here too - viewing both on my 1080p 22" shows a clear detail improvement in the latter. The 1080p render blurs and misses some small details completely. There aren't enough pixels to give some detail enough definition.

AA techniques can be a performance-efficient way to try and deal with the problems of finite res, but ultimately they are trying to mimic higher resolution - and they probably won't be quite as faithful. I'd love to see a 1080p and 4K rendering of the same material side by side on 1080p and 4K displays respectively - but I am increasingly unconvinced by arguments that resolution differences are completely indiscernible with computer graphics beyond 1080p on a typical TV. We can argue about performance cost-efficiency, but that's another matter... :)
 

Zaptruder

Banned
Following from my last post, I went to look again at those Wall-E shots used to show the 1080p original render vs the 5kx2k 'marketing' render.

http://hq55.com/disney/walle/walle-disneyscreencaps.com-521.jpg

http://0.tqn.com/d/kidstvmovies/1/0/I/H/walle008.jpg

While the point that you can have a nice smooth image at 1080p is well made - and while there appears to be a colouring difference here too - viewing both on my 1080p 22" shows a clear detail improvement in the latter. The 1080p render blurs and misses some small details completely. There aren't enough pixels to give some detail enough definition.

AA techniques can be a performance-efficient way to try and deal with the problems of finite res, but ultimately they are trying to mimic higher resolution - and they probably won't be quite as faithful. I'd love to see a 1080p and 4K rendering of the same material side by side on 1080p and 4K displays respectively - but I am increasingly unconvinced by arguments that resolution differences are completely indiscernible with computer graphics beyond 1080p on a typical TV. We can argue about performance cost-efficiency, but that's another matter... :)

It's a shame that the two images aren't showing the exact same scene - the film render could be rendered purposefully with depth of field and other cinematic elements that are not present in the hi-res marketing shot.

Nonetheless, I don't doubt for a moment that super sample anti aliasing generally provides a better image quality than other AA techniques.

The only way to do a proper A-B test is to get a 4k native image, down sample to 1080p - show both images on a 4k screen, at 35 degree field of view, show it to a random person and get them to report the severity of the difference, if any.
 

gofreak

GAF's Bob Woodward
Nonetheless, I don't doubt for a moment that super sample anti aliasing generally provides a better image quality than other AA techniques.

Sure. I think it's the closest to having extra pixels.

But the point is that this 1080p rendering - with, presumably, SSAA out the wazoo - appears to be blurring some detail vs a higher native res. Someone familiar with the mechanics of SSAA can outline if that should be the case or not - maybe if each extra sample is equivalent to a full extra pixel, it might be the case that the total number of samples here still isn't as high as the 5Kx2K - maybe that would account for the difference. Or maybe they're not using SSAA, but high quality MSAA. SSAA is - I believe - basically as expensive as a native rendering anyway, so if we were going to do that, we might as well offer the higher res if the display can take it.*

Anyways, point taken about the slightly different angles etc. But I have a big old skeptic-hat on at the moment, wrt 'it won't make a difference' argument.

Shame we can't easily sample 4K vs 1080p easily right now. Although if someone was so inclined and had the equipment they could probably compare 1080p to other higher resolutions on higher res monitors and see if it lines up with the predictions those charts make. I've a feeling they're really only relevant with live movie footage.

* edit - reading accompanying blog posts, it seems Pixar uses MSAA or something like it in the 1080p renders
 

SapientWolf

Trucker Sexologist
Following from my last post, I went to look again at those Wall-E shots used to show the 1080p original render vs the 5kx2k 'marketing' render.

http://hq55.com/disney/walle/walle-disneyscreencaps.com-521.jpg

http://0.tqn.com/d/kidstvmovies/1/0/I/H/walle008.jpg

While the point that you can have a nice smooth image at 1080p is well made - and while there appears to be a colouring difference here too - viewing both on my 1080p 22" shows a clear detail improvement in the latter. The 1080p render blurs and misses some small details completely. There aren't enough pixels to give some detail enough definition.

AA techniques can be a performance-efficient way to try and deal with the problems of finite res, but ultimately they are trying to mimic higher resolution - and they probably won't be quite as faithful. I'd love to see a 1080p and 4K rendering of the same material side by side on 1080p and 4K displays respectively - but I am increasingly unconvinced by arguments that resolution differences are completely indiscernible with computer graphics beyond 1080p on a typical TV. We can argue about performance cost-efficiency, but that's another matter... :)
The post processing that Pixar does typically introduces some blur to the image, which they actually prefer for film. It's PC gamers that are used to that super sharp look. We argued with Timothy Lottes over this because TXAA looked blurry as hell. He said "Well, look at film CG."

You don't necessarily need a 4k display to see the benefits of a sharper source image.
 

gofreak

GAF's Bob Woodward
You don't necessarily need a 4k display to see the benefits of a sharper source image.

Yes, and that will be one of the nice things about any games that do offer > 1080p res. But if you're going to render a sharper image - to downsample for 1080p - you might as well offer it direct for displays that can take it natively also.

It would be great to see the exact same source at different res. I doubt 1080p is the limit for quality resolution of small detail in rendered images though - be your TV big or small. 1080p with MSAA probably won't be quite as good as a full blown resolve...although I guess the wall-e images aren't quite apples-to-apples.
 

majik13

Member
It was definitely here and if you ever go on any AV forums like avsforum.com it was all over the place. And not to mention the awesome reporting from mass media saying it as well.

Here ya go, http://www.neogaf.com/forum/showthread.php?t=340590

this link seems to be more about the viability of blu ray at the time. Yeah some in there are saying it is not needed or discernible, but very few. And many media/journalist are always inaccurate/bias on everything, probabaly just for sensationalism.

But most commentors are just saying that BR and BR players are too expensive right now(08) or that physical media is dieing and not necessary. Which both are pretty true, especially for me. I only own probably 2 BR discs. And I think I maybe only watched 1 of them once. about 3 years ago.

I mostly dl or stream, all my hd movie content.

that thread is more about BR as a medium than resolution. I guess my original comment could be corrected specifically about BR, but not HD resolution.
 
Following from my last post, I went to look again at those Wall-E shots used to show the 1080p original render vs the 5kx2k 'marketing' render.

http://hq55.com/disney/walle/walle-disneyscreencaps.com-521.jpg

http://0.tqn.com/d/kidstvmovies/1/0/I/H/walle008.jpg

While the point that you can have a nice smooth image at 1080p is well made - and while there appears to be a colouring difference here too - viewing both on my 1080p 22" shows a clear detail improvement in the latter. The 1080p render blurs and misses some small details completely. There aren't enough pixels to give some detail enough definition.

AA techniques can be a performance-efficient way to try and deal with the problems of finite res, but ultimately they are trying to mimic higher resolution - and they probably won't be quite as faithful. I'd love to see a 1080p and 4K rendering of the same material side by side on 1080p and 4K displays respectively - but I am increasingly unconvinced by arguments that resolution differences are completely indiscernible with computer graphics beyond 1080p on a typical TV. We can argue about performance cost-efficiency, but that's another matter... :)

I was like "that looks mostly the sa.. OH MY GOD" cuz I saw the details of the paint and stuff.
 
Following from my last post, I went to look again at those Wall-E shots used to show the 1080p original render vs the 5kx2k 'marketing' render.

http://hq55.com/disney/walle/walle-disneyscreencaps.com-521.jpg

http://0.tqn.com/d/kidstvmovies/1/0/I/H/walle008.jpg

While the point that you can have a nice smooth image at 1080p is well made - and while there appears to be a colouring difference here too - viewing both on my 1080p 22" shows a clear detail improvement in the latter. The 1080p render blurs and misses some small details completely. There aren't enough pixels to give some detail enough definition.

AA techniques can be a performance-efficient way to try and deal with the problems of finite res, but ultimately they are trying to mimic higher resolution - and they probably won't be quite as faithful. I'd love to see a 1080p and 4K rendering of the same material side by side on 1080p and 4K displays respectively - but I am increasingly unconvinced by arguments that resolution differences are completely indiscernible with computer graphics beyond 1080p on a typical TV. We can argue about performance cost-efficiency, but that's another matter... :)

those arguments never held any merit in the first place.

just old men shouting at trees and pretending they don't see the obvious differences. There will be no question that minds will be blown when they get a chance to see 4K material on a 4K television sitting next to the same content in 1080p on a 1080p television.

everything else old men like Zap are saying is just noise. they'll "get it" soon enough.
 

SapientWolf

Trucker Sexologist
those arguments never held any merit in the first place.

just old men shouting at trees and pretending they don't see the obvious differences. There will be no question that minds will be blown when they get a chance to see 4K material on a 4K television sitting next to the same content in 1080p on a 1080p television.

everything else old men like Zap are saying is just noise. they'll "get it" soon enough.
lol

Those photos were intentionally blurred for artistic purposes. It has nothing to do with resolution.

"We do what is essentially MSAA. Then we do a lens distortion that makes the image incredibly soft (amongst other blooms/blurs/etc). Softness/noise/grain is part of film and something we often embrace. Jaggies we avoid like the plague and thus we anti-alias the crap out of our images," added Pixar's Chris Horne, adding an interesting CG movie perspective to the discussion - computer generated animation is probably the closest equivalent gaming has in Hollywood.

"In the end it's still the same conclusion: games oversample vs film. I've always thought that film res was more than enough res. I don't know how you will get gamers to embrace a film aesthetic, but it shouldn't be impossible."

http://www.neogaf.com/forum/showpost.php?p=34186165&postcount=152
 
I can do a comparison shot of 4K frame vs a 1080p frame in case anyone wanted to try stuff out with it. I own a RED. Give me a relatively simple scene to do and I'll light it and shoot it at 4K and 1080p resolution.

I can shoot in these resolutions, the third one would be the one that probably becomes standard UHDTV:

5K (5120 x 2700): 1-12fps (HDRx 6fps)
4K (4096 x 2160): 1-30fps (HDRx 12fps)
4K QHD (3840 x 2160): 1-30fps (HDRx 15fps)
 

majik13

Member
I can do a comparison shot of 4K frame vs a 1080p frame in case anyone wanted to try stuff out with it. I own a RED. Give me a relatively simple scene to do and I'll light it and shoot it at 4K and 1080p resolution.

I can shoot in these resolutions, the third one would be the one that probably becomes standard UHDTV:

5K (5120 x 2700): 1-12fps (HDRx 6fps)
4K (4096 x 2160): 1-30fps (HDRx 12fps)
4K QHD (3840 x 2160): 1-30fps (HDRx 15fps)

wouldnt this be kinda pointless, if we dont have 4k devices to view it in. Yeah it will look better on a computer monitor, because we would essentially be able to zoom in on the detail.
 

onQ123

Member
wouldnt this be kinda pointless, if we dont have 4k devices to view it in. Yeah it will look better on a computer monitor, because we would essentially be able to zoom in on the detail.



no it wouldn't be pointless because if you have a 23" 1080P monitor you can just look at the 4K image at full size & what will be on your screen will be like looking at 1/4 of a 4K 46" screen.
 

Stet

Banned
L97Lu.png

It's funny, but the NES actually looks really good when it's upscaled crisply.
 

orioto

Good Art™
That's interesting, Bluerei and onQ123 gave me the idea to test it actually.

4Ktest.gif


On my 40" monitor, this is a real size crop of a 80" 4k picture, done with random D3200 sample on Dpreview. 4k and down-sampled to 2k

Meaning at 40" it's a portion of a 80" screen. What you can see at least, is that even at a proper distance, the 2k version is noticeably blurry.
 

majik13

Member
That is a lie.

The whole set of, "you can only see around so many FPS so 30 is enough and you can't really tell the difference" and, "on a 40 inch TV, from 6 -8 feet away, you can't tell the difference between 720p and 1080p." NEEDS TO STOP.

Fuck what you read or hear.
I HAD a 40inch and I could EASILY tell the difference between 720p and 1080p, 30fps and 60fps.

Hell, I can tell the difference between raw 1080p and downsampled 1620.

I can also usually tell the difference between 50-60 FPS.

When watching a movie, I can kind of understand this concept, sometimes...maybe.
But with videogames, you're displaying raw, aliased, sharp lines.

IIRC some big name said something about if we can Antialias in a way that resembles the focus of film, and also get motion blur to a point similar to that in film, we would be able to get away with normal resolutions and 24 fps without compromising IQ.

it is certainly no lie. and I am not sure why you would say that, since you are basically agreeing with me. Reread what I wrote.

"you need at least a 43inch tv to see a difference between 720p and 1080p at reasonable viewing distance"


I never said you cant tell a difference on a 40" screen. 6-8 feet is a "reasonable viewing distance" so within that range you will se a difference.

I agree about frames per sec, you can definitely tell the difference between 30/60 fps. But screen size and viewing distance definitely have an affect on resolving pixel count.

on a 42" tv, 1080p resolutions are only fully noticeable from 5.5 feet and closer, at 8.2 feet and back it is non discernible between 1080p and 720p, and that is with the best eyesight.
http://carltonbale.com/home-theater/home-theater-calculator/
this is scientific "fact" not some anecdotal evidence.

This is the tv I once owed. the first 42 in 1080p plasma.

"you'll have to sit quite close to the screen to appreciate the benefits of 1080p"
"in fact, depending on how close you like to sit, those extra pixels don't matter at all"


http://reviews.cnet.com/flat-panel-tvs/panasonic-viera-th-42pz700u/4505-6482_7-32466828.html
 

majik13

Member
no it wouldn't be pointless because if you have a 23" 1080P monitor you can just look at the 4K image at full size & what will be on your screen will be like looking at 1/4 of a 4K 46" screen.

Well it is obvious the 4k will have more detail, but this is still pointless, the argument is essentially resolving pixel detail in accordance with screen size and viewing distance on a 4k native TV

Like I already said in my first comment yeah, you can zoom in on a computer, but this has nothing to do with viewing 4k content on a tv. You cant and wont be zooming in to look at detail. You cant judge how well you can resolve the pixel count on a native 4k tv, by zooming in a 4k images on a 2k computer.

The argument is for most consumers the detail will be indiscernible and overkill on roughly anything smaller than 80in, depending on reasonable viewing distance.
 

majik13

Member
lol, everyone quotes the same dumbass. I suppose because he's the only one saying this dumb shit. you call that "scientific fact"? do you know what the phrase "scientific fact" even means? holy shit.

and this is augmented with a random-ass comment in a C|Net review of a 2007 television. I've seen it all.

man, your arguments are so sound, and well backed up.

and you will notice that I put fact inside " ".

anyways it is scientific "fact" until there is other scientific observations that refute this data. Honestly, I would like to see it.
 
man, your arguments are so sound, and well backed up.

here's some equivalent backup to your "science".

"How many pixels are needed to match the resolution of the human eye? Each pixel must appear no larger than 0.3 arc-minute. Consider a 20 x 13.3-inch print viewed at 20 inches. The Print subtends an angle of 53 x 35.3 degrees, thus requiring 53*60/.3 = 10600 x 35*60/.3 = 7000 pixels, for a total of ~74 megapixels to show detail at the limits of human visual acuity.

The 10600 pixels over 20 inches corresponds to 530 pixels per inch, which would indeed appear very sharp. Note in a recent printer test I showed a 600 ppi print had more detail than a 300 ppi print on an HP1220C printer (1200x2400 print dots). I've conducted some blind tests where a viewer had to sort 4 photos (150, 300, 600 and 600 ppi prints). The two 600 ppi were printed at 1200x1200 and 1200x2400 dpi. So far all have gotten the correct order of highest to lowest ppi (includes people up to age 50)."
http://www.clarkvision.com/articles/eye-resolution.html

enjoy yourself.
 

majik13

Member
here's some equivalent backup to your "science".


http://www.clarkvision.com/articles/eye-resolution.html

enjoy yourself.

what?, what does that have to do with tvs and viewing distance. The word "distance" isn't even on that page anywhere. Way to stay on topic, what are you arguing anyways? How does this refute what I orginal said?

here is more evidence, since you some how think no other evidence exists. again my original argument

"you need at least a 43inch tv to see a difference between 720p and 1080p at reasonable viewing distance"

this graph shows that at 5 feet distance you see benefits of 1080p on a 40 in screen.

resolution_chart.jpg
 

mrklaw

MrArseFace
Well it is obvious the 4k will have more detail, but this is still pointless, the argument is essentially resolving pixel detail in accordance with screen size and viewing distance on a 4k native TV

Like I already said in my first comment yeah, you can zoom in on a computer, but this has nothing to do with viewing 4k content on a tv. You cant and wont be zooming in to look at detail. You cant judge how well you can resolve the pixel count on a native 4k tv, by zooming in a 4k images on a 2k computer.

The argument is for most consumers the detail will be indiscernible and overkill on roughly anything smaller than 80in, depending on reasonable viewing distance.


Wouldn't a full screen image on a 42" 1080p TV be the equivalent of watching a corner of an 84" 4k TV? physically the same size and pixels would be the same size - an 84" 4k TV is literally 4 42" 1080p TVs in a 2x2 grid.

You'd then need to adjust the viewing distance. Eg about 10ft away for an 84" screen.
 
Well it is obvious the 4k will have more detail, but this is still pointless, the argument is essentially resolving pixel detail in accordance with screen size and viewing distance on a 4k native TV

Like I already said in my first comment yeah, you can zoom in on a computer, but this has nothing to do with viewing 4k content on a tv. You cant and wont be zooming in to look at detail. You cant judge how well you can resolve the pixel count on a native 4k tv, by zooming in a 4k images on a 2k computer.

The argument is for most consumers the detail will be indiscernible and overkill on roughly anything smaller than 80in, depending on reasonable viewing distance.

He's not talking about zooming in.

What most consumers can't do at the time isn't a good reason not to push media towards that resolution. Bad argument.
 

onQ123

Member
Well it is obvious the 4k will have more detail, but this is still pointless, the argument is essentially resolving pixel detail in accordance with screen size and viewing distance on a 4k native TV

Like I already said in my first comment yeah, you can zoom in on a computer, but this has nothing to do with viewing 4k content on a tv. You cant and wont be zooming in to look at detail. You cant judge how well you can resolve the pixel count on a native 4k tv, by zooming in a 4k images on a 2k computer.

The argument is for most consumers the detail will be indiscernible and overkill on roughly anything smaller than 80in, depending on reasonable viewing distance.

The point is you wouldn't have to zoom in, because the same detail that you get from looking at a 4K / 8MP image at full zoom on a 23" 1080P screen is the same thing that you will be getting on a 46" 4K screen with the whole image on the screen without it having to be zoomed in.


what's so hard to understand about that?
 

majik13

Member
The point is you wouldn't have to zoom in, because the same detail that you get from looking at a 4K / 8MP image at full zoom on a 23" 1080P screen is the same thing that you will be getting on a 46" 4K screen with the whole image on the screen without it having to be zoomed in.


what's so hard to understand about that?

perhaps, I am not certain if they would be exactly the same, but maybe they would be. For this experiment to actual work though, you would also need to sit back as far as you comfortable would watching a 46 inch tv(8 feet?), and switch back and forth between 1080p and 4k image, and see how much discernible difference there is.

People will not be sitting as close to one of those tvs as they would a computer monitor.
 

majik13

Member
What most consumers can't do at the time isn't a good reason not to push media towards that resolution. Bad argument.

i am sure 4k will be standard at some point. And I am not arguing that. I am just saying for most people even people here, it will not be a big jump unless you are willing and able to get a 80+ inch screen tv in your room.

I think 4k adoption will be a lot harder and slower than HD adoption, because of diminishing returns. It is not necessary or useful for most people. I think for a long considerable time 4k will be practically pointless for home tvs.

But I can possibly see with some unknown future tech it will actually become useful and standard, video walls, or vr, or some holodeck or somethign, I dont know

.
no it wouldn't be pointless because if you have a 23" 1080P monitor you can just look at the 4K image at full size & what will be on your screen will be like looking at 1/4 of a 4K 46" screen.
He's not talking about zooming in. .

to look at 4k image full size on a 1080p monitor, you would have to zoom in to see the pixels in 1:1 scale. so as far as I know, yes he is talking about zooming in.
 

Kyuur

Member
Do not want a 4k resolution on my computer monitor. Any application that are fixed pixel width/height will be incredibly hard to see on anything that isn't a gigantic TV.
 
perhaps, I am not certain if they would be exactly the same, but maybe they would be. For this experiment to actual work though, you would also need to sit back as far as you comfortable would watching a 46 inch tv(8 feet?), and switch back and forth between 1080p and 4k image, and see how much difference there is.

People will not be sitting as close to one of those tvs as they would a computer monitor.

Depends on the person and application. For me it depends on the game or media too. If it's a casual platformer, I'll sit back. If it's a fighting game I want to be real close.

So sure, sit back far enough and you won't notice, but not everyone sits at that distance every time for every type of media, which is why saying "you can't notice the difference" really shouldn't be the argument since that isn't the scenario every time you sit down to play or watch something and why using that (not really) scientific chart is a horrible example behind the reason for saying 4K is overkill or not needed or undiscernable.
 
i am sure 4k will be standard at some point. And I am not arguing that. I am just saying for most people even people here, it will not be a big jump unless you are willing and able to get a 80+ inch screen tv in your room.

I think 4k adoption will be a lot harder and slower than HD adoption, because of diminishing returns. It is not necessary or useful for most people. I think for a long considerable time 4k will be practically pointless for home tvs.

But I can possibly see with some unknown future tech it will actually become useful and standard, video walls, or vr, or some holodeck or somethign, I dont know

.


to look at 4k image full size on a 1080p monitor, you would have to zoom in to see the pixels in 1:1 scale. so as far as I know, yes he is talking about zooming in.
The point is you wouldn't have to zoom in

As far as he said, no, he's not talking about zooming in.
 

onQ123

Member
perhaps, I am not certain if they would be exactly the same, but maybe they would be. For this experiment to actual work though, you would also need to sit back as far as you comfortable would watching a 46 inch tv(8 feet?), and switch back and forth between 1080p and 4k image, and see how much difference there is.

People will not be sitting as close to one of those tvs as they would a computer monitor.

Click on the 4K Wall-e image open it at full zoom with only Wall-e in the picture , that's what you will be seeing on a 46" 4K TV if you're just looking at 1/4 of the screen (that's if you're doing this on a 23" 1080P Monitor)


you can step back 8 feet & you will still notice more detail than you would if you was somehow able to just make your 23" monitor 46" with the same pixels.
 

majik13

Member
As far as he said, no, he's not talking about zooming in.


The point is you wouldn't have to zoom in, because the same detail that you get from looking at a 4K / 8MP image at full zoom on a 23" 1080P screen is the same thing that you will be getting on a 46" 4K screen with the whole image on the screen without it having to be zoomed in.

"no it wouldn't be pointless because if you have a 23" 1080P monitor you can just look at the 4K image at full size & what will be on your screen will be like looking at 1/4 of a 4K 46" screen."


I think we may be talking about different things. on a computer there is fitting to the screen, how most images our viewed by default. and then if the image is larger than the screen, you can zoom in to 100% or a 1/1 pixel ratio.

Not sure how you can say there is no zooming in and then say you fully zoom on a image, in the same sentence.

so when someone says full zoom, or when filling the screen or "looking at 1/4" of image on a screen. I consdered that zooming in, as apposed to the full image fitting inside the screen.

is this not what we are talking about?
 
Top Bottom