I remember when people scoffed at the idea of 1080p. Next-gen we might see games running at 4k. Costs need to go down first and mainstream acceptance of televisions. This is still a good 6-7 years off from mainstream; which is perfect to coincide with PS5/Xbox1080.
You can see the FF Agni demo in the background. So I guess FFXV is going to be 4k.
Theres one video from IFA when reporter went to those 3 screens. One showed 1080p image [Agni and other demos], second one upscaled 1080p, and third one native 4k.
TVs are driven by content. Not a single channel is gonna switch to anything above 1080p in the next ten years because
a) they just bought all the equipment for 1080p (a lot of them only 1080i/720p) and spent millions on it
b) higher resolution equals more bandwitch equals much higher cost to get your program to the viewers
c) it doesn't make any sense for developers to support it. A game that runs at 30 fps in 1080p (and I guess we'll see 720p as the standard nextgen) will struggle to reach even half of that frame rate in 4K. And let's not get started talking about stuff like RAM requirements and fill rates. It. Just. Won't. Work! 4K is for professional picture and video editing but not for games and TV.
Damn, 25k and they couldn't use a 300MHz HDMI chip? I know the 7970 has one cause 1080p 120Hz (or 60Hz 3D) over HDMI was one of their bragging points. 1080p/120 and 4k/60 should require the same bandwidth, so I can only assume Toshiba's still using an old gimped chip.
I'd like to have a 4k monitor/tv
I wont buy one until they're at the same price-level of today's 1080p sets.
I'll never, ever, understand the resistance against new and more advanced tech. It's expensive, you don't HAVE to buy it.
"I'll never need X gb of ram"
"Who needs Y mbs of hdd space?"
"4k? I can't even tell the difference"
"How could you possibly fill an entire dvd?!"
"Why would you even need 1mb/s connection?"
I don't see anyone saying 'begone this evil technology'. We're more discussing whether it is relevant.
visual acuity isn't the same as faster higher stronger though. You can always download faster (and things get bigger so its a constant race).
Its more like audio. Bluray has lossless audio with TrueHD or DTS MA, so you don't need anything more (you can just spend your money on speakers to do it justice). There is presumably a similar point with video. Its a mix of a few things - resolution, colour depth to avoid banding etc, framerate and compression.
resolution and colour depth are probably the most objective measures and it should be possible to have a definition of what ppi is enough for 90% of consumers to not distinguish pixels. Its the retina argument basically, and you have to factor in viewing angle (not distance because that will vary with screen size).
Compression and framerate might be more subjective, lossless would be great but is impractical, so its about what we perceive as good enough.
IFA 4K and 8K report from SkyNews
http://www.youtube.com/watch?v=eX4cte2qeBs
Lol without even looking, I knew this thread was made by onQ123. The man is on a mission to make 4k seem relevant. =P
I'd like to have a 4k monitor/tv
I wont buy one until they're at the same price-level of today's 1080p sets.
I'll never, ever, understand the resistance against new and more advanced tech. It's expensive, you don't HAVE to buy it.
"I'll never need X gb of ram"
"Who needs Y mbs of hdd space?"
"4k? I can't even tell the difference"
"How could you possibly fill an entire dvd?!"
"Why would you even need 1mb/s connection?"
I'm against 4K. why? becuz the majority of these demonstrations/tvs are actually sub 4k.
call me when they actually do 4k ala 4096x2160
Isn't stinkles Frankie formerly with Bungie and now with 343?Yeah, it was a joke.
Ah that's the point of 4K.
Your brain can resolve the resolution pretty easy on a 46-60+ sizes. Don't get res mixed with ppi.
Hudson can finally release Hi-Ten Bomberman!
...
1080p isnt eveb standard and you expect 4k to be in a couple of years? The earliest i would says is at least a little over a decade. But people are right 4k screens for monitors and tablets are great.The posts of 'no point' are always so funny ... because we all know deep down inside this will be standard in a couple years and we'll wonder how anyone lived with 1080p.
Only if you're sitting 5 feet away from them.
A person with 20/20 vision can resolve around 300 pixels per inch / viewing distance in feet.
A 46 inch 1080P has 47.88 pixiels per inch. That means that you need to sit no more then 6.26 feet away to be able resolve all the detail. In most homes, people sit much further back then that and 4K is only usefull for very large screen sizes.
I think this is a fair question: which would look better, 3840x2160 at native res on a 4KTV, or 3840x2160 downsampled to 1080p?
1080p isnt eveb standard and you expect 4k to be in a couple of years? The earliest i would says is at least a little over a decade. But people are right 4k screens for monitors and tablets are great.
The bottom one is 720p.
Did you guess right?
Ok if the fastest video card on the market is struggling to hit 30 fps at that res how in the hell is the PS4 gonna manage that? I am not looking forward to sub-HD games being upscaled to 4K.
I've always wondered though if it's a bit more complicated with rendered graphics vs video?
I mean, I know on my monitor that when I ratchet up resolution as far as I can go, the improvements continue to be noticeable to the IQ...to temporal artifacts and aliasing (both edge and texture/shader aliasing) etc. I'm not sure on my PC that I've reached a point of it not making a difference.
Now I haven't tried 4K, but... I just wonder if there's a difference here, if we can apply the same measures used to talk about relative returns of resolution in video (movies) vs computer graphics. Like how people used to wonder why at 640x480 you didn't see 'jaggies' in video while you did in computer graphics...the sampling of light against film is a very different process than sampling in rendering.
I'm not a huge fan of the crispy look that PC games can have sometimes.
I understand films can capture insane amounts of detail and exceed 1080p resolution, but I have little desire to see films at a higher resolution than 1080p. Mind you I'm in the pro-4K camp, as I have the luxury of sitting close enough, but I predict that filmmakers aren't going to utilize 4K that much- few films made up to this point have even taken full advantage of 1080p.
Computer GUI's and games and photography are what will benefit most from 4K.
So funny these threads roll around every time tech updates. My favorite was blu ray. People yelling, "I cant see a difference between up-scaled DVD and a Blu Ray, there is no point to Blu Ray!!!" I have a 64" Plasma, I will wait until 4K and OLED (or some new tech) are both in harmony before I upgrade for now.
people keep bringing up arguments like this, saying back in the day everyone was saying 1080p was unnecessary or like above, blu-ray.
but honestly, i don't remember this. I dont remember people screaming that current SD standards at the time were good enough, and no one will want or need HD. I am sure a few people said it. But I remember quite the opposite.
Eventually you hit a ceiling with this stuff, in mass market. And all it ends up being is just a bigger number, to most consumers.
I am certain that eventually 4k and such will come, but I think consumers wont really see a difference or care. And wont be much of a selling point.
Games have been able to hit over 1080p for years, don't tie in just consoles to this discussion.
I don't see the point of 4K.
So funny these threads roll around every time tech updates. My favorite was blu ray. People yelling, "I cant see a difference between up-scaled DVD and a Blu Ray, there is no point to Blu Ray!!!" I have a 64" Plasma, I will wait until 4K and OLED (or some new tech) are both in harmony before I upgrade for now.
oh definitely. But I think thats perhaps a function of realtime computer graphics being a compromise in many ways - especially AA etc.
So if you have edge AA for instance - it makes sense that at higher resolutions, those edges will be thinner and any artifacts less noticable.
How about if you spent more time on the IQ but with a lower destination resolution, Vs higher resolution with less IQ? Eg 1080p with high quality AA Vs 4k with no AA. if its just downsampling, it could even be the same source image.
Ultimately the point of 4k, as with theater projectors is to display extremely large images while maintaining a very high PPI.
4k may not seem like a big deal now, but manufacturers are just adjusting to what the typical size screen people have at home, back in the early days of HD, 32" and 42" were common, 720p was just fine for those, but as 50 and 60" TVs became more popular, 1080p started to make more sense.
Soon 85" displays will become more affordable and for people who get an 85" display, can do so without sacrificing the PPI they'd get from a 50" 1080p set.
Remember, as screens get bigger and the resolutions stay the same, your PPI suffers. In the future when I upgrade from my current 64" TV, likely to an 85" or 100"+ set (I just bought my 64") I will definitely want to get a 4K set so that I don't loose any detail