• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel: Ivy Bridge GPU to Support 4K Resolutions of up to 4096x4096

Status
Not open for further replies.
I don't think we should be looking at increased resolutions above 1080p for a while. A few big titles struggle to do even 720p, so I can't see this being considered any time soon. I don't want the power of the next generation consoles being slurped up by ridiculous and unnecessary resolution leaps at this time, and I doubt Microsoft or Sony will be either. Perhaps when there is a large penetration of ultra HD in homes around the world it may start to make sense, but I don't see that being the case for at least another decade. Chips capable of 4k resolutions make for nice PR headlines but make little sense for home gaming at any point in the near future, PC or otherwise.
 

Suairyu

Banned
wsippel said:
It certainly does. Ultra high resolution monitors exceed the resolution of the human eye even at close viewing distances. You waste time and performance drawing pixels you can't even see.
Resolution of the... human eye? Wat.
 

dr_rus

Member
8192x8192 supported on all GeForces since 8800.
4096x4096 supported on most of GeForces from 7 series and below.

Intel should stop making noises.
 

Valnen

Member
sentry65 said:
PC monitors suck

1080p HD formats pretty much killed monitor R&D to higher resolutions

in 1997 it was all about 800x600
in 1999 it was mostly 1024x768
2001 was finally seeing 1600x1200

2003 saw 1920x1200

2005 went with mainly 1920x1080 and pretty much stayed there for the last 6 years...
I don't really see much reason to go higher. Honestly I'm pretty happy with my monitor and it's only 1680x1050.
 

wsippel

Banned
Suairyu said:
Resolution of the... human eye? Wat.
You think the human eye has infinite resolution? It doesn't. Just like you can only see a limited color spectrum and process a limited number of frames per second. Many claims I read about resolutions and color depth and how much difference they supposedly make to some people isn't all that different from people claiming they can lift five tons or hold their breath for an hour.
 

Jocchan

Ὁ μεμβερος -ου
rhfb said:
edit: I know what 4096 is, just seems strange to not go for 4x 1080p, instead going for 4.5whatever
This is what irks me about 4K. The fact it's not a perfect multiple.
 

amdnv

Member
Warm Machine said:
Yeah, we sat at a top level of 728x486 for 50 years. The infrastructure shift to move to 1080p was collossal. When we can confidently move 1080p through a cable to everyone's homes for a decade we might move on to a new resolution. The change is more than just the display, it goes from cameras for content creation through post to editing to delivery.
Content may be the biggest problem. Most of today's films use 2K digital intermediates, so they simply won't be available in native 4K unless all post is redone entirely.
Funny enough, older movies that had no digital processing could simply be rescanned at a higher resolution and deliver a better image. Of course, 35mm doesn't resolve infinitely anyway, and for many films there probably wouldn't be a noticeable detail increase between 3K and 4K.

Digital cameras are replacing film, and the cost of shooting digital 4K will be more manageable than shooting 70mm or IMAX. So future content will use 4K more and more. I wouldn't be surprised if DSLR cameras are soon able to shoot 4K or better. They've already got the sensors after all.
 

Dennis

Banned
mr stroke said:
Just thinking about a 4k OLED PC monitor makes me wet.
I am sure I will buy one and be first on the GAF block. My bragging will never end!

Seriosuly though, I hope you can wait because that 4k OLED will cost $2999 - in 2014!
 

Dennis

Banned
Since we are talking monitors - if I was going to buy one today I would buy a 27" 2560 x 1440 monitor because of the pixel density which is better than 30" 2560 x 1600.
 

Zaptruder

Banned
I've long thought that 4K cinema was kinda pointless due to 1080p already pushing the edge of retinal perceptual differences at reasonable viewing differences.

But then I recently thought - with good wearable displays, like the incoming Sony HMZ-1, maybe that's where 4k resolution can really be exploited - creating things in the necessary resolution now so that a decade down the track when wearable displays are blasting us with complete field of vision displays (as opposed to the 45 degree field of vision found in the HMZ-1, and you would find when sitting in the sweet spot of a 1080p display) - 4k is going to be about just right.

Of course the trick that content providers now are missing is that it may be appropriate to reconsider how they set up shots - if 4k vids are going to be used in a manner where it encompasses the whole field of vision of a person - the edges of the screen are going to be stretched right into our periphery. A traditional super close up shot would have a person's forehead and chin blurred out due to how much space they occupy on your retina.
 
Suairyu said:
But it isn't. The human eye is very capable of perceiving increases in pixel density. There have been scientific tests and everything. Keep increasing the pixel density of a display (and footage playing on that display) and eventually people start suffering from motion sickness because everything looks more and more real.

I'm struggling to understand what you mean by "perceive" increased resolution. If you can't discern one pixel from the next what does this perceived increase give you except a really expensive form of anti-aliasing?

Seems like a pointless (and expensive) thing for manufacturers and developers to aspire to.
 

SmokyDave

Member
Rabble rabble! That chart! Rabble rabble!

I see no need for this 4K madness yet. Let's get consoles that can handle 1080p first, eh?
 
retina cellphone, Awesome
retina computer screen, Maybe
retina tv, Fuck you, that is useless why would you ever be that close to it. 4k at 50" (my current size) is going to be crazy hi res.
 

Zaptruder

Banned
MickeyKnox said:
That bullshit chart is close to a banable offense

Yo, you're been called out - explain why the chart is bullshit.

If you can't, then stop repeating the counter-bullshit meme.

There is a limitation to our own optical capabilities - the amount of cones we have available in our fovea limits how much information we can resolve on our retinas.

At a certain angle, additional information will not be resolved by our retinas.

The chart is suggesting that those angles are reached at those resolutions at those seating distances.

Of course, it belies the real story, but it's not flat out inaccurate. The real story is that we can tell large differences in contrast much better than small differences in contrast - a white to black contrast can be seen even at the difference of a single pixel from a much larger distance than a gray to gray contrast.

Still - move back far enough, and it doesn't matter how big the contrast is - it won't be seen.
 

Dennis

Banned
ElectricBlue187 said:
Why do people say "retina ____" like that's a term that has significance outside of mobile apple products?
because everyone knows it means 'extremely high pixel density' but is faster to type....
 

Septimius

Junior Member
Suairyu said:
Resolution of the... human eye? Wat.

YOU'RE A DUCK, NOT A HAWK. And really, you know there are tiny things inside your eye that analyzes the light that hits that point? Kind of like an inverse pixel. HMM.

Also, really, I don't need a 120 inch display with 16kx9k. I'm ok. Just make good games and throw on like 4x AA, and we're good.
 

amdnv

Member
Zaptruder said:
if 4k vids are going to be used in a manner where it encompasses the whole field of vision of a person - the edges of the screen are going to be stretched right into our periphery. A traditional super close up shot would have a person's forehead and chin blurred out due to how much space they occupy on your retina.
This is why I don't appreciate IMAX for regular motion pictures. Cinematographers frame their shots very precisely, and when you can only see a tiny portion of the image without substantially moving your head, all of this is lost.
As such, I find that 45 degrees is the perfect horizontal viewing angle for a movie. It's easy to calculate, too. Your viewing distance has to be identical to the width (not the diagonal) of your screen.

Games are completely different of course. The image can't be big enough. Ideally it should fill out your entire field of view (with the game camera's field of view matching). So ultimately we'll want VR helmets or whatever.
 

Zaptruder

Banned
DennisK4 said:
because everyone knows it means 'extremely high pixel density' but is faster to type....

More specifically, retina is short hand for pixel density higher than your retinal 'sensor density'.

in other words - it should mean that there's no point adding more pixels, because you can't tell the difference.

In practice, unless you're dealing with a Head mounted display (HMD), then it depends on how far you hold it from you or how far you're sitting away from it.

Most people hold the mobile phones at navel to sternum height - so you'd assume with the apple screen, it means, at that distance, your screen has enough PPI that you can't see much if any of a difference... assuming you have 20/20 vision.

Or something like that.
 

jmdajr

Member
I can understand 4k for theaters with such a large screen but at least for the moment 2k is fine at home.

The day I can get 4k 100inch Screen TV for 1500 dollars is the day I go all in.

I didn't go 1080p(which is basically 2k) until I could get a 50+ screen for that amount.

Computers totally different story. I'm sure 4K is awesome for photo editing and all sorts of graphic applications.
 

padlock

Member
It's simple really. If you want somethign that approaches the limit of visual acuity, this simple formula is remarkably accurate:

DPI / Viewing distance (in feet) >= 300

A 16:9 screen at 1080 P has a dpi of: 2203 / diagonal size of screen (in inches)


A 60 inch 1080P screen (36.7 dpi) from 10 feet away has roughly the same 'effective resolution' as a retina display used 1 foot away from your face.
 
I would much rather have a higher screen refresh rate at this point than higher resolution.

Though a iPhone-like "Retina Display" (buzzword, I know), would be attractive.
 
sony-4k-projector-2011-10-03-13hed.jpg


Early 2012 the PS3 will support pictures output to a 4K monitor. So yes the PS3 and PS4 consoles will display 4K for MEDIA.

The point of the OP is video playback Media support at 4K not static displays at higher resolutions or Games.

4K Media support is a done deal as far as Sony is concerned. Media in 4K but games at some lower resolution.

Eastman's comments were spot on except for his assumption that the OP was referring to games not Media; the title of the cited article was 4K video support.

Consoles supporting 4K media is now a given so lets stop arguing THIS point.
 

Vanillalite

Ask me about the GAF Notebook
The bigger thing is just how awesome integrated graphics are slowly becoming for gaming. They aren't ever gonna reach the point of a dedicated card, but ever since the 1st round of the i series intel's really stepped up their game. Integrated in the 2nd gen isn't THAT bad, and this is gonna be another HUGE jump.
 

LiquidMetal14

hide your water-based mammals
It means little to me. I'm interested to see if I have an upgrade path with my current motherboard though. Would be awesome.

The resolutions we're getting now are good enough. Outside of the enthusiast or theaters, I think 1080 is a good peak.
 

soultron

Banned
If guys on PC have figured out how to do [crazy number to do with AA]x [crazy acronym to do with AA] so that no jaggies occur, what's the point?

If us console peasants sit back on our comfy couches and can't see the jaggies in the first place, what's the point?

Other than selling PC hardware and monitors/HDTVs, I don't see one.

In the last ten years I've spent nearly $2K on HDTVs and monitors. I don't plan on dropping that kind of money again in the next half-decade.

It's exciting new tech, that's for sure. Still, I don't feel like upgrading my viewing devices just yet.
 

Iadien

Guarantee I'm going to screw up this post? Yeah.
soultron said:
If guys on PC have figured out how to do [crazy number to do with AA]x [crazy acronym to do with AA] so that no jaggies occur, what's the point?

what's the point?

What's the point of improving on things? I have no idea. =p
 

dr_rus

Member
Brettison said:
The bigger thing is just how awesome integrated graphics are slowly becoming for gaming. They aren't ever gonna reach the point of a dedicated card, but ever since the 1st round of the i series intel's really stepped up their game. Integrated in the 2nd gen isn't THAT bad, and this is gonna be another HUGE jump.
And all that will return to right where it was once the next generation of console h/w is here.
 
Brettison said:
The bigger thing is just how awesome integrated graphics are slowly becoming for gaming. They aren't ever gonna reach the point of a dedicated card, but ever since the 1st round of the i series intel's really stepped up their game. Integrated in the 2nd gen isn't THAT bad, and this is gonna be another HUGE jump.


Well, unless you wanna play Pong integrated graphics (perhaps minus Lano's) still suck.
 

onQ123

Member
whistle.gif
are there any doubters among us who still don't believe in the great onQ? step forward & bow your heads.




mooninites.jpg



Do it or I'll bow 'em for you!





ok on a serious note why is it that people don't believe that the PS4/Xbox Next will have resolutions above 1080P when Tablets will soon have 2K resolutions.

& when have you ever known Sony to just stay with the tech that's been used for years? Sony wouldn't even come out with a PS4 if it didn't bring a new standard.
 
onQ123 said:
ok on a serious note why is it that people don't believe that the PS4/Xbox Next will have resolutions above 1080P when Tablets will soon have 2K resolutions.

& when have you ever known Sony to just stay with the tech that's been used for years? Sony wouldn't even come out with a PS4 if it didn't bring a new standard.

Yes they will. They won't put out a PS4 that costs more than $399 this time around. It will be using older tech.
 

Corky

Nine out of ten orphans can't tell the difference.
dr_rus said:
8192x8192 supported on all GeForces since 8800.
4096x4096 supported on most of GeForces from 7 series and below.

Intel should stop making noises.

I thought the geforce 5xx series only supported up to 2560x1600? Or am I missing something here?
 
Status
Not open for further replies.
Top Bottom