• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

1080p: is it all that critical for next-gen?

But those power cables (and any other line structures) are exactly what will make the illusion break down in the end. The weakest link in the display chain, so to speak.

Also, source: http://www.michaelbach.de/ot/lum_hyperacuity/
I guess I was on the low end, it's actually 5-10x higher.
I like that interactive test. Too bad I know of it so I can't compare it blinded on my 1080p 13" laptop vs. my 1080p 55" TV while sitting at the regular distances I'm on.
 
It's funny, every single post you make is about moving goalposts. Yes, there's lots of situations that can be fine-tuned to your arguments but you don't go "Try this on your PC!!" and then "Awww but you need to hook it up to a TV and sit far, far away".

You assumed most of that, 1:1 pixel mapping should be enough for you to run some test.

I have nothing against 1080p, I can clearly notice the difference, still I think that there are way more critical aspects to the overall visual quality than resolution. Devs aren't clueless and if they chose to use 720p or 900p its probably because they couldn't achieve the same visual quality on that hardware at 1080p.
 
I was recently playing DiRT 2 on 360 which runs in 720p and it looks FLAWLESS. Really clean and no jaggies whatsoever. If all next-gen games look at least as good as that I would be very happy.

So no 1080p isn't essential providing that the game has great IQ and great anti-aliasing to compensate for the low res.
 
Ideally I want my entertainment at the native resolution of whatever I'm watching it on. My TV is 1080p, so that means 1920x1080. No more, no less.
 
It matters due to scaling and the prevelence of 1080P displays. However, it's not as important as frame rate which actually affects gameplay as well as looking better moving, I think they should sacrifice it where need by to get things running at 60 more often.
 
Actually, I'm no graphic whore but I'd still really like to get my Zelda and Metroid in 1080 :)

I'll like that on all games. But one thing is to like a better resolution and another thing is being critical or acting like "OMG conzole gamez unplayable" for being 720p like some people post on a daily basis.

Gimme a break.
 
You assumed most of that, 1:1 pixel mapping should be enough for you to run some test.

I have nothing against 1080p, I can clearly notice the difference, still I think that there are way more critical aspects to the overall visual quality than resolution. Devs aren't clueless and if they chose to use 720p or 900p its probably because they couldn't achieve the same visual quality on that hardware at 1080p.

Yeah, assuming you are talking about PC monitors when you want people to do a test on a PC -- It's definitely not your job to state the testing environment, people just need to assume you meant a TV. With 1:1 pixel mapping, what the hell does the test proof? That would mean the picture just gets smaller in a black box, how the hell is that relevant to anything? The entire point is to see the pixels being stretched over a wider area causing blur and increased pixelation -- You know the stuff that differentiates the various resolutions and makes running non-native resolutions that much less desirable.

And you then state that very basic principle we're all talking about, you aren't coming with any sort of magical revelation. People KNOW why they are doing it, that has been established a long time ago. People are arguing why they'd rather not have that compromise and how they feel the compromise isn't actually a positive.
 
I was recently playing DiRT 2 on 360 which runs in 720p and it looks FLAWLESS. Really clean and no jaggies whatsoever. If all next-gen games look at least as good as that I would be very happy.

Then 1080p would look amazing.
 
Ideally I want my entertainment at the native resolution of whatever I'm watching it on. My TV is 1080p, so that means 1920x1080. No more, no less.

How do you feel about Blu-rays that are 2.40:1? Those giant black bars are eating up your resolution. I'm sure someone will comment: "that's how it is in teh theater!" But I fucking hate it. It sucks.
 
The truth is we might get a few 1080 titles but as this gen gets longer in the tooth dev will figure up that to bring more candy they need to lower them pixel count... so sub 1080 is a sure thing in a very short time.
 
I don't even understand the question of the OP asking if 1080p is critical for this next gen when it is already very clear that there will be a lot of games at 720p and others that are at odd resolutions somewhere inbetween 720p and 1080p, in addition to many others being full 1080p.

I personally would say it's critical that no games have sub-HD resolutions, meaning something below native 720p.


I think what I'm saying is fair, given the current realities of next-gen.


I agree with those that say framerate is more important than resolution. I think it is much more important, actually. It is not that I do not notice the difference bwtween 720p and 1080p, I absolutely can. It's a matter of what is more important.
To me, and I know I'm not alone, 60fps (vs 30fps) makes a bigger difference than 1080p (vs 720p). Framerate affects not only the visual smoothness of games but the gameplay as well.

So I would be happier with a mix of 1080p and 720p games with more consistent framerates. Nothing below 30, and a higher percentage of games that are 60.
 
That chart is fucking stupid.

.

oiul.png


better chart. less bullshit.
 
I'd say it was more embarrassing that a large amount of 360/PS3 games were below native 720p, thus not technically HD at all.

Don't get me wrong, I'd love it if every single PS4/One/Wii U game was native 1080p, but that's 100% not happening.

The same will happen next-next gen (let us call it the "2020+-gen") in that not every game will be Ultra HD (4K or 8K).


edit:

HDTV went into development (by NHK) in the 1960s and was demo'd throughout the 1980s. The first HD (analog) broadcasts in Japan happened in the very late 80s and early 90s. Look how long it took for gaming to even reach HD, let alone 1080p.


Ultra HD began development around 1995. NHK expects the first broadcasts to start (in Japan) in 2020. They've said Ultra HD broadcasts are expected to be in 8K only, not 4K.

Anyway the point is, look how long it took for HD to become adopted and widespread. It certainly did not help that there were 3 different formats, 720p, 1080i and 1080p, with only 720p and 1080i being used for broadcasting. Because of this, the "HD era" of gaming this last gen had 720p, 1080p, resolutions in-between them, as well as sub-HD. Now with this coming gen, we're finally going to see the end of sub-HD
 
It's hard to see the difference between 720p and 1080p on my 55" 1080p LCD. Has to be an A/B comparison to see the better clarity of the 1080p image. Compression of the HD TV broadcasts is more noticeable than the TV show being broadcast in 720p.
If devs need to drop the resolution of some games to make sure the frame-rate isn't shit and that there's no screen tearing, then I'm all for it. PS4 and XB1 aren't powerhouses, so I'll take the trade-off.

Well I am not saying 1080p and sacrifice everything to achieve that. The next gen consoles should be well capable of hitting 60fps and 1080p and in fact they pretty much are capable.

I am confounded that you cannot see the difference on a 55 inch screen (are high res textures enabled? I assume you're using a PC to make the comparison). I can certainly tell the difference (as in a remarkable improvement). I don't have amazing eyesight either, in other words I am pretty average in the front.
 
I don't think so.

We have games on current gen which are running at sub 720p and still look absolutely fantastic. Look at games like Halo 4 and TLoU. Both games are great looking yet only run at 720p.

I think 900p is a great sweet spot, as it allows the devs to maximise the performance of the consoles without expending too much on pointless pixel density. Yes, it may look slightly sharper, but I'd prefer to have more particles, or better shaders.
 
Not at all to me, and I PC game mostly. Just as long as the FPS(30 or 60) are constant with no screen-tearing, I'm good with 1280 x 720 and above.
 
Yes 1080p is a must. I'm fine with black bars as well, as long as the stuff that is on screen is actually rendered at native resolution.
 
Well I am not saying 1080p and sacrifice everything to achieve that. The next gen consoles should be well capable of hitting 60fps and 1080p and in fact they pretty much are capable.

Both Xbox One and PS4 are capable of running all games at 1080p/60. But all those games wouldn't be the graphical leap we want going from one gen to the next. Some devs will drop the resolution to give the graphical leap they want to provide.
 
Same here. Never understood why since I'm usually very good at noticing technical differences.

I can only really tell on my PC as well. However I can only ever tell if there is a difference in resolution with the UI. Font always looks a lot sharper at high resolution. However I couldn't tell the difference at all between 900p and 1080p if I was just looking at the game.
 
GTAV runs at 720p and broke probably every sales record.

Every call of duty game released runs at sub HD and has consistently broken records ever time a new title releases.

If you ask me whether the millions of gamers who have been happily gaming away at mostly sub HD resolutions are going to suddenly turn into PC level graphics whores?

I'm going to say no.

Even most non-casuals couldn't tell you what anti-aliasing is, let alone pass blind 720pvs1080pvs900p test and/or make purchasing decisions based on a games native resolution.

Just like with most things on the internet, it will drive an underground war, but everyone else in the world will just carry on playing their call of duty and Mario.

Same here. Never understood why since I'm usually very good at noticing technical differences.

You likely sit much further back from your TV than your computer monitor.
 
Yes, it is.

It's time to rid ourselves of scaling artifacts. 1920x1080 just happens to be what our TVs are, so that's what we need.

I'm not a resolution fanatic, but this is where I stand as well (although "critical" might too strong a word for how I feel). Native resolution content just looks better without the need for a resource-intensive AA solution.

Now, if they come up with post-processing techniques that make a lower resolution image look indistinguishable from a low-AA 1080p image, then I won't complain about it. But for fuck's sake give me a CLEAN picture.
 
How do you feel about Blu-rays that are 2.40:1? Those giant black bars are eating up your resolution. I'm sure someone will comment: "that's how it is in teh theater!" But I fucking hate it. It sucks.

Well, it doesn't ruin the movie for me. But yeah, I prefer watching 16:9 movies that fill the entire screen.
 
Most TV broadcasts are still in 720P as far as I know, so no, it doesn't matter except for the graphics enthusiasts who have hijacked the term "gamer" in the past few years.
 
Just depends on how much of a videophile you are really. I don't notice a ton of difference between 720p/1080i channells on DirecTV and watching a Bluray at 1080p personally. Maybe if I bought an identical TV and put them side by side...but I'd never make that kind of comparison and have never been very picky about picture quality. Hell I still own and watch a bunch of DVDs that I don't plan on upgrading to Bluray--I did care enough to upgrade my favorites that were more special effect heavy or had great cinematography etc.--so I do care a tad. Just no where near videophile level.
 
1080p is the reason why I hooked my PC up to my HDTV (to play console ports). The jump in resolution already looked massively better for current gen PC ports. Even when it was a straight port with no graphic enhancements.
 
The Wii was 480p when it came out. A lot of people here took the piss, as 720p televisions were gaining popularity at the time. Now, 1080p tv's are the most popular. It seems that any new console game in 720p is going to be in an identical situation to a Wii game. Have peoples standards changed, or have they decided that the current tech is all they need? 720p looks pretty much as bad on a 1080p set at 480p did on a 720p.
 
It'll only be critical if some Mad Men make it so through marketing and making the average customer care about 1080p being a critical video game standard. However I do agree that 1080p and 60fps should be the standard and it'll be interesting to see how Valve markets their own machines
 
One of my friends has a 42'' sitting almost six feet off of the floor, and they sit about twelve feet away. The wife doesn't care about "HD" because "standard is fine" to her, yet my friend swears he can tell when a show is in 1080p from that distance.

Consdering I did a bunch of blind tests and he failed the majority--bullshit.

Most folks will probably see some improvement with a 1080p screen and media over 480i/p content but they won't be getting the full benefit of the higher resolution. That said, very few will gain any noticeable (from their viewing distance) improvement going from say 900p to 1080p.
 
Just depends on how much of a videophile you are really. I don't notice a ton of difference between 720p/1080i channells on DirecTV and watching a Bluray at 1080p personally. Maybe if I bought an identical TV and put them side by side...but I'd never make that kind of comparison and have never been very picky about picture quality. Hell I still own and watch a bunch of DVDs that I don't plan on upgrading to Bluray--I did care enough to upgrade my favorites that were more special effect heavy or had great cinematography etc.--so I do care a tad. Just no where near videophile level.
May I ask if your TV has been calibrated or not since purchase?
 
I'll like that on all games. But one thing is to like a better resolution and another thing is being critical or acting like "OMG conzole gamez unplayable" for being 720p like some people post on a daily basis.

Gimme a break.
Lol, yeah I hear ya, that visual fidelity elitist scum...it's gets tiring. I feel the same a lot of the time when dealing with FPS. WW is 30 due to original animations I believe, and it doesn't bother me in the slightest but hey to eah their own I guess?
 
It should take fewer resources to render higher resolution that adding on smoothing effects like AA.

When I went from a 1280x1062 monitor to 1600p my framerate actually the same for most games. The lack of using all the post-processing BS makes a huge difference.
 
I don't even understand how could anyone argue that it's not noticeable
And sorry but that chart is bullshit
I don't notice, but then again my TV is this weird RCA L32HD31 from 2008 that somehow magically displays in 1080p when the highest rating on the box is 1080i.

I know it is 1080p too because I have 1080i disabled in my PS3's video-output settings and the resolution changes (flashes to blue for a second) when I play a game that doesn't run at 1080p since the XMB is 1080p. I also confirmed doubly with recorded gameplay from my capture card (over HDMI with the DVI workaround).

So I have no idea really of the difference between 1080p and 720p on my TV.
 
Prefer 1080p simply because my monitor is 1080p, no scaling = less input lag = better gameplay.

This was mostly an issue with the PS3 where most games didn't scale up to 1080p using the consoles GPU (due to limitations), and just made it output a 720p signal to the tv and letting it scale the picture instead. The opposite is true of the 360. I hoping the PS4 will always output a 1080p signal and the scaling will be done by the PS4 itself because my TV is crap at doing it.
 
This was mostly an issue with the PS3 where most games didn't scale up to 1080p using the consoles GPU (due to limitations), and just made it output a 720p signal to the tv and letting it scale the picture instead. The opposite is true of the 360. I hoping the PS4 will always output a 1080p signal and the scaling will be done by the PS4 itself because my TV is crap at doing it.

God I hope so. It would be crazy if not. Hated that about the PS3.
 
If "Next-Gen" consoles still haven't caught up to mass market television and PC standards, what's the point? 1080p isn't even considered high end for pc's anymore... It's just disappointing to have a nice 1080p tv and everything is rendered at 720p.
 
Pretty much what I just said.

Actually:
Well I am not saying 1080p and sacrifice everything to achieve that. The next gen consoles should be well capable of hitting 60fps and 1080p and in fact they pretty much are capable.
Reads as PS4/XB1 are capable of 1080p/60 without sacrificing. Maybe not everything would need sacrificed, but 1080p and 60fps both take away from some of the extra processing power these new consoles have over current-gen. Only so much is left for the graphical leap us core gamers want. Devs don't lower the resolution because they think it's cute. PS4 and XB1 are not the graphical leap a lot were hoping for, so either MS and Sony need to delay the systems to beef up the specs so 1080p/60 can be achieved, while maintaining the graphical bump one would expect. Or devs have to optimize and find the best balance for IQ and performance for what they've been handed.

Sure, more powerful consoles doesn't mean devs won't drop the resolution again to achieve better results, it's just that the graphics would be better than what we're getting now.
 
Top Bottom