• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

are visual expectations for PS4/720 way off from reality ?

EVIL said:
it will take another generation for 4k resolutions to go mainstream. so they will propably have some sort of support for it. like upscaling or stuff.
If the next generation lasts 10-15 years, then maybe.
 
I'd be perfectly happy with 1080p60 across the board for the type of graphics we've had this gen. Any less and I'll be slightly disappointed, any more and I'll be giggling like a schoolgirl.
 
SmokyDave said:
I'd be perfectly happy with 1080p60 across the board for the type of graphics we've had this gen. Any less and I'll be slightly disappointed, any more and I'll be giggling like a schoolgirl.
The reality is most devs won't choose that course. We'll get better graphics than this gen, but the framerates won't be 60
 
Raistlin said:
The reality is most devs won't choose that course. We'll get better graphics than this gen, but the framerates won't be 60
I hope and kind of expect that after The Hobbit and Avatar 2 are released developers will see that higher framerates are in vogue and will make locked 60 fps a priority.
 
I don't want 4k or any other wacky resolution, 1080p is twice what we usually get now and I don't want to have to buy another TV just so I can see the frigging text on my almost new 1080p TV, fuck that shit hard.

1080p almost-standard + 4xMSAA + better framerates (I'm sure we'll get them, just like every gen) = Awesome and MUCH better than what we have now. And they could offer a 720p mode where the game runs at 60fps, how cool would that be? ;D (I know I know, that's not how things work :P)
 
Raistlin said:
The reality is most devs won't choose that course. We'll get better graphics than this gen, but the framerates won't be 60
Unfortunately, I suspect you're correct.

I have a horrible feeling we'll still be seeing a few sub-720p games next gen.
 
I'd better see a dramatic change in graphics or I'm not upgrading. Games are being limited and everything is feeling dated. Yes, I have a PC, but I want it on my consoles too. They should be equal to a second tier PC - not bleeding edge, but close to it. I want everything at least 1080p with lots of AA and no stutter with rich environments showing lots of DX11 type niceties.

Hopefully new middleware will be developed to help companies get over the long development cycles associated with HD games.

Raistlin said:
Unfortunately this thread has reached the point where basically everything is coming full circle. All the same questions and points are being reiterated.

You've just described every thread on GAF after it reaches three pages.
 
HomerSimpson-Man said:
Just have enough horsepower, memory, and bandwidth to take current consoles games to render naively at 1080p with goobs of anti-aliasing with the usual extra kick that comes with a new directX API level effects (DX11 in this case) and we should be good.

This is what I am expecting, no more, no less. I expect the biggest jump will be in image quality, Crysis2 in full scale @60fps should run easily and more I won't need. Maybe more interactivity, physics and better AI, but that's about it.
 
SmokyDave said:
I'd be perfectly happy with 1080p60 across the board for the type of graphics we've had this gen. Any less and I'll be slightly disappointed, any more and I'll be giggling like a schoolgirl.
I think 720P, 60fps, good AA, high res textures and no screen tearing is still true next gen.
 
DSN2K said:
Ive seen a few statements during E3 over what people would be happy with for a next gen console...and from my own view I think many are going to end up disappointed.

Many have mentioned examples like the recent Unreal Engine 3 2011 tech demo...

but the reality is to make a full fledged game look like that would take 10 years lol. The amount of time it would take to create all the assets, animations etc... imagine trying to keep that sort of level of consistency for a game like Skyrim ?

Procedural generation of assets (and more) will finally (need to) take off. Then again maybe the future is moving away from giant AAA games and more towards XBLA type gaming? TBH I am inclined to think that a lot of developers are approaching games design in a totally wrong, old fashioned (Hollywood) way, with big scripted sections and too many cut-scenes. L4D is a great example of a game where interesting global AI can make a game quicker to make. Now add procedural generated maps and mobs (think Spore) and we might see the next generation in action.

Then again.. that is my job ;) so I might be prejudiced.
 
I already know they'll never come close to high end PC's in hardware, 1080P@30FPS with MLAA/MSAA 4x is enough for me. Oh and good water shaders in every game, that too.
 
I think we'll probably see more devs shooting for 1080p rather than 60fps. My suspicion is someone (maybe crytek, maybe dice, maybe someone else) is going to target a 720p 30fps that blows away everything else and everyone is going to be asking "why doesn't game X look like that?" and then it's all over.

1080p would be nice to hold onto but I honestly couldn't care less about 60fps, particularly on console.
 
I would just like to reiterate my desire for 60fps over 1080p. Outside of AV buffs, the amount of people sitting close enough to appreciate the difference over 720p is minimal, where as the benefits of 60fps can be appreciated on any TV at any viewing distance.

e~ though you can guarantee next gen the vast majority of games will be 1080p and under 30fps.
 
onken said:
I would just like to reiterate my desire for 60fps over 1080p. Outside of AV buffs, the amount of people sitting close enough to appreciate the difference over 720p is minimal, where as the benefits of 60fps can be appreciated on any TV at any viewing distance.

I don't get this, it's double the amount of pixels on screen. How can anyone not be able to tell the difference? It's night and day to me.

Given the choice though I'd take 720p 60fps and some AA since that'd probably strain the gpu less.
 
Wazzim said:
Blurry poop like most 'efficient' AA methods, just look at the vasaline crap we got in Crysis 2. MSAA is the only way.
Crysis 2 used frame blending "AA", which looks hideous. God of War 3 and Killzone 3 use MLAA, which isn't as good as FXAA and they look very sharp. There's a GoW3 comparison at DF, comparing MSAA (E3 demo) and MLAA (final). No difference in sharpness at all.
 
ChackanKun said:
Ok, can anyone explain me why exactly did both Sony and MS include so little RAM in both consoles? Why not 1Gb minimum?

$$$
 
all I know is that this concept art is atrocious.
playstation-4-concept665x.jpg
 
It's time all of you forget about 60 fps. It will NOT be a standard next gen. Devs will almost always go for higher visual fidelity and 30 fps over 60 fps. Doesn't matter how much more powerful the next gen consoles will be, 30 fps will be the norm.
 
toasty_T said:
I don't get this, it's double the amount of pixels on screen. How can anyone not be able to tell the difference? It's night and day to me.

Given the choice though I'd take 720p 60fps and some AA since that'd probably strain the gpu less.
I don't think its that people can't notice it's most don't really care at some point.
 
eastmen said:
Because of small changes ?

1280x720 = 921,600 pixels.

If say a game is 1024x575 that is 589,824.

A diffrence of 331,776 pixels.


UHDTV is 4320p or 7680x4320 which is 33,177,600

HDTV is 1920x1080 or 2,073,600


Even if we go with digital cinima 4k your still lookig at 3996x2160 or 8,631,360


There is a much greater diffrence in pixel counts between the formats then when a game company will fudge their resolution for a 720p game.


in fact if we asume digital Cinema 4k your looking at 8,631,360 dividied by 2,073,600 is over 4 times the pixel amount.

ps2 did 480p which is 345,600 pixels. 720p = 921,600/ 345,600 = 2.7 times increase in resolution.

People will certianly know the diffrence
It's a matter of diminishing returns. Average people can't tell the difference between SD and HD unless they're specifically looking for it. Just remember the whole Alan Wake fiasco, everything was fine until a pixel counter revealed its resolution.
 
I actually kind of hope 48fps really takes off in movies if only because it will make games look choppy running at 30fps in comparison. Though this isn't something we can really hope to happen by next gen.
 
cuevas said:
I don't think its that people can't notice it's most don't really care at some point.
Yup. The other day someone on GAF said that he wouldn't even spend $5 on a HDMI cable for 360/PS3 so that he could get an HD image. He didn't care that the game looks like ass in 480i on a HDTV.
 
Shalashaska161 said:
I actually kind of hope 48fps really takes off in movies if only because it will make games look choppy running at 30fps in comparison. Though this isn't something we can really hope to happen by next gen.
As I said earlier in the thread, I really think after The Hobbit (shot at 48fps) and Avatar 2 (probably will be shot at 60fps) are released we'll see a shift in developer priorities and a boom in 60fps games.
It will be a glorious time.
 
MLAA looks soft.
Games are getting more and more visually cluttered (post process effects and more detail in environments), so any contrast you can get you should jump on!

MSAA all the way. (msaa is already softer than FS/SSAA, but at least has a limited performance hit assuming you have half decent hardware)

Metroid-Squadron said:
It's a matter of diminishing returns. Average people can't tell the difference between SD and HD unless they're specifically looking for it. Just remember the whole Alan Wake fiasco, everything was fine until a pixel counter revealed its resolution.

It's a difference in standards and what you are used to.
Live in the city for a while and you can't smell the smog and stink anymore.
Spend a week in a forested area, come back and PIIIEEWWWW.

I'll stick to my foresty 1600x1200 60fps thank you very much.
 
AndyMoogle said:
It's time all of you forget about 60 fps. It will NOT be a standard next gen. Devs will almost always go for higher visual fidelity and 30 fps over 60 fps. Doesn't matter how much more powerful the next gen consoles will be, 30 fps will be the norm.
Well I'm happy that the top selling franchise of this generation (CoD) realized the benefits of 60fps and stuck with their convictions on it. I've heard casual gamers that are used to CoD that play a game of BF or MoH and wonder aloud why the game feels slower or choppier than CoD.

As for those saying that current gen tech with 1080p and some AA would be all they'll ever need..... you guys lack serious imagination. I'm glad you don't develop games.

The success of the Wii coupled with the mass exodus of top PC developers over to consoles has resulted in a serious slowdown of video game tech. In any other generation, a 4 year old game like Crysis would've looked archaic by now by PC standards. Rather, it is still years later, the most impressive technical achievement available in gaming. Why? Many top PC devs have moved over to consoles and have no interest in outdoing Crytek. They're more interested in making sure their games run on consoles with 6 year old hardware.

Also, the console generation is going on longer than ever due to the success of the Wii. The big three don't feel they have to sell high end hardware at a loss anymore and progress has all but completely halted on that end.

I hate the fact that at the rate things are at now I'll likely be an old man by the time gaming technology is at a place where developers can literally do anything they want. Where as in an alternate universe, where the Wii never existed, and Crysis was topped a few times over already, we'd be much closer to that realization by now.
 
-bakalhau- said:
I think a single GTX580, for example, on a console... would make for some very good strides forward.

Your crazy if you think its going to have a GTX580, I think a GTX 460 is more likely.
 
Give me open world games that look as good as crysis on consoles and closed world games that can look like the samiritan demo and I will be satisfied.

Give me less and I will be annoyed.
 
Sanjay said:
Your crazy if you think its going to have a GTX580, I think a GTX 460 is more likely.
Why? The consoles are a year plus out and history tells us that Sony and Microsoft typically have top of the line Gpus with custom features.

Sony isn't going to have to eat huge costs like they did with blu ray again and move has only been moderately successful for them. Plus they have shown with vita that they are still in the game of pushing the cutting edge.

I don't find it unreasonable to expect the equivelent of a near top of the line gpu from The year it is released. Most likely 2013
 
Majanew said:
Yep and I'm hoping Epic and Crytek are on Microsoft's ass with their next system, making sure it can be as beastly as possible for a console releasing in 2013.
I'm still lolling at everyone that thinks that MS and Sony are going to jump in in 2013. Doubly so those who think MS is going to rush in in 2012 to defend them against Nintendo.
 
Majanew said:
Yep and I'm hoping Epic and Crytek are on Microsoft's ass with their next system, making sure it can be as beastly as possible for a console releasing in 2013.
I think the Microsoft of then and the Microsoft of now are two different companies. I doubt Microsoft today would give two shits if Epic told them to spend more money for better graphical returns. All of their money is going towards developing this motion sensing stuff and selling their next console around it.
 
Raistlin said:
Far more than 90% can see a difference ... it's more of an issue of the majority not really giving a fuck.
90% can be learn how to spot the difference between 30fps and 60fps games. But that's something different from noticing it while playing.

The funny thing though is that most people game on LCD's. I suspect 99% of those gamers are not aware of how sample-and-hold displays work ... nor the implications it has on motion resolution. If they were aware of that, I suspect a lot more would give a fuck.
You're talking about response time?
 
Grampa Simpson said:
I'm still lolling at everyone that thinks that MS and Sony are going to jump in in 2013. Doubly so those who think MS is going to rush in in 2012 to defend them against Nintendo.
If new consoles aren't announced by next E3 then this generation would quickly go from being one of the best, to being one of the worst as far as I'm concerned. I'm sick of wallowing in last gen's rotten meat. Sick of choppy frame rates and pop-in on AC:Brotherhood. Sick of characters that just stand around in Fallout:New Vegas rather than bringing the city to life. Sick of small, enclosed, linear games being the only ones that run at 60fps.

Games have so much potential.
 
was looking at the xbox power brick yesterday. How much wattage do you think you can fit into that space?

Just asking because up to a certain point all those pc power supplies are the same size. Can they really not do the wattage?Is it more a factor of keeping it cool?
 
Top Bottom