Really that doesn't grant you a free pass. At this time it's hardly a corner case. MLAA and related algorithms simply can't get you jaggy free images at this point.GABDEG said:
Really that doesn't grant you a free pass. At this time it's hardly a corner case. MLAA and related algorithms simply can't get you jaggy free images at this point.GABDEG said:
If the next generation lasts 10-15 years, then maybe.EVIL said:it will take another generation for 4k resolutions to go mainstream. so they will propably have some sort of support for it. like upscaling or stuff.
The reality is most devs won't choose that course. We'll get better graphics than this gen, but the framerates won't be 60SmokyDave said:I'd be perfectly happy with 1080p60 across the board for the type of graphics we've had this gen. Any less and I'll be slightly disappointed, any more and I'll be giggling like a schoolgirl.
I hope and kind of expect that after The Hobbit and Avatar 2 are released developers will see that higher framerates are in vogue and will make locked 60 fps a priority.Raistlin said:The reality is most devs won't choose that course. We'll get better graphics than this gen, but the framerates won't be 60
Unfortunately, I suspect you're correct.Raistlin said:The reality is most devs won't choose that course. We'll get better graphics than this gen, but the framerates won't be 60
GABDEG said:I doubt they'll waste processing power on MSAA.
http://images.eurogamer.net/assets/articles//a/1/3/7/0/3/3/5/fxaa_000.bmp.jpg
Looks pretty good to me and most people.
Raistlin said:Unfortunately this thread has reached the point where basically everything is coming full circle. All the same questions and points are being reiterated.
Yup. MSAA is expensive (especially if you're rendering at 1080p). MLAA and derivates provide great IQ and are cheap to implement.GABDEG said:I doubt they'll waste processing power on MSAA.
http://images.eurogamer.net/assets/articles//a/1/3/7/0/3/3/5/fxaa_000.bmp.jpg
Looks pretty good to me and most people.
Blurry poop like most 'efficient' AA methods, just look at the vasaline crap we got in Crysis 2. MSAA is the only way.GABDEG said:I doubt they'll waste processing power on MSAA.
http://images.eurogamer.net/assets/articles//a/1/3/7/0/3/3/5/fxaa_000.bmp.jpg
Looks pretty good to me and most people.
HomerSimpson-Man said:Just have enough horsepower, memory, and bandwidth to take current consoles games to render naively at 1080p with goobs of anti-aliasing with the usual extra kick that comes with a new directX API level effects (DX11 in this case) and we should be good.
I think 720P, 60fps, good AA, high res textures and no screen tearing is still true next gen.SmokyDave said:I'd be perfectly happy with 1080p60 across the board for the type of graphics we've had this gen. Any less and I'll be slightly disappointed, any more and I'll be giggling like a schoolgirl.
DSN2K said:Ive seen a few statements during E3 over what people would be happy with for a next gen console...and from my own view I think many are going to end up disappointed.
Many have mentioned examples like the recent Unreal Engine 3 2011 tech demo...
but the reality is to make a full fledged game look like that would take 10 years lol. The amount of time it would take to create all the assets, animations etc... imagine trying to keep that sort of level of consistency for a game like Skyrim ?
onken said:e~ though you can guarantee next gen the vast majority of games will be 932p and under 30fps.
Wazzim said:Blurry poop like most 'efficient' AA methods, just look at the vasaline crap we got in Crysis 2. MSAA is the only way.
onken said:I would just like to reiterate my desire for 60fps over 1080p. Outside of AV buffs, the amount of people sitting close enough to appreciate the difference over 720p is minimal, where as the benefits of 60fps can be appreciated on any TV at any viewing distance.
Crysis 2 used frame blending "AA", which looks hideous. God of War 3 and Killzone 3 use MLAA, which isn't as good as FXAA and they look very sharp. There's a GoW3 comparison at DF, comparing MSAA (E3 demo) and MLAA (final). No difference in sharpness at all.Wazzim said:Blurry poop like most 'efficient' AA methods, just look at the vasaline crap we got in Crysis 2. MSAA is the only way.
ChackanKun said:Ok, can anyone explain me why exactly did both Sony and MS include so little RAM in both consoles? Why not 1Gb minimum?
Ushojax said:
ChackanKun said:Really? would it be that expensive to include an extra 512ram? :|
Well yeah, it's a movie.slopeslider said:Thought that was 24fps
I don't think its that people can't notice it's most don't really care at some point.toasty_T said:I don't get this, it's double the amount of pixels on screen. How can anyone not be able to tell the difference? It's night and day to me.
Given the choice though I'd take 720p 60fps and some AA since that'd probably strain the gpu less.
It's a matter of diminishing returns. Average people can't tell the difference between SD and HD unless they're specifically looking for it. Just remember the whole Alan Wake fiasco, everything was fine until a pixel counter revealed its resolution.eastmen said:Because of small changes ?
1280x720 = 921,600 pixels.
If say a game is 1024x575 that is 589,824.
A diffrence of 331,776 pixels.
UHDTV is 4320p or 7680x4320 which is 33,177,600
HDTV is 1920x1080 or 2,073,600
Even if we go with digital cinima 4k your still lookig at 3996x2160 or 8,631,360
There is a much greater diffrence in pixel counts between the formats then when a game company will fudge their resolution for a 720p game.
in fact if we asume digital Cinema 4k your looking at 8,631,360 dividied by 2,073,600 is over 4 times the pixel amount.
ps2 did 480p which is 345,600 pixels. 720p = 921,600/ 345,600 = 2.7 times increase in resolution.
People will certianly know the diffrence
Yup. The other day someone on GAF said that he wouldn't even spend $5 on a HDMI cable for 360/PS3 so that he could get an HD image. He didn't care that the game looks like ass in 480i on a HDTV.cuevas said:I don't think its that people can't notice it's most don't really care at some point.
As I said earlier in the thread, I really think after The Hobbit (shot at 48fps) and Avatar 2 (probably will be shot at 60fps) are released we'll see a shift in developer priorities and a boom in 60fps games.Shalashaska161 said:I actually kind of hope 48fps really takes off in movies if only because it will make games look choppy running at 30fps in comparison. Though this isn't something we can really hope to happen by next gen.
Metroid-Squadron said:It's a matter of diminishing returns. Average people can't tell the difference between SD and HD unless they're specifically looking for it. Just remember the whole Alan Wake fiasco, everything was fine until a pixel counter revealed its resolution.
Well I'm happy that the top selling franchise of this generation (CoD) realized the benefits of 60fps and stuck with their convictions on it. I've heard casual gamers that are used to CoD that play a game of BF or MoH and wonder aloud why the game feels slower or choppier than CoD.AndyMoogle said:It's time all of you forget about 60 fps. It will NOT be a standard next gen. Devs will almost always go for higher visual fidelity and 30 fps over 60 fps. Doesn't matter how much more powerful the next gen consoles will be, 30 fps will be the norm.
-bakalhau- said:I think a single GTX580, for example, on a console... would make for some very good strides forward.
ChackanKun said:Ok, can anyone explain me why exactly did both Sony and MS include so little RAM in both consoles? Why not 1Gb minimum?
Yep and I'm hoping Epic and Crytek are on Microsoft's ass with their next system, making sure it can be as beastly as possible for a console releasing in 2013.Dead Man Typing said:Microsoft was only going to put 256MB in the 360, but apparently Epic games said that would be a mistake:
http://www.eurogamer.net/articles/news241006gears
Why? The consoles are a year plus out and history tells us that Sony and Microsoft typically have top of the line Gpus with custom features.Sanjay said:Your crazy if you think its going to have a GTX580, I think a GTX 460 is more likely.
I'm still lolling at everyone that thinks that MS and Sony are going to jump in in 2013. Doubly so those who think MS is going to rush in in 2012 to defend them against Nintendo.Majanew said:Yep and I'm hoping Epic and Crytek are on Microsoft's ass with their next system, making sure it can be as beastly as possible for a console releasing in 2013.
I think the Microsoft of then and the Microsoft of now are two different companies. I doubt Microsoft today would give two shits if Epic told them to spend more money for better graphical returns. All of their money is going towards developing this motion sensing stuff and selling their next console around it.Majanew said:Yep and I'm hoping Epic and Crytek are on Microsoft's ass with their next system, making sure it can be as beastly as possible for a console releasing in 2013.
90% can be learn how to spot the difference between 30fps and 60fps games. But that's something different from noticing it while playing.Raistlin said:Far more than 90% can see a difference ... it's more of an issue of the majority not really giving a fuck.
You're talking about response time?The funny thing though is that most people game on LCD's. I suspect 99% of those gamers are not aware of how sample-and-hold displays work ... nor the implications it has on motion resolution. If they were aware of that, I suspect a lot more would give a fuck.
If new consoles aren't announced by next E3 then this generation would quickly go from being one of the best, to being one of the worst as far as I'm concerned. I'm sick of wallowing in last gen's rotten meat. Sick of choppy frame rates and pop-in on AC:Brotherhood. Sick of characters that just stand around in Fallout:New Vegas rather than bringing the city to life. Sick of small, enclosed, linear games being the only ones that run at 60fps.Grampa Simpson said:I'm still lolling at everyone that thinks that MS and Sony are going to jump in in 2013. Doubly so those who think MS is going to rush in in 2012 to defend them against Nintendo.