I never said 720p. A modified 1080p or a 945p, that's it. 30% difference is comparing a game that is running at 30 FPS on PS4 to 21 FPS on the next Xbox. The moment you rip out AA, and ultra effects, parity becomes easy.
Really? You think when a system has 2.5gb's more RAM for games and a lot more bandwidth, that it will only go down to 945p, even if they take out all the AA and whatever else? I guess they would would have to use lower res textures too.
One thing to remember about PS3/360 games. Assets are 95% of the time identical, omitting the very early years on PS3. As are the effects. A change here would already make the difference compared to last gen, quite a bit different. Hence the Xbox vs PS2 comparison. Look at the PS2 version of RE6, vs the GC version. Those differences are way bigger than what we saw between PS3/360 games.
Also 20fps vs 30 fps on avg is huge, compared to something that is runing at 60fps vs 50fps. If its avg 20fps(with it sometimes going as low as 15fps), thats getting close to unplayable IMO.
if the 32MB esram really takes up a huge amount of die space (1/2 a 680 equivalent?) then that could mean MS needed to cut back on the size of the GPU and still have a larger (= more expensive) chip to produce.
yes. They really hindering themselves by shoving this vision down our throats. But hey thats there business model, and thats what they think will make them the most successful. If they end up selling more consoles by broading their audience this way, then good for them. But for the hardcore, PS4 seems the way to go.
I guess everyboady win except the hardcore Xbox gamer...