• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

are visual expectations for PS4/720 way off from reality ?

duk said:
not sure because the xenos chip was based on the R400 architecture that launched before the PC counterpart came out
Well, not exactly. The way it came together was kind of like the Intel Core line of processors - Based on some stuff ATi had been working on for the next generation, blended with mainstream consumer level stuff. Xenos, strictly speaking, doesn't have a PC counterpart. It was still a pretty powerful chip for its time, pushed some impressive results early on, and still proves to be a fairly beefy piece of hardware.
 
Grampa Simpson said:
I think a lot of it rests on what features are deemed required for the new console. It seems that added features add extra heat, and by going a little older on the architecture you can trade features for heat/speed.

surely they will do DX11+ just like with 360 DX9+ (i think)
 
TheVampire said:
I think that as long as the next consoles can do full AA and AF everyone will be happy graphics wise.
Personally, I think underdog graphic technologies will be what defines the "look" of the PS4/Xbox 720 generation. Stuff that we've touched upon, but not really seen pushed to its limits, like sophisticated depth of field, more prolific physics eyecandy, greatly improved animations+lipsync, good object motion blur, and maybe even tessellation or voxels.
 
PortTwo said:
Maybe we should define what expectations are?

Like, we are expecting Crysis on Ultra? We need some kind of loose metric to agree upon.

my expectations are that we'll see games during PS4/720 generation that exceed that samaritan demo by a clear margin.

That demo has some nice effects but its brute forcing a lot. plus wasn't it knocked up in a few weeks? The '3x580' comment is just what it took to get it up and running, it isn't meant as an indicator for what level of tech will be needed to have games like that. Epic said themselves it could feasibly run on one card.

Console devs are good at working around problems - look at MLAA compared to MSAA for PCs for instance. Efficiency is the key

Apply the game development and design skills of a team like Naughty Dog to a fixed architecture console with power approximating a current modern PC and I salivate at the mere thought.
 
I would hope that by the time these consoles come out that 60FPS will be mandatory.

Also I just took a gander in the BF3 thread and seen how nice the PC version looks. Is it safe to assume that is basically how good the graphics will be for these systems? Would running Witcher 2 in highest settings be a preview of what to expect?
 
mrklaw said:
That demo has some nice effects but its brute forcing a lot. Console devs are good at working around problems - look at MLAA compared to MSAA for PCs for instance. Efficiency is the key
MLAA is used not only because of performance differences but also because MSAA just doesn't work well with deffered rendering.
mrklaw said:
Apply the game development and design skills of a team like Naughty Dog to a fixed architecture console with power approximating a current modern PC and I salivate at the mere thought.
Also as most people have pointed out next generation consoles probably won't use modern hardware.


Edit: I could be wrong about MLAA though. But I do think I read somewhere that some rendering engines have problems with MSAA and need to use other techniques like MLAA or temporal.
 
Souther said:
I would hope that by the time these consoles come out that 60FPS will be mandatory.

Also I just took a gander in the BF3 thread and seen how nice the PC version looks. Is it safe to assume that is basically how good the graphics will be for these systems? Would running Witcher 2 in highest settings be a preview of what to expect?

As much as I'd like to see 60fps become mandatory, sadly I don't see it happening. This sort of rule could be implemented on any console in theory, because any game can be made to run at 60fps, it's just a matter of the developer building the game with that rule in mind.

But it's hard to see a console maker demanding this of a developer for a host of reasons. The first of them being that a console maker loves to have amazing looking games built for their platform, and it's simply easier to make games prettier when the framerate doesn't have to be 60fps at all times.


Regarding the next gen consoles, after 7-8 years the leap is going to be quite significant, even if the hardware makers go cheap, and not bleeding edge. I believe what is the cutting edge of PC games in 2010/2011, is going to be about what we see in the next gen consoles when they arrive in 2-3 years.
 
BruiserBear said:
As much as I'd like to see 60fps become mandatory, sadly I don't see it happening. This sort of rule could be implemented on any console in theory, because any game can be made to run at 60fps, it's just a matter of the developer building the game with that rule in mind.

But it's hard to see a console maker demanding this of a developer for a host of reasons. The first of them being that a console maker loves to have amazing looking games built for their platform, and it's simply easier to make games prettier when the framerate doesn't have to be 60fps at all times.


Regarding the next gen consoles, after 7-8 years the leap is going to be quite significant, even if the hardware makers go cheap, and not bleeding edge. I believe what is the cutting edge of PC games in 2010/2011, is going to be about what we see in the next gen consoles when they arrive in 2-3 years.


I probably should have worded my reply better concerning 60fps. I really meant that it i would hope it would be the norm by then. Where as the norm pretty much now is 30 I would hope by then 60 would be the norm.
 
jmdajr said:
1280x720p 60fps > 1920x1080p 30fps

Either way, as long as you can do either it be just fine.
1920x1080p 60fps > *



Very very few AAA developers are going to challenge 60fps though. There aren't enough people on consoles to appreciate the extra work going into it and realising its benefits.
 
Haunted said:
1920x1080p 60fps > *



Very very few AAA developers are going to challenge 60fps though. There aren't enough people on consoles to appreciate the extra work going into it and realising its benefits.
Some are slowly starting to. CoD kind of opened up a lot of eyes to it and now some developers like Id for instance are sticking to 60fps as a conviction, no matter what other sacrifices are to be made.
 
Souther said:
I would hope that by the time these consoles come out that 60FPS will be mandatory.

Also I just took a gander in the BF3 thread and seen how nice the PC version looks. Is it safe to assume that is basically how good the graphics will be for these systems? Would running Witcher 2 in highest settings be a preview of what to expect?
I used to think that the PC games 1-2 years before a new generation give a rough outline of what to expect, but the PS3 and 360 have been keeping up surprisingly well, so I don't know quite what to make of it.
 
Orayn said:
I used to think that the PC games 1-2 years before a new generation give a rough outline of what to expect, but the PS3 and 360 have been keeping up surprisingly well, so I don't know quite what to make of it.
It's not so much that console games have been keeping up well as it is that graphical progression in PC games has slowed. Some of the best PC developers that would've been likely to attempt to top Crysis have moved over to consoles. In a normal generation, Crysis would've been topped several times over by prolific PC developers as well as by Crytek themselves by now. Instead, it remains the benchmark for video game visuals even 4 years later.

Epic caters their engines around consoles now. Id caters their engines around consoles now. Crytek moved over to creating engines that work well on current consoles. Call of Duty games look the same on consoles as they do on PC because console is the priority now, and whatever they can get to run on a console gets ported over to PC.

So yes, you give the current gen consoles too much credit if you think the reason the games on PC aren't blowing them away is because the hardware on the consoles were just so ahead of their time when they were released 6 years ago. There's something to be said when a 4 year old game like Crysis can't even run on consoles, but now virtually every game runs on consoles with the hit being little more than a lower resolution and lower frame rates.
 
Reallink said:
Are you sure you're not confusing 1.4a with an as of yet non-existent 1.4b. The original 3D formats were covered in 1.4 (unmodified--no a or b suffix). 1.4a is its successor and the newest spec I've ever heard of. It's the one that mandated side/side and top/bottom formats for broadcast content.
Arg - you're right. I suck. I keep forgetting HDMI is wacky and starts with numbers only. I knew it was the first revision, but as you stated that's actually 1.4a.

:(




Anywho ... regarding the other points, you seem to misunderstand what I've been saying. If you read the top section of this post, it explains everything (though maybe not clearly).

It appears you are getting the opposite meaning from it, but what I was trying to state was that the 1.4a spec does include FP 1080p60 as an optional format - and that a new spec isn't necessary for that. What is necessary is for Silicon Image to actually produce a full-speed Tx card that supports all the optional formats. To my knowledge, they simply don't exist right now.

I only mentioned HDMI 1.5 being needed for 4K resolution, not 1080p 3D. As a matter of fact I stated at the end of the section that I expected a full-featured 1.4a Tx would probably be the 'future proofing' Sony would include in PS4.
 
The way I see it is that even if graphics peak, more power can always be used for things like draw distance and amount of things on screen. Maybe we could get a GTA V or VI with hundreds of citizens and cars on screen at once in a city, which would probably have more of an impressive effect than tessellation or other such details.
 
RedSwirl said:
You could argue that this happened because publishers only allowed it to.

All I'm saying is that a console's "type of userbase" doesn't just show up out of thin air. The success of the PS1 and PS2 largely came about because of third parties.

This isn't a situation where suppliers can manufacture demand; it's where the product itself satisfies most of the demand, and developers would be chasing demand that's simply not there.

If developers are at fault, it's for not understanding what kinds of demand existed on the Wii. My point is that for much of the Wii's userbase, the demand didn't extend past core first-party titles. Of course, this is what happens when game development has become a multi-billion dollar business with multi-million dollar costs: developers by and large trend for conservatism and incremental advancement of the status quo.
 
Dan Yo said:
It's not so much that console games have been keeping up well as it is that graphical progression in PC games has slowed. Some of the best PC developers that would've been likely to attempt to top Crysis have moved over to consoles. In a normal generation, Crysis would've been topped several times over by prolific PC developers as well as by Crytek themselves by now. Instead, it remains the benchmark for video game visuals even 4 years later.

Epic caters their engines around consoles now. Id caters their engines around consoles now. Crytek moved over to creating engines that work well on current consoles. Call of Duty games look the same on consoles as they do on PC because console is the priority now, and whatever they can get to run on a console gets ported over to PC.

So yes, you give the current gen consoles too much credit if you think the reason the games on PC aren't blowing them away is because the hardware on the consoles were just so ahead of their time when they were released 6 years ago. There's something to be said when a 4 year old game like Crysis can't even run on consoles, but now virtually every game runs on consoles with the hit being little more than a lower resolution and lower frame rates.
I suppose they are "keeping up" in the sense that PC gaming hasn't been making the kinds of leaps and bounds it has in the past. Battlefield 3 looks like it'll drive a wedge between consoles and PC, though.
 
The PS4/720 graphics bump will be completely underwhelming. "That's it?" is going to be the resounding vocal opinion when these systems come out with their big fat price points and middling graphical hops.
 
Souther said:
I would hope that by the time these consoles come out that 60FPS will be mandatory.

Also I just took a gander in the BF3 thread and seen how nice the PC version looks. Is it safe to assume that is basically how good the graphics will be for these systems? Would running Witcher 2 in highest settings be a preview of what to expect?

no graphics will be better, especially polycount and tesselation. Also more fancy shaders.
 
It would be nice if 60fps became the norm for shooters, just like it is for fighting games, racing games, and rhythm games, but no platform will ever instate a mandatory framerate.
 
OMT said:
This isn't a situation where suppliers can manufacture demand; it's where the product itself satisfies most of the demand, and developers would be chasing demand that's simply not there.

If developers are at fault, it's for not understanding what kinds of demand existed on the Wii. My point is that for much of the Wii's userbase, the demand didn't extend past core first-party titles. Of course, this is what happens when game development has become a multi-billion dollar business with multi-million dollar costs: developers by and large trend for conservatism and incremental advancement of the status quo.

So you're blaming developers for being conservative in a time when Nintendo is trying to be everything but conservative (in certain aspects), when being conservative themselves got them nowhere.
 
Orayn said:
I suppose they are "keeping up" in the sense that PC gaming hasn't been making the kinds of leaps and bounds it has in the past. Battlefield 3 looks like it'll drive a wedge between consoles and PC, though.
Yup. That's pretty much what I was saying. BF3 looks like the first game in a long time that will make console gamers yearn for new hardware. However, it's kind of a small victory in that by now, top PC games should just flat out not be possible on consoles. Crysis is still the only one that I can think of, and that is like I said, the first and only example despite being so old at this point.
 
Jonm1010 said:
No it wasn't from the horses mouth thats why it was rumour.

The job advertisements weren't though. The threads are there discussing them if you feel the need to search.

On a personal note, I personally don't care when they release, I'd be fine waiting four more years and just getting a better PC in the interim. I just see all these threads about job advertisements, potential chip deals, developer talk, rumours and supposed behind the scenes games being shown - throw in the WiiU - and all the signs are pointing me to the conclusions that Sony and Microsoft are looking to drop a new system sooner rather than latter.
Jack Tretton said:
When will you start talking about PlayStation 4?

PlayStation 3 is really just hitting its stride. And technologically, I don’t think it’s possible to provide any advancement beyond what we have. What we’ve seen from the competition is trying to add features that already exist in PlayStation 3. We invested heavily in that, we rolled a very heavy rock up a steep hill, through the launch period. But now I think that all pays off, and we’ve got a long run way behind it. So, I wouldn’t look for any discussion of a next generation PlayStation for quite some time.

I think there’s ground to be carved out for everybody. But I didn’t see anything about Nintendo’s announcement that said ‘Oh, we’d better get working on rolling out a new PlayStation here pretty soon.’

Our attitude is kind of ‘welcome to the party.’ If you’re looking at being a multimedia entertainment device, if you’re looking at high def gaming, that was 2006 for us.
.
 
The answer to the thread is YES, YES, YES.


Graphics technology will not be getting any better, we have more or less peaked.

Making a big, cinematic 3D game is one of the hardest things to do, the number of different skill sets required are pretty much unmatched. The logistics of putting a big game together is insane.

Games take too much time to make, have become too expensive, and even now tons of games don't even pull a profit. This generation has already had loads of studios (fantastic, talented studios) close because the HD generation screwed them over.




The Samaritan demo is pretty, but that's just because it's more polished. All of the technology in it has been put into games already to some extent, for example Bokeh DOF is in Just Cause 2, Sub Surface Scattering is in Metro 2033. The only real difference is that it's a COMPLETELY authored demo, so every aspect could be at the highest fidelity possibly and be polished to hell and back.

It took them apparently 3 months to make the assets for that tiny demo, never mind the research behind it. How long would a typical 8 hour shooter take?

The demo itself ran on 3 cutting edge 580 GTX cards in tandem, that is INSANELY expensive. Even with graphics cards moving forward as they are, no single card will even come CLOSE to that for years.

The next consoles can't afford to wait that long, especially with the Wii U coming out. They will have to launch soon, and they can't afford to eat up debt like they did this generation.




The PC used to steamroll consoles visually, and that isn't the case anymore.

Crysis 2
Witcher 2
Battlefield 3

All of these are on the 360 which is practically ancient. The only game that ISN'T on consoles is Crysis 1, because the consoles simply can't handle the scale of the levels at that fidelity (nowhere near enough RAM), Cryengine 2 is also less optimized.

What the PC HAS been able to do is run these games BETTER than the consoles with sheer brute force.




So what I'm saying is what we will see is far more polish than we are used to this generation, games that run at a more stable level (maybe even HD as advertised this time ¬_¬), bigger worlds, better use of shader effects and supplemental technology like motion capture (like in LA Noire) and realistic physics engines.


We are already at the edge of photorealism, and the next generation is about EFFICIENCY.


All of those saying that the Wii U is going to be oudated? HAHAHAHAHAHA
 
I think alot of people are setting their expectations too low. I don't think Samartian is out of the question, actually I feel that's what we should be shooting for in all honesty. Im no game developer nor am I graphic tech savvy but regardless of that I'd say every generation has went so far beyond the graphical beast of the previous gen it isn't even funny. Half Life 2, Doom 3 and Far Cry were the tech pushers last gen and they were being done on the xbox late into its life with decent results. Now look how far we've come, we have people arguing if games have met their target renders that were cg (KZ2), if games have surpassed that (Crysis 2) and (Rage) and games that look beyond all three (BF3). I don't know about you guys but games with visual fidelity beyond BF3 should be madatory and expected right out of the gate. If we are setting our sights so low as double TW2 performance (a game being done on 360 as I write this) than what is the point in even releasing new consoles or shelling out another $300-$400? Idk new consoles should once again be a paradigm shift not a simple upgrade. As developers grasp the tech and truly harness it, early Pixar stuff should be within reach. So please for our benefit please don't be like Nintendo and release something that isn't worth buying. I think Sony's plan for PS3 is in line with mine, the Vita itself shows that Sony isn't (hopefully) content with releasing something just for the sake of a quick buck or guard against Nintendo gaining marketshare. Im certain Nintendos decision will ultimately bite them in the ass. Their customers will now always be a generation behind and have to shell out more money to get what should be standard in the first place. Nintendos pocket is going to take some damage because of their decision to upset the console release cycle. So to sum up, Sony & Microsoft are taking the right approach, Samartian should be possible and accepting anything that's current gen is a robbery by the big three
 
Tenck said:
What the PS4 or the 720 will do, PCs beat them to it 5 years ago.
I doubt that. The same companies making the GPU's in PC's are the same companies supplying them for the consoles and the new consoles will get at the very least whatever GPU technology is current.
 
The argument that crysis, in some shape or form, can't run of consoles isn't entirely accurate in my opinion...Far Cry 3 looks as at least as good on crysis on medium and is likely to be an open world game, even larger than crysis
 
Phonomezer said:
How far away is DX12? Wouldn't surprise me if the next xbox were to debut it.
New OS, new shader model. DX12 will hit with Windows 8 next year, though who knows what shape it'll be in. Might not be ready for primetime until there's a 12.1 or 12.2 or something?


The problem is the timing. Kepler and Southern Islands (R-1k) are hitting this year. Maxwell and whatever AMD/ATI has planned (they haven't really extended their roadmap yet iirc) won't hit until 2013 if not later. They aren't going to redesign this years cards to support DX12 midstream ... so unfortunately the consoles will probably miss the boat.




Dr_Peace said:
Graphics technology will not be getting any better, we have more or less peaked.

...

We are already at the edge of photorealism
Absurd.
 
Dan Yo said:
Some are slowly starting to. CoD kind of opened up a lot of eyes to it and now some developers like Id for instance are sticking to 60fps as a conviction, no matter what other sacrifices are to be made.

Rage will be 60fps because Carmack believes that 60fps is the best way to experience the game, not because he's following in the footsteps of Call of Duty. Remember, Doom 4 is going to be 30fps.

nelsonroyale said:
The argument that crysis, in some shape or form, can't run of consoles isn't entirely accurate in my opinion...Far Cry 3 looks as at least as good on crysis on medium and is likely to be an open world game, even larger than crysis

Crysis in its current form certainly couldn't because CryEngine 2 is horribly optimised.
 
Allonym said:
I think alot of people are setting their expectations too low. I don't think Samartian is out of the question, actually I feel that's what we should be shooting for in all honesty. Im no game developer nor am I graphic tech savvy but regardless of that I'd say every generation has went so far beyond the graphical beast of the previous gen it isn't even funny.
Even assuming your premise is correct, the issue here is that Samaritan isn't current gen. It's not a game or even using a finished engine. It's a tech demo to demonstrate where Epic wants to go in the future and is meant for high-end, next-gen PC HW.
 
when the next consoles come out, they will look considerably better than a PC game right now, particularly an average one.

think PS1...way better than PCs right when it came out.

dreamcast...nothing on PC was as good, even near soul caliber

360 was a little anomaly, it looked better but didn't really drop jaws until gears a year later (short dev cycle for launch games)...but that was a standard until crysis but that like others pointed out was a year later and ran generally like poo.

Since it will probably be 1.5+ years until the next psxbox consoles come out, compared with todays games, yes they will look better. Games that year will look as good on high end PC. maybe even with higher rez/AA.

PCs are really held back and seem to jump along with consoles since the development started to parallel and/or be a console lead.
 
Dan Yo said:
It's not so much that console games have been keeping up well as it is that graphical progression in PC games has slowed. Some of the best PC developers that would've been likely to attempt to top Crysis have moved over to consoles. In a normal generation, Crysis would've been topped several times over by prolific PC developers as well as by Crytek themselves by now. Instead, it remains the benchmark for video game visuals even 4 years later.

Epic caters their engines around consoles now. Id caters their engines around consoles now. Crytek moved over to creating engines that work well on current consoles. Call of Duty games look the same on consoles as they do on PC because console is the priority now, and whatever they can get to run on a console gets ported over to PC.

So yes, you give the current gen consoles too much credit if you think the reason the games on PC aren't blowing them away is because the hardware on the consoles were just so ahead of their time when they were released 6 years ago. There's something to be said when a 4 year old game like Crysis can't even run on consoles, but now virtually every game runs on consoles with the hit being little more than a lower resolution and lower frame rates.
This is why I don't understand comments that we'll see current level PC games in next Gen consoles. We're seeing current level PC games on consoles *now*, just with lower resolution and poorer textures. If all you got next gen was a resolution/texture bump, then why even bother releasing a new console?

The bar will be raised significantly IMO. And PCs will benefit from the new baseline being able to take advantage of it. Initial games may look little more than souped up PS3 games as throwing more polys and textures will be the easy route to take.
 
MS and Sony can't afford to not wow people next-gen. A little better isn't enough in a world where everything but your toilet can run games.
 
to be honest, i dont expect much. I dont even expect sony or ms to announce a new console in the next years. Its just to expensive i believe. They are struggling to make money with the current iterations, arent they?

And i also dont think the technology has advanced far enough to justify a new console generation. What i dont want is that they spice up their next consoles with gimmicks just to seperate themseleves from the other competitors on the market. like the wii u did.
 
Ptaaty said:
360 was a little anomaly, it looked better but didn't really drop jaws until gears a year later (short dev cycle for launch games)...but that was a standard until crysis but that like others pointed out was a year later and ran generally like poo.

PGR3
 
What I don't think people realise is that the cost of games is rising, but there is also an ever increasing market for games, so publishers that choose to support these games stand to gain lots of money by investing lots of money.

However, it will constrain the variety of games that receive that big budget, but that is why we have the indie market, which is also growing due to digital distribution, allowing for developers to show case their creativity and skill, in order to attract publishers to give them a budget to work with.

The middle ground will probable continue to contract and games such as enslaved will become less and less common, but there will still be the support of middle-ware such as cry engine, or unreal to give developers a bit of relief, and these tools will continue to become more and more attractive and versatile, allowing for reduced budget games to be created while still having high production values.
 
It's not really about HYPER REAL CANT TELL THE DIFFERENCE FROM IRL graphics anymore. That's an art/software limitation.

What's going to happen is taking the best looking graphics out right now, and being able to apply them on a ridiculous scale with amazing detail.

It's all about the scale and possibilities of games now. Not so much the life life graphics.
 
Ptaaty said:
when the next consoles come out, they will look considerably better than a PC game right now, particularly an average one.

think PS1...way better than PCs right when it came out.

dreamcast...nothing on PC was as good, even near soul caliber

360 was a little anomaly, it looked better but didn't really drop jaws until gears a year later (short dev cycle for launch games)...but that was a standard until crysis but that like others pointed out was a year later and ran generally like poo.

Since it will probably be 1.5+ years until the next psxbox consoles come out, compared with todays games, yes they will look better. Games that year will look as good on high end PC. maybe even with higher rez/AA.

PCs are really held back and seem to jump along with consoles since the development started to parallel and/or be a console lead.
Kingpin on pc destroyed dreamcast at the time.
 
Either devs make better asset procedural generation or expect to see dev cost/time get exponentially worse than even this gen. Photorealism takes a lot of work, even if we have the hardware to do it. And with that extra cost, game prices will go up, and people already bitched so much this gen when games went up to $60-$70. They could easily hit $80 next gen.
 
CrunchyFrog said:
Either devs make better asset procedural generation or expect to see dev cost/time get exponentially worse than even this gen. Photorealism takes a lot of work, even if we have the hardware to do it. And with that extra cost, game prices will go up, and people already bitched so much this gen when games went up to $60-$70. They could easily hit $80 next gen.
An extra cost from what? High quality assets are already being created for games today. The jump next gen won't be as big as it was this generation.
 
Shaka said:
An extra cost from what? High quality assets are already being created for games today. The jump next gen won't be as big as it was this generation.
That's what I'm saying. if people are expecting the leap we had from last gen to this gen to be the standard leap for next gen are in for a disappoint, not because of hard/software limitations, but manpower. And yea high quality assets are being made, but it's taking longer and longer. BF3 looks beautiful, but it's been in development for over 4 years. Summer blockbusters don't take that long. It's hitting a point where content creation is becoming prohibitively extensive when you have to put THAT much detail into the environment of a game, every tin can, every pebble, every leaf on every bush, every pocket flap and decal on each one of the dozens of different types of soldiers and weapons. I'm not saying that the process isn't accelerated somewhat by middleware and engine development, but there's just so much more to consider now when creating the "AAA" video game environment than there was a few years ago. A ceiling will be hit somewhere.
 
TxdoHawk said:
The PS4/720 graphics bump will be completely underwhelming. "That's it?" is going to be the resounding vocal opinion when these systems come out with their big fat price points and middling graphical hops.
To be fair, we all said the same thing when the 360 launched. The first impressive games didn't come out anywhere near launch. Hell, pretty much everything prior to Fight Night Round 3 was more or less upscaled last-gen games. People tend to forget how underwhelming the 360 was up until Gears came out a year later (Albeit, Dead Rising came out over the summer and looked amazing to me). It took devs a loooong time to figure out how to utilize the new techniques. I suspect it'll be like that every generation.
 
Top Bottom