Nah, I think you are underestimating how good developers(especially for exclusives) optimize for consoles.Truth101 said:Well then you are going to be incredibly disappointed.
2San said:Nah, I think you are underestimating how good developers(especially for exclusives) optimize for consoles.
so both consoles will struggle with Crysis running at 30fps on medium settings?Tenck said:What the PS4 or the 720 will do, PCs beat them to it 5 years ago.
Tenck said:What the PS4 or the 720 will do, PCs beat them to it 5 years ago.
Zing lolrapid32.5 said:so both consoles will struggle with Crysis running at 30fps on medium settings?
PortTwo said:Maybe we should define what expectations are?
Like, we are expecting Crysis on Ultra? We need some kind of loose metric to agree upon.
Never said I was talking about launch titles.Violater said:Well those certainly wont be launch title graphics
the 8800GT cards were out before Crysis, think soles papillons sexuels said:and was hardly playable on the hardware of it's time.
Nirolak said:You have to consider console image quality and performance standards.
We don't have any PC games that can only run on a really high end card at 30 FPS (with drops) in 720p with no anti-aliasing or only something like MLAA.
These are the lengths people go to when draining out more power from a console though, which is why graphics are where they are on consoles.
If we actually cared about performance and standards, we'd be holding our stuff a lot further back.
2San said:Never said I was talking about launch titles.
realistic expectations, I don't mind resolution trade off either, if devs wanna push more on the screen. Insomniac were pushing Ratchet and Clank TOD 720p/60fps at first and the game looked gorgeous.ARXIN said:60fps at 720p>30fps at 1080p
If I had to choose...
SneakyStephan said:A thousand intelligent people can repeat this a thousand times over and people will still refuse to understand this.
Not even going taking into account all the 'cheating' that has to be done with agressive LOD's (or photomode lod models in menus vs what you get ingame to bullshit people some more), bad draw distances/closed off areas that bring back memories of the psx era , low update rates on shadows , using sprites more and more, promising GI and day/night transitions but ending up with seperate day and night levels because half the environment shadows are baked etc...
If a pc exlusive title was made for a high end graphics card with what you mentioned and all the cheating I mentioned then people would have to pick their jaws up off the floor.
Too bad that designing a game like this always requires huge sacrifices in design (going off all the corridor games with areas seperated by a ton of load times and the illusion falls apart every time you take a closer look at things ingame.
I just finished playing some ghost of sparta on my psp and it shows what even that little machine is capable off when going all out on the optimising slash faux 'optimising' (everything mentioned above).
The perception is fairly meaningless when discussing the actual technical performance of the systems.Violater said:I understand that, but the general knee jerk reaction from launch titles will play a huge role in the overall perception.
I for one was disappointed with early PS3 and even 360 titles.
None of the R&Cs are 720p.rapid32.5 said:realistic expectations, I don't mind resolution trade off either, if devs wanna push more on the screen. Insomniac were pushing Ratchet and Clank TOD 720p/60fps at first game and the game looked gorgeous.
Tenck said:What the PS4 or the 720 will do, PCs beat them to it 5 years ago.
Don't even go launch, the consoles don't change spec, and the poster didn't specify launch.Metalmurphy said:Do let me know what PC game looked like this:
http://www.youtube.com/watch?v=pJPnClZQTXc&hd=1
or this:
http://ps3media.ign.com/ps3/image/a...rts-fight-night-round-3-20061016111558441.jpg
In 2001.
Or like this:
http://www.youtube.com/watch?v=G-sHalDn3UA&hd=1
In 2000
StuBurns said:Don't even go launch, the consoles don't change spec, and the poster didn't specify launch.
God of War 3 pretty much outclasses anything before Crysis I know of.
What's generally considered the best looking 360? Gears 2 I guess? I thought Alan Wake but I know it got massive GAF hate visually.
2San said:I'm honestly expecting the games to look like that Samaritan UE3 demo though. :O
Of course the disparity won't be as large as 360->Wii or PS3->Wii but you're kidding yourself if you think the Wii U will be able to keep up with consoles released at a minimum 1 year later.SolarPowered said:Yes, people actually expect the new consoles to make the WiiU look as underpowered as the Wii. The only way that would happen is if the 360 or PS3 packed SLI/Crossfire configs of the highest caliber and Nintendo built a bottom of the barrel AMD/Nintendo GPU.
There is no reason there will be a development cost increase.Zerokku said:Expectations are way way off. Not because of graphical capability, but $$$. We've seen more than a handful of developers go bankrupt because a game didn't sell a million copies. We've seen a lot of small Japanese studios move towards handhelds and DD because they simply cant afford the development costs and mass production in the console scene. Can the industry afford even more exorbitant development costs?
We'll see a noticeable jump, but it wont be nearly as large as people are expecting.
I'm guessing you are talking about the lighting and shadows with the global illumination from both engines?Linkzg said:Isn't a goal for Frostbite 2 and Cryengine 3 to make some of those tasks easier?
At least I remember reading/watching an interview specially about the tools becoming more efficient in general.
That's not how it works.PortTwo said:Observe 4ghz "limit" of PC last few years.
StuBurns said:Don't even go launch, the consoles don't change spec, and the poster didn't specify launch.
God of War 3 pretty much outclasses anything before Crysis I know of.
What's generally considered the best looking 360? Gears 2 I guess? I thought Alan Wake but I know it got massive GAF hate visually.
I seriously doubt that anything will look better than the Witcher 2 or Battlefield 3 on next-gen consoles.onQ123 said:I'm thinking Sony will have a 32 SPU Cell with a 16 core GPU & it will blow away what you see on PC right now because no one is pushing high end PCs to the limit.
The WiiU is already nearly running the cryengine, it's running UE3, and it's using the engine made for Assassin's creed. It'll hold it's own no matter what any of us say at this point.saunderez said:Of course the disparity won't be as large as 360->Wii or PS3->Wii but you're kidding yourself if you think the Wii U will be able to keep up with consoles released at a minimum 1 year later.
And? The 360 is running all of those engines, that doesn't mean anything. The Wii U will be lacking in power compared to the competitors and that will be obvious when games are ported from the new consoles to the Wii U.SolarPowered said:I meant to type 720 and PS4 lol.
The WiiU is already nearly running the cryengine, it's running UE3, and it's using the engine made for Assassin's creed. It'll hold it's own no matter what any of us say at this point.
Izayoi said:I seriously doubt that anything will look better than the Witcher 2 or Battlefield 3 on next-gen consoles.
PortTwo said:For multicore, yes. It's a good point and I probably should have been more specific. But those are out of order procs, not directly comparable.
Izayoi said:I seriously doubt that anything will look better than the Witcher 2 or Battlefield 3 on next-gen consoles.
Izayoi said:I seriously doubt that anything will look better than the Witcher 2 or Battlefield 3 on next-gen consoles.
Izayoi said:I seriously doubt that anything will look better than the Witcher 2 or Battlefield 3 on next-gen consoles.
rapid32.5 said:I expect no sub 720p games ever, 60fps for AAA titles. 1080p at 30fps rock solid. If it's not possible Sony/MS fail in my book.