• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Reality of console visuals surpassing PC visual fidelity

mephixto

Banned
yeah like crysis1?

You ran that at 60fps@1080p when it was released?

That is how pc gaming used to be. the scalablility used to mean you could run it at ok framerate on high end hardware.

Not like it is today, when you an average pc can run the latest multiplatform games 1080p@60fps. simply becase the game was developed with 360 in mind.

once the base moves on to next gen consoles. Pc is going back to sub 30fps second.

Crysis 1 till today ran worse than Crysis 2 on my PC(680 sli) and that's cause Crysis 1 its the pinnacle of bad engine optimization.
 

dark10x

Digital Foundry pixel pusher
yeah like crysis1?

You ran that at 60fps@1080p when it was released?

That is how pc gaming used to be. the scalablility used to mean you could run it at ok framerate on high end hardware.

Not like it is today, when you an average pc can run the latest multiplatform games 1080p@60fps. simply becase the game was developed with 360 in mind.

once the base moves on to next gen consoles. Pc is going back to sub 30fps second.
Yep, that's how it always was.

Crysis 1 till today runs like worse than Crysis 2 on my PC and that's cause Crysis 1 its the pinnacle of bad engine optimization.
The pinnacle of bad engine optimization? What a crock of shit.

It is not super optimized or anything, but that kind of claim makes it sound like a broken piece of shit.
 

Salsa

Member
yeah like crysis1?

You ran that at 60fps@1080p when it was released?

That is how pc gaming used to be. the scalablility used to mean you could run it at ok framerate on high end hardware.

Not like it is today, when you an average pc can run the latest multiplatform games 1080p@60fps. simply becase the game was developed with 360 in mind.

once the base moves on to next gen consoles. Pc is going back to sub 30fps second.

give me more examples of well optimized games that no high end machine could run better than a console.
 
No, it wont. There is not a single reason for it. There wont be anything revolutionary or really high end like CELL or Xenon + Xenos with edram in them.

Look at benchmarks of current games. A GTX 680 runs Far Cry 3 at exactly 1080p/60fps. Same for Hitman and AC3. Those are current-gen titles. Add anything substantial to these games graphics-wise and your 60 fps are gone.

Of course you can always make the argument for SLI and six core CPUs or whatever but I'm talking about a standard high-end PC, not the top-end. When it comes to PCs there's always some machine somewhere that will run everything.
 

Serra

Member
The biggest factor to discuss is cost really. Do you want to spend $1200 on a fairly high end gaming PC or do you want to spend $399 for a console that can do almost what a $1200 PC can? I say almost because developers can write code straight to the metal on a console, since its a fixed platform. PC requires writing for the lowest common denominator and also a hypervisor of sorts.. so you are never going to be able to squeeze all the power of a PC whereas you definitely can with a console.

The average consumer doesn't want to manage drivers, editing INI files, sitting at a computer, or just random crashes etc.. The average consumer will always pick the cheaper and better optimized option.

The average consumer will buy a console because its arguably the better option. The enthusiast on the other hand would argue the other, with just cause. Its just, the enthusiast is the enthusiast and their platform is never going to be the best bang for the buck option. PC gamers can argue until they're blue in the face, but it will continue to fall on deaf ears. People don't care that your $1200 PC can edge out more IQ .. they really don't.

I'm a PC gamer as well.

All true.

But this OP is about the new consoles and their games looking better on PC or the console. Price is not an issue, as stated in the OP. I don't think anyone except people like Shadow of the BEAST can make an argument that the PC does not have better IQ.
 

mephixto

Banned
Yep, that's how it always was.


The pinnacle of bad engine optimization? What a crock of shit.

It is not super optimized or anything, but that kind of claim makes it sound like a broken piece of shit.

Explain to me then why that pile of crap runs ~20fps lower than Crysis 2
 

LCGeek

formerly sane
Here's the thing: Assuming 720 and PS4 titles are @ native 1080p, what major graphical benefits would the PC have over consoles? The biggest problem has always been resolution, not frame rate, when talking about the most immediately visible aspect of graphics.

So if, and it's still a big IF, next-gen consoles hit that 1080p mark, I see no reason why they can't keep up with PC for years. Why? Because barely anyone goes higher than 1080p, like a tiny, tiny percentage would even consider that. So if the consoles "check that box", next-gen is going to be very exciting. This is exactly why I'm so hyped up about it.

Couple all this with the fact that 1st party next-gen games (especially Sony's 1st party studios) are going to take 100% advantage of the hardware and I wouldn't be surprised at all to see the best looking game title given to a console game for years on end.

1080p isn't a high mark for pc gamers anyone with I5 oced and newer amd nvidia card can go well beyond that and still keep frames. Only thing holding back some pc gamers now is their monitor capabilities be it single or multi setup displays. The point isn't very few can do, it's that we can do it period and do it well a Console can't. That's where a bulk of enthuiast pc gamers go. Also I will sum up one big advantage of a pc over a console my GPU control panel. Fucking consoles need to grow up and give me something like nvidia inspector or the ability to alter my profiles to the same degree. As long as we can supersample and jack the AF, AO, and other features of games up console games will always be mid tier.

Consoles always enjoy a year to about two of a benefit yet it's quickly erased almost each gen. Console devs right now are making those titles and certain pc devs like crytek are already starting to push their stuff with amazing results.

Also crysis 1 is pc exception when takling graphics. You can configure so much and when you push it can still choke high end machines. The fact you can is a testament to why pc gamers at times like the pc whereas on a console the game is stuck as it it gltiches, bugs, and performance issues.

1200 hundred for a high end pc machine is fine but the gaf pc thread has builds starting off at the low 400 that can still perform. Stop saying this bad meme like it has merit, hasn't had any merit basically since crysis 1 when build threads started showing up on the site.
 

King_Moc

Banned
We are talking about the aficionados aren't we? The OP states that price doesn't matter. I doubt a 1500$ rig won't be able to play next gen titles 1080p 60FPS

Square have said their tech demo was on a single Geforce 680 and ran at 60fps. I believe Epic have now said the same of Samaritan.

The biggest factor to discuss is cost really. Do you want to spend $1200 on a fairly high end gaming PC or do you want to spend $399 for a console that can do almost what a $1200 PC can? I say almost because developers can write code straight to the metal on a console, since its a fixed platform. PC requires writing for the lowest common denominator and also a hypervisor of sorts.. so you are never going to be able to squeeze all the power of a PC whereas you definitely can with a console.

The average consumer doesn't want to manage drivers, editing INI files, sitting at a computer, or just random crashes etc.. The average consumer will always pick the cheaper and better optimized option.

The average consumer will buy a console because its arguably the better option. The enthusiast on the other hand would argue the other, with just cause. Its just, the enthusiast is the enthusiast and their platform is never going to be the best bang for the buck option. PC gamers can argue until they're blue in the face, but it will continue to fall on deaf ears. People don't care that your $1200 PC can edge out more IQ .. they really don't.

I'm a PC gamer as well.

Much of the point of going the PC route is that you are no longer reigned in by what the console developer thought was a reasonable cost. Different people have different incomes. If i think that a new PC is worth £1200, then i can choose to do that. With PC's nowadays, you'll probably find that a £400 one can run the same games, just not as well.

yeah like crysis1?

You ran that at 60fps@1080p when it was released?

That is how pc gaming used to be. the scalablility used to mean you could run it at ok framerate on high end hardware.

Not like it is today, when you an average pc can run the latest multiplatform games 1080p@60fps. simply becase the game was developed with 360 in mind.

once the base moves on to next gen consoles. Pc is going back to sub 30fps second.

No, but Doom 3 and Half-Life 2 sure did (2004). And countless games before that. Unreal, AVP - i ran them all in HD at 60fps. Crysis was an anomaly. They specifically stated that they were targetting future hardware with it. That you wouldn't be able to run the ultra setting for a couple of years.
 

Grief.exe

Member
Look at benchmarks of current games. A GTX 680 runs Far Cry 3 at exactly 1080p/60fps. Same for Hitman and AC3. Those are current-gen titles. Add anything substantial to these games graphics-wise and your 60 fps are gone.

Of course you can always make the argument for SLI and six core CPUs or whatever but I'm talking about a standard high-end PC, not the top-end. When it comes to PCs there's always some machine somewhere that will run everything.

And they run those games at 1080p/60fps with performance to spare, usually well below 50-60% usage.

6-core cpus or hyperthreading really has almost no effect on gaming currently. Very few games are multi-threaded.
 

dark10x

Digital Foundry pixel pusher
give me more examples of well optimized games that no high end machine could run better than a console.
Are you kidding me? That is how it always was prior to, say, 2007 or so. PC gaming was always about pushing the limits of what hardware could deliver. PC gamers today are fucking spoiled or have a very short memory.

Back in the day you'd bring home the newest Origin game (something like Crusader, System Shock, or a new flight sim) and you'd be lucky to get a smooth framerate on even a high-end rig. Things improved once 3D cards hit but even then 60 fps was far from common.

Anti-aliasing wasn't even really a usable thing until 2003/2004.

This was the first generation in PC history where you could expect to max out a game on day 1 at a perfect framerate with image quality enhancements.

No, but Doom 3 and Half-Life 2 sure did (2004). And countless games before that. Unreal, AVP - i ran them all in HD at 60fps.
Oh? Which video card did you run Unreal on? Which resolution?

You're remembering wrong.
 
The biggest factor to discuss is cost really. Do you want to spend $1200 on a fairly high end gaming PC or do you want to spend $399 for a console that can do almost what a $1200 PC can? I say almost because developers can write code straight to the metal on a console, since its a fixed platform. PC requires writing for the lowest common denominator and also a hypervisor of sorts.. so you are never going to be able to squeeze all the power of a PC whereas you definitely can with a console.

The average consumer doesn't want to manage drivers, editing INI files, sitting at a computer, or just random crashes etc.. The average consumer will always pick the cheaper and better optimized option.

The average consumer will buy a console because its arguably the better option. The enthusiast on the other hand would argue the other, with just cause. Its just, the enthusiast is the enthusiast and their platform is never going to be the best bang for the buck option. PC gamers can argue until they're blue in the face, but it will continue to fall on deaf ears. People don't care that your $1200 PC can edge out more IQ .. they really don't.

I'm a PC gamer as well.

The reality is that the PC has become the platform for serious gamers. It used to be the other way around. If you want the best games, you go console. Now it would appear that consoles have reached the mainstream; it's plug and play, relatively affordable and there's very little to mess about with, a.k.a. limited.

PCs are messy, expensive, requires a lot research and tweaking and then there's the wild, wild, world of modding. PC gaming is strictly in the enthusiast realm; it's definitely not for the average consumer.
 
Are you kidding me? That is how it always was prior to, say, 2007 or so. PC gaming was always about pushing the limits of what hardware could deliver. PC gamers today are fucking spoiled or have a very short memory.

Back in the day you'd bring home the newest Origin game (something like Crusader, System Shock, or a new flight sim) and you'd be lucky to get a smooth framerate on even a high-end rig.
Anti-aliasing wasn't even really a usable thing until 2003/2004.
Things improved once 3D cards hit but even then 60 fps was far from common.

This was the first generation in PC history where you could expect to max out a game on day 1 at a perfect framerate with image quality enhancements.


Oh? Which video card did you run Unreal on? Which resolution?

You're remembering wrong.

Nailed it.
 

Salsa

Member
Look at benchmarks of current games. A GTX 680 runs Far Cry 3 at exactly 1080p/60fps. Same for Hitman and AC3. Those are current-gen titles. Add anything substantial to these games graphics-wise and your 60 fps are.

this argument isnt really valid. Gpus are made and sold according to the tech used to make gamea at the moment. Not only that but there's even older games that look and run way better than far cry 3 (wich btw im running at 80+ fps 99 percent of the time at 2040x1152 on a single card)
 
as far as a standalone pre-built PC, sure it could happen

whereas if an individual with any minuscule knowledge of PC gaming were to build a computer with literally the best available parts (at the time) there is no way in hell a console will stand a chance.

don't get me wrong-I would love a console to cost 599 have a CPU comparable to a high end i7, GTX 680, 1TB HDD and 32Gigs of ram, that uses blu ray and supports a grandiose variety of surround sound options. but it wont happen.

it's just too expensive for the big 3 to do that. all consoles will release with "last years" tech.
 
Explain to me then what that pile of crap runs 20fps lower than Crysis 2

The fact the the biggest Crysis 2 level is a fraction of the size of all but the last Crysis level? all the while being completely static in nature? the (initial) lack of one of Crysis' heavier hitting effects namely POM and greatly reduced amounts of per-object motion blur?

The only reason people carry the misconception that C2 is "way more optimized" (a bullshit buzz phrase) is that C2 was a significantly shallower and less ambitious game than it's predecessor
 
This was the first generation in PC history where you could expect to max out a game on day 1 at a perfect framerate with image quality enhancements.

You're remembering wrong.

I maxxed out Quake II at around 100 fps back in 1998 running 3DFX in the OG SLI mode. But yeah, no image quality enhancements; it's a first for me. This PC gen has been a revelation.
 

Salsa

Member
Are you kidding me? That is how it always was prior to, say, 2007 or so. PC gaming was always about pushing the limits of what hardware could deliver. PC gamers today are fucking spoiled or have a very short memory.

Back in the day you'd bring home the newest Origin game (something like Crusader, System Shock, or a new flight sim) and you'd be lucky to get a smooth framerate on even a high-end rig. Things improved once 3D cards hit but even then 60 fps was far from common.

Anti-aliasing wasn't even really a usable thing until 2003/2004.

This was the first generation in PC history where you could expect to max out a game on day 1 at a perfect framerate with image quality enhancements.


Oh? Which video card did you run Unreal on? Which resolution?

You're remembering wrong.

How is this responding what im asking? We are comparing pcs to consoles afaik
 

Grief.exe

Member
as far as a standalone pre-built PC, sure it could happen

whereas if an individual with any minuscule knowledge of PC gaming were to build a computer with literally the best available parts (at the time) there is no way in hell a console will stand a chance.

don't get me wrong-I would love a console to cost 599 have a CPU comparable to a high end i7, GTX 680, 1TB HDD and 32Gigs of ram, that uses blu ray and supports a grandiose variety of surround sound options. but it wont happen.

it's just too expensive for the big 3 to do that. all consoles will release with "last years" tech.

I seriously wish that consoles would charge that much for launch units. Put some top of the line parts in there so they don't become such a joke at the end of the life cycle.
 

LCGeek

formerly sane
I maxxed out Quake II at around 100 fps back in 1998 running 3DFX in the OG SLI mode. But yeah, no image quality enhancements; it's a first for me. This PC gen has been a revelation.

3dfx cards were great for speed. TNT were about image quality give me that 32bit color and aa plz.
 
Just as an example, I have yet to see a PC game being as good as Beyond in character design.

You've yet to see a PC game look as good as the promo shots of a game you haven't played? I'm all for comparisons if needs be, but those comparing shots of two unreleased games [including some bullshots no less] aren't adding anything to that kind of discussion except confusion.
 
as you know prior to 360. or before 2005.

Most new pc games ran sub 30fps. Because they actually pushed the hardware.

it wasn't until pc community moved on to 360 and set that as the lowest common denominator that we got the modern pristine image quality/high framerates that newbie pc aficionados seem to think has been the status quo 30 years.

Lol. You're out of your fucking mind.
 

Reiko

Banned
The fact the the biggest Crysis 2 level is a fraction of the size of all but the last Crysis level? all the while being completely static in nature? the (initial) lack of one of Crysis' heavier hitting effects namely POM and greatly reduced amounts of per-object motion blur?

The only reason people carry the misconception that C2 is "way more optimized" (a bullshit buzz phrase) is that C2 was a significantly shallower and less ambitious game than it's predecessor

C2 in DX9 was more optimized. Once you hit DX11 is when Crysis 2 reverts back to the classic choke point.
 

shandy706

Member
I'd almost bet half the people (more?) arguing about PC always playing games easily, or blowing past 60fps, can't even run Battlefield 3, Assassin's Creed 3, or Max Payne 3 on Ultra/Max with 16xAA (or even 4-8x AA) and obtain even 30-40fps. I bet many of you can't even get 30fps.

LOL

I have a liquid cooled i7, an overclocked 2GB GTX 660 Ti, 9GB RAM (triple channel) and I JUST...JUST...manage 60 frames per second at 8-16x AA on those games. I see 42fps in the downtown areas when the action really picks up in AC:III.

My best friend, who has a similar setup has a GTX 670 and he hits the 45fps mark in the same area. We both manage close to 60 everywhere else. We both manage 60 plus in Max Payne 3 and Battlefield 3.

Next gen games will look better than the above games....period. Microsoft and Sony should have no problem doing so.

NO I DO NOT THINK...in fact I almost guarantee...my PC with it's new/recent tech will not handle "Next Gen" games at 1080p/60fps. Not on Ultra/Max settings.

Heck...look at Crysis 3. I played it and with EVERYTHING on VERY HIGH (no Ultra setting) it brought my system to it's knees. I can see Cryisis 3 being an example of Next Gen console capability....perhaps.
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
A new generation of consoles will once again push PC graphics. Even now we have other CGI-centric industries pushing graphics to no where near what the real-time rendering demands that video games put on systems.

?

Are you saying what current renderers in games are doing is more system taxing than what CG houses are doing?
 
3dfx cards were great for speed. TNT were about image quality give me that 32bit color and aa plz.

Yeah I played with the Riva after that, was really amazed by the real time videos playing on the billboards of G-Police, a feature only available on those cards. Good times!

Amazingly, I spent the same amount of money on the dual 3DFX cards as I did with the GTX 690, a decade later, but the power level comparison is just insane. To an old dude like me, it almost seems like a good deal :)
 

kinggroin

Banned
I maxxed out Quake II at around 100 fps back in 1998 running 3DFX in the OG SLI mode. But yeah, no image quality enhancements; it's a first for me. This PC gen has been a revelation.


What enhancements were even there to take advantage of at the time? It ran at 60fps with the best image quality expected for the time on top end hardware.

Yesterday's 1280x1024 (was it even that commonly high?) Is today's 1080p.


Edit: In fact, weren't 3DFX cards limited to 800x600 or 1024x768 when SLI'd?
 

LCGeek

formerly sane
I'd almost bet half the people (more?) arguing about PC always playing games easily, or blowing past 60fps, can't even run Battlefield 3, Assassin's Creed 3, or Max Payne 3 on Ultra/Max with 16xAA (or even 4-8x AA) and obtain even 30-40fps. I bet many of you can't even get 30fps.

LOL

I have a liquid cooled i7, an overclocked 2GB GTX 660 Ti, 9GB RAM (triple channel) and I JUST...JUST...manage 60 frames per second at 8-16x AA on those games. I see 42fps in the downtown areas when the action really picks up in AC:III.

My best friend, who has a similar setup has a GTX 670 and he hits the 45fps mark in the same area. We both manage close to 60 everywhere else. We both manage 60 plus in Max Payne 3 and Battlefield 3.

Next gen games will look better than the above games....period. Microsoft and Sony should have no problem doing so.

NO I DO NOT THINK...in fact I almost guarantee...my PC with it's new/recent tech will not handle "Next Gen" games at 1080p/60fps. Not on Ultra/Max settings.

Heck...look at Crysis 3. I played it and with EVERYTHING on VERY HIGH (no Ultra setting) it brought my system to it's knees. I can see Cryisis 3 being an example of Next Gen console capability....perhaps.

Cause owning a pc isn't about maxing it out necessarily nor should it be. The point is most mid range and high end pc gamers can roll whatever console gamers get say 3 years in to a generation. Console can't even enter this debate if you were stick most console games on to higher end settings like your mentioning for the pc they couldn't do a solid 30.

Yeah I played with the Riva after that, was really amazed by the real time videos playing on the billboards of G-Police, a feature only available on those cards. Good times!

Amazingly, I spent the same amount of money on the dual 3DFX cards as I did with the GTX 690, a decade later, but the power level comparison is just insane. To an old dude like me, it almost seems like a good deal :)

I totally feel that just moved off of the old quad/c2d on to the i5/i7 era and it's good. I'm amazed at how far about 700bucks takes me. If you asked a version of myself in my teens if I was going to be doing this on a console or pc would've lauged and said arcade hardware forever.
 

mephixto

Banned
The fact the the biggest Crysis 2 level is a fraction of the size of all but the last Crysis level? all the while being completely static in nature? the (initial) lack of one of Crysis' heavier hitting effects namely POM and greatly reduced amounts of per-object motion blur?

The only reason people carry the misconception that C2 is "way more optimized" (a bullshit buzz phrase) is that C2 was a significantly shallower and less ambitious game than it's predecessor

Did Crysis 1 had these: Tessellation, Displacement Mapping, Realtime Local Reflections and Tone Mapping, Bokeh Filters, Ambient Occlusion, Hi-res textures?

No
 

Salsa

Member
LOL

I have a liquid cooled i7, an overclocked 2GB GTX 660 Ti, 9GB RAM (triple channel) and I JUST...JUST...manage 60 frames per second at 8-16x AA on those games. I see 42fps in the downtown areas when the action really picks up in AC:III.

My best friend, who has a similar setup has a GTX 670 and he hits the 45fps mark in the same area. We both manage close to 60 everywhere else. We both manage 60 plus in Max Payne 3 and Battlefields

Because the 660ti is mid range at best with a ridiculous 192 bit memory bus and AC3 is well known as a terribly optimized game?

Holy bad post batman
 
Square have said their tech demo was on a single Geforce 680 and ran at 60fps. I believe Epic have now said the same of Samaritan.

Samaritan ran at 30 fps
08.jpg

When they made it run on a single GTX 680 instead of three GTX 580s they bumped down the resolution to 720p too.
Samaritan-MSAA.png


The UE 4 editor demo they showed (also running on a single GTX 680) ran at about 40 fps with not much going on at also not full 1080p resolution.
http://www.youtube.com/watch?v=MOvfn1p92_8&feature=player_detailpage#t=468s
 

King_Moc

Banned
You've yet to see a PC game look as good as the promo shots of a game you haven't played? I'm all for comparisons if needs be, but those comparing shots of two unreleased games [including some bullshots no less] aren't adding anything to that kind of discussion except confusion.

And using Beyond as the example, of all games. Yeah, the faces look good - because they skimp everywhere else. Plus no physics or gameplay to speak of. Anyone that played Heavy Rain know's exactly how this one's going to pan out graphically.
 

Kokonoe

Banned
Consoles cannot surpass PCs.

Computers can constantly be upgraded, and have much more room for bigger, more powerful parts.
 

Reiko

Banned
Cause owning a pc isn't about maxing it out necessarily nor should it be. The point is most mid range and high end pc gamers can roll whatever console gamers get say 3 years in to a generation. Console can't even enter this debate if you were stick most console games on to higher end settings like your mentioning for the pc they couldn't do a solid 30.

People need benchmark games to see if their rigs are powerful enough for the latest games.

Games like Crysis were more than just a game, they were also a entry level tool.


Crysis 2 DX11 runs at 60fps smoth at ultra on a Single 680, on a SLI config inever drops of 90 and reach tops up to 200fps


Well now it does. At the time of release it wasn't so easy.
 
What enhancements were even there to take advantage of at the time? It ran at 60fps with the best image quality expected for the time on top end hardware.

Yesterday's 1280x1024 (was it even that commonly high?) Is today's 1080p.

There wasn't, hence dark10x's point.

Funnily enough, running games at 1080p was a little disappointing for me; even knowing that 1080p is nearly 4 times the pixel count of 720p, the image is still not sharp nor clear enough for me. Thankfully downsampling fixed that.

So yeah, back on topic, if the next gen consoles only did 1080p, it won't be good enough for me on a visual fidelity level compared to what I can get on the PC.
 
Did Crysis 1 had these: Tessellation, Displacement Mapping, Realtime Local Reflections and Tone Mapping, Bokeh Filters, Ambient Occlusion, Hi-res textures?

No (neither did C2 at launch and when it got performance fucking tanked) yes, no (neither did C2 at launch) no, yes, yes.

and that's without acknowledging the existence of xZero's (now cry-Steve) shaders, who added all those features to CE2 and was hired on my Crytek as a result
 
I totally feel that just moved off of the old quad/c2d on to the i5/i7 era and it's good. I'm amazed at how far about 700bucks takes me. If you asked a version of myself in my teens if I was going to be doing this on a console or pc would've lauged and said arcade hardware forever.

There's definitely a big shift happening. Maybe it's because this console generation dragged out too long, or maybe because gaming has gone mainstream. But right now for the serious gamers (hey this is still GAF!) the PC is the most exciting frontier. Can't wait for the Japanese devs like Platinum Games to jump on board.

Japanese art style/ game design + high res + 60 fps + copious amounts of AA = old school arcade gaming heaven.
 

LCGeek

formerly sane
People need benchmark games to see if their rigs are powerful enough for the latest games.

Games like Crysis were more than just a game, they were also a entry level tool.

Not denying that but don't try to say average consumers use benchmarks or such tools frequently like you or I would. Most using benchmarks are in the enthusiast territory as most of that culture is regardless of how much they have to spend. Average consumers do not care for most of the advanced tools pc offer that makes games better or help them to look or play like they should.

There wasn't, hence dark10x's point.

Funnily enough, running games at 1080p was a little disappointing for me; even knowing that 1080p is nearly 4 times the pixel count of 720p, the image is still not sharp nor clear enough for me. Thankfully downsampling fixed that.

So yeah, back on topic, if the next gen consoles only did 1080p, it won't be good enough for me on a visual fidelity level compared to what I can get on the PC.

Cause it is I came down to 1080 off of 1200p 16:10 monitor. It's a decent 16:9 based resolution but was made to be standard for tvs which isn't saying all that much considering how monitors are far better in terms of tech progression.
 
Did Crysis 1 had these: Tessellation, Displacement Mapping, Realtime Local Reflections and Tone Mapping, Bokeh Filters, Ambient Occlusion, Hi-res textures?

No
Actually, it did have displacement mapping (parallax occlusion mapping, to be accurate), tone mapping (though not with a filmic S-curve like they're touting for Crysis 2), ambient occlusion, and hi-res textures (in fact the textures Crysis 2 launched with were lower-resolution than Crysis's, hence PC gamer outcry). The only things that are new in Crysis 2 are the DX11 features (tesselation & real time local reflections) and the bokeh (which can easily be done in DX9).


That aside, this topic is incredibly silly. We don't know what kind of silicon the hardware makers are going to be launching with. They could be launching with silicon on par with a 680. That would make them high-end for the first year. Then they'd fall behind again. (Well, they're already behind me with my 2x GTX 670s, but obviously I'm an outlier there.)

Also, the comparison between PC versions and console versions as a way to suggest that one is more optimized than the other is horse shit. PC ports frequently have huge image quality gains as well as graphics options that simply do not appear on the console versions. If you want to make a like:like comparison you actually have to set the graphics settings equal to each platform's respective version.

And don't even bring up console Crysis, that one is clearly running at medium settings. Not high, not very high, medium. That should tell you enough about how big the gulf is between them now.
 

kinggroin

Banned
There wasn't, hence dark10x's point.

Funnily enough, running games at 1080p was a little disappointing for me; even knowing that 1080p is nearly 4 times the pixel count of 720p, the image is still not sharp nor clear enough for me. Thankfully downsampling fixed that.

So yeah, back on topic, if the next gen consoles only did 1080p, it won't be good enough for me on a visual fidelity level compared to what I can get on the PC.

This was the post that set it all off:

http://m.neogaf.com/showpost.php?p=45279066

While there is SOME truth to the end point (mainly in regards to image quality), its still a false blanket statement that doesn't take into account that PCs aren't limited by fixed hardware. While there are some software exceptions, many launch performance issues could be circumvented with obscenely priced hardware purchases.

I mean, MOST games prior to 2005, running at less than 30fps at launch is a silly thing to say, no?
 

KKRT00

Member
The fact the the biggest Crysis 2 level is a fraction of the size of all but the last Crysis level? all the while being completely static in nature? the (initial) lack of one of Crysis' heavier hitting effects namely POM and greatly reduced amounts of per-object motion blur?

The only reason people carry the misconception that C2 is "way more optimized" (a bullshit buzz phrase) is that C2 was a significantly shallower and less ambitious game than it's predecessor

Crysis 2 had smaller levels, but with more complicated geometry, more varied textures and more demanding lighting conditions. Motion blur was much better in C2 DX 9 than it was in C1, DX 11 just added per-pixel hdr correct one. PoM was out, because You cant have PoM and anisotropic filtering in DX9 mode.

And C2 is much more optimized. What shallower even means? C2 is superior to C1 in every way as a tech goes. Engine is heavily multithreaded on GPU and CPU side and scales back and up. Heck C1 wasnt even deferred rendered and that changes a lot.

Look at benchmarks of current games. A GTX 680 runs Far Cry 3 at exactly 1080p/60fps. Same for Hitman and AC3. Those are current-gen titles. Add anything substantial to these games graphics-wise and your 60 fps are gone.

Of course you can always make the argument for SLI and six core CPUs or whatever but I'm talking about a standard high-end PC, not the top-end. When it comes to PCs there's always some machine somewhere that will run everything.

Awful examples. They arent optimized properly in terms of multithreading or effects optimization. I'm pretty sure that SSDO, the best ssao implementation to date made by Crytek is 4 or 5 times cheaper than any ssao implementation in Far Cry 3 or Hitman.
And even though they dont utilize CPU and GPU in full, they dont run in console settings either. So that 680, not only renders games in double the framerates, two times higher resolution, but also with not optimized MSAA, higher LOD, higher res textures and effects that every cost half the whole console version gpu budgets.

The only things that are new in Crysis 2 are the DX11 features (tesselation & real time local reflections) and the bokeh (which can easily be done in DX9).

And lighting :)

And don't even bring up console Crysis, that one is clearly running at medium settings. Not high, not very high, medium. That should tell you enough about how big the gulf is between them now.
Actually its more closer to high/very high with some medium.
 

shandy706

Member
Because the 660ti is mid range at best with a ridiculous 192 bit memory bus and AC is well known as a terribly optimized game?

Holy bad post batman

Neither the 670 or 680 can hold 60+ either (on max settings at 1080p).

The 670 drops into the 40s, the 680 and 690 drop into the 50s (turn on TXAA and watch them hit 30s/40s). This is most obvious in Boston.

Next to the 680 the 660 Ti (not overclocked) has a minimum frame-rate average of only 10fps at 43fps vs 53fps in 1080p. At 2560x1600 the 680 drops into the 30s...but that's not really comparable.



Anyway, on subject. I expect the next gen console game to look and run better than what pushes current high end PCs to sub 30. I wouldn't expect more than 4-8x AA or some other form of AA though.
 
Top Bottom