• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

But Can It Run Crysis? 10 years of flagship GPUs compared

MrOogieBoogie

BioShock Infinite is like playing some homeless guy's vivid imagination
I recently played through Warhead on an i5-6500 + 1060 GTX 6GB + 16GB RAM and still managed to see framerates drop into the 30s during intense firefights. This is at 1680x1050, too. To be fair, though, the average framerate hovered between 85 and 100fps.
 

TaroYamada

Member
I first played Crysis on a GTX 460 1gb and I'm glad I could play it maxed out. It's one of my favorite FPS games and it deserves to be experienced at that fidelity.

Just to be clear this was at like 1600x900 if memory serves and I wasn't always hitting 60FPS.
 

LordOfChaos

Member
Wasn't it more cpu than gpu? :x



Sorta, but I mean, the 8800 that was from the same era as Crysis is still brutalized here on a modern CPU at 1080p


$
 
I got a 8800gtx at the time partly to play that. I remember tweaking the config files for hours to squeeze every frame possible out of that thing.

Stalker came out the same year. What a great time for shooters.
 

Inviusx

Member
I remember being enraptured by the thought of owning an 8800GTX back then. Looking at it 10 years on makes me realise remember why I got out of PC gaming.
 
That transistor/framerate graph is awesome: Vega is quite a shame compared to the 980ti, 50% more transistor count and almost the same performance.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Sorta, but I mean, the 8800 that was from the same era as Crysis is still brutalized here on a modern CPU at 1080p


$

8800 was my shit and it got abused by Crysis.
Upgrade a few years later to GTX 260 and it still got its ass beat

But i think its worth noting that 1080p wasnt the go to gaming resolution.
I believe it was 1366 x 768 that was pretty big and 1440 x 900 was also popular with monitors at that time.

1080p was still relatively niche.
 
I would love to give Crysis a new playthrough but for the life of me I cannot get this fucker running on windows 10. I have tried everything under the sun to get it running but nothing works. Ah well such is life, but one hell of a black mark on Cryteks rep (if that still exists) that their game is so problematic.
 

LordOfChaos

Member
Isnt the game dx9? Its just not optimized for modern hardware.

First page of the article shows DX9/DX10 graphics comparisons. iirc DX10 was a patch later.

I really hope the make a HD trilogy. The 360 version plays, looks and runs great and on PC would be amazing.

I actually didn't finish the 360 version because of the long segments of 20fps, at first it was impressive that it was ported to 7G consoles, but the sacrifices just made it unplayable for me.

1-3 remastered on the 8th gen would be something though, especially Pro/X.
 

rodrigolfp

Haptic Gamepads 4 Life
I recently played through Warhead on an i5-6500 + 1060 GTX 6GB + 16GB RAM and still managed to see framerates drop into the 30s during intense firefights. This is at 1680x1050, too. To be fair, though, the average framerate hovered between 85 and 100fps.

You know that you can scroll down to see resolutions higher than 1680x1050?

Isnt the game dx9? Its just not optimized for modern hardware.

Default is DX10, but you can also run a DX9 exe.

First page of the article shows DX9/DX10 graphics comparisons. iirc DX10 was a patch later.

Not rly. DX10 option was there since day 1.

I really hope the make a HD trilogy. The 360 version plays, looks and runs great and on PC would be amazing.

14~30fps never was and never will be great: https://www.youtube.com/watch?v=YwD2ty2UfBM
 

AP90

Member
980 Ti is a beast.

Agreed =)

And it is fairly easy to overclock.. I remember playing crysis on my 9800gtx @1600x900 and it took me months to get a stable oc on the core. I wanted every extra drop of performance out of my card just so crysis could be smoother.
 
I tried the game again when I got my 680 gtx. Unpatched, on max settings at 1080p I constantly got under 30 fps on the first area

Which I thought was insane
 

bobone

Member
Awesome article.
I had just build a shiny new box with a 3870 around that time and was happily playing everything at high settings and resolution till I got Crysis. That game floored me. And made me really want to buy a second GPU to actually be able to run it.

I have a 1080 now, and it wasn't part of that test, so I'm going to have to give it a try myself.
 

Cerbero

Member
8800 was my shit and it got abused by Crysis.
Upgrade a few years later to GTX 260 and it still got its ass beat

But i think its worth noting that 1080p wasnt the go to gaming resolution.
I believe it was 1366 x 768 that was pretty big and 1440 x 900 was also popular with monitors at that time.

1080p was still relatively niche.
Actually in 2007 1680x1050 was the to go resolution, with the Samsung 226BW as the hottest girl in town
 

rodrigolfp

Haptic Gamepads 4 Life
This just proves that Crysis is poorly optimized.

Crysis: Warhead runs better and without the jank, as do Crysis 2 and 3.

Not saying that C1 has good optimizations, but the size of maps and physics of Crysis 2 and 3 are by far not in the same lvl of C1. And Warhead is not much better to run.
 

Futaleufu

Member
I remember the Directx 10 setting of this game was broken, in the last stage you would fall thru the floor at random times.
 

Noleh8r

Neo Member
This just proves that Crysis is poorly optimized.

Crysis: Warhead runs better and without the jank, as do Crysis 2 and 3.

Crysis Warhead, Crysis 2, and Crysis 3 might as well be from a completely different series when compared to the sprawling levels of the original game.
 

120v

Member
glad i wasn't a PC gamer then. i would've gone insane trying to get it to run ok even if i had no particular interest in playing it
 

old

Member
It's neat to see how much architecture generation plays a role in the Radeon cards. Big jump from Terascale to GCN. Another big jump to GCN 2.

Some years it's a huge jump in hardware. Some years it's apparently just a small clock boost.
 

LordOfChaos

Member
It's neat to see how much architecture generation plays a role in the Radeon cards. Big jump from Terascale to GCN. Another big jump to GCN 2.

Some years it's a huge jump in hardware. Some years it's apparently just a small clock boost.

Also OG GCN got some classic AMD FineWIne on the driver side, held up pretty well over time. They seem to have the unfortunate habit of leaving more performance on the table than their competitor at launch day, but then also supporting it and tuning it longer.
 

LordOfChaos

Member
Why did they skip 1060/1050? (or am I blind)



Indeed:

Shv6Ok9.png



Holy moly does Nvidia do a lot more with a lot less transistors. Their relative margins are no wonder.

Then again AMD doesn't split compute cards from gaming cards like Nvidia, so depending on how you look at it you either get exceptionally priced compute, or a lot of transistor baggage for gaming.
 
980 Ti is a beast.
Arguably the best $200 I've ever spent was on one. I was pretty set when it came to 1920 x 1080 but now at 3440 x 1440 I'm starting to feel its limits.

Still, amazing card. I don't ever want to upgrade if I can avoid it without paying that much again.
 

PantsuJo

Member
Could the switch run crysis ?
Crysis 3 was being developed for Nvidia Shield (tablet and TV) so a Switch port is indeed possible, given the nearly identical hardware.

You can find info about this port on Nvidia Shield homepage or, as usual, YouTube.
 
Top Bottom