But Can It Run Crysis? 10 years of flagship GPUs compared

MrOogieBoogie

BioShock Infinite is like playing some homeless guy's vivid imagination
#3
I recently played through Warhead on an i5-6500 + 1060 GTX 6GB + 16GB RAM and still managed to see framerates drop into the 30s during intense firefights. This is at 1680x1050, too. To be fair, though, the average framerate hovered between 85 and 100fps.
 
#4
I first played Crysis on a GTX 460 1gb and I'm glad I could play it maxed out. It's one of my favorite FPS games and it deserves to be experienced at that fidelity.

Just to be clear this was at like 1600x900 if memory serves and I wasn't always hitting 60FPS.
 
#12
I got a 8800gtx at the time partly to play that. I remember tweaking the config files for hours to squeeze every frame possible out of that thing.

Stalker came out the same year. What a great time for shooters.
 
#13
I remember being enraptured by the thought of owning an 8800GTX back then. Looking at it 10 years on makes me realise remember why I got out of PC gaming.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
#15
Sorta, but I mean, the 8800 that was from the same era as Crysis is still brutalized here on a modern CPU at 1080p


8800 was my shit and it got abused by Crysis.
Upgrade a few years later to GTX 260 and it still got its ass beat

But i think its worth noting that 1080p wasnt the go to gaming resolution.
I believe it was 1366 x 768 that was pretty big and 1440 x 900 was also popular with monitors at that time.

1080p was still relatively niche.
 
#16
I would love to give Crysis a new playthrough but for the life of me I cannot get this fucker running on windows 10. I have tried everything under the sun to get it running but nothing works. Ah well such is life, but one hell of a black mark on Cryteks rep (if that still exists) that their game is so problematic.
 
#18
Isnt the game dx9? Its just not optimized for modern hardware.
First page of the article shows DX9/DX10 graphics comparisons. iirc DX10 was a patch later.

I really hope the make a HD trilogy. The 360 version plays, looks and runs great and on PC would be amazing.
I actually didn't finish the 360 version because of the long segments of 20fps, at first it was impressive that it was ported to 7G consoles, but the sacrifices just made it unplayable for me.

1-3 remastered on the 8th gen would be something though, especially Pro/X.
 
#19
I recently played through Warhead on an i5-6500 + 1060 GTX 6GB + 16GB RAM and still managed to see framerates drop into the 30s during intense firefights. This is at 1680x1050, too. To be fair, though, the average framerate hovered between 85 and 100fps.
You know that you can scroll down to see resolutions higher than 1680x1050?

Isnt the game dx9? Its just not optimized for modern hardware.
Default is DX10, but you can also run a DX9 exe.

First page of the article shows DX9/DX10 graphics comparisons. iirc DX10 was a patch later.
Not rly. DX10 option was there since day 1.

I really hope the make a HD trilogy. The 360 version plays, looks and runs great and on PC would be amazing.
14~30fps never was and never will be great: https://www.youtube.com/watch?v=YwD2ty2UfBM
 
#22
980 Ti is a beast.
Agreed =)

And it is fairly easy to overclock.. I remember playing crysis on my 9800gtx @1600x900 and it took me months to get a stable oc on the core. I wanted every extra drop of performance out of my card just so crysis could be smoother.
 
#25
Awesome article.
I had just build a shiny new box with a 3870 around that time and was happily playing everything at high settings and resolution till I got Crysis. That game floored me. And made me really want to buy a second GPU to actually be able to run it.

I have a 1080 now, and it wasn't part of that test, so I'm going to have to give it a try myself.
 
#26
8800 was my shit and it got abused by Crysis.
Upgrade a few years later to GTX 260 and it still got its ass beat

But i think its worth noting that 1080p wasnt the go to gaming resolution.
I believe it was 1366 x 768 that was pretty big and 1440 x 900 was also popular with monitors at that time.

1080p was still relatively niche.
Actually in 2007 1680x1050 was the to go resolution, with the Samsung 226BW as the hottest girl in town
 
#36
It's neat to see how much architecture generation plays a role in the Radeon cards. Big jump from Terascale to GCN. Another big jump to GCN 2.

Some years it's a huge jump in hardware. Some years it's apparently just a small clock boost.
 
#37
It's neat to see how much architecture generation plays a role in the Radeon cards. Big jump from Terascale to GCN. Another big jump to GCN 2.

Some years it's a huge jump in hardware. Some years it's apparently just a small clock boost.
Also OG GCN got some classic AMD FineWIne on the driver side, held up pretty well over time. They seem to have the unfortunate habit of leaving more performance on the table than their competitor at launch day, but then also supporting it and tuning it longer.
 
#41
Why did they skip 1060/1050? (or am I blind)



Indeed:



Holy moly does Nvidia do a lot more with a lot less transistors. Their relative margins are no wonder.

Then again AMD doesn't split compute cards from gaming cards like Nvidia, so depending on how you look at it you either get exceptionally priced compute, or a lot of transistor baggage for gaming.