• Register
  • TOS
  • Privacy
  • @NeoGAF
  • Like

LordOfChaos
Member
(11-14-2017, 02:39 AM)
LordOfChaos's Avatar
http://www.tomshardware.com/reviews/...arks,5329.html


Thought this was an interesting article despite the tired old meme, just to see how far we've come per generation for 10 years.
Beerman462
Member
(11-14-2017, 03:01 AM)
Beerman462's Avatar
OMG I can finally play at 60fps! GTX1070
MrOogieBoogie
BioShock Infinite is like playing some homeless guy's vivid imagination
(11-14-2017, 03:05 AM)
MrOogieBoogie's Avatar
I recently played through Warhead on an i5-6500 + 1060 GTX 6GB + 16GB RAM and still managed to see framerates drop into the 30s during intense firefights. This is at 1680x1050, too. To be fair, though, the average framerate hovered between 85 and 100fps.
TaroYamada
Member
(11-14-2017, 03:06 AM)
TaroYamada's Avatar
I first played Crysis on a GTX 460 1gb and I'm glad I could play it maxed out. It's one of my favorite FPS games and it deserves to be experienced at that fidelity.

Just to be clear this was at like 1600x900 if memory serves and I wasn't always hitting 60FPS.
DocEbok
Member
(11-14-2017, 03:38 AM)
DocEbok's Avatar
Wasn't it more cpu than gpu? :x
LordOfChaos
Member
(11-14-2017, 03:41 AM)
LordOfChaos's Avatar

Originally Posted by DocEbok

Wasn't it more cpu than gpu? :x



Sorta, but I mean, the 8800 that was from the same era as Crysis is still brutalized here on a modern CPU at 1080p


Dreadnought
Member
(11-14-2017, 03:58 AM)
Dreadnought's Avatar
You need a 1080ti to play at 4k60 what the hell
Lister
Member
(11-14-2017, 04:35 AM)
Lister's Avatar
Isnt the game dx9? Its just not optimized for modern hardware.
Ubername
Member
(11-14-2017, 05:03 AM)
Ubername's Avatar

Originally Posted by Lister

Isnt the game dx9? Its just not optimized for modern hardware.

It wasn't optimized when it came out, lol.
Finaika
Member
(11-14-2017, 07:32 AM)
Finaika's Avatar

Originally Posted by LordOfChaos

Sorta, but I mean, the 8800 that was from the same era as Crysis is still brutalized here on a modern CPU at 1080p


980 Ti is a beast.
DonMigs85
Member
(11-14-2017, 09:13 AM)
DonMigs85's Avatar

Originally Posted by Finaika

980 Ti is a beast.

This is only due to a CPU bottleneck. The gap widens at higher resolutions
ClaptoVaughn
Member
(11-14-2017, 09:28 AM)
ClaptoVaughn's Avatar
I got a 8800gtx at the time partly to play that. I remember tweaking the config files for hours to squeeze every frame possible out of that thing.

Stalker came out the same year. What a great time for shooters.
Inviusx
Member
(11-14-2017, 10:12 AM)
Inviusx's Avatar
I remember being enraptured by the thought of owning an 8800GTX back then. Looking at it 10 years on makes me realise remember why I got out of PC gaming.
metareferential
Member
(11-14-2017, 11:34 AM)
metareferential's Avatar
That transistor/framerate graph is awesome: Vega is quite a shame compared to the 980ti, 50% more transistor count and almost the same performance.
Black_Stride
do not tempt fate do not contrain Wonder Woman's thighs do not do not
(11-14-2017, 12:56 PM)
Black_Stride's Avatar

Originally Posted by LordOfChaos

Sorta, but I mean, the 8800 that was from the same era as Crysis is still brutalized here on a modern CPU at 1080p


8800 was my shit and it got abused by Crysis.
Upgrade a few years later to GTX 260 and it still got its ass beat

But i think its worth noting that 1080p wasnt the go to gaming resolution.
I believe it was 1366 x 768 that was pretty big and 1440 x 900 was also popular with monitors at that time.

1080p was still relatively niche.
CharmingCharlie
Member
(11-14-2017, 01:04 PM)
CharmingCharlie's Avatar
I would love to give Crysis a new playthrough but for the life of me I cannot get this fucker running on windows 10. I have tried everything under the sun to get it running but nothing works. Ah well such is life, but one hell of a black mark on Cryteks rep (if that still exists) that their game is so problematic.
Oemenia
Member
(11-14-2017, 02:11 PM)
Oemenia's Avatar
I really hope the make a HD trilogy. The 360 version plays, looks and runs great and on PC would be amazing.
LordOfChaos
Member
(11-14-2017, 02:25 PM)
LordOfChaos's Avatar

Originally Posted by Lister

Isnt the game dx9? Its just not optimized for modern hardware.

First page of the article shows DX9/DX10 graphics comparisons. iirc DX10 was a patch later.

Originally Posted by Oemenia

I really hope the make a HD trilogy. The 360 version plays, looks and runs great and on PC would be amazing.

I actually didn't finish the 360 version because of the long segments of 20fps, at first it was impressive that it was ported to 7G consoles, but the sacrifices just made it unplayable for me.

1-3 remastered on the 8th gen would be something though, especially Pro/X.
rodrigolfp
Member
(11-14-2017, 02:55 PM)
rodrigolfp's Avatar

Originally Posted by MrOogieBoogie

I recently played through Warhead on an i5-6500 + 1060 GTX 6GB + 16GB RAM and still managed to see framerates drop into the 30s during intense firefights. This is at 1680x1050, too. To be fair, though, the average framerate hovered between 85 and 100fps.

You know that you can scroll down to see resolutions higher than 1680x1050?

Originally Posted by Lister

Isnt the game dx9? Its just not optimized for modern hardware.

Default is DX10, but you can also run a DX9 exe.

Originally Posted by LordOfChaos

First page of the article shows DX9/DX10 graphics comparisons. iirc DX10 was a patch later.

Not rly. DX10 option was there since day 1.

Originally Posted by Oemenia

I really hope the make a HD trilogy. The 360 version plays, looks and runs great and on PC would be amazing.

14~30fps never was and never will be great: https://www.youtube.com/watch?v=YwD2ty2UfBM
LordOfChaos
Member
(11-14-2017, 03:13 PM)
LordOfChaos's Avatar

Originally Posted by rodrigolfp


Not rly. DX10 option was there since day 1.

Oh right, it was Crysis 2 that had the DX11 patch and ultra graphics mode.
DonF
Member
(11-14-2017, 03:31 PM)
DonF's Avatar
Related video by Lazy game reviews
AP90
Member
(11-14-2017, 04:10 PM)
AP90's Avatar

Originally Posted by Finaika

980 Ti is a beast.

Agreed =)

And it is fairly easy to overclock.. I remember playing crysis on my 9800gtx @1600x900 and it took me months to get a stable oc on the core. I wanted every extra drop of performance out of my card just so crysis could be smoother.
Lovely Salsa
List of trash:
FTL
Chivalry
Shovel Knight
Divinity Original Sin
Broken Age
Kentucky Route Zero

I also dislike graphic novels, bus stops, maps, reading for kids, classical music, board games, opera, art, and geiger counters.
(11-14-2017, 04:19 PM)
Lovely Salsa's Avatar
I tried the game again when I got my 680 gtx. Unpatched, on max settings at 1080p I constantly got under 30 fps on the first area

Which I thought was insane
bobone
Member
(11-14-2017, 04:41 PM)
bobone's Avatar
Awesome article.
I had just build a shiny new box with a 3870 around that time and was happily playing everything at high settings and resolution till I got Crysis. That game floored me. And made me really want to buy a second GPU to actually be able to run it.

I have a 1080 now, and it wasn't part of that test, so I'm going to have to give it a try myself.
Cerbero
Member
(11-14-2017, 05:16 PM)

Originally Posted by Black_Stride

8800 was my shit and it got abused by Crysis.
Upgrade a few years later to GTX 260 and it still got its ass beat

But i think its worth noting that 1080p wasnt the go to gaming resolution.
I believe it was 1366 x 768 that was pretty big and 1440 x 900 was also popular with monitors at that time.

1080p was still relatively niche.

Actually in 2007 1680x1050 was the to go resolution, with the Samsung 226BW as the hottest girl in town
Phoenix RISING
Member
(11-14-2017, 05:21 PM)
Phoenix RISING's Avatar
This just proves that Crysis is poorly optimized.

Crysis: Warhead runs better and without the jank, as do Crysis 2 and 3.
rodrigolfp
Member
(11-14-2017, 05:40 PM)
rodrigolfp's Avatar

Originally Posted by Phoenix RISING

This just proves that Crysis is poorly optimized.

Crysis: Warhead runs better and without the jank, as do Crysis 2 and 3.

Not saying that C1 has good optimizations, but the size of maps and physics of Crysis 2 and 3 are by far not in the same lvl of C1. And Warhead is not much better to run.
Futaleufu
Member
(11-14-2017, 07:44 PM)
Futaleufu's Avatar
I remember the Directx 10 setting of this game was broken, in the last stage you would fall thru the floor at random times.
rodrigolfp
Member
(11-14-2017, 08:34 PM)
rodrigolfp's Avatar

Originally Posted by Futaleufu

I remember the Directx 10 setting of this game was broken, in the last stage you would fall thru the floor at random times.

Never even heard of this.
TheLaughingStock
Member
(11-14-2017, 11:11 PM)
TheLaughingStock's Avatar
Games still look great even though they're old.
Noleh8r
Junior Member
(11-14-2017, 11:43 PM)
Noleh8r's Avatar

Originally Posted by Phoenix RISING

This just proves that Crysis is poorly optimized.

Crysis: Warhead runs better and without the jank, as do Crysis 2 and 3.

Crysis Warhead, Crysis 2, and Crysis 3 might as well be from a completely different series when compared to the sprawling levels of the original game.
Phatosaurus
Member
(11-15-2017, 01:13 AM)
Phatosaurus's Avatar

Originally Posted by Finaika

980 Ti is a beast.

The CPU likely was the bottleneck there. The 980Ti falls off in performance with AA and at higher resolutions.
Quixz
Member
(11-15-2017, 02:30 AM)
Quixz's Avatar
This game must have sold many 8800GTXs cause I bought one just play.
120v
Member
(11-15-2017, 04:09 AM)
glad i wasn't a PC gamer then. i would've gone insane trying to get it to run ok even if i had no particular interest in playing it
old
Member
(11-15-2017, 05:24 AM)
old's Avatar
It's neat to see how much architecture generation plays a role in the Radeon cards. Big jump from Terascale to GCN. Another big jump to GCN 2.

Some years it's a huge jump in hardware. Some years it's apparently just a small clock boost.
LordOfChaos
Member
(11-15-2017, 05:29 AM)
LordOfChaos's Avatar

Originally Posted by old

It's neat to see how much architecture generation plays a role in the Radeon cards. Big jump from Terascale to GCN. Another big jump to GCN 2.

Some years it's a huge jump in hardware. Some years it's apparently just a small clock boost.

Also OG GCN got some classic AMD FineWIne on the driver side, held up pretty well over time. They seem to have the unfortunate habit of leaving more performance on the table than their competitor at launch day, but then also supporting it and tuning it longer.
zeroOman
Member
(11-15-2017, 12:49 PM)
zeroOman's Avatar

Originally Posted by Phatosaurus

The CPU likely was the bottleneck there. The 980Ti falls off in performance with AA and at higher resolutions.

U mean it well suck at higher resolution more than 1080p?
llien
Member
(11-15-2017, 01:23 PM)
llien's Avatar
Why did they skip 1060/1050? (or am I blind)

Originally Posted by Finaika

980 Ti is a beast.

Indeed:

DonMigs85
Member
(11-16-2017, 12:19 AM)
DonMigs85's Avatar

Originally Posted by llien

Why did they skip 1060/1050? (or am I blind)



Indeed:

Flagship GPUs only
LordOfChaos
Member
(11-16-2017, 03:17 AM)
LordOfChaos's Avatar

Originally Posted by llien

Why did they skip 1060/1050? (or am I blind)



Indeed:



Holy moly does Nvidia do a lot more with a lot less transistors. Their relative margins are no wonder.

Then again AMD doesn't split compute cards from gaming cards like Nvidia, so depending on how you look at it you either get exceptionally priced compute, or a lot of transistor baggage for gaming.
Greatest Ever
pretentious trash
(11-16-2017, 03:32 AM)
Greatest Ever's Avatar

Originally Posted by Finaika

980 Ti is a beast.

Arguably the best $200 I've ever spent was on one. I was pretty set when it came to 1920 x 1080 but now at 3440 x 1440 I'm starting to feel its limits.

Still, amazing card. I don't ever want to upgrade if I can avoid it without paying that much again.
Morbid
Banned
(11-19-2017, 08:35 PM)
This game still looks better than some modern games, crazy.



Laserdisk
Member
(11-19-2017, 10:48 PM)
When I bought my 1060 6gb I reinstalled this and warhead.
Still amazing and beautiful games.
jufonuk
Member
(11-19-2017, 11:17 PM)
jufonuk's Avatar
Could the switch run crysis ?
Phatosaurus
Member
(11-19-2017, 11:18 PM)
Phatosaurus's Avatar

Originally Posted by zeroOman

U mean it well suck at higher resolution more than 1080p?

Suck? No. The performance gap between it and better GPUs just widens. It's still a good card.
Morbid
Banned
(11-19-2017, 11:19 PM)

Originally Posted by jufonuk

Could the switch run crysis ?

360 & PS3 did so sure. I doubt it'd do 1080p 60fps though.
DonMigs85
Member
(Yesterday, 12:29 AM)
DonMigs85's Avatar

Originally Posted by Morbid

360 & PS3 did so sure. I doubt it'd do 1080p 60fps though.

Yeah, we need to remember that the Switch is like a bit less than half the power of a GTX 750 (non-Ti) at best.
PantsuJo
Member
(Yesterday, 01:19 AM)
PantsuJo's Avatar

Originally Posted by jufonuk

Could the switch run crysis ?

Crysis 3 was being developed for Nvidia Shield (tablet and TV) so a Switch port is indeed possible, given the nearly identical hardware.

You can find info about this port on Nvidia Shield homepage or, as usual, YouTube.

Thread Tools