• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Middle-earth: Shadow of Mordor PC Performance Thread

Kinthalis

Banned
Some other people are playing at 1080p/60fps and decent settings with lesser hardware. Pretty sure he doesn't need to settle for 30fps.

I meant that he could stick to 30 FPS and likely increase his vidoe settings over what the guy he quoted said he was running at. Or keep fairly decent settings and go for a higher frame rate.

Until we have a relaly nice breakdown of performance hits per feature we don't really know the exact settings he can expect to run at without some testing on his own.

Any performance guides out yet?
 

UnrealEck

Member
AMD FX 8120
EVGA GTX 770 SC with 2GB of VRAM (oddly, dxdiag incorrectly reports it has 4GB)
8GB of RAM
I have the same card (MSI)
Get that texture cranked to high.
Live on the edge. Take risks. Go pro. etc

I think the 4GB Dxdiag reports is including the main system memory it has access to or is using alongside the on-board memory. I'm guessing that's where a lot of extra data gets stored before being moved to VRAM.
But just a guess. I don't really know much at all about this stuff.
 
This has to be one of the strangest errors I've ever seen in a game. It may have been the lack of sleep, but I saw this post last night around 1:30am and I literally started crying I was laughing so hard, I'm so sorry... :( It just looks so ridiculous with the weird EKG graphics at the top right, it looks as if it was photoshopped in. Have you posted to their their forums?

Well, that is a framerate counter. The picture in picture thing is weird though, reminds me of DSFix.
 
Your CPU is better than what's on a PS4, your GPU is slightly better than what's on a PS4. The real console advanatage here is on the CPU overhead. The game would probably run badly on something like a PS4 CPU.

Still, matching a PS4 on an almost 4 year old CPU + a mid range GPU over 2 years old, is not bad. Nto to mention that in the future, thanks to DX12 and other API's that CPU overhead will diminish, giving your rig even more breathign room later this gen, even assuming you don't upgrade at some point.


Yeah I hear you. I mean the one real world example I have is BF4. I run BF4 at 1920x1080 on ultra (with 4xmsaa enabled) and I get 50-60fps. Dips to 45ish occassionally.

Considering the PS4 runs BF4 at 900p and what I would assume are High graphics, it seems like my rig is still a good amount better.

The main game I'm worried about (an ode to your sig) is Witcher 3. Really hoping my rig can perform well with that game.

As for Mordor, I think I will opt for PC.
 

Qassim

Member
Is anyone else limited to 99fps? For some reason it won't go above that - I have fps limit set to 'no limit'.

Otherwise in the first 20 minutes of the game it is playing rather nicely, great performance, etc @ 1080p, max settings ('high textures').
 
I still can't belive Shadow Of Mordor needs 6GB of VRAM for Ultra, I know the game looks good, but not that good to need such amount of power, there are/will be better looking games on PC and PS4 that will need less power.
 

JB1981

Member
How does the game perform on a GTX 660/i7/8gb ram setup? Seems the game won't look so hot since my card only has 1.5gb of ram.
 

Gbraga

Member
At 60fps, I've found that motion blur really isn't noticeable most of the time. If I'm running below 60, I'll keep it on, otherwise not.

I might give it a shot.

But I really like the exaggerated motion blur some games use, so I'm just assuming this is one such case since the performance hit is quite big.
 

Levyne

Banned
Is anyone else limited to 99fps? For some reason it won't go above that - I have fps limit set to 'no limit'.

Otherwise in the first 20 minutes of the game it is playing rather nicely, great performance, etc @ 1080p, max settings ('high textures').

Hard 100 fps cap
 
I still can't belive Shadow Of Mordor needs 6GB of VRAM for Ultra, I know the game looks good, but not that good to need such amount of power, there are/will be better looking games on PC and PS4 that will need less power.

You can't look at the total picture. The VRAM is just for the texture setting. Even when it may still feel like a lot of VRAM for the quality of textures.
 

Kinthalis

Banned
Yeah I hear you. I mean the one real world example I have is BF4. I run BF4 at 1920x1080 on ultra (with 4xmsaa enabled) and I get 50-60fps. Dips to 45ish occassionally.

Considering the PS4 runs BF4 at 900p and what I would assume are High graphics, it seems like my rig is still a good amount better.

The main game I'm worried about (an ode to your sig) is Witcher 3. Really hoping my rig can perform well with that game.

As for Mordor, I think I will opt for PC.

Not even all high on PS4 for BF4:

r6b6xZg.gif
 
Agreed. MSAA is outdated and useless for transparencies, texture aliasing, so you end up needing to team up with some other filter to get a decent result.

I'll take good FXAA over MSAA any day of the week (with good FXAA I mena dev implemented with no pass on the GUI elements). Of course if I have a better choice over FXAA, I'll take that too.

This part isnt necessarily true btw. Just thorwing that out there.
 

Red Comet

Member
So has anybody with a 3gb card tried ultra textures in-game (aside from the benchmark)? I won't be home for a few hours to test it for myself.
 

Sevenfold

Member
So has anybody with a 3gb card tried ultra textures in-game (aside from the benchmark)? I won't be home for a few hours to test it for myself.

Yes 780ti smooth as butter 80% of the time but bogging down in fights, not crazy bad but noticeable. I've uninstalled Ultra textures and gone for 150% resolution until a better AA solution is found.

780ti
i7 2600k mild overclocks on both
16GB RAM
W7 (64bit exe)
 

Levyne

Banned
So has anybody with a 3gb card tried ultra textures in-game (aside from the benchmark)? I won't be home for a few hours to test it for myself.

Yep I have a 780. It's not a hard locked 60 at 1920x1200 but the dips into the 50s are pretty infrequent. Happens sometimes if I swing the camera around too fast, never seems to dip during combat so far.
 

Durante

Member
Hopefully SMAA will completely marginalize FXAA It's cheap and does not blur as much. It should be the default AA option, with SSAA for those with multi GPU setups.
FXAA has a few advantages over SMAA though. It's even cheaper and blurs more. The latter can actually be an advantage if you combine it with a sharp downsampling filter.

Soft FXAA combined with high-quality, sharp downsampling results in pretty good IQ, both in motion and in stills, and you can get it in almost every game.
 

Sober

Member
Is there any way to skip the intro stuff before the title screen? Also during loading screens when it says hit esc to skip it doesn't do anything and I have to listen to the whole audio clip.
Just downloaded the game but here is what I found.

Go to \ShadowofMordor\game\interface\videos and simply rename:

all the .vib files in \legal\

intro.vib (or the respective language one, _fr, _de, etc.)
nvidia_splash.vib

Those are all the intro vids I could fine, so when you launch all you get is a black screen with the autosave icon then straight to the menu screen.

Not sure about loading screens in the game though, I haven't gotten the chance to play it yet.
 

Kezen

Banned
FXAA has a few advantages over SMAA though. It's even cheaper and blurs more. The latter can actually be an advantage if you combine it with a sharp downsampling filter.
Soft FXAA combined with high-quality, sharp downsampling results in pretty good IQ, both in motion and in stills, and you can get it in almost every game.

True enough but I would still like to see developpers implementing quality AA modes into their games. Not that downsampling is horribly difficult but I don't think it's acceptable to ship a modern PC game with only SSAA as anti-aliasing.
 

Xeroblade

Member
Just downloaded the game but here is what I found.

Go to \ShadowofMordor\game\interface\videos and simply rename:

all the .vib files in \legal\

intro.vib (or the respective language one, _fr, _de, etc.)
nvidia_splash.vib

Those are all the intro vids I could fine, so when you launch all you get is a black screen with the autosave icon then straight to the menu screen.

Not sure about loading screens in the game though, I haven't gotten the chance to play it yet.


Was waiting for someone to figure this out. thank you!
 

SapientWolf

Trucker Sexologist
I honestly think the aversion to FXAA is like mind over matter. Its actually...getting rid of ALL of the aliasing. Like the name "anti-aliasing" implies.

I never understood the term "real AA". What is real AA? As long as the aliasing is gone, isn't that the point?

And when you consider that FXAA gets rid of all of the aliasing in the scene, and at a tiny tiny miniscule performance hit, compared to something like MSAA (what people say is "true AA") which has a massive performance hit and doesn't address aliasing in large amounts of the scene potentially, I think FXAA is pretty damn good.

I honestly think post process AA is the future of AA, or that its already the best alternative. There is no reason eliminating aliasing should take up like 30-40% of your performance bandwidth.

I'll take FXAA or SMAA any day over the huge performance hit os MSAA.
 

Seanspeed

Banned
I honestly think the aversion to FXAA is like mind over matter. Its actually...getting rid of ALL of the aliasing. Like the name "anti-aliasing" implies.

I never understood the term "real AA". What is real AA? As long as the aliasing is gone, isn't that the point?

And when you consider that FXAA gets rid of all of the aliasing in the scene, and at a tiny tiny miniscule performance hit, compared to something like MSAA (what people say is "true AA") which has a massive performance hit and doesn't address aliasing in large amounts of the scene potentially, I think FXAA is pretty damn good.

I honestly think post process AA is the future of AA, or that its already the best alternative. There is no reason eliminating aliasing should take up like 30-40% of your performance bandwidth.

I'll take FXAA or SMAA any day over the huge performance hit os MSAA.
FXAA does not even come close to getting rid of ALL aliasing. At least no FXAA solution I've ever seen.
 
I honestly think the aversion to FXAA is like mind over matter. Its actually...getting rid of ALL of the aliasing. Like the name "anti-aliasing" implies.

I never understood the term "real AA". What is real AA? As long as the aliasing is gone, isn't that the point?

And when you consider that FXAA gets rid of all of the aliasing in the scene, and at a tiny tiny miniscule performance hit, compared to something like MSAA (what people say is "true AA") which has a massive performance hit and doesn't address aliasing in large amounts of the scene potentially, I think FXAA is pretty damn good.
I don't know what I'm reading. FXAA removing all the aliasing in a scene? Hah. No way. ESPECIALLY in games like GTA4 where downsampling from 4K still doesn't come close to removing all the aliasing. I need to see screenshots of the magical FXAA you seem to have discovered :)
 

Gbraga

Member
Just downloaded the game but here is what I found.

Go to \ShadowofMordor\game\interface\videos and simply rename:

all the .vib files in \legal\

intro.vib (or the respective language one, _fr, _de, etc.)
nvidia_splash.vib

Those are all the intro vids I could fine, so when you launch all you get is a black screen with the autosave icon then straight to the menu screen.

Not sure about loading screens in the game though, I haven't gotten the chance to play it yet.

Thanks for this!
 
A 670 has a bit more shading power, I think on average, it performs about 25% better than a 650ti.

It's possible you can tick a few items to high at 30 FPS, or get a higher frame rate. The 670 is also a bit mroe powerful than what the PS4 is sporting.

So in the end you'll probaly be playign at settings so similar to the PS4, that you'll hae to squint to notice the difference, and at similar frame rates.

Except that you would have paid $30 less for the game. Your choice.

I have a i5-2500k, 580 1.5gb with 16gb of ram running 2 1920x1200 monitors, do you think I'd be able to get 30fps with medium/high settings? Or should I go with ps4 (paying $41 for ps4 vs getting it for $30 at the b/s/t thread)
 

Durante

Member
I honestly think the aversion to FXAA is like mind over matter. Its actually...getting rid of ALL of the aliasing. Like the name "anti-aliasing" implies.

I never understood the term "real AA". What is real AA? As long as the aliasing is gone, isn't that the point?

And when you consider that FXAA gets rid of all of the aliasing in the scene, and at a tiny tiny miniscule performance hit, compared to something like MSAA (what people say is "true AA") which has a massive performance hit and doesn't address aliasing in large amounts of the scene potentially, I think FXAA is pretty damn good.

I honestly think post process AA is the future of AA, or that its already the best alternative. There is no reason eliminating aliasing should take up like 30-40% of your performance bandwidth.

I'll take FXAA or SMAA any day over the huge performance hit os MSAA.
First of all, FXAA, like all post-processing AA, is completely, fundamentally incapable of getting rid of all aliasing. It completely fails for both sub-pixel and temporal aliasing (and can in fact make matters worse for the latter). If you want to know why exactly that is the case, and which AA methods are capable of dealing with which types o aliasing then read this article.

FXAA also has the additional issue (not shared to the same extent by e.g. SMAA) that it blurs actual detail which is not an aliasing artifact. That's because, like all other post-processing methods, it has to guess what's aliasing and what isn't. The tradeoff for that is between coverage, unintentional blur and complexity (and thus cost) of the guessing process.

The reason why FXAA is a good choice (IMHO) when combined with downsampling is that you can recover the lost pixel-level detail because of the downsampling process using information from multiple FXAA'd pixels for each final pixel. And since FXAA is very soft, it deals better with more types of aliasing than standard SMAA.

As for "real AA", the definition of that varies, but usually people mean anything which at least uses more than one sample per resulting pixel. In terms of performance, the state of the art is to gather these samples mostly (or exclusively) by reprojection and offsets across multiple frames, which greatly reduces the performance impact. This is implemented as the main AA method in UE4, and also in recent versions of CryEngine.
 

Durante

Member
The problem I have with sharpened shots being posted when discussing aliasing and anti-aliasing is that the stills don't show the impact of sharpening on temporal stability. Yeah, in a screenshot (moderate) sharpening will almost always look better, but when actually playing the game? Not necessarily.
 
The game runs pretty good (mixture of high/ultra and medium textures because 2 GB VRAM)

It has crashed once on me though, went to pick up some intel (papers laying on a table) and it crashed the nvidia drivers (344.11)
 

neorej

ERMYGERD!
i5 3.6 GHz, 8Gigs DDR3, GTX650Ti boost, nice mix of high and medium, steady 30FPS.
I only got a steady 60FPS set most settings to low.
 
Anyone know if this has crossfire support? and if I possibly need to upgrade driver verisons?

Posted this earlier:

Ok, so a work around for AMD cards to somewhat enable Crossfire. Pull up CCC, go to 3D Application Settings. Add the game's exe file. Under AMD CrossFireX, set Frame Pacing to On and CrossFireX Mode to AFR Friendly.

1440p
i7 2600@4.5
2x290x
16GB RAM
Everything set to max + HD Texture Pack
 

SapientWolf

Trucker Sexologist
The problem I have with sharpened shots being posted when discussing aliasing and anti-aliasing is that the stills don't show the impact of sharpening on temporal stability. Yeah, in a screenshot (moderate) sharpening will almost always look better, but when actually playing the game? Not necessarily.
I am really insensitive to temporal aliasing at high framerates, which is why I typically prefer post processing AA over MSAA or downsampling. But you do occasionally get scenes where the sharpening algorithm craps all over itself (like the water in BF4).
 

mkenyon

Banned
Ok, so a work around for AMD cards to somewhat enable Crossfire. Pull up CCC, go to 3D Application Settings. Add the game's exe file. Under AMD CrossFireX, set Frame Pacing to On and CrossFireX Mode to AFR Friendly.

1440p
i7 2600@4.5
2x290x
16GB RAM
Everything set to max + HD Texture Pack
Do you have FRAPS?
 
Top Bottom