Middle-earth: Shadow of Mordor PC Performance Thread

Few more ultra texture pack pics @ 1920x1200, this time with more of the open world and a bit of combat:

http://i1.minus.com/ibbKYrzuzYbvFW.png[IMG]

[IMG]http://i6.minus.com/ibjHpDZwcAcP2x.png[IMG]

[IMG]http://i.minus.com/iblxbFmWeavw7n.png[IMG]

In that combat screen I had my 970 OC'd further as a test.[/QUOTE]

I don't get it.. Where is the 6GB of VRAM in these pictures. If someone told me it was running in 1GB VRAM, I'd say it would be plausible. Heh, on the other hand, if someone posted pics from The Vanishing of Ethan Carter and told me it was running at 6GB VRAM, I'd also say that would be plausible. Shadow of Mordor doesn't appear to be a good example of the way of the future of GPUs. It doesn't even look that great in general. Seems like a good game though.
 
i7 4770k
16 GB RAM
GTX 780
1920x1080

Benchmark Results:
Average FPS: 66.20
Max FPS: 162.13
Min FPS: 38.57

I'll have to play with it when I get back from work to see if I can adjust any settings from Ultra to High to try and bump up the minimum FPS, but otherwise I'm not super-worried about performance.
 
So is there a breakdown of the visual differences of these settings? Like Ambient Occlusion on Ultra is HBAO+? What is it on High? And so on.


I hate when they just have "medium high ultra", as it doesn't mean much.
Yea, much of what people are posting means very little to me cuz I don't know exactly what these settings are doing. The game seems to run well on a wide scale of machines.

Hopefully at some point, somebody can do some screenshot comparisons of some of the settings and the performance hit so people can figure out what's worth turning on/off.
 
I can't believe with all those graphics options that they didn't include at least fxaa as an option. I know it looks shit, but at 1440p the higher fxaa options in some games aren't too bad. Better than nothing anyway if you can't use smaa injector.
 
I don't get it.. Where is the 6GB of VRAM in these pictures. If someone told me it was running in 1GB VRAM, I'd say it would be plausible. Heh, on the other hand, if someone posted pics from The Vanishing of Ethan Carter and told me it was running at 6GB VRAM, I'd also say that would be plausible. Shadow of Mordor doesn't appear to be a good example of the way of the future of GPUs. It doesn't even look that great in general. Seems like a good game though.
Doesn't seem to be the most staggering use of 3 extra GB of VRAM I've ever seen.
It doesn't use 6GB of VRAM.

It just requires a bit more than 4 and 6GB is the next common VRAM amount.

Also it doesn't seem that all textures are better.

So we know if these new textures are actually higher resolution? Or are they just not using any texture compression?

(edit) Also does anyone have comparisons between low/med as well as high/ultra?
Since you need to download them separately I'd think there would be higher resolution textures in some places.

EDIT: Looking at the ground, there does seem like the texture is improved.
 
Textures don't look that good. Cross-gen holding everyone back as usual.
With the amount of space they take up I can't see how cross gen makes a jot of difference. Textures taking up several gig of VRAM In games won't suddenly get better because they no longer have to be downgraded for the old consoles.
 
i'm actually below minimum specs for the cpu (phenom II X4 945) and only using a 760 and i can run the game at 60fps on a mixture of medium and high with DoF, order independent transparency, and tessellation on.

i'm actually really pleased with how this is running.
 
Well, I fell for that benchmark too, when I had a quick glance at the performance of Mordor.

You will get quite a bit lower fps ingame. Also it seems that not as much textures are getting loaded, making VRAM-consumption somewhat less than when you're actually playing.

I played with everything set to ultra (textures too) and resolution set to 150 % and then the game really starts to eat VRAM. I have 4 Gigs on my R9 290X and those are just not enough. I get small hitches once in a while - which are not really game-breaking but quite annoying. Those hitches are also there when I'm using native resolution.

Also, there are some effects like the rain which tank my framerate quite a bit making it go as low as 30.

I'm not sure what they did with the textures, they don't look like they should be so heavy on memory in any way.

There's an comprison between the two settings here, it's in german, but you can click the pics anyways. Ultra-Textures use more than 5 Gigs on a GTX Titan (classic), so the recommendations for the game are not false in any way. The textures used in those pics where placeholders though, "ultra" do look nicer. There's an update to that article coming up.

You can enable Ultra-Textures by going to the DLC-Tap and then clicking HD-Content or Ultra-Content (or something like that). It should be another ~4 Gig Download.

[EDIT:] Oops, I got ninja'd with that link... anyways, those pics are not what Ultraa-Textures look like, these are placeholders.
 
Is there any chance that Ultra textures simply won't load (even when selected) if you don't have the required memory? I'm just trying to understand how 4gigs of supposedly higher quality textures don't look any different from the examples posted here.

And all this talk about the game showing respect to PC gamers with what would be a good port, how can it have important settings missing?
 
So I lied. The game actually uses 5.8GB of VRAM when playing the single player @1440p, and only 4.8GB of VRAM when using the benchmark.

You need a card with 6GB of VRAM if playing on ultra, just a heads up.
 
Is there any chance that Ultra textures simply won't load (even when selected) if you don't have the required memory? I'm just trying to understand how 4gigs of supposedly higher quality textures don't look any different from the examples posted here.

And all this talk about the game showing respect to PC gamers with what would be a good port, how can it have important settings missing?
I don't think it will use the lower textures. since the description does seem to only recommend it. But it doesn't seem to have an effect on all textures. I think it is supposed to make a bigger difference with characters?

Except for anti-aliasing (for which it has an alternative for even though it is not ideal) and triple buffering, I think the features seem very good. Oh, and the issues with 1080P are weird.


So I lied. The game actually uses 5.8GB of VRAM when playing the single player @1440p, and only 4.8GB of VRAM when using the benchmark.

You need a card with 6GB of VRAM if playing on ultra, just a heads up.
It is unreliable to judge how much VRAM something needs if you have more and you check how much it uses. For example that German site also listed 5.8 VRAM...but for 4K.
 
Uh, so why don't I have the option for 1080p? Using a 970

Edit: Nevermind. Went to advanced settings, changed preset to Ultra (was on Custom for some reason) - went back to settings, was in there.
 
Despite the fact that I have a strong gaming rig, I'm so happy I'm not obsessed with maxing everything at 60fps.

As long as it looks decent and plays smooth I'm all good.
 
Ultra Texture Details look insanely same as the High Texture Details from those screen shots, something is clearly wrong.
These are placeholders for the ultra-textures which already were in the review-version of the game. Downloading the "hd-content-dlc", these are supposed to look better. The article will get an update shortly with some new pics.
 
These are placeholders for the ultra-textures which already were in the review-version of the game. Downloading the "hd-content-dlc", these are supposed to look better. The article will get an update shortly with some new pics.
Ah, there does seem to be something going on.

http://www.pcgameshardware.de/Mittelerde-Mordors-Schatten-PC-258069/News/Ultra-HD-Texture-Pack-installieren-1137744/

In this article the texture pack seems to change things for them, so they will have to update the comparison.
 
Did people really think this sudden massive jump in VRam requirements was because these games were going to look massively better than anything that's ever come out? It's purely because these games are being designed for next gen consoles now and resources are now being allocated accordingly. Console ports.
 
Did people really think this sudden massive jump in VRam requirements was because these games were going to look massively better than anything that's ever come out? It's purely because these games are being designed for next gen consoles now and resources are now being allocated accordingly. Console ports.
That justified the requirements for 3GB of VRAM. You'd have different expectations if they recommend double that.
 
I have a MSI GTX 970 with 4GB of VRAM and with the HD texture pack installed my Nvidia driver crashed within ten minutes of playing the game. Put the textures on high and not a single problem since, so I guess you really do need 6GB to max out the game.

Mind that I have 'High Quality' enabled for textures in my Nvidia global settings, so there isn't any texture compression going on outside of the game. Turning it down might improve performance but I doubt it will make the game look much better than regular high textures since you'll be slightly blurring the textures if you do that. Might be worth a try though?

Anyway, all that doesn't change the fact that the VRAM usage is completely ridiculous. The game doesn't look that impressive and some of the textures are kinda shitty even on higher setting. The game seems to run fine for the most part and I wouldn't call it a bad port, but with the way these textures look there clearly could've been some extra optimization.

E: Going to try and see if I run into any problems when I put the Nvidia texture setting on 'Quality'. Will report back.
 
i5-2320@3.00GHz
GTX670 2GB factory OC'd
8GB RAM
Running at 2560x1440 borderless with 16xAF and FXAA forced in the Nvidia control panel.




The actual minimum framerate was ~32fps and the highest was 45 with a 60fps cap. Switching from driver AF to in-game Ultra AF gave me considerably worse performance, dropping as low as 25fps at times during the benchmark. Could there be a chance that driver AF isn't working at all? It was kinda hard to tell in the benchmark.

Also, since this game requires you to restart to change several of the graphics settings, has anyone found a way to skip the intro videos?

Edit: For those of you who want SMAA, I think you'll have to use RadeonPro (works on Nvidia hardware too) to inject it. RadeonPro allows you to use sweetfx and SMAA with 64-bit games like Watch Dogs and this one.
 
i5-2320@3.00GHz
GTX670 2GB factory OC'd
8GB RAM
Running at 2560x1440 borderless with 16xAF and FXAA forced in the Nvidia control panel.


The actual minimum framerate was ~32fps and the highest was 45 with a 60fps cap. Switching from driver AF to in-game Ultra AF gave me considerably worse performance, dropping as low as 25fps at times during the benchmark. Could there be a chance that driver AF isn't working at all? It was kinda hard to tell in the benchmark.

Also, since this game requires you to restart to change several of the graphics settings, has anyone found a way to skip the intro videos?
That's actually pretty good for 1440p. I have similar specs (i5-2500k, but otherwise the same) and I play on a 1080p monitor. You think I could run this on high/ultra near 60fps? (textures on high or normal obviously)
 
With ultra textures

With High textures


i7 4770k@4.5
16gb 2133 DDR3
AMD 290x@1150/1550
Windows 8.1, latest AMD beta driver


Max settings @ 1080p Vsync turned off. I would toss out the low end on both numbers as they happen as the benchmark loads in. Bet if I installed this on my SSD the low numbers would be higher. Would not go by these numbers though. In game benchmarks like these are hardly ever accurate to the full game. Tomb Raider for example would show good results on benchmarks then dip to the 20s in certain levels. Your mileage may vary
 

RVinP

Unconfirmed Member
Anyway, all that doesn't change the fact that the VRAM usage is completely ridiculous. The game doesn't even look that impressive and a lot of textures are kinda shitty even on higher setting. The game seems to run fine for the most part and I wouldn't call it a bad port, but with the way these textures look there clearly could've been some extra optimization.

E: Going to try and see if I run into any problems when I put the Nvidia texture setting on 'Quality'. Will report back.
The game warrants a breakdown of how GPU resources are being utilized, because its a bit confusing to see that the jump to High Textures Settings doesn't make the game look better.

Limitations of the engine or how texture assets are loaded into the game (in what size/chunks etc), LithTech!.
 
That justified the requirements for 3GB of VRAM. You'd have different expectations if they recommend double that.
You'd think, but it's clear that the PC is going to require more VRam than consoles to do the same thing. And consoles have quite a lot to start with now.

We all thought the 970 was some great deal by Nvidia for once, but in reality they've just been preparing for this VRam apocalypse. It's probably that cheap because more and more games will come out that a 970 won't even be able to max out.

There should have been more clarity about this sea change in pc requirements ages ago. How many people bought high-end 2GB or 3GB cards in the past year when Nvidia/ATI knew these people were probably pissing money away.
 
So glad I went with my gut and got a 4GB 770. Everyone I asked said it was a waste of money.
The jury's still out on that one. You might be able to put the texture level up a notch, but you're going to have to turn down a few other options to get a reliable 60fps.

Of course Nvidia haven't released any new drivers for this game yet (which is unusual since it's Nvidia-branded), so perhaps we'll get a performance bump when they do.
 
Did people really think this sudden massive jump in VRam requirements was because these games were going to look massively better than anything that's ever come out? It's purely because these games are being designed for next gen consoles now and resources are now being allocated accordingly. Console ports.
I don't think its too much to expect games with next-gen as a baseline(and thus much higher vRAM requirements) to have improved texture quality all-round.

Shadow of Mordor(2014)



Assassin's Creed Revelations(2011)

 
People saying FEAR 3 SLI bits work, please post a shot of your NV Inspector window, a game screenshot showing scaling, your game settings, and your system config, because tests on my Win 7 x64, TITAN SLI system show 10-15% at best: