• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Middle-earth: Shadow of Mordor PC Performance Thread

Grief.exe

Member
This benchmark is bizarre. Despite no noticeable drops at all from 60 apparently my minimum fps is 33. Weird.

EDIT: Hmm. That steam id trick doesn't seem to be working for me. No download is happenin. It just flashed to my games library.

Windows + run + install id

Then select the option from Shadow of Mordor's Properties > DLC menu.

Is the Steam Overlay working for anyone?

Wasn't working for me either.
 

Kieli

Member
Average of 73 fps on highest settings at 1080p crippled by stuttering because of the lack of VRAM is a ridiculous sight.

Goddamit, nVIDIA.
 

Zeliard

Member
Few more ultra texture pack pics @ 1920x1200, this time with more of the open world and a bit of combat:

ibbKYrzuzYbvFW.png


ibjHpDZwcAcP2x.png


iblxbFmWeavw7n.png


In that combat screen I had my 970 OC'd further as a test.
 
Actually memory manufactures still strictly manufacture memory in terms of decimal units with a byte as the lowest common denominator. So whenever you see MB on whatever info box and not MiB it is definitely megabytes. So in this case it is 1.28GB VRAM.

Give it up mate!

Do you have a source for that? Because semiconductor manufacturing making use of binary not decimal definitions (1024 vs 1000) seems to be included in the JEDEC standards, including their standards on DDR3 and GDDR5 memory. Furthermore, the wiki article on MB is mentioning that in storage and memory contexts it is much more common to use binary rather than decimal definition when expressing these units. Additionally, my "3GB" 780 video card has 3072MB, not 3000mb, and my new "4GB" (according to the label) video card has 4096MB.
 

Psy

Neo Member
For those having issues with HD Pack not showing up under "DLC" in Steam try completely exiting Steam and then re-launching Steam. That did the trick for me. Downloading...

SCWWtGw.jpg
 
Few more ultra texture pack pics @ 1920x1200, this time with more of the open world and a bit of combat:

ibbKYrzuzYbvFW.png


ibjHpDZwcAcP2x.png


iblxbFmWeavw7n.png


In that combat screen I had my 970 OC'd further as a test.

So you're getting a locked 60? VRAM is staying in the 3,5 range? Guess I would be ok then.
 
Either those shots are heavily compressed or the game is rough as hell without decent AA.

For 'ultra' settings they are pretty underwhelming.
 
With the results so far, PC. It seems the VRAM fears were greatly exaggerated.

Well with that being said, I decided to take a chance and purchased it off GMG then. Hopefully it runs pretty good for me.

Guess I ordered it too late though, because it looks like there's no one there to send me a code so I register it. I was hoping to install it overnight.
 
Ground textures look like ass in that third pic. The cross-gen roots are showing, but it might be because it's open-world or close to open-world so you can't realistically give everything great textures. Still assy.


What little foliage there is on the ground looks last-gen as well. Can you take a pic with more foliage? Bushes, trees, or in a a forest, whatever.
 

UnrealEck

Member
Ultra ambient occlusion has a pretty large impact on performance over the high setting.
Shadows also have a fair impact on performance.

If anyone's trying to get best graphics to performance you could try changing those.
 

Denton

Member
OH NOES 6GB VRAM NEEDED

3a0.gif


I wish companies wouldn't inflate their specs and I wish people wouldn't go crazy because of it. Maybe next time.
 

Kieli

Member
Quite frankly, the game, from the screens in this thread, looks like shit.

Doesn't look bad, but really does not look like a game that needs 6GB of VRAM. But, eh, what do I know about programming. I'm just an arm-chair gamer.
 

Araris

Neo Member
Pretty good performance actually. Nothing to complain about.

EVGA ACX GTX 780 - i7 5820k @ 4.4Ghz - 16GB DDR4 2667Mhz - 500GB Samsung EVO 840 SSD

1920x1080 - Everything Ultra (HD Texture Pack not yet downloaded), Tesselation on, Vsync on (not sure why it's going above 60 in the benchmark), etc...
gRtvo76.jpg
 

cheezcake

Member
Do you have a source for that? Because semiconductor manufacturing making use of binary not decimal definitions (1024 vs 1000) seems to be included in the JEDEC standards, including their standards on DDR3 and GDDR5 memory. Furthermore, the wiki article on MB is mentioning that in storage and memory contexts it is much more common to use binary rather than decimal definition when expressing these units. Additionally, my "3GB" 780 video card has 3072MB, not 3000mb, and my new "4GB" (according to the label) video card has 4096MB.

Have you ever noticed the Windows discrepancy between HDD size reported by manufacturer and the reported size in My Computer etc. I.E. my 250GB Samsung SSD shows 232 GB in Windows. That's because Windows actually uses the mebibyte/gibibyte definition but refuses to change their prefixes. Manufacturers however have to be strict about reporting size in MiB or MB because they are very clearly differentiated by all major standards organisations.

An easy way to look is to convert your manufacturers reported memory size to bytes by multiplying by 1000 however many times you need to. Then divide by 1024 however many times necessary and you realise you get the memory size reported by Windows.

TLDR a lot of confusion stems from the way Windows mixes MiB and MB when reporting memory size. And just to be clear IT IS more common to use it in a binary sense, but that doesn't mean its correct.
 

Buburibon

Member
GTX 980 SC here. VRAM is often maxed with "ultra" textures enabled, which I believe led to a couple stutters while fighting a large group of enemies in a more populated area. The difference between "high" and "ultra" textures is pretty small from where I'm sitting (~ 9ft/2.74m from a 60" TV), so I'll be sticking with "high" for a guaranteed buttery smooth experience at all times. Now, if I were to complain about one thing it'd be the lack of AA options. SMAA T2X would do wonders to this game without sacrificing performance.
 
Hmm, so this isn't a repeat of the Witcher 2 then, nevermind the positive 570 results.

I guess a 560 Ti IS good enough for the game, though I'd probably play it safe anyway.
I messed up the texture comparison though. I forgot to restart before I took the shots but the models are different. Here's another quick look at how textures change from highest to lowest, though it isn't a great shot:
15398889385_19b0509f25_o.png

15398889525_198a5a2275_o.png
 
It's dipped on occasion to 55 or so w/o further OC but it's generally pretty smooth. Performance should improve further with driver updates.

Thanks. I'll probably wait for the SLI profile then.

You dont really need AA at 1440p though. I'll use it if it doesn't hurt performance,why not,but I don't miss it at 1440.

(Why the hell are some of you rocking 970's and 980's and still playing @1080p?)

Even at 4K some games need AA...
 

dark10x

Digital Foundry pixel pusher
What the heck is with those benchmark numbers? Is it simply a matter of a demanding benchmark or is the game really running that poorly?
 

Mechazawa

Member
OH NOES 6GB VRAM NEEDED

3a0.gif


I wish companies wouldn't inflate their specs and I wish people wouldn't go crazy because of it. Maybe next time.

The game is using more than 4 gigs for plenty of people who are cranking it to max. Since there aren't really any 5 gig cards, it's Monolith covering their ass and saying 6 gigs is what you need to fully avoid that bottleneck.
 

Eusis

Member
I messed up the texture comparison though. I forgot to restart before I took the shots but the models are different. Here's another quick look at how textures change from highest to lowest, though it isn't a great shot:
https://farm4.staticflickr.com/3905/15398889385_19b0509f25_o.png
https://farm3.staticflickr.com/2948/15398889525_198a5a2275_o.png
Oh, yeah, it really shows on the ground there. Still, if the ram requirements are grossly overstated I guess it COULD be fine with a 560 Ti, but I remember how Witcher 2 would have stuttering unless you threw more RAM at the game and so I can imagine the same can happen here.

People sweating with anything that has 2 GB+ of VRAM and is of the next generation (6xx/7xxx) ARE idiots though. May as well just go PC unless you really want to do Nemesis stuff with them on consoles, and on XB1 you'd be hit harder by lower performance presumably.
 
TLDR a lot of confusion stems from the way Windows mixes MiB and MB when reporting memory size. And just to be clear IT IS more common to use it in a binary sense, but that doesn't mean its correct.

You're certainly right about hard drive capacities, but I have never seen the same thing occur for system memory and video card memory capacities - or at least I don't remember it. For example, this work laptop I am typing from has 8.00 GB of Windows GBs. It has slightly less than that usable, but this is because of of memory addressing, not because it was manufactured in decimal GB rather than binary GB.
 

Grumbul

Member
So what is the best way to get some AA on this puppy then?

Would SweetFX be a better option than the supersampling option in game performance wise?
 
I'm constantly impressed at how well my i5-2500k @4.2Ghz and 2gb 7850 hold up, I'm playing with a mix of high and ultra, with textures on medium and I'm getting around 45-60fps constantly, game looks incredible, plays incredible, I'm just impressed overall, only bummer is you can kinda see some of the low resolution on the medium textures when it zooms in for the orc shitting talking, but it's entirely unnoticeable in game. Might try kicking it up to High on the textures to see what happens.

EDIT: Btw, this is the shit talking bastard that cheap shot me and went to the top of my kill list, I'm coming for you. Also a representation of the quality of the Medium textures I was talking about above.

dvcwrs.png
 

Dryk

Member
So the 6GB VRAM thing was bull?
It's rounded up from 4.8Gb apparently. Which makes sense considering that 4Gb -> 6Gb is the most common jump in consumer cards. So a 4Gb card might get some stuttering but they played it as safe as possible.
 

hawk2025

Member
Something has gone very wrong :(


I'm getting 60fps, but a bizarre image-within-an-image effect depending on the camera angle. Almost like a picture-in-picture display, duplicating the image.

No idea how to fix it :(


I'm on a 780 6GB
 
Top Bottom