• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Middle-earth: Shadow of Mordor PC Performance Thread

Romir

Member
SLI seems to work fine using the F.E.A.R. 3 sli bits with nvidia inspector. I also added the game's exe to the profile but don't know if that was necessary.
 

JaseC

gave away the keys to the kingdom.
It's rounded up from 4.8Gb apparently. Which makes sense considering that 4Gb -> 6Gb is the most common jump in consumer cards. So a 4Gb card might get some stuttering but they played it as safe as possible.

That includes SSAA, though, as some initially suspected. If you look at the screenshots Zeliard has been posting you'll see that when the game is actually rendering at 1920x1200 it uses ~3.5GB.
 

hawk2025

Member
D01BD34876FDAE2038604ECCB5F5488E427ED868



Help :( :( :(



Forgot to add: I tested several other games since the issue started and they are all just fine. So it's not my GPU :(
 

Romir

Member
Quick one.

What program do you guys use to monitor Vram use? Would be for Nvidea.

GPU-Z will do it in the background on the sensor tab with "continue refreshing this screen while its in the background" checked.

In game people use a rivatuner overlay in Evga precision-x or Msi afterburner.
 

DarkoMaledictus

Tier Whore
Wait some games don't support SLI at launch?

I just bought two 970s so this is my first time with SLI :(

Sorry dude, my experience is sli works well at launch and after a little while drivers and games stop supporting it and you're stuck with a single card solution. Might have changed now, but my experience was not good with sli.
 

JaseC

gave away the keys to the kingdom.
Sorry dude, my experience is sli works well at launch and after a little while drivers and games stop supporting it and you're stuck with a single card solution. Might have changed now, but my experience was not good with sli.

I've been using SLI for two years and literally the only games I've ever touched within that period that don't support SLI, either officially or by using the SLI bits of another game, are id Tech 5 titles (namely Rage and Wolf14) and Dead Rising 3. Games that don't support SLI are the exception, not the rule.
 

Zeliard

Member
These are my benchmark results with a Gigabyte 970 + i5 2500k @ 4.5, 1920x1200, ultra everything + textures, vsync off. And yes, the minimum framerate shown here is just the initial stutter upon load. It's not indicative of a true drop.

Gigabyte 970 @ stock:

shadowofmordor_2014_0vxel9.png


Gigabyte 970 further OC'd:

shadowofmordor_2014_0rninn.png
 

DarkoMaledictus

Tier Whore
I've been using SLI for two years and literally the only games I've ever touched within that period that don't support SLI, either officially or by using the SLI bits of another game, are id Tech 5 titles (namely Rage and Wolf14) and Dead Rising 3. Games that don't support SLI are the exception, not the rule.

Was not the case not too long ago, I remember seeing very poor sli benefits and many games that just didn't care about sli. Looking at you Diablo 3... Blizzard has a very poor track record on supporting sli :(!
 

knitoe

Member
My native res is 1440p. The closest I can get to 1080/1200p is 2048X1152.

Max setting:

Low: 2.1GB VRAM
ocXfsuD.jpg


Medium: 3.1GB VRAM
4y9NgIE.jpg


High: 3.8GB VRAM
RHOV661.jpg


Ultra: 5.3GB VRAM
xZNnXMY.jpg


Going by my results, I don't see how you can truly get Ultra unless on 6GB VRAM card. The game my let you select it, but it might not actually be running at Ultra.
 

Rhaknar

The Steam equivalent of the drunk friend who keeps offering to pay your tab all night.
I've been using SLI for two years and literally the only games I've ever touched within that period that don't support SLI, either officially or by using the SLI bits of another game, are id Tech 5 titles (namely Rage and Wolf14) and Dead Rising 3. Games that don't support SLI are the exception, not the rule.

but...but you dont play games? :(
 

DarkoMaledictus

Tier Whore
Because it would make no sense, the game doesn't tax a single card why throw 2 at it

Yep, but getting worst results when enabling sli was definitely not a point in its favor ;).

Other games like Metro 2033 have terrible Sli scaling. Just saying I don't buy into sli benefits; performance, scaling and driver issues are just not worth it. Anyway, just my two cents!
 

AndyBNV

Nvidia
Sure is, but Blizzard officially stated that sli was unsupported.

Diablo 3 SLI is supported. Runs great at higher resolutions, with good to near perfect scaling depending on the res.

For all games we work to have a SLI profile, and if it doesn't happen it's not for a lack of trying.
 

dhlt25

Member
My native res is 1440p. The closest I can get to 1080/1200p is 2048X1152.

Max setting:

Low: 2.1GB VRAM

Medium: 3.1GB VRAM

High: 3.8GB VRAM

Ultra: 5.3GB VRAM

Going by my results, I don't see how you can truly get Ultra unless on 6GB VRAM card. The game my let you select it, but it might not actually be running at Ultra.

Damn, looks like I won't be able to max this with my 970.

The game looks kinda ugly though, I can't see where all the VRAM were used in the screenshot, the texture look worse than the witcher 2
 
Yep, but getting worst results when enabling sli was definitely not a point in its favor ;).

Other games like Metro 2033 have terrible Sli scaling. Just saying I don't buy into sli benefits; performance, scaling and driver issues are just not worth it. Anyway, just my two cents!

MEtro2033 (the original) has near linear SLI scaling. What are you on about?!
 

Smash88

Banned
Diablo 3 SLI is supported. Runs great at higher resolutions, with good to near perfect scaling depending on the res.

For all games we work to have a SLI profile, and if it doesn't happen it's not for a lack of trying.

Andy when are drivers for Shadow of Mordor being released. I've also noticed some random dips, leads me to believe the drivers aren't optimized for this game.
 

R1CHO

Member
My native res is 1440p. The closest I can get to 1080/1200p is 2048X1152.

Max setting:

Low: 2.1GB VRAM
ocXfsuD.jpg


Medium: 3.1GB VRAM
4y9NgIE.jpg


High: 3.8GB VRAM
RHOV661.jpg


Ultra: 5.3GB VRAM
xZNnXMY.jpg


Going by my results, I don't see how you can truly get Ultra unless on 6GB VRAM card. The game my let you select it, but it might not actually be running at Ultra.

Going by your results nobody with a 2 GB vram card or lower would be able to run the game even at the lowest setting.

Independently of the setting the game will cache different amounts of vram if there is enough vram to do so.

The interesting part is to know how this lack of vram affects the performance. Streaming problema like in Watch Dogs? Maybe an overall framerate loss? More agressive lod perhaps?

We will have to wait for deeper benchs to know.
 
Yep, but getting worst results when enabling sli was definitely not a point in its favor ;).

Other games like Metro 2033 have terrible Sli scaling. Just saying I don't buy into sli benefits; performance, scaling and driver issues are just not worth it. Anyway, just my two cents!

That's a flat out lie? Metro had terrible SLI results at launch but once they added in support performance with SLI is around 90% more than a single card.

Anyway this is veering wildly off topic.
 
Yep, but getting worst results when enabling sli was definitely not a point in its favor ;).

Other games like Metro 2033 have terrible Sli scaling. Just saying I don't buy into sli benefits; performance, scaling and driver issues are just not worth it. Anyway, just my two cents!

Metro scales fine for me, Diablo has a nice profile (as Andy said) and your statement that "it runs fine at launch and then stops" makes no sense either. Maybe not grace this thread with your SLI experiences from 10 years ago.
 

AJLma

Member
80FPS Average @ 1440p in the benchmark. Everything maxed. No motion blur or DoF.

I7-3770 3.4GHz
R9 290 Stock Clock
8GB Ram
 

DarkoMaledictus

Tier Whore
Diablo 3 SLI is supported. Runs great at higher resolutions, with good to near perfect scaling depending on the res.

For all games we work to have a SLI profile, and if it doesn't happen it's not for a lack of trying.

It may be supported now, but it was not supported by Blizzard for the longest time. Other games like Wow also do not support sli at all.

(http://www.blizzposts.com/topic/en/298342/nvidia-sli-not-working)

Sli or Crossfire is great for BIG resolutions using multi-screens. However min framerates, studder, driver issues drove me away from it.

Now I just get the biggest and baddest single gpu card I can get and my life is much simpler!
 

erawsd

Member
With my oc'd 7950 and 2500k the game auto picked a mixture of ultra/high settings @1080. Bench mark hits an average of 63fps. I guess ill just run with it and see if i notice any issue during actual play.
 

knitoe

Member
Going by your results nobody with a 2 GB vram card or lower would be able to run the game even at the lowest setting.

Independently of the setting the game will cache different amounts of vram if there is enough vram to do so.

The interesting part is to know how this lack of vram affects the performance. Streaming problema like in Watch Dogs? Maybe an overall framerate loss?
Yeah. It looks like the game dynamically loads base on your card's VRAM and/or performance. So, the settings you select in the game may not actually be what it's running with. It's going to make it very hard do comparison without taking pictures and zooming in.
 

cheezcake

Member
You're certainly right about hard drive capacities, but I have never seen the same thing occur for system memory and video card memory capacities - or at least I don't remember it. For example, this work laptop I am typing from has 8.00 GB of Windows GBs. It has slightly less than that usable, but this is because of of memory addressing, not because it was manufactured in decimal GB rather than binary GB.

HDD/SSD memory is manufactured in decimal amounts with memory sizes reported correctly in MB/GB. Main memory systems are manufactured in binary amounts
(stupid me forgot my computer architecture courses and how memory addressing works otherwise I should have known that from the start)
and for advertising are reported incorrectly in MB/GB because I'm guessing the majority of people have no idea what a MiB or GiB is. Though technically when you see RAM reported as 8GB they mean GiB. End user doesn't give a crap. Funnily enough this means we were both wrong and that my 1280 MiB Card is actually 1.342177 GB.

Also can I say how much I hate this inconsistency and everyone should get on the MiB-GiB train
 

Corpekata

Banned
That's interesting. I'll probably be turning those off then if your getting those framerates on ultra at 1440p

Generally not a big hit on framerates in most games (well, maybe DOF in some). Think those are more about personal taste than performance. I have blur off in pretty much every game unless its' well done like Crytek.
 

nbthedude

Member
Quite frankly, the game, from the screens in this thread, looks like shit.
It's not the prettiest game, but I played for two hours and goddamn is it fun.

In terms of performance, I'm on a i7 4770 at stock with a factory OCed 780 and I have ice smooth frame rates with everything maxed except no ultra texture pack.

This game is pretty fucking awesome once you get going.
 
Just installing the texture pack now, but I got a strange resolution same as others were reporting when I looked briefly in the options menu on first boot. I don't use a custom res and never have, so what gives?
 

knitoe

Member
My friend said that he can play on ultra with 2gb veam card.

See my post above. In short, the game lets him select Ultra and he assumes he's running Ultra, but in actuality, the game is probably running some other settings to fit his VRAM. Which would make it a bitch to do accurate comparisons.
 
HDD/SSD memory is manufactured in decimal amounts with memory sizes reported correctly in MB/GB. Main memory systems are manufactured in binary amounts
(stupid me forgot my computer architecture courses and how memory addressing works otherwise I should have known that from the start)
and for advertising are reported incorrectly in MB/GB because I'm guessing the majority of people have no idea what a MiB or GiB is. Though technically when you see RAM reported as 8GB they mean GiB. End user doesn't give a crap. Funnily enough this means we were both wrong and that my 1280 MiB Card is actually 1.342177 GiB.

Also can I say how much I hate this inconsistency and everyone should get on the MiB-GiB train

How did you get 1.342177?

1280 / 1024 = 1.25.
 

hawk2025

Member
Attention!

I figured out my issue: Nvidia's stereoscopic 3D was completely messing up the image as seen above. Turned it off and it's all gone.
 

R1CHO

Member
Thanks, will try.

Edit: just tried. Exact same issue windowed and with other resolutions.

I don't know mate, seems like a weird bug.

If you want to keep trying things you may do stuff like installing an older driver, using another video outpot, other screen if you can... Thinks like that, I can't think of anything concrete.

Ps: Nice to see is solved.
 

Smash88

Banned
Okay I downloaded EVGA Precision X 5.2.2.

This thing is garbage. It's so hard to find anything on it, I hate it, but I digress. Where the hell do I go to change the color of the OSD.

Also how do I find the graphs?
 

knitoe

Member
Okay I downloaded EVGA Precision X 5.2.2.

This thing is garbage. It's so hard to find anything on it, I hate it, but I digress. Where the hell do I go to change the color of the OSD.

Also how do I find the graphs?
Use MSI Afterburner. It will also instal Rivaturner for ingame display. Go into that and change the colors.
 
Top Bottom