• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

He says it didn't stutter. It did fill up his 4gb vram but he didn't notice any stuttering. Not sure if he was playing on 1080p or 1440p. Will ask him when he's back if anyone's interested.

I am sure there will be plenty of good impressions soon.

This is a little misleading, it seems to be maxing out Vram on all 4GB cards at the very least even if performance isn't bad on them.

I can believe that, but since that seems to run just barely I think they wanted to take the safe route and not recommend 4GB. The trouble is that you have close to no cards in the 4-6 GB VRAM range. So even if the actual requirements would only be 4.5 GB or so, they still need to recommend 6GB as that is the next common VRAM amount. So I doubt the VRAM requirements will quickly go above 6GB of VRAM.

That video says "Textures:Ultra"

So I beg to differ.

The game only comes with "Textures"High" by default. Ultra has to be downloaded separately

As the post above me says, the description tells you if you put in on Ultra and you don't have the pack installed, it uses high textures. Since it was so soon I wouldn't be surprised he did not have the pack.
 

BONKERS

Member
Read the description in the settings, ultra will only differ from high IF the HD textures are installed.

Well the user doesn't state that he does or doesn't have them either right?

One would assume if it's legally possible to get the game at somewhere in the world, the textures would be available too.
 

KingSnake

The Birthday Skeleton
Reading through the PC performance thread and reading about the framerate on PS4 (probably 30 fps, maybe 30-40fps) makes this thread stupidly funny in retrospective.

I really hope that someone gave up on the PC version to buy the PS4 version because of this thread. That would be so good.
 

Seanspeed

Banned
Reading through the PC performance thread and reading about the framerate on PS4 (probably 30 fps, maybe 30-40fps) makes this thread stupidly funny in retrospective.
I'm pretty sure its said that PS4 version is 60fps.

EDIT: Oh seeing in the other thread that its not actually 60fps. Just a case of bad information becoming widespread?

Yea, that'd be hilarious if somebody bought the PS4 version instead thinking it would run better. Sounds like even very modest PC's are able to run this at 1080p/60fps with decent settings.
 

KingSnake

The Birthday Skeleton
I'm pretty sure its said that PS4 version is 60fps.

EDIT: Oh seeing in the other thread that its not actually 60fps. Just a case of bad information becoming widespread?

Yea, that'd be hilarious if somebody bought the PS4 version instead thinking it would run better. Sounds like even very modest PC's are able to run this at 1080p/60fps with decent settings.

It's not bad information, more like wishful thinking. The developers said they are targeting 60fps but never confirmed they succeeded, but everyone took it for granted. Yeah, actually this game seems to be able to run decently on a mid-range PC which means it is a good PC port.
 

Kieli

Member
And i thought I would make it till Witcher 3 before a full upgrade....damn.

Yep....

For the games that are actually lookers, I shudder to think what their VRAM requirements will be when games that look like Shadows of Mordor already require 6. =.=
 

Teremap

Banned
Yep....

For the games that are actually lookers, I shudder to think what their VRAM requirements will be when games that look like Shadows of Mordor already require 6. =.=
But we already have better-looking games that require less VRAM (in some cases, considerably less).

Hell, Crysis 3 came out early last year and it uses considerably less VRAM while also being visually superior in many ways. Now, I don't expect every dev to live up to Crytek standards, but Monolith is obviously doing something wrong.
 

SapientWolf

Trucker Sexologist
Yep....

For the games that are actually lookers, I shudder to think what their VRAM requirements will be when games that look like Shadows of Mordor already require 6. =.=
What would you think if Ryse looked and ran better on worse hardware?
 

pa22word

Member
What would you think if Ryse looked and ran better on worse hardware?

Ryse is a corridor simulator that's more linear and closed off than your average CoD game with a fov of about 30 with a game world about as static and spatially uninteresting as a fucking board that's painted prettily, and designed that way intentionally so the xbone wouldn't explode running it.

Hence I wouldn't give a shit if it performed better, because false equivalencies and tales from the ass from people who don't really know what they're talking about (see: 99.9% of this thread prior to launch) don't really mean anything at the end of the day, just performance over demand.
 
Yep....

For the games that are actually lookers, I shudder to think what their VRAM requirements will be when games that look like Shadows of Mordor already require 6. =.=

But it doesn't require 6. That is for the ultra settings, and besides it doesn't even fully utilizes the 6 GB.

But we already have better-looking games that require less VRAM (in some cases, considerably less).

Hell, Crysis 3 came out early last year and it uses considerably less VRAM while also being visually superior in many ways. Now, I don't expect every dev to live up to Crytek standards, but Monolith is obviously doing something wrong.

Visually superior in many ways doesn't matter a lot. These requirements are only for the ultra textures, the rest of the graphical effects don't have a big impact on VRAM. Also Crysis 3 is almost a corridor shooter compared to this.
 

K.Jack

Knowledge is power, guard it well
Yep....

For the games that are actually lookers, I shudder to think what their VRAM requirements will be when games that look like Shadows of Mordor already require 6. =.=

Batman: Arkham Knight's VRAM requirements will be very interesting, and telling of the future.
 

Elsolar

Member
Visually superior in many ways doesn't matter a lot. These requirements are only for the ultra textures, the rest of the graphical effects don't have a big impact on VRAM. Also Crysis 3 is almost a corridor shooter compared to this.

I dunno, Crysis 3 had some pretty open environments with a lot of variety and detail. Even if the gameplay was linear, can you really say a level like the dam is anything like a corridor shooter from a rendering standpoint?
 

SapientWolf

Trucker Sexologist
Ryse is a corridor simulator that's more linear and closed off than your average CoD game with a fov of about 30 with a game world about as static and spatially uninteresting as a fucking board that's painted prettily, and designed that way intentionally so the xbone wouldn't explode running it.

Hence I wouldn't give a shit if it performed better, because false equivalencies and tales from the ass from people who don't really know what they're talking about (see: 99.9% of this thread prior to launch) don't really mean anything at the end of the day, just performance over demand.
Draw distance in and of itself doesn't really determine scene complexity. Objects out in the distance are going to have a lower LOD and less detail. In fact, according to Crytek, rendering the cityscape of Crysis 2/3 proved to be more technologically challenging than the more open jungle of the original Crysis. Things such as the number of light sources, the polygon count, materials, the number and quality of the shaders, etc. all play a much larger role. And since open world games stream in their assets the memory requirements aren't necessarily greater either.

The salient point is that the game is demanding relative to how it looks. If the reports are true then The Witcher 3 is running around 60fps on a 780, while looking quite a bit better than Mordor and on par with Ryse. I watched the Infiltrator demo run in real time on a single 780. It seems to be the case that going current gen only allows for more efficient rendering. So I wouldn't be surprised if we get improved graphics in the future without the system requirements skyrocketing.
 
Reading through the PC performance thread and reading about the framerate on PS4 (probably 30 fps, maybe 30-40fps) makes this thread stupidly funny in retrospective.

I really hope that someone gave up on the PC version to buy the PS4 version because of this thread. That would be so good.
This post is full of stupidity.
 
It's only a matter of time before PC modders optimize and enhance things on the PC version in ways that the Devs either didn't have time, or didn't care to do, and then this game will REALLY look good, instead of 'pretty nice' as it does as of now. I mean a single guy fixed the Watch Dogs shortcomings, and helped it look as good as the 2012 reveal. With which of course, we were originally supposed to believe it would be of that caliber on consoles.
 
I dunno, Crysis 3 had some pretty open environments with a lot of variety and detail. Even if the gameplay was linear, can you really say a level like the dam is anything like a corridor shooter from a rendering standpoint?

Ah, I should have been more clear. I am not talking about the impact it has on a rendering standpoint, it is possible that both things are just as challenging. However, Crytek can afford to put a lot of effort on the areas in the game, since players are a lot more restricted. Having the world the size of Shadows of Mordor filled with the details of a Crysis game demands so much time to do.
 
Batman: Arkham Knight's VRAM requirements will be very interesting, and telling of the future.

I'd imagine 3 or somewhere between 3 and 4 for high textures. I just can't see it changing since they have the same amount on the consoles for VRAM, unless they are going to be very very efficient with their RAM. Sure, you might get an option for better textures that likely would also demand 6 or 4 GB of VRAM (since you can't really have anything in between).

This post is full of stupidity.

Nah, I agree. There were many people overreacting and many that considered the PS4 version because of it it. Turns out that the PC was the better choice in many of these situations.


And so it seems that the PS4 is indeed using high textures. So high textures on PC and PS4 likely uses the same amount of VRAM and nothing outrageous is going on here. And this quote says what many people in this thread should have known:

"They are on monster PCs making the highest possible quality stuff and then we find ways to optimise it, to fit onto next-gen, to fit onto PCs at high-end specs. Then obviously there's going to be that boundary where our monster development PCs are running it OK - but why not give people the option to crank it up? It makes sense to get it out into the world there - we have it, we built it that way to look as good as possible. You might as well, right?"

Based on the natural progress of technology.

It is inevitable. Come on.

Yes, eventually, but that doesn't mean that kind of increased demands over these specs will happen this gen or very soon.
 

BONKERS

Member
Differences seem pretty marginal given the massive 6GB recommendation, there are a lot of textures I can see in any given shot Eurogamer posted that could've used a higher quality texture. Such as this obviously low res, pixellated looking pillar http://screenshotcomparison.com/comparison/93883
Or the floor that isn't the white outline. The ground texture that IS improved on the one shot, still looks marginally poor in quality. Even if it's an OW game, there should be a larger increase in quality for DOUBLE the VRAM.

Though I REALLY wish Eurogamer would STOP saving screenshots as JPGs with Chroma subsampling, THEN putting a text overlay on it and saving it AGAIN as a JPG.

FFS, use PNG screenshots, then convert to highest quality JPGs after the overlay has been added.



Here's another I found
http://screenshotcomparison.com/comparison/93719
There are some clear differences to definition to textures on the ground.
But the guy used JPG compression to the max.
C'mon guys. Use some common sense
 
Differences seem pretty marginal given the massive 6GB recommendation, there are a lot of textures I can see in any given shot Eurogamer posted that could've used a higher quality texture. Such as this obviously low res, pixellated looking pillar http://screenshotcomparison.com/comparison/93883
Or the floor that isn't the white outline. The ground texture that IS improved on the one shot, still looks marginally poor in quality. Even if it's an OW game, there should be a larger increase in quality for DOUBLE the VRAM.

Though I REALLY wish Eurogamer would STOP saving screenshots as JPGs with Chroma subsampling, THEN putting a text overlay on it and saving it AGAIN as a JPG.

FFS, use PNG screenshots, then convert to highest quality JPGs after the overlay has been added.



Here's another I found
http://screenshotcomparison.com/comparison/93719
There are some clear differences to definition to textures on the ground.
But the guy used JPG compression to the max.
C'mon guys. Use some common sense

It doesn't really make sense to put a lot more detail in the textures if a very large majority can't appreciate them anyway. And you just can't make detailed textures everywhere in such a big game. This isn't a technical issue.

Although I do agree they really should have better quality pictures.
 

Seanspeed

Banned
It's only a matter of time before PC modders optimize and enhance things on the PC version in ways that the Devs either didn't have time, or didn't care to do, and then this game will REALLY look good, instead of 'pretty nice' as it does as of now. I mean a single guy fixed the Watch Dogs shortcomings, and helped it look as good as the 2012 reveal. With which of course, we were originally supposed to believe it would be of that caliber on consoles.
Not quite. The Watch Dog mods were basically just reactivating a few bits that got cut for the final release. But not everything. And those that did weren't always working 100% correctly either(probably why they got cut).

So it helped it get closer to the 2012 reveal, but definitely not entirely there.

Its unlikely that the 'deactivated features' situation will happen again with this game. If modders are to make this game look better, it will require work of their own. And I'm not optimistic that this game will see a lot of modding attention. Hopefully somebody figures out a way to get some decent AA in, at least.
 
Instead of comparing similar but different images in game... has anyone tried just extracting the textures and viewing them directly?

I'm still curious on whether the textures are actually larger or if they are just simply uncompressed versus high (which would account for their larger size and not much improvement).
 

scitek

Member
So game requires 6gb and launches with no sli support, was ultra just intended for titan users?

Ultra textures was basically them saying "What the hell?" and just throwing them in for those that want it. By all accounts, High is still very close to Ultra in overall quality. I turned it down myself just to see and couldn't really tell a difference. Ultra runs fine on my 4GB 970, though, so I haven't felt the need to keep it at High.
 

BONKERS

Member
People who do have the game.

Is it possible you could please make better/more comparisons than EG?

More 1:1 as possible? If using JPGs, please highest quality settings.



It doesn't really make sense to put a lot more detail in the textures if a very large majority can't appreciate them anyway. And you just can't make detailed textures everywhere in such a big game..

Then why even bother at all? And then double the VRAM requirement for such marginal differences?

And while it's true you can't make detailed textures for everything (Not to mention, it's going to be mip'd anyway). If you are going to offer a higher quality version, there at least be big detail improvements to as many obliviously low resolution textures as possible.
 

KKRT00

Member
Ryse is a corridor simulator that's more linear and closed off than your average CoD game with a fov of about 30 with a game world about as static and spatially uninteresting as a fucking board that's painted prettily, and designed that way intentionally so the xbone wouldn't explode running it.

Hence I wouldn't give a shit if it performed better, because false equivalencies and tales from the ass from people who don't really know what they're talking about (see: 99.9% of this thread prior to launch) don't really mean anything at the end of the day, just performance over demand.
What? Game has environments that are 2km long, what are You talking about.

Static? In what way?

From rendering standpoint it doesnt matter if You have 2km or 200km game world, they are demanding the same amount of resources per frame if the fidelity in visible scene is similar. The only difference is in art asset creation, so the artists power to create environments.
 

Seanspeed

Banned
What? Game has environments that are 2km long, what are You talking about.

Static? In what way?

From rendering standpoint it doesnt matter if You have 2km or 200km game world, they are demanding the same amount of resources per frame if the fidelity in visible scene is similar. The only difference is in art asset creation, so the artists power to create environments.
Cant you use a lot of tricks with background scenery, though?
 
People who do have the game.

Is it possible you could please make better/more comparisons than EG?

More 1:1 as possible? If using JPGs, please highest quality settings.





Then why even bother at all? And then double the VRAM requirement for such marginal differences?

And while it's true you can't make detailed textures for everything (Not to mention, it's going to be mip'd anyway). If you are going to offer a higher quality version, there at least be big detail improvements to as many obliviously low resolution textures as possible.

I'll just post this again then, quoted from the developer.

"They are on monster PCs making the highest possible quality stuff and then we find ways to optimise it, to fit onto next-gen, to fit onto PCs at high-end specs. Then obviously there's going to be that boundary where our monster development PCs are running it OK - but why not give people the option to crank it up? It makes sense to get it out into the world there - we have it, we built it that way to look as good as possible. You might as well, right?"
 
Quick comparison I made. The difference is minor (and it's only some textures, not all) so anyone with not so powerful systems shouldn't worry about not being able to run ultra :)

15418808602_68946c1b17_o.jpg

15232397619_d459e26204_o.jpg
 
Quick comparison I made. The difference is minor (and it's only some textures, not all) so anyone with not so powerful systems shouldn't worry about not being able to run ultra :)

https://farm3.staticflickr.com/2946/15418808602_68946c1b17_o.jpg[IMG]
[IMG]https://farm4.staticflickr.com/3928/15232397619_d459e26204_o.jpg[IMG][/QUOTE]

Actually the difference seems to be quite pronounced in some of the screens I have seen, including these. You can see so much more detail in the ultra textures. Although high textures still look fine. If you keep your expectations in check and know that ultra would just be a more detailed version of the high textures, it doesn't really seem you could be disappointed.

It is not like the textures would suddenly completely change.
 

kodecraft

Member
Looks like I'll need to hang on to my PS4 for multiplats.

console developers are ridiculous

i really hope Batman doesn't need some shit like 8GB minimum

You already know the epic Arkham Knight will require epic specifications, plus you just said you"ll stick with the PS4 so grab the PS4 Arkham Knight.
 

pestul

Member
Interesting. The video stutters a fair bit for me but that may have just been the encoding..

http://forums.guru3d.com/showpost.php?p=4927049&postcount=543

For those interested in the 6GB vRAM rumor for Shadow of Mordor I can show that they were fake.

I just ran the Shadow of Mordor benchmark and with Maxed settings at 1440p (yes I downloaded the ultra hd textures pack) I get 94fps average.

I also played through the game a bit without an issue.

Youtube video of my benchmark below.

https://www.youtube.com/watch?v=DFg1FpWExY8&feature=youtu.be
 
Top Bottom