• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Dragon Age: Inquisition PC performance thread

The TL;DR of the above is that the game is extremely CPU intensive due to the high view distance and number of NPCs.
Also it is quite the looker using all new rendering techniques like PBS, POM and Tessellation rather well.
 
Maybe that is why people are saying that mantle is providing a huge performance boost in this thread?

I'm not talking about the Mantle numbers, I'm talking about the Direct-X 11 numbers, it has both on the charts. I expect it to gain from using Mantle, but it's somehow outperforming a clearly more powerful card using DX11, which doesn't make sense.
 
The TL;DR of the above is that the game is extremely CPU intensive due to the high view distance and number of NPCs.
Also it is quite the looker using all new rendering techniques like PBS, POM and Tessellation rather well.

This game does not employ PBR. Looking at even one screen makes it abundantly clear.
 

MattyG

Banned
Ehh, I'm starting to think I'd be better of grabbing this on PS4. I don't think my 3570k@3.4 and 760 2GB would handle this as well as I had hoped. Maybe I'm wrong though.
 

Chaos17

Member
Some parts quickly translated by be:

Introduction:
Now, with the engine known from the online-shooter Battlefield 4, the studio shows that it has answered to fan criticism. Technically Dragon Age: Inquisition has succeeded brilliantly and is sure to be the best looking role playing game thus far.
It only takes seconds to show us clearly, that Bioware wants to distance itself as far as possible from the heavily criticised second part of the franchise. While the constantly repeating levels in Dragon Age 2 were still small and mostly kept in boring red-brown, the sheer complexity of the outside scenes nearly overwhelm us and now every upcoming role playing game must measure itself with the unbelievable view distance and graphical finesse. Yes, even The Witcher 3. Bioware strikingly demonstrates what it can accomplish with new (console-) hardware, modern technology and most of all enough time on their hands. Wow!

[...] (battlefield 4)

Features:
For the Bioware role playing game the lightning and shading model was changed to Physically Based Rendering, additionally the developers consistently use all modern engine features. Including high resolution, albeit subtle textures, Soft Cascading Shadows, Volumetric Lighting, Parallax-Mapping, Tessellation, Imaged Based Lighting as well as Sub Surface Scattering and so forth. And the AMD-API Mantle, that especially helps older processors and those with rather below average performance-per-watt along. In some cases this is sorely needed, because the gigantic levels including scarcely perceptible level of detail and a lot of small finesses cause rather high hardware hunger that can't be satisfied with just a fast GPU.

[...] (character creation and depth of field)

Almost every texture is high resolution, and not only just two meters around the character, but rather until the horizon. Even the mountains in the backdrop are covered by exceptionally high resolution textures. Many of the surfaces are covered by parallax-maps, often in combination with tessellation. The plastic depth is by and large excellent. Okay, with a bit of effort you can find a few blurry textures or low-poly level objects, but those are really high-level nitpicks.

Issues:
One annoyance is the 30 fps lock during cutscenes that always feel quite suttery while the game runs with a 200 fps lock (that can be disabled). Additionally the lip movements aren't synced to the german localisation and some of the animations feel a bit wooden, even though the characters react very nicely to outer circumstances like the slant of the ground, and pronounced specular flickering.Sadly this isn't easily solvable. The many and maybe a little exaggerated glossy and reflecting surfaces are thanks to Physically Based Shading very beautiful in most cases, but have a severe downside: Typical anti aliasing methods like MSAA or FXAA don't or only insufficiently reduce shader aliasing. Bokeh-Depth of field in cut scenes amplify the issue. Some of few methods that work are only partly functional. In Dragon Age: Inquisition the resolution can only be upscaled from a lower to the native resolution. Internal resolutions above 100% don't work, not even per console command. That is irritating, but regarding the system requirements not dramatic, because that those are steep.

Requirements:
On the highest settings Dragon Age: Inquisition is quite the hardware devourer. While the graphics card requirements are understandable for every gamer, it is first and foremost the CPU that is challenged. For hardware-affine gamers this might be reasonable, after all the wide view, level of detail and number of NPCs is quite impressive. But after many years of stagnation in this area the demand of the role playing game on the processor will surprise one or two. Gamers with a moderate Intel Quadcore with 3 GHz need to bring little sorrow, older or weaker processors with outdated performance-per-cycle circumstances will perhaps get into a pretty pickle. For instance, the Intel Core i7-920 of the author doesn't even get 30 fps at max details in Full-HD despite a overclock to 3.8 GHz. And even our test PC, a Intel Core i7-4790 @ 4.5 Ghz has using DX11 one thread near maximum load. Admittedly it is possible to distribute the load a bit, if we reduce the resolution and anti-aliasing - the Haswell manages 120 fps in 720p - but each and every additional Megahertz expresses itself in additional fps even in 1080p with 4xMSAA.

Enters AMDs low-level API Mantle. Especially impressive are the performance improvements of Mantle for old or weak CPUs like the author uses. In combination with a R9 290X can the aged Bloomfield gain 45% performance compared to DX11. That is the difference between a intolerable Stutterfest and an adequately fluid game experience. Even with a Core i7-4790K @ 4.5 Ghz the low-level API can gain 10% compared to the overhead-plagued microsoft API. Thus the R9 290x can clearly pull ahead of the much stronger overclocked GTX 980.

[...] (Benchmarks and conclusion)

Thanks!
 

Tovarisc

Member
Makes me slightly more concerned that my i5 750 @ 3.8 GHz doesn't have juice to run this well enough after all :/

Rest is 8GB DDR & GTX760 2GB.
 

Tovarisc

Member
Maybe you won't be able to run Ultra@60fps, but I'm sure it should permorm smoothly enough in lower settings.

I don't go for Ultra, but being required to run at Low would be such letdown :(

Edit: For reference Nvidia optimizing thing [name escapes me] suggest mix of Ultra and High for BF 4. I'm be comfortable with mostly High settings in BF 4 if I want eye candy with decent performance.
 
Sorry, did you not say that the games does not support PBR?

Because the video shows PBR around the 40 seconds mark...

You are confused on what pbr is

It can have physically based shading, without transitioning fully to PBR, like Crysis 2 and 3 for example.

No part of the shading is physically correct. And crysis 2 and 3 are very poor examples pf physically correct shading, despite cryteks claims
 

Hugstable

Banned
Damn those numbers look pretty bad. Oh well probably just gonna get it on my PS4 at this point anyway since I hate Origin.
 

ISee

Member
its a pipeline that allows materials to react to light based on their physical properties.

ok... so sorry for asking again, but the lighning on the shoulder armor and gloves. That is PBR or is that another method? Do not want to piss you off, but now I am very confused.

h98pFs0.jpg
 

Hugstable

Banned

Ehh well it's not like I really "hate" it, but I'm usually the type who's really picky about having all their games in one place -___-. Only game I really have on Origin that's really just on Origin is Titanfall since my PC is filled with Steam Stuff, so I thought I just might as well add it to the PS4 collection if I couldn't add it to my PC collection.

Those are Max + 4x MSAA numbers, which will kill GPUs.

Ahh Thanks for that, I really need to think about this decision, been thinking about it the last 2 days since I finally started to get hyped for this game due to me not having really followed it much before this week.
 
ok... so sorry for asking again, but the lighning on the shoulder armor and gloves. That is PBR or is that another method? Do not want to piss you off, but now I am very confused.

h98pFs0.jpg

youre not upsetting me at all, no worries. thats the standard specular/fake reflections we have had since shaders were invented. probably just hand tweaked on certain parts of the characters. but if you look at any material in the environment you will notice that every single surface reacts to light exactly the same way. none of the materials look like anything that actually exists in real life.
 

ISee

Member
youre not upsetting me at all, no worries. thats the standard specular/fake reflections we have had since shaders were invented. probably just hand tweaked on certain parts of the characters. but if you look at any material in the environment you will notice that every single surface reacts to light exactly the same way. none of the materials look like anything that actually exists in real life.

Ah I think I get it now. thx.
 
Added edit to my original post about BF 4 performance. Maybe I'm just too aware about my CPU being kinda very old now days and makes me too cautious :b

GeForce Experience?

A tip, disable it and try for yourself.

I've had several performance issues in some games like Shadows of Mordor or Wolfenstein TNO that were solved just by shutting down GFExp. And I have a 2500k, GTX 970 G1 and 16GB RAM.

Just try the settings by yourself.
 
I'll probably be able to squeeze 30 with my 7950 and i5-4670k + mantle. Most likely won't run at that AA though. Is the game really this open world or is it because drivers aren't available yet? Eitherway I'm excited.

edit: oh its MSAA, yeah that's costly. This game is going to look fantastic.
 
"One annoyance is the 30 fps lock during cutscenes..."

Why, Bioware? Why?!

games should let you know about 30 fps locks on cutscenes for their PC release. I spent almost an hour trying to figure out why saints row 4 was running so poorly one second and a smooth 60 fps next and it was driving me mad. Ends up that SRIV has locked cutscenes so no amount of setting changes and adjustments would affect those cutscenes.

Just let me know in advance, dammit. Don't mind if it's during cutscenes, just dont wish to waste my time.
 

garath

Member
I'm on the fence for buying this on release. I'd love to see what I'd be able to get with my specs:

i5 2500k @ 4.2
970
8 gigs RAM

I expect everything maxxed with maybe no MSAA. I realize the game is CPU heavy but the 4.2ghz should help.
 
Bought an MSI GTX 970 a few weeks ago. I was hoping it would last slightly longer before I had to dial down the quality settings substantially. Jesus.
 
Top Bottom