• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Mass Effect: Andromeda PC Performance Thread

Aeana

Member
Without vsync, dips and tearing are normal (in exclusive fullscreen mode). Limiting framerate doesn't sync the frames.

BTW, for Frostbite3 games, best framerate limiter is the built-in limiter. But you have to access game console.
I have a gsync monitor, so tearing isn't much of a concern. I've tried a lot of different permutations now, borderless or exclusive full screen, with ingame vsync and triple buffering on/off, with it forced via Nvidia control panel, with fast sync, etc

Most of them produce a stutter after the latest driver (didn't have it in the old driver as far as I could tell), but general performance is better in the new driver.

I've yet to settle on something I'm completely happy with, but I think I have it to the point now where it doesn't stutter in gameplay and just does it during quick camera shifts in conversations.
 

Lister

Banned
Question MassGAF: Is RGB 8bpc the best (at full) or Ybr422 or Ybr444 with 10bpc for HDR/1080p (upscaled to 4K)?

Also, triple buffering: that helps framerate if you have Vsync on right?

Man, sometimes I forget I know hardly anything about tech stuff :/

i7 3770K / 16GB / 980 G1 / SSHD

Any help appreciated! :)

Not sure about the first, but triple buffering allows smooth output of frames between the usual evenly divisible steps. Double buffered vsync would keep you at 60 FPS so logn as your GPU could manage it, but if it can't it's going straight down to 30, then 15. WIth triple buffering you get the full range of frame rates from 30 to 60. This comes at the cost of a bit more latency.

I've found that in some games, turning off vsync and turning on Fast sync on NVidia control panel + limiting the framerate to my panel's refresh of 60 = higher frame rates. But not all games benefit.
 

dr_rus

Member
Question MassGAF: Is RGB 8bpc the best (at full) or Ybr422 or Ybr444 with 10bpc for HDR/1080p (upscaled to 4K)?

In 1080p you should be able to use 10 bits with RGB or Ybr444 (doesn't matter which one much as the quality should be the same between these) for HDR10.

HDMI bandwidth issues are kicking in if you want to use HDR10 in 4K at 60+ fps.
 

MaLDo

Member
I have a gsync monitor, so tearing isn't much of a concern. I've tried a lot of different permutations now, borderless or exclusive full screen, with ingame vsync and triple buffering on/off, with it forced via Nvidia control panel, with fast sync, etc

Most of them produce a stutter after the latest driver (didn't have it in the old driver as far as I could tell), but general performance is better in the new driver.

I've yet to settle on something I'm completely happy with, but I think I have it to the point now where it doesn't stutter in gameplay and just does it during quick camera shifts in cutscenes.

What I said doesn't apply to gsync :)

In your case try changing the console commands (if it's possible to enable it)

GameTime.MaxVariableFps
RenderDevice.VSyncEnable
RenderDevice.TripleBufferingEnable
RenderDevice.RenderAheadLimit

until you get the best result. If you want to test with gsync disabled, best combo for other games usually is

GameTime.MaxVariableFps 60
RenderDevice.VSyncEnable 1
RenderDevice.TripleBufferingEnable 1
RenderDevice.RenderAheadLimit 2 (sometimes 1)
 

Enclose

Member
Mine too, especially when panning the camera. I'm going to try enabling vysnc on Nvidia control panel and capping at 60 with rtss.

Im getting the weird Frametimes just in Cutscenes. Im dropping with every cut nearly to 40-45 from capped 60. It feels not smooth in all in this scenes. Gameplay is smooth.
 
Anybody able to get Steam DualShock 4 support running properly in MEA? I've got my DS4 on the emulate gamepad profile, but I still have a mouse cursor and keyboard controls.
 

Skyr

Member
Does anybody else get stuttering when paning the camera around while using the scanner?

It's weird because the framerate stays stable at 60 but stutter still occurs tho only when I got the scanner equiped.
 

Ivory Samoan

Gold Member
In 1080p you should be able to use 10 bits with RGB or Ybr444 (doesn't matter which one much as the quality should be the same between these) for HDR10.

HDMI bandwidth issues are kicking in if you want to use HDR10 in 4K at 60+ fps.

Thanks for the response :)

I don't have Ybr444 with 10 bit option, only 8 bpc, but Ybr422 and 420 have 10 and 12 bit....I thought Ybr444 with 10bit isn't possible yet?
I do have RGB with 10bit option though, think I'll use that.
 

Enclose

Member
These are my frametimes and fps in cutscenes, everything else is butter smooth for me:

1WYKyby.png


Im out of options to say whats the problem here
 

bongpig

Neo Member
Bah! Installed the latest NV driver and now the game isnt running as well. Still getting good solid framerates according to afterburner, but it has introduced a horrible periodic stutter when panning around.

I hate being in this situation: hold on to the game confident an update will fix it - or learn form many many previous mistakes and get my refund while I can.
 
The game stutters way too hard for me. I get 50 fps indoor and 80 fps out door, but the game often drop to single digit. It's unplayable.
Maybe because I only have 8Gb of ram, but the ram usage never pass 7 gb for me.
I get a refund because of this. The Origin support is very helpful in my region tho.
 

Gojeran

Member
Bah! Installed the latest NV driver and now the game isnt running as well. Still getting good solid framerates according to afterburner, but it has introduced a horrible periodic stutter when panning around.

I hate being in this situation: hold on to the game confident an update will fix it - or learn form many many previous mistakes and get my refund while I can.

This happened to me as well. When panning the camera it will have a horrible stutter (showed like a 30 fps loss for a brief moment). It seemed to clear up mostly after the game had been running for a few mins but when it first loaded up it was fucking terrible. Had no such issues with the previous driver.
 
At my viewing distance, and the game's TAA and sharpening, I have to say going over scaling 0.8 x 4K is a dimishing return. The game looks very crisp yet the performance gain from scaling the resolution is great.
 

Profanity

Member
Does applying arthroscopic filtering via the Nvidia control panel have a performance penalty vs just forcing it via editing the ProfileOptions_profile thing in the game's save directory?

Huh, I had been wondering about the lack of an AF option in the graphics menu.

The default line in ProfOpts_profile is 'GstRender.AnisotropicFilter 4' - I wonder if that means it's 4x AF, or if it's like in some game configs where it means the fourth level of AF, which would usually be 16x.

Edit: Googling around, it seems like the latter.
 
Huh, I had been wondering about the lack of an AF option in the graphics menu.

The default line in ProfOpts_profile is 'GstRender.AnisotropicFilter 4' - I wonder if that means it's 4x AF, or if it's like in some game configs where it means the fourth level of AF, which would usually be 16x.

Edit: Googling around, it seems like the latter.

It's labelled as "texture filtering" under Graphics
 

bongpig

Neo Member
This happened to me as well. When panning the camera it will have a horrible stutter (showed like a 30 fps loss for a brief moment). It seemed to clear up mostly after the game had been running for a few mins but when it first loaded up it was fucking terrible. Had no such issues with the previous driver.

Thats the thing; my framecounter is showing a solid locked 60fps with my frametimes at 16.6. In the section I last looked at ( in the nexus) my CPU and GPU were both under 60% and yet there was a clear visible stutter every few seconds or so.

Dropping form ultra to high mostly clears it up. I also noticed that leaving mostly the ultra settings but lowering only texture quality to high also was a big help. So im thinking theres something up with the texture loading/streaming. I had a similar issue with Gears of War 4, where the game was running comfortably when looking at the stats, but yet there was a stutter. When I lowered texture quality the problem disappeared.

All speculation of course. Going to do another round of testing this evening as see if I can narrow it down some more.
 
Ah I'm an idiot, you're totally right. It never registered with me because I remember not seeing any kind of 4/8/16 options in the menu, so I assumed it wasn't even in there.

Honest thing to miss, I think all frostbite games use low,med,high,ultra for their graphics settings.
 
I just wanted to say that I've got no stuttering apart from a single hiccup at times when the game loads new sections of the world or something. It's always at the same spots and show a brief spike in CPU timing.

72hz montitor, RTSS locked at 37 FPS, Radeon target framerate at 38, vsync on, triple buffering off. Basically a close to perfect half-refresh.

In before: "36 FPS is a fucking slideshow, you must mean there's constant stutter!"
 

Profanity

Member
Honest thing to miss, I think all frostbite games use low,med,high,ultra for their graphics settings.

Yeah I remember it was like that in BF1, but interestingly after looking, the option was absent in DA:I. Maybe there was something in the back of my head yelling at me that Bioware had a history of hardcoding AF options into the presets.
 

nullref

Member
These are my frametimes and fps in cutscenes, everything else is butter smooth for me:

Yeah, I see the same. (6700k + GTX 1080, 1080p, Ultra preset.) Without a framerate cap or vsync, I'm well above 60 always, but with any combination of vsync and framerate caps I've tried, I'm smooth while playing, but cutscenes and conversations have minor framerate dips into the mid-50s and frametime spikes.

I'm currently using the in-game vsync + triple buffering with a RTSS cap of 60 (which seems to smooth out the frametimes during actual gameplay), but I can't seem to do anything about the conversations and cutscenes. Doesn't affect play much, I guess.
 
How's the HDR implementation?

Definitely up there with some of the best, in my opinion anyway

I have a JS8500 in the living room and X800D as a monitor and it looks like I may be moving my PC to the living room for a while as the X800D can't be set to 4:2:0 12bit so there is abundant banding :(
 
Do any of you know if we are going to get a Digital Foundry PC vs console teardown? Only asking because I've heard various people say that the PS4 Pro version looks exactly the same as PC @ 4K Ultra... This can't be true right? I mean, I've tried to compare screens/videos and I can already see a huge difference myself but I haven't actually seen either version in person so...
 

Ghazi

Member
Do any of you know if we are going to get a Digital Foundry PC vs console teardown? Only asking because I've heard various people say that the PS4 Pro version looks exactly the same as PC @ 4K Ultra... This can't be true right? I mean, I've tried to compare screens/videos and I can already see a huge difference myself but I haven't actually seen either version in person so...

People can have opinions and they can be wrong. We'll most likely get a Pro/PC comparison so you can just link people to that when they bring it up.
 

Akronis

Member
Do any of you know if we are going to get a Digital Foundry PC vs console teardown? Only asking because I've heard various people say that the PS4 Pro version looks exactly the same as PC @ 4K Ultra... This can't be true right? I mean, I've tried to compare screens/videos and I can already see a huge difference myself but I haven't actually seen either version in person so...

I can almost guarantee that the PS4 Pro does not look the same as PC @ native 4K Ultra. The PS4 Pro does not even run at 4K checkerboard.
 

nOoblet16

Member
Do any of you know if we are going to get a Digital Foundry PC vs console teardown? Only asking because I've heard various people say that the PS4 Pro version looks exactly the same as PC @ 4K Ultra... This can't be true right? I mean, I've tried to compare screens/videos and I can already see a huge difference myself but I haven't actually seen either version in person so...

If someone said that there's a 1080P mode in the game that has settings equal to ultra settings from PC I'd be ready to believe that as that's in the realms of possibility. But to say that it looks exactly like 4K/ultra when the Pro version isn't even checkerboard 4K is just too ridiculous to believe.
 

Lister

Banned
Do any of you know if we are going to get a Digital Foundry PC vs console teardown? Only asking because I've heard various people say that the PS4 Pro version looks exactly the same as PC @ 4K Ultra... This can't be true right? I mean, I've tried to compare screens/videos and I can already see a huge difference myself but I haven't actually seen either version in person so...

It's definitely not true. The most striking differences aside from frame rate - even via compressed internet videos are shadows and LOD, with noticeable pop-in on consoles vs PC.

The rest we won't know for sure until we get a tear down and even then we'll probably have to take the commentator's words for it since a lot fo detail is lost over video compression.
 

ISee

Member
Do any of you know if we are going to get a Digital Foundry PC vs console teardown? Only asking because I've heard various people say that the PS4 Pro version looks exactly the same as PC @ 4K Ultra... This can't be true right? I mean, I've tried to compare screens/videos and I can already see a huge difference myself but I haven't actually seen either version in person so...

It's a big release so probably.
But their PC vs Console comparisons are a bit disappointing in the last couple of months anyway. They no longer try to find out at what settings the console versions are running, they miss some pc features and often enough just run the game at 4k with a titan x, while praising how good ps4pro up-scaling is.
I also miss their old i3/750Ti @ console settings tests for example, which was a good way to see how well a game scaled and performed in comparison to consoles (aka good port or not).
Overall they did a better job in the past tbh, but they also seem to be a bit understaffed. Things will probably get even worse with scorpio because they'll have even more to do.
 

Lister

Banned
It's a big release so probably.
But their PC vs Console compassions are a bit disappointing in the last couple of months anyway. They no longer try to find out at what settings the console versions are running, they miss some pc features and often enough just run the game at 4k with a titan x, while praising how good ps4pro up-scaling is.
I also miss their old i3/750Ti @ console settings tests for example, which was a good way to see how well a game scaled and performed in comparison to consoles (aka good port or not).
Overall they did a better job in the past tbh, but they also seem to be a bit understaffed. Things will probably get even worse with scorpio because they'll have even more to do.

I think the 750ti was more about seeing how it performed with entry level contemporary hardware, not so much equivalent hardware, since the 750 is definitely below PS4 spec. The awesome thing about the 750ti was that even though it was LESS powerful than the PS4's GPU, it was still performing as well as it in many games, and even more suprisingly, BETTER in others.

I still remember when that GAF user was banned for pointing that fact out in a thread once. Or was he juniored, I forget. People got Maaaad.

Anyway a better build for DF to include for comparison's sake these days would be something like a1050ti or 1060 (or AMD equivalent) - showing what you get with a console and what you get with both entry level and high end PC hardware, rather than a 2 gen old entry level GPU which is probably hard to find these days and looking at amazon, definitely not competitively priced vs a 1050ti.
 

ISee

Member
I think the 750ti was more about seeing how it performed with entry level contemporary ahrdware, not so much like hardware since the 750 is definitely below PS4 spec. The awesome thing about the 750ti was that even though it was LESS powerful than the PS4, it was still performing as well as it in many game,s and even BETTER in others.

I still remember when that GAF user was banned for pointing that fact out ina thread once. Or was he juniored, I forget.

Anyway a better build to include for comparison's sake these days would be womethign like a1050ti or 1060 - showing what you get with a console and what you get with both entry level and high end PC hardware.

They normally overclocked the 750ti by ~200mhz. The oc should boost the 750ti to a similar level as the the ps4 gpu. In tflops at least, which is of course misleading because comparing tflops between nvidia and amd doesn't really work.
And yes a 1050ti/1060 would make more sense today, but even a 970/480 could do the trick, which is (again) similar in performance to a ps4pro (sort of).
 

Smokey

Member
Do any of you know if we are going to get a Digital Foundry PC vs console teardown? Only asking because I've heard various people say that the PS4 Pro version looks exactly the same as PC @ 4K Ultra... This can't be true right? I mean, I've tried to compare screens/videos and I can already see a huge difference myself but I haven't actually seen either version in person so...

I'd suggest you ignore whatever advice these people try to give you in the future.
 

Rellik

Member
Do any of you know if we are going to get a Digital Foundry PC vs console teardown? Only asking because I've heard various people say that the PS4 Pro version looks exactly the same as PC @ 4K Ultra... This can't be true right? I mean, I've tried to compare screens/videos and I can already see a huge difference myself but I haven't actually seen either version in person so...

I'm playing it on the Pro and I can tell you that those people are bullshitting. It's pop in city and 20fps. Stick to the PC version if you can.
 
What I said doesn't apply to gsync :)

In your case try changing the console commands (if it's possible to enable it)

GameTime.MaxVariableFps
RenderDevice.VSyncEnable
RenderDevice.TripleBufferingEnable
RenderDevice.RenderAheadLimit

until you get the best result. If you want to test with gsync disabled, best combo for other games usually is

GameTime.MaxVariableFps 60
RenderDevice.VSyncEnable 1
RenderDevice.TripleBufferingEnable 1
RenderDevice.RenderAheadLimit 2 (sometimes 1)

I'm on a 970 and would be fine with 30fps locked if I can have all the graphical settings on ultra. Can I just do
GameTime.MaxVariableFps 30?
 
How's the HDR implementation?

I am not a fan. In my opinion the implementation in MEA is more akin to an "alternative lighting version" of the base SDR, than a natural/realistic expansion of it. For instance, you'll come across environments bathed in "soft light" in the SDR version, that look completely different in HDR without (or with a heavily tweaked version of) the volumetric lighting. That's not to say the HDR vesion doesn't have some really "cool" effects, though. It's just that it can look pretty unnatural at times, which to me is the opposite of what HDR is about.
 

Lashley

Why does he wear the mask!?
Do any of you know if we are going to get a Digital Foundry PC vs console teardown? Only asking because I've heard various people say that the PS4 Pro version looks exactly the same as PC @ 4K Ultra... This can't be true right? I mean, I've tried to compare screens/videos and I can already see a huge difference myself but I haven't actually seen either version in person so...

Ahahahahaha
 
Checkerboard 1800 is lower pixel count than native 1440p no? Something like 1600x1800 (could be wrong).

Either way it's graphics are also at a lower setting, frames below 30 fps.
 

Deepo

Member
I'm not sure it's confirmed that works on this game, or is just speculation based on other Frostbite games.

The command is accessible via the console (shows up in autocomplete), but setting it to fex. 30 doesn't do anything, at least not on my rig.
 

Nekrono

Member
I'm on a 970 and would be fine with 30fps locked if I can have all the graphical settings on ultra. Can I just do
GameTime.MaxVariableFps 30?

I'm on a 970 G1, paired with a 2500k @4.0GHz and 8GB RAM.

I can play at 1080p with everything maxed out, including the settings that go a step beyond the Ultra preset, and I've locked the game at 40 FPS, so far I haven't dropped below that, maybe in a cutscene but I didn't really notice it. In gameplay I can say that it runs at a locked 40 FPS for 99% of the time.

I have vsync and triple buffering enabled in game and locked it to 40 FPS in RTSS. I'm also playing with a controller since that framerate doesn't feel that good with M&K although is not that bad in Andromeda.
 

oneils

Member
The game has crashed three times for me. Seems to happen during cutscenes. Annoying.

Edit: weird I can't seem to get past a certain cutscene after habitat 7. It just keeps crashing.
 
The game has crashed three times for me. Seems to happen during cutscenes. Annoying.

Edit: weird I can't seem to get past a certain cutscene after habitat 7. It just keeps crashing.

Latest drivers? I swear people have been having quite a few issues with them, I haven't installed them yet
 

Nekrono

Member
The game has crashed three times for me. Seems to happen during cutscenes. Annoying.

Edit: weird I can't seem to get past a certain cutscene after habitat 7. It just keeps crashing.

Are you playing on PC? If so, are you getting a directx error regarding memory?
 
The "auto 720p" and "auto 900p" options aren't for dynamic resolution scaling, right? It just scales the rendering load to those resolutions so you don't have to muck around with the slider, I'm guessing?
 
Top Bottom