• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Division PC performance thread

This is "nvidia sponsored" game (GameWorks) and yet 4GB Fury X has more FPS than 6GB 980Ti? And nano is almost as fast as 980Ti? I'd say that's unexpected.
I wonder which setting there were using. My VRAM usage was around 4.5GB

EDIT: ok, no PCSS+ and HBAO+, that explains it. GimpWorks strikes again! ;)

GCN is in both consoles. just be glad youre not a kepler owner. eeesh its bad. yet another one
 

Durante

Member
EDIT: ok, no PCSS+ and HBAO+, that explains it. GimpWorks strikes again! ;)
Calling two of the very best high-end diffuse lighting effects implemented in games "GimpWorks" reflects more on your perspective than those effects.

I swear, it often feels like you people would rather have a straight console port without any options.
 

cyen

Member
Calling two of the very best high-end diffuse lighting effects implemented in games "GimpWorks" reflects more on your perspective than those effects.

I swear, it often feels like you people would rather have a straight console port without any options.

NVIDIA defense force? I kid. It would be better if they were using in house effects that could be optimized By each vendor and not only one.
Love hbao+ btw but we need open standards and not closed ones but this is a discution for another topic.
 

UnrealEck

Member
Ubisoft said there were minor graphical issues on nVidia hardware before the beta, so maybe there's also less performance than they anticipate there being closer to release.
 

SimplexPL

Member
Calling two of the very best high-end diffuse lighting effects implemented in games "GimpWorks" reflects more on your perspective than those effects
The ridiculous phrase "strikes again" and an appropriate emoticon was there for a reason - i.e. so that people wouldn't take this dead serious. Guess that didn't work out.
I have 980Ti, and I like and use HBAO+. I also have the utmost respect for all your work for the community, so I hope that cleared things up.
 
30-40fps medium

Sad-Crying-Meme-Face-05.jpg


:(....why did you have to tell me the truth?
 

Smokey

Member
Gsync is bugged in Fullscreen mode, try borderless window and it works. At least gsync. Maybe SLI + Gsync works in borderless window mode.

Is it? I just set it to Windowed (Fullscreen) after reading this post, and my GPU usage tanks (GSYNC on) to the 50%-60% area. Put it back on Full Screen and it's right back up to the normal 99%.
 
Is it? I just set it to Windowed (Fullscreen) after reading this post, and my GPU usage tanks (GSYNC on) to the 50%-60% area. Put it back on Full Screen and it's right back up to the normal 99%.

Windowed video modes do not work for AFR on windows under DX. Never have really. Should work under Mantle, Vulkan, and DX12 though.

Never ever used windowed fullscreen or otherwise with SLI.
 

Spinifex

Member
http--www.gamegpu.ru-images-stories-Test_GPU-MMO-Tom_Clancys_The_Division_Beta_-test-nv.jpg


I wonder why they are recommending a resolution in which the game runs at 40 fps at max settings.

We're moving away from being able to run games at Max Settings at launch -- that's okay, but benchmarkers and reviewers put such a high emphasis on max settings that it's causing heaps of run-on issues IMO.
 

Akronis

Member
NVIDIA defense force? I kid. It would be better if they were using in house effects that could be optimized By each vendor and not only one.
Love hbao+ btw but we need open standards and not closed ones but this is a discution for another topic.

Why would Ubisoft Massive spend time creating their own contact hardened shadow and AO methods if they are provided to them already? Plus, they've been used in other Ubisoft titles.

They're incredibly taxing effects for both AMD and NVIDIA cards.
 

Yibby

Member
Is it? I just set it to Windowed (Fullscreen) after reading this post, and my GPU usage tanks (GSYNC on) to the 50%-60% area. Put it back on Full Screen and it's right back up to the normal 99%.

I get 99% GPU load in both Fullscreen and windowed mode, but in Fullscreen the monitor refresh rate doesn't match my fps. The refresh rate jumps between 90hz-144hz and that's no where near the fps of around 45-60.
 

Smokey

Member
Windowed video modes do not work for AFR on windows under DX. Never have really. Should work under Mantle, Vulkan, and DX12 though.

Never ever used windowed fullscreen or otherwise with SLI.

I'm running a single Titan X.

I get 99% GPU load in both Fullscreen and windowed mode, but in Fullscreen the monitor refresh rate doesn't match my fps. The refresh rate jumps between 90hz-144hz and that's no where near the fps of around 45-60.

A little confused by this post. What do you mean the refresh rate jumps? A little clarification if you could.

I played for a hour plus, and averaged 66 fps @2560x1440. Everything on max except for PCSS+. Smooth and I am impressed by the betas state. I can't really tell if GSYNC is working or not, and not really sure how you could tell such anyway, but this beta period has been a good experience for me.

http--www.gamegpu.ru-images-stories-Test_GPU-MMO-Tom_Clancys_The_Division_Beta_-test-nv.jpg


I wonder why they are recommending a resolution in which the game runs at 40 fps at max settings.

2560x1440 is being used more and more by enthusiasts. They are not really "recommending" the resolution, but just saying you probably need the listed card for a nice, playable experience at that resolution.
 

Qassim

Member
[ig]http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-MMO-Tom_Clancys_The_Division_Beta_-test-nv.jpg[/img]

I wonder why they are recommending a resolution in which the game runs at 40 fps at max settings.

Because there are settings other than max? People seriously need to get over this weird obsession with 'max settings' or it's going to change PC games for the worse.
 

Kezen

Banned
Because there are settings other than max? People seriously need to get over this weird obsession with 'max settings' or it's going to change PC games for the worse.

Indeed. I fear the bad rep some very good PC versions got will make devs much more risk averse. If "max" settings performance is the yardstick by which PC skus are judged then straight PC ports will become the norm.

No doubt max settings would run very, very easily if no scalability beyond consoles was involved.
 

Yibby

Member
A little confused by this post. What do you mean the refresh rate jumps? A little clarification if you could.

I check my fps with shadow play and it's around 45-60. My monitor has an overlay where i can see the current refresh rate. The framerate does not match the refresh rate shown by my monitor, that means gsync is not working in Fullscreen.
 

darthbob

Member
After about 2 months I finally resolved my random rebooting problem (it was the PSU) and now my upgrade is complete!

cTBGYgf.png


Runs the Beta at High/Ultra settings, 1080p 60 fps flawless.
 

GHG

Member
I would be fine if they hid "ultra" or "extra high" or insane settings behind .ini files

Why?

To make people feel better about themselves so that they can ignorantly brag about "maxing" the game out on their mid range GPU?

I think people need to come to terms with the fact that PC games are becoming more demanding agIn and embrace it. It's not a bad thing if you can put all the options on "medium" or "high" and still have a damn looking game while "ultra" is reserved for only those with enthusiast hardware or for until the future upgrade for most people.

This is how it always used to be until developers started making games for consoles first and then just porting the game over to the PC with minimal improvements. Now we are actually starting to get PC games that are doing things the consoles can only dream of but yet we are starting to have more complaints? We have people saying games are "Unoptimised" because they can't max it out at 60fps on a gtx 970. I mean, give me a break....

It doesn't make sense to me. A 970 gtx is not supposed to be able to see you through a whole generation (4 years) while being able to max every game out at 60fps. It doesn't work like that. If it did then the PC gaming space would be in a very bad place as it would mean we are in a state of stagnation.

So no. The ultra/very high settings should absolutely be left in the graphical settings menu. If people can't handle the fact that they have to lower/tweak some options or upgrade then tough shit. Either that or they can go crying back to console gaming and live in ignorant bliss.
 

Smokey

Member
Why?

To make people feel better about themselves so that they can ignorantly brag about "maxing" the game out on their mid range GPU?

I think people need to come to terms with the fact that PC games are becoming more demanding agIn and embrace it. It's not a bad thing if you can put all the options on "medium" or "high" and still have a damn looking game while "ultra" is reserved for only those with enthusiast hardware or for until the future upgrade for most people.

This is how it always used to be until developers started making games for consoles first and then just porting the game over to the PC with minimal improvements. Now we are actually starting to get PC games that are doing things the consoles can only dream of but yet we are starting to have more complaints? We have people saying games are "Unoptimised" because they can't max it out at 60fps on a gtx 970. I mean, give me a break....

It doesn't make sense to me. A 970 gtx is not supposed to be able to see you through a whole generation (4 years) while being able to max every game out at 60fps. It doesn't work like that. If it did then the PC gaming space would be in a very bad place as it would mean we are in a state of stagnation.


It's mostly a "well my video card cost $300+ so I should be able to max everything because reasons" thing. Massive have put a good amount of work into the PC version. It runs and looks great. I dunno where this need to "max" everything came from either, but it needs to die.
 

GHG

Member
It's mostly a "well my video card cost $300+ so I should be able to max everything because reasons" thing. Massive have put a good amount of work into the PC version. It runs and looks great. I dunno where this need to "max" everything came from either, but it needs to die.

I honestly think it's coming from people who have only started gaming on the PC in the last 4/5 years or so when we were in a situation where the games were just not pushing the available hardware at the time (with the exception of a couple of games). It was pretty much a given that if you had a $300 GPU then you could max everything out at 1080p so people who were introduced to PC gaming at that time wrongly got conditioned into thinking that was the norm.

Those of us who have been gaming on PC's for longer than that however know it is not the norm. Games used to come out and you would think "can I even run that at all", never mind "oh no there's this one place it dropped from 60fps to 50fps on my 970 gtx while maxed out, what an unoptimised mess, fix this shit Ubi".
 
It was just a few days ago I said that, with time, you'd need a 970 just for console parity and everyone replied with "NO WAY".

These days it takes few days to make predictions true. Just imagine what happens in the next couple of years when DX12 hardware comes out...
 
We have people saying games are "Unoptimised" because they can't max it out at 60fps on a gtx 970. I mean, give me a break....

It doesn't make sense to me. A 970 gtx is not supposed to be able to see you through a whole generation (4 years) while being able to max every game out at 60fps.

This game is 1080p, 30 fps at high setting on consoles. A GTX 770 is about twice that power and should give double the fps of the console at similar settings and resolution.

The 970 is a step above the 770, several times the power of a PS4. It's NATURAL people expect double the framerate at way higher graphic settings than consoles.

This game doesn't even come close to that.

It's not about "coming to terms that you can't max settings anymore". It's coming to term that the gap between expensive PC hardware and cheap console hardware is narrowing overnight.
 

Smokey

Member
This game is 1080p, 30 fps at high setting on consoles. A GTX 770 is about twice that power and should give double the fps of the console at similar settings and resolution.

The 970 is a step above the 770, several times the power of a PS4. It's NATURAL people expect double the framerate at way higher graphic settings than consoles.

This game doesn't even come close to that.

That's not how things work, and have never worked like that.

Where is a link saying the consoles run at the equivalent of "high" settings? Also if you expect a game to run at more than double the frame rate of the console version along with highly intensive graphical effects because you spent $300+ on a GPU, that's a you problem.
 

GHG

Member
This game is 1080p, 30 fps at high setting on consoles. A GTX 770 is about twice that power and should give double the fps of the console at similar settings and resolution.

The 970 is a step above the 770, several times the power of a PS4. It's NATURAL people expect double the framerate at way higher graphic settings than consoles.

This game doesn't even come close to that.

It's not natural, it's idiotic. Sorry to be blunt.

Without getting into whether the consoles are running the game at the equivalent of high settings or not, people need to take into account how resource intensive the settings they are selecting are. People are not thinking, they are just typing.

lf you whack up things like reflections, PCSS, hbao, subsurface scattering, particle detail and volumetric fog why are you expecting linear performance gains?

It was just a few days ago I said that, with time, you'd need a 970 just for console parity and everyone replied with "NO WAY".

These days it takes few days to make predictions true. Just imagine what happens in the next couple of years when DX12 hardware comes out...

And as for this post... Console parity means running at equivalent settings with the same target resolution AND framerate. Just think about that for a second.
 
It's not natural, it's idiotic. Sorry to be blunt.

Without getting into whether the consoles are running the game at the equivalent of high settings or not, people need to take into account how resource intensive the settings they are selecting are. People are not thinking, they are just typing.

lf you whack up things like reflections, PCSS, hbao, subsurface scattering, particle detail and volumetric fog why are you expecting linear performance gains?

Of course we have to see what console settings are at, on PC.

It's just experience saying that usually console settings are equivalent to "high". In that case the PC, without PCSS or HBAO, still can't have stable 60fps at 1080p with a 970.

That's absurd.

It's not absurd if it's then discovered that console settings are dialed down even compared to "high" on PC. But that's hardly the case. We'll see.
 
And as for this post... Console parity means running at equivalent settings with the same target resolution AND framerate. Just think about that for a second.

Yes. I said two things. One is that gap between PC hardware and console is narrowing. The other is that the 970 will reach parity with console within the next couple of years. Right now you can see that you can't even double the fps at same resolution, which is INSANE for a 970 compared to a console.

As I said, right now and for all games up to this point (say Battlefield), a 770 CAN double fps of a PS4 at similar settings. There are almost zero exceptions. All games out there outside a few cases that are poorly optimized.

The 970 is not a 770. The 770 can have stable 60 fps at 1080p on Battlefield at slightly better console settings. Battlefield doesn't even run at 1080p on console. Do you see how big is the gap?

Now in this game the 770 might not even offer fixed 30 fps at console settings and resolution.

There are countless of Eurogamer articles out there demonstrating how console parity is achievable with an i3 and a 750 Ti. Now try to run this game, at similar setting, on that hardware.

The gap is narrowing overnight.
 

Tovarisc

Member
Of course we have to see what console settings are at, on PC.

It's just experience saying that usually console settings are equivalent to "high". In that case the PC, without PCSS or HBAO, still can't have stable 60fps at 1080p with a 970.

That's absurd.

It's not absurd if it's then discovered that console settings are dialed down even compared to "high" on PC. But that's hardly the case. We'll see.

Isn't subsurface scattering exclusive to PC version of the game? Also I wonder how extensive volumetric fog stuff is in console version. To argue that 970 can't do 1080p60 @ console quality settings we first would need to know accurately what quality settings consoles use, and what techs they are actually running.

Edit: What about rendering distance and object quality? Those are also adjustable on PC.

edit:
There are countless of Eurogamer articles out there demonstrating how console parity is achievable with an i3 and a 750 Ti. Now try to run this game, at similar setting, on that hardware.

What are those same or similar settings tho? I'm actually curious to know if those have been mapped out for Division beta already.
 

GHG

Member
Yes. I said two things. One is that gap between PC hardware and console is narrowing. The other is that the 970 will reach parity with console within the next couple of years. Right now you can see that you can't even double the fps at same resolution, which is INSANE for a 970 compared to a console.

As I said, right now and for all games up to this point (say Battlefield), a 770 CAN double fps of a PS4 at similar settings. There are almost zero exceptions. All games out there outside a few cases that are poorly optimized.

The 970 is not a 770. The 770 can have stable 60 fps at 1080p on Battlefield at slightly better console settings. Battlefield doesn't even run at 1080p on console. Do you see how big is the gap?

Now in this game the 770 might not even offer fixed 30 fps at console settings and resolution.

There are countless of Eurogamer articles out there demonstrating how console parity is achievable with an i3 and a 750 Ti. Now try to run this game, at similar setting, on that hardware.

The gap is narrowing overnight.

How can you argue any of this without even knowing what the console equivalent settings are? You're just guessing at this point because it suits what you want to believe.

Keep on believing I guess.
 
How can you argue any of this without even knowing what the console equivalent settings are? You're just guessing at this point because it suits what you want to believe.

Yes, it's just a guess, but it's a guess based on cases that come before this.

Eurogamer will likely do these comparisons soon so we'll see where's the truth. In the meantime we know that Tomb Raider is another of those games demonstrating the gap is narrowing.

The difference is that if it happens on one game, then you can blame it on that game lack of optimization. But if more games come out, big titles, and all follow a similar trend, then you have to adjust your expectations.

People here are saying that the narrowing gap is due to PC version offering higher settings than usual. We'll see soon if that's the case or not. I say it's not.
 

Helznicht

Member
This game is 1080p, 30 fps at high setting on consoles. A GTX 770 is about twice that power and should give double the fps of the console at similar settings and resolution.

The 970 is a step above the 770, several times the power of a PS4. It's NATURAL people expect double the framerate at way higher graphic settings than consoles.

This game doesn't even come close to that.

It's not about "coming to terms that you can't max settings anymore". It's coming to term that the gap between expensive PC hardware and cheap console hardware is narrowing overnight.

Ok, turn the settings down to console levels and your close (they are not high, high on PC looks noticeably better than the xbox). Shots taken at 1080p on a 970 with an i5 3.5ghz. Framerate ran between 105 and 130 the entire time, so close to 4x the xbox. We just have to come to terms with the fact that those high-ultra effects take a toll on the framerate.

Hx0C6Zg.jpg

r73vvqn.jpg

YmugPhV.jpg

JQFsWi9.jpg

UuSOvdd.jpg
 

napata

Member
Yes. I said two things. One is that gap between PC hardware and console is narrowing. The other is that the 970 will reach parity with console within the next couple of years. Right now you can see that you can't even double the fps at same resolution, which is INSANE for a 970 compared to a console.

As I said, right now and for all games up to this point (say Battlefield), a 770 CAN double fps of a PS4 at similar settings. There are almost zero exceptions. All games out there outside a few cases that are poorly optimized.

The 970 is not a 770. The 770 can have stable 60 fps at 1080p on Battlefield at slightly better console settings. Battlefield doesn't even run at 1080p on console. Do you see how big is the gap?

Now in this game the 770 might not even offer fixed 30 fps at console settings and resolution.

There are countless of Eurogamer articles out there demonstrating how console parity is achievable with an i3 and a 750 Ti. Now try to run this game, at similar setting, on that hardware.

The gap is narrowing overnight.

Why would you only look at Nvidia? If I look at performance I see a 7870 beating the ps4. The 7870 is right where it should be in regards to PS4 perfomance. Doesn't seem like the gap is narrowing at all for AMD. Keppler is just bad with new games and this has been true for a while. Btw this game seems to favor AMD by a wide margin in general. Hell even the minimum & recommended requirements show this. Usually a 7770 isn't anywhere close to a 760.

Also console settings aren't high at all. They're usually a combination of high, medium and low with the most expensive settings like LOD and draw distance at low.
 

JaseC

gave away the keys to the kingdom.
Gah! When is Pascal out? I need the 1080Ti or whatever they will call it in my life.

Current rumours have it pencilled in for some point in the back half of the year, so it's still around five months away at best.
 

SimplexPL

Member
So what should be a yardstick for good performance/optimization? If overclocked 980Ti paired with overclocked i7-6700K cannot sustain 60fps in Division Beta on max settings is that ok, or should we expect better performance? I wonder what's the consensus on that (if any).

@dictator93
I played on PC and then on PS4. Console version looked surprisingly good, at first glance not visibly gimped compared to PC, and also run at perfect 30fps at 1080p. I'm sure there will be differences visible when comparing one next to another, but I was not able to spot any (I did not try very hard to do that, though). Even some form of real time reflections was present, which surprised me the most.
 

Smokey

Member
So what should be a yardstick for good performance/optimization? If overclocked 980Ti paired with overclocked i7-6700K cannot sustain 60fps in Division Beta on max settings is that ok, or should we expect better performance? I wonder what's the consensus on that (if any).

Yes... It is OK. What difference does max make from not max? Is it a mental thing? I get an Avg of 62 fps with all settings at max except for PCSS at 2560x1440 . Technically it's not maxed because of that one setting. In Rise of TR I get 60+ with only shadows at one notch below max. Is that a problem too? I'm getting performance far beyond the console versions here and over 60fps.
 

jaaz

Member
So what should be a yardstick for good performance/optimization? If overclocked 980Ti paired with overclocked i7-6700K cannot sustain 60fps in Division Beta on max settings is that ok, or should we expect better performance? I wonder what's the consensus on that (if any).

I have this set-up. As stated above, going with high instead of PCSS will yield you 60+ FPS average. Going with Ultra instead of HBAO+ will get you even more FPS. I think it's a question of driver and game optimization at this point, both of which can be worked on prior to launch. I can't tell the difference between the settings.

I'm more concerned about G-sync frankly. I can't prove it, but it doesn't seem to be working.
 

SimplexPL

Member
In the previous post I forgot to mention the resolution, so the sentence should be:
" If overclocked 980Ti paired with overclocked i7-6700K cannot sustain 60fps in Division Beta on max settings at 1080p is that ok, or should we expect better performance? "


I have this set-up. As stated above, going with high instead of PCSS will yield you 60+ FPS average.
Yes... It is OK. What difference does max make from not max? Is it a mental thing? I get an Avg of 62 fps with all settings at max except for PCSS at 2560x1440 . Technically it's not maxed because of that one setting. In Rise of TR I get 60+ with only shadows at one notch below max. Is that a problem too?
I'm getting performance far beyond the console versions here and over 60fps.

Sure, if one such setting can make such a big difference, then by all means maxing it just for the sake of maxing makes no sense. However, I was getting around 40fps with everything maxed out, at 1440p, so I have to check if disabling PCSS will really increase performance by 50%.

Going with Ultra instead of HBAO+ will get you even more FPS.
I remember nvidia marketing HBAO+ as a feature that offers superior quality with better performance, so in theroy HBAO+ should be the best option for nvidia users, but maybe I misunderstood.
 

Tovarisc

Member
I remember nvidia marketing HBAO+ as a feature that offers superior quality with better performance, so in theroy HBAO+ should be the best option for nvidia users, but maybe I misunderstood.

HBAO+ is maybe best AO solution out atm when it comes to how it looks, but I wouldn't say it's less demanding on HW than some generic AO solution. I would say it costs few frames more than your run of the mill AO solution.
 
Top Bottom