• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Watch_Dogs PC Performance Thread

Netboi

Banned
Sleeping Dogs Ultra + HD Texture Pack Vs. Watch Dogs Ultra Settings

No Sweet FX added to Watch Dogs.

TfraPKn.jpg
 
I get no stuttering using the NVIDIA recommended settings (except DoF which I turned off). Hardware is:

Qt5rfhO.png


Nothing is overclocked and GPU temperature is limited to a max of 80C. I don't know what the frame rate is, but I'm not sensitive to low frame rates and whatever it is seems fine to me.
 

b0bbyJ03

Member
Tbh pretty much everything looks different. The ground, the grass, you can even still see the difference on the building to the left, which is a bit further. The bottom part of the machine to the right shows much more detail. Sure in the distance you won't notice anything, but I'd say the difference is pretty noticeable.

Talking about this comparison:





And this is just plain wrong.

if you read nvidias comparison they even mention that the higher the texture settings even add more geometry. compare the medium and high setting can you can clearly see it

http://www.geforce.com/whats-new/guides/watch-dogs-graphics-performance-and-tweaking-guide#textures
 

Nethaniah

Member
I think it looks good because of all the glowish effects on it like a CG character in a movie. Watch Dogs ops for realism.

All the characters just look......oiled up? I found it jarring when it played it, not that i consider the characters of Watch Dogs to be the best, not a truly fair comparison though either way.
 

UrbanRats

Member
Sleeping Dogs Ultra + HD Texture Pack Vs. Watch Dogs Ultra Settings

No Sweet FX added to Watch Dogs.

TfraPKn.jpg

Sleeping Dogs looks amazing, but Watch Dogs just looks better, let's not hop on the crazy train now.

the HBAO+ instead of that shite they have on Sleeping Dogs alone makes a huge difference.
 

Guri

Member
I didn't start playing yet, but there's a bunch of known issues on Ubi's forums. It might be a good idea for people to go on the specific threads for one or more issue and provide the details they ask so maybe they can get to solutions quickly.
 

irishcow

Member
Just started playing this. I have i5 2500k @4.5ghz, R9 290 4gb card. Running at 1080p, maxxed out settings, 4xMSAA.

I locked it at 30fps and it never drops below it. Runs great. No stuttering.

Unlocked it runs between 38 and 70fps.

I would rather have locked 60fps especially considering it doesn't look all that great but it's not bad at all at 30fps.

Definitely better than it would be on my PS4 I presume.
 
Why? You're frametime is probably dropping out of its arse (and causing the stuters) with only 2GB of VRAM. We know the game is a RAM monster for both card and system.

Stuttering isn't VRAM. It's something else. For me SLI causes stuttering, as when I disable SLI, the stuttering disappears.
 

Netboi

Banned
All the characters just look......oiled up? I found it jarring when it played it, not that i consider the characters of Watch Dogs to be the best, not a truly fair comparison though either way.

That SSAO makes it all look fluidity on Sleeping Dogs. I'm just putting it up because people think Sleeping Dogs looks better. Visually it doesn't look better than Watch Dogs.
 

-Deimos

Member
i7 2600k @ 4.2 GHz
4GB GTX 670 reference clock
8GB RAM

Everything ultra, DOF, vsync and motion blur off with 2x TXAA gets me 30-55 fps and stutters when driving fast at 1080p.

Is pop-in a problem for anyone else? Extremely noticeable when driving on the highway.
 
I ran the game for about an hour this morning after it d/l last night and here are my settings. Luckily I have not run into any screen tearing issues, massive framerate drops, or really bad slowdown. I was worried that my setup would not be able to run this game at all.


i5 2500k@ 3.3ghz
770 @ stock
8GB RAM 1600mhz
Installed on Evo 120GB SSD
50" Plasma

- Resolution 1920 x 1080
- Refresh rate 60hz
- Fullscreen
- Vsync "1"
- GPU max buffered frames "3"
- Textures "High"
- Anti-aliasing "MSAA4x"
- Level of detail "Ultra"
- Shadows "High"
- Reflections "High"
- AO "HBAO+Low"
- Motion blur "On"
- Depth of field "On"
- Water "Ultra"
- Shader "High"

Any recommendations on what you guys think I can bump up a bit with my setup? Going to tinker around with this a bit more later tonight. I was thinking of maybe OC my 2500K to see if that does much. I am pretty happy with the results, but wouldn't mind to try and squeeze out some more performance from my rig.
 

Netboi

Banned
I ran the game for about an hour this morning after it d/l last night and here are my settings. Luckily I have not run into any screen tearing issues, massive framerate drops, or really bad slowdown. I was worried that my setup would not be able to run this game at all.


i5 2500k@ 3.3ghz
770 @ stock
8GB RAM 1600mhz
Installed on Evo 120GB SSD
50" Plasma

- Resolution 1920 x 1080
- Refresh rate 60hz
- Fullscreen
- Vsync "1"
- GPU max buffered frames "3"
- Textures "High"
- Anti-aliasing "MSAA4x"
- Level of detail "Ultra"
- Shadows "High"
- Reflections "High"
- AO "HBAO+Low"
- Motion blur "On"
- Depth of field "On"
- Water "Ultra"
- Shader "High"

Any recommendations on what you guys think I can bump up a bit with my setup? Going to tinker around with this a bit more later tonight. I was thinking of maybe OC my 2500K to see if that does much.

Overclock your GPU to 1200/1800
Motion Blur Off
AA: TemporalSmaa or TXAA 2x

Change your XML file in documents/mygames/watchdogs
DeferredFXQuality=PC
PostFXQuality=Off (Unless you want it to look grainy like a film)
Reslution: 1920 x 1080
 

Wag

Member
So after all this, after holding it back this is what we get? If it's in this state now I can only imagine what state the game was in 6mos ago. So is this the glitchiest big game release in recent memory?
 

Saiyan-Rox

Member
i5 3350 not over clocked

12GB RAM

AMD radeon 7950

Windows 8.1

Frame rate takes a dive when I'm driving about the city below 20 sometimes which is bad. Any idea what settings would be good for me?
 
I ran the game for about an hour this morning after it d/l last night and here are my settings. Luckily I have not run into any screen tearing issues, massive framerate drops, or really bad slowdown. I was worried that my setup would not be able to run this game at all.


i5 2500k@ 3.3ghz
770 @ stock
8GB RAM 1600mhz
Installed on Evo 120GB SSD
50" Plasma

- Resolution 1920 x 1080
- Refresh rate 60hz
- Fullscreen
- Vsync "1"
- GPU max buffered frames "3"
- Textures "High"
- Anti-aliasing "MSAA4x"
- Level of detail "Ultra"
- Shadows "High"
- Reflections "High"
- AO "HBAO+Low"
- Motion blur "On"
- Depth of field "On"
- Water "Ultra"
- Shader "High"

Any recommendations on what you guys think I can bump up a bit with my setup? Going to tinker around with this a bit more later tonight. I was thinking of maybe OC my 2500K to see if that does much. I am pretty happy with the results, but wouldn't mind to try and squeeze out some more performance from my rig.

Overclock your GPU to 1200/1800
Motion Blur Off
AA: TemporalSmaa or TXAA 2x

Change your XML file in documents/mygames/watchdogs
DeferredFXQuality=PC
PostFXQuality=Off (Unless you want it to look grainy like a film)
Reslution: 1920 x 1080

Looks like this guy just got banned. LOL! Are these settings that Netboi recommended worth trying?
 

Saiyan-Rox

Member
What are your current settings?

Not at the PC atm but from what I remember

Resolution 1920 x 1080

- Vsync off
- GPU max buffered frames 1
- Textures Ultra
- Anti-aliasing Off
- Level of detail Ultra
- Shadows High
- Reflections High
- AO off
- Motion blur On
- Depth of field On
- Water Ultra
- Shader High
 

mkenyon

Banned
What does a card do when it runs out of VRAM and has to swap in new assets?

Essentially, there is a a huge chug in performance where frame times spike well above the norm, and the card basically stops rendering frames. In the following graph, you can see frame times spike to levels that make the game essentially unplayable for a few seconds at a time. In the game, it will feel like a momentary pause.

VurndvA.png


Here are FPS charts with the exact same data as above.

okh8Xv8.png


Notice that the "FPS" doesn't drop below 25 FPS despite having frame times well above 150ms (6.6 FPS)? This is why games can often feel much worse than what the FPS meter is showing. FPS takes all frames rendered over one second and gives you an average of those frames, which can obfuscate major issues.



What is the difference between this and frequent stuttering?

The sort of frequent hitching and odd frame timing is often described as microstutter. This is when frames are being rendered at inconsistent times in rapid succession. In the next graph, you can see the 680 SLI line in blue as very thick, as each frame has a fairly significant time difference from all the other frames. In the game, it will feel consistently choppy, but not with any huge chugging.

Avl2om4.png


Will adding more cards in SLI/X-Fire help solve the issue of running out of memory?

No. Imagine the first graph above with a new line that essentially mirrors the other one, but is otherwise about 60-90% faster most of the time. The huge pauses and stutter would not go away at all, as you are still hitting the upper boundaries of VRAM.

So what should I look at buying to have awesome performance with Ultra textures in Watch_Dogs?

In the words of a well respected GAF member, "anyone upgrading for WD has brain damage or is looking for an excuse". Honestly, at this point, it's hard to really give any specific recommendations because the game seems to be wonky. The VRAM situation from many accounts seems to be a coding issue or bug, and could be ironed out over the next few weeks.

As it stands though, AMDs cards are generally performing much better than NVIDIAs offerings at each of the price points. The R9 290 4GB seems to be the most obvious choice for smooth performance, and can often be found for $330-400 right now. If you're okay with buying used, you can find them as cheap as $250. If you can't afford that much, then the R9 280x can handle a light amount of AA with ultra textures at 1080p no problem.

If you're dead set on NVIDIA, the GTX 780 is the only choice. The 770 4GB, while tempting at a lower price, can't even hope to compete with the R9 290. They're not even really in the same league. The 770 is literally the same card as the GTX 680. To say it is a bit long in the tooth would be an understatement. It's a mid-range GPU masquerading as a top tier card, and the limited memory bandwidth exposes this. For this reason, I wouldn't look at anything other than the GTX 780.

But mkenyon, I read xxx's Watch_Dogs performance review and the AMD cards aren't anything special.

It's a brand new game. If you aren't planning on playing it any longer than the first week or two, by all means, go ahead and follow one of those reviews. But really, the performance in the game is going to change pretty radically over the next few weeks as new drivers and patches come through. To give you a rough idea of how current videocards stack up against each other, this is something both NVIDIA and AMD have had a long time to optimize for.

0Mj6QdE.png
 
[H]ardOCP's Watch_Dogs Performance Preview is out.


14011841102ZXDC1lHwG_3_3.gif
-"In this test we are pushing the game up to its maximum in-game quality setting levels to compare performance. This means we are going beyond the "Ultra" overall quality setting provided in the game. We are raising the HBAO setting to its highest HBAO+ High setting along with "Ultra" textures to provide the highest settings possible. We still have AA disabled here. We see XFX Radeon R9 290X DD performing faster than NVIDIA GeForce GTX 780 Ti even at HBAO+ High mode. This is mostly due to the performance drops that are present with "Ultra" textures. Still, these are the highest possible in-game settings this game is capable of, and the AMD Radeon R9 290X GPU based video card is coming out on top."

-"We hope we have smashed the rumors and provided facts based on gameplay and not some quick-use benchmark tool that will many times tell you little. We actually found the Radeon R9 290X slightly faster in some scenarios compared to the GeForce GTX 780 Ti. We also found out that gameplay consistency was a lot better on Radeon R9 290X with "Ultra" textures enabled thanks to its 4GB of VRAM."

-"AMD made a move years ago with a bold statement to release video cards (AMD Radeon 7950 and 7970) with 3GB of VRAM at a time when most video cards were carrying 2GB. It seems that sort of forward thinking has paid off finally in 2014 with Watch Dogs.
Thankfully NVIDIA now has the GTX 780 and GTX 780 Ti with 3GB of VRAM. Video cards like the GeForce GTX 770 have a reference specification of 2GB. You can purchase 4GB GTX 770 cards and the like, but these do cost more for that VRAM. However, in this game, we think that will be beneficial to gameplay if you want to run with the "Ultra" textures. In fact, it will be required.
We found out that even on the GeForce GTX 780 Ti with 3GB of VRAM performance can drop as new textures and scenery are loaded into memory. This was most notable moving the camera, or driving in the open world city. With the XFX Radeon R9 290X DD with its 4GB of VRAM we experienced smooth gameplay with no drops in performance using "Ultra" textures."



Looks like the 290 is beating out the 780 Ti. So It seems for now that if you're on the fence about these two high end cards, the 290X DD is going to outperform with more stable framerates, dat vram doe.
 

mkenyon

Banned
Hard is pretty outdated now, refusing to use the newest, most accurate testing tools. I wouldn't go anywhere outside of Guru3D, PCPer, and TechReport, IMO.
 
Not at the PC atm but from what I remember

Resolution 1920 x 1080

- Vsync off
- GPU max buffered frames 1
- Textures Ultra
- Anti-aliasing Off
- Level of detail Ultra
- Shadows High
- Reflections High
- AO off
- Motion blur On
- Depth of field On
- Water Ultra
- Shader High

I'd take the textures and LOD to High, put AO on MHBAO, and AA on Temporal SMAA. I'd much rather have AO and AA than higher res textures.
 
I'm getting a tiny bit of stuttering in the Downtown areas when driving. It reminds me of the old Oblivion loading pause.

There doesn't seem to be anything I can change in the settings to get rid of it, but I've only messed around for a little bit. V-SYNC, Buffered Frames, filtering AA and Level of Detail don't seem to make any difference.
 

Seanspeed

Banned
So the fix for stuttering when driving is setting textures to high? Or does -disablepagefile actually work?
There's no sure-fire fix it seems. Some people seem to have alleviated in different ways, but many people are still experiencing it after trying these methods. People with high-end setups as well.

There's something going wrong somewhere.
 
Hard is pretty outdated now, refusing to use the newest, most accurate testing tools. I wouldn't go anywhere outside of Guru3D, PCPer, and TechReport, IMO.

How so? Watch_Dogs has no benchmark tools and their tests are always done from real-world ingame gameplay, not some useless timedemo like most sites use.
 

mkenyon

Banned
How so? Watch_Dogs has no benchmark tools and their tests are always done from real-world ingame gameplay, not some useless timedemo like most sites use.
They still use FPS charts rather than frame time analysis because the latter is more time intensive. However, if you look at my post above, only going with FPS data can really obfuscate problems.
 

Skyzard

Banned
geforce experience tells me to put everything to max for my 780ti.

No. I don't want a slowdownfested wd.


---actually, it's fine! Just saw that AA suggested was temporal smaa
 
if you read nvidias comparison they even mention that the higher the texture settings even add more geometry. compare the medium and high setting can you can clearly see it

http://www.geforce.com/whats-new/guides/watch-dogs-graphics-performance-and-tweaking-guide#textures

That is not between ultra and high though. I don't think there are geometry differences between those, but I'd say the textures are pretty significantly different.

There are geometry differences between High and Medium Textures in NVIDIA's Article/comparison shots, (which could help A LOT with performance IMO).

I wish we could seperate these two things, Textures and Model Quality, into two seperate selections. because I'd like higher quality textures and don't care much for the added model quality.

http://international.download.nvidi...ogs-textures-comparison-1-high-vs-medium.html

They still use FPS charts rather than frame time analysis because the latter is more time intensive. However, if you look at my post above, only going with FPS data can really obfuscate problems.
Excuse my ignorance but what is the difference between frame time and Frames Per Second? Whats so important of frame time, and how does it translate into gameplay. How can end-users measure it, and ultimately does it matter more than FPS?
 
So I am trying to lock this game at 30 using nvidia control panel, vertical sync at half the refresh rate. It doesn't seem to be taking hold in game though. I would use the in game option, but the input delay is way too terrible.
 

Vuze

Member
Thought this would be the most appropiate thread to share our findings.
Those are the default render configurations that the dev team seemed to aim for on PS4/XO :p
They're contained in an internal engine config, that also allows for some interesting graphical enhancements on PC (see screenshots below taken by Vortex), but also causes some glitches w/ AA if modified as it seems.

Eventually this could help to tweak the performance, will keep you updated.

 

DSN2K

Member
did Ubi put out patches to increase performance for ACIV ? wondering what kind of long term support the game will get cause right now its not really optimised well for any build.
 

riflen

Member
It is weird. For me when vsync is set to 1,it does not drop to 30 the second I get under 60,however it still drops framerate a lot compared to vsync being off, and definitely more than it should if triplebuffer worked as it should. I tweeted Jon Morin to ask his team to implement proper triplebuffer. It really sucks right now. And vsync off with tearing is ugly too.

DirectX doesn't feature triple buffering. Optionally, developers can use a render queue, that uses 3 buffers by default, but can be adjusted from 2 to 8. This has some of the advantages of triple buffering (wont crash your frame rate down to sync rate / 2), but adds latency compared to true triple buffering, by using old frames instead of being able to discard them.

I don't know how much Ubisoft can actually do. DirectX has needed true triple buffering for years and years now.
 

mkenyon

Banned
I had a feeling you were talking about latency, but how can frame latency actually be improved/affected by an end-user?
I think you're still confused. You should read that post again with this in your head: FPS and frame time is the same data, but presented differently. FPS simply averages out frame times over a given second, which makes it a less accurate metric.

If you have 20 frames over one second, then frame time analysis looks at each frame individually. FPS just averages out those 20 frames as "20 FPS". In reality, you may have one frame at 200ms, one at 15ms, one at 60ms, etc. For translation, 16.7ms = 60 FPS, 33.3ms = 30 FPS.
 
Top Bottom