• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Watch_Dogs PC Performance Thread

rashbeep

Banned
Would people sacrifice other settings to have HBAO+ high? I really like the way it looks, but I'd probably have to lower some other stuff to keep the game running the way I want it.
 

maneil99

Member
Wrongly done benchmark - it's obvious he made it in non cpu intensive place


Here's what Purepc got:

http://www.purepc.pl/karty_graficzn...e_test_kart_graficznych_i_procesorow?page=0,6

R- means core, W means thread

dual core cpu like pentium are dead, going for i3 gives huge gains over 2 core only

4 core i5 without oc gives good performance but i7 with same 3,5 Ghz clock speed will give around 15% more fps

4,5 Ghz i5 around 5% faster than 3,5 Ghz i7

6 core i7 @ 3,5 is close in performance to 4 core i7@ 4,5 Ghz

game can use all 12 threads of Sandy Bridge-E
Uh, in the benchmark you linked a 4 core is only 3 fps slower then a 4 core 4 ht... Did you even read it properly?
Theres a 5% difference in that graph between a 4.5ghz HT CPU and non HT
 

belmonkey

Member
Thats for you ;)
http://imgur.com/b1UDroH
b1UDroH.png

Looks like everything with 2GB or less has lower min FPS than 3GB+

I'm also a bit surprised seeing a 8GB 290x benchmarked; first I've seen it in a benchmark.
 

Kinthalis

Banned
No stuttering at all? Even driving around fast?

I have an i7 3770k @ 4.4ghz, 16GB of ram, 2x GTX 780s in SLI and I get stuttering when driving around fast on ultra, even w/ the newest nvidia driver.

It looks like oyu need more than 3 GB of vRAM for ultra textures, specially if you are using an MSAA style solution (MSAA x2,x4,x8,TXAA x2, x4).

At least I do, but I'm running at 1440p, so there's that.
 
I've maxed out the settings and I'm using SMAA, I have two 4GB 680 GTX's and I'm getting horrible stuttering that drags the framerate down from a solid 50 to mid 20's. Is it horrible SLI implementation that's causing this? It was rock solid during the indoor tutorial... I'm using the latest drivers.
 

DarkoMaledictus

Tier Whore

Serandur

Member
It looks like oyu need more than 3 GB of vRAM for ultra textures, specially if you are using an MSAA style solution (MSAA x2,x4,x8,TXAA x2, x4).

At least I do, but I'm running at 1440p, so there's that.

I hope with all my heart (i.e. my wallet), Watch Dogs is an exception here rather than the norm from here on out. Your avatar reminds me, I fear for TW3.
 
Has anyone gotten EVGA on screen display to work with this game? :/


@Wounded I'll have a few tests here later on my i7 920 and I'll be doing them at stock speeds :)
 

Kinthalis

Banned
I hope with all my heart (i.e. my wallet), Watch Dogs is an exception here rather than the norm from here on out. Your avatar reminds me, I fear for TW3.

I hope it does actually - in a way. I mean, ultra textures is more than the PS4 can handle. That means the devs made sure to include higher quality assets on PC. I hope THAT trend continues, just with a little more polish and optimization to boot.

If in 3 years we're pushing double the texture resolution/higher texture variance than consoles and to do so we need 4+ GB video cards, I'm ok with that. That's what happened last gen.
 

Kinthalis

Banned
Has anyone gotten EVGA on screen display to work with this game? :/


@Wounded I'll have a few tests here later on my i7 920 and I'll be doing them at stock speeds :)

Afterburner works fine. EVGA's PrecisionX sometimes lags behind the injeciton service release. I'd first try to make sure you've go the latets version of PRecisionX, and then see if you cna manually update the Riva Tuner Statistics server app.

Otherwise, give afterburner a shot.
 

dsk1210

Member
Runs and looks really nice on i5 2500k at 4.4ghz and 780ti, although i dropped down to high textures as it was causing quite severe stuttering on 3gb of vram.

Installed on SSD as well in the hope i could totally remove any stutter, never made a huge difference as far as i can tell, I also disabled the in game v-sync as it was all over the place, used nvidia control panel and adaptive v-sync, huge improvement.
 

Serandur

Member
I hope it does actually - in a way. I mean, ultra textures is more than the PS4 can handle. That means the devs made sure to include higher quality assets on PC. I hope THAT trend continues, just with a little more polish and optimization to boot.

If in 3 years we're pushing double the texture resolution/higher texture variance than consoles and to do so we need 4+ GB video cards, I'm ok with that. That's what happened last gen.

I'm fine with it if the textures actually incur a real performance hit that 780s won't cope so well with even if they did have the VRAM, just feeling insecure over my new purchase (powerful, but arbitrarily held back) and not particularly hopeful my next upgrade is even going to be for the actual chip's performance reasons.

In other words, while I hope for higher quality assets and all, I'm more hoping Nvidia's decision to go with 3 GBs on the 780s wasn't a huge mistake from a consumer point of view and that it actually proves to match the GPU's abilities, but if people with 6 GB 780s or even just R9 290s or 4 GB 770s will be running future demanding games with ultra textures without a hitch and evenly or more-powerful 780s/Tis are struggling, I would be rather upset.
 

Aucool

Member
FX 8350 + GTX 770 (2 GB) + 8 GB RAM, at 1080P I can essentially choose if I wanna play with everything on Ultra (except for textures) + HBAOplus + Temporal SMAA at 45fps with dips or with high settings + HBAO plus and Temporal SMAA at 60 fps, I opted for the second one
 

Kinthalis

Banned
I'm fine with it if the textures actually incur a real performance hit that 780s won't cope so well with even if they did have the VRAM, just feeling insecure over my new purchase (powerful, but arbitrarily held back) and not particularly hopeful my next upgrade is even going to be for the actual chip's performance reasons.

In other words, while I hope for higher quality assets and all, I'm more hoping Nvidia's decision to go with 3 GBs on the 780s wasn't a huge mistake from a consumer point of view and that it actually proves to match the GPU's abilities, but if people with 6 GB 780s or even just R9 290s will be running future demanding games with ultra textures without a hitch and evenly or more-powerful 780s/Tis are struggling, I would be rather upset.

True enough, and I do think 3 GB was a bad decision. I fully expect to be unloading mine in 2 years on eBay for an 800 series card, specifically because, as you said, the actual GPU is more than capable of these next generation games, and it seems vRAM is the bottleneck in terms of a truly "ultra" experience.

'course, average PC gamer consumer won't care much. He's still getting a "next gen" experience, even if not running games at ultra textures setting.
 

Nivash

Member
It hurts so bad having a 780ti with only 3gb ram. Why, nvidia? Why?
1440p am cry.

I have
I5 3570k @ 4.5ghz
Asus dcuII 780ti 3gb
8gb RAM

All settings ultra with fxaa

With vsync off I hover around 50, give or take 10fps.

I resorted to playing with vsync 1 and its been locked at 30, while in some very rare cases dropping below that.
:(

Also running a GTX 780 ti, I've checked it - it's not the VRAM. Overall I get wildly differing fps, it hovers around 60 for the most part but dips hard into the 40s whenever I drive. Then there are some spots that are even worse. I did some testing on one of the worst - conveniently located just outside the safehouse. All tests were done at 1920x1080 and with temporal SMAA.

First: everything at lowest possible, vsync off (obviously) http://i.imgur.com/yuEkZVe.jpg


57 fps. VRAM at ~2500. GPU load 45 %. Oh, and a funny thing happened - the GPU strain was so low that it automatically clocked itself down to 901 MHz! But still not even 60 FPS! CPU load didn't get included in this picture but it was about the same as the rest, 60-70 %.

Next: textures at medium (aka lowest), everything else at max.


39 fps. VRAM at ~2300 . GPU load 50 % (but at least with normal clocks), CPU load 70 %.

Finally: ultra textures, everything else at max.


38 FPS- VRAM at ~2900 which is basically max for the GPU. GPU load 53 %. CPU load 73 %.

So yeah... it's not a VRAM bottleneck, not a CPU bottleneck and there's no justification why the game can't run this scene at higher than 57 fps with the absolute lowest settings. I imagine that an i7 could be what makes some people with a 780 ti be able to run it at above 60 FPS (I'm also running an i5), but it still reeks of horrible optimization. No wonder it got delayed. They should have delayed it longer from the looks of it, guess we'll have to pray that it get's patched in the future.
 

Netboi

Banned

It was a tech demo. A lot of tech demos are always different in the final run due to optimization.

Only tech demo that managed to make it through the final without any changes was Crysis. Look how many people complained about the performance. So Cry Engine 3 ended up being bogged down for optimization.

I bet if they put the full force of the Watch Dogs engine, you wouldn't be running that at a good frame rate. Even on Ultra settings on most mid and high end cards are not constantly over 60fps unless in SLI.

People will have something to complain about on any game. I think Watch Dogs still looks amazing in Ultra settings compared to other open world games out there.
 

Serandur

Member
True enough, and I do think 3 GB was a bad decision. I fully expect to be unloading mine in 2 years on eBay for an 800 series card, specifically because, as you said, the actual GPU is more than capable of these next generation games, and it seems vRAM is the bottleneck in terms of a truly "ultra" experience.

'course, average PC gamer consumer won't care much. He's still getting a "next gen" experience, even if not running games at ultra textures setting.

Think we'll still be able to get decent prices for them in a couple years (to those average PC gamer consumers)? I was planning on holding off until late 2016/early 2017 and doing a major overhaul with Skylake-E, a Pascal card, and a 4K monitor.
 

Vuze

Member
So yeah... it's not a VRAM bottleneck, not a CPU bottleneck and there's no justification why the game can't run this scene at higher than 57 fps with the absolute lowest settings. I imagine that an i7 could be what makes some people with a 780 ti be able to run it at above 60 FPS (I'm also running an i5), but it still reeks of horrible optimization. No wonder it got delayed. They should have delayed it longer from the looks of it, guess we'll have to pray that it get's patched in the future.

Same here, I just can't hit a stable 60 even at the lowest settings w/ AA, AO etc. turned off, it's ridiculous. Hoping for a performance patch soon, but it's Ubi...
 

ShadyJ

Member
Judging by this thread its not the VRAM that's issue its optimization that's the issue...

I mean, if a 780ti isn't getting u 60fps on max all there is something wrong, does UBISOFT have some next gen killer card that they used to build this game on?
 
I get definite performance gains for SLI at 2560x1440.. I had heard from others that it had a negative impact if your resolution was 1920x1080 or lower, which is weird.. but it hasn't been my experience at 1440p
 

Kinthalis

Banned
Also running a GTX 780 ti, I've checked it - it's not the VRAM. Overall I get wildly differing fps, it hovers around 60 for the most part but dips hard into the 40s whenever I drive. Then there are some spots that are even worse. I did some testing on one of the worst - conveniently located just outside the safehouse. All tests were done at 1920x1080 and with temporal SMAA.

First: everything at lowest possible, vsync off (obviously) http://i.imgur.com/yuEkZVe.jpg



57 fps. VRAM at ~2500. GPU load 45 %. Oh, and a funny thing happened - the GPU strain was so low that it automatically clocked itself down to 901 MHz! But still not even 60 FPS! CPU load didn't get included in this picture but it was about the same as the rest, 60-70 %.

Next: textures at medium (aka lowest), everything else at max.



39 fps. VRAM at ~2300 . GPU load 50 % (but at least with normal clocks), CPU load 70 %.

Finally: ultra textures, everything else at max.



38 FPS- VRAM at ~2900 which is basically max for the GPU. GPU load 53 %. CPU load 73 %.

So yeah... it's not a VRAM bottleneck, not a CPU bottleneck and there's no justification why the game can't run this scene at higher than 57 fps with the absolute lowest settings. I imagine that an i7 could be what makes some people with a 780 ti be able to run it at above 60 FPS (I'm also running an i5), but it still reeks of horrible optimization. No wonder it got delayed. They should have delayed it longer from the looks of it, guess we'll have to pray that it get's patched in the future.

Nivash, what CPU? Something is up cause my 780 ti sees 90% + utilization at most times.

I'm runnnign at 1440p with an i7 3770k though so I'm not CPU bound. Are you runnign the latest beta drivers that hit yesterday?
 

Arkanius

Member
AMD said they would post the drivers today and well...
They haven't yet. Guru3D had them before for review and article purposes. Since most websites have decided to post it before AMD, Guru3D released them as well.

Install them, they are legit, but I wonder what's wrong, maybe AMD will release a different version with some extra hotfixes.
 

damaph

Member
My specs are
3570k @ 4500 MHz
GT 670 @ 1200 MHz with 337.88 drivers
256 GB Samsung 830
This game seems to not be very optimized for PC at all. I tried the recommended settings on the Nvidia website which is a mixture of ultra and high settings and it drops to around 45 fps very often. With the high preset, it stays at 60 fps most of the time, but will have drops to 50 fps when driving. If I want a solid 60 fps, I need to drop my settings to a mixture of medium and high.
 
My specs are
3570k @ 4500 MHz
GT 670 @ 1200 MHz with 337.88 drivers
256 GB Samsung 830
This game seems to not be very optimized for PC at all. I tried the recommended settings on the Nvidia website which is a mixture of ultra and high settings and it drops to around 45 fps very often. With the high preset, it stays at 60 fps most of the time, but will have drops to 50 fps when driving. If I want a solid 60 fps, I need to drop my settings to a mixture of medium and high.
The nvidia article was aiming for 35 FPS average I believe.
 

riflen

Member
Nivash, what CPU? Something is up cause my 780 ti sees 90% + utilization at most times.

I'm runnnign at 1440p with an i7 3770k though so I'm not CPU bound. Are you runnign the latest beta drivers that hit yesterday?

In his screenshots it shows a 3570k at 4Ghz.
 

syoaran

Member
So is the GAF consensus now that an i5 4670k is still slightly better value than an i7 4770k at the same speeds?

Wondering which of these processors to get [the extra cash probably going to watercooling to overclock past 4ghz]
 
Top Bottom