• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Battlefield 1 RX480 VS GTX 1060

ethomaz

Banned
Fascinating to see that unlimited console optimization power is still a misleading fantasy almost 3 years into this generation.

The PS4 and X1 struggle to run this game, but the PS4 Pro should be able to maintain 1080p at 60 fps, possibly with minor drops to the 50s at the most alongside improvements to the visual fidelity, potentially sporting Ultra settings or a balance between the High and Ultra settings available on PC.
Struggle is a bit hard... both consoles runs at solid 60fps.
 

spyshagg

Should not be allowed to breed
Game-ready drivers for both companies will hit the web later this week.

Today's are not officially representative
 

V_Arnold

Member
NVidia 372.xx drivers.

Well thanks, so 480 is STILL ahead of 1060 in 1080p. The supposedly inferior product, as I recall. That was my point. (As the 480 is cheaper than the 1060 as well..).

It's not Nvidia who has to improve DX12 performance on their GPUs.

It is not AMD either, cause they seem to have pretty great DX12 performance with the new cards. I wonder where the spin goes if AMD cant have an improvement in any field without somehow them being called out on something else, like, say, poor DX7 support.
 
BF1 and Gears 4 were two games I thought would kill off my 290x, but amazing optimisation means I can max both at 60fps @ 1080p.

Amazing.
 

warheat

Member
NVidia 372.xx drivers.



http://www.sweclockers.com/test/22533-snabbtest-doom-med-vulkan

That without async compute not implemented in the Vulkan patch for nVidia yet... they are looking with nVidia to release the new patch with async compute for Pascal.

BTW OpenGL and Vulkan support have close performance in nVidia that shows how bad the OpenGL support in AMD is while nVidia supports both APIs pretty well and ahead AMD.

Is there any other benchmark beside swelockers with the newest driver? I'm going to build a new PC soon and still can't decide between Galax GTX 1060 vs XFX GTR RX 480, but I'm inclined towards RX 480 due to it overall have slightly better performance in DX12 games.
 
It's dropping below 30 FPS at times apparently. I'd definitely consider that struggling.

Ugh. Well, that just sealed it for me then. I'll probably wait a year n pick up BF1 for Xbox, assuming it has Scorpio optimizations similar to PS4 Pro. I'm not interested in PS4 MP games as I have zero interest in PS+ these days...do all my MP gaming on Xbone. I'd pick it up for PC, but I prefer playing MP with a controller from the couch and don't feel like getting owned by mouse/kb players (assuming even if there's controller aim assist, it won't overcome a good kb/m player).
 
assuming it has Scorpio optimizations similar to PS4 Pro.
If Xbox One S is any indication, it won't need them. Unlike the Pro, the Xbox One S doesn't downclock its superior hardware for the sake of "compatibility", so I'm not expecting the Scorpio to start doing it.

Pretty sure Phil Spencer said as much in an interview but don't quote me on that.
 

ethomaz

Banned
Is there any other benchmark beside swelockers with the newest driver? I'm going to build a new PC soon and still can't decide between Galax GTX 1060 vs XFX GTR RX 480, but I'm inclined towards RX 480 due to it overall have slightly better performance in DX12 games.
I didn't found any other website that made (or revisited) the reviews. That happens most times when a new GPU is launched but we didn't have any after the patch.

Need to wait a new GPU launch.

BTW both cards are solid purchase and there is actually no game that really uses DX12.
 
160x90, damn. Lowest resolution of the generation.

VG Tech - Battlefield 1 Conquest PS4 Frame Rate Test

rdB0VPKl.png

It's a very bizzare and rare bug.
 

dr_rus

Member
It is not AMD either, cause they seem to have pretty great DX12 performance with the new cards. I wonder where the spin goes if AMD cant have an improvement in any field without somehow them being called out on something else, like, say, poor DX7 support.

Who said anything about AMD?

The general idea that if DX12 performance sucks compared to DX11 then it's a h/w vendor who should fix it somehow is completely wrong as the reason for such behavior in 99% of cases is the badly coded DX12 renderer itself. That's what you get with thin APIs - you have to fix the bad performance of your program, not the IHV.
 
Another game where DX12 is a bad choice for actually playing it, regardless of GPU vendor:

frametimes_nvcdymf.png

frametimes_amdy1xpy.png

Yep, even in situations where DX12 is offering an improvement in average FPS, frametimes are so bad that DX11 is still the better choice, even for AMD GPU's.

https://www.computerbase.de/2016-10/battlefield-1-benchmark/ - The full article for people who want to read up.

Some of the more particularly hilarious results are on the Intel/NV side of things, the Core i7 6700k + GTX 1080 combo loses nearly 20% performance going from DX11 to DX12, a testament to how good NV's DX11 CPU utilization is and how bad Frostbite's DX12 CPU utilization is.

There is one little bright spot, the DX12 multithreading is better than DX11, despite the worse per-core performance. So if you happen to hate yourself enough to have a FX CPU + Radeon GPU combo, you do see some notable gains in performance, as DX12 multithreading helps bypass both the FX CPU single core performance bottleneck as well as the Radeon DX11 driver CPU bottleneck, Fury X in particular sees huge boosts. However, it doesn't stop FX CPU's being hilariously CPU limited, particularly in MP, to the point that there's only a 10 FPS difference between a R9 380 and a Fury at 1080p
 
If I remember right, Digital Foundry only tested the single player mode. In SP, the CPU workload is so light that it's never the bottleneck and the horrid CPU performance isn't really demonstrated.

In MP, it all goes to shit.
 

Durante

Member
Why doesn't Digital Foundry mention this?
You'd have to ask them. Generally, Digital Foundry is simply not as good or thorough at performance analysis of PC games as dedicated PC hardware sites (who have been doing this kind of thing for decades) are.
 

thelastword

Banned
Yep, even in situations where DX12 is offering an improvement in average FPS, frametimes are so bad that DX11 is still the better choice, even for AMD GPU's.
I'm not sure that's right. The RX480 never drops below 60fps in DX12 and maintains a much higher lead over the 1060. There's no way it's frametime is worse in DX12. Even in DX11 the RX480 does not fall below 60fps but you could see the 1060 going below that in DX12. If you're playing this game on the RX480, DX12 is definitely the best way to play, the opposite is true for the 1060.

Beautiful Ninja said:
Some of the more particularly hilarious results are on the Intel/NV side of things, the Core i7 6700k + GTX 1080 combo loses nearly 20% performance going from DX11 to DX12, a testament to how good NV's DX11 CPU utilization is and how bad Frostbite's DX12 CPU utilization is.
Or perhaps how bad NV's DX12 performance have been overall......I do agree that their DX11 performance is amazing and wish that AMD would have caught on, they've made some strides in that department with some gains on older DX11 titles on their hardware, but obviously Nvidia is still ahead. In the case of the GTX 1060, I see two things....

1.) It appears a bit more powerful than the RX480 in most DX 11 titles.

2.) The RX 480 appears a bit more powerful in most DX12 titles


Our best answer as to which hardware is better is if AMD's DX11's code was better and if NV's DX12 hardware was more in tune with DX12. I do believe AMD's hardware is more future proof here, but there's no doubt NV knocked it out of the pack with their DX11 drivers.

Beautiful Ninja said:
There is one little bright spot, the DX12 multithreading is better than DX11, despite the worse per-core performance. So if you happen to hate yourself enough to have a FX CPU + Radeon GPU combo, you do see some notable gains in performance, as DX12 multithreading helps bypass both the FX CPU single core performance bottleneck as well as the Radeon DX11 driver CPU bottleneck, Fury X in particular sees huge boosts. However, it doesn't stop FX CPU's being hilariously CPU limited, particularly in MP, to the point that there's only a 10 FPS difference between a R9 380 and a Fury at 1080p
That is not a bad thing because Zen has much improved single core performance over previous AMD chips and it will be packing many cores as well. There's no doubt that battlefield would pull in more CPU resources in MP though.

Do you have the graphs of the fury vs 380 MP benches?
 
I'm not sure that's right. The RX480 never drops below 60fps in DX12 and maintains a much higher lead over the 1060. There's no way it's frametime is worse in DX12. Even in DX11 the RX480 does not fall below 60fps but you could see the 1060 going below that in DX12. If you're playing this game on the RX480, DX12 is definitely the best way to play, the opposite is true for the 1060.

Or perhaps how bad NV's DX12 performance have been overall......I do agree that their DX11 performance is amazing and wish that AMD would have caught on, they've made some strides in that department with some gains on older DX11 titles on their hardware, but obviously Nvidia is still ahead. In the case of the GTX 1060, I see two things....

1.) It appears a bit more powerful than the RX480 in most DX 11 titles.

2.) The RX 480 appears a bit more powerful in most DX12 titles


Our best answer as to which hardware is better is if AMD's DX11's code was better and if NV's DX12 hardware was more in tune with DX12. I do believe AMD's hardware is more future proof here, but there's no doubt NV knocked it out of the pack with their DX11 drivers.

That is not a bad thing because Zen has much improved single core performance over previous AMD chips and it will be packing many cores as well. There's no doubt that battlefield would pull in more CPU resources in MP though.

Do you have the graphs of the fury vs 380 MP benches?

Durante linked the frametime charts just a few posts above us, you can see the difference in frametimes in both DX11 and DX12 for the RX 480, when paired with a FX-8370.

The RX 480's DX12 frametimes are all over the place, ranging from notably lower to notably higher than DX11. Overall it's a much more spiky experience, even if average frame rates are higher on DX12. The GTX 1060 actually has less pronounced frametime spikes in DX12 compared to the RX 480, however it's still spikes more than DX11. In both cases, DX11 provides a much smoother experience, even if DX12 can provide the occasional higher FPS scenes.

And you're quick to blame Nvidia on the DX12 performance issues, when said performance issues are much more in the hands of the game developers rather than NV/AMD nowadays. It's much harder for NV to ship a driver update that can bypass crappy developer code in DX12 or Vulkan when their DX12/Vulkan drivers don't handle as many responsibilities like older API's did.

AMD gets away looking better in crappy DX12 implementations, because their DX11 drivers were poor enough, particularly in regards to CPU overhead, that even bad DX12 paths are as good or better than their DX11 paths.

Nvidia gets shafted on crappy DX12 implementations, because their DX11 drivers perform so well, that it's much easier for performance regressions to appear.

https://www.computerbase.de/2016-10...gramm-battlefield-1-auf-dem-fx-8370-1920-1080 - Here's the link for the FX CPU performance. The game is horribly, horribly bottlenecked in MP on a FX CPU, it's so bottlenecked that the Fury X and GTX 1080 are CPU bottlenecked even at 4K, which is basically unheard of.
 

SapientWolf

Trucker Sexologist
Well thanks, so 480 is STILL ahead of 1060 in 1080p. The supposedly inferior product, as I recall. That was my point. (As the 480 is cheaper than the 1060 as well..).



It is not AMD either, cause they seem to have pretty great DX12 performance with the new cards. I wonder where the spin goes if AMD cant have an improvement in any field without somehow them being called out on something else, like, say, poor DX7 support.
Frostbite loves AMD. It's been that way since BF3.
 

Armaros

Member
Yup, devs needs to step up and bite the bullet. Transition is going to be super rough for consumers as well though :/

Devs are not going to take on more work for themselves to just help out AMD. Most devs teams are spread thin as it is.
 

napata

Member
I'm not sure that's right. The RX480 never drops below 60fps in DX12 and maintains a much higher lead over the 1060. There's no way it's frametime is worse in DX12. Even in DX11 the RX480 does not fall below 60fps but you could see the 1060 going below that in DX12. If you're playing this game on the RX480, DX12 is definitely the best way to play, the opposite is true for the 1060.

Don't give advice when you have no idea what you're talking about. I don't get how you can be in so many tech threads and not even know what frametimes are. How is that possible?

A frametime is basically the amound of time that passes to deliver 1 frame while a framerate is the amound of frames per unit of time.

60 fps tells you that within 1 second 60 frames were delivered but it doesn't tell you how fast those frames were delivered. Now ideally within that 1 second each frame was delivered every 16.7 (1/60) ms.
This doesn't have to be the case however. For example one frame could be delivered after 30 ms, the next after 20ms, the one after that at 8ms, etc. Obviously within the second the average frametime is 16.7ms because you still need 60 delivered frames to get a reading of 60 fps.

The result is that these uneven deliveries result in stutter and thus bad performance even though your framerate counter is still showing the number 60. This is what they call framepacing issues (Bloodborne for example suffers from this). In such a case a framerate is more or less useless to determine actual performance.

So based on the images posted by Durante you don't ever want to use DX12 in this game, even when owning a 480. Do you now understand why? Probably not but I hope I can atleast educate some other people.
 

dr_rus

Member
Frostbite loves AMD. It's been that way since BF3.

Not really, it's more about Battlefield's team love for AMD than FB3 in general. There are a lot of examples of FB3 games which perform on par on NV's h/w, recent one being Mirror's Edge Catalyst.

2.) The RX 480 appears a bit more powerful in most DX12 titles

Are these DX12 titles all from AMD's Gaming Evolved program by any chance?
 

thelastword

Banned
The RX 480's DX12 frametimes are all over the place, ranging from notably lower to notably higher than DX11. Overall it's a much more spiky experience, even if average frame rates are higher on DX12. The GTX 1060 actually has less pronounced frametime spikes in DX12 compared to the RX 480, however it's still spikes more than DX11. In both cases, DX11 provides a much smoother experience, even if DX12 can provide the occasional higher FPS scenes.

Snip... I don't get how you can be in so many tech threads and not even know what frametimes are. How is that possible?...Snip
You guys realize that framerate is tied to frametime right? There are no framepacing issues noted with BF1 on AMD hardware as far as I know. So bringing in Bloodborne into this conversation is so odd because that could have been fixed.


Frametime is a tricky thing because so many things can affect a frametime graph, so anyone can selectively take a portion of a game which is heavily CPU bound like BF's MP, pair the GPU's with a weaker CPU and say... "Hey it's frametime is out of whack". Not withstanding one of the GPU's may be weaker, drivers worse, CPU utilization worse etc...Traditionally in DX11, when you pair an AMD CPU with a NV card you get a better framerate and better frametimes over an AMD CPU/GPU combo...Why? because the CPU footprint is less with NV period.....There's so much more to consider too, but as knowledgeable as you are, I''m sure you know.

This is why I don't trust any frametime graph. You know how easy it is to put up a wacky frametime graph to skew a conversation? Yet there are so many variables to consider, It all depends on how accurate your software is and so many other tangibles.........

In any case, cpu utilization is better/more efficient in DX12 for AMD cards and there's a clear reason why you would have bigger spikes in MP with the FX's weaker single core performance. Funny enough, this test was done for SP by Digital Foundry (not that I defend them that much), but fair is fair....Yet this conversation has turned to MP with an FX 8370. How about testing the MP with an i5 skylake which they used in their test?

In any case, here's a link to a better done frametime analysis comparison if you care for it. Move on from the linked page and you can see frametimes with a FX 8370 and i7 5960X etc...I'm sure we will get more graphs on MP frametimes when the game actually releases......a credible mp frametime analysis is always harder to do obviously.
 
You guys realize that framerate is tied to frametime right? There are no framepacing issues noted with BF1 on AMD hardware as far as I know.
...so the graphs are 'skewed'. Really, wtf.
In any case, cpu utilization is better/more efficient in DX12 for AMD cards and there's a clear reason why you would have bigger spikes in MP with the FX's weaker single core performance

Did you really think this through before posting it? That makes absolutely no sense. DX12 is (in theory) more able to spread itself across multiple cores, especially in comparison to AMD's DX11 drivers which typically hog a single core more dis-proportionally than Nvidia. The fact that the API which has supposedly better CPU utilization has more frame time spikes?

You're literally seeing data which indicates "This processor is a bottleneck under this API on this game", and responding with "Well yeah it's bottlenecking because it's more efficient". That's really...something.
 
Remember the frametime graph revolution a number of years ago? Exposing the fact that games looked and played terribly on AMD hardware in comparison to NV? Remember that? Remember how important that was to make AMD work on a better driver?

I think we should just ignore that nigh revolutionary change in benching and game performance because it exposes a bias, namely my own.
 

tr1p1ex

Member
I ordered a 480 RX mostly based on a few sources showing that it runs faster in BF1. I'll post some observations by the end of the weekend for anyone wanting another data pt. I get it Friday, but will probably play with my current card a bit before putting in the 480 RX just to see the difference.
 

Caayn

Member
Another game where DX12 is a bad choice for actually playing it, regardless of GPU vendor:

https://abload.de/img/frametimes_nvcdymf.png[IMG]
[IMG]https://abload.de/img/frametimes_amdy1xpy.png[IMG][/QUOTE]How do the frametimes look on a i5 or i7?
 

dr_rus

Member
I'm so happy to see an AMD card beating out the Nvidia option.

It doesn't though. Modern rival cards are pretty much even, a cheaper 1070 is mostly beating Fury X and the only real win for AMD is happening when you compare several years old cards (meaning Kepler gen for NV) which are likely VRAM limited here. The game does run better on AMD h/w on average but it's not that much better to say that it's beating anything.
 

thelastword

Banned
...so the graphs are 'skewed'. Really, wtf

Anybody can use any frametime graph to prove a point. There could be many reasons why frametimes are worse in MP on AMD hardware in this particular sample. We need more samples and more testing to determine consistency, especially after the game launches. Perhaps this particular title is not as multi core centric as we're thinking. Perhaps a driver or game patch may improve this. Hell, perhaps the final game may have already sought that out or perhaps the Dev is not aware of the issue atm.

Don't try and pretend that every combination of hardware on PC is perfectly optimized on day one. Do you think Dice was aware that their game drops to 160x90 rez on consoles? Does that mean the consoles just can't run Dice games properly. Can't things get fixed or improved, especially if they're not released yet.

This is a thread about the SP portion, yet the poster who pasted these frametime graphs never bothered to post the SP graphs from the same site, he only posted the MP, you see how that works, conveniently leaving out the actual discussion of this thread. Obviously, there's an issue with the FX8370 + both cards spiking a bit more in MP due to higher CPU load, it's not like the DX12 spike-line is perfect on NV either. There's definitely an issue which I'm sure drivers/patches will sort out.

DX12 coding is still in it's infancy, as we can clearly see, there are titles where the DX11 code clearly outpaces the DX12 code on Nvidia hardware. So lets not pretend everything works 100% correctly and all cpu cores are leveraged in DX12 for all titles. It's a work in progress, for all we know CPU utilization may even be lower on NV cards in DX12.........Strange concept I know.



because they just tested single player, where that does not seems to be an bigger issue:

13-630.1476869928.png


15-630.1476876580.png
Ahh, so there were SP graphs? Coupled with the ones I posted from Guru showing no issues with AMD CPU's. Obviously, we will get more tests on MP from later today and tomorrow etc....where new drivers and patches will be available. It's crazy that that german site did not use an Intel CPU in 2016. I mean, who couples an RX480 or NV 1060 with an FX8370 anyway........I will look forward to more mp tests in the next few days.
 

GSG Flash

Nobody ruins my family vacation but me...and maybe the boy!
Great performance by the RX480. Curious to see how this game runs on the RX470 as both the 480 and 470 are definitely the best value for the money.

It doesn't though. Modern rival cards are pretty much even, a cheaper 1070 is mostly beating Fury X and the only real win for AMD is happening when you compare several years old cards (meaning Kepler gen for NV) which are likely VRAM limited here. The game does run better on AMD h/w on average but it's not that much better to say that it's beating anything.

LOL

I mean I know you're an Nvidia fanboy but the evidence is right there in video form, the RX480 in DX12 outperforms the 1060 running in any scenario.

I'll make sure to save this quote, though, for next time there's a game where an Nvidia card slightly outperforms an AMD card ;)
 

V_Arnold

Member
It doesn't though. Modern rival cards are pretty much even, a cheaper 1070 is mostly beating Fury X and the only real win for AMD is happening when you compare several years old cards (meaning Kepler gen for NV) which are likely VRAM limited here. The game does run better on AMD h/w on average but it's not that much better to say that it's beating anything.

You are contradicting yourself in the same sentence.
Why is this so hard to just accept? It is not like there is a direct competition, since 1070/1080 is on a whole different level (both in price and in performance) to what AMD's newest lineup offers. Is it that hard to accept that the 1060 might not beat the cheaper 480 in all situations?

I mean I know you're an Nvidia fanboy but the evidence is right there in video form, the RX480 in DX12 outperforms the 1060 running in any scenario.

I'll make sure to save this quote, though, for next time there's a game where an Nvidia card slightly outperforms an AMD card ;)

Yeah, it is very jarring to see the mental gymnastics taking place to not allow AMD to score just a "win" in one game against one particular card. As if the takeaway here would be to choose 480 over 1070, which is obviously not.
 

dr_rus

Member

They're saying that the only change compared to 16.10.1 is the addition of CF profile for DX11 mode, single cards performance is the same.

LOL

I mean I know you're an Nvidia fanboy but the evidence is right there in video form, the RX480 in DX12 outperforms the 1060 running in any scenario.

I'll make sure to save this quote, though, for next time there's a game where an Nvidia card slightly outperforms an AMD card ;)

I don't think that you know anything tbh.


You are contradicting yourself in the same sentence.
Why is this so hard to just accept? It is not like there is a direct competition, since 1070/1080 is on a whole different level (both in price and in performance) to what AMD's newest lineup offers. Is it that hard to accept that the 1060 might not beat the cheaper 480 in all situations?

No, I'm not. The fact that the game runs a bit better on AMD h/w doesn't mean that AMD h/w beats NV h/w there as there are several instances where this isn't actually happening since the gap is rather small. AMD also has basically nothing to beat 1080 or Titan XP.
 

GSG Flash

Nobody ruins my family vacation but me...and maybe the boy!
I don't think that you know anything tbh.

And this proves what exactly?

You're not fooling anyone, every GPU thread I've seen you in you are either:

A - Defending or making excuses for Nvidia performance
B - Belittling AMD performance
D - All of the above

It would be nothing short of a miracle to see you actually praise AMD performance when it's due.

No, I'm not. The fact that the game runs a bit better on AMD h/w doesn't mean that AMD h/w beats NV h/w there as there are several instances where this isn't actually happening since the gap is rather small. AMD also has basically nothing to beat 1080 or Titan XP.

Why are you bringing up the 1080 or Titan X? For this game and in this particular comparison, the RX480 beats the GTX1060, full stop. Whether it's by a small margin or large is irrelevant. The evidence is right in the DF video.
 

spyshagg

Should not be allowed to breed
Calling Digital Foundry "amateur" compared to classic websites on account of this frametime issue is a bit daft.

The Frametimes Graph is right there in front of our eyes for the entire length of the DF video.

They simply did not mention any frametime issue because as you can see in their video, there wasn't one. Lets be serious here.
 
Top Bottom