is the rx vega 56 8.2TF or 10.5TF?
Simpler to use TechPowerUp's performance summaries.
Very clearly gets faster relative to the 1060 over time on average.
I'm missing something but this summary shows it didn't changed anything at all... and the small difference is probably due the games tested.Simpler to use TechPowerUp's performance summaries.
Very clearly gets faster relative to the 1060 over time on average.
Okay, what's the theory behind Gears of War 4 with Unreal Engine 4 which also runs 6% faster on Fury X?AotS is very bandwidth hungry due to how its engine functions. It's also hungry in compute part of the rendering process which means that no bandwidth saving technique work there and thus Fiji can have a lead.
I would be cautious of memory OC benchmarks indicating much in case of Vega as it has been stated that OCing memory (or GPU) doesn't affect the clocks of IF which connects them and thus it's unclear that memory clock change even affect the actual bandwidth here.
There is nothing from AMD indicating that the DCC got any fundamental improvements.DCC improvements between GCN3 and 5 may be more than in just bandwidth. Some things can be done in less cycles, some things can't be done at all on older archs. Tessellation is unlikely as there's no big changes between 3 and 5 here, the rest is just guessing.
It's useless but it does improve the 99th percentile in Deus Ex by 12% with the same performance.What it clearly shows is no change on average comprised of both minor gains and minor losses which means that it's useless in current games.
PCGH has no issues with MSAA in Crysis 3 which leads to an open field of problems with no clear answers yet.https://www.youtube.com/watch?v=zEfRi5pBQr4
Not the first indication of MSAA having an unusually high cost on Vega. Might be due to either bandwidth or some issues with the new backend (RBEs/caches).
It is but the results are obviously a disaster.So Vega is just a big Polaris chip in terms of performance per clock? Wasn't Vega supposed to be the largest architectural improvement that GCN ever got?
Simpler to use TechPowerUp's performance summaries.
Very clearly gets faster relative to the 1060 over time on average.
Simpler to use TechPowerUp's performance summaries.
Very clearly gets faster relative to the 1060 over time on average.
Simpler to use TechPowerUp's performance summaries.
Very clearly gets faster relative to the 1060 over time on average.
In July 2016 the difference between RX 480 and 1060 FE is 10%
16 Games tested:
Anno 2205., AC:S, BF3, BF 4, Batman AK, Blops 3, Crysis 3, FO4, Far Cry Primal, GTA 5, Hitman, JC3, Rainbow Sieg. RotTR, Witcher 3, WoW
In October 2016 the difference between RX 480 and 1060 FE is 5.5%
18 Games tested:
Anno 2205, AC:S, BF4, Batman AR, Blops 3, Deus Ex:MD, DOOM, F1 2016, FO4, Far Cryrimal, GTA V, Hitman, JC3, NMS, Rainbow Sieg, RotTR, Witcher 3, Warhammer
In Feb 2017 the difference between RX 480 and 1060 FE is 4.25%
21 Games tested:
Anno 2205, AC:S, BF4, Batman AR, Civ 6, Deus Ex:MD, DOOM, Dishonored 2, F1 2016, FO4, Far Cryrimal, GTA V, Hitman, JC3,Mafia 3, Rainbow Sieg, RotTR, Witcher 3, Shadow Warrior 2, Warhammer, Watch Dogs 2
Nice analysis, although the one thing I wanted to point out is that BF1 is only in the 3rd benchmark, the first two uses BF4.
Nice analysis, although the one thing I wanted to point out is that BF1 is only in the 3rd benchmark, the first two uses BF4.The difference between July 2016 and Feb 2017 seems to be amazing. A 6% performance gain, just because of improved drivers. But that's not the right conclusion, all we can say, based on this data, is that the base data changed (half of the games benchmarked in feb 2017 weren't benchmarked in july 2016, that's too much)
So let's dive in even deeper and let's take a closer look at the benchmarks that all data sets have in common (12 games total):
*snip*
BF1
RX 480 (-25%); RX 480 (-26%); RX 480 (+4,3%)
--> Significant improvement
The biggest reason amd hardware ages better is due to it typically performing better in new games as the years go on
10.5
Nice summary.text
Is there any sort of source for this claim? I've run both for many years and never noticed any kind of difference between one brand aging better than another.
Proof? I really want to believe but benchmarks didn't support it.Its why amd cards have consistently gained performance against nvidia counterparts since the 7970 v gtx 680. Happens at virtually every single pricepoint
Proof? I really want to believe but benchmarks didn't support it.
Its why amd cards have consistently gained performance against nvidia counterparts since the 7970 v gtx 680. Happens at virtually every single pricepoint
Of course using different games to make the summary will show big changes...Look at techpowerup summarys over the years
Except it isn't. AMD cards gained on Kepler era gpus because PS4 was released and game design started to write games to use more shaders and less dedicated hardware like ROPs. It's very easy to see that in cross gen games like GTA V Keplar stayed strong.
Of course using different games to make the summary will show big changes...
But where is the big changes in performance in the same games when compared with nVidia?
Its why amd cards have consistently gained performance against nvidia counterparts since the 7970 v gtx 680. Happens at virtually every single pricepoint
Nice analysis, although the one thing I wanted to point out is that BF1 is only in the 3rd benchmark, the first two uses BF4.
AMD was always good about undercutting prices for cards against nvidia counterparts. The cards aren't magically ageing any better. They may price cut more as the hardware ages but that doesn't mean anything for Joe Shmoe the Gaming Bro who bought the card at full price when it was new. It ages... just like any other hardware.
The difference between July 2016 and Feb 2017 seems to be amazing. A 6% performance gain, just because of improved drivers. But that's not the right conclusion, all we can say, based on this data, is that the base data changed (half of the games benchmarked in feb 2017 weren't benchmarked in july 2016, that's too much)
So let's dive in even deeper and let's take a closer look at the benchmarks that all data sets have in common (12 games total):
Anno 2205:
RX 480 (-25% fps on average than a 1060 FE); RX 480 (-26%); RX 480 (-25%)
--> no significant difference over the time period
AC:S
RX 480 (-23%); RX 480 (-25%); RX 480 (-20%)
--> no significant difference over the time period
BF1
RX 480 (-25%); RX 480 (-26%); RX 480 (+4,3%)
--> Significant improvement
Batman:AK
RX 480 (+3,3%); RX 480 (+1,35%); RX 480 (+7,5%)
-->small improvement
Fo4:
RX 480 (-9,3%); RX 480 (-17,5%); RX 480 (-9,35%)
--> no significant difference over the time period
Far Cry Primal
RX 480 (-10,2%); RX 480 (-9,5%); RX 480 (-8%)
--> no significant difference over the time period
GTA V
RX 480 (-10%); RX 480 (-11,8%); RX 480 (-11,73%)
--> no significant difference over the time period
Hitman
RX 480 (+1,7%); RX 480 (+3%); RX 480 (+7%)
--> small improvement
Just Cause 3
RX 480 (+3,3%); RX 480 (+4,2%); RX 480 (+6%)
--> small improvement
Rainbow Six: Siege
RX 480 (-4,6%); RX 480 (-3,4%); RX 480 (-1,3%)
--> no significant difference over the time period
Rise of the Tomb Raider
RX 480 (-11,4%); RX 480 (-3,3%); RX 480 (-0,6%)
--> significant improvement
Witcher 3
RX 480 (-11%); RX 480 (-10,7%); RX 480 (-4,5%)
--> small improvement
Out of 12 games we have 2 games (17%) that improved in a significant way (~ 20,55% faster ) and 4 games (33%) had small improvements (~ 4,78% faster).
So yes, overall RX 480 improved because of driver optimizations. Those improvements are huge in two cases, but on average we have a much smaller improvement of 2.1%. Because half of the games didn't improve at all and 33% just barely.
In the end I stay by my previous statement: There isn't enough data to support the claim that "polaris improved over time in a significant way"
sorry for gramer and spelling mistakes, I'm obviously not a native speaker and it's sometimes a bit hard to phrase everything correctly
edit:
yeah sorry. I messed up there. But my points stays, because the improvements look even less relevant without BF... not that it really matters. People will keep on repeating the same statements, that are based on miss representative data.
Or maybe there are more game that runs better on AMD in 2017 than 2016... 2018 can be the opposite.There's a misunderstanding going on here. People who claim AMD cards 'age well' don't just mean purely from the perspective of what you've presented above i.e improvements from drivers to older games. It's about the fact that Polaris cards perform stronger relatively on newer games or games released after launch as well. That's part of the reason why it closed the gap - older games are swapped out from older benchmarks for newer titles which have been more optimized for AMD hardware, so on average it's performance is now nigh on identical to the 1060 with today's relevant games (let's say 2017 titles).
It's not just about drivers from AMD bringing up performance. It's about devs optimizing for Polaris or other AMD cards once they've been released. So in this sense performance on average improves over time compared to Nvidia where the performance is much better on day 1.
Here's a task for you - compare the gap in performance between the cards using mid 2016 games versus games released from 2017 onwards and report ypur findings to me, I'd love to see that.
Or maybe there are more game that runs better on AMD in 2017 than 2016... 2018 can be the opposite.
Think a bit about...
2016: 5 (AMD favor) + 6 (NV favor) + 9 (tie)
2017: 6 (AMD favor) + 5 (NV favor) + 9 (tie)
2018: 4 (AMD favor) + 7 (NV favor) + 9 (tie)
That has nothing to do with card aging well but the differents games released in the period.
I understand what are you saying the same way I said that 2018 it can somebody can claim "nVidia cards aged well" using your own metrics.How can you still fail to understand what we are saying
I understand what are you saying the same way I said that 2018 it can somebody can claim "nVidia cards aged well" using your own metrics.
Or maybe there are more game that runs better on AMD in 2017 than 2016... 2018 can be the opposite.
Think a bit about...
2016: 5 (AMD favor) + 6 (NV favor) + 9 (tie)
2017: 6 (AMD favor) + 5 (NV favor) + 9 (tie)
2018: 4 (AMD favor) + 7 (NV favor) + 9 (tie)
That has nothing to do with card aging well but the differents games released in the period... the games released in 2017 are different and uses different engines than the games released in 2016... the quantity is different too... even the benchmarks can't be compared because there are so many differences.
That is what happens when you screw the data to create a point.
That is what happens when you screw the data to create a point.
That is exactly the failed claim I explained before... if you chose others games it will show 980ti increase the difference over Fury X.Fury x v 980ti at launch
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html
Fury x v 980ti now
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/31.html
You can do this same type of comparison for almost every amd gpu v its nvidia equivalent for the last 6 years
That is exactly the failed claim I explained before... if you chose others games it will show 980ti increase the difference over Fury X.
You are comparing two different set of data to make a point.
I can make any card looks different from launch if I choose the games that will lead me for bad or good results.
There's a misunderstanding going on here. People who claim AMD cards 'age well' don't just mean purely from the perspective of what you've presented above i.e improvements from drivers to older games. It's about the fact that Polaris cards perform stronger relatively on newer games or games released after launch as well. That's part of the reason why it closed the gap - older games are swapped out from older benchmarks for newer titles which have been more optimized for AMD hardware, so on average it's performance is now nigh on identical to the 1060 with today's relevant games (let's say 2017 titles).
It's not just about drivers from AMD bringing up performance. It's about devs optimizing for Polaris or other AMD cards once they've been released. So in this sense performance on average improves over time compared to Nvidia where the performance is much better on day 1.
Here's a task for you - compare the gap in performance between the cards using mid 2016 games versus games released from 2017 onwards and report ypur findings to me, I'd love to see that.
Btw whats up with those numbers from techpowerup? 192fps in doom @1080p with a 1080ti?? Hell techspot (hardware unboxed) has more fps in 1440p lol
Also techpowerup has the vega56 winning in doom...against a 1080 (with the fury just 3 fps behind it lol). Is this joker guy running the site? I think i dont want to trust this site anymore 🤣
I see, and it very well could be true. But it is impossible to prove that "fact".
Mainly because we have to take 1060 performance as given and not changeable over time to prove that polaris "aged well" (we need some sort of base to work with). And that assumption is obviously false, because nvidia and graphical engineers also spend time to develop and improve their engines/games for pascal.
But okay, let's push some data. Not because you gave me a task btw, just because I want to find this out for myself.
edit: and done -->
older/2014/2015/ 2016 Games
(in comparison with GTX 1060 FE)
Assassins Creed Unity
RX 480 -13%
COD Advance Warfare
RX 480 -11%
Dragon Age Inquisition
RX480 -12%
Far Cry 4
RX 480 +4%
GTA 5
RX 480 -4%
Metro LL
RX 480 -18%
Witcher 3
RX 480 -5%
Doom (on release)
RX 480 +15%
Dirt Rally
-12%
F1 2015
RX 480 -12%
Fallout 4
RX 480 -13%
FCrimal
RX 480 -6%
Deus Ex:MD
+3%
Watch_Dogs 2
-5%
GoW 4
-5%
Titanfall 2
+7%
Mafia 3
-10%
BF 1 (DX12)
+10%
NMS
-24%
Dishonored 2
-20%
--> on average the RX 480 is %7.25 slower than a 1060 FE (20 games total).
2017 Games only
For Honor
-3%
Hellblade
-5%
Mass Effect Andromeda
- 9%
Prey
-12%
RE 7
+20%
Ghost Recon
- 10%
PUBG
-20%
The Surge
+1%
Rime
-27%
Nier
+4%
DoW 3
+ 0%
Halo Wars 2
-3%
Dirt 4
-5%
Styx shards of darkness
-25%
Sniper: Ghost Warrior 3
+9%
The RX 480 is on average 4.7% slower than a GTX 1060 FE (18 games total)
Performance improved by 2,53% this year (if 1060 FE performance is consistent). It's something?
[Benchmarks are mostly from computerbase, pcgh, gamegpu and techpowerup]
1060 v 480 at launch
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/26.html
1060 v 480 now
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/31.html
This data was shown to me already and I even replied to it on this very page. I explained there in detail why this data is invalid and I even ran a quick analyze of it. Just scroll up and you'll be able to find it with ease.
You are talking about drivers improving existing games. Im not debating that at all. Im saying amd performance increases relative to nvidias in many game averages because newer games tend to perform better on amd. If you disagree with thats whats your explanation for amds gainin performance year after year at most pricepoints relative to nvidia?
Apples in 2017 get increased performance over oranges in 2016.
C'mon Icecold.
Fury x v 980ti at launch
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html
Fury x v 980ti now
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_56/31.html
You can do this same type of comparison for almost every amd gpu v its nvidia equivalent for the last 6 years
i meant more long the lines of performance in future titles that are more demanding without needing to run at 4k. for example i wouldnt be too surprised if this time next year, the furyx is equal to or faster than the 1070 at 1440p in performance summaries
The Fury X is a lovely card but that 4GB of memory is not going to do it any favours vs 8GB, there's even games at 1080p that want over 4GB of memory if you max out the texture quality setting.
However if you're not maxing out the texture quality setting it's definitely got more to give in the games that don't play well with it's memory. I too wouldn't be too surprised if it is equal to or faster than the Fury X in such scenarios.
I think it's the only flagship I've seen AMD release in the past 5 years with a flaw which could impact it's capabilities in the future, the 7970 and R9 290X had more VRAM than their competitors so they were already secured in that area for the future, they pretty much bested the competing products when you see how they perform now. The Fury X however is not so future-proof in that aspect unfortunately.
EDIT: Equal to or faster than the Fury X? That doesn't any sense, the 1070 is what I meant.
Mistakenly I appear to have somehow mixed up the Fury X and GTX 1070 with the RX 480 and the performance discrepancies it has with the GTX 1060 6GB. My apologies.
Looking back on the performance of the Fury X I see that it would be unlikely for it to decrease the gap between it and the GTX 1070, regardless of the memory it has. Meanwhile the situation with the RX 480 and GTX 1060 is entirely different, with the RX 480 clawing back the majority of the performance lead the GTX 1060 once had.
GTX 1060 vs. RX 480 - An Updated Review
I'm not quite sure how I managed to mix up the Fury X and RX 480 while still taking about the Fury X, epic fail on my part sorry.
It was a lovely little card but it's performance wasn't so hot for it's price-point, it only had 4GB or vram and it also didn't have much overclocking headroom when compared to the 980 Ti in roughly the same market segment.
With reference to what I posted earlier, DF has now tinkered with some of the basic improvements in the AMD GPU software, overclocking memory alone gives a 5% uptick in games on Vega 56. Pushing the power to the max gives an increase of about 11% in games.....all on a reference cooler.
Initially I thought it would be plausible for the Fury X to close in on the GTX 1070 but then after taking a second look at it's performance in games at the time and the difference in VRAM capacity I thought it would be unlikely, I then somehow confused myself with the RX 480 and the Fury X in the process.
Your post a couple of months ago that I responded too:
My initial thoughts:
My doubt after confusing myself with the Fury X and RX 480 and looking back on it's performance:
Looking at the Vega 56 review from TechPowerUp kind of shows that my thoughts after taking a second look at the Fury X's performance were wrong, and that your thoughts have some validity to them. The irony is that my initial thoughts of the Fury X getting closer appear to be somewhat valid too, the same thoughts I went back on after confusing myself and taking a second look at the Fury X's performance.
It hasn't even been a year and the Fury X is close to or even faster than the GTX 1070 at 1440p in a some of the nwer games in TechPowerUp's Vega 56 review, I find it rather intriguing that it's able to hold up so well in some of the newer games against the GTX 1070. IIRC there was a slightly larger performance deficit between the cards?
Here are some of the games where this happens as well as the overall performance summary which has the GTX 1070 leading. Unfortunately these are only average frame-rates, I do wonder how the minimum frame-rates would compare though.
(Links to make the post smaller, sorry about all of the images!)
Prey and Sniper Elite 4
Relative performance summary
Hehe, I guess the real "epic fail" was my second look at the performance and coming to the seemingly wrong conclusion.
I'm more interested in seeing how memory OC actually impacts Vega at much higher speeds. There might actually be a memory bottleneck somewhere...
Pascal was a major value leap over Maxwell. 1060=980, 1070=980Ti.