• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Radeon RX480 Review Thread, Launching Now!

This should be dope, but expansive

rx-480-rog-strix-asus-lanAsamento-resfriamento_chamada.jpg

Don't do it to me.

Would snap one of those up if it can clock to 1450Mhz and above.
 

gabbo

Member
It can brink cheaper motherboards but more expensive ones will be fine.

While that's as step in the right direction, I have no idea what a cheap motherboard is now days. I don't want to think my Asus P8Z77-V LK is 'cheap', but it's a bit old.
From the look of things, I don't need worry much, but I'll keep off any purchases until I see more
 

They really have no shame

'no guys we meant ASIC power consumption'

noone ever talks about that, it's ALWAYS the power draw of the entire card, always.

What a shameless (and probably partially successful)attempt to misdirect and derail

Oh a more positive note for AMD, performance-per-dollar for this card is the best on the market, and the 4GB $199 by some distance at that. Nothing comes close to that one.

perfdollar_1920_1080.png


perfdollar_2560_1440.png


https://www.techpowerup.com/reviews/AMD/RX_480/26.html

10 percent better performance/dollar than an r9 290, for similar performance,compared to a card that is several years old

Again, performance/dollar improvements last gen (going from kepler to maxwell ) were over 70 percent

Do people have such a short memory?

Worst value increase from a gpu launch since the geforce fx6800/radeon x1800. Amd and nvidia setting records...
 

ethomaz

Banned
Reposting.
No.

That is games running in DX11 vs games running with DX12. Not a comparison about DX11 vs DX12. What you showed are different games have different performance in different cards.

What you need to have is the same game running in DX11 vs running in DX12 to see if the actual patch of DX12 give better performance boost to RX 480 than others cards.

You are claiming RX 480 has better increase in performance in DX12 over DX11 than other cards but no benchmark shows that...
 

LordOfChaos

Member
Because its TDP is also same as 1080.Yeah, prices not equal, but from a tech point of view is clear there is a way superior architecture, like a gen apart.

We don't know about the superior architecture part. AMD is using Global Foundries, Nvidia is using TSMC, unlike before when they were both on TSMC. I think AMD is suffering for it. AMD was legally bound to do this by a wafer supply contract.

It's likely some of both, but we don't know how much of it is Polaris vs the fab yet until they make TSMC parts.

They are however still using TSMC for their coming higher end.
 

chaosblade

Unconfirmed Member
Initial disappointment out of the way, I'll probably still get a non-reference 480 once they hit. At worst it's a 970, it's a decent upgrade from my 760, DX12 performance looks great, I'll get adaptive sync, and I feel like it will probably see some big performance increases because I just can't imagine it's anywhere near it's potential right now.

It just baffles me that AMD's 5.8TF part with various architectural improvements would not perform close to their previous 5.9TF part, but instead their 5.1TF one. That's over a 10% regression.
 
Right now the reference model is going for 270EUR over here in the Netherlands.

https://tweakers.net/pricewatch/558305/sapphire-radeon-rx-480-8gd5.html

I'm not hopeful for a sub-300 Strix release. :(

Well, £280 would be around 338 EUROS. I think the AIBs will be around that.

They really have no shame

'no guys we meant ASIC power consumption'

noone ever talks about that, it's ALWAYS the power draw of the entire card, always.

What a shameless (and probably partially successful)attempt to misdirect and derail



10 percent better performance/dollar than an r9 290, for similar performance,compared to a card that is several years old

Again, performance/dollar improvements last gen (going from kepler to maxwell ) were over 70 percent

Do people have such a short memory?

Worst value increase from a gpu launch since the radeon fx6800/radeonx1800. Amd and nvidia setting records...

Well when you put it like that...

I was just posting how TPU put it, perf per dollar. But I agree, this totally ignores the historical context of the cards listed.
 

wachie

Member
10 percent better performance/dollar than an r9 290, for similar performance,compared to a card that is several years old

Again, performance/dollar improvements last gen (going from kepler to maxwell ) were over 70 percent

Do people have such a short memory?

Worst value increase from a gpu launch since the radeon fx6800/radeonx1800. Amd and nvidia setting records...
More hyperbole.

Let me put you in your place.

Kepler to Maxwell - 28% improvement in fps/$
http://techreport.com/review/27067/nvidia-geforce-gtx-980-and-970-graphics-cards-reviewed/13

Tonga to Ellesmere - 28% improvement in fps/$
http://techreport.com/review/30328/amd-radeon-rx-480-graphics-card-reviewed/13

Please carry on.
 

wachie

Member
No.

That is games running in DX11 vs games running with DX12. Not a comparison about DX11 vs DX12. What you showed are different games have different performance in different cards.

What you need to have is the same game running in DX11 vs running in DX12 to see if the actual patch of DX12 give better performance boost to RX 480 than others cards.

You are claiming RX 480 has better increase in performance in DX12 over DX11 than other cards but no benchmark shows that...
No.

The original point of disagreement was that you were insinuating that DX11 will continue to be the norm and consequently there won't be any tangible benefit for the 480 until 2018/19. That is not the case, I never said or implied that DX12 version of X game ran better but given how AMD has had CPU utilization issues in DX11 that may well be true.
 

ethomaz

Banned
No.

The original point of disagreement was that you were insinuating that DX11 will continue to be the norm and consequently there won't be any tangible benefit for the 480 until 2018/19. That is not the case, I never said or implied that DX12 version of X game ran better but given how AMD has had CPU utilization issues in DX11 that may well be true.
But that is exactly my point... DX12 won't show real benefices until 2018/2019 when games started to be made using it.

You are using DX11 games vs DX12 games to show the opposite instead to use the same game running in DX11 vs running in DX12.

You can't use different games to gauge benefices for a new API. I continue to say it is not evident yet and it will need to wait until the API start to become the standard... takes years for that.

DX12 will start to show real benefices when engines started to be made from zero in DX12... it is all over DX9, DX10, DX11, etc again.
 

wachie

Member
But that is exactly my point... DX12 won't show real benefices until 2018/2019 when games started to be made using it.

You are using DX11 games vs DX12 games to show the opposite instead to use the same game running in DX11 vs running in DX12.

You can't use different games to gauge benefices for a new API that I continue to say it is not evident yet and it will need to wait until the API start to become the standard... takes years for that.
Uhh what?

The 480 consistently performs better in DX12. If more games get a DX12 path, more benefit for 480 users, devs don't have to code from scratch in DX12 exclusively for the benefits to be obvious.
 
It just baffles me that AMD's 5.8TF part with various architectural improvements would not perform close to their previous 5.9TF part, but instead their 5.1TF one. That's over a 10% regression.

It's only 5.8TF with the boost clock of 1266MHz. It doesn't actually clock that high in most instances. The R9 390(X) on the other hand doesn't have a boost clock, and has 50% more memory bandwidth (Polaris' better memory compression can't always make up for that).
So it's not really surprising in the end.
 
Uhh what?

The 480 consistently performs better in DX12. If more games get a DX12 path, more benefit for 480 users, devs don't have to code from scratch in DX12 exclusively for the benefits to be obvious.
Nvidia has always been late to introduce directx features.But when it makes it uses to be successful( cough NV30...). For the time Dx12 is stablished you will have Volta out with async done efficiently.
 

elfinke

Member

Mrbob

Member
So is this THE 1080p card now, or would the 1070 be a better bet for solid 60fps?

From the benches on Guru 3D, it seems the 1070 preforms significantly better than the 480 at 1080P. It is up to you how much you want to spend though. You aren't hitting 60FPS without some compromise on the 480 at 1080P. 1070 seems to be able to max out and be over 60FPS in most cases. I use The Witcher 3 bench as one of my biggest guides as the game is more graphically intense than others. Huge difference between 1070 and 480. 480 can't even hold 60FPS.

Personally I'm happy I bought a 1070 and didn't wait for the 480. 480 isn't the performance jump I want in a video card.

For 240 bucks (get the 8gb card people, games take over 4gb now at 1080P) it is an excellent buy. But one can argue paying for the premium for the GTX 1070 is worth it as well. Simply a matter of how much you want to invest.

If you want solid 60FPS look towards the 1070.
 

ethomaz

Banned
Uhh what?

The 480 consistently performs better in DX12. If more games get a DX12 path, more benefit for 480 users, devs don't have to code from scratch in DX12 exclusively for the benefits to be obvious.
Show me RX 480 doing better in DX12 enable patch vs DX11 with disable patch and we can start to talk about how RX480 performs better with DX12 over DX11.

DX12 for now is a future expectation and nobody knows how will be the scenario when it started to get used by devs not like a band-aid instead from the ground... I remember FX 5000 being good at DX9 until devs start to make native DX9 games lol
 
It's only 5.8TF with the boost clock of 1266MHz. It doesn't actually clock that high in most instances. The R9 390(X) on the other hand doesn't have a boost clock, and has 50% more memory bandwidth (Polaris' better memory compression can't always make up for that).
So it's not really surprising in the end.
32 rops and memory bandwidth should be enough to win similar flops chip at 1080p.The arch improvements are backwards...
I wonder where are the Santa Clara and Marlborough teams architects...they must be almost all at Nvidia.
 

wachie

Member
Show me RX 480 doing better in DX12 enable patch vs DX11 with disable patch and we can start to talk about how RX480 performs better with DX12 over DX11.

DX12 for now is a future expectation.
Let me repeat once again since you seem to be missing the entire point. I'm not comparing 480 DX11 vs 480 DX12.

480 performs between the ref 970 and ref 980 in DX11. 480 performs like a ref 980 and on occasions faster in DX12. AMD can leverage DX12 performance advantage right now, they don't have to wait till 2018/19.
 

thefil

Member
This is a great popcorn thread. As someone rocking a 4-year old card that I probably won't upgrade for another year at least, the passion and angst in here makes me smile.
 

martino

Member
This is a great popcorn thread. As someone rocking a 4-year old card that I probably won't upgrade for another year at least, the passion and angst in here makes me smile.

yeah this is why i changed more than 1 years ago. good time to change will be next round with hbm and better optimization of architecture on current node size.
 
More hyperbole.

Let me put you in your place.

Kepler to Maxwell - 28% improvement in fps/$
http://techreport.com/review/27067/nvidia-geforce-gtx-980-and-970-graphics-cards-reviewed/13

Tonga to Ellesmere - 28% improvement in fps/$
http://techreport.com/review/30328/amd-radeon-rx-480-graphics-card-reviewed/13

Please carry on.

Not worth it man. I posted a direct quote from videocardz about 480 benches. Both stock and overclock. He accused me of cherry picking the higher score even though both were in the quote. He then went on to attempt to compare prices of 4GB 970s to 8GB 480s to try and make the 970 seem like a slighty not as bad alternative to the 480. He will ignore any criticism and keep posting misinformation.
 
Man, I tried to be optimistic about the power draw because I thought if anything, AMD would try to make this a very efficient GPU (after all that's what they tried to market it as) regardless of performance. But it's way worse than I expected.

If it was ~120w power draw for 980 performance, it would have been impressive (and I don't doubt the 1060 will be around that). As it stands though, it's just... "ok".
 

Mrbob

Member
Many of us survived the power consumption and heat of the GTX 470 and GTX 480, I feel like we are nitpicking now.

Still it is crazy how efficient Nvidia has made their new cards alongside being powerful.
 

ZOONAMI

Junior Member
I really don't understand why so many people are concerned about power draw, especially on a 150w gpu. I guess if your energy prices are astronomical or you have a 300w or less psu...
 

wachie

Member
Not worth it man. I posted a direct quote from videocardz about 480 benches. Both stock and overclock. He accused me of cherry picking the higher score even though both were in the quote. He then went on to attempt to compare prices of 4GB 970s to 8GB 480s to try and make the 970 seem like a slighty not as bad alternative to the 480. He will ignore any criticism and keep posting misinformation.
I expect the same again ;)
 

ethomaz

Banned
Let me repeat once again since you seem to be missing the entire point. I'm not comparing 480 DX11 vs 480 DX12.

480 performs between the ref 970 and ref 980 in DX11. 480 performs like a ref 980 and on occasions faster in DX12. AMD can leverage DX12 performance advantage right now, they don't have to wait till 2018/19.
Are you reading what I'm posting?

RX 480 perform better in some DX11 games too... it is not a big deal because the performance variates from game to game.

What are you trying to say it is because DX12 and not the game... while they is the possibility the same game can perform better on RX 480 even in DX11 so I asked you benches that show RX 480 having better performance with DX12 enabled vs DX12 disabled.

You are focusing in a single point that didn't prove nothing at all... let's check the link you posted.

DX12 Ashes of the Singularity - A game that favors AMD and the results shows it is between 970 and 980 like others DX11 results.

DX12 Hitman - RX480 performs better than GTX 980 but if you go back to Hitman DX11 you will see the game already runs better on AMD with RX 480 running better than GTX 980.

RX-480-ABC-62.jpg


VS

RX-480-ABC-79.jpg


The performance is result of the game and not DX11/DX12... in fact the only card you can say beneficiaries a bit with DX12 was R9 390 in this bench.

Quantum Break - RX 480 performs better than GTX 980 but there is no DX11 option to see if it is the API or the game.

Rise of the Tomb Raider - Another game where it performs better than GTX 980 in DX12... and in DX11 it perform below GTX 970... that is the unique case what you are saying is happening.


After see the results for the 4 games that have some DX12 related feature vs 10 games in the same reviews with DX11 the conclusion is...

1 game shows DX12 results better than DX11 for RX 480
2 game shows same results across DX11 and DX12
1 game can't be compared because it only exists in DX12

Understood now? How your claim has no base... but maybe I have to hate the reviewer that created that bad graph to show DX12 performance average lol... it has no shame to make so weak analyst trying to show RX 480 perform better in DX12.
 

DonMigs85

Member
I really don't understand why so many people are concerned about power draw, especially on a 150w gpu. I guess if your energy prices are astronomical or you have a 300w or less psu...

I don't care much if it's for a desktop, but it doesn't bode well for their mobile parts once Pascal hits laptops.
 

Xenus

Member
I really don't understand why so many people are concerned about power draw, especially on a 150w gpu. I guess if your energy prices are astronomical or you have a 300w or less psu...

Yeah the only really bad part is the reports of drawing more than PCI-E spec from the actual PCI-E port. Otherwise I could care less if it's less efficient as long as the price is right. The efficiency aspect has connotations for the higher end part assuming it;s not al GoFlo and for the consoles but those were built at TMSC previously so it would probably be built there as well going forward.
 

Durante

Member
Since when has power draw outside of some insane scenario ever even been a passing consideration in PC building?
Well, at the very least since HTPCs have become a thing.

But far more importantly, power efficiency in GPUs for a few generations now has been directly indicative of an architecture's ability to compete at the high-end. As such, AMD's failure to catch up in efficiency could well mean that they also won't be able to directly compete in that area, which in turn means less price pressure there, which is bad news.

I expected this card to consume ~130W in-game, and people thought that was pessimistic.
 

chaosblade

Unconfirmed Member
The main concern about power draw seems to be that it exceeds PCIe spec by drawing over 75W. It's ridiculous that's a thing, if it even came close to 150W at their target specs, they should have used 8 pin or dual 6 pin.

At least there is a good chance non-reference cards fix that.
 

rrs

Member
I really don't understand why so many people are concerned about power draw, especially on a 150w gpu. I guess if your energy prices are astronomical or you have a 300w or less psu...
It could matter if the card runs better on 2x6 or 8 pin setups than instantly hitting TDP caps on reference hair drier
 

wachie

Member
Look at the lead in DX11 and DX12.

I'm not even sure how to react when people can read both our posts and make sense of who's blowing smoke. While you are it, send an email to the reviewer at HC and let him know what a fool he is.
 

Irobot82

Member
Well, at the very least since HTPCs have become a thing.

But far more importantly, power efficiency in GPUs for a few generations now has been directly indicative of an architecture's ability to compete at the high-end. As such, AMD's failure to catch up in efficiency could well mean that they also won't be able to directly compete in that area, which in turn means less price pressure there, which is bad news.

I expected this card to consume ~130W in-game, and people thought that was pessimistic.

Compete in the high-end or compete in the high-end efficiency. The Fury X wasn't as fast as the 980ti but it competed.
 
I really don't understand why so many people are concerned about power draw, especially on a 150w gpu. I guess if your energy prices are astronomical or you have a 300w or less psu...

Yeah, I didn't buy a 650watt Power Supply for nothin. I'm considering a 1070(3-4yr card) or 480(2yr card). If it takes 200watts to get the 480 to 1400MHz it's not ideal, but it wouldn't stop me from buying it.

I think some people really want AMD to be more competative(Durante made a great point), but I also think the power draw is ammo for fanboys. The 290/290X were better than Nvidia cards and then Maxwell came along and performance was more or less the same so all you could point to was power draw numbers to claim superiority.
 

Mareg

Member
So is it finally the time to replace my 7950 gigahertz edition ?
That card has served me well for the past 3-5 years. I can't remember. It is still performing admirably. 1080p for life !
 

ethomaz

Banned
Look at the lead in DX11 and DX12.

I'm not even sure how to react when people can read both our posts and make sense of who's blowing smoke. While you are it, send an email to the reviewer at HC and let him know what a fool he is.
Like I said show me the superior DX12 performance from RX 480... if one game is all you have then I will stay with my line of thinking that DX12 will be something not before 2018.

And about the email... why bother so much... let him fool his readers.

Edit - Another DX12 vs DX11 graph.

82408.png

82410.png
 
I really don't understand why so many people are concerned about power draw, especially on a 150w gpu. I guess if your energy prices are astronomical or you have a 300w or less psu...

Because lower wattage for better performance usually means much better TDP. This generally leads to better OC'ing performance so it has a direct correlation to performance. In addition a lot of people are dong micro/HTPC builds and using 500 or lower PSU's. For people who really think about overall costs too it does factor in. But I think mainly it's because it's shocking how much heat and how little OC'ing room the reference card has. Especially compared to Nvidia offerings over the last few years.
 

ZOONAMI

Junior Member
Yeah, I didn't buy a 650watt Power Supply for nothin. I'm considering a 1070(3-4yr card) or 480(2yr card). If it takes 200watts to get the 480 to 1400MHz it's not ideal, but it wouldn't stop me from buying it.

I think some people really want AMD to be more competative(Durante made a great point), but I also think the power draw is ammo for fanboys. The 290/290X were better than Nvidia cards and then Maxwell came along and performance was more or less the same so all you could point to was power draw numbers to claim superiority.

Exactly how I felt about the 980, it was like really guys, it isn't even better than an aftermarket 290x, so you're just buying into nvidias tdp marketing? I get it if you have high energy prices or a small psu, or if you're really concerned about a minimal change to your carbon footprint... but otherwise the 980 wasn't really that impressive imo.

Stealth edit, the 970 was a great perf/dollar card so I changed my post ;) but then there was the lying about the vram set up.

It's also how I feel about the 1080, given a 980 ti is basically just as good, and launched a while ago, at a cheaper price. Again nvidia is pushing tdp like it's some huge deal, when in most usage cases it isn't, especially when we're talking about a $700 card.
 

Irobot82

Member
Like I said show me the superior DX12 performance from RX 480... if one game is all you have then I will stay with my line of thinking that DX12 will be something not before 2018.

And about the email... why bother so much... let him fool his readers.

Edit - Another DX12 vs DX11 graph.

What is this? Does this mean AMD's GCN improvements have fixed some of their driver overhead?
 
Exactly how I felt about the 980, it was like really guys, it isn't even better than an aftermarket 290x, so you're just buying into nvidias tdp marketing? I get it if you have high energy prices or a small psu, or if you're really concerned about a minimal change to your carbon footprint... but otherwise the 980 wasn't really that impressive imo.

Stealth edit, the 970 was a great perf/dollar card so I changed my post ;) but then there was the lying about the vram set up.
Durante said it right.The tech behind the 980 is the tech that made possible a Titan X.The same that now the tech behind the 1080 with double the performance at almost the same TDP as a RX 480 will make us rool our eyes with the 1080TI.
 

element

Member
As silly as it sounds I wish more of the benchmarks were done from the perspective of someone upgrading. People with GeForce 600, 700 and Radeon HD 7000/8000.
 
I bet we'll see some big improvements over time. But, that's wishfull thinking. Right now, I'm seeing the 1070 as my next gpu.

Go for it. What it really comes down to is are you willing to pay twice the cost for that extra 50% performance. Are you looking to get high/max settings 60+fps on 1440p consistenly? Then go the 1070 route, it's for you. If you're someone who wants high/max settings on 1080 the 480 is a more logical choice. These cards exist for different market sectors. Maybe you're just in that higher market.
 
Top Bottom