LordOcidax
Member
http://www.guru3d.com/articles_pages/amd_radeon_r9_rx_480_8gb_review,21.htmlWhere are the Witcher 3 benchmarks?
http://www.guru3d.com/articles_pages/amd_radeon_r9_rx_480_8gb_review,21.htmlWhere are the Witcher 3 benchmarks?
Has there been anything on the 470 yet? Can't seem to find much on it.
It's not moving anything, it lines up with what he originally said.
Some of the specs have been released, it's 2048 shaders and 4 GB only. Still no clock speed yet, just a vague ">4 TFLOPS" like the 480 was ">5 TFLOPS", so it's difficult to estimate how it'll perform.
No release date yet either.
Probably better bins. The article notes that 1 card out of the 4 they tested does those clocks, the other 3 were at 1.3-1.35GHz.
Where are the Witcher 3 benchmarks?
Kyle said the AIB cards will have already selected chips... so you lottery chances are better with them.Well that was some shortlived hope
Well im a bit underwhelmed to be honest.
But really, a $200 card is going to perform like a $200 card. #shockhorror
How the hell does the card draw an extra 100w over the 6pin?
A $200 card is performing like Nvidia and AMD's $300+ card from the previous generation. I'm satisfied.
And the most important thing: it is much faster than the old $200 cards.
A $200 card is performing like Nvidia and AMD's $300+ card from the previous generation. I'm satisfied.
Not sure what your looking at, but is it total system wattage maybe?
And what about the old $200 dollar cards? The 960 and the 380....300 would get you an 8GB r9 390 non-reference card
A non reference 8GB rx 480 is not 200 dollars, the reference 8GB 480s are already 240 dollars
Add the cost of an aib version and you're rubbing up right against the old prices
I agree a 200 dollar equivalent to the 300 dollar card would be nice, but that isn't the case at all.
Reminds me of a car dealer proposing a car loan xD only x dollars a month* small text of doom
Hmm why is that one so much lower than this one?
300 would get you an 8GB r9 390 non-reference card
A non reference 8GB rx 480 is not 200 dollars, the reference 8GB 480s are already 240 dollars
Add the cost of an aib version and you're rubbing up right against the old prices
I agree a 200 dollar equivalent to the 300 dollar card would be nice, but that isn't the case at all.
Reminds me of a car dealer proposing a car loan xD only x dollars a month* small text of doom
Because the guy who runs or owns or edits the site is miffed over AMD snubbing him and HardOCP by not inviting him to Capsaicin or the E3 event and not sending cards for review, apparently. People on Reddit have been showing off some his posts, and to be honest, yeah, he seems pretty salty about it.
Haven't read the review myself yet, but I'd be wary of it. Obviously if they're reporting lower numbers than everyone else, we know what's up.
Wow, if that's true, some people really can't be objective, and clearly don't give a damn about helping the consumer makie an informed decision.Because the guy who runs or owns or edits the site is miffed over AMD snubbing him and HardOCP by not inviting him to Capsaicin or the E3 event and not sending cards for review, apparently. People on Reddit have been showing off some his posts, and to be honest, yeah, he seems pretty salty about it.
Haven't read the review myself yet, but I'd be wary of it. Obviously if they're reporting lower numbers than everyone else, we know what's up.
He conveniently side stepped when I pointed his hyperbole earlier, we know his m.o.Since when does an AIB add $60 to the price? You have 1070 AIBs right now going for $400 that's $20 over the MSRP not $60.
I also like how when comparing the 480 to Nvidia card the power draw is concerning, but apparently the lower power draw of the 480 don't mean shit when you can get a 390 for only 60$ more...
If there are 480 AIBs reaching $300 I expect them to have significant over clocks and handily beat the 390 especially after the first round of driver updates.
Your repeated attempts to make the 480 seem like a poor value are getting pathetic.
Use Guru3D, they are always on spot.Wow, if that's true, some people really can't be objective, and clearly don't give a damn about helping the consumer makie an informed decision.
How the hell does the card draw an extra 100w over the 6pin?
HardOCP actually tests very differently than other sites. They change settings to find an acceptable frame rate (arbitrarily up to the reviewer), and run those same settings on other cards for an apples to apples comparison. Other review sites may throw on the default ultra setting and let the card rip. Results will differ because of settings.
It's funny but HardOCP actually had the BEST results when it came to power consumption. Showing a full system with an RX4 80 drawing ~50watts less than the same system using the GTX 970 during gaming. Which is in their review. On their forums they've said they're purchasing some retail cards to further test.
They use the best setting they can get with the card.Hmm why is that one so much lower than this one?
lolBecause the guy who runs or owns or edits the site is miffed over AMD snubbing him and HardOCP by not inviting him to Capsaicin or the E3 event and not sending cards for review, apparently. People on Reddit have been showing off some his posts, and to be honest, yeah, he seems pretty salty about it.
Haven't read the review myself yet, but I'd be wary of it. Obviously if they're reporting lower numbers than everyone else, we know what's up.
They are really good in the tests.HardOCP actually tests very differently than other sites. They change settings to find an acceptable frame rate (arbitrarily up to the reviewer), and run those same settings on other cards for an apples to apples comparison. Other review sites may throw on the default ultra setting and let the card rip. Results will differ because of settings.
It's funny but HardOCP actually had the BEST results when it came to power consumption. Showing a full system with an RX4 80 drawing ~50watts less than the same system using the GTX 970 during gaming. Which is in their review. On their forums they've said they're purchasing some retail cards to further test.
While the story is true about they don't getting invite to the event the review is one of the best you can read and shows a better RX 480 than most reviews on the internet.Wow, if that's true, some people really can't be objective, and clearly don't give a damn about helping the consumer makie an informed decision.
It's running the PCI-e slot and 6-pin power connector out of spec.
https://www.reddit.com/r/Amd/comments/4qfwd4/rx480_fails_pcie_specification/
Yes, HardOCP are known for using actual gameplay to test. This makes their reviews the most subjective of any review but they are the only ones who actually know what it's like to play games with the cards because they actually play games with the cards.
I like to read their reviews for the commentary and subjective opinion but for basic numbers they aren't the best nor do they pretend to be about millions of bar charts like every other review.
They use the best setting they can get with the card.
1920x1080
AA On
Graphic Preset Ultra
HBAO+
Processing Preset High
HairWorks Off
The other graphic is using Max quality (I don't know what this means) and possible HBOA disabled.
Witcher 3 specifically you shouldn't compare benchmarks from different sites, it doesn't have a built in benchmark so every site uses a different test run.
What in the hell is going on. I read through the thread. That is not looking good, at all.
If drivers end up changing the performance over time, are there any sites that give updated performance results? Not as part of the review, of course, but just in some kind of database or an update article.
The back-peddling/misleading is ridiculous. When has the power consumption of only the GPU chip ever been relevant? People only care about the power consumption of the entire card.
Looks like the thread was even removed at one point but brought back because people were pissed.
lol
They are one of the most reliable bench you can found... the way is different from all others reviews because they choose to test the best playable options you can get with the card at that resolution and shows how the framerate maintain during 30 minutes.
BTW they are gettng better results for RX 480 than most of the others reviews.
No but it's drawing power beyond the rating of the board and 6 pin that is cause for concern. I don't care as much about the power draw if the card stays relatively cool, OC's well and performance well. But I don't want to damage my board by running the card.
Says you maybe...
No but it's drawing power beyond the rating of the board and 6 pin that is cause for concern. I don't care as much about the power draw if the card stays relatively cool, OC's well and performance well. But I don't want to damage my board by running the card.
Dude. This card annihilates the 960 and 380/380X, which were the $200 cards mere months ago. I built a PC a few months back, I know what the old recommendations at that price point were.
You are kidding yourself if you think the 480 isn't a significant improvement over the 960 and 380.
The nitro has an 8 pin. Hopeful the other ones follow.Hopefully AIBs fix that issue. Even so it seems like a low threat. Although a cheap card like this will attract penny pinchers with cheaper/older MBs.
Sorry forgot the /s in my post
I was also referring how people are disappointed the 480 doesn't decimate the 970/390/39x
What in the hell is going on. I read through the thread. That is not looking good, at all.
One of those things is far more likely to happen than the other >.>if the aib cards hit 1500 and reach furys im in
Thank goodness. I thought you were sincere.
I'm considering switching from a 970 to a 480 since I can sell my 970 and get the 480 for basically free, and I would finally be free of the shackles of 3.5 GB of VRAM. I want to support AMD.
It pulls more power than the "official" 75W + 75W = 150W that the PCI-e slot and 6-pin power connector can technically allow. TPU measured it pulling 163W, other sites have also measured similar draw. When overclocked it apparently can pull over 200W from connectors rated for only 150W to stay in spec!
Is this seriously potentially harmful? Nobody seems to know, up until now video cards have respected the PCI-e power specs. The 1080 is like all Nvidia cards which have BIOS-locked power limits, you cannot make a single 8-pin connector 1080 pull more than 75 + 150 = 225W no matter how hard you try, the BIOS will throttle the card back.
There are of course custom 1080's with more than 1 connector, and those have higher power limits like the MSI 1080 Gaming X which has a BIOS that lets the card pull 240W from the 2x 8-pin connectors. Other custom 1080's are similar.
This wouldn't be a problem if AMD weren't so hellbent on fitting a 6-pin connector on a card that draws power like the 1070 and 1080, cards which have 8-pin connectors. Why they would do this is a mystery, just put a damn 8-pin connector on the reference card and be done with it. This is a really dumb mistake that AMD made and there was absolutely no good reason to make it, they know what the damn PCI-e specs are and they know damn well they shouldn't exceed them. The specs aren't decorative or anything.
Opps double post
My cynical side tells me they wanted to match/beat the 970/390 and had to up the clocks at the last minute. If not that then they just took a calculated risk that most modern PCI-E slots could handle the extra draw and saved some pennies on the card/projected a low power consumption image.
To be honest NVidia and AMD reference cards are trash just in varying degrees.