• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Newest AMD R9 300 series rumors: 380 ~$330, 380X ~ $400, 390 ~$700, 390X ~$700+ 300W+

Comparing two reference cards across a wide range of benchmarks is now "spreading FUD and horseshit". Ok, whatever you say.

I've been in this for a while, and reference cards is how GPUs have been compared for over two decades now.

you are deliberately comparing a model you fucking KNOW throttles under load, non reference designs don't do that, the 980 is 10% faster than the 290X at 1080p, less than that at 4k when neither card is throttling and neither is overclocked. hell, that bench shows the 290 only being 1% slower than the 290X, thats complete nonsense too, and i call into question the legitimacy of your source. I don't give a rats ass the reputation you have around here for your work on dark souls and gedosato, when you are full of shit i'm going to call you out on it.
 

tuxfool

Banned
you are deliberately comparing a model you fucking KNOW throttles under load, non reference designs don't do that, the 980 is 10% faster than the 290X at 1080p, less than that at 4k when neither card is throttling and neither is overclocked. hell, that bench shows the 290 only being 1% slower than the 290X, thats complete nonsense too, and i call into question the legitimacy of your source. I don't give a rats ass the reputation you have around here for your work on dark souls and gedosato, when you are full of shit i'm going to call you out on it.

Dude, cool down. If you can produce some evidence of benchmarks on a non-reference 290x vs 980, it would benefit everyone here.
 

The Llama

Member
Boy-That-Escalated-Quickly-Anchorman.gif


Thread pls
 

WolvenOne

Member
Okay, question. Seeing as the 390X is supposed to be about a 300watt card, would it be safe to assume that the lower tier cards below that would be lower wattage, or at least about the same?

My current PSU should be able to handle a 300 watt card comfortably, but I'd hate to go a tier or two down, only to see the wattage requirements skyrocket.
 

FLAguy954

Junior Member
Good thing this is a rumor because that pricing scheme looks like bullshit. I have never seen AMD sell their top two flagship cards so close in price to each other. If they want some market-share back they would be wise to price the 290/290X replacements appropriately.

I'm okay with the card being a 300W monster. Power consumption doesn't deter my choice to purchase a card (but price will).
 
Dude, cool down. If you can produce some evidence of benchmarks on a non-reference 290x vs 980, it would benefit everyone here.

http://www.guru3d.com/articles_pages/asus_geforce_gtx_980_poseidon_review,13.html

certainly. 290X using newest drivers, the model marked GTX 980 is reference clocks, ignore all the overclocked comparisons (its a review of an overclocked 980). check every fucking game, its nowhere near 25%

its trading blows with the 980 in every single game at 1440p and 4k, and coming damned close at 1080p.
25% is a complete fucking lie, i dont give a rats ass who claims it. consider that its a year and a half older than the 980 and its just embarassing for nvidia. 390X should have no trouble stomping titan-x into the ground, at $3 - 400 cheaper.
 

Durante

Member
you are deliberately comparing a model you fucking KNOW throttles under load, non reference designs don't do that, the 980 is 10% faster than the 290X at 1080p, less than that at 4k when neither card is throttling and neither is overclocked. hell, that bench shows the 290 only being 1% slower than the 290X, thats complete nonsense too, and i call into question the legitimacy of your source. I don't give a rats ass the reputation you have around here for your work on dark souls and gedosato, when you are full of shit i'm going to call you out on it.
You shouldn't care about my reputation, but perhaps you should be slightly less agressive in your goalpost moving, and perhaps a bit less agitated overall. Just an idea.

Maybe it's just me, but accusing people of "spreading FUD and horseshit because of a personal bias" because they use the same benchmarking procedure which has been common in GPU comparisons for two decades might be, I don't know, insane?
 

viveks86

Member
Okay, question. Seeing as the 390X is supposed to be about a 300watt card, would it be safe to assume that the lower tier cards below that would be lower wattage, or at least about the same?

My current PSU should be able to handle a 300 watt card comfortably, but I'd hate to go a tier or two down, only to see the wattage requirements skyrocket.

Assuming the rumor holds, we are still unclear what 300+ means. Is it 301? 310? 320? We don't know. Given HBM is expected to be low power, it's unclear how that relates to the lower tier non-HBM cards. It would be reasonable to expect lower tier cards to consume lower power though.
 
guru3d just did benches on the newest drivers, that result is fucking nonsense as well. the 290X is NOT 33% slower, and its not 20% slower either. in EVERY FUCKING GAME IN THE TEST SETUP THE 290X IS 10% slower at worst, and just as fast or slightly faster at best. embarassing for a $560 card that came out 16 months after the 290X, a $300 card.

Which Guru3D benches are you talking about? Could you please link them? Also, stop insulting people.
 

Momentary

Banned
you are deliberately comparing a model you fucking KNOW throttles under load, non reference designs don't do that, the 980 is 10% faster than the 290X at 1080p, less than that at 4k when neither card is throttling and neither is overclocked. hell, that bench shows the 290 only being 1% slower than the 290X, thats complete nonsense too, and i call into question the legitimacy of your source. I don't give a rats ass the reputation you have around here for your work on dark souls and gedosato, when you are full of shit i'm going to call you out on it.

How about using some of that energy to pull up some legitimate benchmarks from some reputable websites?
 
Which Guru3D benches are you talking about? Could you please link them?

http://www.guru3d.com/articles_pages/asus_geforce_gtx_980_poseidon_review,13.html

i did, here ill do it again. bioshock infinite is the only game in the test which shows any significant, distinct advantage for the GTX 980 over the 290X, and the performance difference is so large, it throws off a combined average rather significantly, leading to results like the one posted above.

How about using some of that energy to pull up some legitimate benchmarks from some reputable websites?


i did post results from reputable sites, are you blind?
 
Lol at all the people believing Nvidias TDP figures. At 100% load, the 980 consumes over 220 watts, 50 to 70 less than a 290X at full load. Everyone is forgetting that this card comes standard with a closed loop watercooler. Even if you run the 290X 24 at full loaf for a year, what are you saving? 4-5 bucks a month? If you're penny pinching like that and expect to run enthusiast cards on a 400 watt PSU, you should look at different GPUs.
 

Durante

Member
I was interested to see if there was anything to the claim that the 290X profits disproportionately from non-reference designs, so I had a look.

Non-reference 290Xs seem to be roughly 9% faster than reference models:
290x_nonref_2opuzd.png

290x_nonref_17cuk9.png


Nonreference 980s are 7% to 14% faster than the reference model:
980_nonrefscub3.png


In other words, I see no reason to believe that comparing non-reference with non-reference models would meaningfully change the result.
 
http://www.guru3d.com/articles_pages/asus_geforce_gtx_980_poseidon_review,13.html

i did, here ill do it again. bioshock infinite is the only game in the test which shows any significant, distinct advantage for the GTX 980 over the 290X, and the performance difference is so large, it throws off a combined average rather significantly, leading to results like the one posted above.




i did post results from reputable sites, are you blind?

Stop insulting people lol. How does that help discussion?
 
http://www.guru3d.com/articles_pages/asus_geforce_gtx_980_poseidon_review,13.html

i did, here ill do it again. bioshock infinite is the only game in the test which shows any significant, distinct advantage for the GTX 980 over the 290X, and the performance difference is so large, it throws off a combined average rather significantly, leading to results like the one posted above.




i did post results from reputable sites, are you blind?

You link the benchmark that is most even, other benches vary between 5 - 15 fps with framerates being around 50- 70 or so, yielding about 10-20% depending upon the title.
 
I was interested to see if there was anything to the claim that the 290X profits disproportionately from non-reference designs, so I had a look.

SNIP

In other words, I see no reason to believe that comparing non-reference with non-reference models would meaningfully change the result.


non reference at REFRENCE CLOCKS, those non reference 980s have 10 - 20% overclocks from the factory. see the guru3d tests for reference clocked 980 vs 290X on newest drivers. its nowhere even remotely close to 20%

You link the benchmark that is most even, other benches vary between 5 - 15 fps with framerates being around 50- 70 or so, yielding about 10-20% depending upon the title.


please look at the other pages of the article, i linked the first game test page, look at the rest, bioshock infinit shows a 50% performance difference, the rest? ALOT less than 25% at 1080p, more like ~5% at 1440p or 4k, look for yourself. Also, durante claimed 25% across the board, its not even halfway close to that. its absolute FUD.
 

tuxfool

Banned
Lol at all the people believing Nvidias TDP figures. At 100% load, the 980 consumes over 220 watts, 50 to 70 less than a 290X at full load. Everyone is forgetting that this card comes standard with a closed loop watercooler. Even if you run the 290X 24 at full loaf for a year, what are you saving? 4-5 bucks a month? If you're penny pinching like that and expect to run enthusiast cards on a 400 watt PSU, you should look at different GPUs.

The reference 980 comes with a blower cooler. But, yes, you are correct that the Nvidia TDP figures are generally lowballing it.
 

viveks86

Member
I've only recently started following PC related threads and I'm sick and tired of this shit already. This is worse than console wars and not worth my reading time. I'm out!
 

WarpathDC

Junior Member
Going to keep my r9 270x SC until I see the official specs. I was holding off on a 970 but these AMD specs are disappointing (outside of the 390 series). Oh well I have until may to Witcher proof my gpu. The rest of my rig is ready
 

Sulik2

Member
Any early word on how the drivers are looking? AMD always seems make great hardware then shoot themselves in the foot with their drivers.
 
non reference at REFRENCE CLOCKS, those non reference 980s have 10 - 20% overclocks from the factory. see the guru3d tests for reference clocked 980 vs 290X on newest drivers. its nowhere even remotely close to 20%

Hrmmm...
Let's see
Crysis 3: 49 v 41 = 16.3%

Metro LL: 59 v 50 = ~15%

Theif dx11: 54 v 47 = ~15%

Tomb: 104 v 87 = ~17%

please look at the other pages of the article, i linked the first game test page, look at the rest, bioshock infinit shows a 50% performance difference, the rest? ALOT less than 25% at 1080p, more like ~5% at 1440p or 4k, look for yourself. Also, durante claimed 25% across the board, its not even halfway close to that. its absolute FUD.

First you say 10%, then 20, then '25%.' You are goal post moving and, in general, quite wrong.
 

riflen

Member
I've only recently started following PC related threads and I'm sick and tired of this shit already. This is worse than console wars. I'm out!

To be fair to the vast majority of posters here, this is something that I've only seen happening on this level over last few days. Generally PC threads are not like this, but we've got a few recent juniors throwing personal insults and goalpost-moving like hell at the moment.
 

Durante

Member
non reference at REFRENCE CLOCKS, those non reference 980s have 10 - 20% overclocks from the factory.
So you are proposing comparing cards which almost don't exist? I just checked, and of the 41 non-reference 980 models sold in my country only 4 ship with reference clocks. That's why generally when comparing GPUs you compare the reference models.
 
Hrmmm...

Let's see
Crysis 3: 49 v 41

Metro LL: 59 v 50

Theif dx11: 54 v 47

Tomb: 104 v 87

you are looking at the overclocked models, look at the one labeled simply GTX 980, your numbers are again, FUD.


So you are proposing comparing cards which almost don't exist? I just checked, and of the 41 non-reference 980 models sold in my country only 4 ship with reference clocks. That's why generally when comparing GPUs you compare the reference models.

how many of the non reference 290/290X models come with reference clocks? i know the XFX double dissipation does, i know a few others that do. Look, im sorry for getting angry, im having a bad day and i shouldnt have taken it out on you. The 290/X reference cooler simply isnt adequate to prevent the card from throttling, its almost as terrible a cooler as the nvidia FX5XXX series cards that would catch fire when a screensaver came on lol. comparisons to models using that cooler aren't valid because of this, aftermarket models dont have the issue, and its not 25% across the board. 10 - 15% in some games, sure, dead on even in other games? also sure. overall averages that include titles like bioshock infinite are broken because of its 50%+ advantage on nvidia hardware (which, btw, i have no fucking idea why thats a thing, do you?)
 
Isn't this the website that changes their settings depending on what the highest playable settings are? Kind of makes it even more difficulty to compare FPS, since the settings aren't uniform across the cards.

check the link, of the 2 pages I saw, all the cards had the same settings.

I pretty much stopped going there when they started doing that. I think it was around the time they tried to expand their website into 4 different sections. Though I'm not really into the overclocking scene like I was back in college.
 

Hazaro

relies on auto-aim
Hmm... I'm running off so I don't have time to pull up findings of my own, but I don't think it's really that bad.

Also fuck yeah single GPU performance thank you AMD. Screw watt ratings, feed that sucker POWER.
Pricing feels off, too big of a gap for me, but tiers seem about right considering the crazy 290 used market prices.
you are deliberately comparing a model you fucking KNOW throttles under load, non reference designs don't do that, the 980 is 10% faster than the 290X at 1080p, less than that at 4k when neither card is throttling and neither is overclocked. hell, that bench shows the 290 only being 1% slower than the 290X, thats complete nonsense too, and i call into question the legitimacy of your source. I don't give a rats ass the reputation you have around here for your work on dark souls and gedosato, when you are full of shit i'm going to call you out on it.
Woah dude calm down.

FWIW I think a lot of early benches that have not been revisited do have issues with the 290(X) thermal throttling that make it looks worse than it is.

IMO the comparison to someone buying a 290(X) should be towards a non reference 290 because the reference cooler was SO TERRIBLE NO ONE SHOULD EVER HAVE ONE. So aftermarket cooler 980 vs aftermarket 290X is how I would judge the cards.
 

mephixto

Banned
Isn't this the website that changes their settings depending on what the highest playable settings are? Kind of makes it even more difficulty to compare FPS, since the settings aren't uniform across the cards.

Yes. In this review they lowered the setting of the 290x model several times just to make it playable and to keep up with the 980.
 

Momentary

Banned
I've only recently started following PC related threads and I'm sick and tired of this shit already. This is worse than console wars and not worth my reading time. I'm out!

Stay in SteamGAF my man. I'm here for the entertainment. The only reason it's like this right now is because of NVIDIA and AMD announcing new cards. Then the graphs start coming out. Then the defenders start coming out. It's like this every time. I just want someone to dive into something other than 28nm.

Yes. In this review they lowered the setting of the 290x model several times just to make it playable and to keep up with the 980.

You're about to start a fire with that comment.
 

endtropy

Neo Member
Yeah I learned during the GTX 970 "debates" that it's essentially pointless to try. Someone else will always be convinced that they need to "win" and that winning means doing so on the terms they set. Don't like the outcome of this particular test because heres reasons why this one with linked graph is better/more accurate/truer/etc.

The one thing I laugh about though is folks who think *either* company gives a damn about "reasonableness"... IF I see one more post about Nvidia ripping consumers off I'm just going to die from laughter... Nvidia & AMD are completely driven by market pressure. If you have the highest performing single chip card, guess what, you can set your price and let demand guide it. You get to charge a 30-40% premium for 10% performance difference because your at the top and people will pay it... Thats not ripping consumers off. When you lack competition in a market segment you price accordingly, AMD will do the exact same thing if given the chance.
 

ss_lemonade

Member
Hrmmm...

Let's see
Crysis 3: 49 v 41 = 16.3%

Metro LL: 59 v 50 = ~15%

Theif dx11: 54 v 47 = ~15%

Tomb: 104 v 87 = ~17%

Aren't those oc'ed 980 numbers? What 290x is being used in the guru3d review anyway? It does seem to come close to the 980, even beating it in one benchmark (hitman)
 
Aren't those oc'ed 980 numbers? What 290x is being used in the guru3d review anyway? It does seem to come close to the 980, even beating it in one benchmark (hitman)

i have no idea which 290X is it, i simply know that models not labeled with a brand and model are reference clocked. it could very well be a reference model on forced 100% fanspeed to prevent throttiling , it could be the XFX double dissipation which is reference clocked and runs cool n quiet (the black editions have a ~8 - 9% oc, the regular XFX DDs are reference). they list "GTX 980" numbers as well aka reference clocks.


the 8GB 290X models fare even better vs the GTX 980 at 1440p and especially 4k in quite a few games as well, and those are only $400 right now vs a $560 GTX 980, so i say thats still a valid comparison as well.
 
Top Bottom