• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon Fury X review thread

All PC games that started development 2 years ago should be targeting Dx11 and Ogl 4. There are exceptions such as Deus Ex:MD offering DirectX 12 support (comes out early next year), however it should also support dx11.
As helpful as always tuxfool many thanks.

Developers should be glad they are moving away from 11 since from what i hear and experience in the games that supported it was a very resource intensive API.
 

Decker

Member
Brad Grenz said:
Unknown Soldier said:
Brad Grenz said:
Unknown Soldier said:
The Fury X gets humiliated by an overclocked 980 Ti
Wow, the Titan X is getting HUMILIATED by an overclocked 980 Ti. nVidia must be SO EMBARRASSED right now!
Yep, in the fantasy world where you can't overclock the Titan X too. Nvidia is clearly fantasy embarrassed in that fantasy world!
It's amusing that you don't even recognize how your retort here defeats your own earlier hysterics as well.

Brad Grenz has got a point: you started comparing Fury X with overclocked 980 Ti and, when he did the same between Titan X with overclocked 980 Ti, you started making fun of his words because in the comparison one card was overclocked.
Not to mention you also sarcastically spoke about video card religion to justgames7604 while you are the one having fanboy attitude

can't understand if you are childish or simply trolling
 

Jamex RZ

Banned
That's a GTX 980ti in that picture.

This it what baffles me, how in the world a new card doesn't have HDMI 2.0? Its marketed as a 4k card and a lot of people (me one of them) have 4k Tv's instead of a monitor. I guess I'm going team green this time. AMD may lose some sales due that.
 

tuxfool

Banned
This it what baffles me, how in the world a new card doesn't have HDMI 2.0? Its marketed as a 4k card and a lot of people (me one of them) have 4k Tv's instead of a monitor. I guess I'm going team green this time. AMD may lose some sales due that.

mhm. Everything seems to point to the card being supply constrained, they'll empty out inventory. They'll only keep this up for the next 6/7 months and then everybody will be focused on the upcoming new things.
 

joesiv

Member
mhm. Everything seems to point to the card being supply constrained, they'll empty out inventory. They'll only keep this up for the next 6/7 months and then everybody will be focused on the upcoming new things.
Yeah, I bet AMD never intended it to be a huge seller, I could see it as a trial run for thier new memory architecture. They might not be making a profit on it, perhaps even a loss, is that a thing in this market? Really, with this process node, all things point to hbm2, and the die shrink as the real story.

But I keep thinking, what would a 390x have been like with the same shader count as the fury... Might have been a sweet card.
 

Durante

Member
This it what baffles me, how in the world a new card doesn't have HDMI 2.0? Its marketed as a 4k card and a lot of people (me one of them) have 4k Tv's instead of a monitor. I guess I'm going team green this time. AMD may lose some sales due that.
Yeah, despite the ways people want to deflect it, it's a glaring omission for a GPU purportedly aimed at 4k. I don't really know what they were thinking with both this and the dual-link DVI issue. I can't believe cutting costs by a few cents is worth that.

mhm. Everything seems to point to the card being supply constrained, they'll empty out inventory. They'll only keep this up for the next 6/7 months and then everybody will be focused on the upcoming new things.
Here in Austria, it's available in stock already. Non-reference 980tis are harder to get.
 

derFeef

Member
Hopefully new 3rd party revisions are better with those issues. Almost grabbed one yesterday, but held off because of those.
 

cirrhosis

Member
So Fury X is generally around 12% worse than the 980TI in most games.


What's the pricing between the two on average?

Stateside, they both run around 650 on average. 650 was what nVIDIA launched the 980 Ti at, which AMD decided to match upon launching Fury X.

That doesn't include vendor overclocked 980 Ti's, which easily run to 700 (EVGA Classified for example) or potentially above. Most of the other factory OCed versions of 980 Ti fall somewhere in the 650 to 700 range.
 

tbd

Member
Wow, this thread. GPU wars are serious, I guess.

However, if it comes to Fury X vs. 980 Ti, I'd go with the Fury X. Pretty sure you'll find it for a substantially lower price sooner or later and I personally don't really dig how Nvidia is inherently connected with Ubisoft and some of the most terriblly optimized games there are, see Batman.
 

Irobot82

Member
Yeah, despite the ways people want to deflect it, it's a glaring omission for a GPU purportedly aimed at 4k. I don't really know what they were thinking with both this and the dual-link DVI issue. I can't believe cutting costs by a few cents is worth that.

Here in Austria, it's available in stock already. Non-reference 980tis are harder to get.

Rober Hallock tweeted that "It was a feature they built specifically for the 300 series."

Twitter Conversation

So does that mean they modded the chips and are doing something at a chip level or is this more PR bullshit.
 

Durante

Member
Rober Hallock tweeted that "It was a feature they built specifically for the 300 series."

Twitter Conversation

So does that mean they modded the chips and are doing something at a chip level or is this more PR bullshit.
PR bullshit. Confirmed so, in fact.

People have been frame limiting with external tools for a decade, and people have already run even AMD's implementation on 200-series cards with hacked drivers.
 
PR bullshit. Confirmed so, in fact.

People have been frame limiting with external tools for a decade, and people have already run even AMD's implementation on 200-series cards with hacked drivers.

Have people flashed 390X BIOS on 290X yet? Of they did that they could use the 300/Fury driver branch instead of the older cards branch.
 

Devildoll

Member
Have people flashed 390X BIOS on 290X yet? Of they did that they could use the 300/Fury driver branch instead of the older cards branch.

I did, ran mine at 1050/6000 clocks as well, and I didn't get as good results as the 390X in any review.

My computer specs are worse than most review computers, and even though I got the 390X bios, I can't use the 300 series driver, it wont let me use it, so that's another variable.


But from my 290x to flashed 390x my performance in Tomb Raider at the same preset as Sweclockers was
13.14% higher minimum framerate and 11.24% higher avg framerate.

Where as sweclockers difference between their 290x and proper 390x was
15.79% higher minimum framerate and 19.35% higher avg


I reran a few times without major discrepancy, my testing is far from waterproof, but it would seem like the 390X benefits from more than just bumped up frequencies.
 

Engell

Member
At the bottom they link to a new article describing newer CM parts that apparently eliminate/reduce the issue. Sucks to have a part lottery but good that they're making fast changes.

Yeah lets hope they have it fixed and it stays fixed, it would really benefit the market if AMD would get a little success.
 
AMD sent us some engineering samples of these so putting it through its paces now. Radiator fits fine into my Fractal Design XL, although I feel the need to get an entire liquid cooling system to complement the radiator. Replacing a Gigabyte GTX 970 small form factor so the difference in size isn't that big a change, really.
 

nib95

Banned
Why are you considering it over the 980Ti?

Nvidia cards are massively price hiked here in the UK. On top of that, if history repeats itself, the AMD card may fare better long term than the Nvidia counterpart. Seems like AMD cards generally tend to close or switch the gap over time.
 
Nvidia cards are massively price hiked here in the UK. On top of that, if history repeats itself, the AMD card may fare better long term than the Nvidia counterpart. Seems like AMD cards generally tend to close or switch the gap over time.

The last couple of AMD ranges all ran more or less the same architecture, and all got a big boost at roughly the same time last year. One interpretation is that they got way better support than Nvidia did, another is that they were underperforming at launch and it took them a year or two before we finally got their full potential. In any case, the Fury X has already benefited from this as they share a lot of tech with the previous gens. I guess the memory management side might have room for improvement, a good thing for a 4GB card in this price bracket.

If you could settle for a less powerful system, I might even recommend the 290x since it's still a very good card and way cheaper.
 

dr_rus

Member
Nvidia cards are massively price hiked here in the UK. On top of that, if history repeats itself, the AMD card may fare better long term than the Nvidia counterpart. Seems like AMD cards generally tend to close or switch the gap over time.

One of the reasons why Kepler cards are falling behind GCN cards in newer games is the lack of DX11+ support in them which is used rather widely in newer PC games because of new consoles. With Maxwell and Fury it's the other way around - Fiji support less features than Maxwell 2 does.

There's also the issue of 4GB of VRAM which *will* limit Fiji's potential going forward.

Basically - I don't think that Kepler vs GCN situation has any implication on how things will be in the future, on different architectures and in other games.
 
Well that's just bullshit :/ as a 290x owner it'd be awesome to have frame targeting built into CCC.

It means they're limiting features on older cards so you'll buy new ones but then they'll bitch about Nvidia being anticonsumer.

PR bullshit. Confirmed so, in fact.

People have been frame limiting with external tools for a decade, and people have already run even AMD's implementation on 200-series cards with hacked drivers.

Yep, Catalyst 15.7, released today, adds VSR and FRTC support to all GCN products (R9 300, R9 200, 7900/7800 series).
 

tenchir

Member
Anyone heard about the AF performance/quality difference between FuryX vs. Titan? Apparently Titan uses lower quality AF at default setting which gave them a performance advantage over the FuryX in games. Titan had a 5-15% fps hit when you set the AF quality to be comparable to Fury X default setting in BF4. Could explain why the leak charts from AMD had like 0xAF on them?

http://hardforum.com/showthread.php?p=1041709224#post1041709224

edit: It's not going to change the Ultra setting benchmarks since I assume AF is set to the highest for both of them, but I see it affecting high settings.

edit 2: I should have read to the end. Looks like it was driver issue that turned off AF.
 
Top Bottom