• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon Fury X review thread

Crisium

Member
If Fury is 3584 shaders it is the same as 7950 to 7970. 5 to 10 % diff at equal clocks. Getting equal clocks without water will be harder tho.
 
The only "architectural" advantage Titan X has above 980Ti is it's additional 6GB of RAM. Well, that and the price which is bigger as well.

That isnt true, in theory it has more compute cores

Whether that makes any difference in current games is another issue
 

ricki42

Member
w3-r9.gif

http://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed/6

index.php

http://www.guru3d.com/articles_pages/amd_radeon_r9_fury_x_review,28.html

I need actually read more about how these sites test frame timing, but I find it curious that TechReport has insane frame timing spikes while in Guru3D's tests frame timing is pretty much flawless.

I couldn't find anything on guru3d how they actually plot the data, but looking at the plots I think it might just be the binning. The techreport has the frame number on the x axis, which makes sense, it plots for each subsequent frame how long it took. The guru3d plot has seconds into the measurement on the x axis. Considering that each frame takes a different time, if you bin the results in terms of time, you get a different number of frames in each bin. Of course, they may be plotting the frames and then just annotating the axis in time. But if the binning is in time rather than frame, I would expect some degree of averaging, which would explain the difference.
 

Zalusithix

Member
Semi-passive air cards are silent during idle and low load periods. You can't turn off a pump.

Yeah, noise levels are really a matter of what you want out of your computer. On a pure gaming rig, water is the way to go. It'll be less noisy during heavy load situations. If your computer also serves as a daily driver, then you might be better served by quiet air cooling. It'll be louder while gaming, but quieter the rest of the time while the computer is idling.

Personally I find noise more annoying while doing mundane tasks or listening to music than while playing games, so I veer my builds towards idle noise performance. Ideally I'd have two machines with each built differently, but that's not going to happen any time soon.
 

LordOfChaos

Member
Did any of the reviews look at memory use? AMD claimed they found 4GB was enough because they hadn't optimized driver video memory use enough before now, so I assume that means they're planning on making it more efficient in new drivers for the Fury and pare down memory use, which could possibly also help with frame times if more relevant stuff stays in memory.
 

ricki42

Member
1440p vs 4k.

1440p vs 1080p?

guru3d said:
Note: The AMD Radeon Fury X does not have a DVI output. For FCAT at 2560x1440 (WHQD) we need a Dual-link DVI connector, for which we split the signal to the frame-grabber. This is not possible. We converted the HDMI output to DVI, however that's not a dual-link and as such the highest resolution supported is Full HD. So we had a dilemma, not do FCAT at all, or revert to 1920x1080. We figured you guys would love to see FCAT results, hence we compromised for Full HD over WHQD.
 
Did any of the reviews look at memory use? AMD claimed they found 4GB was enough because they hadn't optimized driver video memory use enough before now, so I assume that means they're planning on making it more efficient in new drivers for the Fury and pare down memory use, which could possibly also help with frame times if more relevant stuff stays in memory.

HardOCP did and they found 4 GB wanting.
 

wildfire

Banned
ATM the there's really no reason to get the Fury X over the 980ti unless you prefer cooler temps. AMD dropped the ball IMO.

And this sums up the entire value proposition.

Do you want to play at 4K above 24 FPS? Get nvidia.
AMD didn't include HDMI 2.0 which is the input of choice for the majority of 4K displays. An active adapter doesn't exist for a few more months and when released will set you back $70 at minimum.

Do you want to play at 1440p at 120+ fps with IPS? You better have moved on from those affordable Korean monitors because dual DVI input wasn't included either. Atleast an adapter would set you back about $30.


Do you want the best possible performance but don't like to overclock? Get nvidia.

Do you want the best performance per dollar after installing a water cooler? Still buy Nvidia as long as you're overclocking.

Do you want a GPU in a small enclosure? Check your favorite case for clearence measurements. The radiator for the Fury while excellent is far thicker than your average AIO so it will be a challenge installing the radiator even though the GPU is tiny.

There were too many compromises in the Fury.
 

mrklaw

MrArseFace
For those just considering jumping into variable framerate gaming, a FuryX + freesync monitor combo would give good performance with quite a monetary saving.
 
I searched the thread for anisotropic filtering but got nothing. Did any of the reviews find an issue with it and the Fury X?

This seems a good card but AMD really needed great here.
 

Xyber

Member
Huge noise reduction is the reason. Noise is a huge factor for a lot of people. Don't tell me you don't care about noise...

While I haven't heard how noisy the 980Ti with the MSI cooler is, if it's anything like my 970 then noise is a non-issue. I can barely hear my card when I game (if I take off my headset), it's really fucking quiet already and I and running a 1555MHz overclock. And I have a modded bios with higher voltage, so the fan spins a little faster to keep the temps down.

With stock voltage I could use a fan profile that wasn't even audible when playing a game.

At idle it is dead quiet and no water cooling will be able to make less noise than what my card is doing now, especially with how noisy those pumps can be on the closed loop systems.

If someone releases a modded bios with unlocked voltage and the card actually can overclock, then the water cooler might actually be worth having.
 

x3sphere

Member
This isn't a PS4. Shouldn't be an issue.

Right, though AMD used 0xAF in majority of the tests for the benchmark it published a week ago, which showed Fury ahead in a lot of games.

I imagine all review sites are using maxed AF though, no one games on PC without AF.
 
My hot take, as someone who just installed a 980ti yesterday:

It looks like the Fury X is a very good, competitive card. It's got some neat features, like water-cooling, that will hopefully show up in other mainstream cards. I was looking at the EVGA Hybrid water-cooled 980ti, but decided it would have been a tight fit - I'm already using an AIO CPU cooler.

The issue seems to be that many (especially the AMD faithful) thought that the Fury X (let alone the "Fury X2") would stomp all over the 980ti. The announce thread, the 980ti thread, etc. were full of claims that Fury X would be a Titan-killer at half the cost. I can understand the disappointment!

Either way, I'm glad that it's good competition - it's pretty likely that the 980ti's pricing is pretty directly related to what Nvidia expected the Fury X to bring to the table.
 

Seanspeed

Banned
Right, though AMD used 0xAF in majority of the tests for the benchmark it published a week ago, which showed Fury ahead in a lot of games.

I imagine all review sites are using maxed AF though, no one games on PC without AF.
Yea, exactly.

Those were some super strange benchmarks.
 

Xyber

Member
Right, though AMD used 0xAF in majority of the tests for the benchmark it published a week ago, which showed Fury ahead in a lot of games.

I imagine all review sites are using maxed AF though, no one games on PC without AF.

If anything, shouldn't 16xAF be more favorable over 0x for the Fury? I might be wrong on this, but isn't AF fairly memory bandwidth intensive?
 
For those just considering jumping into variable framerate gaming, a FuryX + freesync monitor combo would give good performance with quite a monetary saving.

Strongly disagree. You're only going to save $100 to $200 for a Freesync monitor that won't sync the entire refresh range. If you are going to buy a monitor you are planning on having for a long time and you are choosing between a Fury X and a 980 Ti, spend a little more to go with the latter.
 

Elman

Member
For those just considering jumping into variable framerate gaming, a FuryX + freesync monitor combo would give good performance with quite a monetary saving.

A Fury X and a 1440p+ Freesync monitor does have a lower entry price, but Freesync does not have feature parity with GSync.

The refresh rate window for Freesync varies in the monitors that have been announced and released thus far, with some monitors having minimum refresh rates high as 48 FPS. Meaning that if the game drops below 48 FPS, you can kiss Freesync goodbye. You even get weird instances where the monitor is capable of running at 120 or 144 Hz and Freesync is only a fraction of that range; the recently released ASUS MG279Q ranges from 35-90 HZ. Outside of that range, it's back to VSync or tearing.

G Sync is 30 Hz minimum across the board and hits 144 Hz if the monitor supports the refresh rate. A 980 Ti + GSYNC is well worth the price difference considering how much greener the grass is in NVIDIA land.
 

Hazaro

relies on auto-aim
Not what I wanted to wake up to in the morning.
With 1.22V no wonder it can't OC much farther right now. It's a good product, the 980Ti is just flat out better except load temp and load noise (minus pump noise). Or if you needed a compact card and the mini 970 wasn't enough.

If the Air version (Fury) can be kept in line enough at $550 that should be competitive.
 

Quotient

Member
Do you want a GPU in a small enclosure? Check your favorite case for clearence measurements. The radiator for the Fury while excellent is far thicker than your average AIO so it will be a challenge installing the radiator even though the GPU is tiny.

Was the Fury X running so hot via air cooled that AMD resorted to using an AIO water cooler. It seems to completely nullify the small dimensions of the card when you have a thick radiator and piping to deal with.
 

tuxfool

Banned
Was the Fury X running so hot via air cooled that AMD resorted to using an AIO water cooler. It seems to completely nullify the small dimensions of the card when you have a thick radiator and piping to deal with.

You're going to need to stick a large cooler on that card. Be it on the card or off. You'll see Nvidia do the same for Pascal.
 
A Fury X and a 1440p+ Freesync monitor does have a lower entry price, but Freesync does not have feature parity with GSync.

The refresh rate window for Freesync varies in the monitors that have been announced and released thus far, with some monitors having minimum refresh rates high as 48 FPS. Meaning that if the game drops below 48 FPS, you can kiss Freesync goodbye. You even get weird instances where the monitor is capable of running at 120 or 144 Hz and Freesync is only a fraction of that range; the recently released ASUS MG279Q ranges from 35-90 HZ. Outside of that range, it's back to VSync or tearing.

G Sync is 30 Hz minimum across the board and hits 144 Hz if the monitor supports the refresh rate. A 980 Ti + GSYNC is well worth the price difference considering how much greener the grass is in NVIDIA land.

More importantly, sub 30 hz, GSync actually still has a solution to reduce tearing or VSync. It actually does something across the entire refresh range.
 

tarheel91

Member
And this sums up the entire value proposition.

Do you want to play at 4K above 24 FPS? Get nvidia.
AMD didn't include HDMI 2.0 which is the input of choice for the majority of 4K displays. An active adapter doesn't exist for a few more months and when released will set you back $70 at minimum.

Do you want to play at 1440p at 120+ fps with IPS? You better have moved on from those affordable Korean monitors because dual DVI input wasn't included either. Atleast an adapter would set you back about $30.


Do you want the best possible performance but don't like to overclock? Get nvidia.

Do you want the best performance per dollar after installing a water cooler? Still buy Nvidia as long as you're overclocking.

Do you want a GPU in a small enclosure? Check your favorite case for clearence measurements. The radiator for the Fury while excellent is far thicker than your average AIO so it will be a challenge installing the radiator even though the GPU is tiny.

There were too many compromises in the Fury.

Unless you're using a TV, Displayport is the ideal input and is what most monitors come with. Why does everything have to be so black and white around here? 980ti seems to have a clear advantage at lower resolutions and a slight to negligible advantage at higher resolutions at the moment, but it very much depends on the game. Overclocking is really unknown until voltage is unlocked. I really doubt the radiator is going to be more of a clearance issue than a 12"+ GPU. The Fury X doesn't knock it out of the park like many hoped it would, but it also doesn't strike out either like you seem to believe it has.
 

ItsTheNew

I believe any game made before 1997 is "essentially cave man art."
Ah hell as an owner of a 2gb 760 this isn't going to shake up the landscape like I imagined
 

Durante

Member
Unless you're using a TV, Displayport is the ideal input and is what most monitors come with. Why does everything have to be so black and white around here? 980ti seems to have a clear advantage at lower resolutions and a slight to negligible advantage at higher resolutions at the moment, but it very much depends on the game. Overclocking is really unknown until voltage is unlocked. I really doubt the radiator is going to be more of a clearance issue than a 12"+ GPU. The Fury X doesn't knock it out of the park like many hoped it would, but it also doesn't strike out either like you seem to believe it has.
The true strikeout for the Fury X isn't HDMI 2.0 (though it's important for some), or how it compares in FPS to 980ti (it does pretty well at 4k) -- it's the frametime charts mkenyon posted earlier in this thread.

Those need to be rectified, otherwise it's simply not a good buy.
 

Nikodemos

Member
The true strikeout for the Fury X isn't HDMI 2,0 (though it's important for some), or how it compares in FPS to 980ti (it does pretty well at 4k) -- it's the frametime charts mkenyon posted earlier in this thread.

Those need to be rectified, otherwise it's simply not a good buy.
Very true. Those are the real weakness of the Fury. They are even worse than a 295x2, which is a dual-GPU card, that would (theoretically, at least) perform worse than a single-GPU one in that test.

OTOH, that is something we know AMD can fix; after all, they did fix it on their dual-GPU cards.
 

endtropy

Neo Member
Unless you're using a TV, Displayport is the ideal input and is what most monitors come with. Why does everything have to be so black and white around here? 980ti seems to have a clear advantage at lower resolutions and a slight to negligible advantage at higher resolutions at the moment, but it very much depends on the game. Overclocking is really unknown until voltage is unlocked. I really doubt the radiator is going to be more of a clearance issue than a 12"+ GPU. The Fury X doesn't knock it out of the park like many hoped it would, but it also doesn't strike out either like you seem to believe it has.

speak for yourself man.. I'm sure I am one of MANY gamers out there with 1440p monitors that use dual link DVI. It was absolutely silly for AMD to release a card without DVI. Considering this card was built for 4k (at least according to AMD) the lack of HDMI 2.0 is also just another example of missing the market. If it was an issue of being "future proof" for display port eye finite setups then at $649 I expect you to provide me with the adapters so I can use the card with the monitors that are still very prevalent today.

Also some folks seem to be hanging hope on "well maybe it will OC", guess what? 980ti's OC rather well to... So even if they find another 100-200mhz with voltage, thats nothing a third party 980ti can't also replicate.

Simply put, AMD's best missed the mark. They are pricing this as a competitive alternative when it's not. At best they can price it down but I suspect they are incredibly hesitant to due so. This card likely has a very high build cost and they need to recoup heavy investment into R&D as well. Basically I Call into question that AMD can afford to sell this card for much less without taking actual losses on each card sold (when factoring in not just the per unit cost but the amortized R&D).
 

Elman

Member
More importantly, sub 30 hz, GSync actually still has a solution to reduce tearing or VSync. It actually does something across the entire refresh range.

Thanks for pointing this out...seriously! I checked online for more information and I can totally see why they went with a dedicated module.

From PC Perspective:

G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
 
Top Bottom