• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Ryzen Thread: Affordable Core Act

Paragon

Member
In what world does core performance scale linearly with clockspeeds?
Can I just OC my FX-6300 from 3.5ghz to 4.5 and get a hefty ~32% increase? Sign me up on that plane!
If this is your attempt at a joke then right on, it's about time you loosened up and that had me chuckling.
If not, in what parallel universe does CPU clockspeed + IPC correlate directly to FPS in games? This takes the biscuit in this thread for most 'out there' reasoning.

When CPU-limited, performance scaling with frequency seems to be almost perfectly linear in my tests using a 2500K.



 

pestul

Member
Hmm, interesting results with AMD vs Nvidia cards in DX12. I'm pretty sure almost every review out there was using top-end Nvidia cards too.
 

Atilac

Member
This thread really just arguing over irrelevant benchmarks? Firmware and support aren't there yet for Ryzen, any comparisons with Intels more mature architecture isn't an apples and apples comparison. When Conroe first came out the gaming performance didn't blow AMD Out of the water and people kind of flipped, look how that turned out.
 
This thread really just arguing over irrelevant benchmarks? Firmware and support aren't there yet for Ryzen, any comparisons with Intels more mature architecture isn't an apples and apples comparison. When Conroe first came out the gaming performance didn't blow AMD Out of the water and people kind of flipped, look how that turned out.

It is apples to apples for consumers who's purchasing decision are made now and not soon™. AMD released the product, they think it's mature enough for release.

Anyways, the difference between AMD and NVIDIA GPUs used with Ryzen was interesting.
 
It is apples to apples for consumers who's purchasing decision are made now and not soon™. AMD released the product, they think it's mature enough for release.

Anyways, the difference between AMD and NVIDIA GPUs used with Ryzen was interesting.

I don't see the same type of advancements made with Kaby Lake one month after release.
 

PFD

Member
This thread really just arguing over irrelevant benchmarks? Firmware and support aren't there yet for Ryzen, any comparisons with Intels more mature architecture isn't an apples and apples comparison. When Conroe first came out the gaming performance didn't blow AMD Out of the water and people kind of flipped, look how that turned out.

Think of it like MMOs. When you release your new MMO you aren't competing with WoW at release, you're competing with WoW after 10+ years of patches/content.
Waits for someone to point out how vanilla WoW was better
 
Well, someone apparently got a Ryzen R5 1400 early and made a video about it:
https://www.youtube.com/watch?v=DbDpMWo7XTk

Pretty good video
.



not a good video at all. he's testing a CPU by reducing draw calls (which he is effectively doing with lowering to medium settings - if he doesn't want to run into a GPU bottleneck with his 1060 he should have rather reduced resolution). this is pretty stupid as you just pronounce frequency differences this way.

secondly he tested games that are known to work better on DX11 for ryzen 7 in DX12 mode.
 
This thread really just arguing over irrelevant benchmarks? Firmware and support aren't there yet for Ryzen, any comparisons with Intels more mature architecture isn't an apples and apples comparison. When Conroe first came out the gaming performance didn't blow AMD Out of the water and people kind of flipped, look how that turned out.
It's not irrelevant for people who are looking at a new build in the near future and want something that is the best price / performance for today.

Promises of future potential are well and good and relevant for discussion but that doesn't invalidate discussing todays state of play.
 
This thread really just arguing over irrelevant benchmarks? Firmware and support aren't there yet for Ryzen, any comparisons with Intels more mature architecture isn't an apples and apples comparison. When Conroe first came out the gaming performance didn't blow AMD Out of the water and people kind of flipped, look how that turned out.

You're not wrong there.

When an RX 470 is beating a GTX 1060 by 32% there is definitely something wrong. No card from that market segment should be CPU bottlenecked on a R7 1700 so the problem lies elsewhere.

That pesky Nvidia up to their old tricks again, huh? :)

When CPU-limited, performance scaling with frequency seems to be almost perfectly linear in my tests using a 2500K.




Doesn't work like that when the CPU in question is already clocked at 3.6Ghz.
 
Overclocked R5 1400 results at 3.8ghz now:


https://youtu.be/TcdmeGOsnss

Using a RX 480 to get around the Nvidia DX12 driver bottleneck and these are much more promising results.

Mentions that he had to run the fan at 100% speed and it still hit 80C.

The 1500 with the beefier Wraith Spire and double the L3 cache looks like the better buy to me for an extra $20. Should be capable of handling a decent overclock without an aftermarket cooler or resorting to pegging the fan to 100%.
 

Paragon

Member
Doesn't work like that when the CPU in question is already clocked at 3.6Ghz.
Well that doesn't make any sense, since the i5-2500K I used for this test is a 3.3GHz processor stock, overclocked to 4.5GHz, and then clocked down to 2.3GHz for this comparison.

In all cases performance scaled linearly when CPU-limited.
 

This is absolutely ridiculous! All along it was the damn NVIDIA drivers! I should have known, as DX12 performs much better on AMD GPUs than DX11, and can even outperform NVIDIA GPUs in software that uses DX12.

Great stuff by the guy who runs the 'AdoredTV' channel and the others who noticed things like this, this is how it's done. You experiment and find things out.

We're mostly only finding out about this now because a lot of reviewers were testing the highest performing GPUs on the market and those would be NVIDIA's GPUs, as AMD currently don't have anything to compete against the GTX 1080 and above until Vega arrives, which should be arriving soon.
 
https://www.youtube.com/watch?list=PL_sfYUCEg8Og_I4k7nL62IsMrJv5rFRa_&v=QBf2lvfKkxA

Results ( avg fps/cpu load/gpu load )
AMD DX11: 128.4 fps CPU: 33% GPU: 71%
NVIDIA DX12: 143.9 fps CPU: 42% GPU: 67%
NVIDIA DX11: 161.7 fps CPU: 40% GPU: 80%
AMD DX12: 189.8 fps CPU: 49% GPU: 86%

Shocking how the 'tech press' missed this. That's exactly what some of these sites should be investigating if Ryzen was throwing up some big anomalies. But in this culture of 'first' and impatience, all they cared about was getting their review out asap.

But even the late reviews were completely oblivious.

Another one with the same findings:

German magazine "C't" did some benchmarks with a Fury X, too. (and the 1800X)

https://www.reddit.com/r/Amd/commen...agazine_ct_did_a_test_with_a_fury_x/?sort=new

Rise of the Tomb Raider (DX12) 720p high details:

Titan X: 40 min/105 avg/158 max
Fury X: 49 min/130 avg/230 max

same with AotS 720p high details and DX12:

Titan X: 62 fps
Fury X: 72 fps
This is significant since you have to consider that the FuryX is about half the speed of a Titan X Pascal. The 720P results show that the Fury X/Ryzen combo is very close to the TXP in most of the games tested, not just in ROTTR or in DX12 only. The problem seems to affect DX11 too since the Fury X is doing so well against the TXP in Shadow or Mordor and the other DX11 games. If they had used mGPU FuryX to simulate a TXP equivalent AMD card, the Ryzen would have easily beaten the best Intel+TXP score in all the games tested. This is more confirmation that the Nvidia gpu driver is a major factor in poor Ryzen gaming performance.
 

Sinistral

Member
Definitely interesting developments over the passed few days, with AOTS and nVidias drivers. New tech is always so fun.



With the latest 514 bios on the Asus Prime B350M-A board, I've been able to use my 2x16gb G.Skill TridentZ 3200 CL14 (14-14-14-34) at 1.35v at DDR2400... a step up from default DDR2133 with looser timings. Using DOCP target 2400. I tried higher targets with up to +0.2v SOC offset, but ran into infinite boot times, which are annoying as it requires me to clear the CMOS. As opposed to when on default, which goes through the 5 cycle failure routine. I'd try faster speed with looser timings but, I don't have time for that right now.

It still isn't reading my 1800X temperatures correctly it seems... But my motherboard temps seemed to have dropped significantly.
 

Protome

Member
So if I have an RX480 and am looking to get a new CPU, a Ryzen 5 is probably still the best bang for my buck? It's just when paired with an Nvidia GPU it's underperforming?
 

DieH@rd

Banned
https://www.youtube.com/watch?list=PL_sfYUCEg8Og_I4k7nL62IsMrJv5rFRa_&v=QBf2lvfKkxA



Shocking how the 'tech press' missed this. That's exactly what some of these sites should be investigating if Ryzen was throwing up some big anomalies. But in this culture of 'first' and impatience, all they cared about was getting their review out asap.

But even the late reviews were completely oblivious.

Another one with the same findings:

German magazine "C't" did some benchmarks with a Fury X, too. (and the 1800X)

https://www.reddit.com/r/Amd/commen...agazine_ct_did_a_test_with_a_fury_x/?sort=new


This is significant since you have to consider that the FuryX is about half the speed of a Titan X Pascal. The 720P results show that the Fury X/Ryzen combo is very close to the TXP in most of the games tested, not just in ROTTR or in DX12 only. The problem seems to affect DX11 too since the Fury X is doing so well against the TXP in Shadow or Mordor and the other DX11 games. If they had used mGPU FuryX to simulate a TXP equivalent AMD card, the Ryzen would have easily beaten the best Intel+TXP score in all the games tested. This is more confirmation that the Nvidia gpu driver is a major factor in poor Ryzen gaming performance.


surprisemotherfucker_uojq9.gif
 
Using his modus operandi dr_rus could easily argue it's actually Pascal's hardware design that's broken. ;)

Something tells me a different set of rules will be applied there lol.

Nvidia's drivers have taken a nose-dive of late. DX12 performance is shocking in these games, and I reckon it's mostly software based but also hardware too.

Vega vs 1080/1080 Ti in DX12/Vulkan is going to produce fireworks.
 

Datschge

Member
Nvidia's drivers have taken a nose-dive of late. DX12 performance is shocking in these games, and I reckon it's mostly software based but also hardware too.
This is getting off-topic, but I wouldn't be surprised if the cause is completely in software. Nvidia for years (decades?) is using a platform agnostic binary blob as driver which they then adapt to interface with their different hardware as well as the different platforms and graphics APIs. This is how they manage to support new APIs quite quickly, and as the driver can also implement graphics features in software the resulting all-encompassing driver support without differences in features has been one of Nvidia's big selling point over all the years. Now the issue with DX12/Vulkan and Ryzen is that those work best by exploiting multithreading while Nvidia's unified driver blob worked perfectly well without heavy multithreading up to this point.
 
This is getting off-topic, but I wouldn't be surprised if the cause is completely in software. Nvidia for years (decades?) is using a platform agnostic binary blob as driver which they then adapt to interface with their different hardware as well as the different platforms and graphics APIs. This is how they manage to support new APIs quite quickly, and as the driver can also implement graphics features in software the resulting all-encompassing driver support without differences in features has been one of Nvidia's big selling point over all the years. Now the issue with DX12/Vulkan and Ryzen is that those work best by exploiting multithreading while Nvidia's unified driver blob worked perfectly well without heavy multithreading up to this point.

Interesting. Agnostic binary blob is also the best technical term I heard this year.

It seems to me AMD is going all-in with multi-threading. Vega + Ryzen could make for a killer combo for a gaming rig when the dust settles.
 

Datschge

Member
Interesting. Agnostic binary blob is also the best technical term I heard this year.
FYI binary blob is a common term for proprietary closed software drivers under Linux (and by extension Android) and Nvidia is probably the best known actors using such there (loved by gamers' for its feature parity, frowned upon by those wanting a transparent system).
 
yeah! because the rx480 is shit and loses in every game againts the 1060!



OH WAIT


And it's funny you already have Vega benchmarks.

The RX 480, despite being the All-Hyped DX12 video card, at best matches the 1060, trading blows with it losing some benchmarks and winning others.

So you'll forgive me if I'm less than interested in how the Vega performs when I'm already planning on buying a 1080 Ti.

I like Ryzen quite a bit for what it offers and this is a Ryzen thread so I don't even know why people are bringing Vega up in it. But the Radeon line has consistently been a day late and a dollar short and Vega is unlikely to change that reality.
 
The RX 480, despite being the All-Hyped DX12 video card, at best matches the 1060, trading blows with it losing some benchmarks and winning others.

So you'll forgive me if I'm less than interested in how the Vega performs when I'm already planning on buying a 1080 Ti.

I like Ryzen quite a bit for what it offers and this is a Ryzen thread so I don't even know why people are bringing Vega up in it. But the Radeon line has consistently been a day late and a dollar short and Vega is unlikely to change that reality.


you said vega can't even outperform a 1080 like you were spot on. lol
 
The RX 480, despite being the All-Hyped DX12 video card, at best matches the 1060, trading blows with it losing some benchmarks and winning others.

So you'll forgive me if I'm less than interested in how the Vega performs when I'm already planning on buying a 1080 Ti.

I like Ryzen quite a bit for what it offers and this is a Ryzen thread so I don't even know why people are bringing Vega up in it. But the Radeon line has consistently been a day late and a dollar short and Vega is unlikely to change that reality.

The only thing that will be a dollar short is you after paying the Nvidia mark-up for your 1080 Ti. Vega will be cheaper at least.
 

thelastword

Banned
Yes, in much the same way Polaris vs. 1060 produced fireworks.

Wait.

LOL



If Vega can't even outperform a 1080 how's it going to take on the 1080 Ti?
Has the vega already been released? Link me to some reviews.....don't know how i missed the launch because i've been anticipating the thing for so long......
 
The only thing that will be a dollar short is you after paying the Nvidia mark-up for your 1080 Ti. Vega will be cheaper at least.

What mark-up? Are there currently any competitive products with the 1080 Ti that I'm unaware of? When there's only one product in a category, there is no mark-up. It's either take it or leave it.

Sure, Vega will be cheaper. And it will perform less. So I'm not understanding what you mean by mark-up here. If it performs less, it ought to cost less.

Anyways, this is the Ryzen thread. So I've said all I will say about Vega here..
 

Argyle

Member
What mark-up? Are there currently any competitive products with the 1080 Ti that I'm unaware of? When there's only one product in a category, there is no mark-up. It's either take it or leave it.

Sure, Vega will be cheaper. And it will perform less. So I'm not understanding what you mean by mark-up here. If it performs less, it ought to cost less.

Anyways, this is the Ryzen thread. So I've said all I will say about Vega here..

I have no idea about current 1080ti pricing but if there is an MSRP and retailers are selling it for above MSRP isn't that the definition of a markup?

If you want to make that argument I think you can only do that with something that doesn't have a set MSRP.
 
I have no idea about current 1080ti pricing but if there is an MSRP and retailers are selling it for above MSRP isn't that the definition of a markup?

If you want to make that argument I think you can only do that with something that doesn't have a set MSRP.

You said "Nvidia mark-up" implying Nvidia was overcharging for their products, not retailers. That's what he was arguing. On top of that, the people with the highest performing hardware always set the price where they want it. Remember how much the 7970 cost before the 680 dropped?
 

Argyle

Member
You said "Nvidia mark-up" implying Nvidia was overcharging for their products, not retailers. That's what he was arguing. On top of that, the people with the highest performing hardware always set the price where they want it. Remember how much the 7970 cost before the 680 dropped?

I actually didn't say any such thing but ok, I did misread the post he quoted... If he was arguing that Nvidia set their price too high in which case I agree with you, they can set it to whatever they think the market will bear.
 

DieH@rd

Banned
Let us remind ourselves again of this:
2016-03-20_16_21_40-gpuq3280ufv_zpscac57817.png.jpg


Differences between Total [manufacturing cost] and MP [msrp] in AMD and NV cards are quite noticeable.
 
What mark-up? Are there currently any competitive products with the 1080 Ti that I'm unaware of? When there's only one product in a category, there is no mark-up. It's either take it or leave it.
.

What mark-up? So you think they just about break-even on those £830 1080 Tis? Come on dude.

When the 980 Ti came out, you could get them for £550. Now the new generation of that card cost £800. That's a pretty astonishing £250 mark-up in such a short space of time. There's no excuse for it.
 

Paragon

Member
What mark-up? So you think they just about break-even on those £830 1080 Tis? Come on dude.
When the 980 Ti came out, you could get them for £550. Now the new generation of that card cost £800. That's a pretty astonishing £250 mark-up in such a short space of time. There's no excuse for it.
980 Ti launched at $649 and the 1080 Ti is $699.
The UK price is £699 not £800. Blame Brexit for increase from £549 to £699. If not for that, it probably would have been £599.
This is all getting very off-topic though.
 

kotodama

Member
A really interesting video that helps to explain some of the uneven game performance on Ryzen and why AMD's GPUs seem less affected by it:

AMD vs NV Drivers: A Brief History and Understanding Scheduling & CPU Overhead

Hopefully this is something that NVIDIA will be working to improve.

Interesting video. Makes me wonder how well Vega will improve AMD's hardware scheduler in DX11. Also if true, it seems reviewers who are doing CPU performance testing really should be using AMD cards as that would isolate the CPU more.
 
Top Bottom