• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD's Radeon Navi Review Thread: Series 5700.

thelastword

Banned
Does RIS/CAS work at native resolution?
Absolutely.....

Man, can't wait for non blower Navi cards, al these peeps who post benches with 2 fps difference yelling how much better Super is can shut the fuck up. :messenger_tears_of_joy:
Hey, the 5700XT is 2% slower than the 2070 Super according to the Benchmark King, Hardware Unboxed, the buck stops there as far as the most accurate figure given, in the here and now….Yet, the Super 2070 is an FE, already OC'd from Nvidia, the FE's are also some of their best chips of the lot too...….Yet some folk want to gloat that 2070 Super beats 5700XT, when the XT is using a blower cooler...…….Then they source other sites which bench 3 games, mostly NV favored and say 2070 Super is 6% faster or 7% faster etc.....When some of these youtubers never raise the power limit on the Radeon cards for crying out loud....I even saw a bench where Radeon Navi was losing to Turing in Forza 4…….Just pure malarkey in some circles....

The only benches you can trust for a fair figure of averages around a wide gamut of games is Hardware Unboxed..........I see DF for example still using AC Unity and Crysis 3 and so many titles that they know favor Nvidia, some as old as Leadbetter himself......Yet, there are so many newer titles....Where is Black OPs 4 btw, I see Fortnite all the time, Black Ops is a newer game and very popular too, no where to be seen....Where is DMC, RE2, Rage 2, Anthem, Destiny 2......Some don't bench Forza 4, yet that is pretty new, where is Dirt 2 in most benches.....I'd even bench Black Ops 3, Doom, Infinite Warfare......People had no problem benching CSGO all those years....because they knew what team it favored...….
 

LordOfChaos

Member

thelastword

Banned
And I guess the benchmark which many want to see...…..

$400 Radeon 5700XT vs $500 Nvidia 2070 Super…….Paired with a Ryzen 3600 at 4.2Ghz..





Also I had to post this review since morning...It's Level1Techs Review.....

 

CrustyBritches

Gold Member
Supers are now on sale, I see 4 different 2060S models running $399. On Newegg all the 2060S $419 and under models are sold out. Now the countdown begins for "mid-August" when we can get a decent 5700/5700 XT partner AiB. Just like 1060 vs 480, there's a whole period where it's partner AiB 2-3 fan GTX 1060s flooding the market while everybody waits a month for the AMD partner boards, or is forced into buy a crap blower with issues.

On the plus side, this means we could see $420 2-3 fan 5700 XT AiBs, on the other side I'm unlikely to wait that long when I can snag a highly overclockable 2060S.
 
got a 5700XT for playing around with today.

img_6208lkj7s.jpg


done some quick overclocking and undervolting in firestrike. you can see the parameter for wattman under the bar graphs. mind that i haven't done any stress testing with those. but they should give a good starting point. the framed one is a good comproise betweem power and performance (that said the undervolt is honestly good enough though)

firestrike6pjvz.png



Power:

firestrikepowerq4kfy.png



Efficiency:

firestrikeefficiency4fkeg.png
 

JohnnyFootball

GerAlt-Right. Ciriously.
Supers are now on sale, I see 4 different 2060S models running $399. On Newegg all the 2060S $419 and under models are sold out. Now the countdown begins for "mid-August" when we can get a decent 5700/5700 XT partner AiB. Just like 1060 vs 480, there's a whole period where it's partner AiB 2-3 fan GTX 1060s flooding the market while everybody waits a month for the AMD partner boards, or is forced into buy a crap blower with issues.

On the plus side, this means we could see $420 2-3 fan 5700 XT AiBs, on the other side I'm unlikely to wait that long when I can snag a highly overclockable 2060S.
What's funny is that back in 2016 I was wanting to upgrade my GTX 670 to a 480 or a 1060, BUT I was not gonna get the shitty AMD version. I was gonna wait for the Sapphire version. Before the AIB models of the 480 came out, the 1060 was released. I almost bought a 1060, but they sold out on Newegg within minutes. What ended up happening is that about a week later, an Asus 1070 Turbo (their cheaper blower model) went on sale for MSRP of $379 using a coupon and I ended up getting that. At that time pascal cards were selling above MSRP, by a bit (although it was nothing compared to the mining craze) so I was pretty happy with that. A year later, the mining craze was in full swing and I was extremely happy I had my 1070.
 
Last edited:

Ascend

Member
Word is spreading. Might be possible that nVidia really is worried. If they aren't, maybe they should be. This will add to AMD's mind share.

AMD's cards handily win on performance, according to Tom's Hardware, marking the first time in a long time that NVIDIA has felt genuine competitive pressure from its rival.
...
AMD's RX 5700 has an MSRP of $349 following a $30 price cut. The faster RX 5700 XT goes for $399 following a $50 price cut. These cards go head-to-head with NVIDIA's $349 RTX 2060 and $399 RTX 2060 SUPER.

Tom's Hardware found that the RX 5700 produced 11% higher frame rates, averaged across its benchmark suite, than the RTX 2060. Had AMD kept its original pricing, the comparison would have been muddled by a higher price. But with both cards now priced the same, AMD's entry clearly comes out on top.

The RX 5700 XT also bests its competition, beating the RTX 2060 SUPER by 9.9% on average. It even comes close to the performance of NVIDIA's $499 RTX 2070 SUPER, which beats AMD's card by just 6.9% despite costing 25% more.

On top of beating NVIDIA on raw performance, AMD made huge gains in power efficiency. The company's new cards aren't quite as power efficient as NVIDIA's products, but the gap has been substantially narrowed. Power efficiency has been one of AMD's big weaknesses for years, but a new architecture coupled with the move to a 7nm manufacturing process has allowed the company to nearly catch up with NVIDIA.


 

CrustyBritches

Gold Member
A year later, the mining craze was in full swing and I was extremely happy I had my 1070.
Very nice. The mining craze really had a negative impact on gaming prices. Typical to see a RX 480 $100 over msrp. I sold my R9 390 for a good price and got a RX 480 Red Devil 8GB for $269 when they launch about a month after the reference cards. Ether jumped to around $198 and I hurried up and bought 2 1060s for about $260 each. Ether was up around $900 eventually and I made back all my money plus profit. Had about 70MH/s for about $580 because I sold my 390. Good times.
 
Last edited:

SonGoku

Member
got a 5700XT for playing around with today.

img_6208lkj7s.jpg


done some quick overclocking and undervolting in firestrike. you can see the parameter for wattman under the bar graphs. mind that i haven't done any stress testing with those. but they should give a good starting point. the framed one is a good comproise betweem power and performance (that said the undervolt is honestly good enough though)

firestrike6pjvz.png



Power:

firestrikepowerq4kfy.png



Efficiency:

firestrikeefficiency4fkeg.png
Holly shit 2070Mhz core clock?
edit: seems to be memory clock, what's the core clock?
 
Last edited:
Holly shit 2070Mhz core clock?

no it's a just a parameter with which you adjust your frequency-volt-curve for overclocking. real observed clocks in firestrike are just shy of 2000 Mhz for the overclocked variant. but that is load depended and will be a little different in every game.

wattman2ijqs.png


you might get rather easy above 2000Mhz with water cooling.


GDDR6 clock is 900 Mhz (on every module - not that "effective" bullshit nvidia is promoting) in OC. Stock it's 875Mhz.
 
Last edited:

SonGoku

Member
no it's a just a parameter with which you adjust your frequency-volt-curve for overclocking. real observed clocks in firestrike are just shy of 2000 Mhz for the overclocked variant. but that is load depended and will be a little different in every game.

you might get rather easy above 2000Mhz with water cooling.


GDDR6 clock is 900 Mhz (on every module - not that "effective" bullshit nvidia is promoting) in OC. Stock it's 875Mhz.
What's your oc core clock on the blower?
 
What's your oc core clock on the blower?


so i made a frequency over time plot for my firestrike run (GT1 on the right and GT2 on the left):

frequencystockvs.oc3kjab.png


so it overshoots 2000 a few times. but that not an average.

Whats wrong with that? :p
Didnt know core clocks varies on a per game basis.

yeah its not much, but i think games like F1 2019 will clock a bit higher than the division or something. all depends on which engine stresses what part of the GPU the most.
 
Last edited:

Marlenus

Member
Whats wrong with that? :p
Didnt know core clocks varies on a per game basis.

They vary massively, from 1900Mhz down to 1740Mhz depending on game. Anandtech has a chart of clockspeeds in each game.

I think this is why it loses to the 2070s, the clockspeed is way more variable, if it was closer to 1850 and all times then it would be a lot lot closer.

EDIT: To be specific the 5700XT clockspeed is far more variable, the 2070s tends to be above 1850 with just one exception where it is still above 1800.
 
Last edited:

Ascend

Member
They vary massively, from 1900Mhz down to 1740Mhz depending on game. Anandtech has a chart of clockspeeds in each game.

I think this is why it loses to the 2070s, the clockspeed is way more variable, if it was closer to 1850 and all times then it would be a lot lot closer.

EDIT: To be specific the 5700XT clockspeed is far more variable, the 2070s tends to be above 1850 with just one exception where it is still above 1800.
That variance will most likely be gone when the AIB models roll around. Can't wait to see what the Nitro+ model will do.
 

thelastword

Banned
So....is 5700 worth upgrading from RX 580 ?
What do you think?



Bonus vids vs other prior Radeon mid-high cards..






This one's a very interesting video



The video above is coincidentally supported by this one;

It's clear as day...Leadbetter's review was embarrassing, Hardware Canucks just as bad, OC3D? atrocious.....Even some of the guys who saw the great numbers went off different tangents, the day of the launch, it was after 4th of July, it was on a sunday, the dent on the XT is awful, the blower fan...…...All meandering away from the performance of the damn thing, is it great value for your money vs the competition or not?

Some of these guys are clearly not reviewers, their allegiance is based on who pays them or promises them scoops for their channel or it's growth.....Gone are the days when people gave honest reviews...but it's a bit too glaring now...So my question is this, what will they do when Nvidia is no longer at pole position? Suck up to AMD?

Look at Hardware Unboxed, they called out DLSS for what it was, Now Nvidia is giving him the cold shoulder....RTX is only avaliable on 4 titles after 10 months, it's the reality......but people want to recommend it as if it's available on hundreds of games......Yet just look at NVIDIA fans react when anyone declares the truth about these things, they get all hot and bothered and start slinging just like Nvidia. You see the apple does not fall too far from the tree....

Look at how dirty Nvidia is playing after so many scandals and so many infractions all these years, they're still at it, yet they've never been in a position of such weakness before, so the GPU industry is poised to turn right on it's head just as the CPU industry. Truth is, AMD has a dynamite product right now in Navi......but let any techtuber say that, they're in for a world of hurt...…..

Then the RTX brigade come rushing in with their 4 RTX tryout games in hand, talking about 7nm vs 12nm, as if to be on 7nm is a bad thing...….Reviewers talk about so many thing on Navi's review than what they should; Price to Perf, Fidelity FX, Anti Lag, RIS, Streaming, Encoding (Hevc) etc etc...….No, let's deflect instead, lets talk about blowers when we knew that's what was going to be on there prior, knowing that AIB's are coming is short order, lets talk about RTX because Radeon does not have it at 1080p in 4 games, no comparison of CAS to DLSS, because they blew DLSS just as they are blowing Turing RTX now, and got embarassed......

Nvidia gave us a 1080ti matcher in the 2080 for $800 a few months ago. AMD gives us a 1080ti matcher for $400 with only 40 CU's, lets not talk about that, lets talk about shroud dents instead, which is a design, anything trivial is mountained up and they're only molehills.....I could careless how a card looks, it's all subjective anyway...All I want is good price to perf, but lets ignore that, oh ye professional techtubers...and pretend for a minute that you guys are actually concerned for AMD with a straight face...…"AMD is just not doing enough".....I swear, AMD could give us a 2080ti beater for $600 and people would still argue for the 2080ti at it's current price....
 

thelastword

Banned
Radeon Image Sharpening Tested, Navi's Secret Weapon For Combating Nvidia



DLSS gets schooled in quality and support...All DX9, DX12 and Vulkan Games are supported...DX11 Support Pending....OTOH, only a few games support DLSS.....In essence, RIS just works...
 

Ascend

Member
Radeon Image Sharpening Tested, Navi's Secret Weapon For Combating Nvidia



DLSS gets schooled in quality and support...All DX9, DX12 and Vulkan Games are supported...DX11 Support Pending....OTOH, only a few games support DLSS.....In essence, RIS just works...

That's a very nice feature. Playing at 80% of 4K and sharpening gives the same quality as native 4K and boosts your framerate. Navi only keeps looking better and better.
 
Radeon Image Sharpening Tested, Navi's Secret Weapon For Combating Nvidia



DLSS gets schooled in quality and support...All DX9, DX12 and Vulkan Games are supported...DX11 Support Pending....OTOH, only a few games support DLSS.....In essence, RIS just works...


Yup, DLSS being a per game per resolution feature that nVidia drummed up trying to sell an architecture designed for enterprise to gamers .

AMD RIS is a feature that is supported by 47 games on the day of release.
 

thelastword

Banned
Yup, DLSS being a per game per resolution feature that nVidia drummed up trying to sell an architecture designed for enterprise to gamers .

AMD RIS is a feature that is supported by 47 games on the day of release.
Yes and Remij is telling me in the other thread that this is not enough, compared to what exactly?

Also, add to that...

56 Vulkan games and 1779 Dx9 games...…...AMD is not doing enough you see...….1 Nvidia TF is equal to 10 AMD TF's, 1 DLSS game is equivalent to 1000 RIS games....and all that.....
 
Yes and Remij is telling me in the other thread that this is not enough, compared to what exactly?

Also, add to that...

56 Vulkan games and 1779 Dx9 games...…...AMD is not doing enough you see...….1 Nvidia TF is equal to 10 AMD TF's, 1 DLSS game is equivalent to 1000 RIS games....and all that.....
Also, what will the excuse be when RIS eventually supports DX11?

It should be noted that AMD's solution is good for both developers and consumers. Developers have a choice between implementing FidelityFX for higher quality sharpening or can just leave it to RIS. Even if they don't use FidelityFX, consumers will still benefit from RIS.
 

thelastword

Banned
Also, what will the excuse be when RIS eventually supports DX11?

It should be noted that AMD's solution is good for both developers and consumers. Developers have a choice between implementing FidelityFX for higher quality sharpening or can just leave it to RIS. Even if they don't use FidelityFX, consumers will still benefit from RIS.
Not enough DX13 support????? :messenger_beaming:…...


I swear, the hoops some of these guys go through...Also your second point is just as potent.....RIS works on almost 2000 games, day and date and when they sort out DX11, there will be tonnes more....If developers want to manually tinker and enhance how the image is enhanced in their game they can use the Fidelity FX Suite and go broke, that's just genius.....I really want to see what they can come up with by using Fidelity FX ground up in their games, perhaps we will get less blurry games from devs too, because CAS in the Fidelity FX suite is also said to improve TAA quality, so devs can do a lot for the image in Fidelity FX......

Also, RIS "the radeon sharpening filter" does not just sharpen like some are thinking, it's dynamic, not all areas of the image is sharpened, detail in low contrast areas are boosted and high contrast points remains untouched for the most part, the general image is enhanced and made sharper, so details that you would normally spot and pick out whilst you game looks more defined.....It is much better technology than all the other image sharpening filters out there, it's a more evolved algorithm more than anything else.....


"Following the Radeon RX 5700 series launch, AMD has now open-sourced their Contrast Adaptive Sharpening (CAS) technology under FidelityFX on GPUOpen.

Contrast Adaptive Sharpening provides sharpening and optional scaling and is implemented as HLSL and GLSL shaders for Direct3D and Vulkan. CAS is designed to provide better sharpness with fewer artifacts and to increase the quality of temporal anti-aliasing."

https://www.phoronix.com/scan.php?page=news_item&px=AMD-GPUOpen-FidelityFX-CAS
 

so i made a frequency over time plot for my firestrike run (GT1 on the right and GT2 on the left):

frequencystockvs.oc3kjab.png

so i tested the plain undervolt in a gaming scenario today - resident evil 2 / 1440p max settings. there are some distinctive differences in comparison to the undervolted results in 3D mark fire strike.

so in the following picture you see the over time graphs for frequency, fps and thermals for stock (doted line) and undervolted.

undervoltw7joo.png


so that gained us a measly 2% and not the 4% we would expect from the fire strike benchmark. so what went wrong?
as you can see from the frequency graph the card now runs with an averageof 1900MHz oposed to the 1800Mhz. so that's even 5% up. why doesn't that materialize in fps? in contrast to the over time fire strike graph above the card does not seem to hold the clockrate steadily but dips frequently. such a behavoir is a clear sign of throtteling. so what gives? looking at the temperature graphs, we see that we dont see much, but the both graphs perfectly lying above each other. BINGO the thermals must still bottleneck the card in contrast to the firestrike test where card hit its power limit and didn't go in full "save my ass" thortteling mode immediately.


ok so i ramped up the fans from auto to manual 50% or something.

undervoltfansuparka9.png


et voila: now that the card runs 10C cooler we get a stead frequency graph at an average of 1920Mhz. the card now does scales even better than in firestrike with roughly 5% more fps.

so what do we learn [beside the fact that if AMD didn't go with a fucking blower cooler they would have been right up the arse of the 2070Super]? the gains of overclocking and undervolting can very depending on load and what is actually the bottleneck. not all games/engines are the same.
 
Last edited:
Yes and Remij is telling me in the other thread that this is not enough, compared to what exactly?

Also, add to that...

56 Vulkan games and 1779 Dx9 games...…...AMD is not doing enough you see...….1 Nvidia TF is equal to 10 AMD TF's, 1 DLSS game is equivalent to 1000 RIS games....and all that.....
Did I say that... or did I say that it wasn't a whole lot of games?

Are you seriously acting like DX9 support means anything? ROFL These gpus have the power to slay any DX9 game at native 4K.. and they aren't games that need sharpening either. lmfao.

I also said... DX11 or bust. Because that's the API with BY FAR the most game that people currently ACTUALLY PLAY.

You seem disgruntled. You might as well add 1 Radeon sale = 1000 Nvidia sales while you're at it "pie_tears_joy:
 
Last edited:

CrustyBritches

Gold Member
ASUS to Release Custom NAVI GPUs in September
In a blog post on Edge UP, ASUS said that "Our initial Navi offerings will use AMD's reference cooler design and clock speeds, but we'll be tweaking, tuning, and powering up these new Radeons with coolers of our own design soon. Stay tuned for more details in September." This means that custom cards for Radeon RX 5700 and 5700 XT are arriving later than what we previously thought. It was believed that custom designs from AIBs would arrive some time in August, but the Edge UP post now contradicts that claim. In order to find out more, we would have to wait until August at least. Additionally, it may be possible that a "paper launch" will happen in August, while the general availability is targeted for September.
 

thelastword

Banned
I'm not stoked for ASUS cards, but we shall see.....They had one of the worse Vega 64's next to Gigabyte...….Sapphire and Powercolor were the go to AIB Vega cards...….I'm optimistic though that all cards will be excellent and that all AIB's put their best foot forward because they have some respect for AMD now.....At least that's what it looks like, because there are quite a few Navi AIB's coming, some really whacky designs too...…..Some really powerful coolers.....Sapphire has a water-cooler on one of their cards......Navi is getting the treatment tbh....I guess they know the high end Navi's are coming so they are going all out to impress....


As for stabilized clocks......Obviously, undervolting stabilizes your clocks, it's an AMD card afterall and that's been consistent forever.....AMD always overvolts for some reason, so an easy way to get your card cooler and stabilize clocks to UV...Same thing for Vega.....Yet for Navi, you don't even necessarily have to UV to get your clocks consistently at 1970-1980Mhz......Just increase the power limit and voila..

Just look at this video, no OC, Memory is at 875....Power Limit is just maxed out on both cards...The 5700XT stays at the high 1900's and goes over 2000Mhz quite often throughout the tests...It's averaging high 1900's...look at core of the XT is all games....

5700XT vs 2700FE



The only games with unstable clocks are Shadow of the TR and World War Z in some scenes......Yet though I see Navi's clocks dipping in WWZ it still maintains a much higher fps than Turing...So maybe AMD can push more performance out of these cards in World War Z.......The same for Shadow of the TR, when you hover over the village, clocks fall on Radeon and so does framerate, and in these scenes Turing pulls ahead, so maybe AMD should pay attention to that and optimize some more....

5700 vs 2060


In the case of the 5700, it's hovering close to 1700 in most tests.....However what I highlighted above in Worldwarz and Shadow can be seen more blatantly here...

p2yspkE.jpg


Now just before thay point in the video and that whole village scene, Radeon clocks goes under 1000Mhz many times, in the 700-1000Mhz range and curiously, this is where Turing takes the lead in the whole benchmark...Turing maintains it's clocks.....As long as Navi maintains it's clocks, it beats Turing guaranteed.....This ladies and gentlemen is what you call a lack of optimization, and navi loses by a 2-3 fps average in that title...….The same can be seen in World War Z, not as bad as Shadow, but Radeon is so fast in that title, that you hardly take notice, yet it means World War Z can go even faster on Navi... Yet, overall it just shows that Navi has a lot more performance left on the table in said titles...….I'm curious to see if the same thing happened with Vega......Well, whatever is causing this, I hope the Driver team takes notice......

This ladies and gentlemen is how games can get better FPS overtime, sometimes, people don't notice these things, or it's a new architecture, so perhaps they are still optimizing drivers...….Drivers in the next 3 months will be interesting......
 
So while testing RIS.. Hardware Unboxed is using games that have internal resolution scalers. There's a lot of games where you can't even scale the internal resolution... so hitting that 70-80% of 4K sweet spot is impossible for AMD users in most games. The ACTUAL appeal of this RIS filter (the ability to get a look that is close enough to native 4K by sharpening details of a slightly lower base resolution) only has value if you can scale the internal resolution down in game and reap the performance benefits. That means on games without an internal res scaler they have to drop down to 1440p.. in which Hardware Unboxed has said DLSS can look better...

If Nvidia used a base resolution of 1620p or 1800p as the base resolution for 4K DLSS instead of 1440p, then visuals would be sharper, clearer, more defined, and yet more temporally stable.

I'm sure the algorithms will improve as well.
 
Last edited:

thelastword

Banned
8vozIwc.jpg


It's interesting and funny, I was watching this pic and it seems Nvidia has been offering blower styled coolers for a while now, I don't remember too much noise about them in the Pascal days tbh, but now it's so unbearable to gamers and techtubers, they go on and on about it, like it's the cardinal sin......Just funny innit…….The usual techtuber suspects just hate blowers with a passion.....Yet nobody talks about the coil whine issue lots of FE cards seem to have....

Anyway, here's Timmy Joe putting the cooler of a R390 on his 5700XT, whisper quiet.....I think Hardware unboxed did a similar thing, couldn't hear a pip either, bodes well for AIB's in the next few weeks......

 
It's interesting and funny, I was watching this pic and it seems Nvidia has been offering blower styled coolers for a while now, I don't remember too much noise about them in the Pascal days tbh, but now it's so unbearable to gamers and techtubers, they go on and on about it, like it's the cardinal sin......Just funny innit…….The usual techtuber suspects just hate blowers with a passion.....Yet nobody talks about the coil whine issue lots of FE cards seem to have....

That's because AMD card have always run hotter and thus compared to Nvidia even on blower shrouds cause them to run at higher RPMs and be louder.

Also, not all blowers are made equal.. AMD obviously cheaps out on their shrouds compared to what Nvidia has even in the 900/1000 series.
 
Last edited:
blowers are acceptable for 150W tier GPUs, bareable for 175W and just not acceptable for 200+W

Just look at this video, no OC, Memory is at 875....Power Limit is just maxed out on both cards...The 5700XT stays at the high 1900's and goes over 2000Mhz quite often throughout the tests...It's averaging high 1900's...look at core of the XT is all games....

5700XT vs 2700FE


The only games with unstable clocks are Shadow of the TR and World War Z in some scenes......Yet though I see Navi's clocks dipping in WWZ it still maintains a much higher fps than Turing...So maybe AMD can push more performance out of these cards in World War Z.......The same for Shadow of the TR, when you hover over the village, clocks fall on Radeon and so does framerate, and in these scenes Turing pulls ahead, so maybe AMD should pay attention to that and optimize some more....


well if you know joker you know that he does not only max out the power limit but also the fans to avoid thermal throtteling. you can clearly see that on the reported GPU temperatures in his video. with the XTs blower that is not a realistic use case. you wont see clocks hovering around 2Ghz with fan noise that anyone would consider acceptable. ... the video might be a good indication what we will realistically see from partner models though.

the village scene from the SotTR benchmark looks like a bandwidth bottleneck to me. if this is fixable via drivers has to be seen. but i don't think that you see that in game during the village parts anyway. it's more of an artifical load for the benchmark.
 
Last edited:

thelastword

Banned
That's because AMD card have always run hotter and thus compared to Nvidia even on blower shrouds cause them to run at higher RPMs and be louder.

Also, not all blowers are made equal.. AMD obviously cheaps out on their shrouds compared to what Nvidia has even in the 900/1000 series.
They run hotter because AMD cards in the past were always more power hungry, had more raw power, so fans speeds had to be higher...... AMD does not use as much compression technology to save GPU cycles and keep their fans lower, so it all adds up....

Also didn't AMD put watercoolers on their Fiji cards and even one of their Vega cards? Didn't hear any praise in the media for that......Don't get me wrong, if we're all moving to something better, I'm happy...Just the hypocrisy and double-standards about it all...….Come to think of it, didn't the R9 295X2 also have water cooling? So it goes as far back as Hawaii.....

Blowers are shit. The numbers don't lie. Nobody should be buying these hot and loud reference cards. Just wait. I know it sucks to wait while Supers are flooding the market and partner AiBs could be a month or 2 out, but it's worth it.
This blower is not the worse, people are pretending that they are running Open-Setups, if your case is closed you're not going to hear these fans that much...Obviously I'm in for more silent coolers, but people are really tech savvy these days....You can buy a blower and put in your own cooling solution, you can buy a waterblock and put it on...….People don't need a tech to do everything for them nowadays, especially if you have the DIY's to aid you....




This guy has a waterblock on his Radeon 7, if you want, you can put your own fans as well....

Here he has Radeon 7 OC'd to 1200Mhz on the memory and over 2000Mhz on the core....He got a nice boost up...


Granted AIB's might be able to do more with the bios......It looks like there is a 1850 lock on the 5700 and a 2100 lock on the 5700XT, so this means that these cards can be pushed much more, especially the XT, that easily does 2000 Mhz on a blower, so with better cooling, I can see people pushing it to 2200-2400Mhz......Note, at 2000+Mhz on the core, one reviewer registered massive improvements to 1% lows......Obviously you wouldn't keep the cards OC'd like that since fans were at 100%...….

blowers are acceptable for 150W tier GPUs, bareable for 175W and just not acceptable for 200+W



well if you know joker you know that he does not only max out the power limit but also the fans to avoid thermal throtteling. you can clearly see that on the reported GPU temperatures in his video. with the XTs blower that is not a realistic use case. you wont see clocks hovering around 2Ghz with fan noise that anyone would consider acceptable. ... the video might be a good indication what we will realistically see from partner models though.

the village scene from the SotTR benchmark looks like a bandwidth bottleneck to me. if this is fixable via drivers has to be seen. but i don't think that you see that in game during the village parts anyway. it's more of an artifical load for the benchmark.
Yeah those temps are a bit controlled, yet Joker insists all he does is increase the power limits.....In any case, it bodes well for AIB coolers...There's a lot more performance to come out of these cards if their core clocks are not curtailed by a ceiling....

The village scene, if it's bandwidth limited, doesn't explain it......Navi is not more bandwidth starved than Turing.....GPU utilization just goes down a tonne on the Navi GPU's there........OTOH, Turing's GPU utilization is much higher...
 

kittoo

Cretinously credulous
Noob question i suppose, but where do we go from here? We are reaching limits of lithography. Maybe 5nm would come, but what about after that? Are there any other developments that are in the pipeline which will keep improving performance or will we be stuck here for a while?
 

Ascend

Member
Noob question i suppose, but where do we go from here? We are reaching limits of lithography. Maybe 5nm would come, but what about after that? Are there any other developments that are in the pipeline which will keep improving performance or will we be stuck here for a while?
3nm is in the works for 2021 by Samsung and TSMC. It's unclear whether we will reach lower than that, although they are already researching 2.1nm, 1.5nm and 1.0nm.
 
Last edited:

llien

Member
Radeon VP on "jebaiting"




02:15 – Scott’s Role As VP and General Manager of the Radeon BU
05:54 – What Did The Hardware Review Community Miss With The Radeon RX 5700 Launch?
10:45 – Radeon Image Sharpening versus NVIDIA DLSS
13:11 – What Happens During The Bring-Up Of A New GPU?
16:15 – Scott’s Thoughts On The Push For Real-Time Ray Tracing
20:23 – How Did AMD Increase IPC With Navi?
21:33 – Explaining The New Radeon RX Series Naming Convention
23:42 – With Such A Small Die Size, How Much Wiggle-Room Does AMD Have to Make Bigger / Smaller GPUs?
25:35 – JEBAITED! Yes, No, Maybe? Scott Explains...
40:21 – Are There Issues With The Radeon RX 5700 Series H.264 Encoding Engine?


Gives an interesting insight on Steam market share metrics.
 
Last edited:

shark sandwich

tenuously links anime, pedophile and incels
8vozIwc.jpg


It's interesting and funny, I was watching this pic and it seems Nvidia has been offering blower styled coolers for a while now, I don't remember too much noise about them in the Pascal days tbh, but now it's so unbearable to gamers and techtubers, they go on and on about it, like it's the cardinal sin......Just funny innit…….The usual techtuber suspects just hate blowers with a passion.....
6vwIxci.jpg


Yeah gee why would anybody complain about those whisper quiet blower coolers! Those people must have an anti-AMD agenda or something.

(BTW there were non-reference 1080s available at launch, most of which were equally priced or cheaper than the FE, so yeah of course people wouldn’t be complaining as much about the blower)
 
Top Bottom