• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RX 6000 'Big Navi' GPU Event | 10/28/2020 @ 12PM EST/9AM PST/4PM UK

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
3080ti is rumoured to be based on GA102, disabled 4 SMs and with 12GB gddr6x, probably same price as 6900XT.

Right now the 3090 looks completely stupid price/performance wise for gamers so Nvidia need something to counter it.

Same reason they are using an even further cut down version of GA102 as 3070ti to combat 6800 as it makes the 3070 completely redundant for only $80 more.

There is no space on the GA102 to make a 3080ti.
It cant be stronger than the 3090 and and space between the 3090 and 3080 would be within overclock range.
Hell probably worse the 3090 is already a few percent faster than the 3080, the 3080Ti would be like 0.5% faster than the 3080.


A cutdown 102 RTX 3075 actually makes a shit ton of sense.
 
Wait for AIB card announcements.

All the leaks we had in the lats couple of weeks originated from AIB's or people who managed to get their hands on AIB samples.

AMD reference cards almost always clock in lower than what the AIB's manage to squeeze out.

That's fair. Everything I'm saying is still early and subject to change. We won't truly KNOW until the GPUs are out in the wild and tested.

But at this point, IMO, it looks like all of those sky high clocks are just fantasy.

Today, AMD announced their new GPUs with game clocks (sustained performance) at 1.8 to 2.0 GHz - this is NORMAL. GPUs for years have been able to hit these frequencies and hold them.

2.1GHz has been the upper end, but possible.

2.2Ghz - I don't think there has been ANY GPU made so far that can sustain this clock. Even a watercooled GPU can't quite do this.

So when we hear rumors of RDNA2 GPUs that can run at 2.2 to 2.5GHz, this represents a range that was only possible with liquid nitrogen so far.

I'm not saying that these clocks are impossible to hit without LN2 (clocks get faster over time, and eventually we'll get there). But I am saying that if a GPU says 2.4 GHz on the spec sheet but in reality can only blip upto that frequency for a few milliseconds - that just doesn't count for anything IMO.
 
Last edited:

Papacheeks

Banned
So.... it seems like the rumored clock speeds were wrong. Very wrong.

With rumors of 2.5GHz, the reality isn't even close.

6800XT - Game Clock 2015 ( boost clock of 2250 )
6800 - Game Clock 1815 ( boost clock of 2105 )

And AFAIK the game clock is the REAL clock, as in that's the number that can actually be sustained in game. But it will be interesting to see where it lands when people get to test these cards for themselves.

Here's the way from Steve from Gamer's Nexus describes AMD's Boost Clock ...

"The peak opportunistic clock. Boost clock in AMDs spec sheet could mean for a BLIP, for a couple milliseconds and under optimal conditions"

So AMD's boost clock is BS and barely worth mentioning, but it does look nice on a spec sheet.

And what might this say about the PS5's "real" clockspeed?

At this point it doesn't look very likely to me that the PS5 can actually spend much time at all at 2.23Ghz. It may only be able to "BLIP" upto that clock for a few milliseconds at a time. And when these new GPUs are out and tested I'm gonna be that NONE of them will be able to hold 2.23GHz for any length of time worth mentioning. And if they can't hold that clock speed there's NO reason to believe that the GPU in the PS5 can either.

It sure does look like Microsoft gave us the Game Clock while Sony gave us the Boost Clock.

AMD's new RDNA2 GPUs seem to range between 1815 and 2015 for actual sustained performance.

Giving the PS5 the benefit of the doubt, and giving it the upper range of 2015mhz, the PS5 actual and sustained TFLOP count would be 9.28 TFLOPS.

Still a bit too early to know this for a fact, but the facts that we have now are definitely pointing STRONGLY in this direction.

Actually they are true. 2.5mhz was aib. And 2.3mhz is for your overclock which breaks TDP. they set power limits. Notice the power usage they showed is less than what was reported with the higher clocks.

Wait to see what AIB'S do.
 

Kenpachii

Member
There is no space on the GA102 to make a 3080ti.
It cant be stronger than the 3090 and and space between the 3090 and 3080 would be within overclock range.
Hell probably worse the 3090 is already a few percent faster than the 3080, the 3080Ti would be like 0.5% faster than the 3080.


A cutdown 102 RTX 3075 actually makes a shit ton of sense.

Could just get a 3080 and slam 20gb of ram on it done and call it a 3080ti.
 
Last edited:

GHG

Member
But at this point, IMO, it looks like all of those sky high clocks are just fantasy.

Yeh... You should just wait. Basing what frequencies are "normal" for a new architecture based on previous architectures is a fools game.

Enough people (some whom are very reliable) who have actually manged to get samples in hand have stated the card can hit above 2.2 with ease.

We will know for sure the moment the AIB's start unveiling their versions of the cards.

You're also not even taking into account rage mode which AMD announced on stage today for these cards. How do you think that mode squeezes out extra performance? They won't state the clock speeds for that mode though because every single card will respond differently.
 
Last edited:

KungFucius

King Snowflake
After the hell I went through to score a 3080, it is nice to see the 2 are comparable. Maybe next GPU upgrade won't be so fucking awful. I'd like to just be able to buy what I want when it is time to upgrade, not wait a year because there was nothing major new and then get sucked into 3 weeks of F5ing until I finally got lucky.

I am glad I got Nvidia for the sole reason that I would feel obligated to upgrade my CPU to get the most out of the Radeon and really that would be a big waste. My 3700x will be fine for years.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Could just get a 3080 and slam 20gb of ram on it done and call it a 3080ti.

So NOT a Ti just a 20GB version of the exact same chip?

Shots fired :messenger_fire:



Jesus the RTX 3090 is such a joke.
Its not even that much more powerful than the 3080 Nvidia probably just made it massive for status sake.....but now its biting them in the ass as people make fun of it needing such a massive massive cooler to get worked by a much smaller card.
 
I don't think you're understanding how this works. Wait a couple of weeks before going on the offensive.

AMD only ever quote guaranteed frequencies. I have a 3900x that boosts to 4.7 on multiple cores (in a single CCX) and 4.8 on a single core out of the box, no overclocking. According to you that shouldn't be possible because AMD quote 4.6 single core boost for my CPU.

AIB's always have factory overclocked cards. Dependant on the quality of the silicon 2.5 doesn't sound like a massive stretch.

I also have a 3900X and both 4.6 and 4.7 are pretty much BS. But they write that shit on the box because it looks good but it's a borderline lie IMO.

With a 3900X you CAN hit 4.6Ghz or even 4.7 but NOT in any meaningful way.

Here's what you need to hit those frequencies...

You have to be running a fairly new BIOS ( any of the launch BIOS wouldn't )
RAM that is ideal in speed and Cas latency to match the Infinity fabric speed of your 3900X ( 3600 Cas 16 or 3200 Cas 14 are pretty much your only options )
Have everything in your system tuned properly including enabling the proper power plan.

Then you MIGHT SOMETIMES be able to DETECT a brief spike ( less than one second ) into the 4.6-4.7Ghz range and then ONLY during light work like opening a browser.

Maintaining those clockspeeds is IMPOSSIBLE

Hitting those clockspeeds while pushing your CPU or in an actual game is IMPOSSIBLE

So using AMD CPU reported clocks doesn't exactly give me hope that AMD GPU "boost clock" speeds are anything other than marketing BS.
 
Last edited:

GHG

Member
AIB will be shown tomorrow. :messenger_fire: :messenger_fire: :messenger_fire:

If asrock have one that is as sexy as their 5700xt taichi then that's what I'll be eying up.

Hopefully on the whole they are all much nicer looking than the nvidia 3800 partner boards. 90% of those were hideous, even EVGA fucked up.
 

Kenpachii

Member
So NOT a Ti just a 20GB version of the exact same chip?



Jesus the RTX 3090 is such a joke.
Its not even that much more powerful than the 3080 Nvidia probably just made it massive for status sake.....but now its biting them in the ass as people make fun of it needing such a massive massive cooler to get worked by a much smaller card.

Sure it is, if they name it. Nvidia naming has been a joke multiple generations when mid range chips get high end labels. Ti just means better really, 20gb is better done.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Sure it is, if they name it. Nvidia naming has been a joke multiple generations when mid range chips get high end labels.

Titanium has always been reserved for upped versions of whatever they are maxing out.
There has never been a Titanium card that was equal to its base version.


I actually think with the 30 series they are going back to the more classic naming scheme of not having a range topping Ti but instead having midrange Titanium cards that more people are going to buy.
 
Actually they are true. 2.5mhz was aib. And 2.3mhz is for your overclock which breaks TDP. they set power limits. Notice the power usage they showed is less than what was reported with the higher clocks.

Wait to see what AIB'S do.

I can't say that you are wrong. But I won't believe that until I see it reported in MSI afterburner.

I'm interested in the clock speed that the GPUs can SUSTAIN. And I don't expect any of them to be able to SUSTAIN 2.5Ghz.

Going over the TDP is fine. Overclocking is fine. None of this is in any way cheating if it results in a SUSTAINED clock speed of 2.5Ghz. But I don't believe this will happen.
 

Kenpachii

Member
Titanium has always been reserved for upped versions of whatever they are maxing out.
There has never been a Titanium card that was equal to its base version.


I actually think with the 30 series they are going back to the more classic naming scheme of not having a range topping Ti but instead having midrange Titanium cards that more people are going to buy.

And the x80 was always reserved for the high end chip. until it was not. Anyway they could use super also or simple just get a new name. the big fat memory edition.
 
Last edited:
Keep this in mind regarding RT performance: Ampere is faster than Turing by quite a bit BUT in actual games (that aren't path traced) the difference is almost non-existent (compared to just the straight power increase). So just because in a PURE RT scenario it's faster DOESN'T MEAN you will actually run games with RT faster - because they're still hybrid rendered!

Disclaimers out of the way, 6800 XT RT = 2080 Ti RT performance (Pure)


See tests here for Ampere vs Turing:



Yep spot on. I was debating someone in here about this : he was claiming a 50% improvement in RT but I basically said 'show me the receipts' in games but worded it in a janky way as I'd had an argument with the wifey last night.

On paper, Nvidia are claiming big using full path traced canned benches which is extremely misleading.
 
I can't say that you are wrong. But I won't believe that until I see it reported in MSI afterburner.

I'm interested in the clock speed that the GPUs can SUSTAIN. And I don't expect any of them to be able to SUSTAIN 2.5Ghz.

Going over the TDP is fine. Overclocking is fine. None of this is in any way cheating if it results in a SUSTAINED clock speed of 2.5Ghz. But I don't believe this will happen.

'Sustained' how, like 100% of the time all the time during gaming? That's not how modern boost clocks work.

In any case, the lower CU 6800 has the potential to clock closer to 2.5Ghz. The 80CU boosting to 2.25Ghz is pretty damn impressive for the number of transistors there. Must be special bins to keep TDP @ 300W.
 

GHG

Member
I also have a 3900X and both 4.6 and 4.7 are pretty much BS. But they write that shit on the box because it looks good but it's a borderline lie IMO.

With a 3900X you CAN hit 4.6Ghz or even 4.7 but NOT in any meaningful way.

Here's what you need to hit those frequencies...

You have to be running a fairly new BIOS ( any of the launch BIOS wouldn't )
RAM that is ideal in speed and Cas latency to match the Infinity fabric speed of your 3900X ( 3600 Cas 16 or 3200 Cas 14 are pretty much your only options )
Have everything in your system tuned properly including enabling to proper power plan.

Then you MIGHT be able to DETECT a brief spike ( less than one second ) into the 4.6-4.7Ghz range and then ONLY during light work like opening a browser.

Maintaining those clockspeeds is IMPOSSIBLE

Hitting those clockspeeds while pushing your GPU or in an actual game is IMPOSSIBLE

So using AMD CPU reported clocks doesn't exactly give me hope that AMD GPU "boost clock" speeds are anything other than marketing BS.

I don't know what your chip is doing but mine will hit 4.7 consistently across 3 cores when being hammered in single core workloads. The only caveat being the fact that it doesn't stay on the same core to run at 4.7, it jumps between the cores that can achieve that frequency. Its not in light work either, all of my testing was done running cinebench back to back to back because I wanted to make sure my temps were ok after installing my AIO. It's also worth noting that AMD states 4.6 because that's all that can be achieved in most cases on the stock cooler. And this is without even fiddling with overclocking.

So:
  1. AIB's are not stuck with the reference AMD cooler, they can use whatever the fuck cooling solution they want to use - this results in increased frequency headroom
  2. AIB's can also increase the power limit - also results in increased frequency headroom.
  3. How AMD CPU's behave in terms of clock speed actually has little relevance here. We are talking about GPU's. I only referenced my experience with my 3900x because my feeling is that AMD tend to be more conservative (as Nvidia also are) when it comes to reporting expected clock speeds - this is to avoid lawsuits.
Like I keep on saying, wait. It's literally a couple of weeks max. Once we get the full picture and the results are in then go HAM if your feelings are confirmed. All you're asking for at the moment is egg on your face. This is a new architecture, let's wait and see how it actually performs and what AIB's can squeeze out of it (along with what it takes to do so in terms of cooling solutions).
 
'Sustained' how, like 100% of the time all the time during gaming? That's not how modern boost clocks work.

In any case, the lower CU 6800 has the potential to clock closer to 2.5Ghz. The 80CU boosting to 2.25Ghz is pretty damn impressive for the number of transistors there. Must be special bins to keep TDP @ 300W.

It's not impressive if it only "boosts" up to that clock for a few milliseconds on occasion.

And yeah that IS how modern boost clocks work ( atleast on Nvidia ). Don't get caught up in the marketing vocabulary. We're talking about SUSTAINED CLOCKS here. That's what actually matters.

Take my 1080ti for example - Nvidia lists it's base clock at 1481 mhz and it's boost clock at 1582 - but those numbers are MEANINGLESS because with a bit of tuning I can get SUSTAIN a clock of 1987. Not for a millisecond, but for as long as I'm playing.

So that's the REAL question - What is the max clock on these new RDNA 2 GPUs that can be SUSTAINED?
 
I don't know what your chip is doing but mine will hit 4.7 consistently across 3 cores when being hammered in single core workloads. The only caveat being the fact that it doesn't stay on the same core to run at 4.7, it jumps between the cores that can achieve that frequency. Its not in light work either, all of my testing was done running cinebench back to back to back because I wanted to make sure my temps were ok after installing my AIO. It's also worth noting that AMD states 4.6 because that's all that can be achieved in most cases on the stock cooler. And this is without even fiddling with overclocking.

So:
  1. AIB's are not stuck with the reference AMD cooler, they can use whatever the fuck cooling solution they want to use - this results in increased frequency headroom
  2. AIB's can also increase the power limit - also results in increased frequency headroom.
  3. How AMD CPU's behave in terms of clock speed actually has little relevance here. We are talking about GPU's. I only referenced my experience with my 3900x because my feeling is that AMD tend to be more conservative (as Nvidia also are) when it comes to reporting expected clock speeds - this is to avoid lawsuits.
Like I keep on saying, wait. It's literally a couple of weeks max. Once we get the full picture and the results are in then go HAM if your feelings are confirmed. All you're asking for at the moment is egg on your face. This is a new architecture, let's wait and see how it actually performs and what AIB's can squeeze out of it (along with what it takes to do so in terms of cooling solutions).

No, it won't.

Your 3900X will NOT maintain that clockspeed during a cinebench run. It might "blip" upto that for a brief milisecond so it can be detected. but it will NOT stay at that clock speed.
 
Last edited:
Ok my eyes must be deceiving me then.. Thanks for letting me know, I'll go get them checked.

Power plans man, power plans.

You need to check again. That 4.6 on the box does NOT mean sustained. That is an opportunistic and extremely quick blip upto that speed.

There is NO WORKLOAD you can do where you can get EVEN ONE of your cores to MAINTAIN 4.6Ghz.

What software are you using to check your clocks? I bet you're just looking at the "max reported clock"
 
Last edited:

GHG

Member
You need to check again. That 4.6 on the box does NOT mean sustained. That is an opportunistic and extremely quick blip upto that speed.

There is NO WORKLOAD you can do where you can get EVEN ONE of your cores to MAINTAIN 4.6Ghz.

Did you even read any of what I said? Where did I say one of my cores maintains 4.7?

Depends on cooling though. It can boost that high if you have sufficient cooling, most won't have it.

Yep that's exactly what I was getting at in my post above.
 

rnlval

Member
I also have a 3900X and both 4.6 and 4.7 are pretty much BS. But they write that shit on the box because it looks good but it's a borderline lie IMO.

With a 3900X you CAN hit 4.6Ghz or even 4.7 but NOT in any meaningful way.

Here's what you need to hit those frequencies...

You have to be running a fairly new BIOS ( any of the launch BIOS wouldn't )
RAM that is ideal in speed and Cas latency to match the Infinity fabric speed of your 3900X ( 3600 Cas 16 or 3200 Cas 14 are pretty much your only options )
Have everything in your system tuned properly including enabling the proper power plan.

Then you MIGHT SOMETIMES be able to DETECT a brief spike ( less than one second ) into the 4.6-4.7Ghz range and then ONLY during light work like opening a browser.

Maintaining those clockspeeds is IMPOSSIBLE

Hitting those clockspeeds while pushing your CPU or in an actual game is IMPOSSIBLE

So using AMD CPU reported clocks doesn't exactly give me hope that AMD GPU "boost clock" speeds are anything other than marketing BS.
According to Techpowerup, ASUS ROG Strix RX 5700 XT has 2Ghz average clock speed.
 

Ascend

Member
I also have a 3900X and both 4.6 and 4.7 are pretty much BS. But they write that shit on the box because it looks good but it's a borderline lie IMO.

With a 3900X you CAN hit 4.6Ghz or even 4.7 but NOT in any meaningful way.

Here's what you need to hit those frequencies...

You have to be running a fairly new BIOS ( any of the launch BIOS wouldn't )
RAM that is ideal in speed and Cas latency to match the Infinity fabric speed of your 3900X ( 3600 Cas 16 or 3200 Cas 14 are pretty much your only options )
Have everything in your system tuned properly including enabling the proper power plan.

Then you MIGHT SOMETIMES be able to DETECT a brief spike ( less than one second ) into the 4.6-4.7Ghz range and then ONLY during light work like opening a browser.

Maintaining those clockspeeds is IMPOSSIBLE

Hitting those clockspeeds while pushing your CPU or in an actual game is IMPOSSIBLE

So using AMD CPU reported clocks doesn't exactly give me hope that AMD GPU "boost clock" speeds are anything other than marketing BS.
The numbers don't matter if the performance is there.
 
Did you even read any of what I said? Where did I say one of my cores maintains 4.7?



Yep that's exactly what I was getting at in my post above.

You are just overestimating your 3900x. I have one too and I love it but AMDs boost clock is just marketing BS. And I suspect the same to be true about the "boost clocks" for its upcoming GPUs.

A GPU that can occasionaly boost up to a higher clock for a few milliseconds if conditions are perfect is of no practical use to anyone - except the marketing department. ONLY the max sustainable clocks matter.
 
According to Techpowerup, ASUS ROG Strix RX 5700 XT has 2Ghz average clock speed.

I have no problem believing that. That is a very normal clock speed.

But there is a BIG difference between 2Ghz and 2.3 - 2.5Ghz. The former is normal, the latter has only so far been achievable using LN2. So if RDNA2 GPUs can actually achieve those clocks in a sustained way that's a big BIG deal.

Seeing a GPU that can sustain 2.5Ghz would be like seeing a 6Ghz CPU.
 
Last edited:

rnlval

Member
Just seen this:

xt0wfz1povv51.jpg


Will be interesting to see how much this claim holds up in real world gaming scenarios.
Infinity Cache is based on Zen's L3 cache. Zen team was involved in PC's RDNA 2 design.
 

WakeTheWolf

Member
Got me hyped about PC gaming. Honestly with Nvidia's paper launch it made it feel quite bleak for a bit there but AMD delivered today. One question though. Am I best upgrading my Ryzen 5 3600 if I'm getting the RX 6800 XT?
 

Senua

Gold Member
Got me hyped about PC gaming. Honestly with Nvidia's paper launch it made it feel quite bleak for a bit there but AMD delivered today. One question though. Am I best upgrading my Ryzen 5 3600 if I'm getting the RX 6800 XT?
See how it goes once ya get the card. I think you'll be fine for a long while.
 
Last edited:

PhoenixTank

Member
Is Lisa Su Trans?
AIB will be shown tomorrow. :messenger_fire: :messenger_fire: :messenger_fire:
Missed this info apparently - Where'd you read that? :messenger_grinning:
Got me hyped about PC gaming. Honestly with Nvidia's paper launch it made it feel quite bleak for a bit there but AMD delivered today. One question though. Am I best upgrading my Ryzen 5 3600 if I'm getting the RX 6800 XT?
What resolution & framerate are you aiming for? There is meant to be an extra bump with Ryzen 5000 and RX 6000 specifically, but I don't think it has been ruled out of being backported to Ryzen 3000 later on either.
 

WakeTheWolf

Member
What resolution & framerate are you aiming for? There is meant to be an extra bump with Ryzen 5000 and RX 6000 specifically, but I don't think it has been ruled out of being backported to Ryzen 3000 later on either.

I play on my 4K TV. Happy with 1440p 60 fps though looks just as great to me.
 

thelastword

Banned
Well.... I won't say I told you so, but there it is....

Also don't forget there is even more performance when you combine the best cpu's with the best gpu's this holiday.

Absolute slaughterhouse.... And word has it that Aib's and even these Amd cards will provide even more performance when overclocked.
 

supernova8

Banned
I think the Radeon launch success/failure will hinge on the reviewers.

Think back to Ryzen, it only seemed to get popular after a variety of Youtubers started to switch to Ryzen for their own actual builds. Up until that point that most of them were like "yeah ryzen is great! (but I'm still using an i7/i9)" and it all felt a bit hypocritical. Encouraging people to buy X while using Y because they want that sweet sweet affiliate cut. I get it but yeah it's obviously more reassuring if they use it themselves without being forced.

Real proof of the pudding will be whether influencers actually started using Radeon themselves.
 
Last edited:

Neo_game

Member
Even those strange people who are into funny reflections in a handful of games, remind me, what is the fastest card for what price that they can buy from NV at this point?

Pretty sure most games at least higher budget titles are going to have RT going forward. I do not think it is smart idea to spend excess of 500$ and get inferior RT performance. Anyways let us wait wait for benchmarks. AMD din't bother to show in their conference which does not bode well 🤷‍♂️
 
Top Bottom