• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(RUMOUR) RDNA 4 Will Not Have Any High End GPU’s

SmokedMeat

Gamer™
They sure as hell will never win by releasing inferior products.

When’s the last time you owned an AMD GPU?

I bounce around between AMD and Nvidia, and I just don’t see this inferiority. My previous 5700XT was a fantastic card for the money, and my 7900XT flat out beats the more expensive 4070ti very often in rasterization.

People who don’t buy AMD or owned one many generations ago while bring up drivers, but AMD’s drivers have been great. Their Adrenaline application is way better than Control Panel.

AMD’s problem is they’re not selling them at enough if a lower price to take marketshare and mindshare away from Nvidia. Their cards have been great.
 
AMD throwing in the towel for the high end GPU market yet again... awesome...
I don't disagree, but I'm sure they're putting their chips where they matter most for them. Desktop CPU's with capable GPU's (goal=playable capability with frame reconstruction and stuff so you don't need to buy a nvidia chip), integrated APU's (same goal, but at 720p or something) and consoles. Marketshare from Nvidia is probably unattainable for them right now without losing money in the top range, for the rest, they've been weak on the low end, with a bunch of subpar products, but alright in midrange if you consider the price. Issue is lack of DLSS.

Regardless of mass production of value driven products that best nvidia, it's a shame that they're not building big enterprise versions of their gpu's, for work and benchmarking purposes so we at least know how it scales, RDNA4 might be great, but if all we have for testing is a 4 Tflop version it'll forever be a Xbox Series S class GPU. They should do that, making public some prototype benchmarks (perhaps allow testing for journalists), or build a product akin to Nvidia's Titan. (super expensive, as a means to say we basically produce only 5000 units of this shit)

The issue here is probably that it would draw a lot of power and still tie with Nvidia? Not good marketing, but it's not like AMD is doing bad GPU's, it's that Nvidia has a big advantage. If Nvidia was stumbling like they did in the past AMD would be heralded as great.
 
Last edited:

tusharngf

Member
AMD never had powerful gpu's and never going to have any in future. Nvidia will come up with super expensive gpu. I fear the worst for next gen consoles or pro models with AMD. Intel is already beating AMD in cpu side.
 

SmokedMeat

Gamer™
AMD simply doesn't have an answer to DLSS and until they do they're fucked.

FSR lags behind DLSS, but I really don’t think the difference is that big. In screenshots sure, you can pick little things out, but during gaming? Not really.

Even then XESS is open as well, and can be used in place of FSR. Hopefully it sees wider adoption.
 

Senua

Member
AMD never had powerful gpu's
Yea the 7900xtx huh what a weakling

FSR lags behind DLSS, but I really don’t think the difference is that big. In screenshots sure, you can pick little things out, but during gaming? Not really.

Even then XESS is open as well, and can be used in place of FSR. Hopefully it sees wider adoption.
The difference is pretty damn big especially on monitors
 
Last edited:
FSR lags behind DLSS, but I really don’t think the difference is that big. In screenshots sure, you can pick little things out, but during gaming? Not really.

Even then XESS is open as well, and can be used in place of FSR. Hopefully it sees wider adoption.
DLSS advantage is also down to AI. It's basically free because it's being done by a part of the hardware that wouldn't be used for anything else.

And it means the hardware in question will also do well in AI, which is a plus. FSR and XESS are competitive/interesting but they'll never be as free, unless you dedicate a few units to do it. And even then, you're taking space on the die and dedicating it to a sole function. (or that function and AI, but they won't be as good as AI cores)

AMD is behind on this front right now, probably trying to close the distance, but clearly struggling
 

Klosshufvud

Member
Will this mean there won't be any RDNA4 APUs either? I'm curious on how this will impact the portable/laptop market.
 
Last edited:

Senua

Member
Wow, MLiD lookin real dumb right now.
Shocked GIF by The Tonight Show Starring Jimmy Fallon
 

SmokedMeat

Gamer™
The difference is pretty damn big especially on monitors

Maybe if I played one and then went to the other in the same game I’d notice.

But going from my 3070ti to a 7900XT I just don’t notice. Maybe I’m just used to FSR.

I’m not loyal to either company so maybe if my next card is Nvidia, it’ll hit me.
 
Last edited:

winjer

Gold Member
Gamers don't buy Nvidia because of brand power despite what AMD fan boys think.

AMD simply doesn't have an answer to DLSS and until they do they're fucked.

There have been several times in the last 25 years, when ATI/AMD had a better product than NVidia.
There has only been one time, for a few months, when AMD/ATI outsold NVidia.
You are right in saying that now AMD can't compete. But this was not always the matter.

And now the result is showing, RTG has no budget to compete with NVidia. And so we have much higher prices from NVidia.
The reality is that most people only want AMD to be competitive to force NVidia to lower prices. But never buy AMD.
 

Sanepar

Member
Well if it is like Polaris they will have something to fight 5070 they will be fine. This segment gets majority of gpu sales.
 

Mr.Phoenix

Member
Why are people freaking out.

Technically, there isn't a high-end RDNA3 GPU either. Or are we just pretending that Nvidia top-end card is the 4080, not the 4090?

On another note, this could also mean AMD is finally skipping a generation so to speak to focus on architectural improvements and not just more shaders. So maybe this is the gen of cards that we start seeing things like AI cores, more RT cores, better and more robust dual issue, and RT implementation...etc. Kinda like what Nvidia did going from Pascal to Turing.

Fuck it who am I kidding, its AMD we are talking about here. Its probably that with RDNA3 they decided they didn't need to compete with the 4090, so maybe with RDNA4 they have decided they don't even need to compete with the 5080 too.:messenger_pensive:
 

StereoVsn

Member
When’s the last time you owned an AMD GPU?

I bounce around between AMD and Nvidia, and I just don’t see this inferiority. My previous 5700XT was a fantastic card for the money, and my 7900XT flat out beats the more expensive 4070ti very often in rasterization.

People who don’t buy AMD or owned one many generations ago while bring up drivers, but AMD’s drivers have been great. Their Adrenaline application is way better than Control Panel.

AMD’s problem is they’re not selling them at enough if a lower price to take marketshare and mindshare away from Nvidia. Their cards have been great.
The issue is FSR 2 is inferior to DLSS, RTX performance is inferior, and power draw is much larger. The difference of about $50 between these two cards isn't worth the detriments.

7900XTX (especially when it was on the mad $850 sale) would be a better card to compare to $200-300 more expensive 4080, IMO.
 

StereoVsn

Member
Why are people freaking out.

Technically, there isn't a high-end RDNA3 GPU either. Or are we just pretending that Nvidia top-end card is the 4080, not the 4090?

On another note, this could also mean AMD is finally skipping a generation so to speak to focus on architectural improvements and not just more shaders. So maybe this is the gen of cards that we start seeing things like AI cores, more RT cores, better and more robust dual issue, and RT implementation...etc. Kinda like what Nvidia did going from Pascal to Turing.

Fuck it who am I kidding, its AMD we are talking about here. Its probably that with RDNA3 they decided they didn't need to compete with the 4090, so maybe with RDNA4 they have decided they don't even need to compete with the 5080 too.:messenger_pensive:
This means basically that 2024 is lost cause and potentially nothing decent is going to be seen until 2025. That's quite a long time between releases and it also means we are stuck with terrible prices for another 2 years at least.
 

Sethbacca

Member
If you're losing anyway, it makes sense to aim on where the majority of the market is rather than trying to target the 1% of super high end consumers that already have their minds made up.
 

Bry0

Member
Why are people freaking out.

Technically, there isn't a high-end RDNA3 GPU either. Or are we just pretending that Nvidia top-end card is the 4080, not the 4090?

On another note, this could also mean AMD is finally skipping a generation so to speak to focus on architectural improvements and not just more shaders. So maybe this is the gen of cards that we start seeing things like AI cores, more RT cores, better and more robust dual issue, and RT implementation...etc. Kinda like what Nvidia did going from Pascal to Turing.

Fuck it who am I kidding, its AMD we are talking about here. Its probably that with RDNA3 they decided they didn't need to compete with the 4090, so maybe with RDNA4 they have decided they don't even need to compete with the 5080 too.:messenger_pensive:
Just because the 7900xtx doesn’t compete with the 4090 doesn’t mean it’s not high end. It’s a $999 gpu lol. Rdna1 only had the 5700xt which at that point was absolutely mid range when it came out. A “5080” competitor would’ve been very attractive to me if it was priced at $999 again and brought ai improvements. This is extremely disappointing to me.
 

GreatnessRD

Member
I cannot believe AMD is throwing. I refuse to believe this is happening, lmao. I'm praying Kepler is MLID level trolling here.

God help us, Intel. Then again, it might make sense if they're trying to pull an Nvidia and go balls to the wall with AI GPUs their next gen from a business sense. Regardless, pretty pathetic if true.
 

Anchovie123

Member
Not much different to RDNA3. Remember the 7900XTX is a 4080 competitor and should have always been called the 7800XT but because AMD didnt have a true 4090 competitor they called it the 7900 to appear competitive when they really weren't.
 

SantaC

Member
So 7900XTX is the last highend AMD GPU then. Thats very disappointing.

Atleast it can do 4K@ 60 fps.
 

Mr.Phoenix

Member
Just because the 7900xtx doesn’t compete with the 4090 doesn’t mean it’s not high end. It’s a $999 gpu lol. Rdna1 only had the 5700xt which at that point was absolutely mid range when it came out. A “5080” competitor would’ve been very attractive to me if it was priced at $999 again and brought ai improvements. This is extremely disappointing to me.
Things change.

People seem to be forgetting that 1080 launched at $600 and the 1080ti launched at $700 almost a year later. And those were `high-end` GPUs then. let's stop moving the posts. High end now is the 4090. And thats a $1600 GPU. Unless we are conveniently creating a new name for the highest-end or most powerful GPU on the market. The 4080/7900XTX is the new mid-range. And there is a low range and an entry/budget range. To me, that's what the new GPU landscape looks like.

Back in pascal days... high-end GPUs cost $700. Now, they cost $1600. Simple as that.
 

Chastten

Banned
I'm more than fine with that, as long as the midrange and budget offerings are better than they are now.

We need affordable and powerefficient 50 and 60-range cards. Not the 80/90 crap only a few people will buy.
 

DaGwaphics

Member


If the rumour is true it means we might not get high end RDNA 4 graphics cards. Curious to see how this will impact Nvidia’s strategy with Blackwell (50 Series) especially when it comes to pricing.


Since this is just a random tweet with no supporting evidence, you really can't put that much faith behind it. But, I don't really think the move would be a bad thing for them at all.

The higher-end models limit what can be done in the bottom of the stack (you have to protect the price points there). The last few times they've released cards that were relevant to the market they were putting everything they had into the most popular price brackets. Right now the biggest roadblock for them really creating a game changing 7800 or 7700 is the fact that they can't undermine the 7900 series too badly.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
When’s the last time you owned an AMD GPU?

I bounce around between AMD and Nvidia, and I just don’t see this inferiority. My previous 5700XT was a fantastic card for the money, and my 7900XT flat out beats the more expensive 4070ti very often in rasterization.

People who don’t buy AMD or owned one many generations ago while bring up drivers, but AMD’s drivers have been great. Their Adrenaline application is way better than Control Panel.

AMD’s problem is they’re not selling them at enough if a lower price to take marketshare and mindshare away from Nvidia. Their cards have been great.
And this is what always boils down to and why I say AMD releases inferior products. The reason people buy them is because they're cheaper than the NVIDIA alternative. Why do you think the old meme of "I want AMD to be competitive so they can drive NVIDIA's prices down so I can buy NVIDIA" exists?

The 7900 XT'S MSRP is $900 and the 4070 Ti is $800. It better beats it for being $100 more expensive. The reason we've seen a slew of 7900 XT price drops is also because it hasn't been selling whereas the 4070 Ti is like the best-selling new-generation GPU. Then factor in ray tracing, DLSS, ML, much better power efficiency, and CUDA, AMD gets slapped around all over the place. Its saving grace is the far superior VRAM buffer which will cause it to age much better than the 4070 Ti (and even now in some instances) but who wants to buy a card to reap the benefits 4 years later?

AMD needs to release better products than NVIDIA full-stop. They keep releasing inferior cards at cheaper price points and this label of them being the budget alternative will never go away. The last time they were neck-and-neck or even better than NVIDIA was with the 7970 GHz Edition. The 7950 was also comparable to the 670 but with 50% more VRAM and so on.
 
Last edited:

theclaw135

Banned
If you're losing anyway, it makes sense to aim on where the majority of the market is rather than trying to target the 1% of super high end consumers that already have their minds made up.

Sadly, this. No user in that audience has the slightest reason to even consider anything except Nvidia.
 

Gaiff

SBI’s Resident Gaslighter
Things change.

People seem to be forgetting that 1080 launched at $600 and the 1080ti launched at $700 almost a year later. And those were `high-end` GPUs then. let's stop moving the posts. High end now is the 4090. And thats a $1600 GPU. Unless we are conveniently creating a new name for the highest-end or most powerful GPU on the market. The 4080/7900XTX is the new mid-range. And there is a low range and an entry/budget range. To me, that's what the new GPU landscape looks like.

Back in pascal days... high-end GPUs cost $700. Now, they cost $1600. Simple as that.
No, they're not. 4090 is the top-end GPU. 7900 XTX/4080 are both high-end products, they're not mid-range by any definition of the word.
 

DonkeyPunchJr

World’s Biggest Weeb
Why are people freaking out.

Technically, there isn't a high-end RDNA3 GPU either. Or are we just pretending that Nvidia top-end card is the 4080, not the 4090?

On another note, this could also mean AMD is finally skipping a generation so to speak to focus on architectural improvements and not just more shaders. So maybe this is the gen of cards that we start seeing things like AI cores, more RT cores, better and more robust dual issue, and RT implementation...etc. Kinda like what Nvidia did going from Pascal to Turing.

Fuck it who am I kidding, it’s AMD we are talking about here. It’s probably that with RDNA3 they decided they didn't need to compete with the 4090, so maybe with RDNA4 they have decided they don't even need to compete with the 5080 too.:messenger_pensive:
No. 7900XTX was aiming at the high end and failed to compete, it’s as simple as that. It’s pretty disingenuous to make it sound like they only ever were aiming to compete in the “cheaper than 4080” market (which BTW 4080 was a big price increase and disappointing performance increase vs 3080).

Bottom line: AMD does have a high end, it just sucks compared to the competition. And next gen they won’t even try to compete in that space (if the rumor is to be believed).
 

tkscz

Member
As long as the prices match I can see this going good for AMD. Most PC gamers are younger and don't really care for the top of the line cards that cost $500 and more. The $200 GPU that runs the games is the sweet spot. They aren't trying to run ULTRA RAY TRACED 4K settings, they just want the game to run and feel playable. Mid to low range cards that can pull 60 - 120fps at medium to high settings is where the money is. It's 2023 and the top used GPUs on steam are the 1650, followed by the 3060 and then the 1060. That's the crowed that's clambering for a new mid-ranged $200 - $300 card.
 

Buggy Loop

Member
I cannot believe AMD is throwing. I refuse to believe this is happening, lmao. I'm praying Kepler is MLID level trolling here.

God help us, Intel. Then again, it might make sense if they're trying to pull an Nvidia and go balls to the wall with AI GPUs their next gen from a business sense. Regardless, pretty pathetic if true.

Intel will also focus on mid range really

There's little to no return for focusing on high end. It's for epeen wars.

7b4269d1d486c304c6d5b3cb2bdcec28b30bc6c2.gif
 

Bry0

Member
Things change.

People seem to be forgetting that 1080 launched at $600 and the 1080ti launched at $700 almost a year later. And those were `high-end` GPUs then. let's stop moving the posts. High end now is the 4090. And thats a $1600 GPU. Unless we are conveniently creating a new name for the highest-end or most powerful GPU on the market. The 4080/7900XTX is the new mid-range. And there is a low range and an entry/budget range. To me, that's what the new GPU landscape looks like.

Back in pascal days... high-end GPUs cost $700. Now, they cost $1600. Simple as that.
I would not consider -80 class mid range, that’s still on the high end. Rdna 1 and Polaris were what I think is generally considered mid range, and during those gens they didn’t even try to compete with -80 class performance or higher. Those terms should be applied to their performance in the sku stack, not price.
 
Last edited:

Dr.D00p

Member
Does that mean them rich fuckers who buy 90 class cards to play shit console ports are going to get gouged even more by Nvidia next generation?

Good, fleece the fuckers until they see sense, I say :messenger_beaming:
 

Mr.Phoenix

Member
No, they're not. 4090 is the top-end GPU. 7900 XTX/4080 are both high-end products, they're not mid-range by any definition of the word.
Ok got you. We are moving the posts now. Why didn't you all just say that before?

We have gone from high> mid > low-end GPUs to top-end> high > mid > low.

My bad.
 

Bry0

Member
Ok got you. We are moving the posts now. Why didn't you all just say that before?

We have gone from high> mid > low-end GPUs to top-end> high > mid > low.

My bad.
It would be news to me that “high end” only meant the absolute fastest sku in a lineup. I’ve always considered high medium and low end to consist of a couple skus each.
 
Last edited:

SmokedMeat

Gamer™
The issue is FSR 2 is inferior to DLSS, RTX performance is inferior, and power draw is much larger. The difference of about $50 between these two cards isn't worth the detriments.

7900XTX (especially when it was on the mad $850 sale) would be a better card to compare to $200-300 more expensive 4080, IMO.

Agreed. They can’t just go $50 lower and call it a day. They need to be aggressive with pricing.
 

hinch7

Member
Well if that's the case, we're screwed. Nvidia can price cards whatever they want and AMD well can just sell mid-range cards regardless, as the budget option.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Ok got you. We are moving the posts now. Why didn't you all just say that before?
Nobody has moved the post. 80 cards have never ever been in the history of GPUs been mid-range GPUs and they still aren't now. The 1080 wasn't a mid-range card, neither was the 980 or the 780. I have no idea WTF you're talking about.
We have gone from high> mid > low-end GPUs to top-end> high > mid > low.

My bad.
There over a dozen cards on the market. You cannot segment them into three tiers because then you'd have the 4080 in the same category as the 4060 when it's much closer in performance to the 4090. The 80 series has been high-end for as long as it's existed. The mid-range starts at most with the 70 series and even in some cases, you may argue it's high-end. The 60 cards are referred to by NVIDIA as their "mainstream" cards. The 80 series is an "enthusiast" card and has always been.

There are one or two cards faster than the 7900 XTX/4080. How are they mid-range when they're more powerful than 90% of the cards on the market?
 
Last edited:

hlm666

Member
So concentrate on mid range killer. That’s what Intel is doing also strategy wise.

Makes total sense to me
Didn't work for rdna1 so not sure how repeating the process with rdna4 works. I guess if this is true maybe they will actually try harder this time, I have no faith they will though.
 

SF Kosmo

Al Jazeera Special Reporter
Mid-range is where nVidia is most vulnerable, and applying pressure there could have positive effects for the whole stack, if AMD's mid range cards actually deliver good value.

We've been seeing gimped mid-range cards this gen, with anything under $500 basically being garbage. This also helps the higher end cards to justify their price by comparison. So I hope AMD can deliver at that price point and force nVidia to offer solid mid-range cards or lower prices.

But this also means that these expensive cards are expensive for a reason. If those products were super high margin, AMD would make them the priority.
 

Mr.Phoenix

Member
I would not consider -80 class mid range, that’s still on the high end. Rdna 1 and Polaris were what I think is generally considered mid range, and during those gens they didn’t even try to compete with -80 class performance or higher. Those terms should be applied to their performance in the sku stack, not price.
I am sorry this makes no sense to me. I don't care about the price.

There is ~30% performance between the 4090 and the 4080. 30%
And another 27% between the 4080 an 4070.

But somehow, we group 4070 and 4080 into the same `high` end bracket?

And simple question, if the 4080 is high-end, what does that make the 4090? ultra-high-end?

Back in the day, when we actually started grouping these things, the most powerful model (which usually also happened to be the most expensive) of any GPU family was considered high-end. And we named the rest of them accordingly going down the performance stack.

This is simple to me...in my mind, the most powerful GPU on the market is the high-end. The 2 most powerful, is mid-range. The third is ow end. And whatever comes before that, is budget or entry-level. If what we are doing is simply disregarding the 4090, or creating a new class for it, then by all means everything I am saying now is wrong.
 

Bry0

Member
But somehow, we group 4070 and 4080 into the same `high` end bracket?

And simple question, if the 4080 is high-end, what does that make the 4090? ultra-high-end?
70 is more mid range. Nvidia themselves market the 80 and 90 that way so yeah. -80 class cards were never considered midrange.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I am sorry this makes no sense to me. I don't care about the price.

There is ~30% performance between the 4090 and the 4080. 30%
And another 27% between the 4080 an 4070.
No...the 4080 is 50% faster than the 4070. It might be worth it to do your research before you post.
relative-performance-2560-1440.png

relative-performance-3840-2160.png

But somehow, we group 4070 and 4080 into the same `high` end bracket?

And simple question, if the 4080 is high-end, what does that make the 4090? ultra-high-end?
As I said, top-end. There is entry level, just like there is top level. You simply have to see where they stack up in the market of available cards. The 4080 is very near the top but not quite. It's high-end.
Back in the day, when we actually started grouping these things, the most powerful model (which usually also happened to be the most expensive) of any GPU family was considered high-end. And we named the rest of them accordingly going down the performance stack.

This is simple to me...in my mind, the most powerful GPU on the market is the high-end. The 2 most powerful, is mid-range. The third is ow end. And whatever comes before that, is budget or entry-level. If what we are doing is simply disregarding the 4090, or creating a new class for it, then by all means everything I am saying now is wrong.
No one has ever called an 80 card mid-range even back in those days. It was almost always the most powerful card for consumers. Beyond that, you often got into dual-GPUs or prosumer grade products like the Titan.
 

SmokedMeat

Gamer™
And this is what always boils down to and why I say AMD releases inferior products. The reason people buy them is because they're cheaper than the NVIDIA alternative. Why do you think the old meme of "I want AMD to be competitive so they can drive NVIDIA's prices down so I can buy NVIDIA" exists?

The 7900 XT'S MSRP is $900 and the 4070 Ti is $800. It better beats it for being $100 more expensive. The reason we've seen a slew of 7900 XT price drops is also because it hasn't been selling whereas the 4070 Ti is like the best-selling new-generation GPU. Then factor in ray tracing, DLSS, ML, much better power efficiency, and CUDA, AMD gets slapped around all over the place. Its saving grace is the far superior VRAM buffer which will cause it to age much better than the 4070 Ti (and even now in some instances) but who wants to buy a card to reap the benefits 4 years later?

AMD needs to release better products than NVIDIA full-stop. They keep releasing inferior cards at cheaper price points and this label of them being the budget alternative will never go away. The last time they were neck-and-neck or even better than NVIDIA was with the 7970 GHz Edition. The 7950 was also comparable to the 670 but with 50% more VRAM and so on.

AMD made the mistake of slapping a stupid msrp on it. Had they came in at $700 or $750 like it is now, I think it’d be a better situation for them.

4070ti is stupid expensive for the performance imo. If that’s indeed the best selling 4000 series then I’m worried about the future.
I hope Intel lives up to our hopes. We need them to shake things up badly.
 
Last edited:

Gp1

Member
Great.
On company simply doesn't even care anymore because gaming is their tertiary priority.

And the other doesn't even try.

What a great time i chose to upgrade my GPU.

Let's hope that at least AMD delivers on the mid range market. Save us with the 7-8700/7-8800.
 

Bry0

Member
AMD made the mistake of slapping a stupid msrp on it. Had they came in at $700 or $750 like it is now, I think it’d be a better situation for them.

4070ti is stupid expensive for the performance imo. If that’s indeed the best selling 4000 series then I’m worried about the future.
I hope Intel lives up to our hopes. We need them to shake things up badly.
With all the people leaving intels graphics division I already lost hope. Would be nice to be surprised but I’m not counting on it. Maybe rdna 5 they will come out with something more interesting like they did with rdna2. Keeping my fingers crossed but expectations low.
 

Mr.Phoenix

Member
Nobody has moved the post. 80 cards have never ever been in the history of GPUs been mid-range GPUs and they still aren't now. The 1080 wasn't a mid-range card, neither was the 980 or the 780. I have no idea WTF you're talking about.

There over a dozen cards on the market. You cannot segment them into three tiers because then you'd have the 4080 in the same category as the 4060 when it's much closer in performance to the 4090. The 80 series has been high-end for as long as it's existed. The mid-range starts at most with the 70 series and even in some cases, you may argue it's high-end. The 60 cards are referred to by NVIDIA as their "mainstream" cards. The 80 series is an "enthusiast" card and has always been.

There are one or two cards faster than the 7900 XTX/4080. How are they mid-range when they're more powerful than 90% of the cards on the market?
So what if 80 series cards had never been considered mid-range. The bottom line is that 90 series cards never existed. ANd we don't just start creating new categories every time something more powerful comes out.

I couldn't have been clearer with my reasoning. If you don't agree with that... fine. That's ok.

And you know what the range in these name means right?

The way I see it...

High end = most powerful GPUs on the market. 90 - 100% performnc tier.
Mid-range = 60- 90% tier... that's why it's called a range.
low range = 20-60% tier
entry = bottom 10%.

And GPUs moved down that stack as newer ones come out. And those tiers have their own ranges. Eg. 4080/7900xtx would be considered upper mid-range.
 
Top Bottom