• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(RUMOUR) RDNA 4 Will Not Have Any High End GPU’s

64bitmodels

Reverse groomer.
Intel will be dangerous in mid range market and they have more market share to lose to Intel than Nvidia.

High end cards are niche, like ~1 to 2% userbase. This is not how Nvidia takes a chunk of the market share either, everything plays out in the xx60 xx70 range.

They probably can’t compete against Nvidia high end, next gen with all the AI chip design and AI foundry optimization with MCM and so on, they probably have a monster. But an expensive one. For all the R&D it would cost AMD to compete against it, it would barely break 1%, just look at 7900XTX, just now it broke into steam hardware survey after all this time.

So concentrate on mid range killer. That’s what Intel is doing also strategy wise.
limiting your cards to midrange doesn't mean shit when Intel also has midrange at cheaper prices and with better tech. Even Apple is doing upscaling and Raytracing better than them and it simply comes down to the lack of AI acceleration in their cards.

What AMD needs to do is simply what Nvidia has done for the past 5 years: Lean into software more and add more AI cores into their products. Then and only then will they be compettiive.
 

Mr.Phoenix

Member
No...the 4080 is 50% faster than the 4070. It might be worth it to do your research before you post.
relative-performance-2560-1440.png

relative-performance-3840-2160.png


As I said, top-end. There is entry level, just like there is top level. You simply have to see where they stack up in the market of available cards. The 4080 is very near the top but not quite. It's high-end.

No one has ever called an 80 card mid-range even back in those days. It was almost always the most powerful card for consumers. Beyond that, you often got into dual-GPUs or prosumer grade products like the Titan.
I actually did. Just check the relative performance column and click between GPUs.

Anyways... this argument isn't worth the pixels... this is a simple naming thing. You look at the top end as the 4090. And the next one as high-end.I look at high-end as the most powerful GPUs on the market.
 
Last edited:

CuNi

Member
Sadly, this. No user in that audience has the slightest reason to even consider anything except Nvidia.

I'd argue the complete opposite.
Especially the 1% would go where the power is no matter the money.

The issue is, especially when spending so much, you don't want issues. You want the premium product and sadly AMD still has, nowadays unjustifiedly, a reputation of having bad drivers etc. If the difference is barely there and just one or two FPS, people will stick to what they know.

If AMD would ever be able to create a card that's 5-10% better than what nVidia has to offer, while having feature parity (raytracing, ML support, competitive FSR etc.) I'm sure the high end would eat out of their hands.

But as many said so before, consumers aren't creating as much revenue as b2b sales nowadays so there is no incentive to push in that direction anymore and it's way more profitable to forego consumers and try to push more in business sales.

It sucks as a consumer but I can understand their decision from a business point of view.
 
No need to worry.Consoles use improved and customized parts.AMDs middle range is very competitive against Nvidias offers. So no problem AMD will probably come back into high end when they have new improved graphics cards probably RDNA5.
 

Bry0

Member
No need to worry.Consoles use improved and customized parts.AMDs middle range is very competitive against Nvidias offers. So no problem AMD will probably come back into high end when they have new improved graphics cards probably RDNA5.
Yeah rdna 2 was decent coming after the 5700xt. Hopefully they are cooking up something cool for the long term.
 
No need to worry.Consoles use improved and customized parts.AMDs middle range is very competitive against Nvidias offers. So no problem AMD will probably come back into high end when they have new improved graphics cards probably RDNA5.
Speaking of RDNA 5, I have heard first hand information from my source that its going to have improved rasterization and ray-tracing performance over RDNA 3. You heard it here first!

/s
 

Puscifer

Member
They never use mid tier GPU either..
Hard disagree, the 360/PS3 GPU didn't even have a PC equivalent for almost 2 years. But that also came with them losing 2-400 dollars per console, FWIW that generation was long in the tooth but we got A LOT out of them, even TLOU which was breaking consoles by the end of it.
 

DonkeyPunchJr

World’s Biggest Weeb
Didn't work for rdna1 so not sure how repeating the process with rdna4 works. I guess if this is true maybe they will actually try harder this time, I have no faith they will though.
They’re kinda “damned if they do, damned if they don’t”.

When you think about it, the market for their high end GPU now is “gamers willing to spend over $1000 but not willing to spend $1200-$1600 for a better Nvidia GPU” which is a really, really tiny segment.

I’m not exactly optimistic for their future mid-range GPUs, but at least in that segment there are a lot more gamers who go for the “cheaper and almost as good” option.
 

DonkeyPunchJr

World’s Biggest Weeb
Hard disagree, the 360/PS3 GPU didn't even have a PC equivalent for almost 2 years. But that also came with them losing 2-400 dollars per console, FWIW that generation was long in the tooth but we got A LOT out of them, even TLOU which was breaking consoles by the end of it.
PS3 GPU was basically just a 7900 GT gimped with a 128-bit memory bus IIRC. Nothing special compared to PC GPUs at the time (and was pretty old and busted compared to the 8800 series which launched almost the same time)
 

XesqueVara

Member
I can’t believe I’m gonna say something positive for AMD..

@winjer

weekend playing GIF


But this is the most logical move

Intel will be dangerous in mid range market and they have more market share to lose to Intel than Nvidia.

High end cards are niche, like ~1 to 2% userbase. This is not how Nvidia takes a chunk of the market share either, everything plays out in the xx60 xx70 range.

They probably can’t compete against Nvidia high end, next gen with all the AI chip design and AI foundry optimization with MCM and so on, they probably have a monster. But an expensive one. For all the R&D it would cost AMD to compete against it, it would barely break 1%, just look at 7900XTX, just now it broke into steam hardware survey after all this time.

So concentrate on mid range killer. That’s what Intel is doing also strategy wise.

Makes total sense to me
Blackwell is Monolithic tho, and Intel is still long ways to be a worthy competitor their gpu is much bigger than AMD/Nvidia to offer the same performance of them.
 

Dream-Knife

Banned
When’s the last time you owned an AMD GPU?

I bounce around between AMD and Nvidia, and I just don’t see this inferiority. My previous 5700XT was a fantastic card for the money, and my 7900XT flat out beats the more expensive 4070ti very often in rasterization.

People who don’t buy AMD or owned one many generations ago while bring up drivers, but AMD’s drivers have been great. Their Adrenaline application is way better than Control Panel.

AMD’s problem is they’re not selling them at enough if a lower price to take marketshare and mindshare away from Nvidia. Their cards have been great.
I had a 6800 for 9 months. Awful card due to adrenalin.

Ended up dying and sold the replacement on ebay for $1300 and bought a 3080.
 

hinch7

Member
Blackwell is Monolithic tho, and Intel is still long ways to be a worthy competitor their gpu is much bigger than AMD/Nvidia to offer the same performance of them.
I wouldn't count them out. They have good engineers. XeSS is already superior to FSR and current Arc's Raytracing on par or nearly up there with Ampere, in their first DGPU. And performance of the A770 isn't bad by any stretch.

Battlemage (A870??) is due next year. And if they manage to get that to around 4080 performance, while costing significatly less and have similar RT capabilities+upscaling that might swing the pendulum towards Intel for prospective buyers - away from AMD. And would spell trouble for AMD cards in the sub $700 price brackets. Even going into the next generation.
 
Last edited:

Neo_game

Member
It may be bad news for AMD guys but honestly 1500 or 2000$ gfx card is not necessary. Good for those who like to buy them. Nvidia may charge another 100 or 200$ more for the premium 🤷‍♂️
 

Buggy Loop

Member
Blackwell is Monolithic tho, and Intel is still long ways to be a worthy competitor their gpu is much bigger than AMD/Nvidia to offer the same performance of them.

Eh, rumours changed again? Anyway, that's rumours and we can't rule out that Nvidia is looking at both options, as they also did during ADA according to kopite.

Its a question of optimization in the end, TSMC might still be giving enough progress and good yields that monolithic makes more sense again for another gen. MCM is inevitable, but mostly when we hit bottlenecks in foundries. (or crazy server chips)
 

XesqueVara

Member
Just think our only hope for

I wouldn't count them out. They have good engineers. XeSS is already superior to FSR and current Arc's Raytracing nearly on par with Ampere in their first DGPU. And performance of the A770 isn't bad by any stretch.

Battlemage (A870??) is due next year. And if they manage to get that to around 4080 performance, while costing significatly less and have similar RT capabilities+upscaling that might swing the pendulum towards Intel for prospective buyers - away from AMD. And would spell trouble for AMD cards in the sub $700 price brackets. Even going into the next generation.
Im talking Hardware IP here, they still have a long way to compete with AMD/Nvidia, A770 was a 406mm² chip competing with a 230mm² and a 276mm.
 

XesqueVara

Member
Eh, rumours changed again? Anyway, that's rumours and we can't rule out that Nvidia is looking at both options, as they also did during ADA according to kopite.

Its a question of optimization in the end, TSMC might still be giving enough progress and good yields that monolithic makes more sense again for another gen. MCM is inevitable, but mostly when we hit bottlenecks in foundries. (or crazy server chips)
Kopite hinted it before lovelace launch
Blackwell is mostly about a new SM design btw, the last big change was Volta.
 

ReBurn

Gold Member
If AMD can't keep up with Nvidia on the high end that's OK. Most people don't buy the top end anyway. Hopefully they'll work harder to deliver value in the midrange.
 

StereoVsn

Member
Remember, GAF:

It's the Golden Age of PC Gaming
In one sense it kind of is. A lot of games are coming out, a lot of them are great games even.

On the other hand, performance of a ton of AAA titles is shit and GPU price/performance ratio is plain terrible.

But every other hardware component is a lot more reasonable these days. Even AM 5 Motherboards.
 

Red5

Member
Remember, GAF:

It's the Golden Age of PC Gaming

So far yes, you don't even need a 4090 to play the best games on PC out there, whether AAA, AA or indie. Baldur's Gate 3 is running well on my 1070.

The chip shortages due to AI demand is going to hit consoles as well as PC and every market that isn't AI since AI companies are gobbling up all the chips on the market anyway.
 

Reallink

Member
Gamers don't buy Nvidia because of brand power despite what AMD fan boys think.

AMD simply doesn't have an answer to DLSS and until they do they're fucked.
Kamala Harris Reaction GIF by The Democrats

Not just DLSS, but also objectively and significantly inferior RT performance. AMD simply makes an inferior product, full stop. Has nothing to do with fanboyism. People buying $1000+ GPU's are buying with a future outlook, not performance on 5 year old rasterized games.
 
Last edited:

Mister Wolf

Gold Member
The issue is FSR 2 is inferior to DLSS, RTX performance is inferior, and power draw is much larger. The difference of about $50 between these two cards isn't worth the detriments.

7900XTX (especially when it was on the mad $850 sale) would be a better card to compare to $200-300 more expensive 4080, IMO.

I'll even add Frame Generation to your list. Anyone who has actually used Frame Generation knows its a game changer.
 

Xellos

Member
I'd be kind of excited about this, honestly. Polaris and RDNA1 saw RX480 and 5700xt, two of their best cards. If they can get FSR and ray tracing more competitive with Nvidia in the mid-range then I think this is a good move.
 

Red5

Member
So what if 80 series cards had never been considered mid-range. The bottom line is that 90 series cards never existed. ANd we don't just start creating new categories every time something more powerful comes out.

I couldn't have been clearer with my reasoning. If you don't agree with that... fine. That's ok.

And you know what the range in these name means right?

The way I see it...

High end = most powerful GPUs on the market. 90 - 100% performnc tier.
Mid-range = 60- 90% tier... that's why it's called a range.
low range = 20-60% tier
entry = bottom 10%.

And GPUs moved down that stack as newer ones come out. And those tiers have their own ranges. Eg. 4080/7900xtx would be considered upper mid-range.

x90 gpus existed all the way back during the Geforce FX5000 series.

In the 600 series the 680 was the high end and a 690 basically being two 680's slapped together operating in SLI, called enthusiast entry.

80 was always high end while 90 was enthusiast.

4xxx series is the same, 4080/4080ti being high end and 4090 considering the price/performance rating is enthusiast entry. That's not even new terminology, it's something Nvidia used all the way back in 2003-4.
 

Mr.Phoenix

Member
x90 gpus existed all the way back during the Geforce FX5000 series.

In the 600 series the 680 was the high end and a 690 basically being two 680's slapped together operating in SLI, called enthusiast entry.

80 was always high end while 90 was enthusiast.

4xxx series is the same, 4080/4080ti being high end and 4090 considering the price/performance rating is enthusiast entry. That's not even new terminology, it's something Nvidia used all the way back in 2003-4.
Thanks. I stand corrected.
 

StereoVsn

Member
I'll even add Frame Generation to your list. Anyone who has actually used Frame Generation knows its a game changer.
Yeah, that as well, plus low latency functionality and a few other things.

7900xt does have more VRAM and higher rasterization in some games vs 4070Ti (both are helpful at higher res), it's just IMO not enough to get folks to buy 7900xt at very close prices.
 
Last edited:

Leonidas

Member
Bullshit, Gamers bought Nvidia even before RT/DLSS was a thing.
AMD had a much better chance at selling GPUs before RT/DLSS.

I bought RX 580 and Vega 56 for example. That was before RT/DLSS, those cards were pretty good, outside of high power draw.

Ever since RT/DLSS I have been Nvidia exclusive.

Nowadays I won't even consider buying a current AMD card because I care about RT and image upscaling quality.
 

Bry0

Member
Interesting posts
2a4cb9b852733d4b459805a93388df55.png


and
29c919698b2382bed6b472872a01bbd4.png



anyway he's saying only 400$ and below cards:messenger_sad_relieved:
It is what it is I guess. Still looking forward to this MCM monstrosity coming down to the consumer level but by then I think nvidia will be ready,
 

XesqueVara

Member
Interesting posts
2a4cb9b852733d4b459805a93388df55.png


and
29c919698b2382bed6b472872a01bbd4.png



anyway he's saying only 400$ and below cards:messenger_sad_relieved:
Anandtech?
Tbh if its for Validation problems it's fair, AMD is in a 6-7Q cadence between every product line, and in no way they want to delay things.
 
Last edited:

SolidQ

Member
Anandtech?
Tbh if its for Validation problems it's fair, AMD is in a 6-7Q cadence between every product line, and in no way they want to delay things.
Yep. Only question is perfomance for 400$. 6800Xt max or maybe near 7900XT with RDNA 4RT?
 

DonkeyPunchJr

World’s Biggest Weeb
Yeah, that as well, plus low latency functionality and a few other things.

7900xt does have more VRAM and higher rasterization in some games vs 4070Ti (both are helpful at higher res), it's just IMO not enough to get folks to buy 7900xt at very close prices.
Yup, if you’re going to spend $1000 plus why would you settle for significantly worse performance in the most demanding + graphically impressive games out there?

If you’re in that market then chances are you want to play something like Cyberpunk 2077 or Flight Simulator completely maxed out. You’re not going to compromise on that just so you can get like, 3% better max framerate in Counterstrike or something.
 

lmimmfn

Member
Hard disagree, the 360/PS3 GPU didn't even have a PC equivalent for almost 2 years. But that also came with them losing 2-400 dollars per console, FWIW that generation was long in the tooth but we got A LOT out of them, even TLOU which was breaking consoles by the end of it.
Incorrect, I had a 7800GTX with AMD X2 4400 in 2006, 6 months after the 360 launched, that setup provided much higher res/framerates than either the 360 or PS3.

Admittedly, it wasn't until November 2006 when the 8800 cards were released with unified shaders, those cards absolutely destroyed 360/PS3 performance in every way which was released 11 months after the 360(European 360 release date).
 
Top Bottom