• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(RUMOUR) RDNA 4 Will Not Have Any High End GPU’s

Yup, if you’re going to spend $1000 plus why would you settle for significantly worse performance in the most demanding + graphically impressive games out there?

If you’re in that market then chances are you want to play something like Cyberpunk 2077 or Flight Simulator completely maxed out. You’re not going to compromise on that just so you can get like, 3% better max framerate in Counterstrike or something.
If you spend that kind of money you probably want to play CP2077 with ray tracing on.
 

ChiefDada

Gold Member


Steve Harvey Cringe GIF by ABC Network
 
Last edited:
Good.

Focus on the mid ranges, and lower end. That’s the majority of buyers. That’s where the competition is.

Leave the ultra high end all to Nvidia. It’s a segment AMD or Intel will never win.

They should go back to the R & D table with regards to HE, keep it under wraps, focus hard on low/mid/mid+?. Skip a gen of high-end, and hopefully bring the fruit out to display.
Nvidia is a fucking asshole of a company. The last 3 years have shown me they are getting worse. My best friend who deals with the company even talking shit and asking me not to go Nvidia when I want to upgrade next. Ugh.. fucking assholes. If they price the 5090 out of the ballpark or skip it, I will skip the 5xxx, too.
 

SABRE220

Member
Well amd just keeps on giving huh....just when i thought they couldnt get any less ambitious and cowardly in terms of competing in the gpu market....they basically sealed their fate when they decided to skip out of r&d massively to just focus on their CPUs. They had multiple chances to advance their tech with dedicated rt/ml hardware and move forward. Now they are left with a situation where intel has leapfrogged them in terms of forward-looking tech in their first outing.
 
Last edited:
Should AMD only make entry level GPUs I wonder what will happen to PlayStation.Will they build their own GPUs or go to Nvidia or Intel or take the entry level GPUs and Fusion lots of entry level GPUs to one big one.Pretty interesting.
 

Dice

Pokémon Parentage Conspiracy Theorist
When’s the last time you owned an AMD GPU?

I bounce around between AMD and Nvidia, and I just don’t see this inferiority. My previous 5700XT was a fantastic card for the money, and my 7900XT flat out beats the more expensive 4070ti very often in rasterization.

People who don’t buy AMD or owned one many generations ago while bring up drivers, but AMD’s drivers have been great. Their Adrenaline application is way better than Control Panel.

AMD’s problem is they’re not selling them at enough if a lower price to take marketshare and mindshare away from Nvidia. Their cards have been great.
I have also switched around a lot and I'm currently very pleased with my 6700XT. For the price, AMD cards perform very well but their best traits are just in different areas from Nvidia. Right now Nvidia does much better in lighting and reflections, AMD does much better in general rendering and textures.

My favorite strength of AMD is in whatever fire & smoke effects require. I'm not well enough educated in graphics tech to know what that is, but I know on Nvidia I'd always get big framerate drops with those but I don't on AMD. That said, I'm a huge fan of raytracing so that is a huge draw toward Nvidia looking forward to my next PC build if AMD doesn't get their shit together with that.

Nvidia does much better in power management, but you lose whatever you'll get on energy savings in their much higher prices.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Should AMD only make entry level GPUs I wonder what will happen to PlayStation.Will they build their own GPUs or go to Nvidia or Intel or take the entry level GPUs and Fusion lots of entry level GPUs to one big one.Pretty interesting.
Sony already gets semi custom SoC from AMD shopping from their roadmap and adding items they care about to it just for consoles or PC too (that depends on AMD) and/or redesigning some units (like the FPU in Zen2 to lower power consumption, cut size), as well as doing custom silicon work with AMD or other partners (see I/O unit in the main SoC, the Tempest3D Audio unit, and the custom SSD controller and NAND layout).

Also, consoles are power consumption and size constrained $399-499 boxes, these are not high-end GPU’s. Thanks to some customisations and low level API’s developer can depend on they are able to punch considerably above their weight (consoles generally allow devs to use features way ahead of the PC where it is more difficult to standardise and get use of them early: some features like Direct Storage on Xbox had an effect on console games years before some titles started introducing it on PC too; also think about RT features the DX API on PC has yet to expose, there are many levers console devs can use to optimise their titles more thanks to control and a low number of SKU’s/HW profiles the API’s do not need to worry about).
 

PaintTinJr

Member
Given the power draw of the RDNA3 options with very high clocks, which were excellent performers in rasterization and RT when Rapid Pack Maths half floats and async compute were leveraged it would make sense IMO for their next cards to reign in power consumption heavily, either with a view to eventually do full APU based discrete GPUs via PCIe, or do external GPUs via mobile PCIe or USB-C for laptops, or to just focus in on getting the £200 or less GPU to be profitable, low power and performant, with APU ideas maybe put back further for RDNA5 or 6.
 

Elios83

Member
PC gaming hardware is just in full stagnation imo.
There is poor competition which leads to high prices, slow innovation and companies cutting their costs on features (see VRAM amount).
Now these big companies also have bigger customers to think about rather than enthusiasts.
There is the whole industry wide AI boom to serve and if we're talking about RDNA4 GPUs I wouldn't be surprised if AMD's main customer will be console makers like Sony with their PS5 Pro GPU.
 

GreatnessRD

Member
Intel will also focus on mid range really

There's little to no return for focusing on high end. It's for epeen wars.

7b4269d1d486c304c6d5b3cb2bdcec28b30bc6c2.gif
Still a shame. Pathetic and weak if you ask me. Now I'm curious to find out what would be considered mid-range? 8800? 8700? 8800 XT? Oh yeah, the fun times are about to begin.
 

DaGwaphics

Member
AMD is in a position where just being a good value isn't good enough. The 6600/XT and 6700/XT have been solid values for months now and the needle doesn't seem to have moved much (just going off the Steam survey). They need to create another 480/580/5700xt card that just represents too much value to be ignored.

As others have mentioned their AI and console business cuts into the amount of silicone they want to use for GPUs. Avoiding the largest dies should help them produce more GPUs as well, that might be a consideration in this decision as well.
 

Shodai

Member
I don't like the rumor, but it does make some sense. I concur the real problem here is foundry capacity, which really needs to be resolved sooner than later.
 

Hot5pur

Member
I am pretty confident people buy way more GPU than they need.
120 FPS for singleplayer? Why?
Raytracing on in most games? Why? Only a few make a difference.
Ultra settings? You don't actually notice the difference in 99% of cases.
xx80 tier GPUs for 1080p/1440p gaming? Seriously.
Not turning on FSR/DLSS when available? Craziness.

Granted there have been shit ports that rely on brute force, which is a bad trend. I think the problem with PC gaming is FOMO, but you can have a great experience for $300-400 unless you absolutely need 4k, and even there I would not go higher than a 6950xt. Raytracing is largely a FOMO gimmick until it becomes widely adopted and mainstream, where it actually makes a visible difference and not puddle reflections (cyberpunk and Dying light 2 are notable instances where it is done well). I think we are at least 2 generations from where raytracing will actually be a big deal.
 
In one sense it kind of is. A lot of games are coming out, a lot of them are great games even.

On the other hand, performance of a ton of AAA titles is shit and GPU price/performance ratio is plain terrible.

But every other hardware component is a lot more reasonable these days. Even AM 5 Motherboards.

And which games are available on PC but not playable on a PS5? Starfield, maybe. Gaming overall is in a new golden age, but PC gaming is certainly not.


Recent development and temporary, only affecting CPUs and NAND storage. Supply chain glut due to chip ramp up to satisfy demand, now with a huge demand drop since people already upgraded and crazy inflation has all the semiconductor industry decreasing production so they can increase their prices again. GPUs haven’t been affected much by this, and their prices remain high due to anchoring from the abusive relationship PC gamers have been in with Nvidia since the 2080 dropped.


PC gaming has not been a good value since the pre-mining days - yes, even with the mining crash accounted for. The last big release to truly justify a high end gaming rig was The Witcher 3. Between bad optimization, afterthought ports with bugs and issues (looking at you, Square-Enix), and the near total stagnation in VRAM, there is no compelling story to justify where GPUs are at today. It used to be that a new generation of GPUs dropped in price or remained at the same price, with huge efficiency boosts. This is no longer happening, and when you can simply buy a console for less than the price of a last gen GPU alone to get better performance at a cheaper price, I see no argument for PC gaming. Even Steam sales have sucked, and those used to help justify the price of expensive PC hardware.
 

hinch7

Member
I am pretty confident people buy way more GPU than they need.
120 FPS for singleplayer? Why?
Raytracing on in most games? Why? Only a few make a difference.
Ultra settings? You don't actually notice the difference in 99% of cases.
xx80 tier GPUs for 1080p/1440p gaming? Seriously.
Not turning on FSR/DLSS when available? Craziness.

Granted there have been shit ports that rely on brute force, which is a bad trend. I think the problem with PC gaming is FOMO, but you can have a great experience for $300-400 unless you absolutely need 4k, and even there I would not go higher than a 6950xt. Raytracing is largely a FOMO gimmick until it becomes widely adopted and mainstream, where it actually makes a visible difference and not puddle reflections (cyberpunk and Dying light 2 are notable instances where it is done well). I think we are at least 2 generations from where raytracing will actually be a big deal.
This generation has barely started with 'next generation' graphics and game engines. PC's are going to need far more performance to brute force its inefficiencies in design verses on consoles. And when more UE5 games come out you're going to need a beefy system to run them well. See Revelant 2 for example and that doesn't even use Nanite - or mesh shaders.

If buying to play old games then most modern GPU's will do fine. Going forward, we're going to need big advances still. Sadly, the Ai boom has only just started and GPU companies focus is on that rather than consumer grade GPUs. And AMD is looking like its largely throwing in the towel on the latter.
 
Last edited:

Hot5pur

Member
This generation has barely started with 'next generation' graphics and game engines. PC's are going to need far more performance to brute force its inefficiencies in design verses on consoles. And when more UE5 games come out you're going to need a beefy system to run them well. See Revelant 2 for example and that doesn't even use Nanite - or mesh shaders.
Is Remnant 2 the typical case for UE5, seems like it was an unoptimized mess on a new engine. There are several engines that look much better and require far less.
Also, we are hitting massive diminishing returns. We have games from 5 years ago that look comparable to what we see today.
Even Cyberpunk, though looking nice, is not a game changer in terms of visuals. Pathtraced or not. I guess this is my personal opinion but I can't see any reason to upgrade my 3080 to play at 4k60 in the typical case. I mostly turn off raytracing these days as well as the only things it changes is the frame rate.
 

PaintTinJr

Member
What drugs did you take before writing this?
The original CELL Simulator devkits by fixstars were that very solution. My hunch is that for AMD to continue winning government contracts, arrays of powerful, low power SoCs interconnecting with their infinity fabric will be their solution, so having a retail product helps with scale and cost savings.

With Intel also now in the market, both AMD and Intel have the CPU and GPU technology to launch such products and pitch it to very large market (low-end desktop and laptop buyers, upto mid-range desktop GPU buyers).

socketed APUs seems like a logical step that is eventually coming to eliminate many of the limitations of PCIe, and usher in unified memory on desktop PC, so I would see this as a strategy to let them dip their toe without being fully committed, while also going the blue water strategy and strengthen their low power mobile tech position too.
 
Last edited:

Leonidas

Member
AMD leaving the high end is a good thing, maybe now they can be more competitive with Nvidia in the mid-range like they were in the RX 580 and 5700 XT generations. They still need to improve their RT and image upscaling though as those two things would still keep me from buying their GPUs.

I'd buy an AMD GPU if they match Nvidia in RT, image upscaling and efficiency and came at a lower price.

People really have to stop buying into shady rumours so easily.

Multiple sources are saying it, it's not a shady rumor.
 

64bitmodels

Reverse groomer.
And which games are available on PC but not playable on a PS5?
How about literally everything that isn't on Playstation. so Nintendo games, older Sega games, arcade games (both old and modern, if you know where to look), Xbox exclusives, etc.

If the only selling point for PC in your eyes is the potential AAA exclusives the platform can offer you simply just aren't meant for PC gaming.
 
The leak is from MooresLawIsDead so take it with a grain of salt, although he has been accurate in leaking certain PCB's of unreleased producsts in the past. Apparently this is the the PCB of "Navi 4c" which was supposed to be used in the now dead "8900 XTX", MLiD claims that RDNA 4's top end chip would have multiple compute dies and the design it self was way too complicated for AMD, and to get it working properly would mean much longer time in the oven which would then start clashing with the RDNA 5 timeline and schedule. He goes into much more detail in the video but it goes way over my head, maybe the more tech savy folk on here can simply it for us. I've added a time-stampped link as well.

cbZSRPg.jpg



 
It sucks for everyone if they don't try and compete for the high end, but what they've accomplished over the last 10-15 years has almost always felt reactionary rather than revolutionary and Nvidia's aggressive business moves really helped Nvidia put a great and comfortable lead for quite a while that they've maintained since. I've not once in the last few years felt excited for an AMD GPU, but usually every 4 or 5 years when I'm in the mood for a GPU upgrade, I can see all these great features accumulating over that time that gets me excited for my eventual upgrade with Nvidia.
 
Last edited:
Only interesting question how many N43 have CU, if N41 was going up 200CU
I don't see it going beyond 60 CU honestly, although they are targeting the low to mid-range with RDNA 4 so who knows.

If they can match the 7900 XTX in terms of performance but for a price of $500 then they'll have a hit on their hands.
 

Bry0

Member
Only interesting question how many N43 have CU, if N41 was going up 200CU
Probably not that many more. They will likely still be monolithic for these mid range rdna4 cards.

The end goal is pretty easy to see though, they want scalability like they get on processors. Considering the weird quirks on n31 I’m not surprised this top end is being pushed back to rdna5. There is a ton of potential but that is a veeeery complicated design and I would imagine everyone in the Radeon group will have their hands full getting this working reliably.
Probably going to be very power hungry too.
 
Last edited:
I don't see the problem...

We don't know the numbers, so maybe their best selling cards may be mid end cards... As long as those cards are feature packed good software, drivers and AI I don't see the issue here.
 
Top Bottom