• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD’s Flagship Navi 31 (512mb cache) GPU Based on Next-Gen RDNA 3 Architecture Has Reportedly Been Taped Out

winjer

Gold Member
Because the old hardware still be around and catered to limits the benefits you get from the new hardware, not saying there still won't be huge benefits.

You do realize that most games on PC have graphics options, that can scale to higher settings.
 

Cacadookie

Neo Member
Of course they can. PC can run at higher settings than consoles.
On PC you can run all games at true 4K. Or higher if one desires.
Or you can run games at high refresh rates. 144hz displays are very common on PC. And some go to 240hz.
Then there is ray-tracing. There are more games on PC that use ray-tracing. And most use much higher quality settings than on consoles.
And then there is the usual stuff, higher resolution shadows, volumetric fog and illumination. Increased LODs. Higher use of tessellation.
And the list goes on and on.
Considering Devs already have an artistic and visuals look in mind, they don't usually scale up to hard where it would matter especially on PC. I can understand more FPS and higher resolution but some of us just want the next crysis so smaller details such as increased Lods, higher resolution shadows, doesn't always give the visual leap some people want. I can see illumination, volumetric fog, and tessellation being that though. I would also want terrain deformation, higher quality water physics, increased particle effects, destruction, splintering...etc. I think in the end, we are still going to be limited by Consoles. I hope not though.
 

sendit

Member
Lift Off Moon GIF by Stakin

What they will actually end up being used for

Depending on hash rate. Currently have 4 2080s mining that are looking a bit old.
 

Papacheeks

Banned
It is all over for nvidia

Honestly were what 3 years or so after Nvidia pushed ray tracing really hard? What games have really shown a case for its use? Sony's big one is Spiderman? I really can see that AMD is going for rasterization, and as time goes on will get better at ray tracing.

I care about resolution, frames and over IQ in a game. Ray tracing is not being used in games in nay big meaningful way.
 
Honestly were what 3 years or so after Nvidia pushed ray tracing really hard? What games have really shown a case for its use? Sony's big one is Spiderman? I really can see that AMD is going for rasterization, and as time goes on will get better at ray tracing.

I care about resolution, frames and over IQ in a game. Ray tracing is not being used in games in nay big meaningful way.
Reminds me of PC modders to who have to add rain and puddles onto every square inch of any game just to show off the RT reflections lmao
 

DonkeyPunchJr

World’s Biggest Weeb
It is all over for nvidia
Wish I had a dollar for every time I’ve heard an AMD fanboy make a hyperbolic claim about some rumored next-gen CPU/GPU.

Bonus: once it releases, I bet you’ll be here saying “oh yeah well it’s too soon to judge, just wait until we get better drivers/games are optimized for this new architecture, then we’ll really see what it’s capable of!”

We’ll see. I hope we get some fierce competition but I am not going to project that my favorite company is for real gonna dominate this time based on some rumored specs for a future product. You’d think the fanboys would’ve learned by now.
 

SantaC

Member
Wish I had a dollar for every time I’ve heard an AMD fanboy make a hyperbolic claim about some rumored next-gen CPU/GPU.

Bonus: once it releases, I bet you’ll be here saying “oh yeah well it’s too soon to judge, just wait until we get better drivers/games are optimized for this new architecture, then we’ll really see what it’s capable of!”

We’ll see. I hope we get some fierce competition but I am not going to project that my favorite company is for real gonna dominate this time based on some rumored specs for a future product. You’d think the fanboys would’ve learned by now.
Except zen 3 was the real deal.
 

Bo_Hazem

Banned
Thats about 41 and 61Tflops for the top two if they were both clocked @2000mhz

Makes current gen consoles seem like pipsqueaks.

Seems like 4K@60fps with RT would be the baseline. Current gen would serve as crossgen for PS5 Pro iteration, I guess. It seems to me the only major upgrade from here outside TF's is going full ARM architecture.
 

DonkeyPunchJr

World’s Biggest Weeb
Except zen 3 was the real deal.
Wow so after literally 10 years of “just wait until next year’s CPU!!!” you finally guessed right. And all it took was a half-decade delay in Intel’s fab roadmap to allow them to catch up.

truly you AMD fanboys have an almost prophetic ability to predict the outcome of future product match-ups and this isn’t just another iteration of the hype cycle. I stand corrected.
 

Dream-Knife

Banned
It's AMD so they will make 100 reference cards, with Sapphire and Powercolor getting 500 each.
 
Last edited:

SantaC

Member
Wow so after literally 10 years of “just wait until next year’s CPU!!!” you finally guessed right. And all it took was a half-decade delay in Intel’s fab roadmap to allow them to catch up.

truly you AMD fanboys have an almost prophetic ability to predict the outcome of future product match-ups and this isn’t just another iteration of the hype cycle. I stand corrected.
You seem triggered 😁

Are you…ok?
 
Last edited:

martino

Member
Considering Devs already have an artistic and visuals look in mind, they don't usually scale up to hard where it would matter especially on PC. I can understand more FPS and higher resolution but some of us just want the next crysis so smaller details such as increased Lods, higher resolution shadows, doesn't always give the visual leap some people want. I can see illumination, volumetric fog, and tessellation being that though. I would also want terrain deformation, higher quality water physics, increased particle effects, destruction, splintering...etc. I think in the end, we are still going to be limited by Consoles. I hope not though.
higher rendering resolution = more detail not lost on the screen
if we really use 8k texture this gen, we will lost a part of them if rendering resolution is 1080p/1440p reconstructed to higher res (especially in distance)
 
Last edited:

Lethal01

Member
You do realize that most games on PC have graphics options, that can scale to higher settings.

I realize I specifically mentioned that right, and they are the reason that there will be tons of benefit from getting a new card.

I'm just saying that even games that scale well will be held back by having to be able to scale down to work on older cards. To fully take advantage of ray tracing you have to build the entire game around real time ray tracing being totally on at all times. The Devs of Metro Exodus said the same thing.
 
Next year gonna be so crazy.

  • RDNA3
  • Lovelace
  • Arc
Then on the CPU side you got Alder Lake coming in a couple days, then next year Zen3+, Zen4 and Raptor Lake second half. Shit bout to be too wild.

But to stay on topic, those MCM GPUs gonna be some real heaters, lmao

Plus finally implementing:
DDR5
Wifi6E
PCIE5.0
NVMe SSD Drives up to 16GB/sec read/write speed (almost equal to PS3 and X360 Ram Bandwidth)
USB 4.0
 

Interfectum

Member
Will PC gamers even get to take advantage of any of this compute performance considering consoles exist?

Most next gen games already compromising with performance mode vs quality mode, RT on/off. The PC versions could just run the quality mode with full performance, ray tracing enabled, etc.
 
Last edited:

Cacadookie

Neo Member
Most next gen games already compromising with performance mode vs quality mode, RT on/off. The PC versions could just run the quality mode with full performance, ray tracing enabled, etc.
Depends on what settings they use and how they use RT. Most games that use Ultra Settings on PC never really LOOK ultra and still end up using a lot of resources that could be put to better use. With GPU's heading up north of 40 to 60 TF, there will be a lot of headroom to do wayyy more instead of just Res bump and FPS bump.
 

Sosokrates

Report me if I continue to console war
this thread has nothing to do with consoles and yet you manage to shit post "humour" comparison with hw from the future just to downplay another at least try to be funny.
Seems more of a "you" problem.

If you're that upset by somone refering to the consoles as "pipsqueaks" in comparison to such vastly more powerful hardware perhaps its time to take a break.
 

Darius87

Member
Seems more of a "you" problem.

If you're that upset by somone refering to the consoles as "pipsqueaks" in comparison to such vastly more powerful hardware perhaps its time to take a break.
you stating obvious what's the point? wanna derail thread about amd cards vs consoles? go on... troll 🦴👅
 

PaintTinJr

Member
Have AMD said what their solution to nvidia's RTX IO( + GPU compute ) is yet?

I wonder if the infinity cache sizes are actually to bypass the need for an RTX IO card and just use the nvme on MB or PCIe bus adapter - with Directstorage API - to use the GPU for decompression as a simpler solution - that scales performance with the infinity cache size, even if it only uses half a card's infinity cache size for Directstorage to feed decompression on the CUs.
 

Dream-Knife

Banned
Seems more of a "you" problem.

If you're that upset by somone refering to the consoles as "pipsqueaks" in comparison to such vastly more powerful hardware perhaps its time to take a break.
PCs have been more powerful than consoles for like 20 years or more, so I don't really see the point of bringing it up.

PCs offer much more than power, so if you want to wage some console vs PC war, I suggest you focus on that.

There is always more powerful hardware coming out. These new cards will make the 3090 mid tier. That's how it always is.
 

Kenpachii

Member
It is all over for nvidia

They do have hopper, i wonder if there lovelace solution is going to get ditched over night ( which would be a good thing, as i see it as a filler gen ), or maybe there lovelace solution has a mcm design also.
Still i could see lovelace compete against this card if it was only for the RT solution so nvidia could bank on that.

Considering Devs already have an artistic and visuals look in mind, they don't usually scale up to hard where it would matter especially on PC. I can understand more FPS and higher resolution but some of us just want the next crysis so smaller details such as increased Lods, higher resolution shadows, doesn't always give the visual leap some people want. I can see illumination, volumetric fog, and tessellation being that though. I would also want terrain deformation, higher quality water physics, increased particle effects, destruction, splintering...etc. I think in the end, we are still going to be limited by Consoles. I hope not though.

There was no reason to buy a 580 GTX, because it was massive overkill when it came out. then metro 2033 came out and ac unity and the 580 was a cripple, which needed a entire new tier of performance. With RT being a thing, games could very well make the shift to RT only like metro exodus sooner rather then later when amd joins the club to make use out of those cu's.
 
Last edited:

Dream-Knife

Banned
There was no reason to buy a 580 GTX, because it was massive overkill when it came out. then metro 2033 came out and ac unity and the 580 was a cripple, which needed a entire new tier of performance. With RT being a thing, games could very well make the shift to RT only like metro exodus sooner rather then later when amd joins the club to make use out of those cu's.
The 580 came out because the 480 was crap. It used like 300w and ran hot as hell.

580 can play BF1 1080p medium at 60 fps. One of my friends still has one in his ancient windows 7 machine.
 

Trogdor1123

Gold Member
Next year gonna be so crazy.

  • RDNA3
  • Lovelace
  • Arc
Then on the CPU side you got Alder Lake coming in a couple days, then next year Zen3+, Zen4 and Raptor Lake second half. Shit bout to be too wild.

But to stay on topic, those MCM GPUs gonna be some real heaters, lmao
With out the shortages it would be great as a consumer... Sad face
 

GreatnessRD

Member
Plus finally implementing:
DDR5
Wifi6E
PCIE5.0
NVMe SSD Drives up to 16GB/sec read/write speed (almost equal to PS3 and X360 Ram Bandwidth)
USB 4.0
That is also exciting, but since I mostly use my PC for gaming, by time that matter I'll be ready to upgrade anyway. They haven't even saturated PCIE 4.0 yet. I'm more excited about Microsoft's direct storage. That should be a game changer and then I'll upgrade to a Gen4 NVME. Right now I'm fine with the two Gen3's I have.
With out the shortages it would be great as a consumer... Sad face
The shortages are a lie. In the beginning, sure, it was real. Not anymore. Nobody will convince me that aren't manufacturing the shortage to keep prices high since they see psychopaths are willing to part their money for low-end stuff. lol
 

Kenpachii

Member
The 580 came out because the 480 was crap. It used like 300w and ran hot as hell.

580 can play BF1 1080p medium at 60 fps. One of my friends still has one in his ancient windows 7 machine.

I had 2, ac unity wasn't even playable on any level because to low v-ram, same for watch dogs. Your friend having fun with his 580 is great, 1,5gb of v-ram didn't age well and made me rage endlessly even while people where talking before the PS4 release that consoles would never go to such huge amounts because games where limited before. Still performance of the 580 was trash when next gen games started to hit and v-ram started to rise drastically.

RT could easily be a requirement for newer games. Nvidia moved on already a long time ago amd is now already moving into there second gen on this front and that gen will see a huge leap in performance on this front if those stats are right.

In games with consoles supporting RT on there hardware, we could see it become a requirement the next day. As AMD is no longer crippling the market.

Think of metro exodus enhanced edition as ac black flag. Now imagine ac unity. U will need all the performance u get can.
 
Last edited:

xPikYx

Member
Next year gonna be so crazy.

  • RDNA3
  • Lovelace
  • Arc
Then on the CPU side you got Alder Lake coming in a couple days, then next year Zen3+, Zen4 and Raptor Lake second half. Shit bout to be too wild.

But to stay on topic, those MCM GPUs gonna be some real heaters, lmao
Good luck finding one 😂
 

ToTTenTranz

Banned
Are you sure? There were rumors that the release slipped into Q1 2023.
Bondrewd claimed AMD's GPUs are on a 6 quarter cadence, meaning the first RDNA3 graphics card should appear up to one year and a half after RDNA2.
Navi 21 released in the middle of Q4 2020, so Navi 31 should be 6 quarters after that in Q2 2022.

I'll believe Bondrewd over any Weibo lurker that repeats what he reads on twitter.
 
Last edited:

GreatnessRD

Member
Good luck finding one 😂
Lmao!!

Well, I was able to grab a 3060 Ti on launch for AIB OG MSRP. (MSI Ventus $450) Then I was able to grab a 6800 XT off AMD's website in June of this year at MSRP. If I put up a decent fight, I think I'd make out. However, since I just bought this 6800 XT, I have no desire to buy the new GPU unless they just completely blow life away. This 6800 XT should hold me until at least '25
 
Wow so after literally 10 years of “just wait until next year’s CPU!!!” you finally guessed right. And all it took was a half-decade delay in Intel’s fab roadmap to allow them to catch up.
?????????

Did you sleep through Zen, Zen+ and Zen 2?
If Zen didn't exist you can bet your ass we'd still be slumming with 4c/4t i5's from Intel because of the shitty yields they're getting from their shitty nodes.

And Intel fucking their process technology isn't AMD's problem, its not the consumers' problem its not our problem. You compete with what's on the market, and if Intel with all their size and R&D budget can't compete with a CPU manufacturer that was less than 1/10th their size? Well....I'd say its more embarrassing for Intel to even be losing at all.

truly you AMD fanboys have an almost prophetic ability to predict the outcome of future product match-ups and this isn’t just another iteration of the hype cycle. I stand corrected.
AMD fanboys?
AMD have been executing and delivering to a high level for the last 5 years now. Their CPU's are causing Intel all sorts of problems in datacentre because Zen 3 is just better than ICL. The have a better node in TSMC's N7 vs Intel's 10nm (now rebranded to Intel 7) and they have better product planning. Their bets on chiplets from way back in 2016/17 with the First Zen and the Zeppelin dies along with their infinity fabric have now been proven.
And we can clown on Radeon forever for GPU in the gaming space, and rightly so. Vega was garbage for gaming. But when it comes to HPC....well Vega was a very good compute architecture and their current CDNA compute accelerators are doing very good business for hyperscalers.

Whether you like it or not, since Lisa Su took over, AMD are a very effectively run business. As I've said earlier, they have been reliably executing on their roadmaps for the last five years, with delays really only stemming from the Pandemic's latent effects and silicon shortages that are impacting the entire industry. Certainly they've been able to deliver to expectations in the CPU space far more effectively than Intel has. And with CDNA and RDNA2 they seem to have corrected their approach in GPU to the point where they are competitive again.
 
Top Bottom