• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMDs 7000 series graphics cards hits almlost 4GHZ

SantaC

Member
Holy smokes, if this rumor is true, Nvidia could be in trouble for the first time in like 15 years.


https://wccftech.com/amd-rdna-3-rad...power-mode-next-gen-infinity-cache-confirmed/

MXYGkqa.gif
 
Last edited:

Kuranghi

Member
This sucks, so I could buy a 4000-series and then this comes out afterwards and is better in every way for probably less money, but I can't wait to see both because I risk not being able to get the 4000-series at a reasonable price if the 7000-series is all mouth and no trousers. Or will the 7000-series be out before the 4000?

I guess I'm overthinking it actually, since if I buy a 4000 card and then price balloons but AMD is better some weeks/months later I can just sell the nvidia card and buy the AMD day one.
 

Crayon

Member
I'm not in for the first round of high-end but I've been on the edge of my seat waiting to see what rdna3 can do!!!
 

DonkeyPunchJr

World’s Biggest Weeb
Holy smokes, if this rumor is true, Nvidia could be in trouble for the first time in like 15 years.


https://wccftech.com/amd-rdna-3-rad...power-mode-next-gen-infinity-cache-confirmed/

MXYGkqa.gif
We’ll see. They have been hyping next gen as a big improvement in performance per watt, which makes me think they won’t be able to match Nvidia in raw performance.
This sucks, so I could buy a 4000-series and then this comes out afterwards and is better in every way for probably less money, but I can't wait to see both because I risk not being able to get the 4000-series at a reasonable price if the 7000-series is all mouth and no trousers. Or will the 7000-series be out before the 4000?

I guess I'm overthinking it actually, since if I buy a 4000 card and then price balloons but AMD is better some weeks/months later I can just sell the nvidia card and buy the AMD day one.
Yes this is truly a crisis. A new video card is coming out, then another video card is also coming out and it might be better. ;)

FWIW I don’t expect this to be a repeat of last gen when people were snatching up like 2-3 year old video cards for over MSRP. There’s an excess of inventory right now + GPU mining has crashed.

I bet a few dipshits are gonna try to grab them and scalp them, only to realize that a lot has changed since 2020. It’ll pay to be patient for a few months.
 

Xyphie

Member
No reason to expect a meaningful Fmax delta between RTX 4000 series and RDNA3. Except for Navi 33 both series will be built using TSMC 5nm derived nodes using the same logic libraries.
 

M1chl

Currently Gif and Meme Champion
So what exactly? Nvidia in trouble, haha nice joke. Radeon is few generations behind and it does not have current-gen features for compute, RT and their software SDK stack sucks ass.

This is like saying that Pentium 4 beats Ryzen due to frequency
 

SantaC

Member
So what exactly? Nvidia in trouble, haha nice joke. Radeon is few generations behind and it does not have current-gen features for compute, RT and their software SDK stack sucks ass.

This is like saying that Pentium 4 beats Ryzen due to frequency
Remember zen 2 > zen 3
Massive leap
 

GymWolf

Member
Until they have something to prolong the life of the gpu (or just to squeeze more performance) like dlss, they are doa for me personally.

Don't even care about their inferior rtx tech, but they need a dlss alternative that work as good or better.

(i mean unless their gpu has better performance than nvidia+dlss, then we can talk)
 
Last edited:

ahtlas7

Member
How is power consumption?

edit: from article, Radeon RX 7000 GPUs and next-gen iGPUs, will going to offer a range of new technologies including a refined adaptive power management tech to set workload-specific operation points, making sure that the GPU only utilizes the power required for the workload. The GPUs will also feature a next-gen AMD Infinity Cache which will offer higher-density, lower-power caches and reduced power needs for the graphics memory.

doesn’t specify but this does sound promising.
 
Last edited:

alucard0712_rus

Gold Member
So what exactly? Nvidia in trouble, haha nice joke. Radeon is few generations behind and it does not have current-gen features for compute, RT and their software SDK stack sucks ass.

This is like saying that Pentium 4 beats Ryzen due to frequency
Not few generations behind, but 1-1.5 :messenger_grinning_smiling:
Also I don't understand why to much hate and disbelief in Nvidia? They are nerds and do stuff long before everyone realizes they need it. There objectively no reasons to think they will 'lose' anytime soon.
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
Man I wish I had a dollar for every time an AMD fanboy predicted that Radeon++ would be the one that restores AMD to their rightful place on the graphics throne.

Then fast forward a few months: “well, it may not be the best right now, but just wait a couple years when it gets better drivers and games are optimized for it! I bet Radeon is gonna have longer legs”

Then fast forward a year: “hooooo boy, Radeon++ is shaping up to be a real beast. Nvidia is in trouble this time!!”
 

GymWolf

Member
I went back in time, when they name it their CPUs in original way. Netburts and all that kind of good stuff
next time you go back in time can you please erase the last jedi from history? punch the expectations destroyer on his fluffy face or some shit.

Edit: i hope your empathy reaction are not because you are last jedi fans or i may have to put both in ignore :lollipop_squinting:
 
Last edited:

M1chl

Currently Gif and Meme Champion
Not few generations behind, but 1-1.5 :messenger_grinning_smiling:
Also I don't understand why to much hate and disbelief in Nvidia? They are nerds and do stuff long before everyone realizes they need it. There objectively no reasons to think they will 'lose' anytime soon.
Hate on Nvidia? Not from my side baby, I am all in on Nvidia. They can't do anything wrong.
 

SantaC

Member
Man I wish I had a dollar for every time an AMD fanboy predicted that Radeon++ would be the one that restores AMD to their rightful place on the graphics throne.

Then fast forward a few months: “well, it may not be the best right now, but just wait a couple years when it gets better drivers and games are optimized for it! I bet Radeon is gonna have longer legs”

Then fast forward a year: “hooooo boy, Radeon++ is shaping up to be a real beast. Nvidia is in trouble this time!!”
RDNA2 are good cards. Only nvidia fanboys deny it
 
I'm still getting a 4090 because I am not dealing with AMD's drivers. People love to say "but they fixed them!" after all this time but they're still lacking in terms of custom support like in emulation, where I need it to work without a hitch.
 

lukilladog

Member
Man I wish I had a dollar for every time an AMD fanboy predicted that Radeon++ would be the one that restores AMD to their rightful place on the graphics throne.

Then fast forward a few months: “well, it may not be the best right now, but just wait a couple years when it gets better drivers and games are optimized for it! I bet Radeon is gonna have longer legs”

Then fast forward a year: “hooooo boy, Radeon++ is shaping up to be a real beast. Nvidia is in trouble this time!!”

I don´t think it would be too smart to dismiss the technological advantages AMD has now just because some passionate people don´t know what they are talking about.
 

DonkeyPunchJr

World’s Biggest Weeb
RDNA2 are good cards. Only nvidia fanboys deny it
Good? Yes

Better than Nvidia? Not if you care about DLSS or ray tracing. And I’ll remind you that fanboys were saying “well, RDNA2 has more RAM, that’s really going to pay off later once games need more memory!!” <— still waiting!

Nvidia in trouble? Hell no lol
KpyheGH.jpg

I don´t think it would be too smart to dismiss the technological advantages AMD has now just because some passionate people don´t know what they are talking about.

What advantages? And I am not dismissing anything, I am remaining skeptical instead of buying into yet another hype cycle. I’d be happy to see AMD really kick Nvidia’s ass, but I’ll believe it when I see it.
 
Last edited:
for a while, rumor's been AMD may actually beat nvidia this upcoming gen.
AMD's gone chiplet, while nvidia is still monolithic, so AMD has the edge.

explains why nvidia was rumored to be pushing ungodly TDPs.

only concern is AMD chiplet drivers... and you never know, nvidia might have some special sauce up their sleeves... but nvidia TDPs make me pessimistic.
 

GloveSlap

Member
Until they have something to prolong the life of the gpu (or just to squeeze more performance) like dlss, they are doa for me personally.

Don't even care about their inferior rtx tech, but they need a dlss alternative that work as good or better.

(i mean unless their gpu has better performance than nvidia+dlss, then we can talk)
I'm in the same boat. I can overlook ray tracing to a degree, but dlss is just too damn good. I'll keep an open mind though.
 

lukilladog

Member
What advantages? And I am not dismissing anything, I am remaining skeptical instead of buying into yet another hype cycle. I’d be happy to see AMD really kick Nvidia’s ass, but I’ll believe it when I see it.

Efficiency and chiplet design, can give you more gpu for the same price or even lower.
 

winjer

Gold Member
Some years ago, I heard that AMD had plans to send engenders from it's CPU division, to the GPU division, to improve their design.
Getting over 3Ghz from RDNA2 was impressive. But 4Ghz it's even better.
 

DonkeyPunchJr

World’s Biggest Weeb
next time you go back in time can you please erase the last jedi from history? punch the expectations destroyer on his fluffy face or some shit.

Edit: i hope your empathy reaction are not because you are last jedi fans or i may have to put both in ignore :lollipop_squinting:
Haha hell no. It was actual empathy.
 

PhoenixTank

Member
Efficiency and chiplet design, can give you more gpu for the same price or even lower.
Yeah that is definitely the exciting thing from AMD that we know about. Changed the game for CPUs, I'm hopeful it can have the same effect for GPUs. These dies are getting excessively large and expensive.

I also don't believe for a second that we'll see 4GHz GPU clocks in normal use. Some ridiculous LN2 OC setup? maybe. Happy to be proven wrong when we actually see them!
 

Crayon

Member
I'm good with current fsr2 performance vs current dlss performance. It's close enough for me. But that's current fsr compared to current dlss. We'll see if there is any change in how the next round of cards handle these.
 
Mark Cerny told at the first PS5 presentation that higher clocks are better for few things especially ray tracing.Will be interesting to see how AMDs ray tracing goes against nividias.I believe performance wise AMD will be better than Nividia and close in ray tracing
 
Last edited:

SeraphJan

Member
FSR 2.1 is pretty good, if AMD could get their ray tracing performance on par with Nvidia, they could really stand a chance this time
 

Sanepar

Member
So what exactly? Nvidia in trouble, haha nice joke. Radeon is few generations behind and it does not have current-gen features for compute, RT and their software SDK stack sucks ass.

This is like saying that Pentium 4 beats Ryzen due to frequency
Lol, what? Besides rt in rasterization 6800 xt is on pair with a 3080 and oc better coming close to a 3080 ti.

You have no idea what u are talking about.
 

Crayon

Member
FSR 2.1 is pretty good, if AMD could get their ray tracing performance on par with Nvidia, they could really stand a chance this time

Or at least get raytracing within striking different. The gap between rx6000 and rtx3000 is way too much. Like, rediculous.
 

M1chl

Currently Gif and Meme Champion
Lol, what? Besides rt in rasterization 6800 xt is on pair with a 3080 and oc better coming close to a 3080 ti.

You have no idea what u are talking about.
Besides rt, tensor units, cuda sdk and so on. DLSS. If you are buying GPU to play only games which are out now, ok. However we who also work in the industry, needs quite a bit more. Radeon is hollow package, it can do just rasterization and that's it. So if you are ok with that, fine. But RT and other compute technologies will be used more and more, so if you are buying something for few years Nvidia is definitely worth the extra.
 

Irobot82

Member
Besides rt, tensor units, cuda sdk and so on. DLSS. If you are buying GPU to play only games which are out now, ok. However we who also work in the industry, needs quite a bit more. Radeon is hollow package, it can do just rasterization and that's it. So if you are ok with that, fine. But RT and other compute technologies will be used more and more, so if you are buying something for few years Nvidia is definitely worth the extra.
In terms of graphics, what other "compute technologies" does Nvidia have that Radeon doesn't? And don't count black boxed nvidia physx and other nonesense.
 

M1chl

Currently Gif and Meme Champion
In terms of graphics, what other "compute technologies" does Nvidia have that Radeon doesn't? And don't count black boxed nvidia physx and other nonesense.
Yes, it has pretty much industry standard CUDA SDK, it has additional Tensor units for "dumb math", used in ML. GPU by itself can do operation like that, however you are wasting rasterization power. Nvidia is pretty much leader for Machine Learning/Deep learning.
 

Sanepar

Member
Besides rt, tensor units, cuda sdk and so on. DLSS. If you are buying GPU to play only games which are out now, ok. However we who also work in the industry, needs quite a bit more. Radeon is hollow package, it can do just rasterization and that's it. So if you are ok with that, fine. But RT and other compute technologies will be used more and more, so if you are buying something for few years Nvidia is definitely worth the extra.
3xxx series will perform crap on future rt games. Even a 3080 can't handle rtx decent on most games. So what u are saying doesn't make sense.

And with a 6800 xt or 6900 xt will don't need dlss or fsr. The only game 3080 or 6800 xt can't handle native 4k@60 is cyberpunk.
 

Brigandier

Member
Really would be nice to see AMD make some great cards again that bring serious competition to Nvidia and the market....

Nvidia have been comfortable for too long, They need some competition again.
 

Buggy Loop

Member
DoomedMan I wish I had a dollar for every time an AMD fanboy predicted that Radeon++ would be the one that restores AMD to their rightful place on the graphics throne.

Then fast forward a few months: “well, it may not be the best right now, but just wait a couple years when it gets better drivers and games are optimized for it! I bet Radeon is gonna have longer legs”

Then fast forward a year: “hooooo boy, Radeon++ is shaping up to be a real beast. Nvidia is in trouble this time!!”

Yuuupppp

I was on that underdog train since ATI days, way back in late 90’s.

Oh, wait till AMD’s unified drivers
Oh, wait for AMD’s (insert name I have forgotten because it was irrelevant) drivers
Oh, wait when computing will be important, clearly an advantage!

Etc etc. At Pascal I switched to Nvidia.

RDNA 2 had the same kind of wild rumours too before official specs. Forums had like BVH performances that would be 4 times Nvidia’s ! OMG, it’s gonna crush Nvidia when console games with ray tracing get ports to PC and made for AMD.. 4GHz? I think my Taiwanese uncle that sees samples of AMD saw that number! Oh my gosh!

It’s all crap till we see unveilings (Nvidia rumours too btw, same nonsense), and then even until we test them.

But yes!!! This is the one. This is the time that Nvidia somehow makes the mistake of not going MCM (they aren’t too good with architecture/cost optimization), that competition gets a quantic universe siphoning for out of nowhere clocks, less heat and less wattage, literally breaking the laws of physics. Oh they also sit on their throne and stopped R&D (as usual Nvidia!!) and make some intel / Ryzen analogy to all that, somehow.

Poor Nvidia

We Are Doomed Reaction GIF
 
Last edited:

M1chl

Currently Gif and Meme Champion
3xxx series will perform crap on future rt games. Even a 3080 can't handle rtx decent on most games. So what u are saying doesn't make sense.

And with a 6800 xt or 6900 xt will don't need dlss or fsr. The only game 3080 or 6800 xt can't handle native 4k@60 is cyberpunk.
But it has DLSS, which helps a lot of and it's better in motion than FSR 2.0. Besides native res is overrated with techniques like DLSS 2.0+

To that bolded I don't know what it means.
 
Top Bottom