• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RX 6000 'Big Navi' GPU Event | 10/28/2020 @ 12PM EST/9AM PST/4PM UK

Kenpachii

Member
Not an AMD fanboy, but I will go out on a limb and say that their future super sampling offering will allow the 6800 XT to grow into a greater card than it already is. I am not too concerned about RT performance right now, and by the time this new SS is offered it will gain a much needed performance boost in RT intensive games.

For them to mention they are working on this and future technology with MS(DX 12 Ultra) to make games load faster/run smoother gives me the confidence to go team red this time.

I really wish they would have gone more in dept on this, but frankly as they spend so little time on this i am more afraid its not going to matter for this generation of cards even remotely. It will most likely support

Yap, bloodly Amd dont support their 3000 Ryzen with smart memory access.
Not buying their 6000 series.

I expect Nvidia to launch the rumored 3080 Super/Ti with 12GB ram now. Probably around $900, get that instead!
Hopefully it is built on tsmc too.

12gb is laughable. they need 16gb at the minimum.

it's 10gb gddr6x vs 16gb gddr6. I wonder what will matter more... probably 3080 will be too slow by the time games require that vram anyway

This cards will stay valid for the entire PS5 and XBOX generation mate. games will move up in v-ram consumption because of next gen. AMD is not throwing 16gb on there cards just to throw away money its going to be the new standard and everybody knows it.

16gb slower v-ram > 10gb faster v-ram its not even a question. Nvidia fucked up with there 10gb model.
 
Last edited:

Alienware

Member
Live look at everyone who paid for a 3080 and 3090.
giphy.gif

I don't have a 3080 because Nvidia didn't want money, I would definitely NOT feel buyer's remorse if I had one in my case right now.
AMD did a solid job, but no miracles.
 

rofif

Banned
I really wish they would have gone more in dept on this, but frankly as they spend so little time on this i am more afraid its not going to matter for this generation of cards even remotely. It will most likely support



12gb is laughable. they need 16gb at the minimum.



This cards will stay valid for the entire PS5 and XBOX generation mate. games will move up in v-ram consumption because of next gen. AMD is not throwing 16gb on there cards just to throw away money its going to be the new standard and everybody knows it.

16gb slower v-ram > 10gb faster v-ram its not even a question. Nvidia fucked up with there 10gb model.
probably but what is normal ram doing? We also all have at least 16gb of normal ram aside from vram. Consoles have 16gb total shared memory right ?
 
Ridiculous post. Why does the use of DLSS matter if the end result is indistinguishable from native resolutions? This is a silly post. Resolutions are becoming so massive (e.g. 4K and 8K) that using brute force to run games, especially with advanced and heavily taxing features such as ray tracing, is not sensical but using intelligence (i.e. AI upscaling) to do so is sensical.

then it's not really 4k if it's using dlss.
Same with their claim of 8k with dlss, which doesnt end up being real 8k.


Tech jesus said it best.


I also think you're overselling this feature. Amd will have it too shortly, since they said it's coming but keep hammering on that, until that moment.
 
Last edited:

BluRayHiDef

Banned
AMD RX 6000 Series: Solomon Grundy, who uses his incredible strength (i.e. CUs) to solve all of his problems and therefore fails to solve some of them since they require cognition.

Nvidia RTX 30 Series: Bruce Wayne, who uses a combination of brains (i.e. tensor cores) and brawns (i.e. CUDA cores) to solve his problems, therfore being able to solve them all.
 
D

Deleted member 17706

Unconfirmed Member
I wouldn't bet the house on this. 6x has such crazy bandwidth.

I'd bet they are probably equivalent. We'll let real benchmarks speak the true true.

Yep. None of this means shit until we get third party benchmarks.
 

GymWolf

Gold Member
wait, i forgot about the +6gb on the 6800xt compared to the 3080, that's a good selling point.

Now i only need to know if having 6gb more can offer similar performance gains compared to dlss in vram hungry games, how does it work usually? if you don't have enough vram you can't use 4k texture pack or rtx at high settings? never happened to me so i'm new to these type of problems.

edit: fuck i forgot about nvidia using the new type of memory in the 3080, is faster vram a good way to substitute more vram in general? or more vram is always better?
 
Last edited:

Kenpachii

Member
I wouldn't bet the house on this. 6x has such crazy bandwidth.

I'd bet they are probably equivalent. We'll let real benchmarks speak the true true.

Real benchmarks are useless for 16/10gb v-ram comparisons. As no game is currently builded for next generation consoles only and pushes the envelop we still sit in the PS4 generation.
 
Last edited:

Kenpachii

Member
wait, i forgot about the +6gb on the 6800xt compared to the 3080, that's a good selling point.

Now i only need to know if having 6gb more can offer similar performance gains compared to dlss in vram hungry games, how does it work usually? if you don't have enough vram you can't use 4k texture pack or rtx at high settings? never happened to me so i'm new to these type of problems.

edit: fuck i forgot about nvidia using the new type of memory in the 3080, is faster vram a good way to substitute more vram in general? or more vram is always better?

This is what happens when u run out of v-ram. all low settings.



while the game released before this one he could play on ultra without issue's at stable framerate but then next gen arrived and even while the gpu is fast enough the v-ram isn't holding up to the point the card is useless. Think about that for a second. this is why i am so anal about v-ram. U need x amount of it before anything else.

Now with the 3080 it will probably be able to still play games just at reduced settings even while the gpu is fast enough to push higher settings the v-ram simple isn't holding up. This is why a 6800xt will age perfectly fine and the 3090 but the 3080 not so much and the 3070 is just a sadistic joke at this point.

Nvidia really should have pushed 16gb cards but they won't because they want u to upgrade in 2 years from now or 1,5 years from now whenever there new gpu comes forwards. And one way to do so if the performance increase is lackbusting = more v-ram.
 
Last edited:

rofif

Banned
This is what happens when u run out of v-ram.


That's unity on 2gb so kinda extreme but yeah. Battle for middle earth ran about the same on my 256mb geforce back in the day until I got 512mb model.
But now we have 10gb of gddr6x. Maybe it will swap faster? Maybe it depends on the game etc. who knows. I clearly hope that 10gb on 3080 is enough :lol:
 

rofif

Banned
you know what else seems to be in RAGE-mode? Nvidia fans.

i really don't get why? don't you want GPU prices keep in check at least somewhat? this was a good day for pc gaming. the underfunded underdog came back from the dead today. rejoice!
nobody is angry. It's fantastic that amd finally catched up. it will be good for all.
And I've been hunting whole week for 3080fe week ago (and I got it)
 

GymWolf

Gold Member
This is what happens when u run out of v-ram. all low settings.



while the game released before this one he could play on ultra without issue's at stable framerate but then next gen arrived. Think about that for a second. this is why i am so anal about v-ram. U need x amount of it before anything else.


Let me ask you one thing, does using dlss reduce the amount of vram that the game normally need to work? maybe that's why nvidia was less anal about the vram quantity?
 
I love this new narrative from Nvidia fans about the 6000 series being "brute force" machines while the 3000 series are supposedly majestically efficient engineering marvels. Keep it up guys!

This is especially funny when you consider the following:
AMD using tons of know how and efficiency in designing their architecture, utilizing new and efficient techniques such as Infinity Cache
Doing this with a smaller 256bit bus
+54% perf per watt for 6800XT
+63% perf per watt for 6900XT
much smaller die size
lower power draw (300w)
based on AIB leaks lots of OC headroom if you increase power draw
performance matching their Nvidia equivalent tier.
Synergy with Ryzen 5000 series processors to grant even more performance

And yet these are supposed to be inefficient "brute force" cards compared to ampere which:
Has higher power draw
Lower perf per watt
Was pushed past the point of efficiency
Does not OC well
Is on a larger die
Larger bus
GDDR6X memory instead of GDDR6

Such efficiency, much majesty....wow
 

me0wish

Member
you know what else seems to be in RAGE-mode? Nvidia fans.

I don't see anyone actually raging, please keep console wars mentality in console war threads.

Competition is always good, but I expected cheaper from AMD, plus nvidia has better raytracing performance, and for me, I'd rather pay 50$ more for better rt performance.

I hope these two companies keep getting aggressive with pricing, so we get pre-mining pricing.

did you read the comments?

I didn't go through the whole thread, but a guy or two whining doesn't mean "nvidia fans are raging".
 
Last edited:

Kenpachii

Member
Let me ask you one thing, does using dlss reduce the amount of vram that the game normally need to work? maybe that's why nvidia was less anal about the vram quantity?

It could reduce the v-ram amount, however consoles also use dynamic resolutions scaling which is kinda the samish v-ram saving solution even while quality is worse obviously as output. Those consoles will still budget 10gb of v-ram even with that.

At the end of the day we got to see where things are going on how stuff is moving forwards. It could be we are stuck for half of the generation with hybrid games that run also on potato's. But it could also just flip over over night and there we are.

All with all i am happy AMD saw the light and pushes 16gb very hard, but i am still mad at nvidia for pushing there user base to 10gb v-ram even while they know next gen is around the corner. But we got competition back again so that's good. I am curious at how nvidia will react.
 

Ascend

Member
AMD RX 6000 Series: Solomon Grundy, who uses his incredible strength (i.e. CUs) to solve all of his problems and therefore fails to solve some of them since they require cognition.

Nvidia RTX 30 Series: Bruce Wayne, who uses a combination of brains (i.e. tensor cores) and brawns (i.e. CUDA cores) to solve his problems, therfore being able to solve them all.
The infinity cache flies in the face of this.
 

Ascend

Member
I think this thread I made deserves a revisit... I wasn't far off...


All the ignorant naysayers in there though... Obviously they are now naysayers in here...
 
AMD RX 6000 Series: Solomon Grundy, who uses his incredible strength (i.e. CUs) to solve all of his problems and therefore fails to solve some of them since they require cognition.

Nvidia RTX 30 Series: Bruce Wayne, who uses a combination of brains (i.e. tensor cores) and brawns (i.e. CUDA cores) to solve his problems, therfore being able to solve them all.
This is extremely reductive.
 
Absolutely amazing showing from AMD, exceeded most expectations.

That Nvidia power king thread has sure aged badly, huh

U Unknown Soldier post if you're ok!
Imagine thinking traditional rasterization matters in 2020.

Also imagine thinking I care enough about traditional rasterization to change from my planned 3090, assuming I ever make it to the front of the EVGA queue, when I would rather have ray tracing and DLSS. I don't buy video cards for yesterday's games, I buy them for tomorrow's.
 
Last edited:

martino

Member
like i decided a little before reveal i will from first dx12U games (and amd to deliver their open source solution) to see where games will go this gen.
since hdmi 2.1 is struggling too i will not update my screen like planned too....
but amd has my priority and attention.
 

llien

Member
AMD wont even be supported at launch with RT
Implications of this are terrible for RT as a whole.
The whole point of Microsoft having it as API was that it just runs, no matter what card, if it supports it.

It is getting absolutely nowhere if one needs to code for each manufacturer.
 

silentsod

Neo Member
I only care about raster performance from these cards; if the 6900XT is doing as fast or faster raster for less money and less operating expense when the benchmarks are finally out they get money.

Great job from AMD after having their ass handed to them for so long.
 

HTK

Banned
Since next-gen consoles are on RDNA 2 although customized a bit, anyone knows if this will indirectly benefit these new 6000 cards when it comes to how games are developed/optimized and will ultimately perform?
 

llien

Member
Since next-gen consoles are on RDNA 2 although customized a bit, anyone knows if this will indirectly benefit these new 6000 cards when it comes to how games are developed/optimized and will ultimately perform?

UE4 was written for NV cards foremost.
UE5, however...

 
  • Like
Reactions: HTK
Implications of this are terrible for RT as a whole.
The whole point of Microsoft having it as API was that it just runs, no matter what card, if it supports it.

It is getting absolutely nowhere if one needs to code for each manufacturer.

I imagine it will probably end up being the case that DXR works on Nvidia cards with no changes, but RTX code, which seems to be a custom fork/modification of DXR doesn't work on AMD cards, at least that is the implication I'm potentially seeing.

Sounds about right for Nvidia practices. To make matters worse most of the games with RTX are probably sponsored by Nvidia (free marketing, promotion, engineering time to code the RTX stuff, support etc...) so there may be a clause in the agreement to stop them from implementing DXR, or at least in some of them. (Maybe forced to delay for 6months to a year?)

Will be interesting to see how this pans out, the open solutions normally win in the end but it can take an awfully long time if a custom solution has more momentum at the start. I believe Intel GPUs should also work with DXR so it would probably be in developer's best interest to write DXR and have it work on all three manufacturer's cards but only time will tell.
 
Last edited:
Genius move by AMD to make their answer to DLSS a feature included across PC, PS5 and XsX as is the rumour. This is the type of move they should have been making to light a fire around Nvidia, by leveraging the huge factor of having that console architecture advantage going back a gen. What they put into the next gen consoles can have a great bearing on the direction of gaming tech, and they are trying to steer it to favourable ground.
 

Rikkori

Member
What I like about 6800 XT is that I can probably keep it for 5+ years easily. With the 3080 I thought I'd have to upgrade next cycle, because 10 GB was pretty weak. Right now I still err on the side of the 3080 because I want Ansel + CUDA (AI/ML), and my main focus for upgrading was CP2077, but it will come down to availability in the end anyway. Plus I like re-playing older games at 5K+ so the fat FP32 of ampere helps.

I think people overvalue DLSS + RT grossly and don't think about what's actually going on at all. And also, as you can see with DLSS in WD:L & the ultra-performance mode in any game, it's really not that great, so the best results are in certain games where the AA was already poor, leading me back to my earlier conclusion that what's really missing right now is not DLSS as "AI upscaling" non-sense, but rather better AA solutions. Worst case - just drop resolution scale 10-20% and you get most of the way there in any game.

As for RT, it's also hit or miss, and honestly it require many more settings options than are currently given to be really usable. I'm talking ray count, steps, resolution, distances, etc. Again, if not for Cyberpunk I'd really not give a single shit about it.

All in all, I'm pretty happy with everything that's going on. I do think people buying 3070s are fools though, the 3060 Ti is almost the same & will be considerably cheaper, while on the other hand the 6800 is a 10x better buy in any and every circumstance involving gaming. I gotta thank CDPR for the delay, I would've probably caved and bought one if nothing else would be available before the game launch but now I can wait safely for stock. :)
 
Since next-gen consoles are on RDNA 2 although customized a bit, anyone knows if this will indirectly benefit these new 6000 cards when it comes to how games are developed/optimized and will ultimately perform?

That seems to have been the idea behind the design, to grant synergy both ways. At least with Series X it should essentially be the same API calls so they can port the work already done here to the PC version pretty quickly I would imagine.

Also at a lower level, in terms of optimizing engines for specific hardware, most game engines and console ports should be coded in such a way to take advantage of the RDNA2 arch, so presumably this would also result in some better performance/optimization on PC, at least in an ideal scenario.
 
  • Like
Reactions: HTK

llien

Member
AMD RX 6000 Series: Solomon Grundy, who uses his incredible strength (i.e. CUs) to solve all of his problems and therefore fails to solve some of them since they require cognition.

Nvidia RTX 30 Series: Bruce Wayne, who uses a combination of brains (i.e. tensor cores) and brawns (i.e. CUDA cores) to solve his problems, therfore being able to solve them all.

Tensor cores is just one more op for brute force number crunching. (akin to AVX ops in CPU world)
What AMD is doing to get away with 256bit mem bus on slower RAM is much more advanced.
 

notseqi

Member
And GA102 based 3070Ti will cost... ? And be available...?
3070 is now irrelevant. They did their duty on that. Can't elevate that one as it would 'shame' Nvidia and pull from their ti-releases. Sticks around as a cheapo 2080ti replacement to fill that niche. 2080ti is 390€ here already.
 

llien

Member
3070 is now irrelevant. They did their duty on that. Can't elevate that one as it would 'shame' Nvidia and pull from their ti-releases. Sticks around as a cheapo 2080ti replacement to fill that niche. 2080ti is 390€ here already.

But there is only 10% difference between 3080 and 3090 and the latter is the full GA102.
How could 3080Ti fit into this? What chip would it be based on?
 
Top Bottom