• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

llien

Member
there does seem to be more Nvidia cards closer in line with their MSRP, at least from a quick glance, although many of them are the lower tier models.
Never seen cards by either manufacturer close to claimed MSRP, bar amd.com site.

3080 is simply MIA.

3090 go for 1700+ Euro, cheapest 3070 is 729 Euro.
6800 is 850, 6800XT goes for 950.
And, curiously, here is how sales on the mindfactory looked like on week 1 2021:

Radeon Top 5 Selling Brand Lines!

  1. RX 6800 = 200 Units.
  2. RX 580 = 200 Units.
  3. RX 5700XT = 100 Units
  4. RX 5600XT = 80 Units.
  5. RX 570 = 40 Units.


Nvidia Top 5 Selling Brand Lines!

  1. RTX 3070 8GB = 135 Units.
  2. GTX 1650 Super = 80 Units
  3. GTX 1030 = 80 Units.
  4. RTX 2060 6GB = 70 Units.
  5. GT 710 = 70 Units.
 

thelastword

Banned
First it was better than native resolution.

Now we finally accept that it introduces blurriness.

Hopefully next you guys will be able to accept how it loses quality on a lot of distant objects, but that might be pushing it I guess.
Are you telling me that narrative that DLSS is better than Native no longer applies? Are people finally admitting to the concessions in a DLSS image?
 

Kenpachii

Member
Steve says at the end of the video there is an argument for the 3070 if you value RT and DLSS. Thing is, you're sacrificing a chunk of performance and half the VRAM if you go from a 6800 to a 3070, and that 8GB is starting to show its age.

As usual steve is fucking wrong with his misleading video's.

8gb 3070 for a 500 card that gets sold for 200+ above msrp everywhere to 700 is laughable bad. 8gb = budget card area at this point in time and the 3060ti makes far far far far more sense then the 3070 ever will be.
And if nvidia drops a 3050 ti at 300 bucks or even lower with still 8gb of memory that card will make the 3060 ti useless.
 
Last edited:

Sun Blaze

Banned
As usual steve is fucking wrong with his misleading video's.

8gb 3070 for a 500 card that gets sold for 200+ above msrp everywhere to 700 is laughable bad. 8gb = budget card area at this point in time and the 3060ti makes far far far far more sense then the 3070 ever will be.
And if nvidia drops a 3050 ti at 300 bucks or even lower with still 8gb of memory that card will make the 3060 ti useless.
That post makes no sense.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Nvidia Top 5 Selling Brand Lines!

  1. RTX 3070 8GB = 135 Units.
  2. GTX 1650 Super = 80 Units
  3. GTX 1030 = 80 Units.
  4. RTX 2060 6GB = 70 Units.
  5. GT 710 = 70 Units.

Totally off topic.
Just came to say the GT 710 will probably never die.
Seeing the GT 1030 that high also make me assume your offices are all upgrading to 4K displays.
 

GreatnessRD

Member

So they really gonna keep this mess going into 2021, eh? This sucks for a ton of people who want graphics cards at the lower price stack. Looks like Nvidia about to have a ball if rumors are true they're releasing the 80 Ti and regular 60 cards in February.
 

RoyBatty

Banned
The use of VRAM on Godfall with RT (DXR 1.1) is extremely high. It uses 15GB on 1440p (Epic settings + LPM).

RT OFF (8-8.2 GB)
fZrIU17.png


RT ON (15GB)
T2so5nD.png



I did some test with SAM (3900x + 32GB 3800 MHz).

Game Highest Settings - AVG FPS (OFF -- ON)

WoW Shadowland (not 100% use / RT On)

71 -----> 83

Cyberpunk 2077 (City)
69/70 -----> 74/78

Metro Exodus (RT On)
67 -----> 69

Star Citizen (Free flight)
90 -----> 102

Horizon Zero Dawn
118 -----> 120

Shadow of the Tomb Raider (RT Off)
145 -----> 150

AC Valhalla
97 -----> 111

Godfall (RT On)
101 -----> 106
(RT Off SAM On = 113)


Some Valhalla comparisons with friends:

6800 XT (SAM)

hjudrPF.png


3070

Tzlff8p.png


2060

D8AX01k.png
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The use of VRAM on Godfall with RT (DXR 1.1) is extremely high. It uses 15GB on 1440p (Epic settings + LPM).

RT OFF (8-8.2 GB)
fZrIU17.png


RT ON (15GB)
T2so5nD.png
tenor.gif


Are you using Afterburner per process monitoring for your VRAM usage?
Because thats utter trash that this game could possibly be using 15GB at 1440p.

Fuck that shit.
 

Papacheeks

Banned
tenor.gif


Are you using Afterburner per process monitoring for your VRAM usage?
Because thats utter trash that this game could possibly be using 15GB at 1440p.

Fuck that shit.

Thats the built in radeon tool. So who knows how accurate it is. For all we know it could just be cached memory log that this is showing metric wise. I never really look at usage as i play all my games at 1440p with my RADEON VII which has 16gb of HBM2 which is higher bandwidth.


Also i play at high settings not ultra for most of my games with a mix of some medium settings.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Thats the built in radeon tool. So who knows how accurate it is. For all we know it could just be cached memory log that this is showing metric wise. I never really look at usage as i play all my games at 1440p with my RADEON VII which has 16gb of HBM2 which is higher bandwidth.


Also i play at high settings not ultra for most of my games with a mix of some medium settings.
I guarantee its reporting wrong.
15GB of VRAM being used?

At 1440p?

Nope nope nope.
Better install the latest version of Afterburner and give that game a once over.
If it uses over 8GB ill be shocked.....shocked!

On Topic
But where dem 6700XT leaked benchmarks?
 

Sun Blaze

Banned
The use of VRAM on Godfall with RT (DXR 1.1) is extremely high. It uses 15GB on 1440p (Epic settings + LPM).

RT OFF (8-8.2 GB)


RT ON (15GB)




I did some test with SAM (3900x + 32GB 3800 MHz).

Game Highest Settings - AVG FPS (OFF -- ON)

WoW Shadowland (not 100% use / RT On)

71 -----> 83

Cyberpunk 2077 (City)
69/70 -----> 74/78

Metro Exodus (RT On)
67 -----> 69

Star Citizen (Free flight)
90 -----> 102

Horizon Zero Dawn
118 -----> 120

Shadow of the Tomb Raider (RT Off)
145 -----> 150

AC Valhalla
97 -----> 111

Godfall (RT On)
101 -----> 106
(RT Off SAM On = 113)


Some Valhalla comparisons with friends:

6800 XT (SAM)



3070



2060
So how are cards with 8GB of VRAM not choking with this game at 4K lol?

Hint: 15GB of VRAM is bollocks.
 

RoyBatty

Banned
tenor.gif


Are you using Afterburner per process monitoring for your VRAM usage?
Because thats utter trash that this game could possibly be using 15GB at 1440p.

Fuck that shit.

Those were from the control panel.

I took screenshots with my fpsmonitor app (numbers from there).


RT OFF


DCpjKS1.jpg



RT ON

DyO1B6z.jpg


FPS APP

WP91QnD.png



There are a some videos on Youtube with a 3090, playing on 4K without RT and using 10-12GB VRAM, so it's easy to believe.
 

Sun Blaze

Banned
Those were from the control panel.

I took screenshots with my fpsmonitor app (numbers from there).


RT OFF




RT ON



FPS APP




There are a some videos on Youtube with a 3090, playing on 4K without RT and using 10-12GB VRAM, so it's easy to believe.
Yet here it is at 4K on a 3080, and using 6GB of VRAM at Epic settings.



It's not using that either. It's allocated VRAM. The more you have, the more your card will allocate. That's why you'll see the 3090 allocate 12 while the 3080 allocates half of that. If this games was really eating 12-15GB of VRAM, the 3080 would flatline and stutter like hell.
 

RoyBatty

Banned
Yet here it is at 4K on a 3080, and using 6GB of VRAM at Epic settings.



It's not using that either. It's allocated VRAM. The more you have, the more your card will allocate. That's why you'll see the 3090 allocate 12 while the 3080 allocates half of that. If this games was really eating 12-15GB of VRAM, the 3080 would flatline and stutter like hell.

That zone is really sh*tty and closed (first mission), doesn't have RT either, that's why...

BTW Black_Stride Black_Stride was right with MSI AB and the process monitoring. But is using more than 10GB VRAM on 1440p with RT.

NJNRCHX.jpg


gQ4fyDh.jpg
 

Sun Blaze

Banned
That zone is really sh*tty and closed (first mission), doesn't have RT either, that's why...

BTW Black_Stride Black_Stride was right with MSI AB and the process monitoring. But is using more than 10GB VRAM on 1440p with RT.

NJNRCHX.jpg


gQ4fyDh.jpg
Once again, allocated VRAM. At 4K it's using 6GB. Dropping it to 1440p and adding RT won't fucking suddenly make the usage go up by 2/3. That's basic understanding of VRAM. Allocated isn't the same as used and it's actually quite difficult to pinpoint how much vram is being used. Especially with tools like afterburner or Windows which are completely off.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
That zone is really sh*tty and closed (first mission), doesn't have RT either, that's why...

BTW Black_Stride Black_Stride was right with MSI AB and the process monitoring. But is using more than 10GB VRAM on 1440p with RT.

NJNRCHX.jpg


gQ4fyDh.jpg

~10GB sounds alot more plausible, imma guess some Unreal Engine RT memory leaks or something getting it that high at 1440p.
Unreal Engine shouldnt be eating that much VRAM even with RT at 1440p....whatever implementation of RT they used must be pretty trash to increase VRAM by that much.

Considering the game uses around 8GB at 4K....dropping resolution in half and adding RT shouldnt increase the VRAM usage by 25%

Im going to guess the next patch will have VRAM actual usage be closer to 8GB at 1440p with RT on.
 
Last edited:
D

Deleted member 17706

Unconfirmed Member
Yet here it is at 4K on a 3080, and using 6GB of VRAM at Epic settings.



It's not using that either. It's allocated VRAM. The more you have, the more your card will allocate. That's why you'll see the 3090 allocate 12 while the 3080 allocates half of that. If this games was really eating 12-15GB of VRAM, the 3080 would flatline and stutter like hell.


Haven't Radeon cards long used (or allocated) quite a bit more VRAM than their Nvidia counterparts?
 

The Skull

Member
As expected, a few good jumps in performance and some not so much. I wonder if devs will be able to specifically optimize for this in future games? Interesting to see how this performs on Intel and Nvidia.
 

RoyBatty

Banned
Once again, allocated VRAM. At 4K it's using 6GB. Dropping it to 1440p and adding RT won't fucking suddenly make the usage go up by 2/3. That's basic understanding of VRAM. Allocated isn't the same as used and it's actually quite difficult to pinpoint how much vram is being used. Especially with tools like afterburner or Windows which are completely off.

Nope. VRAM allocated is 15GB. 10GB is the usage with RT ON. I had only Godfall open.

BTW I don't realize what is your point in this thread, your successive posts looks clearly like a Nvidia's warrior. No interest on RDNA2 at all.

~10GB sounds alot more plausible, imma guess some Unreal Engine RT memory leaks or something getting it that high at 1440p.
Unreal Engine shouldnt be eating that much VRAM even with RT at 1440p....whatever implementation of RT they used must be pretty trash to increase VRAM by that much.

Considering the game uses around 8GB at 4K....dropping resolution in half and adding RT shouldnt increase the VRAM usage by 25%

Im going to guess the next patch will have VRAM actual usage be closer to 8GB at 1440p with RT on.

Maybe. Or maybe we will see more of it in the next next generation games.

UuaFHLe.png

71VwMVD.png


Godfall's devs didn't said it need 12GB on 4K?

I'll give a try without RT on home.


Is really nice to have a performance upgrade with cost 0. Hope it come to NVIDIA and Intel too.

My test.
 

Mister Wolf

Member
Nope. VRAM allocated is 15GB. 10GB is the usage with RT ON. I had only Godfall open.

BTW I don't realize what is your point in this thread, your successive posts looks clearly like a Nvidia's warrior. No interest on RDNA2 at all.



Maybe. Or maybe we will see more of it in the next next generation games.

UuaFHLe.png

71VwMVD.png


Godfall's devs didn't said it need 12GB on 4K?

I'll give a try without RT on home.



Is really nice to have a performance upgrade with cost 0. Hope it come to NVIDIA and Intel too.

My test.

4K-Godfall.png
 

martino

Member
Yet here it is at 4K on a 3080, and using 6GB of VRAM at Epic settings.



It's not using that either. It's allocated VRAM. The more you have, the more your card will allocate. That's why you'll see the 3090 allocate 12 while the 3080 allocates half of that. If this games was really eating 12-15GB of VRAM, the 3080 would flatline and stutter like hell.

Or we can assume metrics are right and make conclusions.
AMD tech stack need twice as much memory as nvidia one to display same thing
Are amd boyz happy with what this mean or will they try to use their brain and real gpu ram usage numbers now when comparing data in the end ?
 
Last edited:

Ascend

Member
As expected, a few good jumps in performance and some not so much. I wonder if devs will be able to specifically optimize for this in future games? Interesting to see how this performs on Intel and Nvidia.
I think devs can definitely optimize for this. If you can only address 256MB at a time like right now, that means you need to constantly monitor and refresh multiple VRAM chunks by copying stuff from system RAM. This costs time and CPU cycles and actually also increases VRAM allocation, because it might take a few cycles for you to refresh the 256MB portion that you need.
If you can create multiple 4GB chunks for example, it saves quite a bit on CPU and RAM resources. That also likely means that the speed of your system RAM will matter less for games, which is a good thing.

I don't think they'll use 4GB anytime soon though. Likely 2GB, since with 4TB you'd get into trouble with 10GB cards. Everything is divisible by 2GB. Or should I say 2 GiB? Optimally, it should automatically detect your GPU VRAM size and create one whole chunk of it for your CPU to address. But I'm not sure if that's practical for game development. It would need to be baked into the engine.
 
Last edited:

Sun Blaze

Banned
I'm curious about the extreme performance gains in AC: Vallhalla and Borderlands 3. Performance uplift in the 10-20% range is nothing to sneeze at and I really wonder why is that.
A bit saddened that in some games the performance difference is negative though. Makes it a bit of a pain in the ass as you have to reboot the whole system to disable it.
Would be nice if you could just toggle it on and off.
 

Ascend

Member
I'm curious about the extreme performance gains in AC: Vallhalla and Borderlands 3. Performance uplift in the 10-20% range is nothing to sneeze at and I really wonder why is that.
A bit saddened that in some games the performance difference is negative though. Makes it a bit of a pain in the ass as you have to reboot the whole system to disable it.
Would be nice if you could just toggle it on and off.
That would mean giving the OS low level access to BIOS settings... Doesn't seem like a good idea. The most obvious equivalent I can think of right now is being able to turn SMT/HT on and off in Windows for your CPU. That's asking for trouble.
 
Once reziable BAR support becomes more widespread and supported across multiple GPU/CPU/Mobo vendor combinations maybe we will see modern engines be updated to support it directly, which might mean even more performance gains and at the very least not causing any performance loss. Would be nice if we start to see it incorporated into late 2021/2022 games.
 
Last edited:

Sun Blaze

Banned
That would mean giving the OS low level access to BIOS settings... Doesn't seem like a good idea. The most obvious equivalent I can think of right now is being able to turn SMT/HT on and off in Windows for your CPU. That's asking for trouble.
Oh yeah, definitely would be a huge security breach potential.
SAM as far as I'm concerned is a big deal and should be getting more attention. Up to 20% performance uplift is pretty incredible, especially in games that are very demanding like AC: Valhalla or Borderlands 3.
 
SAM as far as I'm concerned is a big deal and should be getting more attention. Up to 20% performance uplift is pretty incredible, especially in games that are very demanding like AC: Valhalla or Borderlands 3.

The reason it is not getting more attention is that Nvidia doesn't have it (yet) and more importantly Nvidia were not first to the market with it. If this was an Nvidia first feature it would be hailed as the second coming and shouted from the rooftops from a lot of the very people in this thread essentially ignoring it, downplaying it or saying it is not a big deal. Probably some of the tech press too, as we saw with the pressure Nvidia puts on review outlets to toe the line and push whatever features/narrative they have/want or face the consequences. Not to mention their promotional content with some tech reviewers like Digital Foundry etc... "better than native" and so forth.

This may sound harsh but it is just the cold hard facts. It is especially funny when people try to spin Radeon GPUs as being behind Nvidia in all features and having no advantages at all or no reasons at all to buy them, these people always conveniently leave out or somehow forget to mention SAM. But it is what it is, fanboys gonna fanboy and brigade threads. Just a pity that a group of mostly 30+ year old guys feel so insecure about their luxury GPU purchases and potential competition in the market that they resort to nonsense like that, but what can you do? 🤷‍♂️
 
Last edited:
The reason it is not getting more attention is that Nvidia doesn't have it (yet) and more importantly Nvidia were not first to the market with it. If this was an Nvidia first feature it would be hailed as the second coming and shouted from the rooftops from a lot of the very people in this thread essentially ignoring it, downplaying it or saying it is not a big deal. Probably some of the tech press too, as we saw with the pressure Nvidia puts on review outlets to toe the line and push whatever features/narrative they have/want or face the consequences. Not to mention their promotional content with some tech reviewers like Digital Foundry etc... "better than native" and so forth.

This may sound harsh but it is just the cold hard facts. It is especially funny when people try to spin Radeon GPUs as being behind Nvidia in all features and having no advantages at all or no reasons at all to buy them, these people always conveniently leave out or somehow forget to mention SAM. But it is what it is, fanboys gonna fanboy and brigade threads. Just a pity that a group of mostly 30+ year old guys feel so insecure about their luxury GPU purchases and potential competition in the market that they resort to nonsense like that, but what can you do? 🤷‍♂️


Embarassing as always i see. The reason its not getting attention wouldnt be because its reserved for a particular set of cpus, a particular set of gpus and a particular set of mobos, right ? Its because everyone is secretely against amd. Its not because there no tangible uplift in 3 quarters of the games tested and even a drop in performance on ocassion, right ? Its because we're all pro nvidia. Making a pathetic post like this and ending it talking about fanboys and insecurity is just the cherry on top. Acting like you're personally persecuted because the company with the inferior product isnt felated more. Good to see a bastion of objectivity like yourself
 
Embarassing as always i see.

Uh-huh, you don't have much of a leg to stand on I'm afraid with this, to put it mildly. Lets leave the personal insults and whatever out of this.

The reason its not getting attention wouldnt be because its reserved for a particular set of cpus, a particular set of gpus and a particular set of mobos, right ?

In and of itself that is a fair point and I don't totally disagree either, but then we see things like DLSS which works in a far smaller range/array of games and the hype surrounding it and that narrative kind of falls apart a little under further inspection, no?

Its because everyone is secretely against amd.

Where are you getting this type of sentiment? I've seen this type of thing brought up a few times in various threads from mostly people with a strong Nvidia leaning to put it kindly, nobody has a secret head cannon or fanfic about the world being secretly against AMD. Repeating it indefinitely doesn't somehow make it true.

Currently Nvidia is the long time market leader with somewhere between 70-80% market share right now. Being market leader tends to afford you a certain dominance in mindshare with fans/gamers/the press. Nvidia also has absolutely amazing marketing and branding, like seriously some of the best in the world, they are much better than AMD at marketing their products and features, of course having market dominance certainly helps there.

In terms of the tech press we have already seen with the Hardware Unboxed fiasco that Nvidia can be quite "forceful" so to speak with the talking points, features and editorial narrative that they want people to push. How much more of that goes on behind the scenes and never reported on or heard about? Who knows but it is not a massive stretch to say it could be a potential factor in what tech channels and sites want to push specifically.

Granted I'm not saying that is 100% the case or anything, but Nvidia have given enough pause for thought with their shenanigans that there may always be a seed of doubt in people's minds. Although I don't really want to relitigate the whole HU fiasco, that thread was embarrassing enough as it is. But the point I'm making is that if SAM was an Nvidia feature like DLSS for example and Nvidia had come out with it and not AMD then we could potentially see a more positive press spin than we currently have seen from most outlets. Of course that is not guaranteed but probably somewhat likely all things considered.

Regarding the perception of SAM in the minds of Nvidia fans/fanboys? It would definitely be viewed more positively and added to the "long list of features Nvidia is ahead of AMD with! lol" etc... I mean this is so obvious I don't even think it needs to be said. That is the kind of thing fanboys do whether they are for Sony, MS, Nvidia, AMD, Intel or whoever so I don't really see how you could disagree with such an obvious point. For example if AMD came out with DLSS instead of Nvidia the AMD fanboys would likely be overwhelmingly positive on "AMD-DLSS" and the Nvidia fanboys would be downplaying it every chance they got and vice versa.

Its because we're all pro nvidia.

You certainly are, not really much question about that as it is self evident from your posts in this thread and others. There are also many others who seem to regularly post in this thread or really any Radeon related thread with mostly negative AMD related comments and mostly positive Nvidia comments. This is not rocket science and is again self evident just looking at this or any of those threads. We generally call this type of thing "brigading" or simply trolling.

Generally the motivators are insecurity over their purchase and fear of competition. This results in a need to defend their purchase in an overly zealous manner, kind of like a religious foot soldier or cult member, and as we all know the best defense is a good offense. Anyway this is product fanboying on internet forums 101, nothing particularly new here to learn for anyone reading this. My point was that a lot of the people who match the description above who have been vigorously posting in this thread have been either outright ignoring or downplaying SAM, and my point was that this was simply because it was not an Nvidia feature.

Which is a logical take away, especially given Nvidia's market dominance the last decade or so it is also logical to assume there are far more people who are fans of Nvidia than AMD for example and as such they have been used to buying Nvidia products as a no brainer as AMD were not competitive. Now that they are we see excess downplaying of AMD/their features and otherwise acting out as they don't know how to handle actually having competition. A pretty simple and logical conclusion I think.

Making a pathetic post like this and ending it talking about fanboys and insecurity is just the cherry on top.....Good to see a bastion of objectivity like yourself

As explained my original comment was pretty short, straight forwards and based on logical conclusions and evidence. The fact that you were triggered enough to post the above drivel based on that says everything.

I don't patrol Nvidia threads to try to dissuade people from buying 3000 series GPUs, I don't continually troll them with low effort bait and hot takes, I don't quote AMD marketing as if they were fact. I don't post overly negative comments about 3000 series GPUs in this thread either and I've given Nvidia their due with better Ray Tracing, much better Path Tracing, I think DLSS is great in general and the Nvidia GPUs have big CUDA based advantages in Blender or Adobe suites, these are great things and great reasons to buy them.

Don't take this the wrong way but being called a fanboy by someone who brigades all Radeon threads on behalf of the Nvidia mothership isn't exactly the stinging barb you believe it to be, you unfortunately don't have the credibility to pull that kind of thing off and be taken seriously, sorry.

Acting like you're personally persecuted

Again, where are you getting this from? Who is acting personally persecuted anywhere regarding GPUs? I'm certainly not. Again as above, simply repeating this over and over again doesn't make it true. They are just GPUs man, we are all technology enthusiasts but at the end of the day they are luxury consumer electronics, toys essentially. I don't think anyone is feeling personally persecuted based on toys. I'm certainly not anyway. I just dislike hypocrisy, double standards, misinformation and when people partake in these actions with a smugness multiplier. When I see these things I tend to correct where I can and when I can be bothered.
 
Last edited:
Anyway, moving on from that slight derail, I think once SAM is supported by all vendors more widely over the next 2-5 years we are going to see some games/engines written to take advantage of it and hopefully increase the benefits granted by SAM/Resizable BAR to the point where it is simply turned on by default all the time. But I think it will take a while to get there and older titles that don't currently benefit from it or benefit negatively will likely stay that way.
 
Played around with oc'ing my card. Got it to stable 2600mhz. Or rather 2650 and it will result in slightly lower clocks.

At 4k in Control, without AA I can actually manage 60hz this way. running heavily into power limits there at 330watts... also gets a bit loud then...

And yes. SAM is cool. Free performance increase. (without any repercussions)

Edit:
Scratch that. 2650 isn't stable. 2600 was though ^^ (for now)
 
Last edited:
Played around with oc'ing my card. Got it to stable 2600mhz. Or rather 2650 and it will result in slightly lower clocks.

At 4k in Control, without AA I can actually manage 60hz this way. running heavily into power limits there at 330watts... also gets a bit loud then...

And yes. SAM is cool. Free performance increase. (without any repercussions)

Nice, RDNA2 cards really are overclocking monsters. Great to see cards that actually benefit from an OC again.

Which card and model do you have? AIB or reference?
 
tenor.gif


Are you using Afterburner per process monitoring for your VRAM usage?
Because thats utter trash that this game could possibly be using 15GB at 1440p.

Fuck that shit.
The PS5 literally only has 16 GB of RAM, of which only 14 GB or so is actually accessible to the dev. So there's no way Godfall on PC is somehow using more VRAM than the PS5 has available for the whole game. It's an obvious example of mistaking allocation for usage.
 

smbu2000

Member
Sold off the 5700XT that I was using and picked up a RX6800. Was looking at getting the 6800XT, but I've never seen any in stock. Only standard 6800 cards and once a pair of 6900XT cards. Anyway, got a good deal on my card (Sapphire RX6800 Nitro+), so I'm happy especially with even my "lower end" RX6800 (non-XT) card has 16GB of VRAM. Looking forward to trying it out with the Ryzen 5900X that I picked up last month.
 

RoyBatty

Banned
The PS5 literally only has 16 GB of RAM, of which only 14 GB or so is actually accessible to the dev. So there's no way Godfall on PC is somehow using more VRAM than the PS5 has available for the whole game. It's an obvious example of mistaking allocation for usage.

It uses 10.5GB with RT and 8GB without RT on 1440p (SAM ON).
 
The PS5 literally only has 16 GB of RAM, of which only 14 GB or so is actually accessible to the dev. So there's no way Godfall on PC is somehow using more VRAM than the PS5 has available for the whole game. It's an obvious example of mistaking allocation for usage.

Correct me if I'm wrong, but don't PC versions of games normally offer higher texture quality and graphical settings than their console equivalents? Wouldn't this in turn take up more VRAM? I thought this was a fairly normal thing. And again I could be wrong here, but don't the console versions of Godfall not support Ray Tracing yet? Which as far as I'm aware increases the memory usage more when used vs not being used.

Granted that doesn't mean it is definitely using 15GB or whatever large amount of VRAM. Although RoyBatty RoyBatty has just tested the game on his machine and afterburner which is supposed to now show actual usage vs allocation is showing 10.5GB of usage with RT turned on at 1440p. I would imagine this could increase again a little if playing in 4K?
 
Sold off the 5700XT that I was using and picked up a RX6800. Was looking at getting the 6800XT, but I've never seen any in stock. Only standard 6800 cards and once a pair of 6900XT cards. Anyway, got a good deal on my card (Sapphire RX6800 Nitro+), so I'm happy especially with even my "lower end" RX6800 (non-XT) card has 16GB of VRAM. Looking forward to trying it out with the Ryzen 5900X that I picked up last month.
Congrats on getting a card at all to be honest, supply issues are really bad at the moment along with the crazy pricing.
 

Ascend

Member
The PS5 literally only has 16 GB of RAM, of which only 14 GB or so is actually accessible to the dev. So there's no way Godfall on PC is somehow using more VRAM than the PS5 has available for the whole game. It's an obvious example of mistaking allocation for usage.
This argument makes zero sense.

As Ryujin just pointed out above, PC generally offers higher quality assets that take up more system RAM and VRAM.

Secondly, because consoles are one pool of RAM, they can be more efficient with it. Until SAM becomes mainstream, whatever is in VRAM must also be kept in system RAM, meaning the total RAM usage of a PC will always be higher than the consoles. If we estimate 14GB of use in the PS5, that means that in theory, with a PC running console settings, you'd have to have 14GB of VRAM to avoid using system RAM, all else being equal.

Lastly, the reason that games allocate more VRAM is to avoid having to fetch additional data from system RAM. Games don't allocate more VRAM for nothing. If your RAM is fast enough, this is not an issue, and the GPU can receive the data in time. If your system RAM is too slow (or the fetch request comes in too late), you'll get stutters. There is a minimum amount of VRAM required for games where the system RAM can still feed the GPU fast enough. For that, 8GB is generally enough for now, although there are a handful of exceptions already.

All this being said, I am not certain that Godfall requires more than 10GB to run smoothly with RT turned on, but it's definitely allocating it. For that, we would need to have RT enabled on cards that have 8GB or 10GB of VRAM and analyze the frametimes.
 
Top Bottom