• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NX Gamer] The Medium - The Complete Technical Analysis - PC | Series X | Series S

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Someone told me that Godfall is that game. How much VRAM does it take when operating at highest settings + all graphical options on?

ke1gp0na9ry51.png


Not even.
 
I'm seeing people reaction to this video on twitter and NX boy is being torn to pieces like an ignorant fraud.
He really seems to not know the technical workings of a lot of things and judges by leaps of faith.
 
The more I see the series S in practice the worst I feel for those who bought into its promise (1440p next generation graphics).

No, sub 720p doesn't cut it.
 

Stuart360

Member
Devs really need to start using proper dynamic resolution on PC, it's almost mandatory for games like these.
A lot of PC games come with dynamic resolution options now, although i find them too aggresive, so optimize myself.
The problem with this game is your gpu load doubles instantly when the game goes split screen. I started the game, and was getting 1080p/60 and was thinking here we go another game with hyperbole about its performance, then the first split screen segment started lol.
 

Md Ray

Member
By the time you fill up 8GB of VRAM your GPU would probably have choked out on performance anyway.
So giving the 3060 alot of VRAM is pointless because in gaming applications it would have already died by the time if got to filling its VRAM.

Theres is no game right now even with RT that eats up over 8GB of VRAM that wouldnt have already been slideshowing at that resolution.
The 8GB limitation is very real. RTX 2080 Ti is 82% ahead of 3070 when they're supposed to be identical here. Your next argument would be "Well, it's 4K blah, blah.. It's a 1440p card".

But have you ever thought that next-gen games could very well push VRAM so hard that even at 1440p this could become problematic?

5v5km4j.png
 
Last edited:
The 8GB limitation is very real. RTX 2080 Ti is 82% ahead of 3070 when they're supposed to be identical here. Your next argument would be "Well, it's 4K blah, blah.. It's a 1440p card".

But have you ever thought that next-gen games could very well push VRAM so hard that even at 1440p this could become problematic?

5v5km4j.png


i7DjxA8.png




This piece of shit AMD sponsored and specially tweaked game is literally useless for benchmarking anything. Its not representative of anything.

I see the FUD about those 10 gigs on the 3080 continues, even after almost half a year after launch and the most demanding games running flawlessly on that amount of vram. This year doesnt seem to have some graphical powerhouse that would make that card sweat. So we will be going into 2022 and still await the moment when 10 gigs will be a bottleneck. This show what a nonsense issue this is.

People seem to gloss over the fact that series x also has 10 gig of the faster ram that is used for gpu and not 16. 10 gigs would only ever be an issue at 4k using the absolute highest settings. Anything bellow 4k will be a non issue. Using DLSS at 4k will also make this a non issue.

Ive never seen nonsense like this, people pretending to be concerned over a non-issue, which continues to fail to materialize.
 

Md Ray

Member
i7DjxA8.png




This piece of shit AMD sponsored and specially tweaked game is literally useless for benchmarking anything. Its not representative of anything.

I see the FUD about those 10 gigs on the 3080 continues, even after almost half a year after launch and the most demanding games running flawlessly on that amount of vram. This year doesnt seem to have some graphical powerhouse that would make that card sweat. So we will be going into 2022 and still await the moment when 10 gigs will be a bottleneck. This show what a nonsense issue this is.

People seem to gloss over the fact that series x also has 10 gig of the faster ram that is used for gpu and not 16. 10 gigs would only ever be an issue at 4k using the absolute highest settings. Anything bellow 4k will be a non issue. Using DLSS at 4k will also make this a non issue.

Ive never seen nonsense like this, people pretending to be concerned over a non-issue, which continues to fail to materialize.
....

I was talking about 8GB...
 
Yeah, shouldn't 3070 be ahead of 2080 Ti at 4K RT as well, no?


Thats why this game is useless. It doesnt have a tipical behaviour and is one of the very few AMD sponsored games thats heavily and specifically tweaked for radeons. This game is not representative for anything
 

Md Ray

Member
Thats why this game is useless. It doesnt have a tipical behaviour and is one of the very few AMD sponsored games thats heavily and specifically tweaked for radeons. This game is not representative for anything
I've seen the same thing happen with Cyberpunk 2077 as well. 3070's 8GB VRAM gets choked. "but..but 4k...." doesn't work because... Like I said: what if next-gen games push VRAM so hard that even at 1440p this becomes problematic?
 
Last edited:

Md Ray

Member
3070 is surely BW limited with only 448gb/sec.
2080 Ti has 37.5% more bandwidth than 3070. How do you explain the 82% uplift for the 2080 Ti then?

It's because textures, shadow maps and are falling back to system memory due to VRAM limitation hence the perf penalty.
 

Neo_game

Member
2080 Ti has 37.5% more bandwidth than 3070. How do you explain the 82% uplift for the 2080 Ti then?

It's because textures, shadow maps and are falling back to system memory due to VRAM limitation hence the perf penalty.

Not sure how you came to that conclusion. Have you seen benchmarks of other more impressive games ? 3070 problem is nore BW than Vram. Moreover 2080Ti was 1200$ gfx card. 3070 is doing great. May be in some years time with 8K textures or some silly ultra settings Vram may come into play. It is useless for a mediocre game like this or Godfall.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
2080 Ti has 37.5% more bandwidth than 3070. How do you explain the 82% uplift for the 2080 Ti then?

It's because textures, shadow maps and are falling back to system memory due to VRAM limitation hence the perf penalty.

Because bandwidth doesnt translate 1:1 with framerate.
Second Godfall is utter trash for benchmarking RT.
The cost of turning on RT is 5fps at 4K on a 2080Ti?


Also Nvidia RT was turned on 4 days ago where is that benchmark from?
 

Kenpachii

Member
By the time you fill up 8GB of VRAM your GPU would probably have choked out on performance anyway.
So giving the 3060 alot of VRAM is pointless because in gaming applications it would have already died by the time if got to filling its VRAM.

Theres is no game right now even with RT that eats up over 8GB of VRAM that wouldnt have already been slideshowing at that resolution.

580gtx with 3gb of v-ram was useless. games barely used 1,5gb at the time as games where designed around 512mb of v-ram so it was all fine. The moment new gpu's would come out the performance would drastically increase anyway and that 580 would fall flat, hell 1,5gb is like what 6x what PS3 has? no way next gen could require that much and a 580 is even the flag ship card 560 with 1gb will also be fine guys!

Guess what happened.

That 3gb 580 was playing every single next gen game under the sun perfectly fine, guess what happened with those 1,5gb plebs? they where forced to upgrade and ditch there cards because even on low settings nothing runned and it didn't stop there, v-ram requirements creeped up and up throughout the gen to the point it even pushes past the 6gb which was considered massive overkill back in the day.

This is where you guys are stuck with your logic. looking at current gen games and that's about it even while every single new game that comes out from this point forwards could ditch the ps4 and xbox one completely and start to require massive massive amounts of v-ram. The PS5 and xbox series X all have 10gb of v-ram to spend on games, no imagine PC games going even further. suddently 20gb isn't that useless anymore and a 3090 is cruising without issue's forwards. anything else nvidia has on the market will be bottlenecked like hell.

The reason i state the 3060 also will age well even whiel it doesn't have a large memory pool, its bigger then what consoles have to spend on games so it will stay valid no matter what and the lower settings people use on budget cards anyway will help the card massively to stay relevant.

The moment the games require a 2060 super gpu performance to play games at high settings as a minimum with atleast 10gb of memory a 3060 will stay relevant a 3070 wil not run it.

And if you think nvidia has your back, think again they do this on purpose to force you to upgrade in 2 years again.

Performance of a GPU is fucking useless if you don't got the v-ram to boot it,
 
Last edited:

Rikkori

Member
The 8GB limitation is very real. RTX 2080 Ti is 82% ahead of 3070 when they're supposed to be identical here. Your next argument would be "Well, it's 4K blah, blah.. It's a 1440p card".

But have you ever thought that next-gen games could very well push VRAM so hard that even at 1440p this could become problematic?

5v5km4j.png
The funnier part is that these benchmarks don't even have FFx LPM activated, which is where vram demand really sky-rockets, that's why you see all these low vram apologists always use screenshots from youtube where the setting is disabled. The sad part is - it doesn't even matter because it's not like people have a choice what GPU to buy atm, so why are they still fanboying? They're simply retarded. :messenger_tears_of_joy:

 
Top Bottom