• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

8 Gb of Vram is not enough even for 1080p gaming.

kingyala

Banned
I may be wrong, but I think that the RTX3070 with it 8gb has always been advertised as a 1440p card.
yeah you can advertise a gpu for any specific resolution to easily sell it but that has always been fraudulent... you cant say so and so gb is directly proportional to so and so resolution.... what eats up vram is the amount of data per frame or scene you can simply open up blender and keep adding objects on screen until you run out of memory... this doesnt matter what resolution your running at... its like saying 1 tb hdd is enough for a windows workstation pc... you just cant ever have enough memory this always depends on the circumstances and i blame the crossgen period plus lack of boundary pushing nextgen games on pc similar to what crysis was, plus other plethora of reasons
 

SlimySnake

Flashless at the Golden Globes
The 7900XTX isn't a 53 TFLOPs card, though.
The ALUs on RDNA3 WGPs are double-pumped, meaning they can do twice the FMA operations, but they did so without doubling the caches and schedulers (which would increase the die area a lot more).
This means Navi3 cards will only achieve their peak throughput if the operations are specifically written to take advantage of VOPD (vector operation dual) or the compiler manages to find and group compatible operations (which it doesn't, yet), and if these operations don't exceed the cache limits designed for single-issue throughput.
This means that, at the moment, there's virtually no performance gain from the architectural differences between RDNA2 and RDNA3. The 7900XTX is behaving like a 20% wider and ~10% higher clocked 6900XT with more memory bandwidth, which is why it's only getting a ~35% performance increase until the 6900XT gets bottlenecked by memory bandwidth at high resolutions.


This should get better with a more mature compiler for RDNA3, but don't expect the 7900XTX to ever behave like a 6900XT if it had 192 CUs instead of 80.
So you're expecting the 7800xt to be roughly 10% faster than the 6950xt?

I wonder how they price it considering you can get the 6950xt for $650 nowadays.
 

SmokedMeat

Gamer™
l
So you're expecting the 7800xt to be roughly 10% faster than the 6950xt?

I wonder how they price it considering you can get the 6950xt for $650 nowadays.

I’m not sure, but they could pull an Nvidia and pretend the previous gen doesn’t exist.

Like the 3070ti still selling above $600.
 
I'm firmly of the opinion developers and the hardware manufacturers are effectively colluding to force everyone onto next generation cards.

In effect, meaning anyone without a 30 or 40 series card and 32gb RAM will end up suffering a severely impaired experience...and it'll only get worse.

Any AAA title releasing next year will basically be unplayable on 10 and 20 series Nvidia cards, even at 1080p.

They will be so poorly optimised (on purpose) and demanding on VRam, that only those on the new hardware who can effectively brute force their way through these issues will have any success.

Resident Evil 4 Remake aside, which itself is still pretty demanding, think about Forspoken, or Gotham Knights, or Hogwarts Legacy, or even Dead Space Remake.

Lots of poor performance in games that really aren't so breathtaking they should be forcing a 4090 to struggle to maintain 60fps.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I'm firmly of the opinion developers and the hardware manufacturers are effectively colluding to force everyone onto next generation cards.

In effect, meaning anyone without a 30 or 40 series card and 32gb RAM will end up suffering a severely impaired experience...and it'll only get worse.

Any AAA title releasing next year will basically be unplayable on 10 and 20 series Nvidia cards, even at 1080p.

They will be so poorly optimised (on purpose) and demanding on VRam, that only those on the new hardware who can effectively brute force their way through these issues will have any success.
All these devs get money from nvidia and AMD so i wouldnt put it past them, but I think engineering on these games is hard enough. THey are probably starved for resources and simply dont have enough engineers to find and fix these issues. Just look at redfall shipping without a 60 fps mode. its clearly an engineering challenge that they say they will eventually resolve but they lack the time and resources to get this done in time.

I had to upgrade to 32 GB just to get hogwarts running without stuttering and it went from consuming 15GB to 25 fucking Gigabytes. Clearly the devs who delayed last gen versions because of lack of resources just said fuck it and targeted the latest console specs not to push graphics but just to make backend programming a bit easier.

P.S I have noticed a lot of these bizarre ray tracing patches on older games come out recently. Bizarre shadow only RT stuff that Halo Infinite and Elden Rings got an year after launch. Like who asked for this? Wouldnt be surprised if the devs took money from either nvidia or AMD and promised to ship an RT mode.
 

yamaci17

Member
All these devs get money from nvidia and AMD so i wouldnt put it past them, but I think engineering on these games is hard enough. THey are probably starved for resources and simply dont have enough engineers to find and fix these issues. Just look at redfall shipping without a 60 fps mode. its clearly an engineering challenge that they say they will eventually resolve but they lack the time and resources to get this done in time.

I had to upgrade to 32 GB just to get hogwarts running without stuttering and it went from consuming 15GB to 25 fucking Gigabytes. Clearly the devs who delayed last gen versions because of lack of resources just said fuck it and targeted the latest console specs not to push graphics but just to make backend programming a bit easier.

P.S I have noticed a lot of these bizarre ray tracing patches on older games come out recently. Bizarre shadow only RT stuff that Halo Infinite and Elden Rings got an year after launch. Like who asked for this? Wouldnt be surprised if the devs took money from either nvidia or AMD and promised to ship an RT mode.
to be fair, hogwarts legacy is still a mystery to me. because game clearly functions well on series s. and unless it uses directstorage or dedicated streaming tech stuff, i really wonder how they got it to work it with only 8 gb memory budget for both gpu and cpu.

I even made a test at 720p/low, and game still commited 26 gb memory. you can prevent or minimize stutters on 16 gb by destroying everything in the background (2 gb idle vram usage + 12 gb in game ram usage), but game still chomps and fills the entire 16 gb buffer



I really don't think series s runs at these settings... so it is more troubling. what is the game doing will all that ram?

however a frame cap around 45 50 fps weathered the storm. I couldnt find any hogsmeade series s benchmarks though

lets put bullshit aside, I'm just playing cyberpunk with path tracing while ram usage is maxed out around 9 10 gb usage. and hogwarts legacy at 720p/low looks infinitely worse than cyberpunk.

these games scale like junk.
 
Last edited:

SNG32

Member
I'm firmly of the opinion developers and the hardware manufacturers are effectively colluding to force everyone onto next generation cards.

In effect, meaning anyone without a 30 or 40 series card and 32gb RAM will end up suffering a severely impaired experience...and it'll only get worse.

Any AAA title releasing next year will basically be unplayable on 10 and 20 series Nvidia cards, even at 1080p.

They will be so poorly optimised (on purpose) and demanding on VRam, that only those on the new hardware who can effectively brute force their way through these issues will have any success.

Resident Evil 4 Remake aside, which itself is still pretty demanding, think about Forspoken, or Gotham Knights, or Hogwarts Legacy, or even Dead Space Remake.

Lots of poor performance in games that really aren't so breathtaking they should be forcing a 4090 to struggle to maintain 60fps.
It depends on the games. For jrpgs and fighting games a 10 series should suffice for this gen. Open world games will start getting bumped up to the minimum 20 series though.
 

yamaci17

Member
It depends on the games. For jrpgs and fighting games a 10 series should suffice for this gen. Open world games will start getting bumped up to the minimum 20 series though.
I think you're missing the point, GPU grunt is not the problem here. a 1080ti still has more or equal grunt compared to a 3060 without ray tracing/dlss involved

1080 also has plentiful of grunt, pretty much equal or a bit better than "3050".

problem is vram amounts. 1070 and 3070ti has same vram; 8 gb; which apparently devs are struggling to fit their data into. as a result, most casual users (I leave myself out of that) get ps3-n64 like assets/textures here and there. that really has nothing to do with people having a 2000 or 3000 card. actually, a 1080ti with 11 gb will have a better time with TLOU with high textures compared to someone with a 3070ti and 8 gb (if their idle vram usage is high). 4060/4060ti is further insulting with 8 gb

pretty much sums the problem
 

SNG32

Member
I think you're missing the point, GPU grunt is not the problem here. a 1080ti still has more or equal grunt compared to a 3060 without ray tracing/dlss involved

1080 also has plentiful of grunt, pretty much equal or a bit better than "3050".

problem is vram amounts. 1070 and 3070ti has same vram; 8 gb; which apparently devs are struggling to fit their data into. as a result, most casual users (I leave myself out of that) get ps3-n64 like assets/textures here and there. that really has nothing to do with people having a 2000 or 3000 card. actually, a 1080ti with 11 gb will have a better time with TLOU with high textures compared to someone with a 3070ti and 8 gb (if their idle vram usage is high). 4060/4060ti is further insulting with 8 gb

pretty much sums the problem
I still allude to somewhat trash ports being an issue. The PS5 last of us was made for the PS5. I’m not saying that vram isn’t an issue if you’re trying to run ultra 4K 60. But most games shouldn’t be having an issue from any of the 3060 series and up gpus running at 1440p or 1080p high settings. Regardless of VRAM. That is a shitty port.
 

Loxus

Member
I still allude to somewhat trash ports being an issue. The PS5 last of us was made for the PS5. I’m not saying that vram isn’t an issue if you’re trying to run ultra 4K 60. But most games shouldn’t be having an issue from any of the 3060 series and up gpus running at 1440p or 1080p high settings. Regardless of VRAM. That is a shitty port.
Did you watch the OP video?
 

Ev1L AuRoN

Member
Now that more and more games are targeting PS5/XSX it's only natural that not only the VRAM will increase but also the CPU requirements. Not that the custom Zen 2 on the consoles are amazing or anything but we need to keep the perspective that most of what we've been playing on PC for almost a decade target jaguar cores running at sub 2ghz clock.
 

RobRSG

Member
Funny thing is, they did. Except for the 7700XT/7800XT.
AMD ain't playing around.

Building an Enthusiast PC
More Memory Matters
Without enough video memory your experience may feel sluggish with lower FPS at key moments, more frequent pop-in of textures, or – in the worst cases – game crashes. You can always fine tune the in-game graphics settings to find the right balance of performance, but with more video memory you are less likely to have to make these compromises. For this enthusiast build, we recommend graphics cards with at least 16GB of video memory for ultimate 1440p and 4K gaming. For more mid-range graphics that are targeting 1440p, AMD Radeon™ offers 12GB GPUs that are excellent for QHD displays.

Peak Memory Usage in Newer Games
Tested with Radeon™ RX 7900 XTX at 4K Ultra Settings with RT on and off.
Tested with Radeon™ RX 7900 XTX at 4K Ultra Settings with RT on and off.


636WG7p.png
V1ycixr.png
wxOo2Ld.png
6OMaEeC.png
XZUHsrc.png
I took the time to read the add, and upon reflection, there are some stuff that can backfire here:

1. The double edge of using a campaign / bullet point that can sound similar, and be compared to a very known movement in the US. Come on, AMD. Your marketing guys should be smarter than that. Sponsoring games and asking them to artificially fuck up VRAM management was a genius move up until you do this.

2. The swapalooza of comparing graphics cards to make the competitor cards look expensive and bad. Why is the 6950XT being pitched against the 3080, while its real opponent was the OG 3090? And why the MSRP has been updated in only one side? The same can be said about the 6800XT vs 3070TI. I think the only fellas who are falling for this are the ones with short memory and also the ones that are trying to justify a useless upgrade path.
 
Last edited:

PaintTinJr

Member
For the sake of two of my nephews liking higher texture settings in games, one with a 6GB RTX 2060 Super and the other with a 8GB RX 6650XT and both having to use muddy texturing in their opinion while gaming of late, just to avoid loading stutters, I'm hoping this is just a short term problem caused by PC ports previously being wasteful about VRAM use, by staging compressed assets before use and redundant assets kept longer than necessary in VRAM, with lots of latitude to improve.

Assuming the gouging on the VRAM by Nvidia and the latest consoles pushing the bar up for VRAM now means that efficient streaming and accounting for VRAM is essential for high textures in games for PCs with 6GBs or more, I could see this getting fixed for many games via engine updates in the next year or two, probably just in time for actually needing12GBs or more for real when consoles start using more of their 16GBs as VRAM and eventually start streaming in more and more new data per second increasing the effectiveness of VRAM/RAM split they use.
 

SNG32

Member
For the sake of two of my nephews liking higher texture settings in games, one with a 6GB RTX 2060 Super and the other with a 8GB RX 6650XT and both having to use muddy texturing in their opinion while gaming of late, just to avoid loading stutters, I'm hoping this is just a short term problem caused by PC ports previously being wasteful about VRAM use, by staging compressed assets before use and redundant assets kept longer than necessary in VRAM, with lots of latitude to improve.

Assuming the gouging on the VRAM by Nvidia and the latest consoles pushing the bar up for VRAM now means that efficient streaming and accounting for VRAM is essential for high textures in games for PCs with 6GBs or more, I could see this getting fixed for many games via engine updates in the next year or two, probably just in time for actually needing12GBs or more for real when consoles start using more of their 16GBs as VRAM and eventually start streaming in more and more new data per second increasing the effectiveness of VRAM/RAM split they use.
I think nvidia will have some sort of DLSS 3 for 3000 series cards or make DLSS 2 more efficient of vram to hit 1080p or 1440p resolutions on high and Ultra textures. Ultra 4K you’re definitely going to have to use some heavy hitting gpus with a lot of vram though.
 

Stooky

Member
Almost all of the latest games have this issue. Ive been bitching about this for the last few months. Gotham Knights, Forspoken, Callisto Hogwarts, Witcher 3, RE4 all have really poor RT performance. Not all are related to VRAM like TLOU but lack of vram definitely doesnt help.

PCs will NEVER get optimized ports. You can go back 2-3 generations and every PC port releases with issues. PCs are meant to brute force through those poor optimizations and these games do exactly that unless you turn on RT which increases VRAM usage or enable ultra textures and boom, those same cards simply crash and simply do not perform according to their specs.

This will only continue as devs release unoptimized console ports. Yes, console ports. You think TLOU is properly optimized on the PS5? Fuck no. 1440p 60 fps for a game that at times looks worse than TLOU2? TLOU2 ran at 1440p 60 fps on a 4 tflops polaris GPU with a jaguar CPU. PS5 has a way better CPU and a 3x more powerful GPU. Yet all they managed to do was double the framerate. Dead Space on the PS5 runs at an internal resolution of 960p. That is not an optimized console game I can promise you. PCs just like consoles are being asked to brute force things, and the AMD and Nvidia cards with proper vram allocations can do exactly that.

Respawn's next star wars game is a next gen exclusive. FF16 is a next gen exclusive. Both look last gen but i can promise you, they will not be pushing 5GB vram usage like cyberpunk, rdr2, and horizon did. Those games still look better than these so-called next gen games, but it doesnt matter. They are being designed by devs who no longer wish to target last gen specs. And sadly, despite the fact that the 3070 is almost 35-50% faster than the PS5, the vram is going to hold it back going forward.
Slimy… why…every friggin time.…uggggh…Games are optimized for consoles. The optimization gets better over the life of console as devs learn more about the system. And honestly every game that is released is optimized. You know this.
 
Last edited:
I think nvidia will have some sort of DLSS 3 for 3000 series cards or make DLSS 2 more efficient of vram to hit 1080p or 1440p resolutions on high and Ultra textures. Ultra 4K you’re definitely going to have to use some heavy hitting gpus with a lot of vram though.

Unlikely there is no magic to fit 12GB-14GB of VRAM usage into Nvidia's paltry 8GB limit on most of their cards.
 

Noxxera

Member
I can guarantee I can play games at 4K with 8gb vram. 1080p? Im gonna have to enable resolution scaling and MSAA8 or something to hit the cap.
 
to be fair, hogwarts legacy is still a mystery to me. because game clearly functions well on series s. and unless it uses directstorage or dedicated streaming tech stuff, i really wonder how they got it to work it with only 8 gb memory budget for both gpu and cpu.

I even made a test at 720p/low, and game still commited 26 gb memory. you can prevent or minimize stutters on 16 gb by destroying everything in the background (2 gb idle vram usage + 12 gb in game ram usage), but game still chomps and fills the entire 16 gb buffer



I really don't think series s runs at these settings... so it is more troubling. what is the game doing will all that ram?

however a frame cap around 45 50 fps weathered the storm. I couldnt find any hogsmeade series s benchmarks though

lets put bullshit aside, I'm just playing cyberpunk with path tracing while ram usage is maxed out around 9 10 gb usage. and hogwarts legacy at 720p/low looks infinitely worse than cyberpunk.

these games scale like junk.


Are we sure it's maxing out vram in some of these examples or is it just using the extra space as a cache.

I wouldn't worry unless it's stuttering or stalling the game.
 

yamaci17

Member
Are we sure it's maxing out vram in some of these examples or is it just using the extra space as a cache.

I wouldn't worry unless it's stuttering or stalling the game.
it is stuttering. game legit needs to use 20 gb ram in hogsmeade to operate smoothly. you can see pretty evidently in the video, game is a stuttery mess on 16 gb in hogsmeade. quite literally unplayable. only way to make it tolerable is to lock to 30/40 on 16 gb. and even then you will still get stutters,

this is at 720p lowest possible settings where game will most likely look even worse than the eventual ps4 version. series s runs the game with clearly med high mixed settings at 900p/60 with only limited 8 gb total memory budget.

logically, you can assume that all the GPU VRAM data is duplicated. lets assume series s uses 6 gb vram for gpu and 2 gb vram for cpu operations

that means on PC, theoritically, game should've been fine with a total of 8 gb ram usage. instead, it hard requires 20 gb

either this game uses sophisticated streaming / directstorage etc. technologies on consoles to achieve that, or there's something broken on the PC sides. I never heard it using dstorage on consoles so I cannot be sure.

(this is a ram comparison btw not vram. in those settings, there's no problem with vram)
 
Last edited:
it is stuttering. game legit needs to use 20 gb ram in hogsmeade to operate smoothly. you can see pretty evidently in the video, game is a stuttery mess on 16 gb in hogsmeade. quite literally unplayable. only way to make it tolerable is to lock to 30/40 on 16 gb. and even then you will still get stutters,

this is at 720p lowest possible settings where game will most likely look even worse than the eventual ps4 version. series s runs the game with clearly med high mixed settings at 900p/60 with only limited 8 gb total memory budget.

logically, you can assume that all the GPU VRAM data is duplicated. lets assume series s uses 6 gb vram for gpu and 2 gb vram for cpu operations

that means on PC, theoritically, game should've been fine with a total of 8 gb ram usage. instead, it hard requires 20 gb

either this game uses sophisticated streaming / directstorage etc. technologies on consoles to achieve that, or there's something broken on the PC sides. I never heard it using dstorage on consoles so I cannot be sure.

(this is a ram comparison btw not vram. in those settings, there's no problem with vram)

Ah right sorry I thought it was vram related to the thread title and general vram posts lately.

System memory doesn't seem that bad, you can upgrade or add more quite cheaply. The big problem with vram issues is nvidia are so stingy with it and have hugely inflated GPU prices.
 

nkarafo

Member
Truth is, the more recourses you give to developers, the more bloated their code becomes because they care less about optimizing it. Give them 50GB of VRAM today and they will cap it in a couple of years without improving the graphics as much.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Conclusion : If you buy a new graphics card today, you need at least 12gb of vram even for 1080p at high ultra settings.


So dont play at high/ultra if you get an 8GB GPU?
We will ignore that the 3080'10G is still doing fine outside of outliers who seeming use Microsoft Paint to compress their high, medium and low textures.
 

th4tguy

Member
All these devs get money from nvidia and AMD so i wouldnt put it past them, but I think engineering on these games is hard enough. THey are probably starved for resources and simply dont have enough engineers to find and fix these issues. Just look at redfall shipping without a 60 fps mode. its clearly an engineering challenge that they say they will eventually resolve but they lack the time and resources to get this done in time.

I had to upgrade to 32 GB just to get hogwarts running without stuttering and it went from consuming 15GB to 25 fucking Gigabytes. Clearly the devs who delayed last gen versions because of lack of resources just said fuck it and targeted the latest console specs not to push graphics but just to make backend programming a bit easier.

P.S I have noticed a lot of these bizarre ray tracing patches on older games come out recently. Bizarre shadow only RT stuff that Halo Infinite and Elden Rings got an year after launch. Like who asked for this? Wouldnt be surprised if the devs took money from either nvidia or AMD and promised to ship an RT mode.
A lot of times those tech upgrades are learning projects for the devs. They want that tech to be implemented from day one on the next project and don’t have experience with it so they do a smaller patch project for the last title.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Have you seen Last of Us on Medium? It's worse looking than the Ps3 version.
Yes, because they literally spit on the textures if you dont set them to max.
Ive never seen a game basically say fuck you to texture quality that drastically.

Medium settings shouldnt be negative LOD bias.....Naughty Dog should be so ashamed of that.
Most games High and Medium arent that drastically different and people can comfortably play on medium, especially if it isnt an open world game.

The Last of Us on medium environmental textures looks like they were preparing a Switch port.
Fucking disgusting.......
pokemon-scarlet-violet-1668873902.jpg





But its an outlier and a very very bad port.
Most games high and medium textures are totally totally playable.......the last of us on medium looks much much much much worse than the last of us part 2 on base PS4.
That should tell you something went wrong.
 

Spyxos

Gold Member
Yes, because they literally spit on the textures if you dont set them to max.
Ive never seen a game basically say fuck you to texture quality that drastically.

Medium settings shouldnt be negative LOD bias.....Naughty Dog should be so ashamed of that.
Most games High and Medium arent that drastically different and people can comfortably play on medium, especially if it isnt an open world game.

The Last of Us on medium environmental textures looks like they were preparing a Switch port.
Fucking disgusting.......
pokemon-scarlet-violet-1668873902.jpg





But its an outlier and a very very bad port.
Most games high and medium textures are totally totally playable.......the last of us on medium looks much much much much worse than the last of us part 2 on base PS4.
That should tell you something went wrong.
The problem is, we have several of these very bad ports lately(Last of Us, Forspoken) there will be a lot more of them. Star Wars Jedi Survival and Immortals of Aveum will probably be one of them as well.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The problem is, we have several of these very bad ports lately(Last of Us, Forspoken) there will be a lot more of them. Star Wars Jedi Survival and Immortals of Aveum will probably be one of them as well.
The number of games that have totally passable medium textures vastly vastly vastly outnumbers and will continue to outnumber the games that compress anything under high to mud.

Immortals of Ave is a UE5 title, even using UEs default MipMapping wont give you what The Last of Us has......clearly an incompetent port team was working on that game.
If UE continues to be the reigning champ of most used game engines, then even indie devs like the guys making Immortal of Ave can just use Unreals auto generated MipMaps to have medium be totally playable.
 

Loxus

Member
The problem is, we have several of these very bad ports lately(Last of Us, Forspoken) there will be a lot more of them. Star Wars Jedi Survival and Immortals of Aveum will probably be one of them as well.
After watching that video, it clear those games aren't bad ports and even said that VRAM utilization is becoming a trend among recent titles.

The guy in the video also said those games are built at 4k, then down scale to 1080p, still keeping the 4k textures and assets.

I don't know why you guys don't want progress in PC gaming space. Not raising the minimum from 8GB to 12GB can hurt PC gaming in the long run. Especially when the PS4/XB1 gen is left behind for good.

It's a fact that every console gen, VRAM usage goes up and it puzzles me that you guys still deny this, even after multiple test videos.

In 5-6 years, consoles will be pushing 8k. Hopefully one day, PC gamers can stop holding back devs with having to optimize for 1080p.
 

Bojji

Member
So dont play at high/ultra if you get an 8GB GPU?
We will ignore that the 3080'10G is still doing fine outside of outliers who seeming use Microsoft Paint to compress their high, medium and low textures.

Truth is 12gb is ok amount for "next gen" games, with console like quality settings but with devs sometimes offering higher quality settings than on consoles and with ray tracing 12GB cards will struggle too.

3080 10GB is right now on the edge, you have to lower settings already in few games to not be fucked by insufficient VRAM amount.

I have 6800 and I'm thinking about changing GPU in the future (12 months) but there is no Nvidia GPU that is worth changing it to with reasonable price. 4070 outside or RT is just slightly better (and I would still have to pay 50% more) and 4070ti is 100% more expensive. Both GPUs also have less VRAM...
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
After watching that video, it clear those games aren't bad ports and even said that VRAM utilization is becoming a trend among recent titles.

The guy in the video also said those games are built at 4k, then down scale to 1080p, still keeping the 4k textures and assets.

I don't know why you guys don't want progress in PC gaming space. Not raising the minimum from 8GB to 12GB can hurt PC gaming in the long run. Especially when the PS4/XB1 gen is left behind for good.

It's a fact that every console gen, VRAM usage goes up and it puzzles me that you guys still deny this, even after multiple test videos.

In 5-6 years, consoles will be pushing 8k. Hopefully one day, PC gamers can stop holding back devs with having to optimize for 1080p.
You do know the most popular GPU range is the Low-Mid Range right.

The upper tiers of GPUs are the minority by quite a gap.
I think only the 3080 in recent months has seen a good boost because people have been offloading them, and people have been buying them up cuz they dropped as low as 500 dollars.

And people dont actually buy new GPUs every time a new series comes out.
New GPUs come out every 2 years.
If/when enough games force PC gamers to meet min spec, there will be a generation shift, and youll see people dumping their old GPUs just like GTX570/670/770 all basically vanished one day.

Sure I fully agree that 12GB of VRAM should be the minimum across the range, and its kinda disappointing that Nvidia is giving everything under the RTX4070 8GB of VRAM.

But people with 3070s and 3080s def dont need to throw away their GPUs.
Medium texture quality shouldnt be something that suddenly looks muddy as hell.

Its never been like that but for some reason people are acting like because The Last of Us has terrible terrible textures.....every game has terrible textures.
Its a huge huge outlier, weve seen nextgen only games that work within 10GB of VRAM just fine.
And even in the past, people have just lowered texture quality and been fine.


You arent going to see the top 20* GPUs on Steam suddenly vanish because of The Last of Us and Forspoken.




*The RTX 3060'12G is on there but ill still count it in top 20 cuz that sounds better.
 

Spyxos

Gold Member
After watching that video, it clear those games aren't bad ports and even said that VRAM utilization is becoming a trend among recent titles.

The guy in the video also said those games are built at 4k, then down scale to 1080p, still keeping the 4k textures and assets.

I don't know why you guys don't want progress in PC gaming space. Not raising the minimum from 8GB to 12GB can hurt PC gaming in the long run. Especially when the PS4/XB1 gen is left behind for good.

It's a fact that every console gen, VRAM usage goes up and it puzzles me that you guys still deny this, even after multiple test videos.

In 5-6 years, consoles will be pushing 8k. Hopefully one day, PC gamers can stop holding back devs with having to optimize for 1080p.
I would rather say that it is clear that these games were thrown onto the market unfinished. These games have way more problems than just high Vram requirements on the Pc.

I'm all for progress, but it should also be possible to get the 8g vram full with slightly prettier textures than we got.

There will certainly be no 8k gaming in 5-6 years on consoles. It is simply not worth it to waste so many resources(Hardware Power, Energy) on 8k.

I'm not even sure if the average gamer sees the difference between 4k and 8k. However, he would certainly notice the increased prices.
 

Bojji

Member
.


But people with 3070s and 3080s def dont need to throw away their GPUs.
Medium texture quality shouldnt be something that suddenly looks muddy as hell.

3070 is dead, same goes for new 4060/4060ti. 8gb of VRAM dog shit amount for cards that are not low end. Yet, despite having raw power to play games in reasonable settings, 3070 users will be forced to look at N64 textures in new games.

Look at Crysis from 2007, game on low settings looks worse than Far Cry from 2004 (i have been there at launch with my 7600GT) and runs worse too on same hardware. Developers never really cared about people playing in lower than recommended settings. They make textures and materials for consoles and then some shit version of it for low VRAM users, Xbox series S is in the same spot as 8GB cards and some games already have horrible textures on it.
 

KXVXII9X

Member
This is why I'm going back to console and handheld gaming, so I don't have to worry about any of this crap. When does the upgrading end? Gameplay and such still continue to be ignored. I'm more impressed with Zelda TotK tbh and it runs on PS3 level hardware.
 

Bojji

Member
This is why I'm going back to console and handheld gaming, so I don't have to worry about any of this crap. When does the upgrading end? Gameplay and such still continue to be ignored. I'm more impressed with Zelda TotK tbh and it runs on PS3 level hardware.

But Zelda will be low res and sub 30fps, at least with PS5 and Series X you can expect better performance and resolution targets.

I think people will be very happy playing Zelda using emulator lol
 

Loxus

Member
There will certainly be no 8k gaming in 5-6 years on consoles. It is simply not worth it to waste so many resources(Hardware Power, Energy) on 8k.

I'm not even sure if the average gamer sees the difference between 4k and 8k. However, he would certainly notice the increased prprices.
Yea, I thought so too, until I saw this tweet awhile back.

Generating hints of object overlap by region testing while rendering for efficient multi-GPU rendering of geometry
Rxe8NRw.jpg

n5uw8gu.png
5FCsuBj.png


A single 40CU RDNA4/5 chiplet on 3nm should be capable of doing 4k/60.
If i understand this patent correctly, Mark Cerny found away to make multiple GPU chiplets work by dividing the screen into 4 and having a chiplet render it's own screen division. Of course patent isn't away used in a product, but Mark Cerny having created this, makes me believe the PS6 is GPU multi- GCD chiplet based.

AMD has a similar patent too, which increases the probability even higher.
P8GlWqB.jpg
j535owm.jpg
 

Spyxos

Gold Member
Yea, I thought so too, until I saw this tweet awhile back.

Generating hints of object overlap by region testing while rendering for efficient multi-GPU rendering of geometry
Rxe8NRw.jpg

n5uw8gu.png
5FCsuBj.png


A single 40CU RDNA4/5 chiplet on 3nm should be capable of doing 4k/60.
If i understand this patent correctly, Mark Cerny found away to make multiple GPU chiplets work by dividing the screen into 4 and having a chiplet render it's own screen division. Of course patent isn't away used in a product, but Mark Cerny having created this, makes me believe the PS6 is GPU multi- GCD chiplet based.

AMD has a similar patent too, which increases the probability even higher.
P8GlWqB.jpg
j535owm.jpg

I don't really understand what exactly it says.
 
Top Bottom