It is but it would be better if you could post the frame time graph from MSI AB or enable the log option in MSI AB and create a graph in excel. Ideally give it a 250ms polling time.
Show the GPU usage graph too.
In my experience the big stutters only came from moving the camera erratically like you say.
But there are some points in my graphs where GPU usage drops and frame times go very high, that is indicative of the driver managing memory between the 3500mb and 500mb sections.
Wow, we're having a very civil discussion here compared to the Nvidia forums right now. They're all going ballistic lol
Probably not for a little while. Maybe a few scenarios with optional ultra textures and if you downsample.If I'm running at 1080 will I ever run into this issue? Not planning to go above 1080 any time soon.
I can't wait for someone to bump this thread in a few year's time when someone releases a game that needs all 4Gb and it just doesn't work properly
Wow, we're having a very civil discussion here compared to the Nvidia forums right now. They're all going ballistic lol
Man you weren't joking. NVidia needs to apply cold water.
Like 4GB of vram to merely run? Gee, I wonder if this card will hold up in 2020
So what do you guys think Nvidia will do about this....
nothing?
If I'm running at 1080 will I ever run into this issue? Not planning to go above 1080 any time soon.
So what do you guys think Nvidia will do about this....
nothing?
******hugs r9 290*****
******hugs r9 290*****
Because an average doesn't take into account stuttering.
60 60 60 60 0 60 60 0 60 60 60 60 60 60 60 0 60 60 60 60 60
Total average is still high but will be awful to play. On average during the course of you life you will not be on fire in fact probably closer to over 99% of your life, so why bother about being on fire for less then 1%! Fun with averages.
They cannot say the performance of the last 0.5GB VRAM is not a big deal when they have intentionally limited access to it for performance reasons.
I for one will be pissed if my S3 ViRGE cant run games in 2020
Pretty much. The fact that the cards try really hard to never use more than 3.5GB shows that NVIDIA knew full well that the last 500MB was not performant enough to be accessed regularly.
It's essentially a 3.5GB card with a 500MB overflow buffer. The last 500MB is very obviously not meant to be full-performance VRAM, and instead it seems that it's more like a catch-all buffer before the driver can reallocate things back down below the 3.5GB threshold.
That's not a 4GB card, even though they advertised it as such. People can defend NVIDIA all they want -- it's not 4GB of comparable VRAM. NVIDIA knew this ahead of time, and the driver trying to stay under 3.5GB is proof.
Will this have any practical implications? I can't say. But the point is that NVIDIA marketed it as a fully-fledged 4GB card, and it's clearly not.
This pisses me off so much. I've been noticing that something was going on any time vram usage went into the 3.5 range, but had no idea what it was. Not sure if I would have bought the cards knowing this. They are strong performers, but I play at 4k whenever possible and usually have all the other bells and whistles turned in, so this is an issue that I encounter semi regularly. Considering requesting a refund, tbh.
Are there any cards on the market or even coming soon that can handle 4k? I thought playable 4k was a ways off, at least for single GPU systems
Are there any cards on the market or even coming soon that can handle 4k? I thought playable 4k was a ways off, at least for single GPU systems
I'll reserve my outrage for when there's actual evidence of this strongly affecting actual games.
I've been playing Bioshock Infinite maxed at 4k, 55fps avg on my singled OC'ed 970
What do you mean by playable? A gtx 980 achieves north of 30 fps on a lot of current gen games at better than console settings at 4k. On some it can push 60 and at console settings that number of games rises.
Are there any cards on the market or even coming soon that can handle 4k? I thought playable 4k was a ways off, at least for single GPU systems
They got away with 660, 2GB limit on high-end cards, and Titan Z. PC performance threads seems to have increasingly more Nvidia issues but people exclusively harp on AMD drivers. This thread seems to have more people defending Nvidia ("my 2013 game runs 4K fine!!!") than others. So I think Nvidia will be relatively unscathed.So basically they have admitted to doing the same thing as they did with the 2GB 660/660 ti cards.
Because there wasn't an uproar about those then they were clearly hoping to get away with it again.
Not so lucky.
Plenty of older games run like butter at 4K, and look great at it too.
Well I was asking more in regards to games released in 2014![]()
Kinda shady for them to oversell half a gig, but I have a feeling I wouldn't even notice the difference.
This is more like you and a friend each got two burgers, fries and a beverage and they all tasted great until your friend exclaims 'Hey, there are no pickles on my second burger!'. You had not noticed up until then, but it turns out your second burger is missing pickles too!
It doesn't ruin the otherwise lovely burgers let alone the entire meal, but the menu clearly stated all burgers made at 'Billy's Burger Palace' have delicious pickles on them right on the front. Hell, you even remember the lady at the drive-in explicitly saying the pickles were on the burgers! This wouldn't frustrate most people to the point they would drive back to the fast food joint and demand a refund, it was relatively cheap anyway, but you're kind of annoyed nonetheless. In fact, you might have actually ordered chicken nuggets had you known this beforehand. At least you can be sure those are 100% chicken! (Or are they?)
Not a perfect analogy by any means either (unless people eat pickles to future proof their body), but let's not act like that 0.5GB vanished into thin air because of this issue.![]()
Kinda shady for them to oversell half a gig, but I have a feeling I wouldn't even notice the difference.