• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia responds to GTX 970 memory issue

mephixto

Banned
It is but it would be better if you could post the frame time graph from MSI AB or enable the log option in MSI AB and create a graph in excel. Ideally give it a 250ms polling time.

Show the GPU usage graph too.

In my experience the big stutters only came from moving the camera erratically like you say.

But there are some points in my graphs where GPU usage drops and frame times go very high, that is indicative of the driver managing memory between the 3500mb and 500mb sections.

I'll try that, the game start stuttering heavy on my part when moving the camara around but what I noticed is that VRAM reached his max (4070 I believe) and that's natural, there is no more physical RAM to allocate. I even managed to crash the game a couple of times.
 
Edit: Whoops, lol, wrong thread. This is what I get for being a tab hoarder.

Actually on-topic, I think I'm going to wait a couple more days to see more real-life scenarios, but before all this happened I was extremely close to pulling the trigger on a 970 and I think I'm still considering it.
 

pestul

Member
Wow, we're having a very civil discussion here compared to the Nvidia forums right now. They're all going ballistic lol

I'm pissed, but I still love the performance I've gotten out of the card thus far.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Wow, we're having a very civil discussion here compared to the Nvidia forums right now. They're all going ballistic lol

Yes. I saw that earlier. They are not happy bunnies.
 

Xdrive05

Member
Nvidia really fucked this one up, and I'm not even talking about the card itself. Their failure to disclose it, and their two weeks too late response once their customers discovered it, is shameful.

It's like they're not even trying to stay in good graces with the people who pay their salaries.

And when they do respond, it's some hand waving bullshit that doesn't even address the stuttering which revealed the hardware issue in the first place. They just show average fps like this is fucking 2002.

Only thing now is what happens next. Will there be a mass market backlash? Or is this just the dirty plebian "power users" making noises in the internet wilderness, and the company keeps it from making popular waves? We'll see.
 

MadSexual

Member
While it's slightly disappointing to me, it probably won't affect my plans, as I've always intended to add one of the rumored 6 or 8 gig variants in SLI with this one as the secondary. I don't expect to suffer in performance before that hypothetical time. ...Assuming the architecture does not preclude higher VRAM models.
 

scurker

Member
Well shit. I have been looking at upgrading and had my eye on a 970. I'm not sure what to do now, since the 960 cards don't look that great and I haven't really been looking at any 290x cards.
 
If I'm running at 1080 will I ever run into this issue? Not planning to go above 1080 any time soon.

Everyone is ALREADY running in the issue. The problem is to demonstrate how much it costs, given that we don't have a model that works.

If Shadow of Mordor tries to stick to 3.5Gb then it already swaps more than it would without the soft ceiling. This already affects performance in the way of frame rate pacing.

Those of you saying that the game stutters when panning the camera AND memory usage is lower than 4Gb you are all seeing concrete effects of this.

Those who think it's not an issue are misperceiving the problem and how it actually affects performance
 

Xdrive05

Member
So what do you guys think Nvidia will do about this....
nothing?

Almost certainly nothing. Their mealy mouthed response was basically, "yeah, lol, we really hoped you wouldn't find out, but at least we sold a bunch first! But it's still pretty good so fuck you."

They won't do anything until the market responds. Which it might not do in a large enough way to prompt action from them. The next week will decide it.
 
For those of you that misunderstand how this works:

1- The difference between 3.5Gb and 4Gb in actual game performance is very minimal.
2- You haven't seen this problem reported because of point 1, not because no one is actually affected.

So this is a "small" issue because the difference of 0.5Gb in actual gameplay is small. Not because there isn't a real problem, or because it's rare to trigger it.
 

potam

Banned
Man fuck this. Gonna look into getting a refund for my cards and just go back to rocking my 680 until AMD's cards come out.

And the nvidia defense force needs to chill the fuck out. At a MINIMUM what they did was shady. And yes, I would like all 4 gb to perform at the same speed. Sorry for being a needy, picky consumer.
 

Zukuu

Banned
I'd take like to see a small price cut so I can finally get one.
Doesn't seem like a big issue, as long as they take care of optimizing the games via their drivers for the 970.
 

Falk

that puzzling face
Because an average doesn't take into account stuttering.

60 60 60 60 0 60 60 0 60 60 60 60 60 60 60 0 60 60 60 60 60

Total average is still high but will be awful to play. On average during the course of you life you will not be on fire in fact probably closer to over 99% of your life, so why bother about being on fire for less then 1%! Fun with averages.

I would like to subscribe to your newsletter.
 

Honey Bunny

Member
They cannot say the performance of the last 0.5GB VRAM is not a big deal when they have intentionally limited access to it for performance reasons. They cannot have their cake and eat it.
 
They cannot say the performance of the last 0.5GB VRAM is not a big deal when they have intentionally limited access to it for performance reasons.

Pretty much. The fact that the cards try really hard to never use more than 3.5GB shows that NVIDIA knew full well that the last 500MB was not performant enough to be accessed regularly.

It's essentially a 3.5GB card with a 500MB overflow buffer. The last 500MB is very obviously not meant to be full-performance VRAM, and instead it seems that it's more like a catch-all buffer before the driver can reallocate things back down below the 3.5GB threshold.

That's not a 4GB card, even though they advertised it as such. People can defend NVIDIA all they want -- it's not 4GB of comparable VRAM. NVIDIA knew this ahead of time, and the driver trying to stay under 3.5GB is proof.

Will this have any practical implications? I can't say. But the point is that NVIDIA marketed it as a fully-fledged 4GB card, and it's clearly not.
 

Bigbillybeef

Neo Member
They never made it clear at any point that any of the VRAM on this card was inferior. No matter the impact on performance, it is disturbing that they thought they could get away with this.

If they'd have marketed the cards as 3.5gb cards with a 500Mb buffer/backup memory/overflow etc, and explained the benefits of such a system in terms of price/performance I would probably have still purchased mine anyway. But now I feel like I've been deceived despite the fact that I've been perfectly happy with my 970.

It's a shady practice that sets a bad precedent.
 
As it is a consistent issue, it should have already shown up in benchmarks. Nvidia's job is to selectively tell you how great the product is, reviewers should try and show the rest of the picture.
 
Pretty much. The fact that the cards try really hard to never use more than 3.5GB shows that NVIDIA knew full well that the last 500MB was not performant enough to be accessed regularly.

It's essentially a 3.5GB card with a 500MB overflow buffer. The last 500MB is very obviously not meant to be full-performance VRAM, and instead it seems that it's more like a catch-all buffer before the driver can reallocate things back down below the 3.5GB threshold.

That's not a 4GB card, even though they advertised it as such. People can defend NVIDIA all they want -- it's not 4GB of comparable VRAM. NVIDIA knew this ahead of time, and the driver trying to stay under 3.5GB is proof.

Will this have any practical implications? I can't say. But the point is that NVIDIA marketed it as a fully-fledged 4GB card, and it's clearly not.

This pisses me off so much. I've been noticing that something was going on any time vram usage went into the 3.5 range, but had no idea what it was. Not sure if I would have bought the cards knowing this. They are strong performers, but I play at 4k whenever possible and usually have all the other bells and whistles turned in, so this is an issue that I encounter semi regularly. Considering requesting a refund, tbh.
 

GHG

Gold Member
So basically they have admitted to doing the same thing as they did with the 2GB 660/660 ti cards.

Because there wasn't an uproar about those then they were clearly hoping to get away with it again.

Not so lucky.

These cards should have been 3.5 GB but... Marketing.
 

Zane

Member
This pisses me off so much. I've been noticing that something was going on any time vram usage went into the 3.5 range, but had no idea what it was. Not sure if I would have bought the cards knowing this. They are strong performers, but I play at 4k whenever possible and usually have all the other bells and whistles turned in, so this is an issue that I encounter semi regularly. Considering requesting a refund, tbh.

Are there any cards on the market or even coming soon that can handle 4k? I thought playable 4k was a ways off, at least for single GPU systems
 

SURGEdude

Member
I really wish PC graphics had a 3rd player for gamers (Intel doesn't count). The choice is between a company with space heaters and crappy drivers, and one who has a long history of deception and at times high prices. Ugg.

Of course I might be a bit biased since I owned one of those cursed 8600m mobile GPUs that killed my Macbook. Not sure I can be objective about them since to be honest.
 

cheezcake

Member
I'll reserve my outrage for when there's actual evidence of this strongly affecting actual games.

Are there any cards on the market or even coming soon that can handle 4k? I thought playable 4k was a ways off, at least for single GPU systems

I've been playing Bioshock Infinite maxed at 4k, 55fps avg on my singled OC'ed 970
 

Kinthalis

Banned
Are there any cards on the market or even coming soon that can handle 4k? I thought playable 4k was a ways off, at least for single GPU systems

What do you mean by playable? A gtx 980 achieves north of 30 fps on a lot of current gen games at better than console settings at 4k. On some it can push 60 and at console settings that number of games rises.

If you mean 60 fps maxed out crysis 3 with full aa then no. You need sli for that.
 

Zane

Member
I'll reserve my outrage for when there's actual evidence of this strongly affecting actual games.



I've been playing Bioshock Infinite maxed at 4k, 55fps avg on my singled OC'ed 970

What do you mean by playable? A gtx 980 achieves north of 30 fps on a lot of current gen games at better than console settings at 4k. On some it can push 60 and at console settings that number of games rises.

Awesome, didn't know that.
 

Renekton

Member
I'm wondering if CDP will specifically make Witcher 3 fit under 3.5GB :D

So basically they have admitted to doing the same thing as they did with the 2GB 660/660 ti cards.

Because there wasn't an uproar about those then they were clearly hoping to get away with it again.

Not so lucky.
They got away with 660, 2GB limit on high-end cards, and Titan Z. PC performance threads seems to have increasingly more Nvidia issues but people exclusively harp on AMD drivers. This thread seems to have more people defending Nvidia ("my 2013 game runs 4K fine!!!") than others. So I think Nvidia will be relatively unscathed.
 
Well I was asking more in regards to games released in 2014 :)

I hadn't paid attention to that part of your question, yeah not many there I suppose. Although Alien Isolation runs 60fps at 2715x1679 (2x DSR) and between 45-55fps at 4K on my overclocked 970. I haven't played much of the game yet though and haven't encountered the alien yet, so don't know if that performance would hold. The graphics settings are maxed and even pushed beyond their official limits through ini/cfg tweaks.
 

espher

Member
This is more like you and a friend each got two burgers, fries and a beverage and they all tasted great until your friend exclaims 'Hey, there are no pickles on my second burger!'. You had not noticed up until then, but it turns out your second burger is missing pickles too!

It doesn't ruin the otherwise lovely burgers let alone the entire meal, but the menu clearly stated all burgers made at 'Billy's Burger Palace' have delicious pickles on them right on the front. Hell, you even remember the lady at the drive-in explicitly saying the pickles were on the burgers! This wouldn't frustrate most people to the point they would drive back to the fast food joint and demand a refund, it was relatively cheap anyway, but you're kind of annoyed nonetheless. In fact, you might have actually ordered chicken nuggets had you known this beforehand. At least you can be sure those are 100% chicken! (Or are they?)

Not a perfect analogy by any means either (unless people eat pickles to future proof their body), but let's not act like that 0.5GB vanished into thin air because of this issue. ;)

Yeah, but I really wanted those fucking pickles, because the total package (including the pickles!) was the reason I went to Billy's instead of going to Rad Ian's Burger Place and ordering the $2.90 X Burger? And now I'm skeptical that the next burger I get from Billy's also won't have pickles despite the sign saying there are pickles and that they'll just lie again and tell me I just can't taste them because they're ground into the burger patty or part of the sauce now?

Despite the fact that the burger may have been totally delicious, it looks like I and everyone else that got the burger didn't get pickles, and the burger is supposed to come with pickles. Would we be okay with it not coming with the burger patty? The bun? The cheese? If the answer is 'no' to any of those, it should be 'no' to the pickles as well.

Fuck, now I need to go eat some pickles.

(For what it's worth, I still don't have my 970 yet, because it shipped for me along with the rest of my PC on the 21st, but had I known about this, I likely would have considered a different card or at least waited for more info before pulling the trigger. So I'm not in the "boy this burger was great but now that I know it doesn't have pickles I hate it" camp -- I'm in the "found out on my drive home from the drive thru that they didn't put the right toppings on my burger" camp.)

Edit: As an aside, it's funny to me that there was almost universal outrage over Nintendo not including an adapter w/ the n3DS (selling it alongside as an accessory) but this thread is so split down the middle.
 

SURGEdude

Member
Kinda shady for them to oversell half a gig, but I have a feeling I wouldn't even notice the difference.

What I really want are non-nVidia provided benches. Averages lend to the same shit AMD got into with Mantle. Average frame is so beyond useless that it's insulting. How stupid do both GPU makers think PC gamer are?
 
Top Bottom