• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia responds to GTX 970 memory issue

Vamphuntr

Member
I don't think there would be a "backlash" if they had explained upfront what was inferior in the card compared to the 980 version. They sure explained the inferior clock speed and GPU cores and disabled SSM but strangely skipped the 3.5 GB and 0.5 GB partition part, lied on the ROP and L2 cache values and their peak bandwidth of 224 GB/s is quite misleading to say the least. It doesn't help that apparently the deception is embedded in the bios as GPUZ reports the false info.

I'm not on the let's throw away our GTX 970 boat at all nor do I feel it's trash but I feel really cheated. I know the card is still very interesting for the price but I can't help but feel I got ripped off in a way.
 

potam

Banned
Can't speak for anyone else, but for me had this information been known, I wouldn't have purchased a 970. It's as simple as that.

I won't go so far as to say that. Like others have said, the current performance of the card still stands, 4GB or no. But, I don't know if I would have ordered 2 on day 1. It would have probably helped me hold off until AMD announced their next cards, though.

I'm not on the let's throw away our GTX 970 boat at all nor do I feel it's trash but I feel really cheated. I know the card is still very interesting for the price but I can't help but feel I got ripped off in a way.

Just about sums it up for me.
 

newman929

Neo Member
I don't think there would be a "backlash" if they had explained upfront what was inferior in the card compared to the 980 version. They sure explained the inferior clock speed and GPU cores and disabled SSM but strangely skipped the 3.5 GB and 0.5 GB partition part, lied on the ROP and L2 cache values and their peak bandwidth of 224 GB/s is quite misleading to say the least. It doesn't help that apparently the deception is embedded in the bios as GPUZ reports the false info.

I'm not on the let's throw away our GTX 970 boat at all nor do I feel it's trash but I feel really cheated. I know the card is still very interesting for the price but I can't help but feel I got ripped off in a way.

I'm with you on this.

Also, if I had known ahead of time I would have just got a 980. Now it's too late to return the 970 by about 10 days.
 
Why are all you guys shitting on the peeps who have been cheated out of their money by Nvidia? I don't think anyone has bought the 970 card to use for a couple of months, they more than likely have bought them to use over a few years and as the years go by the games which start demanding 4gb of DDR5 ram will become more abundant....then we will start to see quite a few differences between a 3.5gb/4gb cards.

I really think we should all be siding with the consumer, firstly because Nvidia have done some underhanded shit here, secondly so that we can avoid them doing more crap like this in future. Do we really want to start buying hardware only to take it home and have to take it apart to make sure what you bought was actually what was advertised? I sure as hell have no interest in a future where these billion dollar corporations can just do whatever the fuck they want because there are no repercussions.
 

potam

Banned
I'm with you on this.

Also, if I had known ahead of time I would have just got a 980. Now it's too late to return the 970 by about 10 days.

I haven't begun exploring my options yet (waiting to see if Nvidia offers any official resolution), but if it does get to the point that I feel the need to return them, and Newegg/Nvidia/MSI are unhelpful, I would have no moral objections to issuing a charge back on my credit card. At the end of the day, they intentionally misrepresented the product they sold me.
 

pestul

Member
Full 4GB was a huge reason why I bought the 970. Gaming at 1440p, I wanted a little bit of future proofing. If I knew what I know now, it would have been a toss up with a 290x.
 

Rafterman

Banned
It does not have access to it at the advertised speed. There could most definitely be an argument in court that the last 0.5GB being accessed at 1/8 speed does not make up for advertisement as if it's full speed. Nvidia going "well TECHNICALLY all 4GB is accessible" isn't going to mean shit when it's obvious that the card was marketed without consumers being aware of the handicap on the top 12.5%.



People are just going to keep quoting the "3% difference" without understanding how averages work. It's been mentioned again and again. Then you have the people saying current games run fine so deal with it, the old benchmarks are still valid so deal with it, it can access 4GB technically so deal with, etc.

I've already gone over this, and I agree that it's pretty shady, but saying "I paid for a 4gb card and i didn't get it" isn't accurate.

And then you have people who claim games slow to a crawl and lose 20 fps when accessing that last .5gb, without ever showing a shred of proof. Do you think you are helping by being as wrong on your side of the argument as you think people are on the other side? Do you think exaggerating the problem helps?
 

GHG

Gold Member
I don't think there would be a "backlash" if they had explained upfront what was inferior in the card compared to the 980 version. They sure explained the inferior clock speed and GPU cores and disabled SSM but strangely skipped the 3.5 GB and 0.5 GB partition part, lied on the ROP and L2 cache values and their peak bandwidth of 224 GB/s is quite misleading to say the least. It doesn't help that apparently the deception is embedded in the bios as GPUZ reports the false info.

I'm not on the let's throw away our GTX 970 boat at all nor do I feel it's trash but I feel really cheated. I know the card is still very interesting for the price but I can't help but feel I got ripped off in a way.

This 100%.

Nvidia used deception to get extra sales. The moment you start lying about s products specs you are preventing consumers from making an informed decision. You are selling a product based on a lie.

I refuse to believe this was all a misunderstanding between the tec team and the PR departments. The fake specs are written into the bios ffs. Last time I checked, PR are in no way responsible for coding a bios.

Nvidia should feel the wrath for this one and I'm mystified as to how anyone can defend them at this point given the recent information that has come to light.
 

Vamphuntr

Member
I really doubt they will get sued successfully on this or that they will do much to appease people. They knew about the peculiar design and properties of the card in the first place since they've designed it and they probably were very careful about the wording of the specifications on the box and product's page to avoid problem. They were misleading for sure but I think you are way too much optimistic if you are expecting a replacement card or some important compensation out of this.

I do wonder what manufacturers think of this though. It's not a "warranty" issue since the card is performing the way it was designed but that way was hidden from the consumers and now MSI, Gigabyte, Asus, Zotac and all the others are stuck with products sold that weren't what people believe they've bought.
 
I've already gone over this, and I agree that it's pretty shady, but saying "I paid for a 4gb card and i didn't get it" isn't accurate.

I paid for a single pool 4GB card that runs at the specified bandwidth at all times on the specified bus.

Why would I expect to get anything else when it has not been stated that this is NOT what I will be receiving by the manufacturer upon purchase?.

Hows that? I've simply worded it differently but you know full well that this is what people mean when they say they paid for 4GB and didn't receive it.
 

DarkJC

Member
How can they not be offering refunds for this? They effectively lied about the specs of the card. If I had bought one I would've been pissed, even if the performance difference is negligible. They essentially pulled a fast one to make it seem like more of a deal than it actually is.
 

Serandur

Member
This 100%.

Nvidia used deception to get extra sales. The moment you start lying about s products specs you are preventing consumers from making an informed decision. You are selling a product based on a lie.

I refuse to believe this was all a misunderstanding between the tec team and the PR departments. The fake specs are written into the bios ffs. Last time I checked, PR are in no way responsible for coding a bios.

Nvidia should feel the wrath for this one and I'm mystified as to how anyone can defend them at this point given the recent information that has come to light.

This constitutes fraud as they intentionally misrepresented the specifications of the product for their own gain. Though it might be difficult to successfully pursue in a court of law because of how easily the issue is misunderstood and proving intent is important, this situation is not only unethical, but potentially unlawful as well.

They're just trying to cover their ass by saying it was a communication error, but that's almost certainly BS.
 

potam

Banned
waUNUhR.png

.
 

DarkJC

Member
This constitutes fraud as they intentionally misrepresented the specifications of the product for their own gain. Though it might be difficult to successfully pursue in a court of law because of how easily the issue is misunderstood and proving intent is important, this situation by Nvidia is not only unethical, but potentially unlawful as well.

They're just trying to cover their ass by saying it was a communication error, but that's almost certainly BS.

When GPU-Z reports the wrong specs, it's definitely BS. It's not a simple marketing mistake when the BIOS is misrepresenting the specs of the card.
 

GHG

Gold Member
I can't believe what I'm reading in this thread...

The card is advertised as a full 4GB card. But what we have come to understand is that only 3.5 GB of that 4GB can be utilised at the advertised bandwidth. When the last .5 GB is accessed it can cause the performance to drop.

The biggest thing to understand here is that as a user you cannot control whether the card accesses that last .5 GB or not. Therefore you are at risk of your performance dropping inexplicably at any moment while gaming. You cannot have certainty of performance stability when you are pushing the 3.5GB VRAM ceiling.

Now if we had known this from the start then fine, people would make a decision whether it is worth the gamble or not. They would calculate on average whether they would need a proper 4GB card based on the type of gaming they do and then choose whether or not they buy this card.

But no. Nvidia took that choice away from the consumer. Now I don't care how much you like your 970, whether you are still ok with your card or not and how much you might like Nvidia to shit in your mouth but that is unacceptable.

I would like to see what would go down if Sony or Microsoft tried to pull something similar and deceived consumers in terms of the consoles respective specifications.
 
oh then you're good to sit and wait. by then i'm sure it'll be a less muddy picture.



both the 2 high performing games i tested personally would see a solid benefit to the 980 over 970 in 1080p (Shadow of Mordor and Far Cry 4). there's evidence that quite a few others would see non trivial increases @ 1080p as well in COD:AW, AC:Unity, Star Citizen was mentioned; pretty much anything that can crank up the mem usage. Assuming Mad Season is in the US, the price difference is a lot less hefty than it is in Canada (250CDN upgrade from the MSI 4g 970($420) to 980($670)). there's also good reason to believe that even though the 970 is performing well now, a key flaw has been exposed so early in it's life and is only going to get more glaring as more games are released that push the limit on textures and VRAM usage. price/performance, the 970 is the way to go but for just that extra ~200 you're getting a beast of a card that's good for at least a couple years in the 980, for ~500USD.
still think a 970 is a good upgrade over 7950 3GB?
 

bootski

Member
How come frametime spikes to 50ms (20FPS) but framerate stays at 60FPS (16.7ms)?

good question. i've seen frametime been brought up a few times as the true measure of this issue but i don't understand why. frametime is measured as as inverse (in ms) of the framerate to get the millisecond per frame (ms/f) i.e

Code:
ms/s % frame/s = ms/s x s/frame = ms/frame

so for a fps of 20:
frametime = 1000ms/s x 1s/20frames
          =  50 ms/frame

likewise fps 60 = 1000ms/s x 1s/60frames
          = 16.7ms/frame

as you can see, those values are exactly on point.

EDIT: i totally misunderstood your question. sorry for the elaborate reply. the framerate spikes that you see are at least PARTIALLY explained by different things popping up on the screen or menu access. not all of them mind you, but when i access a menu, talk to an npc, trigger a cutscene and a number of other actions, the framerate does that spike.

Your max memory usage is 50 megs over 3.5 gb (3626/1024). You're still mostly within the 3.5 GB limit.

you understand that the VRAM itself is not actually a different speed right? the ram chip that's being accessed is of the exact same quality as the rest of the memory on the card. the method of accessing is what's at issue here. ANY access into that partition should show at least SOME notable decrease in performance if your theory of a 15-20fps drop held any water, which it doesn't.

now that we cleared that up, i played another hour or so of SoM and ran into the same issue as before with the weird stutters. i THINK it's caused by the card swapping out memory to try to stay under the 3.5GiB mark, similar to what would happen on a functioning card if you maxed out it's memory (say a 980 with the VRAM pegged at 4096MiB). to show it a bit more clearly, i zoomed in the mem usage portion of the graph. the dips fall fairly well in line with the times of my experience of the graphical anomalies. i had the afterburner OSD running so i was able to see it happening in realtime.

som1080pbus.PNG

note: not all the dips in mem usage are the cause of the card, some happened when i triggered cutscenes while others happened when i accessed the menu. in fact, i think it was the little dips more than the big ones that occured at the same time as my gfx issues. edit: i should also note that this just goes to further show that the 970 is in essence a 3.5GiB card or at least wants to be one. lol.

still think a 970 is a good upgrade over 7950 3GB?
i wouldn't. it's an upgrade for sure, but a 7950 is still a pretty damn capable card and spending 300+USD for what would amount to a few settings increases wouldn't seem worth it to me. depends on you though. i personally don't upgrade very often, which is why im not as laissez-faire about it as you'll notice some others are.
 

TSM

Member
you understand that the VRAM itself is not actually a different speed right? the ram chip that's being accessed is of the exact same quality as the rest of the memory on the card. the method of accessing is what's at issue here. ANY access into that partition should show at least SOME notable decrease in performance if your theory of a 15-20fps drop held any water, which it doesn't.

The way I read what they said was that they don't use the .5GB like they do the other 3.5GB. The use it as a 3rd pool of ram that they can cache stuff to instead of dragging across the system bus. I wouldn't think they'd actually use it for anything but local cache. When they need data that's neither in the .5GB cache or the 3.5Gb pool, then they'd drag stuff across the system bus which would probably be what you are seeing. If this is what they are doing, then the card is in fact much better then a 3.5GB card, but not as good as a full 4GB card would be.
 

Fularu

Banned
OK, so you don't believe his tests, like the ones he linked to where his memory usage goes much more into that last 500 megs and it doesn't "slow to a crawl" (now it's not going into the last 500 enough? *dead). You don't believe the various other tests people have done (like here, here, here, or here) or the conclusions from the tech sites today (lol bought off by nvidia amirite lolol), and everyone else that posts anything that isn't sufficiently negative and supporting the "slows to a crawl" bullshit is some corporate ball<verb>er.

Shitposts like that cloud the fact an actual issue does in fact exist. "Slows to a crawl". Hah. Give me a fucking break.
Look, whenever the card taps into the last 500 megs it starts to stuter. Some notice it and find it jarring, others don't. I do and I thought *my card* had a problem and was trying to troubleshoot it or return it to NCIX. Now I understand why it happens.

I don't care if it maintains 60 fps 97% of the time, when it doesn't and dips and stutters, it bugs the hell out of me, it crawls for a split second and then resumes work.

When I bought this card, I bought a 4 gb unified memory card that would stand the test of the next 3 years. Not one that shows a significant flaw within weeks of release (and purchase) on the hardware level.
 

potam

Banned
Fuck it, if Newegg is going to side with the manufacturer and not the consumer, I don't have any qualms about issuing a charge back on my credit card.

http://kb.newegg.com/FAQ/Article/1729

I'm guessing they're getting hammered by people asking about this. The CSR had this ready for me:

Agent Sally W.: Thank you for holding. What you mentioned is not the fact. The GeForce GTX 970 is equipped with 4GB of dedicated graphics memory. However the 970 has a different configuration of SMs than the 980, and fewer crossbar resources to the memory system. To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section.
 

Abounder

Banned
Pretty disappointing. They should at least throw in some free game codes but that really isn't worth much to those with full rebuilds like my brother. I knew the 970 was too good to be true :(
 

daninthemix

Member
Fuck it, if Newegg is going to side with the manufacturer and not the consumer, I don't have any qualms about issuing a charge back on my credit card.

http://kb.newegg.com/FAQ/Article/1729

I'm guessing they're getting hammered by people asking about this. The CSR had this ready for me:

Thing is, Nvidia is in the wrong here. I can understand why a retailer would be reticent to take on the cost (admin/manpower) of accepting returns en masse for a misrepresented product.

Someone with a mind for law should look into whether a class action is feasible.
 

potam

Banned
Thing is, Nvidia is in the wrong here. I can understand why a retailer would be reticent to take on the cost (admin/manpower) of accepting returns en masse for a misrepresented product.

Someone with a mind for law should look into whether a class action is feasible.

Oh, I know. I even repeated multiple times that I realized it was Nvidia's fault and not theirs. I'm still going to give Nvidia some time to respond, and I may call up MSI tomorrow. I only contacted Newegg first since if they were willing to do the refund, that would solve the problem right there.

With that being said, I don't appreciate that Newegg already has a canned response and has a KB page set up already just regurgitating the info Nvidia released. The fact that they did those two things, though, tells me this issue is gaining some traction with the community though.
 

TSM

Member
Fuck it, if Newegg is going to side with the manufacturer and not the consumer, I don't have any qualms about issuing a charge back on my credit card.

http://kb.newegg.com/FAQ/Article/1729

I'm guessing they're getting hammered by people asking about this. The CSR had this ready for me:

You are attacking it from the wrong angle. They advertised 4GB and it has 4GB on board. As long as they are utilizing all the 4GB of ram in some manner then it's a 4GB card. While this is shady, it's not false advertising. The real lies are the 56 ROPs and 1792k of L2 cache which their engineers clearly knew about. That's where the false advertising lies.
 

mugwhump

Member
I am upset

I definitely would have thought twice about returning my just-purchased 290 for a 970 if I'd known this.

It doesn't help that apparently the deception is embedded in the bios as GPUZ reports the false info.

I thought gpuz used spec sheets for some things?
 

Serandur

Member
Thing is, Nvidia is in the wrong here. I can understand why a retailer would be reticent to take on the cost (admin/manpower) of accepting returns en masse for a misrepresented product.

Someone with a mind for law should look into whether a class action is feasible.

It should be. This is straight-up fraud, Nvidia have all but admitted it themselves.
 

SURGEdude

Member
This 100%.

Nvidia used deception to get extra sales. The moment you start lying about s products specs you are preventing consumers from making an informed decision. You are selling a product based on a lie.

I refuse to believe this was all a misunderstanding between the tec team and the PR departments. The fake specs are written into the bios ffs. Last time I checked, PR are in no way responsible for coding a bios.

Nvidia should feel the wrath for this one and I'm mystified as to how anyone can defend them at this point given the recent information that has come to light.

This is my thought as well. It isn't so much the potential (likely minor) impact, but the lying and deception to increase sales. AMD would be wise to use this for their marketing benefit.
 

potam

Banned
You are attacking it from the wrong angle. They advertised 4GB and it has 4GB on board. As long as they are utilizing all the 4GB of ram in some manner then it's a 4GB card. While this is shady, it's not false advertising. The real lies are the 56 ROPs and 1792k of L2 cache which their engineers clearly knew about. That's where the false advertising lies.

Yeah, you may be right, but I feel like the 4GB shit is just semantics at this point.
 

bootski

Member
Look, whenever the card taps into the last 500 megs it starts to stuter. Some notice it and find it jarring, others don't. I do and I thought *my card* had a problem and was trying to troubleshoot it or return it to NCIX. Now I understand why it happens.

I don't care if it maintains 60 fps 97% of the time, when it doesn't and dips and stutters, it bugs the hell out of me, it crawls for a split second and then resumes work.

When I bought this card, I bought a 4 gb unified memory card that would stand the test of the next 3 years. Not one that shows a significant flaw within weeks of release (and purchase) on the hardware level.

i hear you man. i'm in the same boat. i called this evening and held a 980 for my inevitable return of this card. after seeing how it responded in SoM, i can't really justify spending 420CDN on something that's so inherently flawed.


Fuck it, if Newegg is going to side with the manufacturer and not the consumer, I don't have any qualms about issuing a charge back on my credit card.

http://kb.newegg.com/FAQ/Article/1729

I'm guessing they're getting hammered by people asking about this. The CSR had this ready for me:

sorry man. that's really shitty to hear. newegg used to be the top dog in service for computer stuff. whatever happened to them?

Thing is, Nvidia is in the wrong here. I can understand why a retailer would be reticent to take on the cost (admin/manpower) of accepting returns en masse for a misrepresented product.

Someone with a mind for law should look into whether a class action is feasible.

what they're doing is wrong but i don't know that it was LEGALLY wrong or fraudulent. their response to this has been quite unexpected. i bought this card on friday full on knowing about this issue and thinking that nvidia would be quick to address the issue considering the people who have picked up the 970's so far would fall into the enthusiast crowd.

instead, the explanation we got from the SVP of GPU Engineering was a technical version of that's "the way it's meant to be played".
 

Renekton

Member
Thing is, Nvidia is in the wrong here. I can understand why a retailer would be reticent to take on the cost (admin/manpower) of accepting returns en masse for a misrepresented product.

Someone with a mind for law should look into whether a class action is feasible.
Good luck with that though.

gtKYxW6.jpg
 

XBP

Member
good question. i've seen frametime been brought up a few times as the true measure of this issue but i don't understand why. frametime is measured as as inverse (in ms) of the framerate to get the millisecond per frame (ms/f) i.e

Code:
ms/s % frame/s = ms/s x s/frame = ms/frame

so for a fps of 20:
frametime = 1000ms/s x 1s/20frames
          =  50 ms/frame

likewise fps 60 = 1000ms/s x 1s/60frames
          = 16.7ms/frame

as you can see, those values are exactly on point.

EDIT: i totally misunderstood your question. sorry for the elaborate reply. the framerate spikes that you see are at least PARTIALLY explained by different things popping up on the screen or menu access. not all of them mind you, but when i access a menu, talk to an npc, trigger a cutscene and a number of other actions, the framerate does that spike.



you understand that the VRAM itself is not actually a different speed right? the ram chip that's being accessed is of the exact same quality as the rest of the memory on the card. the method of accessing is what's at issue here. ANY access into that partition should show at least SOME notable decrease in performance if your theory of a 15-20fps drop held any water, which it doesn't.

now that we cleared that up, i played another hour or so of SoM and ran into the same issue as before with the weird stutters. i THINK it's caused by the card swapping out memory to try to stay under the 3.5GiB mark, similar to what would happen on a functioning card if you maxed out it's memory (say a 980 with the VRAM pegged at 4096MiB). to show it a bit more clearly, i zoomed in the mem usage portion of the graph. the dips fall fairly well in line with the times of my experience of the graphical anomalies. i had the afterburner OSD running so i was able to see it happening in realtime.

som1080pbus.PNG

note: not all the dips in mem usage are the cause of the card, some happened when i triggered cutscenes while others happened when i accessed the menu. in fact, i think it was the little dips more than the big ones that occured at the same time as my gfx issues. edit: i should also note that this just goes to further show that the 970 is in essence a 3.5GiB card or at least wants to be one. lol.

I just ran SOM and checked my frame rates + frame times. I'm not sure why but my frame rate isn't dropping below 60 as much as yours is. I only saw a single huge drop to 48 fps once and it stayed above 60 throughout the run. There was a lot of fluctuation between 70 to 85 though (with occasional drops to 60s). This is at 1080p with everything at ultra.

NrHsAjk.png
\


(the huge drop in the middle is me minimizing the game to desktop)
 

Bastables

Member
You are attacking it from the wrong angle. They advertised 4GB and it has 4GB on board. As long as they are utilizing all the 4GB of ram in some manner then it's a 4GB card. While this is shady, it's not false advertising. The real lies are the 56 ROPs and 1792k of L2 cache which their engineers clearly knew about. That's where the false advertising lies.

It's a pig in a poke. I'm semi surprised at the people arguing that in-spite of being sold a misrepresented product some are arguing that dog/cat meat is just as good as the notional piglet initially advertised.
 

TSM

Member
It's a pig in a poke. I'm semi surprised at the people arguing that in-spite of being sold a misrepresented product some are arguing that dog/cat meat is just as good as the notional piglet initially advertised.

No, I'm just saying that I don't think anyone is going to be successful with most of the retailers since it's up to nvidia how they allocate the ram on their video card. Nvidia just tells them that it has 4GB and they use all 4GB. The lies about the number of ROPs and L2 cache on the other hand are blatant falsifications on nvidia's part and should be actionable.

Also nvidia has a history of doing this with the on board memory:
This is not the first time that NVIDIA has used interesting memory techniques to adjust performance characteristics of a card. The GTX 550 Ti and the GTX 660 Ti both used unbalanced memory configurations, allowing a GPU with a 192-bit memory bus to access 2GB. This also required some specific balancing on NVIDIA's side to make sure that the 64-bit portion of that GPU's memory controller with double the memory of the other two didn't weigh memory throughput down in the 1.5 GB to 2.0 GB range. NVIDIA was succeeded there an the GTX 660 Ti was one of the company's most successful products of the generation.
 

Bastables

Member
No, I'm just saying that I don't think anyone is going to be successful with most of the retailers since it's up to nvidia how they allocate the ram on their video card. Nvidia just tells them that it has 4GB and they use all 4GB. The lies about the number of ROPs and L2 cache on the other hand are blatant falsifications on nvidia's part and should be actionable.

Also nvidia has a history of doing this with the on board memory:

I agree with you much easier to point out the actual down grade in the top L2 cache figures, they can fudge the 4 gigs is four gigs even if it's restricted to 3.5 with a slower .5 gig cache. I still think that this is a really clear example of being sold a pig in a poke though, by a proprietor with a history of misrepresenting what they're selling.
 

Reallink

Member
You are attacking it from the wrong angle. They advertised 4GB and it has 4GB on board. As long as they are utilizing all the 4GB of ram in some manner then it's a 4GB card. While this is shady, it's not false advertising. The real lies are the 56 ROPs and 1792k of L2 cache which their engineers clearly knew about. That's where the false advertising lies.

By shady do you mean misleading, because there is an allowance in false advertising claims for exactly these kind of half truths with intent to mislead. Nvidia are fucked on this no matter how you slice it.
 
I have been catching up with all this.

The card has been performing brilliantly and my opinion hasn't been skewed on this. Like bootski I spent a bit of time running Shadow of Mordor to see what happened when the 3.5GB limit was exceeded. I found it incredibly hard going to get the game to go over the limit. The card seemingly does not want to give up the additional RAM all that readily.

Prior to the recent reports of a potential issue with the VRAM I hadn't noticed anything untoward. I got occasional stutters in Mordor but it was quite rare and the game never felt unduly affected by this. It is open-world so I expect an odd hitch here-and-there.

What bothers me is that the cards future performance has been degraded by whatever percentage when games do start to routinely using more than 3.5GB. Going into the future: What real world gaming performance hit will this card suffer compared to the 980 because of the segmented RAM?

I know there isn't any such thing as future proof but we have all bought into a new card that we expect to last a bit of time. I suppose only time will tell as to how much this segmented RAM affects games compared to if it was a straight 4GB amount.

I also like the nice little touch of 64 ROPs being reported in GPU-Z. How is this a marketing mistake?
 

kraspkibble

Permabanned.
I think I'll pass on the 970.

960 is a joke and the 980 is just simply out my price range.

I'll wait for the AMD 300 cards.
 
Why did nVidia try to cut corners like this though? Would VRAM partitioning save them that much of money?

It's actually not cutting corners, it's solving a problem in GPU design when you are harvesting defective dies and selling them as cut-down parts. The solution probably needs some work but the concept actually required additional development work to implement.

The standard method of cutting down the GM204 would have resulted in a part with a 192-bit memory bus and 3GB of RAM. What Nvidia did was come up with a way to make the card still have a 256-bit bus and 4GB of RAM, but in order to do so required a shared access trick to access the part of memory which is physically cut from crossbar access.

They do need to work on the shared access to make it not completely tank the performance of the RAM which cannot directly access the crossbar by 90% but using software load balancing they are already trying to minimize the performance loss.

If anything it's a very clever solution to an old problem in GPU design but hiding what was going on and lying about it to customers was really not a good idea. If Nvidia had come out and said up front look this is what we did, here are the benchmarks to prove it works fine, oh yeah the card is $330 now go ham I would imagine the vast majority of 970 owners would still have bought the thing. Now they are looking down the barrel of the legal gun and it was completely unnecessary to have hidden it from the public in the first place. No one denies the card delivers amazing performance for $330, so why the skulduggery Nvidia?
 

filly

Member
So so sneaky. Has any other graphics card ever done this kind of thing? Every single person who has bought a 970 never questioned that all 4gb's are running at full speed. It is an assumption, but it is a very fair assumption as 99.9% of the time when we buy ram in any form it isn't being stepped.

It like someone selling you a 100m swimming pool that has been filled with water only to find the last 15m are un-swimable, its a thick gloup and it makes no logical sense. It's unprecedented, you might not be able to swim 85m right now but at some point you would love to make the full 100m, because that is really in many ways why you get a beast of a card, you want to see that game that really uses it for ever bit of power that is has.

I'm surprised nVidia are that stingy. How much it possibly cost for the last bit of ram to have parity with the rest? My benchmarks are great and its been a great card. Its been a great card, but only so far. I bought it because achieving this much above consoles means a good future, but the ceiling now seems significantly lower.

Does anyone think that now we are going to get packing reading x GBs of GDDR5 running a X mhz across ALL CHIPS? It shouldn't be needed, assumptions are dangerous but in this case we are completely correct to assume such things. This single act makes me loose my faith in nVidia and questions what else they are hiding.

Also, did Dev's know this? How could they not? They must of in development come across it when optimizing. For such a mainstream high end card, if nVidia didn't tell them that would of been out of order, wasting their time. Did they just send their nVidia optimization team in to make the games work when Dev's complained when they hit a ceiling? Or did the Dev's know and not say anything, surely one would of said something by now?
 

Chozolore

Member
So so sneaky. Has any other graphics card ever done this kind of thing? Every single person who has bought a 970 never questioned that all 4gb's are running at full speed. It is an assumption, but it is a very fair assumption as 99.9% of the time when we buy ram in any form it isn't being stepped.

It like someone selling you a 100m swimming pool that has been filled with water only to find the last 15m are un-swimable, its a thick gloup and it makes no logical sense. It's unprecedented, you might not be able to swim 85m right now but at some point you would love to make the full 100m, because that is really in many ways why you get a beast of a card, you want to see that game that really uses it for ever bit of power that is has.

I'm surprised nVidia are that stingy. How much it possibly cost for the last bit of ram to have parity with the rest? My benchmarks are great and its been a great card. Its been a great card, but only so far. I bought it because achieving this much above consoles means a good future, but the ceiling now seems significantly lower.

Does anyone think that now we are going to get packing reading x GBs of GDDR5 running a X mhz across ALL CHIPS? It shouldn't be needed, assumptions are dangerous but in this case we are completely correct to assume such things. This single act makes me loose my faith in nVidia and questions what else they are hiding.

Also, did Dev's know this? How could they not? They must of in development come across it when optimizing. For such a mainstream high end card, if nVidia didn't tell them that would of been out of order, wasting their time. Did they just send their nVidia optimization team in to make the games work when Dev's complained when they hit a ceiling? Or did the Dev's know and not say anything, surely one would of said something by now?

Read the port above yours.
 

filly

Member
Read the port above yours.

Unknown soldiers post is very insightful and intelligent. That being that case, it only mutes my point about cost - it definitely seems like the upgrade to the full 4gb would of cost nVidia significantly more.

I didn't think I bought a card with 87% of ram functioning at full speed. I thought I bought a card with 100% of ram functioning at the full speed and as that is not the case I have good reason to be disappointed.
 

espher

Member
If Nvidia had come out and said up front look this is what we did, here are the benchmarks to prove it works fine, oh yeah the card is $330 now go ham I would imagine the vast majority of 970 owners would still have bought the thing. Now they are looking down the barrel of the legal gun and it was completely unnecessary to have hidden it from the public in the first place. No one denies the card delivers amazing performance for $330, so why the skulduggery Nvidia?

I'd have no beef if all the cards were on the table.

It's not a bad card, the performance (within the newly known parameters) is fantastic for the value, but my beef is with intentional misrepresentation of specs. Or unintentional, if you want to take that part of the story at face value.

It's like having someone lie to you about something entirely innocuous. You have to wonder why they'd even bother, and then you may be apt to wonder what else they may be lying about if they felt they had to lie about that. Trust is shot, even if there is otherwise 'no damage done'.
 
Have people here read the most recent PcPer article and video on the issue? I thought it was pretty good. I look forward to more testing being done though.
 
Top Bottom