• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Semiaccurate.com: Nvidia is screwed. Again. (Fermi related)

Chittagong said:
Makes me wanna buy a Fermi just because of how expensive it's to manufacture. The rare times you get to take money from a corporation (launch PS3 being the other one).

If the goal is to make the company lose money, wouldn't it be wiser not to buy the product and have it continue to sit on the shelf?

Eh, just a thought.
 
If Fermi is still in single-digit yields, it's going to cost way too much and have a tiny profit margin if it is coming out in a month or two. It probably won't be out until summer, at which point ATI can announce their HD6000 series for an August release. nVidia is basically skipping a generation.
 
Lonely1 said:
Even if you are ATI only, it's in your best interest to have a strong Nvidia (cheaper prices)...
Exactly. The prices on 5xxx prices aren't going to go down if there's no competition, and there's no reason to roll out the next series yet either.
 
gofreak said:
A stopped clock is right twice a day.

Charlie is permanently 'stopped' in anti-nvidia mode. So obviously when times are hard for nvidia he's going to look vindicated.

Even news aggregators have marked his card where he is sometimes listed under 'satire'. Friends don't link friends to his nvidia stories :| His info may (or may not) be right, but I'd prefer to wait to get it in a way that isn't filtered through his extreme fanboyism.

not to menion that his site is plastered with ATI ads...
 
SapientWolf said:
Exactly. The prices on 5xxx prices aren't going to go down if there's no competition, and there's no reason to roll out the next series yet either.

Yup, they've come down slightly, but are still about 10% over MSRP. Shame. Hopefully things pick up soon, either in large sales due to people not wanting to wait (which may cause a slight decrease) or ideally, through nVidia providing more competition.

nVidia's still got quite the lead, so this is good in a way for ATI, but I really have been hoping to see Fermi benchmarks and find out what it's pricing is going to be.
 
rocK` said:
Nvidia/Intel in a race for GPUs, created Fermi, a supposedly strong processor (?) which doesn't deliver? Intel was aware of it's limitations and didn't spout off about 'taking it to the next level' like Nvidia did...is that somewhat accurate?

Nope, that's not the article is talking about.

All semiconductor electronics (CPU, GPU, whatever) are built in wafers. Once the wafer is rolled out, is cut into individual silicon and tested to see if they're viable for production.

ICC_2008_Poland_Silicon_Wafer_1_edit.png


The problem that's being reported about is that very few units are good enough to be released to the market. Basically when a finished wafer of the new Fermi GPUs are tested, very few are "error free". A low yield (the article says than less than 10% are good) means high prices and very few units in the market. Actually they're losing money with every Fermi GPU built.

Btw, the last paragraph in the OP is pure gratuitous hate.
 
Bad news for nvidia. But is there really a rush to grab a hold of a DX11 card? I thought the main reason the latest ATI card were selling well was because of their performance for the price.
 
confused said:
Glad I jumped on the 5850 and didn't hold out for the Fermi.

I did the same on my new build, but I can't get any drivers to actually work for my 5850. I just keep getting crashes and BSOD with all of the drivers.
 
BravoSuperStar said:
Read this a few days ago. Taking it with a grain of salt considering the author. The "Laughabee" comment was funny though.

Despite people saying he's one of the most accurate (and most horribly biased) news sources for rumors in the past year, I agree, taking it with a grain of salt is probably best. I mean no matter what your take, at worst you have to wait a little under two months to find out for sure one way or the other, so it's just a matter of weeks before we know what's going on for real.

DX11 is supposed to get a small frame rate increase over DX10 I believe (if nothing else), but I don't think there's ever been a DX that was relevant before it's successor came out.

I've never had any BSODs or gray screens or icon problems or anything with my 5870, I'd probably try to RMA the card if I had such troubles though.
 
Lostconfused said:
Bad news for nvidia. But is there really a rush to grab a hold of a DX11 card? I thought the main reason the latest ATI card were selling well was because of their performance for the price.

Well yes, the performance are really impressive for the price.

I bought a 5870 3 months ago (togheter with an i5 750), and i have never been happier. Bought it for the performance/price, not the DX11, same as when ive got the 8800 in 2007. It runs battlefield bad company 2 smooth as butter at 1920x1200, 8x adaptive multisample AA, 16 aniso
 
Son of Godzilla said:
The whole concept of fabrication yields is really fascinating. Sometimes it's just easy to take technology for granted without realizing how truly complicated things are.


Hundreds of millions of devices on a single chip, i'm amazed these thing get produced at a consumer level at all.
 
Minsc said:
Despite people saying he's one of the most accurate (and most horribly biased) news sources for rumors in the past year, I agree, taking it with a grain of salt is probably best. I mean no matter what your take, at worst you have to wait a little under two months to find out for sure one way or the other, so it's just a matter of weeks before we know what's going on for real.

DX11 is supposed to get a small frame rate increase over DX10 I believe (if nothing else), but I don't think there's ever been a DX that was relevant before it's successor came out.

I've never had any BSODs or gray screens or icon problems or anything with my 5870, I'd probably try to RMA the card if I had such troubles though.


Yeah I think there is enough technical info and the fact the chip has been delayed to merit some credibility. I would hate to see nvidia end up in the worst case scenario presented in the article. And I haven't had any issues with my 5850 either or it would have been returned.
 
Burger said:
Single digit wafer yields ?

Oh snap.

*edit* As the owner of a computer running a nVidia chip which is known to be failure prone (8600M) and nVidia's complete 'there is nothing wrong' attitude to the whole bad bump fiasco, I hope they get smashed by the competition.

Ugh the 8600M. The company I work for has many, many failed laptops because of this chipset.
 
nacire said:
I did the same on my new build, but I can't get any drivers to actually work for my 5850. I just keep getting crashes and BSOD with all of the drivers.

send it back if thats the case. i haven't had any problems w/ mine.
 
fizzelopeguss said:
I'd rather stick clothing pegs on my nuts before i buy another ATi card.
That's a pretty strange shopping ritual. But I guess what you do in your free time is your own business.
 
fizzelopeguss said:
I'd rather stick clothing pegs on my nuts before i buy another ATi card.

Your narrow mind means you miss out.

Both Nvidia and ATI have fucked up BIG time in the past, there are both as capable of selling you a lemon as each other....... My laptop has a 8600M GT for instance and that fucking thing throttles back to fuck all mhz as soon as I even think about clicking on a game with its fan spinning at 100% ALL THE TIME.

Dell and Nvidia deny all problems though the pack of twats.
 
Vorador said:
Btw, the last paragraph in the OP is pure gratuitous hate.

WTH, expressing my bias and stating that I never had any faith in nVidia's vision for the Fermi architecture is 'gratuitous hate'? What have I done, buried nVidia fans alive?
 
I'm just waiting for Nvidia to hit the market to see how much the 5850 will drop in price. At the moment, I won't pay that much for a graphics card (since £250 is about the price of a console, and I still need a new mobo and processor), so I'm looking at getting a 5770. I'd love a price war to sway me towards the 5850 for a clear performance increase, though!

... Unless the Nvidia cards show better performance for less and then I'll have to re-evaluate. I have absolutely no preference whatsoever.
 
pj said:
If this random internet dude can know all of this, and I can read and understand it with no hardware background, how can Nvidia fuck it up so bad when making video cards is all that they do?

Charlie gets a Monday quarterback perspective on things. It's a lot easier point out these flaws after they happen instead of predicting them months in advance. ATI is just better managed to have seen that such large die sizes are not viable and/or they had some collaboration from their AMD co-workers.

It really all boils down to a risk ATI took several years ago when they decided to design the 4000 series with a yet to be manufactured 256bit GDDR3 memory, because they believed a large die size was no viable. This allowed them to make a smaller GPU logic chip since GDDR3 ram needs less physical connections than the old types of memory. NVIDIA decided not to take such a risk and for forced into a larger design so they must have assumed that either ATI was making the same choice or that they were gonna take advantage of the large chip size since they were forced to use it anyway. There are of course other blunders committed by Nvidia that could have mitigated the mess they are in with the GT2XX series but the big die size is the real killer in my opinion.

Also Nvidia handled the bump issue horribly.
 
fizzelopeguss said:
Currently i ain't missing out on shit, is fermi is complete ass then i might have to wait for AMD to drop the price of their current cards.

FUCK ATI!!!!111

NVIDIA FOREVA!!!!
 
Minsc said:
FUCK ATI!!!!111

NVIDIA FOREVA!!!!

Not really, the only graphics card brand that holds a special place in my heart is voodoo.

If i spend £200+ on any graphics card it has to have nhancer, if the 5850 drops to £150 i'll seriously consider picking one up.
 
fizzelopeguss said:
Not really, the only graphics card brand that holds a special place in my heart is voodoo.

If i spend £200+ on any graphics card it has to have nhancer, if the 5850 drops to £150 i'll seriously consider picking one up.

You just came across the wrong way with your earlier comment, I apologize for my remark... we all have our preferences, I shouldn't fault your intolerance of ATI any more than my own intolerance of the 9400M chipset (used in $2000 computers) or Vista.
 
Minsc said:
You just came across the wrong way with your earlier comment, I apologize for my remark... we all have our preferences, I shouldn't fault your intolerance of ATI any more than my own intolerance of the 9400M chipset (used in $2000 computers) or Vista.

I'm like that, i seem to rub people the wrong way. :P
 
As has been stated before a few posts up, here is anandtech's showing of defects in fabrication.

2r7njaq.jpg

Example of 7 defects on a wafer using a smaller die (ex R870)

14cxp4z.jpg

Example of 7 defects on a wafer using a large die (ex fermi)

As you can see, each defect has a higher chance of fucking an entire chip on fermi due to it's sheer size. That's not to say ATi is sailing smoothly either, they just have a smaller chip and can yield better by default.

Taking into account TSMC having tons of yield issues as it is (even if they say it's much better...), and nVidia basically having a completely new unproven architecture using the problematic 40nm process, and you have this shit scenario they're in now.

Anyway, you guys should really read that whole anandtech article, it's a fantastic read and really shows how ATi pulled off some amazing feats these past 2 product cycles. From both a marketing, and product point perspective.

The R870 Story: AMD Showing up to the Fight
 
HomerSimpson-Man said:
Dammit, I want them to release this soon to lower the blasted 5xxx series prices. Weakened competition sucks.
This, and to make ATI hurry up their 28nm parts. I want 60fps Crysis on a card without a PCIe power connection.
 
Guys, chill. Fermi's fine. It'll be on the market soon enough.

and for the bump crack bashing, I do believe nvidia set aside 400m purely for this purpose to fix anyone's laptop that's hit with this problem.

Minsc said:
my own intolerance of the 9400M chipset (used in $2000 computers) or Vista.

AFAIK, none of the macs that cost 2000$ (other than air) has purely 9400m only....
Also, great way at blaming the apple tax on the chipset lol
 
tahrikmili said:
WTH, expressing my bias and stating that I never had any faith in nVidia's vision for the Fermi architecture is 'gratuitous hate'? What have I done, buried nVidia fans alive?
I don't know. I find Fermi's focus on GPGPU computing appealing personally. What I don't find appealing is what will be the card's size and cost, forcing me to wait for a low-end/mid-range model.
 
nacire said:
I did the same on my new build, but I can't get any drivers to actually work for my 5850. I just keep getting crashes and BSOD with all of the drivers.

I had these but they seem to have disapeared now. Only had a black screen last week and no greys which were pretty constant idle and gameplay. It was pretty annoying.

Do a complete driver uninstall then reboot, and then use Driver Sweeper, reboot again and install Catalyst 10.2.
 
Apparently Charlie got his hands on some benchmark figures: http://www.semiaccurate.com/2010/02/20/semiaccurate-gets-some-gtx480-scores/

On the GTX470 side, there are 448 shaders, and the clocks are set at 625MHz and 1250MHz in both cases. If the GTX480 is really at 600Mhz and 1200MHz, and the GTX470 is slightly faster, it should really make you wonder about the thermals of the chip. Remember when we said that the GF100 GTX480 chip was having problems with transistors at minimal voltages? Basically Nvidia has to crank the voltages beyond what it wanted to keep borderline transistors from flaking out. The problem is that this creates heat, and a lot of it. Both of our sources said that their cards were smoking hot. One said they measured it at 70C at idle on the 2D clock.

The GTX480 with 512 shaders running at full speed, 600Mhz or 625MHz depending on which source, ran on average 5 percent faster than a Cypress HD5870, plus or minus a little bit. The sources were not allowed to test the GTX470, which is likely an admission that it will be slower than the Cypress HD5870.

The GF100 GTX480 was not meant to be a GPU, it was a GPGPU chip pulled into service for graphics when the other plans at Nvidia failed. It is far too math DP FP heavy to be a good graphics chip, but roping shaders into doing tessellation is the one place where there is synergy. This is the only place where the GTX480 stood out from a HD5870. The benchmarks that Nvidia showed off at CES were hand-picked for good reason. They were the only ones that Nvidia could show a win on, something it really needs to capture sales for this card and its derivatives, if any.

If true, nVidia is fucked.
 
I'm not really concerned about those idles. It's the load temperature that matters. Before I put an aftermarket cooler on my 4890, it idled at around 64C on the 2D clock and only hit about 78C on a full 3D load. I bet it'll be fine, Nvidia can't release a product that literally cooks itself - and besides, I bet an Accelero could fit it anyway.
 
Tenacious-V said:
As you can see, each defect has a higher chance of fucking an entire chip on fermi due to it's sheer size.
Nope. The opposite actually. With a larger die, chances of a single defect ruining one die decrease (because the chance that multiple defects fall into the same [already ruined] die increase).

But, of course, a larger die is more expensive, which makes the individual defect more expensive, too. A large shot die wastes a larger proportion of the overall wafer space than a small one.

... I thought most modern logic designs above a certain size include redundancy (which makes this way more complicated), but perhaps not in this case.
 
I NEED SCISSORS said:
I'm not really concerned about those idles. It's the load temperature that matters. Before I put an aftermarket cooler on my 4890, it idled at around 64C on the 2D clock and only hit about 78C on a full 3D load. I bet it'll be fine, Nvidia can't release a product that literally cooks itself - and besides, I bet an Accelero could fit it anyway.

Why not though? It's better for hardware to idle at around the 30C range or less, it keeps your whole case cooler and thus quieter as less active cooling is needed.

tahrikmili said:

Well, I still think it will sell out. There's got to be enough nVidia die-hards (especially those who want faster 3DVision) that will buy it at any price, so if you really want it, do it before it starts selling for over $800 because it's unavailable everywhere.
 
tahrikmili said:

Customer reviewes are epic :lol :lol

Pros: I started folding on this card and immediately I hit 500000000000000 ppd. Then i received the message that I had just cured all cancer with the insane amount of folding this card did.

Cons: It got so hot when folding that my case burned and melted through my apartment floor. It then crashed on the cancer kid at the botton.

Other Thoughts: Guess you can say in addition to folding and finding a cure for cancer it also helped take someones life as well.

Pros: Good investment

Cons: MX440 running Deus Ex 2 actually feels faster.

Other Thoughts: Consider getting MX440 which is also available at this reseller.


Pros: 12,983,249,893,843 cuda cores of graphical rendering goodiness RENDERS MIDGET PORN LIKE NO OTHER, THOSE LITTLE BASTARDS LOOK LIFE SIZED WITH THESE CARDS !!!!

Cons: It's mythical, you can hear about it's greatness more then you can see it because it's a force of nature, not a material object

Other Thoughts: I have 8 of these in octuplet SLI, it blew up my first 10 monitors it's so good


Pros: Great for terrorism

Cons: Plugged two of them in and the power station went into meltdown and the police are looking for me

Other Thoughts: I'm posting from an internet cafe, please tell my parents I lov

:lol :lol :lol :lol @ last one, police got him
 
Top Bottom