• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA to release GeForce Titan

It's not as simple as that. Another die, say 6 billion transistors minus the computational caches etc is still a big fucking die.

die size increase cuts yields exponentially.
1-s2.0-S0167926003000828-gr9.gif

Fig. 9. (a) and (b) have the same defects but the yields are 38% and 79%, respectively with the chip edge of (a) double that of (b) (four times increase in chip area).
 

DieH@rd

Banned
But they already have GK110 in professional cards (Tesla and Quadro). The Geforce line is supposed to be consumer gaming cards.

According to PCPER, nvidia will just sell their professional Titan cards to the general population, without changes to the design.
 
But they already have GK110 in professional cards (Tesla and Quadro). The Geforce line is supposed to be consumer gaming cards.

Seems like a lot of the die size would be wasted on consumers, I don't get why the high end single chip wouldn't just be a GK104 with more cores speed and bandwidth. The double precision units of 110 are unnecessary for games.

That's not how producion of multibilion transistor count silicon parts work.

You want as few lines as possible and then you are optimising them all the time to improve yields.
 
Oh no, it has a larger bus?! ... Wait, how is that bad again?

Read again, I said no such thing.
it has the same bus size as gtx 580 and a lower TDP (which suggests lower voltage, combined with the transistor count 3b@ 40nm vs 7b@28nm it also suggests the same or a smaller die than the gtx 580).

Nothing about this card makes it anything more than the successor to gtx 580, the lower tdp (and lower voltage) makes it cheaper too since they don't need to cherrypick parts that can run at very high clocks like with the gtx 580.

I hope I expressed myself better this time.

As xesphere and shambles have said, prices have been on the rise since the 460 released (with a sharp increase this gen) and it's turning out to be a repeat of 2005-2007.

Heh, I stuck with my radeon 9800pro all the way till 2009 because I refused to partake back then, looks like it'll be a repeat of that for me. I need to find me a new hobby.
 
If this had come out 6 months ago I definitely would have gotten two of them. People claiming that this is overkill obviously aren't running high end rigs. Resolutions >1080p aside, there are still games that need a card like this to hit 60 fps. Hell, I'd be floored if Crysis 3 didn't need 2 of these to run at 60 fps @1080p alone.

Having said that, charging $900 for cards that should have come out at $600 a year ago is egregious. I couldn't justify spending this much on a year old video card when new video cards and consoles are on the horizon.
 

BigTnaples

Todd Howard's Secret GAF Account
Awesome a 900 dollar piece of hardware that has no real purpose...

Uh.


Any one here who is a fan of 60fps, 3D, Occulus Rift, Supersampling, higher than 1080p Resolution, Tessellation, etc could benefit greatly from this. And still be needing more horsepower. Especially with next gen fast approaching, and Crysis 3. PC gamers are starting to have reasons to upgrade more again, finally.
 
Uh.


Any one here who is a fan of 60fps, 3D, Occulus Rift, Supersampling, higher than 1080p Resolution, Tessellation, etc could benefit greatly from this. And still be needing more horsepower. Especially with next gen fast approaching, and Crysis 3. PC gamers are starting to have reasons to upgrade more again, finally.

Yup. Seeing those kinds of comments makes me glad most members of GAF have no involvement with technology development/advancement.
 
Whoa... DAT meltdown! It's like... premium products are a bad thing! And no one should be caring about visual fidelity.

I know right. I only have 2x 580s in my machine, but playing BF3 on a 120hz monitor at 120 fps, all Ultra settings, is a sight to behold.

Bulletstorm in 3d is impressive as well.

Go big or go console.
 

Izayoi

Banned
We should be blaming AMD for the price, because they're seemingly incapable of producing hardware powerful enough to compete with nVidia.

I'll definitely be waiting for a price drop of some kind, but this is good as I probably don't need to upgrade anyway, seeing as my 2yo PC is way more powerful than either of the next-gen consoles as it is.
 
We should be blaming AMD for the price, because they're seemingly incapable of producing hardware powerful enough to compete with nVidia.

I'll definitely be waiting for a price drop of some kind, but this is good as I probably don't need to upgrade anyway, seeing as my 2yo PC is way more powerful than either of the next-gen consoles as it is.

Ummm AMD had the most powerful card for the past 6 months...

The reason the prices are so high is because people are eating this stuff up. They are willing to pay for it so companies release them.
 

artist

Banned
die size increase cuts yields exponentially.
1-s2.0-S0167926003000828-gr9.gif

Fig. 9. (a) and (b) have the same defects but the yields are 38% and 79%, respectively with the chip edge of (a) double that of (b) (four times increase in chip area).
You are comparing dies with 100% size difference, while we're talking of a hypothetical difference of 14%. Try again.
 

mkenyon

Banned
Artist, could you give me a rundown of what's going on with the production then? I'm no engineer, I just understand performance and how to test it. You seem to know what's going on here, but you're only really responding with 'no' instead of explanations.
 

DieH@rd

Banned
If AMD drops the ball again this will probably be more powerful than the high end 700 series card.

AMD has no intention of making chip that has 7.1 billion transistors.

Their upcoming flagship card [8970] will have 5.1 billion [a lot!], which will be sold for ~400-450$.
 

artist

Banned
Artist, could you give me a rundown of what's going on with the production then? I'm no engineer, I just understand performance and how to test it. You seem to know what's going on here, but you're only really responding with 'no' instead of explanations.
I've already dropped quite a bit of info - the amount of wafers you order for your Tesla products are less than 1/12th of what you do for your Geforce line.
 

mkenyon

Banned
NM, found this on the last page:

Not that you're not right but the process node drop also significant increases the number of chips they can get off a wafer. Excluding R&D costs it's now significantly cheaper for them to produce these chips than before we hit 2xnm. It appears like they ate the increased the profit margins at first to make up for the R&D setbacks and now that they realize they can jack the prices are seeing how far they can push it.
Is there anything other than inductive reasoning we have to believe this is the case? Would love to see some more data on this.
I've already dropped quite a bit of info - the amount of wafers you order for your Tesla products are less than 1/12th of what you do for your Geforce line.
You mean in terms of # of products sold? Essentially getting a cut on the price of the manufacturing because of economies of scale?
 

artist

Banned
You mean in terms of # of products sold? Essentially getting a cut on the price of the manufacturing because of economies of scale?
What I meant was that Tesla K20 being in short supply does not indicate bad yields but rather inaccurate demand projections for K20 within Nvidia.
 
I know right. I only have 2x 580s in my machine, but playing BF3 on a 120hz monitor at 120 fps, all Ultra settings, is a sight to behold.

I can't even imagine BF3 at max settings in 120 frames.

I was able to play portal 2 at max settings at 120 FPS and it really was something else. It really helps keep you from getting disoriented when everything is that smooth. Same with counter strike go, I just feel like I have a HUGE advantage over everyone else.

120 frames truly does change the experience significantly. Sometimes when I clone my monitor to the tv and forget that I left it on I'll feel my mouse on the desktop at 60 frames and it just feel disastrous. It blows me away how we allow ourselves to compensate for 60 FPS.
 

Ceebs

Member
This is of actual interest to me. Would be much preferable to a SLI setup to push my high res monitor.

I can probably unload my card for 350 or so still.

Will want to see the 1440p or 1600p benchmarks first. This is assuming they are not stupid and put enough VRAM on the thing.
 

mkenyon

Banned
This is of actual interest to me. Would be much preferable to a SLI setup to push my high res monitor.

I can probably unload my card for 350 or so still.

Will want to see the 1440p or 1600p benchmarks first. This is assuming they are not stupid and put enough VRAM on the thing.
Supposedly 6GB. I think it comes in either 5GB or 6GB variants, so it'll be one of those two.
 
Quoting number isn't what Im going to try to do since Im not here to argue who has the bigger epeen. Only that people should think before they blame something.

This thread is full of people who wants this card at $900. It doesn't matter what AMD does if people are buying this stuff at that price.
 

Dennis

Banned
Supposedly 6GB. I think it comes in either 5GB or 6GB variants, so it'll be one of those two.

lol 5GB or 6GB for a single GPU is so ridiculous outside of using it for a realtime GPU render engine.

So if I buy two 6GB VRAM Titans I will have more VRAM than most people have regular RAM in their PCs let alone the next gen consoles.

This appeals to me.


DO you even lift?

fixed
 

Ceebs

Member
lol 5GB or 6GB for a single GPU is so ridiculous outside of using it for a realtime GPU render engine.

So if I buy two 6GB VRAM Titans I will have more VRAM than most people have regular RAM in their PCs let alone the next gen consoles.

This appeals to me.




fixed

Two of them would be getting close to what my car is worth XD. Maybe a bit overkill there.
 

mkenyon

Banned
Quoting number isn't what Im going to try to do since Im not here to argue who has the bigger epeen. Only that people should think before they blame something.
What did I say that you are responding to me with this?

*edit* ohhhh, TechReport's really accurate and awesome graphs.

Listen man, my test bench has a 7970 on it. I'm not some fanboy. I'll point out any hardware flaw if I see it.
 

squidyj

Member
lol 5GB or 6GB for a single GPU is so ridiculous outside of using it for a realtime GPU render engine.

So if I buy two 6GB VRAM Titans I will have more VRAM than most people have regular RAM in their PCs let alone the next gen consoles.

This appeals to me.




fixed

but... that's not how it works :l
 
What did I say that you are responding to me with this?

*edit* ohhhh, TechReport's really accurate and awesome graphs.

Listen man, my test bench has a 7970 on it. I'm not some fanboy. I'll point out any hardware flaw if I see it.

Not blaming you or anything, just saying blaming AMD isn't going to lower the price of this card.
 
Top Bottom