• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA to release GeForce Titan

mkenyon

Banned
Why not just drop the extra hundred and go for the 690? You HAVE to be rolling in fucking money at that point.
Is this a result of gamers being poor or just young? $1000 isn't a lot of money for someone in their 20s-30s with an okay job.

PCs are less expensive to stay on the high end than most hobbies out there. A lot of folks consistently buy and sell the high end stuff as well so you don't have to pay a ton of money out of pocket when you upgrade.

But yeah, I had the 690, and I'd far prefer this. I can't tell you the number of times I'd have to deal with halved performance because SLI profiles are borked.
 
$900 huh? Nah, my $40 GTS 250 still runs most games at high (I downsample to 1080p), so I'll pass.

n725075089_288918_2774.jpg
 

Jtrizzy

Member
Since all the regulars are in here could someone summarize the "roadmap" a little? I've been out of town for a while and am lost. Currently I have a 1.5gb 580, and a 2600k@4.2.

My plan is/was to wait till the next gen consoles come out, but my brief reading on this seems to say that they won't be much better than a 580, is that correct? I'd want something that's really going to shit all over them. My other annoyance is that I'd like to wait till there are 120hz IPS monitors and get 3 of those. Or alternatively a Panny plasma that can display 120hz.
 

Batman

Banned
So this would be preferred to the 690 because this is basically a single GPU card and not a SLI on one card like the 690?
 

tipoo

Banned
So assuming I had way more money than I had good uses for, would I really take a 1/10th reduction in price for 85% of the performance of the 1000 dollar (690) card?

And besides, weren't the things they cut down from Big Kepler often unrelated to gaming, like double precision float performance? Adding more compute units always helps performance, but why add things in a gaming card that aren't needed for gaming? A GK104 based design with more CUDA cores seems to make more sense to me, rather than waste all those transistors.
 
Since all the regulars are in here could someone summarize the "roadmap" a little? I've been out of town for a while and am lost. Currently I have a 1.5gb 580, and a 2600k@4.2.

My plan is/was to wait till the next gen consoles come out, but my brief reading on this seems to say that they won't be much better than a 580, is that correct? I'd want something that's really going to shit all over them. My other annoyance is that I'd like to wait till there are 120hz IPS monitors and get 3 of those. Or alternatively a Panny plasma that can display 120hz.

wait 2 years from now and buy w/e the top end card is and it will be the card that will last.
 
Is this a result of gamers being poor or just young? $1000 isn't a lot of money for someone in their 20s-30s with an okay job.

PCs are less expensive to stay on the high end than most hobbies out there. A lot of folks consistently buy and sell the high end stuff as well so you don't have to pay a ton of money out of pocket when you upgrade.

But yeah, I had the 690, and I'd far prefer this. I can't tell you the number of times I'd have to deal with halved performance because SLI profiles are borked.

Exactly, you have other people in their twenties spending thousands of dollars on cars, motorcycles, partying, etc etc but its ok because those are "cool" hobbies.
 

TheExodu5

Banned
I'm excited about a new high end GPU, but the price is completely offputting. NVidia is charging insane amounts for a card with likely the same die size as a GTX 580. They can do it since they basically have no competition, but I'll be skipping out. I can stomach spending $500 on a GPU...$900 is pushing it too far.
 

Durante

Member
Why not just drop the extra hundred and go for the 690? You HAVE to be rolling in fucking money at that point.
Multi-GPU rendering has a variety of issues -- profiles, game compatibility, microstuttering, latency, you name it. When you can achieve similar performance with a single GPU that's a much better option.
 

artist

Banned
Since all the regulars are in here could someone summarize the "roadmap" a little? I've been out of town for a while and am lost. Currently I have a 1.5gb 580, and a 2600k@4.2.

My plan is/was to wait till the next gen consoles come out, but my brief reading on this seems to say that they won't be much better than a 580, is that correct? I'd want something that's really going to shit all over them. My other annoyance is that I'd like to wait till there are 120hz IPS monitors and get 3 of those. Or alternatively a Panny plasma that can display 120hz.
perfrel_2560.gif


This will be 2x 580 in performance (not talking SLI).
 
So what some of you are saying is that if AMD had released a better card then this card would have been like $600 a year ago???

So I guess we have AMD to blame???
 

nbthedude

Member
Since all the regulars are in here could someone summarize the "roadmap" a little? I've been out of town for a while and am lost. Currently I have a 1.5gb 580, and a 2600k@4.2.

My plan is/was to wait till the next gen consoles come out, but my brief reading on this seems to say that they won't be much better than a 580, is that correct? I'd want something that's really going to shit all over them. My other annoyance is that I'd like to wait till there are 120hz IPS monitors and get 3 of those. Or alternatively a Panny plasma that can display 120hz.

Given the current rumors, your 580 w/ 2600 is probably more powerful than the next gen consoles. Wait a few years. You probably already have something better or around equivalent to them.
 

Durante

Member
I'm excited about a new high end GPU, but the price is completely offputting. NVidia is charging insane amounts for a card with likely the same die size as a GTX 580. They can do it since they basically have no competition, but I'll be skipping out. I can stomach spending $500 on a GPU...$900 is pushing it too far.
Which is why we should all hope for AMD to get their shit together and be competitive again.
 

nbthedude

Member
So what some of you are saying is that if AMD had released a better card then this card would have been like $600 a year ago???

So I guess we have AMD to blame???

Or you could say we have AMD to thank. If they weren't in the game at all just consider what the price gouging would look like. They already offer a lot more bang for the buck in their top in cards and that is at the level where Nvidia has to compete with them.
 

artist

Banned
So assuming I had way more money than I had good uses for, would I really take a 1/10th reduction in price for 85% of the performance of the 1000 dollar (690) card?

And besides, weren't the things they cut down from Big Kepler often unrelated to gaming, like double precision float performance? Adding more compute units always helps performance, but why add things in a gaming card that aren't needed for gaming? A GK104 based design with more CUDA cores seems to make more sense to me, rather than waste all those transistors.
Mutliple ASIC designs like you are suggesting adds a lot of costs as well as risk factors.
 
Or you could say we have AMD to thank. If they weren't in the game at all just consider what the price gouging would look like. They already offer a lot more bang for the buck in their top in cards and that is at the level where Nvidia has to compete with them.

True too, BTW im not an nVidia zealot, I actually own a 7970 that I have been using since launch and I love it for the price I payed for it.
 

artist

Banned
So what some of you are saying is that if AMD had released a better card then this card would have been like $600 a year ago???

So I guess we have AMD to blame???
It really comes down to AMD's sweet-spot GPU strategy. If they are willing to go balls out and make (another) huge die, this wont be priced at $899 long.
 

nbthedude

Member
True too, BTW im not an nVidia zealot, I actually own a 7970 that I have been using since launch and I love it for the price I payed for it.

I own a 7970 too but I also own media PC with an low-mid range Nvidia. I'm brand agnostic but right now it's pretty crazy the price-value difference between the two brands.
 
Since all the regulars are in here could someone summarize the "roadmap" a little? I've been out of town for a while and am lost. Currently I have a 1.5gb 580, and a 2600k@4.2.

My plan is/was to wait till the next gen consoles come out, but my brief reading on this seems to say that they won't be much better than a 580, is that correct? I'd want something that's really going to shit all over them. My other annoyance is that I'd like to wait till there are 120hz IPS monitors and get 3 of those. Or alternatively a Panny plasma that can display 120hz.

the rumors a saying the PS4 will have something a little more stronger than the 580 in it, but you really cannot compare chips in a console running together vs parts in a pc.

you might need a beefier pc setup for future console ports because these new consoles have small jaguar cores, but they're able to run 16 threads vs 4 threads on an intel quad. Devs might just brute force it with intel chips in our pc's, but it's still an unknown right now.

if you want to take a dookey on the next consoles, i'd suggest grabbing a gtx 780 when it's released this year and go from there.
 

FACE

Banned
I'm excited about a new high end GPU, but the price is completely offputting. NVidia is charging insane amounts for a card with likely the same die size as a GTX 580. They can do it since they basically have no competition, but I'll be skipping out. I can stomach spending $500 on a GPU...$900 is pushing it too far.

That's what happens when AMD drops the ball, unfortunately.
 
I'm excited about a new high end GPU, but the price is completely offputting. NVidia is charging insane amounts for a card with likely the same die size as a GTX 580. They can do it since they basically have no competition, but I'll be skipping out. I can stomach spending $500 on a GPU...$900 is pushing it too far.

Amen, this fact can't be pointed out enough.

Peronally I balk at this not because I have a problem with spending 900 dollars on gaming, but because I won't spend 900 dollars on 450 dollars worth of hardware.

The 'you can spend that much' argument is worthless, overpaying is overpaying.
Buying two of these to put in SLI for a combined price of 900 dollars would be sane, spending 900 dollars on a gtx 580 sized die with 580 sized bus is madness no matter how big an enthusiast you are. Might as well light your money on fire.

As for blaming AMD or NVIDIA, I blame both.
AMD kicked this off by massively overpricing the 7xxx series before kepler released (they had 6 months of the undisputed by a country mile peformance/watt crown and abused it just like nvidia abuses it now) .
Thanks to the efficiency gains (performance/watt) people were actually buying them at these stupid prices.
Nvidia saw that and probably went holy SHIT there's people who will pay double for no reason?
So now they both settled into moving lower volumes at far higher profits. And we as consumers lose, while analysts ponder over the self fulfilling prophecy of 'stalling growth of the pc market' (since this same scenario is going on in the HDD market and partly in the cpu market)

And again looking at this 'titan' card's low TDP and clockspeeds compared to gtx 580 the yields will be high (because voltage will be lower) so Nvidia are double stiffing you with this thing.

My input: if you have any self respect as a consumer then don't buy this.
 
Could you please tell me where/how you are building it, and what the specs are? (Or at least what games the machine can handle on max settings)

Just type in Budget PC gaming build into google, i'm in Canada so this is a DYI project, getting all my stuff off New Egg.ca

Toms Hardware has a lot of nice guides also.

Hope this helps!
 

Shambles

Member
While nVidia has kept their flagship pricing relatively constant over the past generations this represents a significant leap in pricing of their flagship GPU. While AMD has been cranking the price of their flagship GPUs nvidia has held the high end and been raising the floor on mainstream parts. While Titan may be an oddball product it also seems like we should expect the GTX 780 to launch at $600 to justify high prices at every price point.

460 $230
560 Ti $250
660 Ti $300

4870 $300
5870 $380
6970 $369
7970 $550

As technology progress both the performance and pricing improves. However lately while the total performance improves, the price seems to be increasing in relation to it.

Whossshhhh

Judging by your posts, and since you relate 2008 as the same distant past as 1960 I'm just going to go ahead and assume you're 14. Best to keep quiet until you know what you're talking about.
 

tipoo

Banned
This is clearly meant to be a professional high end card.

teslagpu2.jpg


Jump from 1536 Cuda Cores in gtx680 to 2688 in titan, lower clocks for core and memory, but higher bandwith [from 256bit to 384bit].



But they already have GK110 in professional cards (Tesla and Quadro). The Geforce line is supposed to be consumer gaming cards.

Seems like a lot of the die size would be wasted on consumers, I don't get why the high end single chip wouldn't just be a GK104 with more cores speed and bandwidth. The double precision units of 110 are unnecessary for games.
 

artist

Banned
As technology progress both the performance and pricing improves. However lately while the total performance improves, the price seems to be increasing in relation to it.
Has a lot to do with the 20/22nm node getting delayed and TSMC wafer pricing and allotment.
 

tipoo

Banned
Mutliple ASIC designs like you are suggesting adds a lot of costs as well as risk factors.

True, but I would think the smaller die size my approach would require would make up for that. 7 billion transistors is HUGE, any reduction to that would greatly help.
 

artist

Banned
True, but I would think the smaller die size my approach would require would make up for that. 7 billion transistors is HUGE, any reduction to that would greatly help.
It's not as simple as that. Another die, say 6 billion transistors minus the computational caches etc is still a big fucking die.
 

x3sphere

Member
While nVidia has kept their flagship pricing relatively constant over the past generations this represents a significant leap in pricing of their flagship GPU. While AMD has been cranking the price of their flagship GPUs nvidia has held the high end and been raising the floor on mainstream parts. While Titan may be an oddball product it also seems like we should expect the GTX 780 to launch at $600 to justify high prices at every price point.

460 $230
560 Ti $250
660 Ti $300

4870 $300
5870 $380
6970 $369
7970 $550

Back when Nvidia didn't have much competition from AMD their flagship GPUs were priced _very_ high. The 8800 Ultra debuted at around $899 I believe in 2007, so this wouldn't be a first. If they're pricing this new GPU that high, it means they have confidence that AMD won't outdo them anytime soon.
 

Shambles

Member
Has a lot to do with the 20/22nm node getting delayed and TSMC wafer pricing and allotment.

Not that you're not right but the process node drop also significant increases the number of chips they can get off a wafer. Excluding R&D costs it's now significantly cheaper for them to produce these chips than before we hit 2xnm. It appears like they ate the increased the profit margins at first to make up for the R&D setbacks and now that they realize they can jack the prices are seeing how far they can push it.

Back when Nvidia didn't have much competition from AMD their flagship GPUs were priced _very_ high. The 8800 Ultra debuted at around $899 I believe in 2007, so this wouldn't be a first. If they're pricing this new GPU that high, it means they have confidence that AMD won't outdo them anytime soon.

Let's hope this cycle goes full circle again and AMD can push the envelope like ATI did. :(
 

artist

Banned
Not that you're not right but the process node drop also significant increases the number of chips they can get off a wafer. Excluding R&D costs it's now significantly cheaper for them to produce these chips than before we hit 2xnm. It appears like they ate the increased the profit margins at first to make up for the R&D setbacks and now that they realize they can jack the prices are seeing how far they can push it.
That is exactly how it is.
 
Top Bottom