• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA to release GeForce Titan

LiquidMetal14

hide your water-based mammals
This would be a viable upgrade considering it would use all 2 PCIe 3.0 ports at full x16 speeds.

I'm curious if I would need a bigger PSU to handle 2 of these as I already ordered the Seasonic x850 a few days back. I read a review that mentioned it could pull almost 1000w but I'm not sure if that's good if the PSU is rated for 850w. Either way, I'm pretty sure I could run 2 of these in SLI.
 

Jarmel

Banned
Wat.

15+15 =/= 90

In case this isn't clear:

Each GTX690 is two GTX680s on the same circuit board. Literally two GPUs, not a comparison. Two 690s means you have 4 680s between two circuit boards. That's as far as SLI goes. You can't SLI 3 690s.

Two 690s is going to be under 10% performance increase over a single 690 due to how poorly SLI scales with 3-4 GPUs. For really rough argument's sake, lets say two 690s = 115% performance of a 690. One Titan is 85% performance of a 690. If Titans scale poorly at ~90%, that means that two Titans are 153% performance of a 690.

Alright, fine so we're theoretically looking at a 40% increase in SLI rigs mainly. At this point though, convenience is out the window. We're now looking at pure performance in certain price ranges and in general. That would mean we need a comparison of trisli 680s against sli Titans around the 1.5k to 2k price range.
 

artist

Banned
Kind of interesting on how this turned out;

Rumor: Nvidia adopting AMD's GPU strategy next year, massive die to return in '13?

A lot of the specs/codename in there were wrong, but the the overview that Nvidia will have no massive die in 2012, only in '13 was right on. From the rumblings at Maxwell side of things, they intend to repeat the same - compute-light GM104 to release first and the big daddy to show up later. A lot will depend obviously on TSMC node ramp up and the competitive landscape from AMD & Intel.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
It's ~1.7x a GTX680 for $899.

What concerns me is that this pricing implies that GTX780 is going to be a mild improvement, like 1.2x.
 

Zaptruder

Banned
Horrendous price, but I've paid around that range for my high end rigs before.

Looking forward to cramming two top end cards (hopefully the generation after this titan) in my rig when Rift consumer releases around mid 2014.
 

LiquidMetal14

hide your water-based mammals
It's ~1.7x a GTX680 for $899.

What concerns me is that this pricing implies that GTX780 is going to be a mild improvement, like 1.2x.

At that point, the entry level price may be around 500-600 though. At least I'm hoping for that.
 

Weenerz

Banned
What rigs are people running monitor-wise that would make this a sensible choice? Not only that, but what games as well? This is such a small market for people that would not only need this but actually use it. I also wonder if someone is upgrading then what are their current specs too? I have absolutely no reason to be jealous of anybody buying these.

I run 2560x1440, and my 7970 will have issues depending on the game. Issues meaning not a constant 60 fps.
 

LiquidMetal14

hide your water-based mammals
I run 2560x1440, and my 7970 will have issues depending on the game. Issues meaning not a constant 60 fps.

Precisely why we need to get better GPU's in hopes to maintain a level performance field around 60fps at near 1080p resolutions. I play 1440p downsampled and it's great but I've been lowering it slightly to a custom res of 2049x1150 (ballpark) and retain near 60fps most of the time on my 6970. I can only imagine performance being nearly 60fps constant with a 680 but I'm only speculating on that GPU and not this new GPU which will nearly perform as well as a 690.
 

Mononoke

Banned
I run 2560x1440, and my 7970 will have issues depending on the game. Issues meaning not a constant 60 fps.

I have a weird 2560x1080 monitor (also have a 2560x1440p) - and even that can have FPS dips on the more intense games.

Although maybe that is from some other issue (maybe another part in my setup is weak). I downsample + max out all my settings. So I don't find it all that surprising that some people need a better card then the basic GTX 680.
 
I'd laugh at anyone with a 4 way SLI rig, regardless of which cards it uses. 3 extra frames of input lag is not my idea of a good time.

Wait what, SLI adds a frame of input lag?
Hahah wow, that defeats the entire purpose of wanting higher framerates then.

For me this puts ANY SLI user in camp chumpington.

lol, people aren't jealous. If anything, they're just disappointed that consumers will probably end up justifying Nvidia's decision to raise their video card prices.

Noone is going to dignify this statement:p People don't want to hear it, gota rationalise.
 

Alexios

Cores, shaders and BIOS oh my!
Wait what, SLI adds a frame of input lag?
Hahah wow, that defeats the entire purpose of wanting higher framerates then.

For me this puts ANY SLI user in camp chumpington.
I'm not an SLI user but this is a shitty thing to say. It's not like all they play is SSFIV and Counter-Strike, for some extremely high resolution/AA/3D/whatever else with a silky smooth overall framerate outweighs 1-3 frames of input lag. Hardly a big deal too if running at something like 120fps. I don't need my Skyrim or whatever to have zero input lag (and it doesn't seem to).

And no, high frame rates aren't only for response in competitive games, they're also for how amazing games then look, duh.
 
I'm not an SLI user but this is a shitty thing to say. It's not like all they play is SSFIV and Counter-Strike, for some extremely high resolution/AA/3D/whatever else with a silky smooth overall framerate outweighs 1-3 frames of input lag. Hardly a big deal too if running at something like 120fps. I don't need my Skyrim or whatever to have zero input lag (and it doesn't seem to).

It may not be a nice thing to say but it's the truth.
Unless you are struggling to hit a minimum framerate of 30 fps without SLI you are not gaining anything.


The entire point of higher framerates = more responsiveness, SLI scaling isn't 100 percent so you aren't just not gaining any you are actively losing some.

EVERY game benifits from 60 (as in half the input lag) over 30, it's not because it's mandatory for cs or streetfighter that you don't get the same benifit (the benifit is the feeling of responsiveness, not the competitive edge) in every other game.
Doubling your input lag defeats the entire purpose of aiming for a higher framerate.
I'm just amused considering how many people on this forum brag about hitting 120 fps with their SLI rigs, it makes them look like chumps if you take this into account.
 

bee

Member
It may not be a nice thing to say but it's the truth.
Unless you are struggling to hit a minimum framerate of 30 fps without SLI you are not gaining anything.


The entire point of higher framerates = more responsiveness, SLI scaling isn't 100 percent so you aren't just not gaining any you are actively losing some.

EVERY game benifits from 60 (as in half the input lag) over 30, it's not because it's mandatory for cs or streetfighter that you don't get the same benifit (the benifit is the feeling of responsiveness, not the competitive edge) in every other game.
Doubling your input lag defeats the entire purpose of aiming for a higher framerate.
I'm just amused considering how many people on this forum brag about hitting 120 fps with their SLI rigs, it makes them look like chumps if you take this into account.

if sli gives you 120fps which reduces input lag even further over 60fps but gives you a single extra frame of lag in the process (normal 2 card sli), then surely you're still coming out a winner

can't notice any input lag here on games like natural selection 2 running in sli
 

Sethos

Banned
It may not be a nice thing to say but it's the truth.
Unless you are struggling to hit a minimum framerate of 30 fps without SLI you are not gaining anything.


The entire point of higher framerates = more responsiveness, SLI scaling isn't 100 percent so you aren't just not gaining any you are actively losing some.

EVERY game benifits from 60 (as in half the input lag) over 30, it's not because it's mandatory for cs or streetfighter that you don't get the same benifit (the benifit is the feeling of responsiveness, not the competitive edge) in every other game.
Doubling your input lag defeats the entire purpose of aiming for a higher framerate.
I'm just amused considering how many people on this forum brag about hitting 120 fps with their SLI rigs, it makes them look like chumps if you take this into account.

I love this crusade you're on, you just sound really bitter.

The fact that you have some ancient GPU, sitting in a thread concerning a high-end enthusiast card thread bitching and moaning, then proceed to comment on something you've never even tried is all that more hilarious.

Pretending you can see a little over 16ms of input lag at 60FPS. Without even owning an SLI setup?
 

Theonik

Member
But again wouldn't that only matter if games supported it?
PC games don't talk directly with the Graphics card but go through DX/OGL/*whatever then to the drivers, (which in turn give instructions to the card)
There is no such thing as games supporting the extra cores or not, that is down to the drivers. It would be pretty insane if devs had to explicitly code for 1500+ GPU cores lol
The clock can also be raised. More cores or bus channels can't.
 

sk3tch

Member
It's ~1.7x a GTX680 for $899.

What concerns me is that this pricing implies that GTX780 is going to be a mild improvement, like 1.2x.

What do you expect? That was the performance gain from GTX 580 -> GTX 680. It has always been around the 20% mark. That's how nvidia gets the dollars, boys/girls.
 

Smokey

Member
What do you expect? That was the performance gain from GTX 580 -> GTX 680. It has always been around the 20% mark. That's how nvidia gets the dollars, boys/girls.

are you ready to start up the arms race again my friend

was pretty fun last time!
 

sk3tch

Member
are you ready to start up the arms race again my friend

was pretty fun last time!

Ha, we'll see. I have 3 boxes to equip. So I may buy 4 for them all and do some 2-way, 3-way, 4-way testing...but right now I'm rolling with just a GTX 690 + a GTX 650 Ti (dedicated PhysX). Been waiting for the next gen.
 
I love this crusade you're on, you just sound really bitter.

The fact that you have some ancient GPU, sitting in a thread concerning a high-end enthusiast card thread bitching and moaning, then proceed to comment on something you've never even tried is all that more hilarious.

Pretending you can see a little over 16ms of input lag at 60FPS. Without even owning an SLI setup?

You are the one having to justify running SLI and I assume owning a 680 ^^
I'm not bitter, what do I have to be bitter about. I have no rationalisation to do to avoid buyer's remorse.

Also I game on a crt monitor at 85 hz with vsync off, I'm well familiar with input lag and I lower my settings in games until I get that framerate. My ancient GPU may be dated now but it still has 60 percent of the performance of a gtx 680, so not that ancient (and twice the performance/dollar).

If you can't tell an extra16 ms input lag when why don't you lock your framerate at 30 fps to begin with. This is what I'm talking about when I say chump, trumpetting about pc gaming in every topic while apparently being unable to appreciate the benifits you trumpet about , or noticing when they aren't there.

Explaining how prices were jacked up due to a lack of competition makes me bitter now or gives me some kind of agenda?

I think you are a chump and I'm going to leave it at that, not worth getting banned over putting someone in their place.

As I said in my first post, I didn't expect people to respond reasonably to uncomfortable truths, they never do, you are living proof in this case.
 

Smokey

Member
You are the one having to justify running SLI and I assume owning a 680 ^^
I'm not bitter, what do I have to be better about.

Also I game on a crt monitor at 85 hz with vsync off, I'm well familiar with input lag.

If you can't tell an extra16 ms input lag when why don't you lock your framerate at 30 fps to begin with.

Explaining how prices were jacked up due to a lack of competition makes me bitter now or gives me some kind of agenda?

As I said in my first post, I didn't expect people to respond reasonably to uncomfortable truths, they never do, you are living proof in this case.

I have a 580 sli setup, and had a 690. I also have a 2560x1600 monitor, and a 120hz monitor. I play FPS 95% of the time and do just fine on either monitor. Gaming on SLI is fine. Don't care what the numbers say. Unless their is a specific issue with a title not having proper SLI support or a profile it is fine.
 

mkenyon

Banned
You are the one having to justify running SLI and I assume owning a 680 ^^
I'm not bitter, what do I have to be better about.

Also I game on a crt monitor at 85 hz with vsync off, I'm well familiar with input lag.

If you can't tell an extra16 ms input lag when why don't you lock your framerate at 30 fps to begin with.

Explaining how prices were jacked up due to a lack of competition makes me bitter now or gives me some kind of agenda?

As I said in my first post, I didn't expect people to respond reasonably to uncomfortable truths, they never do, you are living proof in this case.
I'll be back with some data that will help explain. Gimmie like 45 mins.
 

Corky

Nine out of ten orphans can't tell the difference.
[Clark Gable];46770088 said:
People going hungry in the streets and people tossing money around like It's going out of style. Yeah, I'd sooner drop $900 in a charity than spend it on a gpu which will last a year and a half max.

Man, that's very admirable of you.

If I had the money I would buy my ass an Nvidia Titan, Wii u/ps4/720 a nice 4k OLED TV AND donate to a charity/worthy cause of my choice.
 

Tain

Member
I'd laugh at anyone with a 4 way SLI rig, regardless of which cards it uses. 3 extra frames of input lag is not my idea of a good time.

I knew about microstuttering but I didn't know you'd see that much lag with a 3-way or 4-way setup.

I really can't imagine myself going SLI ever.
 
What do you expect? That was the performance gain from GTX 580 -> GTX 680. It has always been around the 20% mark. That's how nvidia gets the dollars, boys/girls.

The 20 percent number makes no sense when you switch to a new architecture AND get a massive process node shrink like going from 40nm->20nm.

It was as big a tock as a tock can get,.

I'll be back with some data that will help explain. Gimmie like 45 mins.
I'm all ears and more than willing to respond to reason :)
 

Smokey

Member
Man, that's very admirable of you.

If I had the money I would buy my ass an Nvidia Titan, Wii u/ps4/720 a nice 4k OLED TV AND donate to a charity/worthy cause of my choice.

Why is he posting on a phone/monitor/tv? Why did he not donate the money to buy that product to charity?

People hungry in these streets mayne :(
 

Amish

Neo Member
this is just something they put out just cause they can, and not because they will make big bucks on it. cause yeah, $900 isnt really something every gamer has laying around. will be fun to see the performance on this and maybe in sli if its supported.

but im more interested in optimizing games on pc, cause an average gaming pc is not only a few times faster than any console, but the pc is just so unoptimized compared to any console..
 

Sethos

Banned
You are the one having to justify running SLI and I assume owning a 680 ^^
I'm not bitter, what do I have to be bitter about. I have no rationalisation to do to avoid buyer's remorse.

Also I game on a crt monitor at 85 hz with vsync off, I'm well familiar with input lag and I lower my settings in games until I get that framerate. My ancient GPU may be dated now but it still has 60 percent of the performance of a gtx 680, so not that ancient (and twice the performance/dollar).

If you can't tell an extra16 ms input lag when why don't you lock your framerate at 30 fps to begin with. This is what I'm talking about when I say chump, trumpetting about pc gaming in every topic while apparently being unable to appreciate the benifits you trumpet about , or noticing when they aren't there.

Explaining how prices were jacked up due to a lack of competition makes me bitter now or gives me some kind of agenda?

I think you are a chump and I'm going to leave it at that, not worth getting banned over putting someone in their place.

As I said in my first post, I didn't expect people to respond reasonably to uncomfortable truths, they never do, you are living proof in this case.

I'm not justifying a single thing, you on the other hand are desperately trying to justify why you won't be running SLI and why everyone running SLI are 'chumps', which in my world is some kind of bitter attack or something else bothering you. The fact that you deny being bitter yet keep coming back to the thread, trying to shit on everyone that actually can afford this stuff and you're slowly getting more and more immature about it.

The fact that you compare 16ms of input lag with locking at 30FPS makes my head hurt, where are you getting this from? Are you just pulling random statements from thin air? It's obvious beyond anything you have never seen an SLI setup in action and the fact that your whizzing little head comes off when someone mentions a single frame of input lag per GPU in SLI. I could sit you down on 4 random computers, one of them running an SLI setup and offer you one million dollars to find the computer running SLI by purely playing a game and you wouldn't notice in a billion years.

The prices would be jacked up no matter what, the 8800 Ultra price was jacked up and you know why? Because they are NOT aimed at the common man. They are limited edition ( so to speak ) enthusiast cards aimed at people who would gladly pay $900 for the performance no matter how competitive the market is - These cards will be sold. This is not a damn mainstream card that people who are tight need to worry about, it's not even in their stratosphere. It's like Ford Mondeo owners starting to bitch over the price of the Ford GT. It's not aimed at them, don't worry about, drive your Mondeo and shut up.

The fact that you constantly return to the thread, to re-establish your point over and over again as why everyone are chumps and have to quote the 2-3 people with the same narrowminded train of thought as you, just to say "OMG UR RIGHT!" is pathetic - Why are you here if you genuinely have nothing to add? You keep thinking you're holding the magic stick of truth and everything you say is just the burning bush in the desert, the epitome we need to worship and follow.

The card is not aimed at you, it's not a mainstream card, people spend big bucks on hobbies and there's PLENTY of cards available for people who are looking to spend less or get a better price / performance ratio. Buy those and stop pestering everyone who can afford this stuff.
 

K.Jack

Knowledge is power, guard it well
Isn't 20nm going to come along next year and obsolete the Titan?

I mean, if the 2x performance per watt talk is true.
 

kpjolee

Member
I'm not justifying a single thing, you on the other hand are desperately trying to justify why you won't be running SLI and why everyone running SLI are 'chumps', which in my world is some kind of bitter attack or something else bothering you. The fact that you deny being bitter yet keep coming back to the thread, trying to shit on everyone that actually can afford this stuff and you're slowly getting more and more immature about it.

The fact that you compare 16ms of input lag with locking at 30FPS makes my head hurt, where are you getting this from? Are you just pulling random statements from thin air? It's obvious beyond anything you have never seen an SLI setup in action and the fact that your whizzing little head comes off when someone mentions a single frame of input lag per GPU in SLI. I could sit you down on 4 random computers, one of them running an SLI setup and offer you one million dollars to find the computer running SLI by purely playing a game and you wouldn't notice in a billion years.

The prices would be jacked up no matter what, the 8800 Ultra price was jacked up and you know why? Because they are NOT aimed at the common man. They are limited edition ( so to speak ) enthusiast cards aimed at people who would gladly pay $900 for the performance no matter how competitive the market is - These cards will be sold. This is not a damn mainstream card that people who are tight need to worry about, it's not even in their stratosphere. It's like Ford Mondeo owners starting to bitch over the price of the Ford GT. It's not aimed at them, don't worry about, drive your Mondeo and shut up.

The fact that you constantly return to the thread, to re-establish your point over and over again as why everyone are chumps and have to quote the 2-3 people with the same narrowminded train of thought as you, just to say "OMG UR RIGHT!" is pathetic - Why are you here if you genuinely have nothing to add? You keep thinking you're holding the magic stick of truth and everything you say is just the burning bush in the desert, the epitome we need to worship and follow.

The card is not aimed at you, it's not a mainstream card, people spend big bucks on hobbies and there's PLENTY of cards available for people who are looking to spend less or get a better price / performance ratio. Buy those and stop pestering everyone who can afford this stuff.
+1
Almost exactly what I wanted to say.lol.
 

mkenyon

Banned
Id more than willing to respond to reason :)
Okay.

So, lets say you have a single card that is rendering frames at 60 FPS. You get a second card to get FPS to 120. In terms of frame times, that is 16.7ms to 8.3ms, which is halving the time it takes to render each frame. Games poll input based on frames rendered, so that means your input is now being polled every 8.3ms instead of every 16.7ms. SLI adds in a frame of lag to help compensate for microstutter.

So essentially, you are a frame behind the action. But that frame is only 8.3ms, and it's no worse off than it was before. In fact you're still getting better overall input than you were before, as your input is being polled more frequently.

Now lets compare the dips in performance that one might see. If on that same single card, your performance falls to 25ms frame time (40 FPS). If your performance stays rougly doubled, you're looking at a frame time of 12.5ms (80 FPS). Even if there is a frame of lag, having your input polled every 25ms compared to 12.5ms is pretty unacceptable for competitive gaming. Regardless of what it is showing you on screen, the input is still being polled more frequently for the game engine to take into account.

Beyond this, you get the added benefit of having smoother action that makes it easier to track at 8.3ms on a 120Hz monitor. I'm not kidding when I say it feels like you're cheating in twitch games where tracking is important, compared to 60Hz.
 

E-Cat

Member
Isn't 20nm going to come along next year and obsolete the Titan?

I mean, if the 2x performance per watt talk is true.
nvidia-yol-haritas%C4%B1-maxwell-kepler.jpg


According to this infamous chart (just ignore the dates), there should be a 3x increase between generations. But it's referring to DP FLOPS, which are largely irrelevant to gaming. Though Wikipedia says Kepler (GTX 680) is actually 2.95x SP GFLOPS/W compared to Fermi (GTX 480). Huh.
 
I wouldn't call someone who buys this an idiot.

I'd call them rich and enviable. You can bet I wish I had one.

I don't know.. Even if I had the 900$ to spare, I wouldn't buy this thing because it's simply not economical. For half the price you can get like 90% of the performance (I haven't looked at the numbers, just estimating based on prior experience). It's similar to the super top of the line 800$ Intel CPUs.. the gain in performance is not nearly enough to justify the additional cost. I guess I'm also the type to pride myself in getting the best possible performance for a relatively low cost. (in whatever I buy)
 

mkenyon

Banned
nvidia-yol-haritas%C4%B1-maxwell-kepler.jpg


According to this infamous chart (just ignore the dates), there should be a 3x increase between generations. But it's referring to DP FLOPS, which are largely irrelevant to gaming. Though Wikipedia says Kepler (GTX 680) is actually 2.95x SP GFLOPS/W compared to Fermi (GTX 480). Huh.
Well, the "per watt" is important to take into consideration. If the TDP of a card is halved, while the GFLOPS/W is tripled, that's only a 150% increase in performance.
I don't know.. Even if I had the 900$ to spare, I wouldn't buy this thing because it's simply not economical. For half the price you can get like 90% of the performance (I haven't looked at the numbers, just estimating based on prior experience). It's similar to the super top of the line 800$ Intel CPUs.. the gain in performance is not nearly enough to justify the additional cost. I guess I'm also the type to pride myself in getting the best possible performance for a relatively low cost. (in whatever I buy)
Despite your name making me think of one of my most favorite songs, your estimates are pretty off. Titan is ~170% of the performance of a GTX680.
 

E-Cat

Member
Well, the "per watt" is important to take into consideration. If the TDP of a card is halved, while the GFLOPS/W is tripled, that's only a 150% increase in performance.
Right. It all depends on what kind of a power envelope is acceptable. Current trends seem to be pointing at a lower TDP for Maxwell.
 
Top Bottom