• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA to release GeForce Titan

n0n44m

Member
Right. It all depends on what kind of a power envelope is acceptable. Current trends seem to be pointing at a lower TDP for Maxwell.

looking at previous cards (480...) I think what TDP is acceptable for Nvidia depends on :

1. yields
2. chips performance vs similar AMD chips
3. heat (and everything that includes such as cooling noise and more VRM components on PCB)

smaller chips = less heat & better yields , bigger chips = more performance but more heat & less yields

but you're right, if Maxwell competes with AMD chips equally well as the Kepler generation does, then the TDP for regular (non-Titan) cards certainly won't increase
 

sk3tch

Member
The 20 percent number makes no sense when you switch to a new architecture AND get a massive process node shrink like going from 40nm->20nm.

It was as big a tock as a tock can get,.

I bet if you compare the 20% gain from GTX 580 -> 680 and extrapolate that to the price of the Geforce Titan (higher gain from 680 -> Titan) - that's why this is launching at $899 versus the GTX 680 at $499.

And architecture doesn't mean shit when it comes to marketing. They're going to price according to performance.
 

Momentary

Banned
I'm so torn. Should I go ahead and wait for Maxwell or take the plunge with 2 Titans? I know it's not that significant, but every time I see that Maxwell chart I get pissed off because all it does is make me want to wait. I really don't know if I can wait 1 more year for my dream rig. I've been wanting to build a PC since May of last year. I'd be willing to drop $1800 and slap some water block on there.
 

angelfly

Member
Damn, I was planning on building a new PC around May since the 780 was going to be out by then and I was planning to get two. Now this is coming out so when I build it I'll probably just go with a single Titan and add another down the line.
 

mkenyon

Banned
I'm so torn. Should I go ahead and wait for Maxwell or take the plunge with 2 Titans? Every time I see that Maxwell chart I get pissed off because all it does is make me want to wait. I really don't know if I can wait 1 more year for my dream rig. I've been wanting to build a PC since May of last year.
Keep in mind that Maxwell chart is performance/watts, not showing actual gains in performance. Maxwell is more than a year out. If if looks like you need to go that direction, then sell whatever cards you have and go Maxwell.

Continually waiting for the next thing means that you'll never build. It hasn't been a bad time to build other than right before Sandy launch in the last... 6 years.
 

Momentary

Banned
Keep in mind that Maxwell chart is performance/watts, not showing actual gains in performance. Maxwell is more than a year out. If if looks like you need to go that direction, then sell whatever cards you have and go Maxwell.

Continually waiting for the next thing means that you'll never build. It hasn't been a bad time to build other than right before Sandy launch in the last... 6 years.

I just got pushed off the edge... The day these are released I will by two. I'll at least wait for Haswell along with an accompanying motherboard for it. How long did it take ASUS to release their Maximus V after its accompanying socket hit?
 
Okay.

So, lets say you have a single card that is rendering frames at 60 FPS. You get a second card to get FPS to 120. In terms of frame times, that is 16.7ms to 8.3ms, which is halving the time it takes to render each frame. Games poll input based on frames rendered, so that means your input is now being polled every 8.3ms instead of every 16.7ms. SLI adds in a frame of lag to help compensate for microstutter.

So essentially, you are a frame behind the action. But that frame is only 8.3ms, and it's no worse off than it was before. In fact you're still getting better overall input than you were before, as your input is being polled more frequently.

Now lets compare the dips in performance that one might see. If on that same single card, your performance falls to 25ms frame time (40 FPS). If your performance stays rougly doubled, you're looking at a frame time of 12.5ms (80 FPS). Even if there is a frame of lag, having your input polled every 25ms compared to 12.5ms is pretty unacceptable for competitive gaming. Regardless of what it is showing you on screen, the input is still being polled more frequently for the game engine to take into account.

Beyond this, you get the added benefit of having smoother action that makes it easier to track at 8.3ms on a 120Hz monitor. I'm not kidding when I say it feels like you're cheating in twitch games where tracking is important, compared to 60Hz.

Thanks for the thorough explenation.
The input polling is a fair point I did not consider it, I was thinking only of the feedback through the delayed frame.
And we both agree that frame lag wise SLI puts you right back at square one (assuming you have 100 percent SLI scaling btw, which you don't :p ).

The responsiveness is still better with SLI than at half the framerate without it, but the feedback (result of your input being displayed, and feedback is a big deal in making a game feel responsive and 'real time') still stays exactly the same (or worse with less than 100 percent scaling).

And yeah I fully agree on a high framerate helping to track in a game like quake or even cs. I perform much worse in both games if my framerate drops below 50-60.
So that's a strong argument for still getting benifit from SLI if it keeps your framerate over that treshold.

I personally get a kick out of instant feedback (through low input lag, hence why I play on a crt monitor and can't stand vsync), and feeling like my mouse movements are displayed without delay and interpreted 1:1 (I despise any form of mouse smoothing or acceleration) so the feedback portion is all I considered when considering the importance of framerates.


Sorry for the late reply btw, dinner and all that.
 

mkenyon

Banned
A month or so. However, motherboards are going to mean a lot less for performance than they do currently. One of the most important factors for enthusiast boards, VRM control, is being moved on-die.

@Sneaky
I'm right with you. I think when we are talking about performance games where the framerate is 150+, a single frame is not going to be perceivable by even the most extreme esports folks. Even CS:GO on my bench with a single 7970 and a 3570K has a 99th percentile frametime of 6.4ms. You're not going to notice 6.4ms of input lag.

b3TgS.png

When you do need that extra oomph with graphics hogs, it's unlikely that it is a game where that very slight input lag is going to have any noticeable difference. I'm not looking to perfectly track someone across the map with a railgun when I'm playing Far Cry 3, Guild Wars 2, or something similar. That's when that SLI performance can come in handy.
 

Timedog

good credit (by proxy)
These dudes online jealous cause I'mma style out on that. Watch me C-Walk up to Best Buy and run up on dat Geek Squad with G cheese, talkin bout SLI all up on these. Then I'mma stay shy when you meet my ass in Deathmatch cause that's my \V\Vish, ya heard me, pimp?
 

TrutaS

Member
Nvidia wants you to work hard to get it ---> it is just a proven genius marketing practice.

This surely seems to go in-line with the struggles AMD is facing, but I think there is a limit of price to each people simply won't give in despite the lack of competition.
 

Momentary

Banned
6gb of GDDR5 ram at 5.2GHz. I'm definitely putting up $1800 for 2 of these bad boys. I've read rumors that EK will have blocks for this card sooner than a lot of people think. I wonder how that's possible?

The only other thing that could make this the most perfect time for me to build is the release of a 2560x1440@120hz monitor. Damn I can't wait for the end of February.
 

Kame

Member
You know what you can buy with $900?

A lot of other essential things such as groceries, college tuition fees, apartment fee, car monthly and insurance.
 

TheExodu5

Banned
Did I do the math wrong, or is Nivida going to charge $900 for performance that can be matched by a 660 ti SLI setup?

You can always match a high end setup for a cheaper SLI setup. A single card remains a far better solution than dual GPUs, as dual GPUs introduce a lot of problems, and microstutter can make a game feel like it has a low framerate even with good performance. My single GTX 680 feels faster than my old SLI GTX 570 setup, even though it's only maybe 50% faster than a single GTX 570.
 

SapientWolf

Trucker Sexologist
You can always match a high end setup for a cheaper SLI setup. A single card remains a far better solution than dual GPUs, as dual GPUs introduce a lot of problems, and microstutter can make a game feel like it has a low framerate even with good performance.
In my experience, microstutter is only a problem when the framerate does not match or exceed the refresh rate. It's also more of a problem on AMD setups than Nvidia.

I would drop to 1024x768 before I let the framerate drop under the refresh rate.
 

Agauos

Neo Member
Can anyone confirm if I need to get a motherboard with PCI Express 3.0 to take advantage of that 85% performance increase? Or will my 2.0 do fine?
 

mkenyon

Banned
You can always match a high end setup for a cheaper SLI setup. A single card remains a far better solution than dual GPUs, as dual GPUs introduce a lot of problems, and microstutter can make a game feel like it has a low framerate even with good performance. My single GTX 680 feels faster than my old SLI GTX 570 setup, even though it's only maybe 50% faster than a single GTX 570.
That's because measuring performance with FPS metrics is inaccurate. Have you seen this new frame latency data? If not, checkout TechReport.
 

Theonik

Member
Can anyone confirm if I need to get a motherboard with PCI Express 3.0 to take advantage of that 85% performance increase? Or will my 2.0 do fine?
Depends on if you do SLI or not. A 16x PCI-E slot should be enough for it I reckon. The SLI issue would come from the fact that a lot of motherboards have much less lanes for the additional GPUs.
 

Agauos

Neo Member
Depends on if you do SLI or not. A 16x PCI-E slot should be enough for it I reckon. The SLI issue would come from the fact that a lot of motherboards have much less lanes for the additional GPUs.

Awesome, thanks. Do you think my Phenom II 965 would bottleneck?
 

mkenyon

Banned
Awesome, thanks. Do you think my Phenom II 965 would bottleneck?
1) It might bottleneck on PCI-E 2.0. Just ever so slightly.

2) Games don't have a cut and dry bottleneck. I think this thought process started by people who inductively put things together like GPU usage and CPU usage, when X part never really bottlenecks Y part. It's more of limitations in game engine performance based on your hardware specs.

So what I'm saying is that your Phenom II is bottlenecking your performance in any game, much more severely than a modern socket 1155 processor would, even an i3. This is regardless of the video card you have.
 

mkenyon

Banned
Phenom II will bottleneck even gtx 660 in cpu intensive titles - if you are debating $900 gpu then i5 K series with mobo for <400 should be your first priority
I agree with this.

But so so folks are clear, it's not like there is a frame time limit based on a CPU that won't change if you get a better video card. It's a lot more dynamic than that.
 

SapientWolf

Trucker Sexologist
I agree with this.

But so so folks are clear, it's not like there is a frame time limit based on a CPU that won't change if you get a better video card. It's a lot more dynamic than that.
Consistent frame times aren't going to be of any use if they're consistently low.
 

isamu

OMFG HOLY MOTHER OF MARY IN HEAVEN I CANT BELIEVE IT WTF WHERE ARE MY SEDATIVES AAAAHHH
Would love to have one of these cards someday. Looks like a beast.
 

TheExodu5

Banned
From the same site (google translate):

Now confirm SweClockers sources that Geforce GTX Titanium is not only based on the same circuit board computing Tesla K20X but also contains as much memory. The upcoming monster model equipped with 6 GB of GDDR5 memory, which suggests that the video card is likely to have a memory bus of 384-bits.

http://www.sweclockers.com/nyhet/16495-nvidia-geforce-gtx-titan-far-6-gb-minne

Sound like this card will be really worth its price after all.
 
There's one risk in buying it.

All 3 consoles are using ati architucture - that means games might favout 89xx series cards in benchmark so next gen ato could be getting close to titan in titles from 2014 onwards.
 
There's one risk in buying it.

All 3 consoles are using ati architucture - that means games might favout 89xx series cards in benchmark so next gen ato could be getting close to titan in titles from 2014 onwards.

imo, this is irrelevant. DX11 and generally how much better performing Nvidia parts are will make this a non-matter.
 
There's one risk in buying it.

All 3 consoles are using ati architucture - that means games might favout 89xx series cards in benchmark so next gen ato could be getting close to titan in titles from 2014 onwards.

Most of PC games today are ports from 360 which has an AMD GPU, yet they run just fine on Nvidia cards too.
 
There's one risk in buying it.

All 3 consoles are using ati architucture - that means games might favout 89xx series cards in benchmark so next gen ato could be getting close to titan in titles from 2014 onwards.
Next Gen gpus are at half (less than half in terms of Durango) the power of a non-titan 680. A titan 680 is in a completely different league.
 

Momentary

Banned
Last I've seen they had it down to $800. I think I saw this at Anandtech. I'll try to find the link when I have a break from work. Also, this card is not a derivative of the 680.
 
Next Gen gpus are at half (less than half in terms of Durango) the power of a non-titan 680. A titan 680 is in a completely different league.

I didn't meant it like Titan will be too slow. What i meant is that 8970 might be nearly as fast for 500$ when it's released in H1 2013.
 

Sethos

Banned
So is the Titan the GTX 780 essentially? Or is it the Titan, within its own naming convention and the GTX 780 is coming out down the line?

To me it sounds like the Titan is a sort-of show project between 6xx and 7xx, based on their render GPUs, so it'll probably compete with even the best single 7xx.

Like an early 7xx high-end preview at a premium price
 
Top Bottom