• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Newest GTX 960 rumor: Jan 22 launch, three variants

http://wccftech.com/gtx-960-ti-benchmarks-specs-revealed/

supposedly GM206 with 128 bit bus and 2 GB VRAM for the base model, which goes in hand with an earlier rumor and a probably (cut down) GM204 or GM206 with 265 bit bus and 4 GB VRAM for one the Tis which would confirm the even earlier rumor with the indian shipping documents

Turns out there are currently not one, not two, but three engineering samples of the 960. There is your base Geforce GTX 960 and then two different flavors of the GTX 960 Ti. And best, of all we have performance numbers for all three, courtesy of DG Lee over at IYD.KR.

The results however, are not that impressive. The GTX 770 wipes the floor with the base GTX 960; heck even the R9 280 (which costs ~$200) manages to put the card in its place. The base 960 performs around 10% faster than the GTX 760 (stock) which is a barely acceptable margin as it is. The Ti variants are another story altogether, and this is where the going gets interesting. There are two GTX 960 Ti variants currently one with 1280 CUDA Cores and another with 1536 CUDA Cores. The performance difference is quite huge and I have a feeling that we are looking at a cut-down GM204 in the ‘Ti’ cards.

Nvidia-Geforce-GTX-960-GTX-960-Ti-GTX-960-Ti-Ultra.jpg

There you go folks, the first 100% confirmed benchmarks of the GTX 960 and company. As you can see, the GTX 960 is at the lower end of the chart, just above the GTX 760. The 1280 SP variant of the GTX 960 Ti fares much better, beating out the GTX 770 and the R9 280X. The 1536 SP variant of the GTX 960 on the other hand is simply brilliant. It is able to breeze past even a reference GTX 780. AMD’s R9 290 is about 7.5% faster than the GTX 960 Ti but considering the higher price (above the $300 mark) the Ti offers superior value. One of the reasons why I suspect that the GTX 960 uses a different core than the Ti variants is the huge performance gap between the three. IF the base card uses a 128 bit bus then it would help explain the lackluster performance despite Maxwell’s amazing compression technologies. Also, I think I can now safely say that the GTX 960 spotted at Zauba.com was a Ti variant and has a 256 bit bus and 4GB GDDR5 memory.

Based on the current pricing of competing AMD solutions, my estimates for pricing would be as follows: ~$280-$320 for the better GTX 960 Ti, ~$250 for the 1280SP Ti and ~$200 for the base variant. The cards will be launching on 22nd of January and I expect we will find out which of the three cards, if not all, Nvidia decides to go forward with.

from a couple of days earlier

http://wccftech.com/nvidia-geforce-gtx-960-gm206-210/

I am by no means claiming for certain that the GTX 960 will be powered by a cut GM204, rather that there is zero authentic evidence on the existence of a GM206 core so far. Infact the GTX 960 prototype that was spotted in Zauba had 4GB of GDDR5 and 256 Bit Bus Width, indicating a cut GM204 core. Therefore, I must admit, that if the GTX 960 turns out to house the GM206 after all, it will surprise me quite a bit. Ofcourse there is one other possibility (scrapers: this is obviously speculation) that Nivdia is prepping a GTX 960 along with a GTX 965 or GTX 960 Ti variant. Here the previous one could house the GM206 core and the latter could house the GM204 core. It is worth pointing out that the Mobility variant, GTX 965, exists already.

Read more: http://wccftech.com/nvidia-geforce-gtx-960-gm206-210/#ixzz3NtTO8YOF
 
What is that graph even measuring? Relative performance to what? I'm assuming 780 since it's 100%, but boy that could have been labeled better.
 

blastprocessor

The Amiga Brotherhood
Need to see what TDP and price of the 960ti otherwise meh and wait for 20nm/16nm or whatever Nvidia eventually plump with.
 
I want an upgrade for one TV Gaming PC that has a 6970 in it. If that 1536 model came out for $250 or less with that performance I'd bite. $300 or less I'd consider

EDIT: Also, R9 290 has been beating the GTX 780 since right near launch quite consistently. GTX 780 ti is a different story
 

Seanspeed

Banned
doesnt the high end 960 Ti cut a bit on thr 970?
Yea, absolutely. Which is very weird.

Makes me wonder how they'll handle pricing. 970's already start at $320. So maybe

960(base) - $175
960Ti(mid) - $220
960Ti(high) - $275

Basically using one umbrella model for the whole 'midrange' level with the base 960 being the replacement for the 750Ti in essence.
 
so when are we getting a 6gb vram card? or would hbm basically let us be stuck with 4gb without issue?

shadow of mordor is even requiring 6gb vram on ultra, yet most of the high-end today only sport 4gb.
 
I wouldn't be surprised if they've mistakenly mixed in an unannounced 950 with the 960 and 960 Ti, because having three separate SKUs for a #60 card makes no sense to me outside of a very specific set of circumstances. Those circumstances would be a whole bunch of 970 chips being manufactured with bum cores and being labelled as 960 variants for sale, like the 570 Fermi chips that were used in the 560 Ti 448 Core GPUs.
 

Vaettir

Member
Wow.

There aren't going to be two TIs for the 960.

GTX 950
GTX 960
GTX 960 TI

There were two TI variants for the 560 TI though. I don't really think these TI's will exceed 2GB. The 1536SP would cannibalize their 970 if that was the case.
 
so when are we getting a 6gb vram card? or would hbm basically let us be stuck with 4gb without issue?

shadow of mordor is even requiring 6gb vram on ultra, yet most of the high-end today only sport 4gb.

Well, this isn't the high-end offering. I assume it comes with the new generation of Nvidia.
 

Seanspeed

Banned
There were two TI variants for the 560 TI though. I don't really think these TI's will exceed 2GB. The 1536SP would cannibalize their 970 if that was the case.
They've already sold a shitton of 970's, though. And it will still be faster.

Also depends on how they price. Unless they undercut the 970 price by a large amount, they could both still be options, although the 960Ti would obviously be the marginally better value even at something like $275. I mean, the difference between the 660Ti and 670 wasn't huge and both offered 2GB versions.
 

sk3tch

Member
The chart is pretty useless. Even if it were accurate - none of this really matters until we see the price AND how AMD responds. Their response to the 970 and 980 launch was pretty awesome - huge price cuts. The R9 290X is still a pretty incredible value today (despite it feeling last gen versus Maxwell). I'm guessing AMD will counter NVIDIA's tech with free games and price cuts blazing.
 

sk3tch

Member
8gb vram is quickly becoming a requirement. I'd hold out if at all possible.

It is? Haha. C'mon man. There are barely any 8GB cards in the marketplace AT ALL and now all of a sudden it is becoming a requirement? The 960 is a mid-tier card - if you're pushing for 6GB or 8GB or more you best stick to the high-end stuff. Wait until Spring or so and the next round will come out. We'll see what they have for us.
 
8gb vram is quickly becoming a requirement. I'd hold out if at all possible.

This is the PC excessive specs joke, right?

EDIT: Nvidia's CES conference happens tonight at 8PM PT/11PM ET/4AM GMT. If this rumor holds any weight at all, we may hear something then.
 

Etnos

Banned
Seems like a decent option for a budget build if price accordingly. Really like what nvidia is doing with this 900 series.

Quietly running BF4 at 2715x1527 downsampled to 1080p to look really fucking awesome... We talking about a $300 vid card here (gtx970), impressive.
 
I wouldn't settle for anything less than 16GB of VRAM, guys. I mean, the PS4 has 8GB of GDDR5 and because of coding to the metal, it's twice as powerful as an equivalent PC. You'd need 16GB just to match PS4. Nvidia is just too salty to give PC gamers what they need and it's honestly sad.
 

Etnos

Banned
I wouldn't settle for anything less than 16GB of VRAM, guys. I mean, the PS4 has 8GB of GDDR5 and because of coding to the metal, it's twice as powerful as an equivalent PC. You'd need 16GB just to match PS4. Nvidia is just too salty to give PC gamers what they need and it's honestly sad.

You being serious about this? because properly optimized games like BF4 are already giving the PS4 a run for its money, real bad. I know because I own both a 970gtx and a PS4.

We still talking about x86 architecture, and you can "code to the metal" in both platforms, I mean...
 

sk3tch

Member
people that want to downsample, 4K resolution or have ultra high textures in their games perphaps?

The problem is, if you're rolling 4K+ you need horsepower, too...not a damn GTX 960 Ti with 8GB of VRAM... :)

You being serious about this? because properly optimized games like BF4 are already giving the PS4 a run for its money, real bad. I know because I own both a 970gtx and a PS4.

Nah, he's joking.
 

Kieli

Member
You being serious about this? because properly optimized games like BF4 are already giving the PS4 a run for its money, real bad. I know because I own both a 970gtx and a PS4.

We still talking about x86 architecture, and you can "code to the metal" in both platforms, I mean...

Yes, I'm being completely serious.
 

Seanspeed

Banned
You being serious about this? because properly optimized games like BF4 are already giving the PS4 a run for its money, real bad. I know because I own both a 970gtx and a PS4.

We still talking about x86 architecture, and you can "code to the metal" in both platforms, I mean...
Ur must be a Nvida fanboy. lol Salty much?
 
I wouldn't settle for anything less than 16GB of VRAM, guys. I mean, the PS4 has 8GB of GDDR5 and because of coding to the metal, it's twice as powerful as an equivalent PC. You'd need 16GB just to match PS4. Nvidia is just too salty to give PC gamers what they need and it's honestly sad.

And that is just for PS4 quality settings in the IQ department. This totally ignores the 16 core processors clocked at 4.0 Ghz needed to do the CPU stuff.

And what about ultra settings?
 

Teletraan1

Banned
I usually buy a x70 series GPU from nvidia so this is not what I am looking for but I am glad these exist. Still holding out for 6GB cards and probably their next iteration. I just want 6 so I wont be crucified like the person above for wanting 8. Crucified by the people on this board that will be the first to be rocking 8 but whatever.
 

Crisium

Member
Since when is an R9 290 10% faster than a 780?

Kepler performance has plumetted compared to GCN. In nearly every recent game, AMD cards are ahead of where they were relative to 700/600 series a year+ ago. It really puts all those high priced 780/780Ti/Titan purchases into a sad perspective.


7970 and 680 used to be neck and neck. 7970 even with a 780? DA:I isn't even that much of an outlier - all accross the board in the past 6 months Kepler performance has decreased.

See for yourself:
http://gamegpu.ru/test-video-cards/igry-2014-goda-protiv-sovremennykh-videokart.html

It used to be 780Ti was marketed as a class above the 290X. And the 290X was supposed to compete with the 780. This is no longer consistantly true. More and more, 290X is equal or faster to 780Ti. 290 same with 780. GCN and Maxwell have both improved drastically compared to Kepler recently. Looking at the 4K average really paints a bad picture for Nvidia's Kepler - these cards cost hundreds more than 290 series, for no gain and often evidently actually worse performance.

I'm not so sure about this 960 rumour with 3 variants, but we'll see. 660 Ti and 670 had the same amount of shaders actually, so 1536 for 960Ti and 1664 for 970 seems right. The x60 Ti simply cuts back ROPs and Bandwidth more than raw shader power.
 

sk3tch

Member
Kepler performance has plumetted compared to GCN. In nearly every recent game, AMD cards are ahead of where they were relative to 700/600 series a year+ ago. It really puts all those high priced 780/780Ti/Titan purchases into a sad perspective.



7970 and 680 used to be neck and neck. 7970 even with a 780? DA:I isn't even that much of an outlier - all accross the board in the past 6 months Kepler performance has decreased.

See for yourself:
http://gamegpu.ru/test-video-cards/igry-2014-goda-protiv-sovremennykh-videokart.html

It used to be 780Ti was marketed as a class above the 290X. And the 290X was supposed to compete with the 780. This is no longer consistantly true. More and more, 290X is equal or faster to 780Ti. 290 same with 780. GCN and Maxwell have both improved drastically compared to Kepler recently. Looking at the 4K average really paints a bad picture for Nvidia's Kepler.

I'm not so sure about this 960 rumour with 3 variants, but we'll see. 660 Ti and 670 had the same amount of shaders actually, so 1536 for 960Ti and 1664 for 970 seems right. The 60 Ti simply cuts back ROPs and Bandwidth more than raw shader power.

AMD has made improvements - but you're overblowing the amount of "regression" by Kepler (and the improvements by AMD, for that matter). Some of which being BIOS updates increasing the speed of the card, some of which being better drivers. During the same time, NVIDIA was releasing plenty of great features (ShadowPlay, Geforce Experience, FXAA, etc.) while AMD is still playing catch-up.

The other side of the coin is AMD's hotter, louder, and more power hungry cards versus NVIDIA.

That said - can't wait to see what AMD drops for their next gen card.
 

Crisium

Member
I'm talking about Kepler, not Maxwell. Kepler's heat and noise advantage over AMD is much more incremental in nature. Especially Sapphire Tri X series.

780Ti cost $300-350 more than a 290 for 6+ months. Look at that average of 10 games. It was not worth it. I honestly think Kepler customers need to be contacting Nvidia about matching AMD's performance gains. Get that driver to team to work.

If I had a 780 Ti I would be seriously irritated that the 290 is trading blows with a card that just 4 months ago was going for $600. A 780Ti owner should be furious that his $700 launch card is now as fast as a $400 launch and currently $250-300 AMD card in 2014 games. They should NOT be advocating buying more Nvidia products, in my humble, consumer oriented opinion. Although Maxwell is quite wonderful (except in SLI/Crossfire, where AMD out right wins), it makes you wonder about Nvidia's long term support - especially as 4K gets more popular and AMD currently is king at that resolution. Things did not look like this a year ago, and it's a testament to AMD's recent performance drivers. Even just compared to 970/980 launch reviews, 290 series is making some ground on Maxwell, although not as drastically as Kepler.
 

Seanspeed

Banned
Kepler performance has plumetted compared to GCN. In nearly every recent game, AMD cards are ahead of where they were relative to 700/600 series a year+ ago. It really puts all those high priced 780/780Ti/Titan purchases into a sad perspective.



7970 and 680 used to be neck and neck. 7970 even with a 780? DA:I isn't even that much of an outlier - all accross the board in the past 6 months Kepler performance has decreased.

See for yourself:
http://gamegpu.ru/test-video-cards/igry-2014-goda-protiv-sovremennykh-videokart.html

It used to be 780Ti was marketed as a class above the 290X. And the 290X was supposed to compete with the 780. This is no longer consistantly true. More and more, 290X is equal or faster to 780Ti. 290 same with 780. GCN and Maxwell have both improved drastically compared to Kepler recently. Looking at the 4K average really paints a bad picture for Nvidia's Kepler - these cards cost hundreds more than 290 series, for no gain and often evidently actually worse performance.

I'm not so sure about this 960 rumour with 3 variants, but we'll see. 660 Ti and 670 had the same amount of shaders actually, so 1536 for 960Ti and 1664 for 970 seems right. The x60 Ti simply cuts back ROPs and Bandwidth more than raw shader power.
AMD is certainly getting better, but using a game that has one of the best showings for AMD architecture(along with Ryse) and Mantle support is not necessarily the most representative benchmark to pick. Other games do not show that same level of comparison and the 970 is often at or near 290X levels at 1080p and 1440p.

And criticizing the price of the 780Ti and 780 based on this makes little sense since AMD has been playing catch up this entire time. People don't buy GPU's based on what theoretical performance they might get from them in a year's time IF driver support has improved, they get them for tangible day 1 benefits.
 

Skyzard

Banned
I'm talking about Kepler, not Maxwell. Kepler's heat and noise advantage over AMD is much more incremental in nature. Especially Sapphire Tri X series.

780Ti cost $300-350 more than a 290 for 6+ months. Look at that average of 10 games. It was not worth it. I honestly think Kepler customers need to be contacting Nvidia about matching AMD's performance gains.

If I had a 780 Ti I would be seriously irritated that the 290 is trading blows with a card that just 4 months ago was going for $600.
A 780Ti owner should be furious that his $700 launch card is now as fast as a $400 launch and currently $250-300 AMD card in 2014 games. They should NOT be advocating buying more Nvidia products, in my humble, consumer oriented opinion. Although Maxwell is quite wonderful (except in SLI/Crossfire, where AMD out right wins), it makes you wonder about Nvidia's long term support - espcially as 4K gets more popular and AMD currently is king at that resolution. Things did not look like this a year ago, and it's a testament to recent performance drivers.

I wouldn't be too happy...if it weren't for amazon <3
 
Top Bottom