• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Geforce Titan X ~50% faster than GTX980 (3072 cores, 192 TUs, 96 ROPs)

shark sandwich

tenuously links anime, pedophile and incels
This card sounds badass but at this point IMO you might as well just wait for the 20nm/16nm parts. I wouldn't be surprised to see a GTX 1070 with gaming performance very close to Titan X at less than half the price. Probably only 6-8GB of VRAM though. (Or 5.5GB lol get it?)
 

mkenyon

Banned
This card sounds badass but at this point IMO you might as well just wait for the 20nm/16nm parts. I wouldn't be surprised to see a GTX 1070 with gaming performance very close to Titan X at less than half the price. Probably only 6-8GB of VRAM though. (Or 5.5GB lol get it?)
GTX 1070 will be the 980 rebadged. :(

You're talking about 2+ years out for the next die shrink and architecture change.
 

wachie

Member
thats a 110mhz overclocked 970 vs a 30mhz overclocked 290X, at stock clocks the story would be far different.
It really wouldn't.
Non-sense.
Sean, those are outdated benchmarks with shitty metrics. I know you know better.

As it is right now, the 290X does outperform the 970. AMD has done a lot, especially since the Omega Driver release, which isn't reflected in some of the older stuff.

But also, it is a power hungry bitch and definitely produces more heat. The other person doesn't seem to understand the correlation between watts consumed and heat in watts produced.
Correct.
 
Take these with a huge grain of salt because I have no idea where they originally came from so no idea if they are legit or not. I saw them on Anandtech but there was no source provided.

Based on the rumored Titan X specs the performance over the 980 seems to be a little lower than you would expect as it is only 35% @ 4k and I would have expected 40% + with the increase in VRAM + memory bandwidth on top of a 50% shader increase.

okWvaAz.jpg

zzzYXYg.jpg

LAcWtv8.jpg

That seems odd since you have higher fps on higher resolutions?
 
It seems 28nm remains economical due to this:

http://www.eetimes.com/author.asp?doc_id=1321536

Summary (more above at the link)

We have been hearing about the imminent demise of Moore's Law quite a lot recently. Most of these predictions have been targeting the 7nm node and 2020 as the end-point. But we need to recognize that, in fact, 28nm is actually the last node of Moore's Law.

Beyond this point, we can continue to make smaller transistors and pack more of them into the same size die, but we cannot continue to reduce the cost. In most cases, in fact, the same SoC will actually have a higher cost!

...


Beyond 28nm, the SRAM bit-scaling rate is about 20% per node instead of the historical 50%.
 

wachie

Member
Since this was being discussed prior in this thread but not sure about the credibility of this rumor

Fiji Radeon 390X comes with 8GB frame buffer

The decision to go for an 8GB Fiji rather than the planned 4GB version was in part attributed by Nvidia’s Titan X 12GB card announcement. This is just the first part of the story. One of the main reason is that the card is expected to perform so well in 4K gaming, that the 4GB frame buffer could impose a serious limitation.

Our sources are confident that the card is coming this summer, or early summer to be precise, but we don’t have a better date than that. It could be as early as Computex that starts in early June, but we would expect that it happens slightly later than that. Our friends from Sweclockers.com were reporting that the cards should come in Q2 2015 with 4GB HBM memory but I guess that this plan might be slghtly altered.

http://www.fudzilla.com/news/graphics/37258-fiji-radeon-390x-comes-with-8gb
 
its not 50% faster, its ~35% faster. an overclocked titan-x is 50% faster than a non overclocked 980, stop spreading or believing FUD and lies from nvidia, its embarassing.
Instead of assuming he read the graphs incorrectly, you assume he is holding the PR wagon and spreading FUD. Dude, you have to calm down... seriously.
 

Nachtmaer

Member
That seems odd since you have higher fps on higher resolutions?

These are relative numbers using the 290X as its base.

I thought gen 1 HBM was maxed at 4GB?

The limit is 1GB per memory stack. Most people figured that four stacks would make the most sense for size and cost reasons. Unless there's a way to not have to rework the GPU chip itself to add more stacks, I really have no idea how they're gonna pull off 8GB.
 

Marlenus

Member
That seems odd since you have higher fps on higher resolutions?

It is not fps but performance scaling with the 290x as the base making the Titan X and 390x 46% and 49% faster at 4k respectively. EDIT: beaten.



Not sure if they can do 8GB of HBM without reworking it slightly and from my understanding that would double memory bandwidth.
 

wachie

Member
Instead of assuming he read the graphs incorrectly, you assume he is holding the PR wagon and spreading FUD. Dude, you have to calm down... seriously.
I think what he meant was the Nvidia slide in the OP and the title of this thread. I dont think he (US) perused those chinese graphs to conclude it was 50% faster.
 
Because an aggressive strategy, especially coming right off the 970 fiasco, could really give AMD a huge boost in marketshare, something they've been lacking for a long time.

I wouldn't think $500, but $600 might just be a genius, bold move.

That seems like thin reasoning to me, predicated only on how desperate AMD is perceived.

I'll just say that I hope AMD really is that deperate, I really want this GPU to be in the 600 dollar range too. lol

Since this was being discussed prior in this thread but not sure about the credibility of this rumor

Fiji Radeon 390X comes with 8GB frame buffer



http://www.fudzilla.com/news/graphics/37258-fiji-radeon-390x-comes-with-8gb

Article reads like a pile of rubbish, to be honest.
 

viveks86

Member
It's interesting that the Nvidia slide says 50% while all practical benchmarks seen so far says ~35. I wonder if they are making theoretical claims or plan to actually back that up with benchmarks at GTC. Also wonder if that slide is actually an official slide or not (though I can't imagine it being unofficial)
 
It's interesting that the Nvidia slide says 50% while all practical benchmarks seen so far says ~35. I wonder if they are making theoretical claims or plan to actually back that up with benchmarks at GTC. Also wonder if that slide is actually an official slide or not (though I can't imagine it being unofficial)

It's 50% clock for clock.

Obviously the bigger chip isn't going to clock as high in practical situations. It has cooling and power limits to stay inside.
 

Seventy70

Member
...and I thought people getting defensive over consoles was bad. People getting defensive over $1000 hardware is just something else
 

wachie

Member
Article reads like a pile of rubbish, to be honest.
It could probably be, which is why I posted with a disclaimer.

And presumably an optional 12GB option too right?

Marketing wise, its just 2/3 as capable versus 1/3,

(Its hard for me to imagine many games will require more than 4, given how almost all games are limited by having console versions)
A lot of games can go over 3GB already.

...and I thought people getting defensive over consoles was bad. People getting defensive over $1000 hardware is just something else
They are all sad. $399, $349 or $999.
 

mkenyon

Banned
Sorry to ask again is a 750 watt power supply expected to be sufficient enough, running a 5820k, 8 gb ddr4.
Yep. Though if it's a poor quality 750W, and you're pushing your 5820K to the bleeding edge on an overclock, and want to do the same with a Titan X, then you might (but still probably wouldn't) run into a problem.
 

Hawk269

Member
Yep. Though if it's a poor quality 750W, and you're pushing your 5820K to the bleeding edge on an overclock, and want to do the same with a Titan X, then you might (but still probably wouldn't) run into a problem.

Agree with Mkenyon (he is as wise as Yoda, in the Computer realm). If you OC your CPU and plan to OC the Titan X, it comes down to the quality of the PSU. At 750w, you should be fine regardless, but if you are pushing both parts to the max OC and if the PSU is not of good quality, it may be an issue.
 
The reference design for R9 290/290x is really shitty.
I have one and its loud and runs at 94°C... the worst ? It throttles a lot.
But I bought a 40 euros aftermarket cooler...
That made such a difference :eek:
Runs quietly... and barely reach 75°C on load and overclocked to 1ghz.

Even the Asus 290 I bought ran super hot and super loud when I tried to push the OC beyond 5%, had to buy aftermarket water cooler but after that its been relatively cool and quiet even at 30% OC. If they release 390X with comparable performance to Titan X at 60% price I'd buy it in a heartbeat, but I know what I'm getting into unlike someone on last page.
 

dr_rus

Member
Weren't there rumors that the compute side of things were gimped this time around? Doesn't look like it.

It's likely that doubles will be gimped in the same way as on other GM2xx chips. Otherwise it should be the fastest compute chip out there for those who don't need doubles - at least until Fiji.
 

shark sandwich

tenuously links anime, pedophile and incels
GTX 1070 will be the 980 rebadged. :(

You're talking about 2+ years out for the next die shrink and architecture change.
Are we seriously expecting 2 more years of 28nm? Nvidia has been on 28nm for 3 years already (600 series launched March 2012).

The new architecture (Pascal) is coming in 2016. But I was under the impression that we'd see 16nm GPUs in 2015.
 
Are we seriously expecting 2 more years of 28nm? Nvidia has been on 28nm for 3 years already (600 series launched March 2012).

The new architecture (Pascal) is coming in 2016. But I was under the impression that we'd see 16nm GPUs in 2015.

TSMC can't deliver it until next year. AMD went with GloFo, who'll also have it ready for next year.
 

LeleSocho

Banned
This guy is pretty salty about the reference cooler... Is he right? https://youtu.be/zytKg9169Zc

Who the hell is he even? He has like 50 views on a week old video and i can see why, all he does is saying "i would never buy a reference cooler design" for 6 minutes straight without giving any reason of why he doesn't like it or why it's objectively bad... hell he doesn't even know how the card performs with that cooler because nobody tried the thing.
 
Who the hell is he even? He has like 50 views on a week old video and i can see why, all he does is saying "i would never buy a reference cooler design" for 6 minutes straight without giving any reason of why he doesn't like it or why it's objectively bad... hell he doesn't even know how the card performs with that cooler because nobody tried the thing.
I know he sounds extremely negative. Just wanted opinions about his points regarding the reference cooler
 

Tenck

Member
Most people I have heard from like the Titan Styled reference cooler. I will watch the video to see what he means.

He's just criticizing that they wont do dual fan cooler. He thinks cards are too hot with reference coolers.

Wouldn't watch the video because it's him rambling without really thinking of what his argument is.

"I hate this." "I don't know what to think" "I hate this."

Edit: Even Lele Socho agrees it's a pretty bad video.
 

Kezen

Banned
no, roughly 35% faster at 200% cost (nvidias charts claim 50% because they are comparing an overclocked titan to a reference clocked 980, which is misleading, seems typical for nvidia these days.

It's not Nvidia's fault there is a market willing to pay one grand (or more) for a GPU. They don't have to price at what its specs would suggest, they price it at what the market finds acceptable.

I don't have numbers for Titan and Titan Black, curious to see exactly how many people have bought it. Regardless it must have been a success for Nvidia considering the margin they make on such a product.
 
Hmm interesting, 12 GB may not be overkill after all (though it doesnt need to be fast)

One highly respected developer can see things moving in another direction based on the way games are now made.

"I can totally imagine a GPU with 1GB of ultra-fast DDR6 and 10GB of 'slow' DDR3," he says. "Most rendering operations are actually heavily cache dependent, and due to that, most top-tier developers nowadays try to optimise for cache access patterns... with correct access patterns, correct data preloading and swapping, you can likely stay in your L1/L2 cache all the time."

http://www.eurogamer.net/articles/digitalfoundry-2015-nvidia-geforce-gtx-970-revisited
 
Top Bottom