• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Geforce Titan X ~50% faster than GTX980 (3072 cores, 192 TUs, 96 ROPs)

not to mention much more oc headroom, much less power draw, and much cooler temps.

power draw yes, temps absolutely not. only reference design 290s get hot, non reference models dont run any hotter (or louder) than GTX 970s do (~70C under typical load) as for OC headroom, no idea, i run my 290 at 1200mhz, stock is 947.
 
So just how fucking crazy big and hella hot will this sucker be?

The size and heat give me nightmares.

it runs cooler and uses a lot less power than a 290x. power draw is nearly identical to 780ti.

power draw yes, temps absolutely not. only reference design 290s get hot, non reference models dont run any hotter (or louder) than GTX 970s do (~70C under typical load) as for OC headroom, no idea, i run my 290 at 1200mhz, stock is 947.

congrats you have a top 1%er 290x. and custom design or not, they run hotter.
 
it runs cooler and uses a lot less power than a 290x. power draw is nearly identical to t80ti.

it doesn't run cooler, see above. even at 1200mhz i never see temps exceed 75C during gaming on my 290. 1200 isn't typical i agree, but 1100 is easily done on any 290 or 290X, 1150 is possible on most.
 
TDP = heat.

Temp_w_600.png


non reference 290X vs GTX 970, the verdict is clear, non reference 290 series cards run at the same temps or cooler than GTX 970s and 980s.


Power_w_600.png



they also don't use THAT much more power either.
 
I gave you the answer and you're still trying to bat it away?

The TDP is the heat, all your showing is how effective (noise withstanding) a given SKU is at getting it away from the cards sensor.

The 290/X is a hotter card by virtue of the energy it needs to shift. End of.
 

Seanspeed

Banned
Temp_w_600.png


non reference 290X vs GTX 970, the verdict is clear, non reference 290 series cards run at the same temps or cooler than GTX 970s and 980s.


Power_w_600.png



they also don't use THAT much more power either.
But unfortunately, doesn't beat the 970 in actual performance at 1080p like you claimed:

Qvqu8L1.png
 
I gave you the answer and you're still trying to bat it away?

The TDP is the heat, all your showing is how effective (noise withstanding) a given SKU is at getting it away from the cards sensor.

The 290/X is a hotter card by virtue of the energy it needs to shift. End of.

Noise_w_600.png


funny you should mention noise, the non reference 290 series cards win there too.

But unfortunately, doesn't beat the 970 in actual performance at 1080p like you claimed:

Qvqu8L1.png

thats a 110mhz overclocked 970 vs a 30mhz overclocked 290X, at stock clocks the story would be far different.
 
your own link shows the max load temp of the 290X (even overclocked) at 67c, where the air cooled 980s are at 71,76, or 81c. thanks for proving the point.

look at the fan speed. i dont even want to imagine how unbearable that must be. this is also a 980. 970 would be even lower in temps and power draw. 504 watts is just terrible for the performance offered.
 

pahamrick

Member
fantastic, you are comparing the 290X to the 980 now, rather than the 970 we were discussing, and you fail to see that the 290X shows lower temps than those 980s do.

Sure, if you don't mind gaming at 100% fan speed. I use a 5870, I can hear the fans at 100% when using headphones.
 
So you're just going to ignore everything else? lol

Yea, get out of here with this shit man. You're transparent as hell.

those benches are suspect as hell too in terms of the numbers, guru3d shows far different results for these cards than hardocp does.

Sure, if you don't mind gaming at 100% fan speed. I use a 5870, I can hear the fans at 100% when using headphones.

my 290 @ 1200mhz never spins up over 60% and never exceeds 75C, so i dont know what shit ass cards hardocp got ahold of to do its testing. XFX double dissipation btw.
 

Seanspeed

Banned
those benches are suspect as hell too in terms of the numbers, guru3d shows far different results for these cards than hardocp does.
Yes, anything you can to dismiss other relevant evidence. Only the selective evidence *you* provide is valid.

Well, actually its not, because as you say, they are at different stock clocks, so the comparison doesn't count. Right?
 
hardocps benchmarks are from actually playing the game

actually playing the game would give different test scenarios every time the test was run, a fixed loop is more useful for a direct comparison. especially if they are playing a shooter, where player counts and on-screen action can vary wildly from one match to the next, or even moment to moment.
 

Seanspeed

Banned
actually playing the game would give different test scenarios every time the test was run, a fixed loop is more useful for a direct comparison. especially if they are playing a shooter, where player counts and on-screen action can vary wildly from one match to the next, or even moment to moment.
This is getting off-topic. You were proven wrong, we can all move on now.
 
actually playing the game would give different test scenarios every time the test was run, a fixed loop is more useful for a direct comparison. especially if they are playing a shooter, where player counts and on-screen action can vary wildly from one match to the next, or even moment to moment.

for minimum framerates, maybe. over a 10 min or so run the avg frame rate is always within 1 to 2% every single time assuming you put even a modicum of effort in keeping the scenario similar
 
Get a room you two Derailing the topics point. This ain't an amd x Nvidia topic!

i wasn't the one trying to make it an AMD vs nvidia topic, i answered a question and was ganged up on by team green fanboys desperate to prove that amd cards automatically suck no matter what. maybe i got a bit out of hand defending myself from attacks, maybe i didn't. I'm done, im just going to block them and pretend they dont exist.

Are you quite new to GAF? That kind of discourse among members is not really tolerated. Be careful what you say.

yes, as you can see im a junior (no idea when that goes away btw)
 
Red and Green goes perfectly well together on my plate. But PC gaming isn't about food. Although I like steaks, I like the greens better on my PC.
 

Kezen

Banned
Lol. What is going in here?

AMD lunactic getting way too defensive.

i wasn't the one trying to make it an AMD vs nvidia topic, i answered a question and was ganged up on by team green fanboys desperate to prove that amd cards automatically suck no matter what. maybe i got a bit out of hand defending myself from attacks, maybe i didn't. I'm done, im just going to block them and pretend they dont exist.
You seem tense.
Take a look at this, that will calm you down.
 

ZOONAMI

Junior Member
I've always seen the non reference 970s out perform 290x in everything that 1440p and under on max settings... so how can it give similar performance to a 980?

Once you're at 1440p or 4k it isn't really wise to be at max settings. And at high resolutions 290x and 980 trade blows depending on the game.
 

ZOONAMI

Junior Member
Temp_w_600.png


non reference 290X vs GTX 970, the verdict is clear, non reference 290 series cards run at the same temps or cooler than GTX 970s and 980s.


Power_w_600.png



they also don't use THAT much more power either.

I have a reference diamond 290x, and it never hits 90 degrees. It will get into the low 80s, but why is this really an issue when AMD engineered the card to be fine at high temps? Because it is a blower cooler, all of the heat is pushed out the back of the case, so it's not like it's really heating up other components. Non-reference coolers just spread the heat around inside of your case.
 

viveks86

Member
Once you're at 1440p or 4k it isn't really wise to be at max settings. And at high resolutions 290x and 980 trade blows depending on the game.

Do either of these cards have the name titan in it? If not, this is off topic. Let's end this here, please?
 

ZOONAMI

Junior Member
It's natural to compare the current top nvidia card to the upcoming top card, and it's also a quite natural flow of conversation to talk about AMD when talking about Nvidia.
 

Hawk269

Member
I thought I clicked on a thread about Titan X but all I am reading is a battle royale about gpus that came out a while ago that has no bearing on the Titan X.
 

ZOONAMI

Junior Member
I thought I clicked on a thread about Titan X but all I am reading is a battle royale about gpus that came out a while ago that has no bearing on the Titan X.

What more is there to say about the titan x that nvidia hasn't already told us? It is natural to compare current and upcoming GPUs to an announced upcoming GPU. Nvidia themselves is comparing the gtx 980 to the titan x. It makes sense to talk about the 290x, 980, 390x, and titan x in this thread. They are all at the top of the GPU spectrum.
 

Death2494

Member
It's natural to compare the current top nvidia card to the upcoming top card, and it's also a quite natural flow of conversation to talk about AMD when talking about Nvidia.
I'm sure he's saying this because it's clear that this has turned into a measuring contest. While discussing competitor's products are somewhat relevant, the thread has devolved to 980/970 vs 290x. It should really be about the Titan X.
 

Death2494

Member
What more is there to say about the titan x that nvidia hasn't already told us? It is natural to compare current and upcoming GPUs to an announced upcoming GPU. Nvidia themselves is comparing the gtx 980 to the titan x. It makes sense to talk about the 290x, 980, 390x, and titan x in this thread. They are all at the top of the GPU spectrum.
Why not just make another thread? Problem......solved?
Back on topic, so this thing being sub 1k is out of the question right?
 

ZOONAMI

Junior Member
Why not just make another thread? Problem......solved?
Back on topic, so this thing being sub 1k is out of the question right?

A titan x vs 980 vs Amd thread? I bet there would be comments in there - why don't you just post in the titan x thread? Lol

As for $1000? Who knows. I think $800 would be reasonable though. We're talking about a 36% performance increase over the current high-end cards. If AMD could release the 290x in 2013 for $550, $1000 seems too much for the titan x.
 

ZOONAMI

Junior Member
I also think the thread title shouldn't be 50% faster (should be 36%), as it doesn't make sense to compare a stock clocked card to an OCd card.
 
The reference design for R9 290/290x is really shitty.
I have one and its loud and runs at 94°C... the worst ? It throttles a lot.
But I bought a 40 euros aftermarket cooler...
That made such a difference :eek:
Runs quietly... and barely reach 75°C on load and overclocked to 1ghz.
 

Hawk269

Member
I'm sure he's saying this because it's clear that this has turned into a measuring contest. While discussing competitor's products are somewhat relevant, the thread has devolved to 980/970 vs 290x. It should really be about the Titan X.

That is what my somewhat sarcastic comment was for. The last page became a fight between two people talking about the 290x and mainly the 970 which really have no bearing on the Titan X. It just was not how the power difference to the X was going to be, just flat out bickering between those 2 cards.
 

Felspawn

Member
looks to be the first real standalone card to do 4k gaming regadless of game. I'm not about to go buy one, but consiering the GTX970 is as fast as a titan was 2 years ago for a fraction of the price i makes me believe 4k will be the standard PC Master race Resolution in a year or two.
 

gus-gus

Banned
Would a 750 watt power supply be likely to handle this card?
I believe it's corsair power supply
I have a 970 right now, 5820k, 8gb ddr4. Wish I could check it out myself but I'm at work.
 

Hawk269

Member
looks to be the first real standalone card to do 4k gaming regadless of game. I'm not about to go buy one, but consiering the GTX970 is as fast as a titan was 2 years ago for a fraction of the price i makes me believe 4k will be the standard PC Master race Resolution in a year or two.

I want to believe that a single gpu can do 4k at 60fps with most settings on high but I don't see that happening for a while. In SLI i am sure the X can do any current game and games for the next year or so, but not a single card.
 

mkenyon

Banned
But unfortunately, doesn't beat the 970 in actual performance at 1080p like you claimed:

Sean, those are outdated benchmarks with shitty metrics. I know you know better.

As it is right now, the 290X does outperform the 970. AMD has done a lot, especially since the Omega Driver release, which isn't reflected in some of the older stuff.

But also, it is a power hungry bitch and definitely produces more heat. The other person doesn't seem to understand the correlation between watts consumed and heat in watts produced.
 

Marlenus

Member
Take these with a huge grain of salt because I have no idea where they originally came from so no idea if they are legit or not. I saw them on Anandtech but there was no source provided.

Based on the rumored Titan X specs the performance over the 980 seems to be a little lower than you would expect as it is only 35% @ 4k and I would have expected 40% + with the increase in VRAM + memory bandwidth on top of a 50% shader increase.

okWvaAz.jpg

zzzYXYg.jpg

LAcWtv8.jpg
 
Top Bottom