• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Geforce Titan X ~50% faster than GTX980 (3072 cores, 192 TUs, 96 ROPs)

jaypah

Member
The reference design for R9 290/290x is really shitty.
I have one and its loud and runs at 94°C... the worst ? It throttles a lot.
But I bought a 40 euros aftermarket cooler...
That made such a difference :eek:
Runs quietly... and barely reach 75°C on load and overclocked to 1ghz.

Oh my God, can I get a link? Also was it hard to do?
 
980 doesn't seem to be worth it and Titan will be at least $1k... So I guess I'll stick with my gtx580 for now. I don't even know if my PC could handle a new card anyways.
 

tuxfool

Banned
Most people I have heard from like the Titan Styled reference cooler. I will watch the video to see what he means.

The Titan (and 780ti) reference cooler was a very good blower cooler, but not everybody likes them. It was so so effective because it used an expensive vapour chamber design. The 980 dropped the vapour chamber, primarily to drop costs and also because it didn't need it.

If they return to the vapour chamber then it maybe has the potential to be quite quiet. However blower coolers tend to be limited by the amount of air they can push so when overclocking (where efficiency drops drastically) the cooler starts being unable to keep up with triple fan open air coolers (provided the case airflow is good).

Ultimately the guy only has a point if the cooler is inadequate for the TDP of a card, or if he is a heavy overclockers.
 
The GTX 980 produces 8.98% less wattage of heat than the 290X under load. (Almost all wattage used gets converted to heat)

http://images.anandtech.com/graphs/graph8526/67755.png

Also, on playing the game vs. a standard benchmark:

"Because we measure from the wall, this test means we’re seeing GPU power consumption as well as CPU power consumption, which means high performance cards will drive up the system power consumption numbers merely by giving the CPU more work to do.

For that reason, when looking at recent generation cards implementing GPU Boost 2.0 or PowerTune 3, we prefer to turn to FurMark as it essentially nullifies the power consumption impact of the CPU."
 

Ruff

Member
Good lord this sounds awesome. Makes me even more sad that my GPU is about to kick the bucket. Screen spilts in 2 when I start playing games on it (Left side now on the right Etc), and flashes different colored screens and won't be able to afford a new one in quite a while. At least i can still do non gaming related stuff on it for the time being.

I should really stop hating myself by looking at these high end specs :L
 
Oh my God, can I get a link? Also was it hard to do?



There are guides and the documentation is good.
It was the first time I did it, but it takes like an hour. You need to be careful of every of your screws. But totally worth it. Heat and noise wasnt a problem but... the GPU would drop its frequency to not burn. And now ? It can maintains 1ghz. Considering its meant to reach 90°C, being at 75°C was still a good margin.
 

Durante

Member
The GTX 980 produces 8.98% less wattage of heat than the 290X under load. (Almost all wattage used gets converted to heat)

http://images.anandtech.com/graphs/graph8526/67755.png

Also, on playing the game vs. a standard benchmark:

"Because we measure from the wall, this test means we’re seeing GPU power consumption as well as CPU power consumption, which means high performance cards will drive up the system power consumption numbers merely by giving the CPU more work to do.

For that reason, when looking at recent generation cards implementing GPU Boost 2.0 or PowerTune 3, we prefer to turn to FurMark as it essentially nullifies the power consumption impact of the CPU."
That's really stupid. Furmark does not give you any idea about actual power consumption during normal usage.

As you can see here, in a game test a reference 290x consumes 85 Watts more than a reference 980, resulting in 35% higher overall system power consumption.
 
That's really stupid. Furmark does not give you any idea about actual power consumption during normal usage.

As you can see here, in a game test a reference 290x consumes 85 Watts more than a reference 980, resulting in 35% higher overall system power consumption.
Hey Durante, off topic but what type of pc configuration do you think will be able to run The Witcher 3 on high, 1080 and 60fps? I don't know if I should get the 980 or if getting a new Titan is overkill... Can you help!?
 

mkenyon

Banned
Hey Durante, off topic but what type of pc configuration do you think will be able to run The Witcher 3 on high, 1080 and 60fps? I don't know if I should get the 980 or if getting a new Titan is overkill... Can you help!?
Wait for performance reviews. Anything else is speculation.
 

JAYSIMPLE

Banned
Is it worth buying a 980 at all? say i wanted 4k at some point? has it been filled (4gb) at 4k by most games?


I cant afford this card :(
 

Kezen

Banned
Is it worth buying a 980 at all? say i wanted 4k at some point? has it been filled (4gb) at 4k by most games?


I cant afford this card :(

I think this is not a very good buy if you game at 4K. It shows its limits already and future games will certainly ask even more of it.

It's a killer card for 1080p gaming, some would say it's almost overkill but I'd disagree with that because some very impressive games are coming out and I can see a 980 grade GPU required for maximum settings and 40-60fps. Far Cry 4 is not 60fps locked with a 980.
 
Hey Durante, off topic but what type of pc configuration do you think will be able to run The Witcher 3 on high, 1080 and 60fps? I don't know if I should get the 980 or if getting a new Titan is overkill... Can you help!?

The only statement we currently have on perf is that a 980 runs the high preset at 1080p/30. titan x is 35%~ faster than a 980
 

dr_rus

Member
Is it worth buying a 980 at all? say i wanted 4k at some point? has it been filled (4gb) at 4k by most games?


I cant afford this card :(

980 won't be enough for 4K. TitanX may be usable but I wouldn't bet on it being able to run 4K @ 60Hz most of the time either.
 

Shozuki

Member
I hope Witcher 3 supports SLI. Hope that'll do 60 fps @ 3440x1440?

I wanna get two of these to replace my R9 295X2's.
 

Kezen

Banned
900x900px-LL-1840233e_0MiDfFY.png
 
I'm actually gonna wait for the Witcher 3 benchmarks to get my new PC

Will play at 900/30 on PS4. Then will enjoy again when I get my new PC, well thats the plan anyway.

I am hoping AMD's card releases soon after if only to get a 980 or upgrade at a reduced price (I expect Titan to be crazy expensive)
 

Tablo

Member
Damn the 390X is going to be a beast, once Nvidia releases a realistically priced card that's around that level of perf I am in.
670 am cryyy
 
Top Bottom