• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Launches GTX 980 And GTX 970 "Maxwell" Graphics Cards ($549 & $329)

My feeling on card upgrades is to wait for a game I want to play but can't run well enough, then upgrade. There is no better feeling than going from shoddy performance to butter-smooth performance in a game I like.

Right now, my Titan just crushes everything, has 6GB of video memory for LotR, and has obliterated the need to render HD video in Adobe Premiere (only crazy effects need to be rendered). I bet it's going to play Witcher 3 incredibly well too. I can't see myself upgrading from it for a long while.
 

Kieli

Member
why not wait for 20nm cards?

Because we won't know when nVIDIA is going to feel generous again with their pricing. And I certainly will not bet on them giving us as amazing a performance/dollar ratio with their shiny new manufacturing process and architecture.

Not unless AMD pulls a miracle (which I highly doubt....).
 

CentroXer

Banned
Because we won't know when nVIDIA is going to feel generous again with their pricing. And I certainly will not bet on them giving us as amazing a performance/dollar ratio with their shiny new manufacturing process and architecture.

Not unless AMD pulls a miracle (which I highly doubt....).

I'll buy whoever comes out with the 20 nm cards first. Be it Nvidia or AMD.
 

Evo X

Member
Just got a 780Ti and not that pleased with how much it can OC... only got core up 50Mhz before artifcats in 3DMark Fire Strike. Does flashing the bios affect the OCability? Should I do this?

I'm aware it unlocks more voltage but I don't think I want to up the voltage too much. Will this make me OC better for each given voltage level?

I'm annoyed because I bought a really good aftermarket cooler. GPU temp is not bottlenecking me at all but the card just doesn't want to go as far as I hoped.

Flashing the BIOS with Skynet allows higher voltages along with the ability to lock the OC clock. Meaning, it's not randomly boosting to wherever it wants. The OC you set, is the OC that it will clock to in all games. Upside is greater stability, downside is higher power usage.

Additional voltage should not affect the life of the card too much. The way I see it, any degradation wouldn't be a problem because I would upgrade the card WAY before it starts to matter. For example, if at stock voltages, the card could last 10 years, but overclocked to the max, it only lasts 8 years. That's perfectly fine to me, because by that time, this card will be ancient history and probably equivalent to what's available in smart phones by then. lol.

Just be aware that higher voltages doesn't always allow higher clocks. You might have just gotten a bum chip that can't go much over stock. Just the way it works out sometimes. Only thing the manufacturer guarantees is that the card remains perfectly stable at default clocks. Anything extra is just a bonus.
 

Makareu

Member
why not wait for 20nm cards?

You will always have something to wait for.
The excitement around these cards and especially the 970 is the consensus that this is a bundle of performance, consommation, ecosystem and price that is a rare occurrence.

I am half expecting Nvidia to have a press release on how well it sells.
 

studyguy

Member
Feel like I should upgrade my shit soon.
970 first though prob. I have a 2500k OC'd and that honestly still feels okay to me, but idk how well it fares against other CPUs right now. Rolling a 680 atm so it's probably a better bang for my buck to just move up to the 970 first before dealing with the hassle of a new Mobo for a CPU upgrade.
 

garath

Member
why not wait for 20nm cards?

The old adage in PC gaming. If you're always waiting for the next best thing, you'll be waiting forever.

There's ALWAYS something better in the pipe. The best you can do is try to find a good price/performance ratio that works for your budget and your gaming needs and upgrade on a cycle that keeps you playing the games you want to play. For some that's upgrading to the very best every single cycle, for others it's a 2-3 generational jump.

You can anticipate big changes around the corner if you pay attention (i.e. I would not have bought a new video card in the last few months with the Nvidia rumors and leaks) but sometimes it's impossible to tell what the next cycle will bring in performance, price or timing.

This is a weird time with new games coming out trying to push the envelope a bit because of the new consoles and this new Nvidia card is mostly just a refresh but the real key is the price. This is fantastic performance for the price. Hard to ignore it. I personally don't like spending over $300 on a video card. Certainly not $600+. The 970 is a real good value.
 

kharma45

Member
To scratch my itch while I wait.


Will the H81 need bios update to boot with the G3258? Don't own a Haswell to do that.

I don't think so, it's not a DC chip.

I'll buy whoever comes out with the 20 nm cards first. Be it Nvidia or AMD.

Rumours are AMD is going 20nm, Nvidia want to go 16nm.

Feel like I should upgrade my shit soon.
970 first though prob. I have a 2500k OC'd and that honestly still feels okay to me, but idk how well it fares against other CPUs right now. Rolling a 680 atm so it's probably a better bang for my buck to just move up to the 970 first before dealing with the hassle of a new Mobo for a CPU upgrade.

2500K is fine.
 

Zafir

Member
Does anyone know if voltage overclocking voids the MSI warranty? I mean considering they give you the app that does it with the card you'd think not, but I know some companies have been a bit funny with it in the past.
 

Evo X

Member
Does anyone know if voltage overclocking voids the MSI warranty? I mean considering they give you the app that does it with the card you'd think not, but I know some companies have been a bit funny with it in the past.

As long as you don't flash the BIOS or do hard modding, it does not void the warranty. Since it's all software based, they wouldn't even know you messed with voltages unless you outright told them.
 

Zafir

Member
As long as you don't flash the BIOS or do hard modding, it does not void the warranty. Since it's all software based, they wouldn't even know you messed with voltages unless you outright told them.

True enough, thanks for the response.
 

derExperte

Member
As long as you don't flash the BIOS or do hard modding, it does not void the warranty. Since it's all software based, they wouldn't even know you messed with voltages unless you outright told them.

Afaik there are ways they can find out, some kind of log or parts that get 'marked' when you overclock/overvolt. Don't know if they do this or care in this case though, most likely not since these cards are clearly marketed towards overclockers.
 
Then again. The 980 is $220 more than the 970. That's almost another video card just in the price difference for only a 10% or so performance increase. I think you'd cry a lot less if you feel like you got a great performer that you want to replace in a couple years for $330 rather than a great performer that you want to replace in a couple years for $550.

I genuinely don't believe that extra performance out of the 980 is going to buy you any additional time over the 970 - especially for that price. Just set that $220 aside toward your next card and call it a win.

Or just go 970 SLI and get a huge performance boost over a single 980.
 
Or just go 970 SLI and get a huge performance boost over a single 980.

*if the game supports SLI and if microstuttering isn't a problem*.

I'm SLIing, obviously, but lets not just throw that out there as if it's a given. You aren't playing Dead Rising 3 or The Evil Within with better framerates on SLI 970s than on a single 980.
 
*if the game supports SLI and if microstuttering isn't a problem*.

I'm SLIing, obviously, but lets not just throw that out there as if it's a given. You aren't playing Dead Rising 3 or The Evil Within with better framerates on SLI 970s than on a single 980.

No, but most games with broken SLI have inherent problems anyway and won't do better on a 980 compared to a 970 either way. (Dead Rising, Watch Dogs, AssCreed, etc.). Not sure what's the deal with Evil Within.

980 just seems like a bad bang for the buck right now. If you really need top notch single card performance and have money to burn, better wait for Titan2.
 

AmyS

Member
GTX 980 May Lose The Performance Crown To AMD’s R9 380X in February 2015

This leak comes from overclockers.ru and it pertains to AMD’s next generation of graphics cards. Yesterday AMD revealed some very interesting information to journalists in Japan. During AMD’s annual event in the country this year, company officials divulged a number of exciting plans to 4gamer.net journalists in a round table discussion.

According to the same folks at overclockers.ru AMD has three new graphics cards in the works. The R9 290X successor based on the Bermuda GPU , The R9 380X based on Fiji and the R9 370X based on Treasure Island. And the R9 380X should be released by Feubuary next year if everything goes according to plan.

The new series of graphics cards is supposed to be the first ever to feature TSMC’s new 20nm manufacturing technology as well as 3D stacked HBM memory. The R9 390X reference design is also rumored to feature a form of AMD’s hybrid “Hydra” liquid cooling currently reserved for AMD”s flagship R9 295X2.

http://wccftech.com/gtx-980-lose-performance-crown-amds-r9-380x-febuary-2015/


20nm you can pretty much count on, but 3D stacked HBM memory also? It's known that AMD (as well as Nvidia) have been working on 3D stacked RAM for quite some time.

Will be really interesting if AMD can bring some strong competition early next year.
 
While I do not understand how AMD could at all take back the crown without 20nm (they are just too hot and huge at 28nm already), I was under the impression the TSMC 20nm was never going to be a thing for GPU manufacturers.
 

AmyS

Member
While I do not understand how AMD could at all take back the crown without 20nm (they are just too hot and huge at 28nm already), I was under the impression the TSMC 20nm was never going to be a thing for GPU manufacturers.

Perhaps TSMC 20nm is just a stop gap until 16nm FinFET ?
 

Kevyt

Member
http://wccftech.com/gtx-980-lose-performance-crown-amds-r9-380x-febuary-2015/


20nm you can pretty much count on, but 3D stacked HBM memory also? It's known that AMD (as well as Nvidia) have been working on 3D stacked RAM for quite some time.

Will be really interesting if AMD can bring some strong competition early next year.

Ah yes, from the same website that reported on a rumor "AMD would be releasing a 12 core Phenom CPU on a 25 nm process with 6 Ghz...." over a year ago...

So I'd take their "leaks" with a grain of salt.
 

ktroopa

Member
If DSR runs a game higher res at 1440p for a 1080p monitor say, do I still need to bother with AA settings in game? The higher res naturally elimanates jaggies so then is it best to turn any stuff like that off to keep fps high? What impact if any does DSR have on fps in general.
 

Addnan

Member
If DSR runs a game higher res at 1440p for a 1080p monitor say, do I still need to bother with AA settings in game? The higher res naturally elimanates jaggies so then is it best to turn any stuff like that off to keep fps high? What impact if any does DSR have on fps in general.
AA still helps. DSR is probably the best method of AA but some sort of extra on top is always a nice bonus. As for the impact it is as if your computer is rendering it 1440p or higher and so fps will be lower naturally.
 

Xyber

Member
If DSR runs a game higher res at 1440p for a 1080p monitor say, do I still need to bother with AA settings in game? The higher res naturally elimanates jaggies so then is it best to turn any stuff like that off to keep fps high? What impact if any does DSR have on fps in general.

Yes, even when downsampling from a really high resolution there can be aliasing. But adding FXAA or SMAA on top of that usually works great and it doesn't blur the image like it does at 1080p.
 
I like watching interviews with Tom because he is a master of answering a question without actually answering the question. He could have been a politician.

Yeah I think I would like talking to the guy but it is painful how he has to go through the motions of PR all the time. Ryan sure has a gut of steel.
 
im not seeing dsr in any control panel menus, how do i enable it?

edit - found it in the global 3d settings. surprised you cant enable it on a per game basis

edit 2- nm i realize how it works now
 
Do you people really think we're gonna get 20nm amd / nvidia cards when we didn't get the half step between 40/28 and neither TSMC or Global Foundries even has a high power 20nm process? I wish I were that optimistic...
 
No, but most games with broken SLI have inherent problems anyway and won't do better on a 980 compared to a 970 either way. (Dead Rising, Watch Dogs, AssCreed, etc.). Not sure what's the deal with Evil Within.

980 just seems like a bad bang for the buck right now. If you really need top notch single card performance and have money to burn, better wait for Titan2.

Evil within is ID Tech 5 like Rage and the most recent Wolfenstein. The engine isn't compatible with SLI at all for some reason.
 
I like watching interviews with Tom because he is a master of answering a question without actually answering the question. He could have been a politician.

Indeed. Another reason why I like him so much is because that dude is passionate as hell. You can tell when you watch him speak about their products and their teams that he is incredibly proud of what they do, and he always has such enthusiasm. He also always has slides that make visualising complicated processes, much simpler.
 

Hixx

Member
im not seeing dsr in any control panel menus, how do i enable it?

edit - found it in the global 3d settings. surprised you cant enable it on a per game basis

You don't need to, all the setting does is allow you to choose higher resolutions in the game settings.
 

Evo X

Member
No, but most games with broken SLI have inherent problems anyway and won't do better on a 980 compared to a 970 either way. (Dead Rising, Watch Dogs, AssCreed, etc.). Not sure what's the deal with Evil Within.

980 just seems like a bad bang for the buck right now. If you really need top notch single card performance and have money to burn, better wait for Titan2.

If you take price out of the equation, I think it's a little disingenuous to say that 970 is about the same as 980. Some people just want the fastest single GPU, and the 980 is a good 20-25% faster than the 970 in most games. That's definitely noticeable, especially in the most demanding situations.

I bought a 980 instead of waiting until Titan 2 because I wanted the best NOW, not later. If/when that card comes out, I'll see if it's worth upgrading.

Sure the 980 isn't as aggressively priced as the 970, but it's still not bad, especially compared to the launch prices of Titan and 780Ti.

Even if I was to do SLI, I much prefer 980s to 970s. That way, even in a worse case scenario where I can only utilize one card, I know it will provide the best possible performance that money can buy at this time.
 
Yeah, the price increase from 970 -> 980 is definitely not proportionate to the performance increase, but it's still a more powerful card and not by a small margain. Just looking at the Guru3D reference model reviews...

16% faster in Battlefield 4
20% faster in Crysis 3
20% faster in Hitman Absolution
21% faster in Metro Last Light...

I'm pretty happy with the 980. Especially considering the crazy resolutions I run. And also for those games with no SLI support.
 
If you take price out of the equation, I think it's a little disingenuous to say that 970 is about the same as 980. Some people just want the fastest single GPU, and the 980 is a good 20-25% faster than the 970 in most games. That's definitely noticeable, especially in the most demanding situations.

I bought a 980 instead of waiting until Titan 2 because I wanted the best NOW, not later. If/when that card comes out, I'll see if it's worth upgrading.

Sure the 980 isn't as aggressively priced as the 970, but it's still not bad, especially compared to the launch prices of Titan and 780Ti.

Even if I was to do SLI, I much prefer 980s to 970s. That way, even in a worse case scenario where I can only utilize one card, I know it will provide the best possible performance that money can buy at this time.
That was my thinking too. No regrets so far.
 

BeforeU

Oft hope is born when all is forlorn.
I am going to hold out for a 20-nm card though. My 280x will carry me for another year.

my next card is probably going to have 6GB VRAM though. I'll settle for no less than 4GB VRAM.

by that time, games will require more demanding VRAM. there is really no right time to buy. It will always get better if you hold off.
 

derExperte

Member
If you take price out of the equation, I think it's a little disingenuous to say that 970 is about the same as 980. Some people just want the fastest single GPU, and the 980 is a good 20-25% faster than the 970 in most games. That's definitely noticeable, especially in the most demanding situations.

That's stretching it a bit though, on average the difference seems to be more around 15% when comparing stock clocks, in some games even less, in some more.
 

mkenyon

Banned
I would still prefer nvidia. I doubt that AMD can reduce the noise and power usage like nvidia did... AMD cards are so power hungry...
AMD has been pretty spot on in terms of power consumption. The 290 and 290X were pretty exceptional in that regard.

The recent 285 was right alongside the 760/770 which it competes with:

ahECgJS.png
 

Momentary

Banned
Whatever high-end GPU Nvidia is sportin when the Oculus consumer version is released, will be mine. Oh yes... Any guesses?

I can see it now... $100 off the GTX1090 with your Oculus Rift purchase.

Anyway, I'm surprised that a GPU thread is still getting posts after all this time. Kinda weird seeing this on NeoGAF. Times, they are changing.
 

Daingurse

Member
How does it run with +150MHz ?

Here is what I get with my Gigabyte (+150 Core, +500 Memory, +0 mV, +112 Power), 1530MHz Boost, stable and does not go over 55-60° when full :

Mine is unstable as fuck at 150MHZ, I even get lower Firestrike scores when I try to clock it that high. Wish mine ran that cool. mine's like 65-68° full load. I rarely have luck with the silicone lottery lol.
 
Top Bottom