• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GTX 1070 is slightly faster than a Titan X and GTX 980 ti, reveals 3DMark benchmark

I think i'll go 1080 factory OC'd from my 670s I want a huge boost and that would give the most and I have no intention to SLI again, its more hassle than its worth
 

Teeth

Member
Should I expect this card to manage 1440@60fps for most games? I'm sick of playing at sub-native, and I need my monitor for work.

Check existing 1440p benchmarks for the 980Ti for a variety of games...that'll be around what you'll get. Add a few more frames if you're feeling optimistic.
 

Ulldog

Member
I'm sticking with my 970, but it's really impressive that they keep improving like this. Is there any risk of them every hitting a limit?
 
Q

Queen of Hunting

Unconfirmed Member
Are 4k monitors actually worth buying currently if nothing can handle it
 

Chiggs

Member
is this true? maybe i shouldn't regret my 970 purchase a couple months ago after all

HBM2 cards are going to kick the ever-loving shit out of anything released this summer--regardless of brand.

If you're building a new PC right now and need a card, of course the 1070 and 1080 (and perhaps Polaris) are the way to go. There should be no buyer's remorse.

That said, for folks that own 970s/980s or 290x/390x and higher, why? Knowing that the real standard (HBM2) is not that far off, I just don't see how dropping serious cash right now is a great idea. Of course, if money is no option then who cares.

GDDR5x is a stop-gap solution, period. Nothing more.

Edit: remember the first HDTVs that were only 1080i and featured a 4:3 format and were still CRT? That's what GDDR5x is.
 

Durante

Member
HBM2 cards are going to kick the ever-loving shit out of anything released this summer--regardless of brand.
Why would they? I mean, cards released a year from now will naturally be faster than cards released right now, and chips with a larger die area will naturally be faster than chips with a smaller one.

But "kicking the ever-loving shit out of anything" sounds more extreme to me, and I just can't fathom why that would happen. If you have 4 times the memory bandwidth that is great and all, but if in practice your GPU is only bandwidth limited by 10-30% in the vast majority of gaming use cases then this will only manifest in a 10-30% performance increase. E.g. a Fury has almost 60% higher bandwidth than a 980ti, and yet it's still significantly slower in most games.

I feel like many people are forgetting in their euphoria over HBM that memory bandwidth is just a means to an end, and not an end in and of itself.
 

DieH@rd

Banned
For majority of use case scenarios, modern GPUs are not memory bandwith starved. HBM is all nice and pretty, but mostly it is used as a brand for promotion of "next great thing".
 
Why would they? I mean, cards released a year from now will naturally be faster than cards released right now, and chips with a larger die area will naturally be faster than chips with a smaller one.

But "kicking the ever-loving shit out of anything" sounds more extreme to me, and I just can't fathom why that would happen. If you have 4 times the memory bandwidth that is great and all, but if in practice your GPU is only bandwidth limited by 10-30% in the vast majority of gaming use cases then this will only manifest in a 10-30% performance increase. E.g. a Fury has almost 60% higher bandwidth than a 980ti, and yet it's still significantly slower in most games.

I feel like many people are forgetting in their euphoria over HBM that memory bandwidth is just a means to an end, and not an end in and of itself.



Yup. What will always matter first is the number of cores and the clockspeed.
 

Chiggs

Member
. E.g. a Fury has almost 60% higher bandwidth than a 980ti, and yet it's still significantly slower in most games.

Even though it's all we really have to go by, I'm not sure how comparing two very different GPUs is a great example of GDDR5 vs HBM. The real test, of course, is a 1080 with GDDR5x against one with HBM2. Apples to apples.

HBM2 will likely cause a shift that sees high-end cards equipped with HBM2 and cheaper versions equipped with GDDR5x.
 

dr_rus

Member
HBM2 cards are going to kick the ever-loving shit out of anything released this summer--regardless of brand.

If you're building a new PC right now and need a card, of course the 1070 and 1080 (and perhaps Polaris) are the way to go. There should be no buyer's remorse.

That said, for folks that own 970s/980s or 290x/390x and higher, why? Knowing that the real standard (HBM2) is not that far off, I just don't see how dropping serious cash right now is a great idea. Of course, if money is no option then who cares.

GDDR5x is a stop-gap solution, period. Nothing more.

Edit: remember the first HDTVs that were only 1080i and featured a 4:3 format and were still CRT? That's what GDDR5x is.
This is ridiculous. HBM is just a memory standard which provides more bandwidth, it's not anymore "real" than GDDR5X or DDR4 or HMC, and if some other memory standard (like GDDR5X for example) is able to provide the same effective bandwidth with less costs then it's very easy to spot a better option - unless your brain was eaten by AMD's hype for HBM a year ago.

Even though it's all we really have to go by, I'm not sure how comparing two very different GPUs is a great example of GDDR5 vs HBM. The real test, of course, is a 1080 with GDDR5x against one with HBM2. Apples to apples.

There will never be a 1080 (or any chip for that matter) with support for both HBM and GDDR5.
 

Chiggs

Member
This is ridiculous. HBM is just a memory standard which provides more bandwidth, it's not anymore "real" than GDDR5X or DDR4 or HMC, and if some other memory standard (like GDDR5X for example) is able to provide the same effective bandwidth with less costs then it's very easy to spot a better option - unless your brain was eaten by AMD's hype for HBM a year ago.

HBM2 will become the premier industry standard. The only reason GDDR5x is being utilized is because it's available and it managed to be better than even Micron expected. This has nothing to do with AMD because Nvidia will make the switch too.

There will never be a 1080 (or any chip for that matter) with support for both HBM and GDDR5.

Trying to illustrate how an apples to apples comparison is the real way to gauge performance differences between the two standards. Hence why I put apples to apples?
 

dr_rus

Member
HBM2 will become the premier industry standard. The only reason GDDR5x is being utilized is because it's available and it managed to be better than even Micron expected. This has nothing to do with AMD because Nvidia will make the switch too.
Well, sure, it might, when the games will actually require that much bandwidth which isn't the case right now and thus at this very moment HBM is just money down the drain. The whole hype around it has everything to do with AMD as there are no physical evidence which would prove that HBM is any better than GDDR5 let alone GDDR5X for gaming uses currently.

Trying to illustrate how an apples to apples comparison is the real way to gauge performance differences between the two standards. Hence why I put apples to apples?

You can put whatever you want but the simple fact is that Fury with HBM _and_ with DCC improvements of GCN3 isn't that much faster than Hawaii with GDDR5 (not even 5X) and a previous version of AMD's DCC. This should already tell you a lot.

Modern games are optimized for 1080p console b/w figures and while this will last gobs of b/w provided by HBM are just not necessary and GDDR5X will do fine for the time being. If you expect some big jumps from future HBM GPUs over their GDDR5X counterparts then you're likely to be disappointed. It's just as much of an incremental increase as GDDR5X is.
 

mattiewheels

And then the LORD David Bowie saith to his Son, Jonny Depp: 'Go, and spread my image amongst the cosmos. For every living thing is in anguish and only the LIGHT shall give them reprieve.'
So what's considered an ideal 4K setup if this isn't ideal? Is it always an SLI setup with two cards that people are using, or is there a single card that does the job well?
 

mcz117chief

Member
I think I will wait until at least 1270, I did just buy 970 this year and it will definitely last me for at least 3-4 years.
 
Well, sure, it might, when the games will actually require that much bandwidth which isn't the case right now and thus at this very moment HBM is just money down the drain. The whole hype around it has everything to do with AMD as there are no physical evidence which would prove that HBM is any better than GDDR5 let alone GDDR5X for gaming uses currently.



You can put whatever you want but the simple fact is that Fury with HBM _and_ with DCC improvements of GCN3 isn't that much faster than Hawaii with GDDR5 (not even 5X) and a previous version of AMD's DCC. This should already tell you a lot.

Modern games are optimized for 1080p console b/w figures and while this will last gobs of b/w provided by HBM are just not necessary and GDDR5X will do fine for the time being. If you expect some big jumps from future HBM GPUs over their GDDR5X counterparts then you're likely to be disappointed. It's just as much of an incremental increase as GDDR5X is.
What about the room HBM leaves for larger chips? Will that make significant difference eventually?
 
R9 290X to the 1070. Worth it??

Nobody can tell you if it's worth it or not, it's entirely subjective. Some people upgrade even for 30% better performance which I think it's almost useless but that's me. Others will wait for at least 2-2.5x the performance which usually means at least two generations. In this case we are talking about around 60% probably a bit more, does it seem good to you? If yes, then it's worth it.
 
I'm still limping along on a 7870 and I've been holding off on an upgrade about as long as I can manage. I was thinking about picking up a second-hand 970 once those started raining from the sky. But with this kind of performance out of the 1070, I might just have to post the extra cash and step up to the new hotness. I'll have to wait a little while longer for these Founders Edition shenanigans to die down, but it seems like it may be worth it.
 

dr_rus

Member
Reposting from one of 1080 threads:

XAQT4bs.png


Means that 1070 reviews should be coming soon.

1070 PCB: NVIDIA GeForce GTX 1070 PCB pictured

NVIDIA-GeForce-GTX-1070-PCB-900x420.jpg


Seems like one less power phase than 1080, otherwise pretty much the same save for some wiring differences.
 
Top Bottom