• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

State of the GPU industry market summed up in one video

I have a feeling that the current gen consoles and their performance level didn't help things either. There is little reason for Nvidia to come up with a home run like the 8800gt when even the lowest-end cards provide console-level performance for peanuts.

I do think that VR will spark a new arms race though. High framerates and resolutions are a necessity for a good VR experience.

Yes. VR will definitely make these companies have to change the way they talk about GPU performance and numbers. It doesnt sound so sexy when NV has to say "plays your favorite VR game on medium!" regarding a titan.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Another reason not to buy a AMD machine.
Releasing a next gen card with the same performance as there last gen. Laughable.

I know nVidia have some issues but it's much easier to buy a new nVidia card and know you're getting new tech and better performance.
 

WolvenOne

Member
The sort of naming schemes GPUs use now, stems from when die shrinks happened practically every other year. We're in a,situation where companies put out new lines of cards every 18 to 24 months, but without die shrinks or other new technology improvement doesn't make each new line of card a clear improvement.

This should be alleviated next year, when we finally get a die shrink. Aside from optimization however I wouldn't expect the old pace to reassert itself. With existing chip technology we only have a few die shrinks left, and each jump will take a little time.
 

Bad_Boy

time to take my meds
titan x going for 1000+ right now is rediculous.

I miss the days when $600 was the standard for the highest gaming card. And even then, that was a lot.
 
Isn't every type of chip across the board slowing down in improvements? Cpu, gpu, laptop and mobile?

I will upgrade my 980tis when I see a good gap increase
 
Yes. VR will definitely make these companies have to change the way they talk about GPU performance and numbers. It doesnt sound so sexy when NV has to say "plays your favorite VR game on medium!" regarding a titan.

VR SLI will change things, though.

Rendering alternate frames is stupid in VR; being able to render one point of view per each card will help a whole lot.

Still, let's hope things get hot again in the competition between the two.
 

Kysen

Member
This video seemed to alot of AMD complaints and some vague references to some ancient Nvidia rebrands. Expensive halo GPUs have always been around, why the complaints now? There is a card at every price point now.

By no means is a 980Ti/FuryX required for a meaningful upgrade in a modern desktop.
 

Vex_

Banned
What I did to jump into pc gaming was look at the current consoles of this generation, translate that into pc parts, and buy the combination of parts that puts me just above that. This allows me to play whatever consoles get, but at a better framerate guaranteed across ~10years. Not to mention cheaper than getting the very best parts. MUCH cheaper.

This is assuming the consoles last 10 years this time.

To be honest, you don't need much more than what consoles put out (ps4 is usually 30fps@1080p), but for a couple of bucks more, you can get = or > 60fps@1080p.

But then again, I have been a console gamer all my life lol. What do I know :(
 

Piers

Member
I know nVidia have some issues but it's much easier to buy a new nVidia card and know you're getting new tech and better performance.

crvUyZB.gif

But generally, yeah, I agree with you
 

bj00rn_

Banned
I'm always interested in the truth, but then I need the whole truth, not some half-truths. There's a disconnect between the claimed "rebranding" and AMD (or Nvidia) spending money redesigning their chips/GPUs. Is it really "rebranding" if the card is more efficient and f.ex. delivers better IQ? I thought "rebranding" was taking the exact same product and wrapping it with a new name and packaging, not redesigning architecture. I want to know the full story behind that, so I'm not just going to put all my trust in this rebranding theory from a single guy with a potential agenda (making clickbait videos, perhaps not, I'm just saying I want more datapoints)
 

Swass

Member
So according to this video we haven't seen much gains in several years.. what would be the best card to purchase today, new or used, to get the most bang for the buck?
 
So according to this video we haven't seen much gains in several years.. what would be the best card to purchase today, new or used, to get the most bang for the buck?

380/290/290X/970/980ti

Everything else is meh.


Is there any value to the video or is it another case of someone being butthurt that Titan costs 1000$ when you get 99% of it's performance from gtx 980 ti ?
 
I've just got back into pc gaming after a long break. I paid about 5 times the cost of my console to play games at twice the framerate with better visuals. I enjoy it but there's no way I could argue it's not awful value for money.
 

Durante

Member
This is a stupid video, because its answer to the question "Why has AMD done this" is simplistic and, well, stupid.

The reason performance growth in the GPU market has slowed down somewhat is that one of the primary drivers of performance growth in all hardware is process shrinks, and we haven't seen a process shrink in GPUs in forever.

Hell, the GPU market is still in much better shape in this regard than the desktop CPU market, where you basically can't get a meaningful upgrade in performance at all. If anything, you should be applauding GPU makers for what they managed to get out of the 28nm process.

I feel sorry for giving this tripe a click.

I've just got back into pc gaming after a long break. I paid about 5 times the cost of my console to play games at twice the framerate. I enjoy it but there's no way I could argue is not awful value for money.
If you did that, then you made incredibly poor decisions in your component selection. You don't need to spend $2000 to build a PC which is twice as fast as the consoles, not by a long shot.
 
Have said this for a few years now.

People talking about how long 28nm has gone are missing the point although it doesn't help over a long period.

The shit started with the 7970, the first 28nm card. AMD's previous top end card 6970 launched at around $360, it was competing with the GTX 580 and priced below it and the 580 was also a DP card like Titan GK110 is, the 580 is the GF110, the last true top end card released for a reasonable price.

AMD didn't do a good job in moving to 28nm. We should've had 4870 to 5870 performance jump but all we got was a small bump. 5870 launched at $400 and had huge gains thanks to moving to 40nm.

AMD priced the pathetic 7970 above the old gen 580. $499 was the ceiling price that Nvidia had with its huge DP card the 480 and 580. AMD with their midrange 28nm effort priced it at $550 and people were paying gladly $600-650 despite it being a poor effort. AMD didn't price it like they did with the 5870. 5870 had the performance crown but wasn't priced silly over old gen crap. It actually replaced old crap instead of slotting in front like you expect.

Nvidia were unimpressed with the 7970, so they quickly brought in the midrange card, GK 104, they saw with some GPU boost it could match the 7970 in some benches and beat it in others. 7970 being so poor was further highlighted when they shown huge gains months later in driver updates, that are not typical.

When the 5870 launched, NVidia waited 6 months to launch the 480 and were forced to use the GF110. They usually wait to see what AMD can do. If AMD had done the job right and the 7970 was the expected speed then we would've seen NVidia wait 6 months and bring out GK110 to slightly beat it. NVidia said themselves were expecting 7970 to be much better.

Nvidia had the GK110 to launch later and decided not to destroy the market and price it accordingly with current gen. SInce AMD want to price a midrange card above an old gen 580, then NVidia's actual high end card has to slot in front of that price wise and we start to see cards being slotted ahead instead of replacing. Nvidia had too much of a performance advantage and sadly people ran out an bought 7970s for $650. Nvidia seen an opportunity to release different cards up to a $1000 that wipe the floor with AMD's efforts.

The gains on 28nm from NVidia have been great actually despite being disappointingly long, just look at the OG titan to a titan x. It's the performance advantage NVidia had in 28nm over AMD and AMD pricing their poor efforts too high.

The true successor to my 580 was under the guise of a OG titan, not a GK104 680 but Nvidia could relax and not bother hurrying GK 110 since AMD released the 7970, on par with Nvidia GK 104 midrange.

Anyway, I bought a 580 then bought a GTX 970. I didn't fall into the trap of buying new midrange cards over and over or fall for $700 cards. Perhaps I've been lucky but most of the 28nm run was pure bullshit to me and the 970 is a nice stop gap but still has a caveat. 970 was a blessing after that awful run, cheap and powerful, been a great buy almost a year on. 780ti was perhaps the biggest joke of the 28nm run.

So yes you can dodge most of it. 580 and 970, two cards over a 5 year span and I'm still on my i7 930 which has not give me any problems so far. No need for new rigs all the time if you buy at the right time. there's always new stuff around the corner but with each GPU gen you usually get a big bump, 28nm didn't have a good start so you avoid it.

Should say I've no problem with people buying $1000 cards, that's your income. Just saying the upgrade nonsense can be dodged if you just look at the market.
 

LeleSocho

Banned
Can't see the video because i'm on a very limited connection so i'll just say that around 2008 i bought an HD4830 which was the weakest high end GPU of the at the time ATI for the price of 80€ new.
Now the equivalent of that card, let's say nvidia's 960, costs way way more than that and so will the next year...
It's really ridiculous this situation.
 
I've just got back into pc gaming after a long break. I paid about 5 times the cost of my console to play games at twice the framerate with better visuals. I enjoy it but there's no way I could argue it's not awful value for money.

Then you completely over built. You can build a PC at less than twice the cost of a current console that will do what you just said.
 

PnCIa

Member
Yep, that video was good. Rooting for a company is a bad idea anyways, and AMD is not better than Nvidia when given the chance.
 

AaronB

Member
Many valid points, but is roughly two years of little progress in graphics cards completely shocking when you think about the scope of human history? "Man, we're still using the same rock-sharpened spears we were using two years ago. Something's horribly wrong here."
 
Many valid points, but is roughly two years of little progress in graphics cards completely shocking when you think about the scope of human history? "Man, we're still using the same rock-sharpened spears we were using two years ago. Something's horribly wrong here."

That's a weird comparison. Technological progress is supposed to go faster over time and the bigger deal is that we have had constant progress on GPUs for many years and the development is much slower now.

Not that I am too mad about what is happening and it isn't a result of lackluster investments or them deliberately trying to halt progress. Although I do dislike how all the rebrands make for a much more nontransparent GPU market.

Can't see the video because i'm on a very limited connection so i'll just say that around 2008 i bought an HD4830 which was the weakest high end GPU of the at the time ATI for the price of 80€ new.
Now the equivalent of that card, let's say nvidia's 960, costs way way more than that and so will the next year...
It's really ridiculous this situation.

Well, there are rumors about a GTX 950 coming. And previous Nvidia generation we had the 750Ti for €130 or so and even the normal 750 for cheaper but that one wasn't very popular. So, I don't think it's too bad at all.
 

LeleSocho

Banned
Many valid points, but is roughly two years of little progress in graphics cards completely shocking when you think about the scope of human history? "Man, we're still using the same rock-sharpened spears we were using two years ago. Something's horribly wrong here."
Dumb analogy, in the early '00s tech industry had incredible evolutions sometimes in less than a year time now we are at the point that in 2011 we saw the last process node change and will probably we'll see another next year... That's a 5 years wait, compared to the decade prior to that it's alarming. Not to count that the jump wouldn't even be as big as it could since 14/16nm we'll see next year are in reality 20nm with finfet.

Sure engineers are fighting with the limits of physics here so it's not child's play but it doesn't excuse the hardware maker behaviour.
 

belmonkey

Member
Have the price drops for the old high-end GPUs been appreciable over the last few years? 3 years after the $550/$500 7970 and 680, we now have similar performance (380 and 960) at $200 with 4GB versions at $220. Doesn't seem too bad for the performance that used to be high-end a few years ago. Kind of a bummer though that even in 2013 it was possible to get a 7850 at under $150 and now in 2015 the R7 370 launched at $150.
 
Then you completely over built. You can build a PC at less than twice the cost of a current console that will do what you just said.
Absolute nonsense. If you want a clean, tear free image, with better visuals, on a fixed refresh screen (ie a TV) now and over the *next couple of years*, you needs lots of power. I know exactly how my pc performs. You're saying a sub 970 equipped pc is going to be putting out a reliable, tear free 1080p60 at better than console visuals in a couple of years time? Why exaggerate so much?
 
Basically it's becoming increasingly difficult to continue to do process shrinks at a marketable cost. These are needed to drive the next improvements. That's why Nvidia/AMD have been focusing on improving power efficiency and performance per watt instead. Finding ways to do more with what they have until they can get costs for manufacturing those smaller die. Progress costs alot.

Is what I know from a rudimentary search and doing my best to peruse information about the current state of die shrinkage.
 
Absolute nonsense. If you want a clean, tear free image, with better visuals, on a fixed refresh screen (ie a TV) now and over the *next couple of years*, you needs lots of power. I know exactly how my pc performs.

Although I don't think you could achieve quite the same with twice the cost of a console. You most definitely don't need to pay five times the price for what you are trying to achieve. An I5 + GTX 970 should already deliver that result.
 

LeleSocho

Banned
Well, there are rumors about a GTX 950 coming. And previous Nvidia generation we had the 750Ti for €130 or so and even the normal 750 for cheaper but that one wasn't very popular. So, I don't think it's too bad at all.
I'm not saying that you can't buy a card for the same price (even if 750ti costs +50% more than what i paid for my old card) i'm saying that cards like the 750ti is nowhere near the tier that was my HD4830 7-8 years ago, the card you mentioned is never mentioned as high end and never should.
 
I'm not saying that you can't buy a card for the same price (even if 750ti costs +50% more than what i paid for my old card) i'm saying that cards like the 750ti is nowhere near the tier that was my HD4830 7-8 years ago, the card you mentioned is never mentioned as high end and never should.

The HD4830 has never been high-end? What makes that one more high-end than this one? Also, I dunno when you bought it, but on release the card was also 50% more expensive than you mention.

Now there was even a lower priced card, the HD 4670 and no, in that segment of the market I don't think there is really anything anymore nowadays. Although if you are going for that price point that gap is probably closed by the integrated GPUs on the CPU.
 
Although I don't think you could achieve quite the same with twice the cost of a console. You most definitely don't need to pay five times the price for what you are trying to achieve. An I5 + GTX 970 should already deliver that result.
A 970 is not going to be putting out a tear free 1080p60 at meaningful high settings in 2 years time. It can't even do that now.
 
I'm not sure if you're aware but if the 980 was a mid range card then the 670 was a low-mid card when it came out. The 680 was to the 600 series as the 980 was to the 900 series, the "little" chip of that series, which lasts two years instead of one. Little chip -> Big chip is likely the new normal for Nvidia. And based on AMD's hilarious rebadge this year, it seems like they don't really have a better strategy either. It's not just a lack of competition, the R9 290s were good cards, priced aggressively and it aged better than Nvidia's contemporaries did. Fact is though, neither company is capable of bringing out insane new chips every 12-16 months anymore. Clockwork die shrinks are a distant memory. Increasing power draw every year isn't viable. And despite people complaining about price gouging and Titan / Fury overpriced cards, only one of these two companies is currently profitable, and we're not talking scrooge mcduck levels of profitable. Nvidia's profit last quarter was 26 million, off a total operating income of 1.15 billion.

I'm still salty about Nvidia and their shitty Kepler support. There's no reason for a 780 to not age as well as a 290


In regards to stagnation, could it just be that there really hasn't been demand for cpus and gpus to be more and more powerful? Last console generation lasted so long that most of us got used to maxing out pretty much every game at 1080/60. a 2500k cpu from years ago is still performing just fine. If VR or 4K gaming become more mainstream then maybe We'll see more of a push for more powerful cards
 

Toski

Member
I'm still salty about Nvidia and their shitty Kepler support. There's no reason for a 780 to not age as well as a 290

What about the 680/770 people. Even the "sub-par" 7970/280X has aged better, although thats probably due to AMD's long tooth arch and originally inefficient drivers.
 
A 970 is not going to be putting out a tear free 1080p60 at meaningful high settings in 2 years time. It can't even do that now.

Except the when it does do that. In practically every review ever, since launch. Shit, games like Alien Isolation, the Battlefields, CODs, and loads of other less demanding titles hit 120fps with minor tweaks or none at all.

The exceptions are pretty much just TW3 with Hairworks and max shadows, Crysis 3, Unity (cpu bound), and Batman (cause that shit is broke af).
If that is true, then that is really, really sad. The 970 is a high end GPU. I would expect it to perform well for every game that comes out this generation.

If all you care about is 1080p60, the 970 is way overkill considering that 1080p60 is mainstream level.
 

velociraptor

Junior Member
Another reason not to buy a AMD machine.
Releasing a next gen card with the same performance as there last gen. Laughable.

I know nVidia have some issues but it's much easier to buy a new nVidia card and know you're getting new tech and better performance.

Nvidia prices are comical.

It seems like GPU prices remain stagnant for Nvidia cards.

If I spend £200+ on a GPU, I expect it to play games 1080p60 for at least 4 years.
 

velociraptor

Junior Member
A 970 is not going to be putting out a tear free 1080p60 at meaningful high settings in 2 years time. It can't even do that now.

If that is true, then that is really, really sad. The 970 is a high end GPU. I would expect it to perform well for every game that comes out this generation.
 

LeleSocho

Banned
The HD4830 has never been high-end? What makes that one more high-end than this one? Also, I dunno when you bought it, but on release the card was also 50% more expensive than you mention.

Now there was even a lower priced card, the HD 4670 and no, in that segment of the market I don't think there is really anything anymore nowadays. Although if you are going for that price point that gap is probably closed by the integrated GPUs on the CPU.
Wat.
The HD4830 was a cut down version of the 4870 which was the highest performing gpu Ati had at the time... It was essentially the "I can't afford a 4850/4870" card, it was the weakest of them but it was still the third most powerful card you could buy because and had the same gpu of its bigger brothers, completely uncomparable to the 750ti which has low power and thermals in mind and essentially a proof of concept that Maxwell cards existed.
As i said i bought it in 2008 so not that later than its launch. But again even if you are right and the card costed 150€ it still means that in 2008 you could buy the third most powerful card of a company for that price... today this situation would be a dream.
 

dr_rus

Member
I've no idea what some of you are talking about. GPU prices always were high. Back in the days of 3dfx SLI was the top end config, then there were Ultra and MAXX cards, etc. A highest option - like Titan today - always cost a fortune. The only thing that has been on and off again are cards like 9500pro, 8800gt and 970 (yes, I consider it to be quite a performer for a buck). And this comes with competitive lineup peculiarities more than any conscious decision to create such a proposition.

I don't think that we've been limited by 28nm nearly as much as some of you think. We started at a very high wafer prices hence the relatively small chips for high prices. As wafer prices fell we've seen the introduction of bigger chips with Titan, 780 and Maxwell series. The performance of a typical $500 offering went up pretty considerably between 680 and 980. If you compare 780ti to 980ti then the noticable performance increase came with lower price actually even in the top end biggest die space.

All in all the only rather bad thing which has happened lately is the 300 series AMD rebrands which actually increased the prices for the same performance - but this is mostly becuase of AMD's grim financial situation than anything else.

Some people just like to whine a lot and don't really remember how it was prior to when they bought their first video card which was some 8800gt coincidentally - a temporary fluke more than a typical proposition.
 

V_Arnold

Member
If that is true, then that is really, really sad. The 970 is a high end GPU. I would expect it to perform well for every game that comes out this generation.

This is such an arbitrary rule that it is obvious why it cannot be met, no matter how high you set the bar.

You can musture up high enough polycounts, objectcounts, texture sizes and enough shaders that will cripple even Titan, let alon a 970. These gpu's have limits. There were console games in the previous generation that went by with low-20 fps numbers and still would have crippled the cpu/gpu-s of their time, because of their unoptimized nature.

Try playing GTA5 on a pc with the specs close to an Xbox 360...see? The 970 will perform reasonably, provided that the game engine's requests stay within its capabilities, and only then.
 
A 970 is not going to be putting out a tear free 1080p60 at meaningful high settings in 2 years time. It can't even do that now.

Along as the ultra settings are off, which is also the case with console games, it might do that very well.

Aside from that, I'm having my doubts that the consoles will manage to do all the games at a tear free 1080p30 at meaningful high settings in 2 years time. They can't even do that now.
 

LilJoka

Member
Absolute nonsense. If you want a clean, tear free image, with better visuals, on a fixed refresh screen (ie a TV) now and over the *next couple of years*, you needs lots of power. I know exactly how my pc performs. You're saying a sub 970 equipped pc is going to be putting out a reliable, tear free 1080p60 at better than console visuals in a couple of years time? Why exaggerate so much?

Probably will be fine since in a few years meduim/low settings on PC will match consoles.
 
Wat.
The HD4830 was a cut down version of the 4870 which was the highest performing gpu Ati had at the time... It was essentially the "I can't afford a 4850/4870" card, it was the weakest of them but it was still the third most powerful card you could buy because and had the same gpu of its bigger brothers, completely uncomparable to the 750ti which has low power and thermals in mind and essentially a proof of concept that Maxwell cards existed.
As i said i bought it in 2008 so not that later than its launch. But again even if you are right and the card costed 150€ it still means that in 2008 you could buy the third most powerful card of a company for that price... today this situation would be a dream.

I don't see how it is way worse to have a low-end card on a new architecture instead of one based on the current high-end offering. The way to compare is to look at how much performance offered then against now. There seemed to be more competition back then, so I'm sure it still rules favorably for the past, but there is still a low-end market and it isn't bad.

Saying it was the third most powerful cards tells nothing. What does it matter that there are more high-end cards now? Do low-end cards suddenly deliver terrible performance nowadays? Reception of the 750 Ti seemed pretty positive. And we do have even lower-end cards, but these don't get any attention at all.

And right now I'm ignoring the AMD offering, but they still have plenty of low-end graphics cards too.
 

robb_w7

Banned
It really does, I was 1/2 to saying fuckit and just buying a PS4/xbone when making my build, but I'm too much of an fps snob.

Console gaming is defiantly less of a headache, thats always been the case, but the rewards for it working well on PC trump everything

But watching that video and just seeing how Nvida and AMD are making the graphics card industry a complete mess makes console gaming that more appealing
 

wildfire

Banned
titan x going for 1000+ right now is rediculous.

I miss the days when $600 was the standard for the highest gaming card. And even then, that was a lot.

You forgot about the GeForce 8800 Ultra?

It was $850 right after the economy collapsed in 2006.
 
I don't feel it's as bad as that if you buy wisely and don't have the attitude that you need the newest and best hardware. I have had a I7 2600k for the past four years and it's still fine, only had to update my graphics card in all that time as my 580 died after three and a half years and upgraded to a 970 (which was much cheaper than I feared for the power). Obviously not ideal for 4k, but it's fine for me. I can play almost every game at high/ultra at 60fps 1080p.
 

mclem

Member
Yeah, that video could be played as a commercial for console manufacturers. What a confusing mess Nvidia and AMD created. PC Gaming continues to grow (edit: In particular at a rate faster than consoles IIRC), but I wonder if the plateau is sooner than we think.

I think the key is that the areas where PC Gaming is growing most significantly aren't really the areas where graphics are the big driving force.
 

tuxfool

Banned
All in all the only rather bad thing which has happened lately is the 300 series AMD rebrands which actually increased the prices for the same performance - but this is mostly becuase of AMD's grim financial situation than anything else.

The 300 series did increase performance versus the 200 series. Not through architecture but process improvements and base clocks. It also wasn't significant enough that one would typically justify for a new series, but it is there.
 
Top Bottom