• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

State of the GPU industry market summed up in one video

https://www.youtube.com/watch?v=MvV1KgGZtMo


I've been complaining about this for a long time but this guy is very eloquent at explaining it.
Having just spent an extortionate amount of money on a midrange gpu to upgrade my old rig I was painfully reminded of how prices have more than doubled with the scheme detailed in the video.

My old potato was a beast in 2009 and cost me 550 euros (for everything including case, hdd, psu etc)
Now an upgrade to a year old midrange Gpu and a year old midrange cpu cost me 850 euros.
Less relative performance and a full build would have cost me twice as much as in 2009.
It sucks, it's more expensive now than it was even in 2003 to get a decent gaming experience.

Try to resist derailing the thread because of the last 5 seconds of that video, thanks
 

HarryKS

Member
I tried to build one recently. Gave up for this exact reason.

Can't deal with something where I feel I'm always on the verge of missing out if I buy now rather than wait 2 weeks every 2 weeks.
 
I tried to build one recently. Gave up for this exact reason.

Can't deal with something where I feel I'm always on the verge of missing out if I buy now rather than wait 2 weeks every 2 weeks.

At some point you just bite the bullet and do it. I had the same feeling end of April when I was building mine. Ended up getting a 970 and no regrets. Can play every game on the market on high if not uktra settings.
 
I tried to build one recently. Gave up for this exact reason.

Can't deal with something where I feel I'm always on the verge of missing out if I buy now rather than wait 2 weeks every 2 weeks.

That's not what the video is about :p

Prices doubled, new tech being around the corner is a good thing, it generally means prices for old tech drop (making it cheaper to get into gaming)
Only since the titan/fury scheme was introduced prices for new tech have doubled and prices for old shit have stayed the same.

At some point you just bite the bullet and do it. I had the same feeling end of April when I was building mine. Ended up getting a 970 and no regrets. Can play every game on the market on high if not uktra settings.
I like my gtx 970 (and how much performance they got out of 28nm with maxwell is an engineering marvel) too it's a fine card, the problem is having to pay SO much for it (380 euros) , it should never have cost more than 150-200, and trying to rationalise how much I had to pay does noone any favors (except nvidia)
 

HarryKS

Member
At some point you just bite the bullet and do it. I had the same feeling end of April when I was building mine. Ended up getting a 970 and no regrets. Can play every game on the market on high if not uktra settings.

Other factors came in play. I've reached an age where I don't care as much for videogames as I used to. I just go for convenience and pricing.

I can't justify those prices for top graphics cards and components when I can get a garment from a prestigious brand for the same amount. It's more useful. Just an example.


That's not what the video is about :p

Prices doubled, new tech being around the corner is a good thing, it generally means prices for old tech drop (making it cheaper to get into gaming)
Only since the titan/fury scheme was introduced prices for new tech have doubled and prices for old shit have stayed the same.


I know. It's part of it. I was looking at graphics cards and I really struggled to pick one because of the number of options available. At one point it was akin to typing 'which is the best earphone' on google. No article would put down similar lists. Prices were a bit prohibitive as well. I thought the tech had known a greater evolution. I was wrong.
 

akira28

Member
I have to strike deep with the future in mind. So paying a little more up front for something that will at least be midlevel. screw trying to be bleeding edge.
 

SapientWolf

Trucker Sexologist
I think they can get away with this because current games aren't taxing those GPUs at 1080p or below, which is what the overwhelming majority of PC gamers play at. So those old ass GPUs are still viable products. This generation of consoles didn't raise the bar as high as the previous gen did, and multiplatform games all have to scale down to the XB1.

On one hand, progress is slow, but on the other hand, a $100 GPU will play nearly everything at the same settings as the consoles, which is completely unprecedented this early in the cycle.

Fun fact: Nvidia and AMD were sued for price fixing and settled.
 

Skux

Member
Prices of 200 series cards have gone down.

He seems to defeat his own argument. Rebrands exist because of competition, and
that new tech like HBM can't yet be sold at mainstream prices, and he understands all of it.

Bottom line is if you don't want to pay x amount for this GPU, then don't. The Titan and Fury exist because people will buy them.
 
The price creep of graphics cards annoys me greatly. I got a 5850 shortly after launch for £200. I got a gtx 670 a good 6 months after it came out for £250 ish. I'm still using it and will probably continue to do so for at least another year, maybe two.

The 980 is a midrange card, priced like a high end one. Kepler was so far ahead of the competition that Nvidia could charge what they wanted and its gotten worse.
 
Yeah I laughed quite hard when nvidia released the titan and charged $1000 for it, absolutely crazy that people actually bought them.
 

Victrix

*beard*
Another issue is people not caring about 60fps (or better, you want a 120/144hz monitor with a computer that can push it, you really do).

There are fps-deniers in great numbers on this forum, what do you think its like out there in the larger general gaming population? Quite a few who don't notice or don't care (and some who actively lobby for lower fps in exchange for improved technical graphics quality).

And then beyond that, like Wolf mentioned, lots of people running default 1080p monitors means a low bar to pass (and a not insignificant number are on 768p from laptop monitors). If a hand wave replaced them with 1440p, you might hear some noises.

But past a certain point on framerate or resolution, you find less and less people who really care that much. Hell, I don't care as much about the res as I do the fps.

I care a great deal about monitor panel type and quality, and very few people even know or care about the differences (nor should they, its a pita).

I kind of think high res, high refresh monitors reaching affordable saturation might provide more of an impetus for significant performance improvements than a lot of other factors (reiterating how much you want a high hz monitor with no motion blur - it is the ssd of panel tech).
 

Faenix1

Member
I was one of the poor saps that bought a Geforce 250 thinking it was "new".

Really wish I spent a bit more for at least a 260
 
I'm not really sure I agree, despite the fact that the video raises some good points. While rebrands are definitely disappointing when they are not accompanied by a price drop and it is true that it costs more to remain on the bleeding edge, the cost of entry for PC gaming is dramatically lower and even low-end graphics cards and CPUs provide respectable performance.
 

Zaptruder

Banned
The whole PC-gaming industry has really stalled for quite a number of years in terms of progress.

Marginal progress. We're due for a paradigm shift - from GDDR5 to HBM.

And the shift to DX12, coupled with VR push and relatively easy multi-GPU support (at least for VR; one GPU for each eye, or AFR if implemented by the dev)... and we'll see what is probably going to be the largest jump we'll have seen for a decade... and then it'll lkely quickly plateau back down to relative stagnation.

I suspect in a few years, the kinds of machines needed to power 4k VR will be 4xSLI machines. It'll be... ludicrous.
 
I got a gtx 670 a good 6 months after it came out for £250 ish. I'm still using it and will probably continue to do so for at least another year, maybe two.

The 980 is a midrange card, priced like a high end one.

I'm not sure if you're aware but if the 980 was a mid range card then the 670 was a low-mid card when it came out. The 680 was to the 600 series as the 980 was to the 900 series, the "little" chip of that series, which lasts two years instead of one. Little chip -> Big chip is likely the new normal for Nvidia. And based on AMD's hilarious rebadge this year, it seems like they don't really have a better strategy either. It's not just a lack of competition, the R9 290s were good cards, priced aggressively and it aged better than Nvidia's contemporaries did. Fact is though, neither company is capable of bringing out insane new chips every 12-16 months anymore. Clockwork die shrinks are a distant memory. Increasing power draw every year isn't viable. And despite people complaining about price gouging and Titan / Fury overpriced cards, only one of these two companies is currently profitable, and we're not talking scrooge mcduck levels of profitable. Nvidia's profit last quarter was 26 million, off a total operating income of 1.15 billion.
 

Victrix

*beard*
VR is a good point too, the thought of low framerates, frame stutter, and tearing in an immersive environment sounds pretty awful.
 
Prices of 200 series cards have gone down.

He seems to defeat his own argument. Rebrands exist because of competition, and
that new tech like HBM can't yet be sold at mainstream prices, and he understands all of it.

Bottom line is if you don't want to pay x amount for this GPU, then don't. The Titan and Fury exist because people will buy them.

His main complaint is that The Fury and Titan are the true next in the series, and instead of advertising/pricing it as such, they are making it a premium product. While selling the "new series" cards which are no different than their older models, as the next in line.

it's a new money grabbing scheme by the companies.
 

Zaptruder

Banned
I'm not sure if you're aware but if the 980 was a mid range card then the 670 was a low-mid card when it came out. The 680 was to the 600 series as the 980 was to the 900 series, the "little" chip of that series, which lasts two years instead of one. Little chip -> Big chip is likely the new normal for Nvidia. And based on AMD's hilarious rebadge this year, it seems like they don't really have a better strategy either. It's not just a lack of competition, the R9 290s were good cards, priced aggressively and it aged better than Nvidia's contemporaries did. Fact is though, neither company is capable of bringing out insane new chips every 12-16 months anymore. Clockwork die shrinks are a distant memory. Increasing power draw every year isn't viable. And despite people complaining about price gouging and Titan / Fury overpriced cards, only one of these two companies is currently profitable, and we're not talking scrooge mcduck levels of profitable. Nvidia's profit last quarter was 26 million, off a total operating income of 1.15 billion.

People just forget that technology is complicated and expensive (and reliant on a whole technology ecosystem on continued improvements - especially in the material sciences). These are literally some of the most powerful and complex consumer devices you can buy in the world.
 
His main complaint is that The Fury and Titan are the true next in the series, and instead of advertising/pricing it as such, they are making it a premium product. While selling the "new series" cards which are no different than their older models, as the next in line.

it's a new money grabbing scheme by the companies.

Perfect summary of the 9 minute video, well said.
 
His main complaint is that The Fury and Titan are the true next in the series, and instead of advertising/pricing it as such, they are making it a premium product.

Titan 1 actually made sense because it had crazy VRAM for when it came out and worked really well as a value workstation card. Like it was a "budget alternative" to Quadro cards. But they were primarily marketing it to the ultra high end enthusiasts.

Titan 2 and Fury don't have this advantage. Titan still has arseloads of Vram (12gb!) so it being expensive isn't crazy. It's a damn high margin part, but if you're a Saudi prince or Smokey it could still make some sense when you're planning to SLI these puppies and the framebuffer is limited by what each card has (not additive).

Fury is stuck in the worst possible place, the HBM memory that it was banking on being the new hotness isn't quite ready for prime time yet, so only comes in 4GB. And it wasn't so stunning in perf that it outpaced the 6GB 980ti. So they had to price it on par with that, instead of up at 750-1000 which is where they were probably secretly hoping to be able to put it.
 

ruddiger7

Banned
I bought 2 980tis recently. Not saying theyre worth the money but nvidia cards to hold value very well and when I want to upgrade in the next 18 months to 2 years its not going to cost me a fortune. I work in IT and at least can claim tax deductions though.
 

hateradio

The Most Dangerous Yes Man
I was one of the poor saps that bought a Geforce 250 thinking it was "new".

Really wish I spent a bit more for at least a 260
You mean the GTS? I knew it wasn't new since I did a little research. I still got it. :p

It's nice to have a re-re-branded card, eh.

I bought 2 980tis recently. Not saying theyre worth the money but nvidia cards to hold value very well and when I want to upgrade in the next 18 months to 2 years its not going to cost me a fortune. I work in IT and at least can claim tax deductions though.
Wait, seriously?
 
I have a feeling that the current gen consoles and their performance level didn't help things either. There is little reason for Nvidia to come up with a home run like the 8800gt when even the lowest-end cards provide console-level performance for peanuts.

I do think that VR will spark a new arms race though. High framerates and resolutions are a necessity for a good VR experience.
 

Momentary

Banned
I have a feeling that the current gen consoles and their performance level didn't help things either. There is little reason for Nvidia to come up with a home run like the 8800gt when even the lowest-end cards provide console-level performance for peanuts.

I do think that VR will spark a new arms race though. High framerates and resolutions are a necessity for a good VR experience.

Hopefully Pascal cards will be the jump we need in performance. Hopefully it's equivalent to the jump of the 8800 GTX.
 

Nokterian

Member
I haven't upgrade my 780 either...so far everything works maybe in the next couple of years i will upgrade. Same goes with my processor still got my I7 2600k what a processor still great workpower it is a shame that nvidia is keeping the price high of Titan X and other cards and it is a shame seeing rebrands happen without any progress.
 

fatchris

Member
Thank God I stopped gaming for four years and am now permanently one generation behind with a backlog of everything from Mass Effect to Xcom. My 150 euro graphics card slays everything I play.
 

Xyphie

Member
Code:
2011: Radeon 7000 series - 28 nm
2012: Radeon 7000 series - 28 nm
2013: Radeon 200 series - 28 nm
2014: Radeon 200 series - 28 nm
2015: Radeon 300 series - 28 nm

2012: Geforce 600 series - 28 nm
2013: Geforce 700 series - 28 nm
2014: Geforce 900 series - 28 nm
2015: Gefroce 900 series - 28 nm

Spot the culprit. When your foundry partner can't deliver a new node there's not much you can do except build bigger and thus more expensive and power-hungry dies.
 

eso76

Member
I think the point is just there's higher 'higher-ends' today.
I mean, if there's a market for 600 and 1000$ cards, then it makes sense for manufacturers to produce them, and for devs to take advantage of that power.

Whereas in 2003 you had low, mid and high end cards (where 400$ was high end) you now have low, mid, high, higher, enthusiast, extreeeeme, batshit insane.. Making your 250$ card look piss poor.

It's just the way the market shifted: people claim they are poorer than they were 10 years ago, at the same time they apparently dont have a problem with 600$ GPUs or 900$ mobile phones that would have sounded ridiculous a few years ago. People need to understand they can, and in a lot of cases should, settle for less.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
I have a feeling that the current gen consoles and their performance level didn't help things either. There is little reason for Nvidia to come up with a home run like the 8800gt when even the lowest-end cards provide console-level performance for peanuts.

I do think that VR will spark a new arms race though. High framerates and resolutions are a necessity for a good VR experience.

So, if the consoles were beasts with 680s and i7s in them, Nvidia would be making far more powerful GPU's right now and lowering their prices significantly? Shit....that's something i didn't really think about.

But i can see where your coming from somewhat. from what i understand Intel doesn't compete because there's no other CPU manufacturer around on their level, hence AMD's Zen core are what people are banking on to give them pressure.

I always thought it was just technological barriers, and Nvidia/AMD just going for profit margins that made things how they are.
 

AP90

Member
From my experience, its better to go high end with the CPU and ram, then replace the gpu every few years. This is due to how negligible the CPU performance increases are.

Thus, buy either the 6600k CPU with Mb and at least 16gigs of DDR4ram (possibly 32if you can swing it) and not upgrade CPU for 4-5years.

If you can afford a LGA 2011 cpu+MB you could make it 5-6years
 
Yeah, being stuck at 28nm for so long is a pretty big reason why GPU performance hasn't increased like it use to when die shrinks were happening every two or three years. The current gen consoles not pushing things forward like consoles use to probably did also play a part.
 
I actually just upgraded my R6950 (flashed to a 6970) last month, getting a nice custom cooled XFX 290x.

Cost me maybe $280 USD after rebate. Still need to overclock it though, but waiting for it to break in a bit.

Honestly, because DX12 games that really push the envelope are a fair ways off, at 1080p it's still a beast of a card. Didn't see the advantage of spending what, $100 for a slightly faster 390x, nor 200+ more for an R9 Fury.

At the time, the 970 was still a bit more, and iirc, the 290x is actually a fair shade faster overall.

Rarely, if ever, have I gone bleeding edge videocard wise, but I can still usually milk 3-4 years out of a card game wise. This one will easily last me the next three I think. It helps that CPU's haven't grown much in the last five years or so, my 2600k still takes anything I throw at it.
 

LilJoka

Member
In 2008 I bought a HD 5850 for £210, in 2014 I bought a GTX 970 for £270. 970 is higher quality, non reference cooler and very silent. I could have bought the reference model for £220. Nothing has changed too much.
 

Suikoguy

I whinny my fervor lowly, for his length is not as great as those of the Hylian war stallions
I'm not sure if you're aware but if the 980 was a mid range card then the 670 was a low-mid card when it came out. The 680 was to the 600 series as the 980 was to the 900 series, the "little" chip of that series, which lasts two years instead of one. Little chip -> Big chip is likely the new normal for Nvidia. And based on AMD's hilarious rebadge this year, it seems like they don't really have a better strategy either. It's not just a lack of competition, the R9 290s were good cards, priced aggressively and it aged better than Nvidia's contemporaries did. Fact is though, neither company is capable of bringing out insane new chips every 12-16 months anymore. Clockwork die shrinks are a distant memory. Increasing power draw every year isn't viable. And despite people complaining about price gouging and Titan / Fury overpriced cards, only one of these two companies is currently profitable, and we're not talking scrooge mcduck levels of profitable. Nvidia's profit last quarter was 26 million, off a total operating income of 1.15 billion.

Shit, have Graphics cards reached a similar point that CPUs reached several years ago? I always thought they would scale laterally better then a CPU would with more cores. But, admittedly my knowledge of the inner workings of hardware is just a few notches above novice. If I had to guess, it's a fill-rate problem, and that can't be addressed latterly?

Console gaming just seems less of a headache after watching that.

Yeah, that video could be played as a commercial for console manufacturers. What a confusing mess Nvidia and AMD created. PC Gaming continues to grow (edit: In particular at a rate faster than consoles IIRC), but I wonder if the plateau is sooner than we think.
 

Nafai1123

Banned
This is why I hate shopping for a new GPU and won't for awhile longer. I was hoping by this time in the generation prices of the higher performance cards would have dropped, but it just hasn't happened. We always just keep saying "wait till the next generation!" and honestly, with the rather stagnant advancement so far I haven't had a problem doing that.

I have a feeling VR is going to force my hand sooner than later though.
 
Just have a strict budget and buy the best possible hardware parts for that money. And upgrade your stuff only if you really need it (or you hate your money).

Anything else just makes you crazy. And no, you don't need >850€ for a decent gaming experience.
 
Shit, have Graphics cards reached a similar point that CPUs reached several years ago? I always thought they would scale laterally better then a CPU would with more cores. But, admittedly my knowledge of the inner workings of hardware is just a few notches above novice. If I had to guess, it's a fill-rate problem, and that can't be addressed latterly?

I don't think we're entering CPU territory yet. It's just that we'll be more seeing bigger jumps every other year. It's slower progress, but it's not stagnation just yet.

Hopefully Pascal/whateverthefuck AMD has will be cool, HBM2 + node shrink. But you never know.
 
Code:
2011: Radeon 7000 series - 28 nm
2012: Radeon 7000 series - 28 nm
2013: Radeon 200 series - 28 nm
2014: Radeon 200 series - 28 nm
2015: Radeon 300 series - 28 nm

2012: Geforce 600 series - 28 nm
2013: Geforce 700 series - 28 nm
2014: Geforce 900 series - 28 nm
2015: Gefroce 900 series - 28 nm

Spot the culprit. When your foundry partner can't deliver a new node there's not much you can do except build bigger and thus more expensive and power-hungry dies.

That's most likely the most important reason for inflated prices, but not the only one.

Take the GTX 980 for example. It's not really a huge chip (398 mm²) and is on a mature and relatively cheap 28 nm process. For reference, the GTX 400 series was Nvidia's first line of 40nm cards, yet they released the GTX 480 with 529 mm² for the same price as the GTX 980 (500€/$). Things don't get better when you look at their mid-range solutions. The GTX 460 was a way better deal for ~200€/$ than the GTX 960 is now.
 

LilJoka

Member
I think some problem is down to consumers too, such as the "Max out" term, where people think anything below ultra setting is not acceptable. Even when ultra provides unnoticeable improvements except for a 10fps drop. These are the type of consumer that are also driving sales for Titan type cards.

It's easily possible to play games at 4K 30 on a GTX 970 if you drop a few settings to High and shadows to Medium.

And do 1080p60 just by dropping a few settings to High instead of Ultra.
 
Top Bottom