• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

State of the GPU industry market summed up in one video

wildfire

Banned
If you're going refute my claims then don't make shit up.

In gw2 during sieges at launch my phenom II gave me 5-10 fps, the big dragon fight (forgot the name it's been a long time) event when you had 30+ people fighting gave me 10 fps as well, my friend with an i5 2500k (stock clock) who played with me even dropped to 20fps in those sieges and fights.


@ wildfire, let me guess, your ancient cpu is a nehalem or sandy bridge quad core
That would indeed run mmos fine as my brand new 4690k is only marginally faster than a 2011 i5 2500k

Athlon 64 X2.

MMOs I've played are Planetside2 , Eve Online, Guild Wars 2, The Secret World and of course WOW.

Only Planetside 2 I've frequently crashed. All the other games are manageable or good even among crowds.
 

RiverBed

Banned
Oh, CPU progression is even worse. A decade after quad cores release, one would assume 6 and 8 core CPUs would be around. Hopefully in a couple of years or a little more the 'DX12 effect' will be more than a marketing phrase and we'll see a real pump in performance and offering from GPU and CPU markets a like. I want either noticeable performance gains or drastically reduced prices. I am NOT about to pay a premium for a hair bump in performance. Every time I build a PC it is a huge leap than the previous one. I only do one every few years (longer than a console generation the last time) just for that reason.
 

Odrion

Banned
Oh, CPU progression is even worse. A decade after quad cores release, one would assume 6 and 8 core CPUs would be around. Hopefully in a couple of years or a little more the 'DX12 effect' will be more than a marketing phrase and we'll see a real pump in performance and offering from GPU and CPU markets a like. I want either noticeable performance gains or drastically reduced prices. I am NOT about to pay a premium for a hair bump in performance. Every time I build a PC it is a huge leap than the previous one. I only do one every few years (longer than a console generation the last time) just for that reason.

But since the 2500k the CPU has never really been a bottleneck. 6 and 8 cores are good for heavy multitasking, but that's about it. Hell, if you really want to hack it you can buy a $70 Pentium G3258 and get 80% of the same results.
They're trading margins (low volume high profit) for market share (losing all of it)
In the short term this is giving them money, in the long term they might fuck themselves over completely.
That isn't our concern (consumer) though.

If people support an industry you expect benifits to the consumer, like volume production lowering costs and competition further lowering prices.
We aren't getting either benifit. All we get is marketing, perceived value and price collusion (which is hard to prove unless CEOs are retarded enough to discuss it in company emails like they did in 2008).

I'm not happy about only a fraction of the money I paid for my 970 going into the actual production and R&D, and the rest going into marketing and shareholder pockets.
Because that's what you ARE paying for when you buy a titan (or a 970), you are paying for branding and marketing.

I don't want beats/apple gaming computer, my gpu is not a fashion accessory.


I got defensive, you got defensive, it was based on a misunderstanding.
My apologies as well, water under the bridge.
Breathe into a paper bag and calm down.

QYrmOQm.jpg


WF57YWT.jpg

These aren't mid-range settings. The 970 isn't a mid-range graphics card. Just because Nvidia found out that there are idiots out there, like me, who would buy their Titan cards so they can pump their settings to Ultra with 120fps doesn't make that the new high-end and anything lower than that the low~mid range. If developers put some luxury setting in their game that forces the hair or capes or whatever to be a billion polygons, that doesn't make turning that shit off a mid-tier compromise. It reminds me of when people complain about a game "being a bad PC Port" because developers put in a specific super stressful setting, people saw that as not being optimized regardless that without the setting the game still looks and runs a generation ahead of it's console port.

If there was a trend of lower performance gains with a higher price point, PC nerds would've caught wind of it by the time reviews came out. Instead people said "Holy shit, the 970 is even nipping at the heels of the 780ti." Then they released the $650 980ti card that's as powerful as two 970s working together. Calling the 970 and 980ti the "Beats by Dre" of graphic cards is absurd.

There is something to be said about the $200~250 range being bad for Nvidia cards, maybe. The 960 isn't really a good bargain compared to the price:performance ratio of the 970 or 980ti. Heck, I personally think the 750ti is kinda crappy for $150, although it's really really low power requirement can be super beneficial if you're clever.

AMD is just an absolute shitshow though. GCN for three years, confusing rebranding, drivers issues, and a interface that looks absolutely arcane.
 

SapientWolf

Trucker Sexologist
I want to upgrade my PC but every time I look at prices it just seems you can't get as much for your money at the moment.

I built it in 2011 and I still have my spreadsheet with the costs.

A HD 6870 GPU (+free copy of Deus Ex HR) cost me £125 back then, and it was a great card at the time. For the same money now I can get a R9 370, which from what I can tell is basically the same card with minor changes and wouldn't be much of an upgrade at all.

It's the same with CPUs, I paid £150 for my i5 2500k. If I was to spend the same money again four years later I wouldn't be able to gain much performance.

My whole build with case + keyboard + everything else (except a monitor) cost me £580. It's hard to spend that much now and end up with a better PC than what I already have.
You only have 1GB of memory on the 6870, right? Almost anything would be an upgrade. That's a huge liability now. That's the main reason why I had to retire my 5850s.
 

rrs

Member
die sizes are stuck as mobile SoCs can easily outbid gpu providers for lower die sizes

Also. Nvidia and AMD seem to be fighting more over a newly bloated highend chip market (higher profit margins) rather than midrange or low end markets
 

Grimalkin

Member
I'm not expecting to go back to the Radeon 9800 Pro glory days but I'm unimpressed by the current offerings. I look at the charts and I don't see a compelling reason to upgrade. My PC is a i5-2500K and GTX570. I can still run everything I want to play okay, not up to my standards but I guess I'm waiting for something that's going to wow me for spending $350-$400 on a single GPU.

It's insanity to spend $600+ on a card. I've always spent $350-400 on upgrading the GPU and lately it isn't worth the money. When is the next great leap? I'm ready but the tech isn't there.
 
I'll admit that I don't really know much about the state of video cards in the year 2015 (last time I bought a video card was right before HL2 shipped, ha! - perhaps an ATI 9800?), but isn't this much of a non-issue since benchmarks pretty much cover what you need to know about newly released cards?

I mean, I guess the way that I see it is that either you're a casual PC gamer and don't really look at specs and just go to newegg or whatever and order based on customer reviews, or you're a hardcore build-your-own-pc type of PC enthusiast and then of course you're going to be looking at benchmarks/etc.

The third option is that you're the person that actually does the benchmarks in which case it's not really a problem because you're not even looking at the names to try and determine performance, right? You just look at the specs and then do your benchmarks and write your reviews/etc.

Am I missing something? Again, I'm prepared to admit that I might be missing the issue due to ignorance.
 

satriales

Member
You only have 1GB of memory on the 6870, right? Almost anything would be an upgrade. That's a huge liability now. That's the main reason why I had to retire my 5850s.

I don't doubt that I could see some improvement by upgrading but there just isn't the same jump as there used to be. For £125 I can buy a 370 which has 2GB memory but it's an entry-level card. I will need to spend £200-250 for something that is going to last another four years.

I'm actually not against spending that much on a card, it's just disappointing that prices have risen and progress has slowed.
 

tokkun

Member
Oh, CPU progression is even worse. A decade after quad cores release, one would assume 6 and 8 core CPUs would be around. Hopefully in a couple of years or a little more the 'DX12 effect' will be more than a marketing phrase and we'll see a real pump in performance and offering from GPU and CPU markets a like.

I think that is doubtful. There is no evidence to suggest that DX12 will make CPUs have a greater impact on game performance. Yes, you can do more draw calls in DX12 on a 6-core CPU than a 4-core CPU, but that is meaningless if games are not already draw-call-bound. If anything it will make them have less of an impact since you can use an even weaker CPU and not be CPU-bound.

Unless of course you are running the Star Swarm marketing game engine demo from that company made up of former Microsoft DX employees and Microsoft MVPs.

Anyway CPU core counts have stayed low because Intel makes big margins on their Xeon processors. They are not going to cannibalize that market by putting out a 12-core mainstream CPU if they can help it.

Meanwhile in the GPU world Nvidia is intentionally nerfing the Titan's DP FP performance to make it less attractive to Tesla buyers, but that has a pretty negligible effect on gamers.
 

longdi

Banned
I'm not expecting to go back to the Radeon 9800 Pro glory days but I'm unimpressed by the current offerings. I look at the charts and I don't see a compelling reason to upgrade. My PC is a i5-2500K and GTX570. I can still run everything I want to play okay, not up to my standards but I guess I'm waiting for something that's going to wow me for spending $350-$400 on a single GPU.

It's insanity to spend $600+ on a card. I've always spent $350-400 on upgrading the GPU and lately it isn't worth the money. When is the next great leap? I'm ready but the tech isn't there.

Eh...getting a 290x for 300 will be a great leap over your 570...
 

belmonkey

Member
....Heck, I personally think the 750ti is kinda crappy for $150, although it's really really low power requirement can be super beneficial if you're clever.

It seems to be a better option now the price has been dropped to $120-$130. The general drop in price of the same level of computer hardware since last year hasn't seemed too bad. It used to be that a budget $400 PC could only manage a 750 ti + G3258 with 4GB RAM. Now an i3 and 8GB of RAM can make it into the same budget.



AMD is just an absolute shitshow though. GCN for three years, confusing rebranding, drivers issues, and a interface that looks absolutely arcane.

I agree fully. It would be more forgivable if their GPUs worked fine in budget builds like Nvidia's.
 
I especially hate how AMD trying to upgrade my GPU for Dx12 support. Nvidia has DX12 support all the way back to the Fermi Architecture while AMD's support is only with GCN architecture. Because of that, My PC for gaming which uses a 6xxx is stuck with Dx 11.2 meanwhile my MacBook Pro has a GT 6xxM has DX12.
 

tuxfool

Banned
I think that is doubtful. There is no evidence to suggest that DX12 will make CPUs have a greater impact on game performance. Yes, you can do more draw calls in DX12 on a 6-core CPU than a 4-core CPU, but that is meaningless if games are not already draw-call-bound. If anything it will make them have less of an impact since you can use an even weaker CPU and not be CPU-bound.

There is an inherent advantage to cheaper draw calls, in that you can issue them faster. It should definitely help when dealing with high frame rates and VR.

It also allows people to use less powerful cpus. It saves power on cpus which is useful for gaming laptops and other portable devices.
 

tuxfool

Banned
I especially hate how AMD trying to upgrade my GPU for Dx12 support. Nvidia has DX12 support all the way back to the Fermi Architecture while AMD's support is only with GCN architecture. Because of that, My PC for gaming which uses a 6xxx is stuck with Dx 11.2 meanwhile my MacBook Pro has a GT 6xxM has DX12.

Architecturally 6xxx (and earlier) series is very very different from the series' that came after. Whereas Fermi to Maxwell were all iterations on a similar microarchitecture.
 

zou

Member
The lower euro accounts for most of the difference.

550 EUR in 2009 was about 790 USD.
850 EUR today is about 930 USD.

Inflation over 6 years accounts for the rest.
 
The lower euro accounts for most of the difference.

550 EUR in 2009 was about 790 USD.
850 EUR today is about 930 USD.

Inflation over 6 years accounts for the rest.

Prices in europe did not drop when the euro/dollar ratio became 1.6/1
In fact people were importing pc hardware from the US back then because it was cheaper to pay for international shipping + the US price than to pay the EU price

So no.
 

zou

Member
Prices in europe did not drop when the euro/dollar ratio became 1.6/1
In fact people were importing pc hardware from the US back then because it was cheaper to pay for international shipping + the US price than to pay the EU price

So no.

EUR/USD was only briefly at 1.6, so of course they wouldn't adjust the prices for a few months.

The prices in USD are the same as they were in 2009, but whatever.
 

tokkun

Member
There is an inherent advantage to cheaper draw calls, in that you can issue them faster. It should definitely help when dealing with high frame rates and VR.

It also allows people to use less powerful cpus. It saves power on cpus which is useful for gaming laptops and other portable devices.

I'm pretty sure I watched an Nvidia presentation a while back where they talked about how (and this is a software feature IIRC) they didn't need to increase the number of draw calls for VR because they could automatically adjust them for perspective. I'm not sure what AMD is doing.

Although I can agree in principle that fast draw calls are better than slow ones, isn't the crux of the issue whether they are a bottleneck or not? If they aren't, then speeding them up reduces your CPU utilization but doesn't make the game run any faster.
 

Vinland

Banned
AMD is just an absolute shitshow though. GCN for three years, confusing rebranding, drivers issues, and a interface that looks absolutely arcane.

I wouldn't call this an air tight case of AMD being a shitshow. What is wrong with the drivers or GCN for that matter? Or are you speaking from a place of popular opinion or fact. Are you accounting for the developer's responsibilities to make their game work correctly in the first place with both popular brands in the market space they are competing in? Why do you care if someone re-brands a card as long as it is cheaper than it was before? How is that a bad thing? How is AMD benefiting from that confusion if they priced it according to informed people's expectations of something older being cheaper?

How is what you said any more of a shit show than the 970 memory scandal or Nvidia Gameworks black boxing?

What do you really define as a shit show and why do the looks of the gpu's interface and branding, which is completely subjective, have anything to do with it?
 
I wouldn't call this an air tight case of AMD being a shitshow. What is wrong with the drivers or GCN for that matter? Or are you speaking from a place of popular opinion or fact. Are you accounting for the developer's responsibilities to make their game work correctly in the first place with both popular brands in the market space they are competing in?
IIRC, there've been GPU driver comparisons between Nvidia and AMD that show there's a big gap in DX11 performance. AMD's general lack of driver updates outside of the beta ones is also a bit of a problem, because you have to rely on potentially unstable drivers if you want the best performance from the new, big name games, while Nvidia always pushes out drivers that have been tested to some degree within a day or two of release.
 
60FPS + good IQ is lovely. It's the reason to play games, especially PC games.

Though truthfully, I am a bit of a casual gamer. Hence I don't think the PC platform has any exclusive games that interest me.

Which is still not anywhere close to it being the only reason.

- Control over hardware
- Mod support
- Control over settings
- Image quality
- Multi-purpose
- Keyboard + Mouse controls
- Exclusives
- Most games
- Friends may be on PC

And more reasons. Thinking that the only reason is to run games in 60 FPS is just plain ignorance. The reasons may not be relevant to you at all, but in that case you stated it completely wrong.
 
What bothers me a lot about the cards is the voltage locking. I have a 7970 and one of the fans on it died so i have an aftermarket heatsink that keeps it incredibly cool. If I could overclock it just a little bit I would probably get the same performance as the 300 series. It still runs games beautifully though.
 

Vinland

Banned
IIRC, there've been GPU driver comparisons between Nvidia and AMD that show there's a big gap in DX11 performance. AMD's general lack of driver updates outside of the beta ones is also a bit of a problem, because you have to rely on potentially unstable drivers if you want the best performance from the new, big name games, while Nvidia always pushes out drivers that have been tested to some degree within a day or two of release.

It is a weak argument none the less.

What about Nvidia's issues with Windows 10? How did that one get by QA? C'mon calling out one company as being a shitshow when there is enough egg to go on everyone's face is a bit petty. History shows both companies entertaining dubious business practices to one extent or another or even having hardware defects and driver compatibility issues.

I am having fun on my current AMD platform, I wish I had been a little more informed on the price range they were going for but I figured they would target 650 and 750 and the 290X re-brand would be 400 so I paid 400 (possibly 389 now that I think about it.) Since I am running with the same exact driver version as my 7870 and I never have had issues with it I guess I am lucky to the guy who thinks AMD is a shitshow. Or maybe I am lucky I didn't play Witcher 3 or Batman on the PC. Though I wasn't really lucky with Witcher 3 on PS4 now either was I?
 

tuxfool

Banned
I'm pretty sure I watched an Nvidia presentation a while back where they talked about how (and this is a software feature IIRC) they didn't need to increase the number of draw calls for VR because they could automatically adjust them for perspective. I'm not sure what AMD is doing.

Although I can agree in principle that fast draw calls are better than slow ones, isn't the crux of the issue whether they are a bottleneck or not? If they aren't, then speeding them up reduces your CPU utilization but doesn't make the game run any faster.

If you're running at a faster frame rate then you're submitting more draw calls. This particular technique you're talking about is called timewarping and it is used primarily to reduce latency and to correct for dropped frames, not to reduce draw calls.

Although I can agree in principle that fast draw calls are better than slow ones, isn't the crux of the issue whether they are a bottleneck or not? If they aren't, then speeding them up reduces your CPU utilization but doesn't make the game run any faster.
Ofc, but doing less work is always better than more work.
 

tuxfool

Banned
IIRC, there've been GPU driver comparisons between Nvidia and AMD that show there's a big gap in DX11 performance. AMD's general lack of driver updates outside of the beta ones is also a bit of a problem, because you have to rely on potentially unstable drivers if you want the best performance from the new, big name games, while Nvidia always pushes out drivers that have been tested to some degree within a day or two of release.

The issue with beta drivers is pretty much nonsense. Especially in light of the issues and regressions Nvidia has had recently with their WHQL drivers.

It should be noted always that WHQL is not a guarantee of driver quality. Yes nvidia does put out more day 1 driver, and yes, they are often whql. But the fact that AMD's drivers are beta isn't an issue to most people.

Consider the AMD drivers, Google Beta, not regular beta.
 

tuxfool

Banned
What bothers me a lot about the cards is the voltage locking. I have a 7970 and one of the fans on it died so i have an aftermarket heatsink that keeps it incredibly cool. If I could overclock it just a little bit I would probably get the same performance as the 300 series. It still runs games beautifully though.

You're joking, right?
 

Lulubop

Member
60FPS + good IQ is lovely. It's the reason to play games, especially PC games.

Though truthfully, I am a bit of a casual gamer. Hence I don't think the PC platform has any exclusive games that interest me.

Just because you can, you shouldn't. The only reason to get a gaming PC is to have a PC which plays all games at 60FPS. Otherwise, PC gaming is pointless.

Truly some of the worst stuff posted on gaf. congrats.
 

Applecot

Member
Inflation

Shrinking gains from modern advancements

Stuck on 28nm for yonks

Proliferation and popularisation of the Ti / Titan sized cards. Which are basically XL top end cards. Naturally these are more expensive than previous top end cards.
 

FLAguy954

Junior Member
Have said this for a few years now.

People talking about how long 28nm has gone are missing the point although it doesn't help over a long period.

The shit started with the 7970, the first 28nm card. AMD's previous top end card 6970 launched at around $360, it was competing with the GTX 580 and priced below it and the 580 was also a DP card like Titan GK110 is, the 580 is the GF110, the last true top end card released for a reasonable price.

AMD didn't do a good job in moving to 28nm. We should've had 4870 to 5870 performance jump but all we got was a small bump. 5870 launched at $400 and had huge gains thanks to moving to 40nm.

AMD priced the pathetic 7970 above the old gen 580. $499 was the ceiling price that Nvidia had with its huge DP card the 480 and 580. AMD with their midrange 28nm effort priced it at $550 and people were paying gladly $600-650 despite it being a poor effort. AMD didn't price it like they did with the 5870. 5870 had the performance crown but wasn't priced silly over old gen crap. It actually replaced old crap instead of slotting in front like you expect.

Nvidia were unimpressed with the 7970, so they quickly brought in the midrange card, GK 104, they saw with some GPU boost it could match the 7970 in some benches and beat it in others. 7970 being so poor was further highlighted when they shown huge gains months later in driver updates, that are not typical.

When the 5870 launched, NVidia waited 6 months to launch the 480 and were forced to use the GF110. They usually wait to see what AMD can do. If AMD had done the job right and the 7970 was the expected speed then we would've seen NVidia wait 6 months and bring out GK110 to slightly beat it. NVidia said themselves were expecting 7970 to be much better.

Nvidia had the GK110 to launch later and decided not to destroy the market and price it accordingly with current gen. SInce AMD want to price a midrange card above an old gen 580, then NVidia's actual high end card has to slot in front of that price wise and we start to see cards being slotted ahead instead of replacing. Nvidia had too much of a performance advantage and sadly people ran out an bought 7970s for $650. Nvidia seen an opportunity to release different cards up to a $1000 that wipe the floor with AMD's efforts.

The gains on 28nm from NVidia have been great actually despite being disappointingly long, just look at the OG titan to a titan x. It's the performance advantage NVidia had in 28nm over AMD and AMD pricing their poor efforts too high.

The true successor to my 580 was under the guise of a OG titan, not a GK104 680 but Nvidia could relax and not bother hurrying GK 110 since AMD released the 7970, on par with Nvidia GK 104 midrange.

Anyway, I bought a 580 then bought a GTX 970. I didn't fall into the trap of buying new midrange cards over and over or fall for $700 cards. Perhaps I've been lucky but most of the 28nm run was pure bullshit to me and the 970 is a nice stop gap but still has a caveat. 970 was a blessing after that awful run, cheap and powerful, been a great buy almost a year on. 780ti was perhaps the biggest joke of the 28nm run.

So yes you can dodge most of it. 580 and 970, two cards over a 5 year span and I'm still on my i7 930 which has not give me any problems so far. No need for new rigs all the time if you buy at the right time. there's always new stuff around the corner but with each GPU gen you usually get a big bump, 28nm didn't have a good start so you avoid it.

Should say I've no problem with people buying $1000 cards, that's your income. Just saying the upgrade nonsense can be dodged if you just look at the market.

Everything in your post is pure bullshit. When it comes to AMD's GPU architectures, GCN was the best thing that ever happened to them. When the 7970 came out, it absoluted wrecked any VLIW4 card on the market. All of the GCN cards from the 7750 to the 7970 had anywhere from 30-50% performance increases over the respective VLIW4 card each was replacing.

Here is a visual of the jump from the 6970 and comparison between the 7970 and the 680.

As you can see the 7970 traded blows with the 680.

In regards to your point about AMD not making much strides with 28nm; the 290/290X came out and afforded an additional 35-40% increase in performance over the 7970 on the same node. I purchased my 290 for $250 so perhaps this makes me biased :p. I know how to shop for PC hardware as well and AMD's offerings still provide a great value, especially since the 290 series GPUs can be found for less than a 970.

Edit: Even now, the 7970 still remains a viable card for 1080p gaming with high resolution textures as AMD was forward thinking enough to give the card 3 GB of VRAM.
 

prag16

Banned
I'm not expecting to go back to the Radeon 9800 Pro glory days but I'm unimpressed by the current offerings. I look at the charts and I don't see a compelling reason to upgrade. My PC is a i5-2500K and GTX570. I can still run everything I want to play okay, not up to my standards but I guess I'm waiting for something that's going to wow me for spending $350-$400 on a single GPU.

It's insanity to spend $600+ on a card. I've always spent $350-400 on upgrading the GPU and lately it isn't worth the money. When is the next great leap? I'm ready but the tech isn't there.

A 970 is a massive jump from a 570. You see jumping up to that for $300 as absurd and not worth it? Disagree, but okay.

As a few people have said, nvidia does seem to have a shitty range between $150 and $300. 750ti is solid at the low end of the range, 970 is great at the high end, but the 960 is all that's really in between, and doesn't seem that great.

The people in here claiming the 980 is midrange and 970 is low/mid are smoking something.
 

Freiya

Member
Not sure where you get that.

No matter how you look at it AMD has cards at the same or cheaper price point as Nvidia that are the same performance or better. The 390 for example is 5%~ better than a 970 with double the memory bus and vram. The only reason to get a 970 is heat/power/noise, brand loyalty and game pack-ins (which are actually a huge plus for the card).

That near exactly is the case for the 960 vs. 380 as well.



I'm sorry but those are not the only reasons to go nvidia. I have crossfire 290x setup and I am sick of it. I have to deal with so many weird ass random ass problems with drivers that it's just infuriating. 2 monitors plugged in + using audio over hdmi port? LOL my sound is screwed every time I play a game and have something else on my 2nd monitor.

It is a known bug for years now and AMD still hasn't fixed it. Hell I'm also having CF problems in tomb raider right now. For some reason my CF only works half the time in tomb raider and I know it has something to do with some combination of pc settings I have going on.


Fuck amd. I'll never do it again. EVER! and my gf feels the same way. Her 290 will be her last amd card. In fact she still give me crap because I'm the one who talked her into buying it. We are the only 2 out of our circle of friends that even have amd cards and they all laugh at us because of the random problems we have all the time. ;(
 

Renekton

Member
I have a feeling that the current gen consoles and their performance level didn't help things either. There is little reason for Nvidia to come up with a home run like the 8800gt when even the lowest-end cards provide console-level performance for peanuts.

I do think that VR will spark a new arms race though. High framerates and resolutions are a necessity for a good VR experience.
Are you taking the Grief route of blaming consoles for everything?
 

clav

Member
This is why I buy one card every 5-6 years usually the lower tier ones.

That's enough time to appreciate the differences without spending a lot.

I liked how he pointed out the 8800 GT evolution chart from Nvidia as I knew that, too.
 
Top Bottom