My PC is just over three years old and it's still plugging away, running most stuff at high settings and hitting 1080/60. It's a 3570k 4.4OC with 670 SLI. The second card has really come in handy in just pushing those high settings towards 60fps. I've been debating whether to put a 970 in there, but for now it does the job. It's not really an old-ass PC, but I bought it because PS3 games were not only looking decidedly ropey but the frame rate was chugging like an asthmatic Darth Vader in a dust factory. In a way, Sony and Microsoft did me a solid by not going all in with the tech on their next-gen consoles. It slowed down the arms race for amazing graphics and, bar the odd terrible port, 2GB GPU's are just about handling their business. Only just, mind you. The next year will be the breaking point for those.
Yup, the 5770 was awesome for what you payed back then. Can't help but think that to get the same longevity of that card right now, you have to get at least a 970 which is almost double the price.
As a long-term PC gamer I'm annoyed that hardware advanced amazingly fast in the 90s when I didn't have enough money to upgrade frequently, but now that I have enough money to upgrade somewhat frequently I can't bring myself to do it because the individual steps are too small.If PC hardware requirements moved as fast as some PC gamers want them to I'd only be playing undemanding indies on PC and otherwise be back playing mostly on console like I did in the 90/early 2000s.
I guess we also have the weak CPU cores in the PS4 and Xbone to thank.
I've also been wondering if I should stick with the Q8200 or buy a used Q9550 or QX9650. Perhaps once I see the CPU getting pegged at 90% in certain games.
Phenom IIx6 1090T
16 GB DDR3
R9 270X
Nothing she can't do, and I love it.. Still gonna upgrade the CPU though. It's long overdue.
Does it bottleneck you in anything though? I mean, I still stick with an FX 6100, which is probably roughly around that performance, and I don't feel CPU constrained in anything, like ever..
Not really, but in certain games, I feel like performance could be better, like Fallout 4. It's definitely still competitive, but at the same time, it's showing its age, and I know I need something better.
Does it bottleneck you in anything though? I mean, I still stick with an FX 6100, which is probably roughly around that performance, and I don't feel CPU constrained in anything, like ever..
core 2 duo + 9800gtx was a fucking beast and up until this year, was mostly still hitting minimum reqs.
The stereotype that you have to upgrade your PC every year to play newest titles is just pure myth.
Ha, this is my set-up EXACTLY (except I've only oc'd to 4.2 using "turbo mode" on my motherboard (so not genuine oc -- it ramps up to 4.2 only when it needs to) -- when I went to 4.3 I got bsod after a while).
I'm in the same boat -- thinking about a 970, but the extra 670 keeps my frames up for now. I can run Witcher 3 on ultra (except for hair works off, tree draw distance on high) and I get 55-60fps. It runs every thing else I throw at it at 60 fps on ultra. even Metro Last Light.
I will rock these GPUs until CDPR's Cyberpunk 2077 comes out, then we'll see.
I went from a 640 (that I got 4 years ago), to a 750ti in November. That has been my only upgrade in the last 4 years, and it cost me $100.
You should upgrade to dual GTX 980Ti's.
PC hardware has only been VRAM limited for the last 5 years. Otherwise, you could basically still keep using that GPU from 2010 in the latest game no matter what.
A q6600 probably still gets 30fps in nearly all games.
Seriously? The P4 3.6 + 6800GT that I tried absolutely COULD NOT run Unreal Engine 3 games properly. It was a very sub-par experience ~20fps at best. Even Bioshock 1, which was not UE3, ran terribly.It really depends on the game. I had a P4 @ 3Ghz paired with a 6600gt and it performed admirably in the original Gears of War.
My current PC has a Q6600 and a GTX660. Bioshock Infinite is almost a locked 60fps on high, Assassins Creed Black Flag is 30fps on high, I even played Cod Blops 3 at decent settings. PC hardware has never had a longer effective lifespan before.
My PC is a dinosaur, but still plays most stuff at 2560x1440 at a decent framerate.
i7 980x (1st gen), and dual GTX Titans on water with 12 GB 2000Mhz DDR3 RAM. I should upgrade my chipset soon.
"Dinosaur"
"Dual GTX Titans"
"12GB RAM"
�� Some of you HONESTLY.
He's talking about the cpu, which is a bit long in the tooth.
Can safely lock the framerate at 30.
Core 2 duo E8400 + 9800 GT 512 MB + 4GB RAM is fine in 1024x768 even now (though in BF3 it crashes due to lack of memory during the long play session)
A q6600 probably still gets 30fps in nearly all games.
Seriously? The P4 3.6 + 6800GT that I tried absolutely COULD NOT run Unreal Engine 3 games properly.
The most common wrong perception about PC gaming is that you have to upgrade your PC annually because PC games have to run at 60fps at the highest settings. There are people who can't afford that, and thanks to the robust customization of PC games, people can still run the games even on old-ass PC. They can tinker the settings to their liking, sacrifice IQ for 60fps or sacrifice framerates for maximum IQ and settle for 30fps lock. A well-built gaming PC can last about as long as console life cycle, or even more.
The last few years have been pretty friendly to PC Gamers. A lot of developers were hamstrung by developing for the consoles--which used DirectX 9.0c--first and not really optimized for modern hardware. Since the new consoles use DX11/12 and a new version of OpenGL, expect games to be substantially more demanding as well as having lower requirements for better IQ.
If you bought an i5-2500/k or even an FX-83xx, and a 660 Ti or 670 (or HD 7870/7950/7970) odds are you are still in the clear to run most games at great settings on 1080p.