• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Is there a new PC arms race coming?

You guys are forgetting one big thing going for consoles. It's optimization. So even if the X720/PS4 have similar or weaker specs found on the PC, it can still blow the PC away. Just try playing the most graphically X360/PS3 games on a similar spec PC. It can't be done. I expect a similar cycle where the X720/PS4 beats the PC early and gets overcome once the PC hardware is so far powerful that console optimization means little. Thus, people buying hardware today thinking it's going to get them pass next gen maybe are in for a rude awakening.

No. You're the one who is in for a rude awakening. PC hardware from last year outclasses that of the rumored specs of PS4/720 that are due a year-and-a-half from now. I can't make it any simpler. Yes, optimization is important, but PCs will produce significantly better image quality on top of the better graphics due to the graphical push brought on by new consoles. A huge part of that optimization includes sacrificing image quality, such as resolution, AA, AF, real-time global illumination, low FPS target (30), lower texture resolution, etc.
Note: I'm not arguing about similar-spec machines. Those who are avid PC gamers who like pushing their hardware most likely have much higher-spec machines already. But saying that consoles will blow away PC is a huge generalization, and is just wrong.

I hope you're not one of those people who are expecting something like Agni's Philosophy on PS4/720. Of the people who bring up the same argument as you just did, many of them are also firm believers of Agni's being capable on PS4/720. You'll get a dumbed down version of that on your console @720p, 30fps, whereas a GTX680 can do it at 1080p and better image quality (unknown, but stable framerate). If not, then you can ignore it, but this is also for those who are misinformed.

Also, PS4/720 are rumored to have Jaguar cores. That's not exactly cutting-edge in the terms that it's high-end. No. Jaguar cores are low-end CPU solutions. So for those of you worrying about CPU being an issue, I don't think you should worry. Hopefully more engines utilize compute shaders effectively within the coming years, putting less strain on your CPU. I would advise to get something better than a Radeon 7750 if you don't have so already, and are looking forward to playing next-gen games at higher settings than a console.
 
im kind of torn as to what video card I should get and when, I guess I should take my time

I'd read up about the GTX 660ti. It may be the perfect card at this time to deliver great bang for your buck so that in two or three years you really won't mind having to upgrade. This is kind of what I did when I bought my GTX 460 two years ago. Sub $200 and it runs almost everything short of turning a few features down at 1080. It brought me to where I had hoped it would in this cycle.
 
Nobody is saying they are going to blow this generation away. Saying $500-600 PC's right now will blow the next PS and Xbox is just as ludicrous. Over 1000 sure.

Many PC's in the 500-600 price range have integrated graphics. >.> Both sides are way to extreme for my taste. The majority of PC's purchased in that price range are going to have a piss poor GPU. Custom computers aren't mainstream still. Dell customization maybe...
 
What arms race? My good ole 5770 will probably last me another 2 years, giving it a glorious 5 year lifespan. This thing has smoked everything thrown at it. Not until something like Watch_Dogs or 1313 hit PC (2014?) will we have to upgrade.

Yeah, I still have my 5770 from when I built my current PC about 2 years ago and it still runs just about everything great. Maybe not 60FPS, but good enough.
 
I don't think anyone that puts a little effort into building a gaming PC will have a problem overcoming what next gen throws at it.

I don't even know how people can play the current gen consoles. Some games are seeing games dip into the single digit fps at 720p. My eyes can't take it.
 
I will be upgrading my 670 in a year or two due to my resolution, but for most people I would imagine it will last way longer.
 
I think you guys strain the definition of 'optimization' sometimes. The reason Uncharted 3 has great 'visuals' is because it is running at upscaled 480p, no AA/AF or other post process elements, and at 30 fps.




Buy a 6850/6870 now for $150 and it will smoke the next generation consoles for the entirety of their life cycle.

Glorious Comments Sir.

SMH
 
Nobody is saying they are going to blow this generation away. Saying $500-600 PC's right now will blow the next PS and Xbox is just as ludicrous. Over 1000 sure.

Many PC's in the 500-600 price range have integrated graphics. >.> Both sides are way to extreme for my taste. The majority of PC's purchased in that price range are going to have a piss poor GPU. Custom computers aren't mainstream still. Dell customization maybe...

Here's what $500 can get you, and it'll only get cheaper, and probably significantly once the 660ti goes on the market. Add ~$50 and you can replace the GPU with a 570. There's a computer that'll last you through next-gen. Of course, it's not mainstream, but that's not the argument.
 
Nobody is saying they are going to blow this generation away. Saying $500-600 PC's right now will blow the next PS and Xbox is just as ludicrous. Over 1000 sure.

Many PC's in the 500-600 price range have integrated graphics. >.> Both sides are way to extreme for my taste. The majority of PC's purchased in that price range are going to have a piss poor GPU. Custom computers aren't mainstream still. Dell customization maybe...


500-600 have integrated graphics wut?
 
Yes, companies will be fighting one another to figure out who can sell you more hats. An arms race like we have never seen.
 
500-600 have integrated graphics wut?

Looking at Dell best selling PC in the range:
VIDEO CARD Intel® HD Integrated Graphics

I'm sure you can customize a computer to get something decent in that range, but that is not as mainstream.

Granted I don't see my PC having problems for years to come. My I7 oc to 4.6ghz is nice and cool and it appears like it will be capable for some time.
 
That's mainstream.

That's what people buy in the price range.

Nobody is arguing about whether or not a system is mainstream. We're talking about gaming PCs up against consoles. That said, check out my previous post about a $500 PC if you haven't already.
 
Buy a 6850/6870 now for $150 and it will smoke the next generation consoles for the entirety of their life cycle.

Next-gen consoles are gonna be so weak to the point mid-range 2010 videos cards will be beating then till the end of their life cycle?
 
That's mainstream.

That's what people buy in the price range.

nobody even buys Dells anymore! on just this forum alone we constantly see so many people building a gaming PC for the first time or showing interest in that Alienware X51 (looks a but like an X-Box) we're going to have chip from AMD that will really push the standard of integrated GPUs with their APUs a level or two above what we have now (and maybe more) and for a low price.
 
That's mainstream.

That's what people buy in the price range.

That's not the argument. Any sensible person with the intention of going into PC gaming can do so with a build costing around $500-$600, and with that money you can get a fairly modest system that outclasses current generation consoles and won't need massive upgrades to get a leg up on the next generation.

If anything, PC gamers have the most to gain from the next generation, as the tech is already there and its the developers who need to take advantage of it.
 
what are the rumored (if any) specs of the "HD" next-gen twins?
has any real solid rumor or info been leaked yet?

"Solid" rumors have been leaked. Seems like PS4/720 are both going with Jaguar cores for their CPUs. 720, I think has 8cores vs PS4's 4, but I think PS4 has an APU setup..

PS4 rumored to have the more powerful descrete GPU, akin to an underclocked AMD Pitcairn GPU @1.8TFLOPs. 720 supposed to have a 1.1-1.5TFLOP AMD GPU.

720 supposed to have 8GB DDR3 (maybe DDR4) with a significant amount reserved for OS (probably DVR-use and some sort of Windows 8 implementation). PS4 supposed to have 2GB which is likely to go up to 4GB. I think the type of RAM remains in question, but I'd guess either GDDR3/5. Maybe DDR3/4.

Check out their respective threads for more info. What I just stated is off the top of my head. As you can see, not weak consoles, but outdated and outclassed by the time they release.
 
I think you guys strain the definition of 'optimization' sometimes. The reason Uncharted 3 has great 'visuals' is because it is running at upscaled 480p, no AA/AF or other post process elements, and at 30 fps.




Buy a 6850/6870 now for $150 and it will smoke the next generation consoles for the entirety of their life cycle.
Bingo.

Next-gen consoles are gonna be so weak to the point mid-range 2010 videos cards will be beating then till the end of their life cycle?
If you run games at the same res and settings they're running at on the consoles, those cards will last a while.
I'd be willing to bet my 5670 could produce the exact same visuals as the consoles at the same res and get higher frames.
 
No. You're the one who is in for a rude awakening. PC hardware from last year outclasses that of the rumored specs of PS4/720 that are due a year-and-a-half from now. I can't make it any simpler. Yes, optimization is important, but PCs will produce significantly better image quality on top of the better graphics due to the graphical push brought on by new consoles. A huge part of that optimization includes sacrificing image quality, such as resolution, AA, AF, real-time global illumination, low FPS target (30), lower texture resolution, etc.
Note: I'm not arguing about similar-spec machines. Those who are avid PC gamers who like pushing their hardware most likely have much higher-spec machines already. But saying that consoles will blow away PC is a huge generalization, and is just wrong.

I hope you're not one of those people who are expecting something like Agni's Philosophy on PS4/720. Of the people who bring up the same argument as you just did, many of them are also firm believers of Agni's being capable on PS4/720. You'll get a dumbed down version of that on your console @720p, 30fps, whereas a GTX680 can do it at 1080p and better image quality (unknown, but stable framerate). If not, then you can ignore it, but this is also for those who are misinformed.

Also, PS4/720 are rumored to have Jaguar cores. That's not exactly cutting-edge in the terms that it's high-end. No. Jaguar cores are low-end CPU solutions. So for those of you worrying about CPU being an issue, I don't think you should worry. Hopefully more engines utilize compute shaders effectively within the coming years, putting less strain on your CPU. I would advise to get something better than a Radeon 7750 if you don't have so already, and are looking forward to playing next-gen games at higher settings than a console.

When I say optimization, I mean games are design for the hardware. Right now, we have video cards with 6GB of Vram. What games are using that much of ram. None. So, even if the X720 only has 2 GB Vram, it's not going to be a 1/3 "weaker" than that card. Pluse, obviously, developers are going to be more efficient using the X720 2GB vs 2GB on card X the PC.

And, no, I don't expect the X720/PS4 be even within sniffing range of the top benchmarking specs, but I do expect them to be somewhat mid-highend and have features currently not available, like X360 being the first with a unified shader.

Maybe, my memory is bad, but I don't remember PC hardware 1.5 year before X360/PS3 launch being comparable. Could be different this tiime, but we'll see.
 
When I say optimization, I mean games are design for the hardware. Right now, we have video cards with 6GB of Vram. What games are using that much of ram. None. So, even if the X720 only has 2 GB Vram, it's not going to be a 1/3 "weaker" than that card. Pluse, obviously, developers are going to be more efficient using the X720 2GB vs 2GB on card X the PC.

And, no, I don't expect the X720/PS4 be top benchmarking specs, but I do expect them to be somewhat mid-highend and have features currently not available, like X360 being the first with a unified shader.

Maybe, my memory is bad, but I don't remember PC hardware 1 year before X360/PS3 launch matching up that well. Could be different this tiime, but we'll see.

that 6GB card you mentioned is a special edition card, with oodles of VRAM so you can run at insane resolutions (3 monitors at 2560x1600)with every effect known to man turned on.
and being efficient on consoles just means toning down some effects and pushing others.
it's no magic tricks. optimization isn't a magic word that fixes everything.
it'd be better if consoles were a lot more powerful just so developers wouldn't have to waste so much damn time trying to come up with tricks to squeeze a bit more performance out of the console. it would probably save some money too, who knows.
 
I would not be surprised if next-gen consoles launch and are less powerful than a high end gaming PC from Fall 2013.

That was true of this generation as well. And just like this gen I think that even though the jump in power makes them still lesser overall than contemporary PCs, it will still reinvigorate fans and devs enough to push the PC out of the spotlight for a little while.

The PC just doesn't have massive bricks of hype like console launches do.
 
When I say optimization, I mean games are design for the hardware. Right now, we have video cards with 6GB of Vram. What games are using that much of ram. None. So, even if the X720 only has 2 GB Vram, it's not going to be a 1/3 "weaker" than that card. Pluse, obviously, developers are going to be more efficient using the X720 2GB vs 2GB on card X the PC.

And, no, I don't expect the X720/PS4 be top benchmarking specs, but I do expect them to be somewhat mid-highend and have features currently not available, like X360 being the first with a unified shader.

Maybe, my memory is bad, but I don't remember PC hardware 1.5 year before X360/PS3 launch being comparable. Could be different this tiime, but we'll see.

It is different. A high-end PC from last year will outclass and outperform both consoles when they release next year, all based on their heavily rumored/credible specs. That said, I'm not arguing about optimization on the console space. Again, a huge part of that is low image quality, but the actual code-work for the specific hardware is also there, no question. PCs will benefit because the extra power will go into significantly better image quality/higher effects/better effects/higher framerates/higher resolution/etc. on top of the better graphics.
 
If you run games at the same res and settings they're running at on the consoles, those cards will last a while.
I'd be willing to bet my 5670 could produce the exact same visuals as the consoles at the same res and get higher frames.

But there is still a good difference between the releases of both cards and the consoles launch.
6850/6870 are from 2010 and next-gen consoles will arrive probably only on 2013 or later, and the 5670 was released 3 years after the 360.
My guess is that a 660ti or a hd 7950 will be enough to max next-gen console ports.
 
Posing the question in the thread title because I'm wondering if the emergence of next-gen consoles that are in the pipeline are going to revive -- at least temporarily -- the kind of performance arms race that personified the PC world until recent years.

PC gaming started doing well again when the arms race stopped, so if developers are smart the answer is no.

The arms race benefits nobody - creating games that most users can't run is horrible business and why before Steam took off WoW, The Sims and Tycoon games were completely dominating sales charts for years straight.
 
that 6GB card you mentioned is a special edition card, with oodles of VRAM so you can run at insane resolutions (3 monitors at 2560x1600)with every effect known to man turned on.
and being efficient on consoles just means toning down some effects and pushing others.
it's no magic tricks. optimization isn't a magic word that fixes everything.
it'd be better if consoles were a lot more powerful just so developers wouldn't have to waste so much damn time trying to come up with tricks to squeeze a bit more performance out of the console. it would probably save some money too, who knows.

People have to stop saying you only need X Vram for X resolution. Heard the same thing with 1GB, 2GB and so on. Try running Skyrim with mods, especially highres textures, at 1080p with 1GB or 2GB Vram. Within a few years, I wouldn't be surprise if 6GB Vram wasn't enough for 1080p.

And, no, optimization doesn't just mean toning something down. When working on a close device, you figure the best way to move data. On a open device, you may have to use / waste more resources to get the same thing done. So, the later device would be the one needed to "tone" down.
 
the eventual 7xx series should be fine if the rumors are anything to go by. At most everyone will start using octo-core CPUs

that's my guess anyway. I doubt next-gen will be a HUGE graphical leap.
 
Things change considerably when you move away from the computer desk. When you have that moderately priced PC plugged into your 65" TV in the living room, running native, pixel-perfect 1080p games with full AA/AF, longer draw distance, maybe even texture improvement downloads, you can really see the difference between the PC and the console.

Comparing your console on your big screen TV to that computer plugged into that 22" monitor, that's a harder thing to compare. Of course your console graphics seem more "grand". But compare apples to apples on the same display and audio system and you start to see for real.

But in the end, I don't think the "arms race" is as prevalent. Since most games are optimized for consoles these days, as long as your PC can do console-level graphics, you can play pretty much anything that comes out adequately for the next several years. Yes, you can buy beefy upgrades to your PC on a regular basis and continue to blow away the console graphics, but as for playability and capability to match what the game's target graphics are, the race has slowed considerably.
 
I'm pretty amazed what my C2D e8400/5870 is still accomplishing to this date. Every once in awhile a game will come out that my 2 core cpu can't quite handle, but the vast majority still run just fine. And even the rare exceptions, like GTAIV, I can still get to an acceptable level. I keep thinking about upgrading, but there are really no games out there that I feel it is worth it for.

Honestly, I'm not sure how much I want it to change. Games look pretty good at 1080p/4xAA. Of course they can look better, but I'm not sure it's worth the money to me.

I hear that. Only in the past couple years have PC games started taking advantage of quad cores. On top of that, Intel doesn't appear to be moving beyond quads for the mainstream segment anytime soon given the focus on mobile and/or low-power parts.
 
PC gaming started doing well again when the arms race stopped, so if developers are smart the answer is no.

The arms race benefits nobody - creating games that most users can't run is horrible business and why before Steam took off WoW, The Sims and Tycoon games were completely dominating sales charts for years straight.

I totally agree... but I'm not sure some players have much sense of restraint. That said, maybe we're at a point with 3D graphics that the most bleeding edge game can be run on modest hardware.

Started the thread mainly out of curiosity... not in the market for a gaming-capable PC at the moment (equal parts cost and a huge backlog, both on console and computer)... next year is more likely... but I would consider delaying if there was a consensus that there might be a bit of a spike in specs.
 
Bingo.


If you run games at the same res and settings they're running at on the consoles, those cards will last a while.
I'd be willing to bet my 5670 could produce the exact same visuals as the consoles at the same res and get higher frames.

Bingo ?

Uncharted 3 Running at upscaled 480p ?

One of the most retarted crap i have read.
 
I believe the Uncharted series runs at native 720p. It has massive shortcuts and tricks for optimization, but being native 480p is not one of them.
 
Nobody is saying they are going to blow this generation away. Saying $500-600 PC's right now will blow the next PS and Xbox is just as ludicrous. Over 1000 sure.

Many PC's in the 500-600 price range have integrated graphics. >.> Both sides are way to extreme for my taste. The majority of PC's purchased in that price range are going to have a piss poor GPU. Custom computers aren't mainstream still. Dell customization maybe...

I think you are severely misinformed and downright ignorant in this area.
 
When I say optimization, I mean games are design for the hardware. Right now, we have video cards with 6GB of Vram. What games are using that much of ram. None. So, even if the X720 only has 2 GB Vram, it's not going to be a 1/3 "weaker" than that card. Pluse, obviously, developers are going to be more efficient using the X720 2GB vs 2GB on card X the PC.

And, no, I don't expect the X720/PS4 be even within sniffing range of the top benchmarking specs, but I do expect them to be somewhat mid-highend and have features currently not available, like X360 being the first with a unified shader.

Maybe, my memory is bad, but I don't remember PC hardware 1.5 year before X360/PS3 launch being comparable. Could be different this tiime, but we'll see.

I get what you're saying in regards to a lot of PS3 games like Uncharted and God of War III that were coded by people knowing that every single person running the game will run them on one specific hardware spec. That "optimization" typically did get a little bit more performance out of a little bit less hardware, but even that might not be the case next gen.

If the leaked specs and stories are true, next gen consoles are going to use mostly off-the-shelf parts. The days of consoles with their own original architectures might be fading away, which means that all the next gen consoles might simply be "dumb PCs" put into $400 - $500 boxes. Current computers already outperform the leaked specs, which is another difference from past generations. It used to be that newly launched consoles were slightly ahead of the PCs of their time.
 
So many wrong assumptions in one thread!

First, "future proofing" a PC doesn't exist! Your PC is as long future proof until a game comes that needs more. So if suddenly Crytek announces a new game tomorrow that needs a 670 to run smooth then you have a new benchmark to tackle. The hardware doesn't matter, it's about the software and its needs.

Plus, everybody is acting as if they can never open their PC again once it's closed. Get a good videocard now, and if 2 years down the line you need more just buy a new card and sell the old one. It will cost you less than getting a super-highend card now.

It is funny how many people ignore this huge advantage of PCs.
 
If there's any "optimization" left that can help console games save face it's art direction. I can easily say that Uncharted 2 & 3, God of War III, and Gears 3 look nearly as beautiful as The Witcher 2 and Crysis do on PC, because they did an amazing job matching the art direction with what they could do on the hardware. Nintendo accomplished the same thing with Mario Galaxy. Despite clear technical inferiority, Uncharted and Gears 3 still impress me because the developers knew what kinds of textures and models and effects to use for each specific environment and model to cover up all the faults on that hardware. But even then, that kind of "optimization" is really only possible with first party and console-exclusive games.
 
So many wrong assumptions in one thread!

First, "future proofing" a PC doesn't exist!

mh? when people say futureproofing they're talking about getting a card that should be able to run most upcoming games at good settings for 2+ years. That's a thing that exists.

Of course software runs this but there's no reason to think that buying a great graphics card now wont guarantee that you'll have a better gaming experience for a couple years to come, that's what futureproofing means.
 
mh? when people say futureproofing they're talking about getting a card that should be able to run most upcoming games at good settings for 2+ years. That's a thing that exists.

Of course software runs this but there's no reason to think that buying a great graphics card now wont guarantee that you'll have a better gaming experience for a couple years to come, that's what futureproofing means.

But why worry about that? PC game requirements are not linear, they don't increase steadily over time. You have new engines from time to time and the new console generation which will most likely set the minimum requirements of upcoming stuff. But as you see nobody knows shit about it and everybody is speculating wildly.

So instead of worrying about 2+ years, just get something that is adequate for now and if something comes up that needs more, just upgrade accordingly then. Anything else is most likely a waste of money.
 
So instead of worrying about 2+ years, just get something that is adequate for now and if something comes up that needs more, just upgrade accordingly then. Anything else is most likely a waste of money.

but.. that's what im saying

im not saying you shouldn't keep updating your PC, im saying that most people rather go "oh i'll get the GTX670 now so I can max out GTAIV and be ready for GTAV!" instead of going "well I always wanted to play GTAIV, so i'll buy this gtx 460".

That's futureproofing, and it saves them money. Cards dont drop that fast and the new games arent that far away.
 
My last computer I built in 2008 with a 4850 and it never ran into a game it couldnt run.

I sold the whole computer last year to build my current one, but that is 5 years of use our of one card.

You could easily do that now if you bought a 6850 or 660ti. Will last you through the next generation.
 
My last computer I built in 2008 with a 4850 and it never ran into a game it couldnt run.
My PC is from 2007 (a mid range rig even back then) and yet even just changing the GPU from a Geforce 8800 to a GTX 460 was enough to keep it more than viable until today.

Of course, there are a couple of games I can't max, but I still play most of my stuff at way higher settings than the typical console game. 1680x1050 at 60hz is how I usually play, often with high settings.
 
My Q6600 OC'd @ 3.0 with an OC'd 5850 is still playing every game I own on high. Literally no need to upgrade my aging PC.. its fucking great
KuGsj.gif
 
I already feel like my 6970 is long in the tooth, and you guys are talking about sticking with 3 year old cards for a few more years? Reminds me of my buddy who says his 460 chews up and spits out every game he throws at it. I just have to think his standards are much lower than mine when it comes to framerate, AF, AA, etc..
 
I hope not, I believe next gen will not be a huge jump. I personally could care less about graphics as some of my favorite games this gen were downloadable games with creative graphics instead of photorealism type of graphics.
 
Reminds me of my buddy who says his 460 chews up and spits out every game he throws at it.
But it does.
I have one of those and most of the times when my computer falls short with some game at high settings, it's because I'm CPU limited.

I just have to think his standards are much lower than mine when it comes to framerate, AF, AA, etc..
No need to brag like a 5 years old. We all know you can have a better experience investing more money. No doubt about that.
Still, with a GTX 460 these days you can run circles around console performances.
 
Top Bottom