It's funny. Countless threads and discussions for months on end about 2gb, 3gb being enough, about some random $500 setup blowing away the consoles and able to run next-gen games for a few years at least. It's all out the window. All of it was bs.
Well I wasn't just about to spunk $750 on a pair of GPUs only to be relegated to having to play at anything less than max settings. That's just stupidity in my book, especially when I already own a PS4.
2014
Those expectations should have flown right out the window the moment current gen consoles started packing at least 5GB of VRAM specifically for games nearly a year ago.When you spend that much on a GPU alone you want to play at nothing other than max settings. That's the whole point of the high end PC market.
If he bought with the intention of playing at medium/high settings he would have just bought a 760 or something. Then it wouldn't hurt as much.
It's all about expectations.
They're not console-equivalent prices.
The 2gb/3gb cards aren't going to last the entire generation, even on console-equivalent settings. You're wrong when you say nothing has changed. A lot has changed, almost overnight. Look at Evil Within. We're just at the start of this gen.
2014
It's like there's some kind of psychological blow when they can't put one of the settings at the highest number. Even though it's an optional texture pack add-on designed to provide extreme quality to those with the highest end cards. Like a 670 with 2GB is still going to be matching or exceeding PS4 quality. People panicking because they've got a 3GB 780 are going to look pretty silly when the actual benches come out.
It's funny. Countless threads and discussions for months on end about 2gb, 3gb being enough, about some random $500 setup blowing away the consoles and able to run next-gen games for a few years at least. It's all out the window. All of it was bs.
Man I don't even want to imagine the requirements of GTA5. Damn VRAM is gonna fuck PC users.
Will wait for 9708gb edition.
Yeah, those people just need to get over themselves. There's absolutely nothing wrong with a bit of future proofing in games, especially if, say, "high" ends up looking like last generation's "ultra".When you spend that much on a GPU alone you want to play at nothing other than max settings. That's the whole point of the high end PC market.
If he bought with the intention of playing at medium/high settings he would have just bought a 760 or something. Then it wouldn't hurt as much.
It's all about expectations.
That post was a reply to a post that was speaking in a broad context, not this specific game. I even mention this like 10 posts down.Hahahaha what the hell are you on about
It still blows away the consoles. It just isn't max settings. You get that there's a gradient here right?
Oh cmon it'll cut the mustard just fine at medium texture settings and can probably handle the tessellation, too. Your 670 still smokes a PS4.Excuse me? 6GB? Are they serious? That it, I'm officially over PC hardware. Not that I cared much about this game anyway, but if my 2GB gtx 670 can't cut the mustard at 1080p someone fucked up.
It's one setting in one game. Not that I'm convinced a high end SLI setup is going to falter on ultra textures anyway, even sans 6GB vram per card. In fact, I'm willing to wager real money that an SLI 970 4GB setup will handle it fairly well, assuming there's no SLI related shenanigans.
Yeah, those people just need to get over themselves. There's absolutely nothing wrong with a bit of future proofing in games, especially if, say, "high" ends up looking like last generation's "ultra".
I hope that turns out to be the case but I'd rather sit back, watch and wait instead of being a $750 guinea pig.
That doesn't even do the 3GB+ cards proper justice. When I played Alan Wake and Alice:Madness Returns for the first time on PC with my 2GB GTX 670 the IQ positively blew me away and practically locked me into PC gaming permanently (barring unforeseen financial troubles). A 780 or 970's "high" must be far beyond anything I've seen from last gen.Yeah, those people just need to get over themselves. There's absolutely nothing wrong with a bit of future proofing in games, especially if, say, "high" ends up looking like last generation's "ultra".
I don't see how anything has changed. The requirement of being able to run "next-gen" games at console-equivalent standards is just as you described, except minus the "out of the window" part. If you want your PC to continue to meet new standards as they change over time, in this case a setting that far exceeds (Texture Quality) what the console versions are using you will need better hardware.
Twisting information to fit your arguments doesn't make them any more true.
I hope it isn't a CPU issue like last time with GTA 4 at launch on PC. If it is Ill have to go buy a 20 Core processor at 4.0GHz or some nonsense.I hadn't considered the GTA V PC requirements.
Holy shit, that will be insane.
The game will still run on the $500 machine.
Max and YouTube? Come on PC Gamer...PCGamer put up a 1440p max setting vid on youtube. Youtube has some shit compression, but its something I guess.
https://www.youtube.com/watch?v=DrzSBpkxbqw
All this talk about insane VRAM usage due to next-gen unified memory is making me want hold out on buying a 4GB GTX970 for a 1624GB GTX1170...
PCGamer put up a 1440p max setting vid on youtube. Youtube has some shit compression, but its something I guess.
https://www.youtube.com/watch?v=DrzSBpkxbqw
I hope it isn't a CPU issue like last time with GTA 4 at launch on PC. If it is Ill have to go buy a 20 Core processor at 4.0GHz or some nonsense.
Wait 2 or 3 months, 8GB models shouldn't be far out. If you buy with EVGA they have a nice step up program to where you can upgrade within 90 days of purchasing the card. Ie, EVGA puts out a 8GB model in November you can send them the card and 50 dollars to get the version with more memory.All this talk about insane VRAM usage due to next-gen unified memory is making me want hold out on buying a 4GB GTX970 for a 1624GB GTX1170...
I'll be playing GTA5 and Arkham Knight on my 2GB GTX 670 at 1080p without projectile vomiting at the screen or having to buy a PS4.Will Evil Within still run on the $500 2gb VRAM machine? Will X game out in 2015 still run on it? Technically, a 670/2gb blows away PS4/X1's GPU but that's irrelevant when games are coming out in the first year of next-gen recommending 4 gigs of VRAM, a cross-gen game no less. Or 3 gigs for high textures. What's going to happen in 2015 and beyond? Be realistic.
I do agree with this plan. The GTA5 meltdown is going to be really awkward.Basically, buy a PS4 and enjoy games at high/medium.
Wait 3 years until GPU's are put of the 28nm hole and Nvidia/AMD step up the VRAM offering then buy a mainstream GPU that will net you again high/ultra settings for all these console ports until nex-gen
Then repeat process.
Sounds like a plan.
Wait 2 or 3 months, 8GB models shouldn't be far out. If you buy with EVGA they have a nice step up program to where you can upgrade within 90 days of purchasing the card. Ie, EVGA puts out a 8GB model in November you can send them the card and 50 dollars to get the version with more memory.
This is little more than a gut feeling but I bet GTA5 runs fine on 2 gigs because it was originally developed with just 360 and PS3 in mind. Evil Within and Mordor have been developed with PS4/X1 in mind all along.I'll be playing GTA5 and Arkham Knight on my 2GB GTX 670 at 1080p without projectile vomiting at the screen or having to buy a PS4.
believe it! /Naruto
I'll be buying one for Bloodborne just like everyone else.
PCGamer put up a 1440p max setting vid on youtube. Youtube has some shit compression, but its something I guess.
https://www.youtube.com/watch?v=DrzSBpkxbqw
*citation needed*This is little more than a gut feeling but I bet GTA5 runs fine on 2 gigs because it was originally developed with just 360 and PS3 in mind. Evil Within and Mordor have been developed with PS4/X1 in mind all along.
PCGamer put up a 1440p max setting vid on youtube. Youtube has some shit compression, but its something I guess.
https://www.youtube.com/watch?v=DrzSBpkxbqw