• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Witcher 3 will not have visual cutbacks on PS4/Xbox One relative to PC

We have one of these threads for every single multiplatform game ever. Stock gaming journo question:stock game dev answer. They always say that everyone is going to get a pretty game and will never tell you that their game will look better on one platform over the other. You aren't going to throw a platform under the bus on a product you are trying to sell on said platform. Have we ever gotten a strait answer even once? I wonder if they would even be honest about this question postmortem when the game is out of print.
 
The console versions will probably run at a solid 30fps and have brilliant graphics, to get the same level on PC will take a monster PC i'd bet so no, PC versions are not always the best, not unless you have a monster PC in the first place and not everyone does.
God PC guys are annoying.

People who think it takes monster PCs to run at console settings are also annoying
 
The console versions will probably run at a solid 30fps and have brilliant graphics, to get the same level on PC will take a monster PC i'd bet so no, PC versions are not always the best, not unless you have a monster PC in the first place and not everyone does.
God PC guys are annoying.
It's not like needlessly butthurt people are any less annoying, honestly.
 
CDPR taking supersampling out of witcher 3 then for lolconsole parity?

If they go for a gimped release like crysis 2 did then I'll wait till I can pick it up for 5 euros in a steam sale with the goty ungimp patch.
For PC only guys to act so smart and high and mighty they ask some dumb ass questions.


Seriously, which PC game can't you force supersampling in?
 
When developers say this they usually mean rendering features, a 1:1 set of rendering technicalities between PC and consoles. Like, both have bokeh, motion blur, tessellation, physics, and so on. No unticked boxes in the console release, which is different scenario to what we've seen for the last few years on current gen -> PC ports. PC versions in this case might not look significantly better, but also might make use of some rendering quirk the console isn't capable of. EG: Tomb Raider looks good on all platforms, but only the PC version uses tessellation.

Naturally the PC version will always edge out a 1:1 console version because anti-aliasing, resolution, quality of those rendering technicalities, 3D, multi-monitor, and so on are not set features. They're not even set on PC. You could play the game with 1:1 technicalities on PC, but butcher the texture filtering, resolution, and anti-aliasing so much so that it looks worse than the console build.

Exactly.
 
I hope it looking the same as the PC version graphically doesn't mean the game will have performance issues. The Witcher 2's frame-rate and screen tearing on 360 sucks.
 
The reaction to the wolf fur is weird!

Welcome to GAF :)


Great, the more even these platforms are the better out multiplats will look.

I hope it looking the same as the PC version graphically doesn't mean the game will have performance issues. The Witcher 2's frame-rate and screen tearing on 360 sucks.

Indeed I'm re-playing it on console now just after the LIVE sale. There are many things that are annoying. That being said, for what the hardware is capable of CDP did a magnificent job on it. Can't friggin wait to see what they can achieve on the new platforms. These guys are unbelievably talented.
 
They did an excellent job with the 360 version of W2, so I actually believe them.
Yep, exactly my thoughts. The 360 version was remarkably good looking and closer to the PC version than I would have expected.

I hope it looking the same as the PC version graphically doesn't mean the game will have performance issues.
I'm just as worried about the PC version. I'm not confident that my GTX680 will be able to handle the game with consistent performance...
 
Oh, right. The PhysX fur stuff.

i0GAnRUmjGCpR.gif

bring on next gen systems
 
I think a lot of PC owners are going to be surprised. I reckon on those with a good i5+ and a 680 GPU will be able to run games better than the PS4.

So it will require a computer far more powerful than consoles, even though consoles are now 99% identical to PC architectures now? lol.
 
Wow so only AMD users on PC are left out.
They are not, physx effects can generally be activated even if you don't have an nvidia card. The problem is that without hardware acceleration the performance hit is so massive that practically no one does it. I don't know how it will work on next gen consoles, but just because physx is supported doesn't mean it's realistic to actually run all of those fancy effects.
 
People who think it takes monster PCs to run at console settings are also annoying
With current console ports? Of course you can expect average PCs to power through it. For the next generation of games, however, I'm expecting it to change somewhat. I don't think people are going to be able to crank up AA and achieve 60 fps without breaking a sweat.
 
They are not, physx effects can generally be activated even if you don't have an nvidia card. The problem is that without hardware acceleration the performance hit is so massive that practically no one does it. I don't know how it will work on next gen consoles, but just because physx is supported doesn't mean it's realistic to actually run all of those fancy effects.

Got ya. People that run AMD cards with NV cards for physx, how successful are they in doing that?
 
With current console ports? Of course you can expect average PCs to power through it. For the next generation of games, however, I'm expecting it to change somewhat. I don't think people are going to be able to crank up AA and achieve 60 fps without breaking a sweat.

PCs with single 670s should have no problem running TW3 at next gen console level at 60fps, granted it's optimized well.
 
So they're using the same assets/effects/textures, probably just a locked framerate? That's nice to know, since CDPR are incredibly ambitious and talented.
 
PCs with single 670s should have no problem running TW3 at next gen console level at 60fps, granted it's optimized well.

I somehow doubt it but we'll see, would be good if right.

If anything, this statement from CDPR proves the Xbone is not as weak as most people make it out to be.
 
I somehow doubt it but we'll see, would be good if right.

If anything, this statement from CDPR proves the Xbone is not as weak as most people make it out to be.

Each of my OC'd 670s has 2.8 TFLOPS. If this game has feature parity between X1 (1.2 TFLOPS) and PS4 (1.8 TFLOPS) then a 670 will be plenty for 30fps. I do agree with you that 60fps is likely to be a stretch on a single 670.
 
If anything, this statement from CDPR proves the Xbone is not as weak as most people make it out to be.
Nah, it doesn't.
If anything, this statement just proves that they are not willing to outline any difference between platforms for the sake of diplomacy. Which is understandable, when you consider how rabid reactions can be when developers act in any other way.
 
I somehow doubt it but we'll see, would be good if right.

If anything, this statement from CDPR proves the Xbone is not as weak as most people make it out to be.

Even if the xbone isnt that weak they wouldnt take sides regardless, they'd hurt their business if they did so this early. A single 670 is damn near 2x the ps4s gpu processing output so I don't see a reason a 670 paired with a nice cpu wouldn't run at console quality at a higher frame rate.
 
PCs with single 670s should have no problem running TW3 at next gen console level at 60fps, granted it's optimized well.

2GB 670? Not sure. when the new consoles start being pushed, that might be a challenge for a 2GB card, especially if you want to run at 1080P or greater with AA.
 
If a 670 can max it out with 60 FPS, come Maxwell, wouldn't we already be at the point of 100 FPS or much higher resolutions for next-gen games?
 
If a 670 can max it out with 60 FPS, come Maxwell, wouldn't we already be at the point of 100 FPS or much higher resolutions for next-gen games?
I... don't see how one thing should imply the other, honestly.
Beside, every software will be pretty much a different history, there isn't a general rule.

Because you agree or disagree with him?
Because what he's saying doesn't even make sense.

"1080p" were a widely embraced standard on PC for at least the past 3 years, but now suddenly they will become extremely hard to achieve because of... memory constrictions?
 
Next Gen starts when the PS4 or the Xbox One arrives at my doorstep.

Next-gen started in November 2007 when Crysis was released.

If a 670 can max it out with 60 FPS, come Maxwell, wouldn't we already be at the point of 100 FPS or much higher resolutions for next-gen games?

We are already at the point of 120fps or 1600p. Maxwell will hit 60fps in BF4 at 1600p if the consoles are running it at 1080p60.
 
2GB 670? Not sure. when the new consoles start being pushed, that might be a challenge for a 2GB card, especially if you want to run at 1080P or greater with AA.

Ram won't be a issue for PC gamers this early, maybe in two years. I doubt CDPR are developing TW3 on PC with a 6GB Titan in mind only. With TW3 being Open World with no load times I expect it to be more cpu bound, maybe.
 
There will probably be very subtle differences between next gen consoles and PC's the first and maybe second year the consoles release but for sure PC will be where it is at for performance like 60fps and higher resolutions. If TW3 is optimized well like TW2 then I'm not even worried about running it. My two 670s don't even go pass their base clock speed to run TW2 at max settings, no ubersampling though.
 
For PC only guys to act so smart and high and mighty they ask some dumb ass questions.


Seriously, which PC game can't you force supersampling in?

Tons of games, especially on amd cards.
Don't post if you don't know what you are talking about.

I'm not a 'pc only guy' , I just don't like arbitrarily gimping one version for parity with the lowest common denominator to appease the 'console only guys'
 
Each of my OC'd 670s has 2.8 TFLOPS. If this game has feature parity between X1 (1.2 TFLOPS) and PS4 (1.8 TFLOPS) then a 670 will be plenty for 30fps. I do agree with you that 60fps is likely to be a stretch on a single 670.

Nah, it doesn't.
If anything, this statement just proves that they are not willing to outline any difference between platforms for the sake of diplomacy. Which is understandable, when you consider how rabid reactions can be when developers act in any other way.

Even if the xbone isnt that weak they wouldnt take sides regardless, they'd hurt their business if they did so this early. A single 670 is damn near 2x the ps4s gpu processing output so I don't see a reason a 670 paired with a nice cpu wouldn't run at console quality at a higher frame rate.

I hear you guys, I have a SLI 670 as well and I run The Witcher 2 comfortably. I do hope they optimize their next game but I doubt a single GTX 670 will run it at a locked 60fps with all the bells and whistles (and with physx on top of that)... Would be extra happy to be wrong :)
 
Top Bottom