• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The Witcher 3 probably not 1080P on consoles

You are not the first to voice this opinion, and I think it's justified for you to say so. But I assume you haven't had first hand experience with gaming on a gaming laptop?

Of course if you move extensively and laptop is the only way to game then I understand owning it, and it can still provide good experience. But yeah to be honest I just got spoilt, if I don't have 1080p/60fps/high I just grumble and go upgrade. Also why I dislike when games are console only, I can't get my preferred settings which sucks.
 
It's possible that for various reason (perhaps some per pixel process like shading), that runnign at the same resolution frees up resources elsewhere on the PS4, resources that they'd rather have.

So it may be that they are both 900p (who knows though if that will actually be the case), but the PS4 may feature better LOD levels, less pop-in, better shadows, more particle effects?

In other words, I doubt they will truly be on "par".

If the limiting factor is the cpu,wich I really think it is and will be for this generation because of how slow it's clocked on an amd architecture that has half the ipc as intel... well that would explain why those two version since the cbone even with a slower gpu has a clock advantage cpu wise and since this is the obvious bottleneck well it's a levrage.

Either way if anyonne want 1080p and 60fps this gen he would be playing on a pc to begin with.
 
Might be wrong, but didnt they say they weren't trying to differentiate both versions on effects/particles and whatnot? The only two meaningful ways of harnessing the overhead the ps4 hás over the x1 would have to be either fps or res. The difference isn't t high enough to have a meaningful, feasible fps difference (40-42 for ex.) would look awful, better bring it to 30. So the only thing left is 1080p, which they had hinted at, hence why I'm confused by this backtracking, that's all..

It's rare that a game will have extra or different graphical features on one console over the other, but the quality of the same effects could be pushed up a few notches - and that is what we usually see on superior console versions.
 
Laptop gaming is fine, so long as you know what to expect,a nd you seem like you do, and ar ehappy with that level of performance for the money spent. Sounds good to me!

Another happy news for you is that with MAntle, with DX12, wiht OpenGL+ extensions, your CPU will only perform better when it comes ot 3D games on modern engines utilizing those modern API's.

And yeah, sorry for continuing the hijack.

Oh I know, that is partly why I wanna keep this laptop going on a bit longer to see how far I can push it. But at the same time, the graphics demand of newer games is just gonna go up, what good is for example Mantle then when my GPU cannot cut it. It should only be good if I have a good GPU and a weak CPU. Oh well at least I am happy as it is. :P

Of course if you move extensively and laptop is the only way to game then I understand owning it, and it can still provide good experience. But yeah to be honest I just got spoilt, if I don't have 1080p/60fps/high I just grumble and go upgrade. Also why I dislike when games are console only, I can't get my preferred settings which sucks.
Oh I know that feeling rather well. I was rather spoilt too. Years back I've always game on my PC with top of the line cards at their times like the Ti 4800, 9800 pro, GT 6600 , etc.

Up until it became not ideal for me anymore did I turn to gaming laptop. The detox process was rather hellish but I got used to it.

But if anything, TW3 is gonna be the game that makes me finally build a Desktop again, together with a PS4 for Bloodborne, next year is gonna be a glorious year.
 
Looks more and more like I'll be skipping Xbone/PS4 and getting a new PC this gen.

I see a lot of people are reaching this conclusion. It warms my heart :)

For the record, I don't believe CDP are involved in any backroom deals for forced parity. If they decide to drop the resolution I'm positive that it will be for good reason.
 
It seems the CPU can be a limiting factor even in GPU bound scenarios :
PS4-GPU-Bandwidth-140-not-176.png


That may explain why Watch Dogs didn't run at 1080p native on PS4.
The bandwidth is shared between GPU/CPU so if one of them use it then the other have less to use... it is a matter of how the dev will use the bandwidth between CPU and GPU (there is a limitation if I'm not wrong CPU can use a maximum of 30GB/s... GPU can use all of it).

EVery system/hardware that uses shared memory pool have this.

BTW optimizations can help in the "disproportionately" part... in fact I believe this scenario showed in 2013 is not more accurate and the SDK already have a better use of memory bandwidth but of course that won't change the fact that the bandwidth CPU is using can't be used by GPU (and vice-versa).

PS. Memory bandwidth is not what is holding 1080p on consoles in my opinion... in Xbone is the size of the memory (32MB) and on Watch Dogs on PS4 I don't know but I think that game could run in 1080p if Ubsoft optimize it better.
 
New voice actor for Geralt? Can't tell if serious... His voice is perfect.

Nobody's complaining about Geralt. I saw someone complain about Gerald though, a new character I guess. Withholding judgement until I hear his voice.

/s
 
Uhh what. If the PS4 version is exactly the same as the XBO version im boycotting the fuck outta that shit. Never played the Witcher 2 so its not a must play really.

Remember when the boycott talk was about those damn lazy developers not making the ps3 version the same as the 360's?
 
Remember when the boycott talk was about those damn lazy developers not making the ps3 version the same as the 360's?

I must be mis-remembering. Why would the PS3 version be the same when people thought it was more powerful?
If anything, the opposite happened (FF13 wasn't going to be better on 360. Especially when it was decided the DVD's couldn't hold everything).
 
It seems the CPU can be a limiting factor even in GPU bound scenarios :
PS4-GPU-Bandwidth-140-not-176.png


That may explain why Watch Dogs didn't run at 1080p native on PS4.

Not really. The GPU does not need all that bandwidth all of the time and the CPU does not require its maximum allotted bandwidth all of the time. Part of optimisation is about making sure that when the GPU needs the bandwidth it is available, sure in some cases it is not possible but with how low level the developers can get with the hardware this level of optimisation is doable, it just takes time and effort.
 
This thread is another carnival of stupid, I feel.

What "special graphical feature" would you expect in ps4 than it couldn't be done in the other version?

If ps4 has more horsepower than xbox, the difference surely will be in different resolution, like in so many current games, that's not a surprise. But resolution change is not exactly a "special graphical feature".
 
In non CPU bottlenecked scenarios (BF4 Mantle for example) I would take that bet. The difference will be within margin of error. Better texture performance on PS4 vs better pixel fill on R7 265 with fractional increase in memory bandwidth (2.5% more).

In situations that depend almost entirely on the GPU, ones that need a strong CPU, or even BF4 without Mantle, the R7 265 would still probably come out on top, considering it easily beats a 750 ti, which already can trade blows with PS4. I don't think the performance of the R7 265 is indicative of what to expect of PS4 in TW3, or at least current games don't make me think it will be that way.
 
This will need a 4GB VRAM GPU and a beefy i7 to run on ultra at somewhat stable 60fps@1080p. I can already imagine the meltdowns on the PC perfomance thread, not cause bad optimization or drivers, just because the game looks really good and demanding.

I hope so. Tbh I like when a game does that. I like when they push things higher and higher and force people like Nvidia and AMD to create better products that will allow higher performance in games like that. I could care less if my FPS suffers on ultra settings due to the graphical features. I'm just glad they would care enough to add those features and hope it raises the bar. We've had it too good with the long cycle last gen. Time to get out of that and get back to pushing the limits.
 
If the limiting factor is the cpu,wich I really think it is and will be for this generation because of how slow it's clocked on an amd architecture that has half the ipc as intel... well that would explain why those two version since the cbone even with a slower gpu has a clock advantage cpu wise and since this is the obvious bottleneck well it's a levrage.

Either way if anyonne want 1080p and 60fps this gen he would be playing on a pc to begin with.

I want 1080p and 30fps and will be quite disappointed if it doesn't hit that mark. I'll probably just wait until I get a better PC. I passed on Watch Dogs after resolution-gate.
 
Obviously I have my preferences, which I make no effort to hide, but if you play a Witcher game on anything but a PC you will not be getting the top tier experience. CDPR make PC games, and console ports. That's just how it is. Expecting them to squeeze a lot of additional goodness out of what amounts to a pitifully small difference in power on the ps4 when compared with the Xbone in the grander scheme of things is unrealistic.
 
On multi-core use in PCs and Consoles:

We are always trying to improve our multi core usage, but quad core CPUs were already quite efficiently used in the Witcher 2. The game, the renderer, the physics and audio systems are all running on different threads, and we are trying to maximize the efficiency.
This may likely be the main reason CDP is struggling with consoles. This comment suggests they are comfortable with having fixed threads for a task (one for main logic, 1-3 for renderer tasks, etc), but not full-blown parallelism. That they refused to make Witcher 2 for PS3 is another sign.

Having a dynamic job system (like a circular queue) is a difficult new territory for them.
 
That is only a PS4 problem though with its shared bandwidth. interdasting, where di you get that slide`?
One year agor France developer conference.

They are talking about the issues in game development and how PS4 give a solution to it... this slide was after the bandwidth issue and how you can use the bandwidth between GPU and CPU.... this show the solution.

Misterxmedia necro this slide to say that Sony lied about the the theory memory bandwidth.
 
Actually, expecting a solid 60 on max settings with that system is pretty unrealistic. The 870m is pretty much a 660TI with slower memory, if I'm not mistaken. So GPU power aroud a PS4 but with a much better CPU.

Wait what?

I thought the 870m was better than the PS4 GPU?

At least that's what i saw here earlier. What kind of framerates would i be looking at with stuff like the fur simulation and Ubersampling off vs with them on then?
 
Wait what?

I thought the 870m was better than the PS4 GPU?

Far from it.

At least that's what i saw here earlier. What kind of framerates would i be looking at with stuff like the fur simulation and Ubersampling off vs with them on then?

Impossible to say until the game has released. HairWorks/PhysX + Ubersampling would be unplayable, though.
 
Unplayable as in impossible to hit 60fps or unplayable as in a slideshow?

More the latter than the former. Hardware-accelerated PhysX effects are invariably expensive for the hardware of their time and Ubersampling, if memory serves, is 2x2 supersampling (e.g. 1920x1080 becomes 3840x2160).
 
It's strange that this is even a problem. You underestimate the problems when you thought a series was always going to be on PC. I for one wouldn't choose console for W3 because it's been a standard for me at least.

Wouldn't any type of downgrade be absolutely terrible at this point, considering what I just mentioned? I have better hopes that people will enjoy what the series is than compare graphics features on a console version. I don't want to sound offensive, but I am a tad offended by this news due to the fact that The Witcher is a well known PC title period. If anything, they should pull all the bells and whistles to make the console version stand out. For what's it worth, it's hard imagining how we got here from the time TW1 came out and all the rumors of the console version. CDPR is having to come out and say this and that, now that they put it on console.

It's not like the console gets a "copy" rather than its own port. W2 even had the control schematics for the 360 controller, so we knew it was going to happen eventually. To say that the console port is 100% like the PC version would be insane unless you're running a low end gaming PC. I feel the largest potential for W3 is still on PC. I sure hope it does well on console. If my PC broke I'd get it for console for sure.
 
The Witcher 3: Wild Hunt developer CD Projekt Red has responded to a recent report that quoted a developer as saying the ambitious RPG may not run in 1080p on PlayStation 4 and Xbox One. CD Projekt Red marketing manager Tomasz Tinc tells GameSpot today that the studio is constantly optimizing the game, and explains that a final resolution has not been determined.

"As [The Witcher 3 visual effects artist Jose Teixeira] said, we are working hard on optimizing the game and nothing is set in stone," Tinc explained. "Our engine team remains optimistic and we will surely squeeze the most we can out of each console. I know you might be expecting concrete info like the final frame rate and resolution, but it's simply impossible to give you that sort of thing before the optimization process is complete."

I'm still going to say PS4 1080p/XBONE 900p MAYBE 1080p.
 
It really doesnt make much sense to know that during E3 CD Project claimed that the XBOX version ran in 900p and that it will actually get better through more optimization, and now wondering whether 1080p is possible at least for PS4.

Besides it was implied back then that PS4 version was already running in 1080p...

Also I remember CD Project clearly stating that 900p is the absolute minimum for them.

I really am wondering what we will end up with.
 
I recall Witcher 2 being rather sensitive to resolution bumps. I played through W2 on a GTX580 and had to settle for 1600x900 in order to hit (mostly) 60 fps. 1080p or higher dropped the frame-rate pretty regularly.

Not surprised they're having issues here.
 
It really doesnt make much sense to know that during E3 CD Project claimed that the XBOX version ran in 900p and that it will actually get better through more optimization, and now wondering whether 1080p is possible at least for PS4.

Besides it was implied back then that PS4 version was already running in 1080p...

Also I remember CD Project clearly stating that 900p is the absolute minimum for them.

I really am wondering what we will end up with.

It wasn't just implied back then, it was confirmed by the executive producer: https://twitter.com/MackDoughal/status/479916121966252032

Also, it doesn't make sence that they still aren't able to say what the final resolution will be. How is it that games like The Order, Driveclub and Shadows of Mordor have their final resolution set months before release?

Are they backpedalling because of the MS deal?
 
I'm going to go out on a limb and say Phil Spencer made a phone call saying they want the game to run at the same resolution on both consoles & because Xbox One isn't hitting 1080P they will also lower the PS4 resolution so they will match.
 
It wasn't just implied back then, it was confirmed by the executive producer: https://twitter.com/MackDoughal/status/479916121966252032

Also, it doesn't make sence that they still aren't able to say what the final resolution will be. How is it that games like The Order, Driveclub and Shadows of Mordor have their final resolution set months before release?

Are they backpedalling because of the MS deal?

Agreed, this whole thing doesnt add up. If they arent sure about the resolution now, then why did they state with such confidence 2 months ago in E3?
 
I don't believe for a second CDP will knock the resolution down on PS4 for "parity" reasons. Those conspiracy theories are entertaining though.
 
I don't believe for a second CDP will knock the resolution down on PS4 for "parity" reasons. Those conspiracy theories are entertaining though.

It's probably easier to entertain the idea that this all Microsoft and the Xbox Ones fault than accept the fact that the PS4 just isn't that powerful to begin with.
Relative to modern PC hardware
 
I'm going to go out on a limb and say Phil Spencer made a phone call saying they want the game to run at the same resolution on both consoles & because Xbox One isn't hitting 1080P they will also lower the PS4 resolution so they will match.

More sacks of cash?

Or could it be the devs don't want a resolution-gate storm on them when they are still coding and optimising the game?

I honestly believe with the game coming Feb 2015 they are still tweaking it.
 
Top Bottom