• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The Witcher 3 probably not 1080P on consoles

Ah, you edited after I replied. Yeah, it's probably not hairworks but it sounded like they were at least experimenting with some sort of basic simulation for consoles at some point.

I definitely think it's possible to have hair/fur simulation (approximation) on consoles but I wonder if Hairworks is the right solution for that. They will need to build a PS4/XBO specific, tailored-made tech from the ground-up.
Unless Nvidia add consoles' support for Hairworks.
 
If the R7 265 is capable of running Far Cry 3 at Ultra presets with 4xMSAA @1080p while averaging 33FPS and only dropping to 27 FPS (Source) then there is no reason that with the same level of IQ the Witcher 3 cannot run at 1080p30 on the PS4. Sure that level of IQ might not be 'ultra' any more but it is still very good.

Despite a similarity in specs, I think you'd be hard-pressed to find a benchmark where the R7 265 doesn't perform better.
 
But really, they need another voice actor for Gerald.

7beb4c.gif
 
The scale of this game is huge, and even Witcher 2 needs a lot of power.

I am dissapointed if it is less than 1080p, hope it is at least 900p, but I am at least glad I can play the game, framerate is king though, that should be smooth.

I could not afford a PC and a PS4 as I like the exclusives. So PS4 it is.

I hope 1080p remains a standard for at least more than 80% of games.

With Ubi having last gen AC4 just at 1080p30 I think was a sign of things to come from Ubi, I hope I am wrong.
 
Sorry if I missed it, but it appears there's a new interview with the lead developer of Witcher 3. Was published Aug 21. Here's some stuff:
On Xboxone, ESRam and 1080p:
On Mantle and DX12:
On console CPU clock speeds
On how DX12 may help PCs and Xboxone

Damn, sounds like those 6 cores in consoles are really bogging things down. And shame about no Mantle at launch, but it is nice to see they will at least look at it for future updates.
 
Despite a similarity in specs, I think you'd be hard-pressed to find a benchmark where the R7 265 doesn't perform better.

In non CPU bottlenecked scenarios (BF4 Mantle for example) I would take that bet. The difference will be within margin of error. Better texture performance on PS4 vs better pixel fill on R7 265 with fractional increase in memory bandwidth (2.5% more).
 
I think CDPR will just release the SAME version, both on PS4 and One. So I think 900p for all.

Uhh what. If the PS4 version is exactly the same as the XBO version im boycotting the fuck outta that shit. Never played the Witcher 2 so its not a must play really.
 
My laptop i7 1.6 ghz am cry.

That i7 is actually most likely quite a bit faster than those jaguars in consoles, though not exactly top of the line either. Laptops are just not meant for high end gaming, in my humble opinion.

Uhh what. If the PS4 version is exactly the same as the XBO version im boycotting the fuck outta that shit. Never played the Witcher 2 so its not a must play really.

One stupid joke comment by a poster on GAF and we are already at the "boycotting the fuck outta that shit" phase ?
Come on now.
 
I don't get why they can't get this at 1080p. Not a good look with well over 5+ years to go with these consoles.
Because it's an open world game? Because the consoles aren't as powerful as everyone seems to think? Developers have a limited hardware budget to work with. They can prioritize 1080p, sure, but they'd have to sacrifice in other areas, namely texture quality.

Your clear picture means jack when all of your assets are a muddy soupy mess.
 
Because it's an open world game? Because the consoles aren't as powerful as everyone seems to think? Developers have a limited hardware budget to work with. They can prioritize 1080p, sure, but they'd have to sacrifice in other areas, namely texture quality.

Your clear picture means jack when all of your assets are a muddy soupy mess.

Or maybe because:
We always want to provide the best possible experience to all our gamers regardless of the platform and so we are not aiming to develop special graphical features for any of them.
It seems they don't care about the platform advantage, reading there.
 
Or maybe because:
It seems they don't care about the platform advantage, reading there.

Jesus F. Christ this has been explained about twenty different times in this exact thread by various different people including Durante and CDP people.

So for twenty first time, no, it does NOT mean that.
 
Or maybe because:
It seems they don't care about the platform advantage, reading there.
'Special Graphical Features' would be things like AA, Physx, AO... Things that make the game stand out from the competition and make it, ya know, special. Pixels aren't special.
 
I don't get it. If it's CPU bound then why are the reducing the resolution?

I guess, but I could wrong, their engine use CPU for some graphic features, like Watch Dog. There are things which pass on the cpu, it's nothing of crazy, especially on the ps360 was normal. The thing is, CPU on ps4 or xbone, not suit well for those kinda of features.
 
Jesus F. Christ this has been explained about twenty different times in this exact thread by various different people including Durante and CDP people.

So for twenty first time, no, it does NOT mean that.

So you mean finally will see The Witcher 3 use the 8 ACE (or some) on the ps4 indeed to drop the resolution to gain resource for the weak cpu? Will see.
Edit:
But something in mind said nope.
 
Never expected anything close to parity with the PC, but a position of launching the same experience on both X1 & PS4 is... super disheartening-- but not entirely surprising-- for me as a PS4 owner.

One of these days... when I have more money and live in a bigger place, I will finally get back into PC gaming and will pick this up along with W1 & 2 and play through the entire trilogy.
 
I don't get it. If it's CPU bound then why are the reducing the resolution?

They never said it was purely CPU bound. The CPu is just another obstacle, but I suspect shader and perhaps fill or some other performance barriers on the GPU as also an issue.

Remember, these are basically 750ti GPU's with some extras.
 
Never expected anything close to parity with the PC, but a position of launching the same experience on both X1 & PS4 is... super disheartening-- but not entirely surprising-- for me as a PS4 owner.

One of these days... when I have more money and live in a bigger place, I will finally get back into PC gaming and will pick this up along with W1 & 2 and play through the entire trilogy.

These days you can build pretty beefy PC's that are only slightly bigger than an Xbone.
 
They never said it was purely CPU bound. The CPu is just another obstacle, but I suspect shader and perhaps fill or some other performance barriers on the GPU as also an issue.

Remember, these are basically 750ti GPU's with some extras.

It seems the CPU can be a limiting factor even in GPU bound scenarios :
PS4-GPU-Bandwidth-140-not-176.png


That may explain why Watch Dogs didn't run at 1080p native on PS4.
 
These days you can build pretty beefy PC's that are only slightly bigger than an Xbone.

At the end of the day it really is the space any additional thing would take up in my tiny ass studio apartment. Not a fan of clutter and I don't have room for anything else currently. One day, mate. One day I shall return and unshackle myself.
 
It seems the CPU can be a limiting factor even in GPU bound scenarios :
PS4-GPU-Bandwidth-140-not-176.png


That may explain why Watch Dogs didn't run at 1080p native on PS4.

Hmmm, looks like it. So the CPU is taking away bandwidth from the GPU under heavy utilization. But you want it heavily utilized in order to get the most out of it. A damned if you do, damned if you don't scenario.
 
At the end of the day it really is the space any additional thing would take up in my tiny ass studio apartment. Not a fan of clutter and I don't have room for anything else currently. One day, mate. One day I shall return and unshackle myself.

Well, Pc's will only get smaller and more powerful, so it's not like waiting is a bad thing. In 3 years you can probably rock something that drives a 4K panel and is the size of an Xbone and comes with a VESA mount so it goes behind your TV and disappears from view :)
 
We've been seeing difference between ps4/x1 in most multiplats, except a few handful of games that aren't the most demanding. If this game is as demanding as it seems, how come both will run at the same res? Unless the devs decided to reach a certain target and don't want to push further. First they said 1080p was looking to be ps4 res, backed by two things: x1 managed 900p, the power gap should be sufficient to run 1080p. Now are they backtracking on what they said?
 
We've been seeing difference between ps4/x1 in most multiplats, except a few handful of games that aren't the most demanding. If this game is as demanding as it seems, how come both will run at the same res? Unless the devs decided to reach a certain target and don't want to push further. First they said 1080p was looking to be ps4 res, backed by two things: x1 managed 900p, the power gap should be sufficient to run 1080p. Now are they backtracking on what they said?

If they are both the same resolution, maybe either the PS4 version will have some higher quality effects, better IQ or better performance. Or maybe a combination?
 
We've been seeing difference between ps4/x1 in most multiplats, except a few handful of games that aren't the most demanding. If this game is as demanding as it seems, how come both will run at the same res? Unless the devs decided to reach a certain target and don't want to push further. First they said 1080p was looking to be ps4 res, backed by two things: x1 managed 900p, the power gap should be sufficient to run 1080p. Now are they backtracking on what they said?

It's possible that for various reason (perhaps some per pixel process like shading), that runnign at the same resolution frees up resources elsewhere on the PS4, resources that they'd rather have.

So it may be that they are both 900p (who knows though if that will actually be the case), but the PS4 may feature better LOD levels, less pop-in, better shadows, more particle effects?

In other words, I doubt they will truly be on "par".
 
If they are both the same resolution, maybe either the PS4 version will have some higher quality effects, better IQ or better performance. Or maybe a combination?

Might be wrong, but didnt they say they weren't trying to differentiate both versions on effects/particles and whatnot? The only two meaningful ways of harnessing the overhead the ps4 hás over the x1 would have to be either fps or res. The difference isn't t high enough to have a meaningful, feasible fps difference (40-42 for ex.) would look awful, better bring it to 30. So the only thing left is 1080p, which they had hinted at, hence why I'm confused by this backtracking, that's all..
 
Might be wrong, but didnt they say they weren't trying to differentiate both versions? The only two meaningful ways of harnessing the overhead the ps4 hás over the x1 would have to be either fps or res. The difference isn't t high enough to have a meaningful, feasible fps difference (40-42) would look awful, better bring it to 30. So the only thing left is 1080p, which they had hinted at, hence why I'm confused by this backtracking, that's all..

I think you are over simplifying. What they actually said was that they weren't developing extra graphical features for one platform or another (except PC apparently with dat sweet wolf fur). That doesn't preclude them from tweakign shadow quality, LOD quality, number of particles on various effects, better anti-aliasing, etc, etc, on the PS4.
 
That i7 is actually most likely quite a bit faster than those jaguars in consoles, though not exactly top of the line either. Laptops are just not meant for high end gaming, in my humble opinion.

You are not the first to voice this opinion, and I think it's justified for you to say so. But I assume you haven't had first hand experience with gaming on a gaming laptop?

My laptop was top of its line in 2011 and since then it's been serving me quite well even with heaviest games. But to define "well" in my book is medium-high settings, with fps >30 at 900p (native). Yes that maybe considered peasant-tier to some of you I know, but with my situation it's simply not ideal to have a desktop.

To paint a clearer picture, I can play Crysis 3 at medium, TW2 at medium-high, Metro LL medium, BI medium high, Wolfenstein TNO at high with some features turnt off, to name a few, all at around 30-40 fps. Some even higher.

The only games that gave me a hard time was Metro 2033, Watch Dogs, LA noire. But those games were notoriously messy and unoptimized.

I guess gaming at 900p has its advantage.

But enough with derailing the thread.

My laptop's GPU is slightly weaker than the xbo's with around 1.12TFs. Factoring in other elements I believe I can game it at least low-medium at 900p.

Hope might not be the right word, should be dream.
 
You are not the first to voice this opinion, and I think it's justified for you to say so. But I assume you haven't had first hand experience with gaming on a gaming laptop?

My laptop was top of its line in 2011 and since then it's been serving me quite well even with heaviest games. But to define "well" in my book is medium-high settings, with fps >30 at 900p (native). Yes that maybe considered peasant-tier to some of you I know, but with my situation it's simply not ideal to have a desktop.

To paint a clearer picture, I can play Crysis 3 at medium, TW2 at medium-high, Metro LL medium, BI medium high, Wolfenstein TNO at high with some features turnt off, to name a few, all at around 30-40 fps. Some even higher.

The only games that gave me a hard time was Metro 2033, Watch Dogs, LA noire. But those games were notoriously messy and unoptimized.

I guess gaming at 900p has its advantage.

But enough with derailing the thread.

My laptop's GPU is slightly weaker than the xbo's with around 1.12TFs. Factoring in other elements I believe I can game it at least low-medium at 900p.

Hope might not be the right word, should be dream.

Laptop gaming is fine, so long as you know what to expect,a nd you seem like you do, and ar ehappy with that level of performance for the money spent. Sounds good to me!

Another happy news for you is that with MAntle, with DX12, wiht OpenGL+ extensions, your CPU will only perform better when it comes ot 3D games on modern engines utilizing those modern API's.

And yeah, sorry for continuing the hijack.
 
I think you are over simplifying. What they actually said was that they weren't developing extra graphical features for one platform or another (except PC apparently with dat sweet wolf fur). That doesn't preclude them from tweakign shadow quality, LOD quality, number of particles on various effects, better anti-aliasing, etc, etc.

Oh, ok my bad:)
If its something like ufc then I don't mind. What really posses me off is when capable devs leave unused power on the table, if they at least try to take advantage of ps4 advantages (be them big or small) then I'm happy. I'm not expecting PC quality here, but if they can push the game further without breaking shit, should do so.
 
As far as I can tell, no one has made a thread about that new interview, which has quite a bit of topics that would be great to talk about but aren't necessarily related to this thread... if someone wants to do that... :)
 
Top Bottom