• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The Witcher 3 probably not 1080P on consoles

Awww yeah :D

I'm loving this laptop more and more lol, thank you GAF for that gaming laptop topic that guided me away from my original choice (So glad i didn't go with that) and instead to this. Witcher 3 here i come lol.

Actually, expecting a solid 60 on max settings with that system is pretty unrealistic. The 870m is pretty much a 660TI with slower memory, if I'm not mistaken. So GPU power aroud a PS4 but with a much better CPU.
 
Scoobs, that's not even true. Do you keep up with PC gaming/hardware at all, or are you just hypothesizing on something you're unfamiliar with?

Read the discussion afterwards, about how 'max settings' was defined. I'll not summarise (and therefore speak for scoobs), best just reading it yourself.
 
I was referring to how it will run on consoles - I think that even with a resolution drop it will still be hard to maintain the 30fps that they seem to be aiming at. I think they are too ambitious with how much they are filling the world with.
On PC's I think that it won't do so well on something like a 680 without having to drop down several graphic options.
My experiences with TW2 on PC (more than one PC actually) have been extremely troublesome though, so I might be biased in thinking that they aren't the best at optimizing stuff.

Again, just speculation, and hopefully I will be proven wrong.
You specifically said "and on PC as well", so no, you weren't just talking about consoles.

And most PC gamers have no problem turning down a few settings to get desired performance. That does not mean 'it doesn't run well'.

Not sure what trouble you had with TW2, but I wouldn't say it was 'unoptimized' at all considering how good it looks. Demanding, sure, but not unreasonably so. And the team got TW2 looking pretty good on the Xbox360, so doubting their ability to 'optimize' is fairly unfounded, I think.
 
My experiences with TW2 on PC (more than one PC actually) have been extremely troublesome though, so I might be biased in thinking that they aren't the best at optimizing stuff.

Again, just speculation, and hopefully I will be proven wrong.
They certainly aren't the best but they sure as hell tried, a lot. At the time of the witcher 2 i would say they are at least very decent, based on my laptop being able to run the game on medium-high at 900p. They have grown a lot in term of knowledge and man power since then so i hope this time tw3 be in s better state optimization wise at launch.
 
w5pgjfb.gif


witcher3vnf7l.gif


I believe them... I really doubt the consoles can handle that at max resolution.
My PC would actually burn if it had to run it so, I'll take whatever is released on PS4. lol
 
Read the discussion afterwards, about how 'max settings' was defined. I'll not summarise (and therefore speak for scoobs), best just reading it yourself.

So are we considering 4k 'maxed' then along with max in game settings?

Challenge still accepted :p
 
You specifically said "and on PC as well", so no, you weren't just talking about consoles.

And most PC gamers have no problem turning down a few settings to get desired performance. That does not mean 'it doesn't run well'.

Not sure what trouble you had with TW2, but I wouldn't say it was 'unoptimized' at all considering how good it looks. Demanding, sure, but not unreasonably so. And the team got TW2 looking pretty good on the Xbox360, so doubting their ability to 'optimize' is fairly unfounded, I think.

TW2 on PC had quite a few issues regarding stuttering and what not, specially in ATI cards. While it might not have been unoptimized as something like Metro 2033 or Crysis 1, it had a few share of technical problems.
Like I said, after watching their latest video, I feel like they still have some of those issues...the video was a stuttering mess. It just felt clunky and unoptimized. I know it's still a "WIP", however, I wouldn't be surprised if the final product had some of these flaws, on both PC and consoles, that's all.

But I can see that you won't budge on this, and that's ok. I guess when the game releases all these discussions will be clarified.
 
TW2 on PC had quite a few issues regarding stuttering and what not, specially in ATI cards. While it might not have been unoptimized as something like Metro 2033 or Crysis 1, it had a few share of technical problems.
Like I said, after watching their latest video, I feel like they still have some of those issues...the video was a stuttering mess. It just felt clunky and unoptimized. I know it's still a "WIP", however, I wouldn't be surprised if the final product had some of these flaws, on both PC and consoles, that's all.

But I can see that you won't budge on this, and that's ok. I guess when the game releases all these discussions will be clarified.

Emphasis mine.

Did you watch just the youtube video? Because that verison WAS stuttery, probably some sort of issue in the transcoding, because if you download the Gamersyde.com high quality mp4 version, there is NO stuttering.

Link (1080p/60FPS): http://www.gamersyde.com/download_the_witcher_3_wild_hunt_35_minutes_gameplay-32805_en.html
 
Alright so I'm a little annoyed at the (potential) lack of 1080p, especially after seeing what a difference it made for AC4 on PS4. Still, CDPR do amazing work and even if it is lower, they look to be filling their world up with a staggering amount of content so if it has to be lower to handle it all smoothly then so be it. My SLIed GTX 280s would be smote by this behemoth if I even gave a passing thought to running it on my pc. :(
 
I'm amazed that people doesn't know how taxing a game like The Witcher 3 or AC Unity can be, expecting those games to do 1080p @60fps on a modified 7850 is ridiculous, then you have this guy in the other thread asking for 4K support on consoles :/

I expect games this gen to be 1080p on PS4, period. 30fps is fine, but native HDTV resolution isn't too much to ask.
 
Emphasis mine.

Did you watch just the youtube video? Because that verison WAS stuttery, probably some sort of issue in the transcoding, because if you download the Gamersyde.com high quality mp4 version, there is NO stuttering.

Link (1080p/60FPS): http://www.gamersyde.com/download_the_witcher_3_wild_hunt_35_minutes_gameplay-32805_en.html

Yes I watched the Gamersyde version.

In fact I am watching it right now and, unless my eyes are really deceiving me and I need to go to the doctor, I can tell you that not only there is quite a bit of stuttering, but the game is running at 30fps and not 60 in the video.

EDIT:

Confirmed - from gamersyde - " It's worth mentioning that while the video is encoded at 60 fps, the game itself runs at around 30 here."
 
You can not compare The Witcher 3 to KillZone Shadow Fall. Both vastly different games since one actually has a living world filled with NPC's that go by their daily routine, and Shadow Fall is a linear shooter with a beautiful backdrop that you can not explore.

InFamous Second Son, on the other hand, is a valid argument but it is still a game that is not that demanding in terms of NPC's, dynamic weather and lighting/shadows since it has a fixed day/night setting.

If my GTX 660 will be able to run this game at 1080p High (which I'm betting it will since the vast majority of games do), then I see little reason why the PS4 can't do the same.
 
I expect games this gen to be 1080p on PS4, period. 30fps is fine, but native HDTV resolution isn't too much to ask.

They probably could do that, but then they'd have to lower something else. It's a give and take with these things.

On PC, usually YOU determine what the trade off will be (if any), on consoles, it's up to the devs.
 
If my GTX 660 will be able to run this game at 1080p High (which I'm betting it will since the vast majority of games do), then I see little reason why the PS4 can't do the same.

We don't know for sure if your 660 will run the game on high at 1080p/60. It might though, but then again your CPU is probably more powerful than your PS4's. IT all adds up, it all plays a role.
 
Yes I watched the Gamersyde version.

In fact I am watching it right now and, unless my eyes are really deceiving me and I need to go to the doctor, I can tell you that not only there is quite a bit of stuttering, but the game is running at 30fps and not 60 in the video.

EDIT:

Confirmed - from gamersyde - " It's worth mentioning that while the video is encoded at 60 fps, the game itself runs at around 30 here."

Did I watch a different video? Crap. redownloading... But sounds disappointing...
 
If my GTX 660 will be able to run this game at 1080p High (which I'm betting it will since the vast majority of games do), then I see little reason why the PS4 can't do the same.

True but we will not know until we hear something about it or have the game in our hands.
 
TW2 on PC had quite a few issues regarding stuttering and what not, specially in ATI cards. While it might not have been unoptimized as something like Metro 2033 or Crysis 1, it had a few share of technical problems.
Like I said, after watching their latest video, I feel like they still have some of those issues...the video was a stuttering mess. It just felt clunky and unoptimized. I know it's still a "WIP", however, I wouldn't be surprised if the final product had some of these flaws, on both PC and consoles, that's all.

But I can see that you won't budge on this, and that's ok. I guess when the game releases all these discussions will be clarified.

Thats because it is unoptimized. This is the exact same Demo they've been showing since E3. So we're talking about a build thats probably 2-3 months old, for a game thats not out for another 6 months. Here's a really good technical interview with the same guy from the OP, he explains a lot of the things they are doing to polish the game and improve framerate:

https://www.youtube.com/watch?v=tEFBVIKrKco

Gotta love the big gulp when the interviewer tells him people are gonna accuse them of downgrade.
 
With MS giving the game such marketing push, I'd wager that parity would be salient point for consideration. As such, they may make both PS4 and XB1 version 900p and lock it down at 30fps w/ v-sync, thereby negating any additional performance gains inherent to the PS4 version in framerate (visual fidelity is not being accounted for since CDPR already stated that both versions will be identical in that regard).
 
TW2 on PC had quite a few issues regarding stuttering and what not, specially in ATI cards. While it might not have been unoptimized as something like Metro 2033 or Crysis 1, it had a few share of technical problems.
Like I said, after watching their latest video, I feel like they still have some of those issues...the video was a stuttering mess. It just felt clunky and unoptimized. I know it's still a "WIP", however, I wouldn't be surprised if the final product had some of these flaws, on both PC and consoles, that's all.

But I can see that you won't budge on this, and that's ok. I guess when the game releases all these discussions will be clarified.
I'm not standing stubborn here. I'm simply waiting for a good argument to sway me to believe:

1) CDPR are just throwing a bunch of shit in without caring about how it runs

and

2) That this wont run well on anything except a top of the line machine

I'm not seeing any of this 'stuttering mess' you're seeing. I see 30fps, which looks quite stuttery if you're used to 60fps, of course.
 
Ya it can be, but at what cost? This is just one of those games I'm really interested in seeing comparisons because it is so strikingly "next-gen looking". You'd think they'd cut AO for sure, I'd be shocked if that was in PS4 or XB1 versions.

The most impressive thing about this game visually is the scenery/backgrounds/water the character models and animations do not look very Next Gen. That being said I think this game running at 900p on PS4 would be best for graphical fidelity and framerate. I also think 900p would be suitable for the XB1 with a few extra sacrifices to effects.
 
Oh god, it didn't hit the holy grail, no buy. Lol, just kidding. Hopefully the frame rate is solid and the game is fun, well because that's why I game. I look forward to it.
 
If the frame rate and draw distances are solid enough then it will still look stunning even at 720p.

Ps4 should be 900 though.
 
They probably could do that, but then they'd have to lower something else. It's a give and take with these things.

On PC, usually YOU determine what the trade off will be (if any), on consoles, it's up to the devs.

Yes, no kidding. They should do that. That's what developers do.
 
With MS giving the game such marketing push, I'd wager that parity would be salient point for consideration. As such, they may make both PS4 and XB1 version 900p and lock it down at 30fps w/ v-sync, thereby negating any additional performance gains inherent to the PS4 version in framerate (visual fidelity is not being accounted for since CDPR already stated that both versions will be identical in that regard).

At E3 they already talked about PS4 running at a higher resolution.
 
They're using Umbra which probably makes CPU speed more of a bottleneck.

At least it has been in every other hand with Umbra that I played.
 
An middleware occlusion software slows down a game? hrm....

Well it's 100% CPU based and the new console don't have very powerful CPUs. GW2 uses it and it runs like ass on AMD CPUs.

Not sure what kind of tricks they could do to optimize it, there was a GDC presentation on it.
 
Well it's 100% CPU based and the new console don't have very powerful CPUs. GW2 uses it and it runs like ass on AMD CPUs.
Not sure what kind of tricks they could do to optimize it, there was a GDC presentation on it.
They would not use it if only the PC would benefit from it.
They are a multiplatform developer.
 
If the R7 265 is capable of running Far Cry 3 at Ultra presets with 4xMSAA @1080p while averaging 33FPS and only dropping to 27 FPS (Source) then there is no reason that with the same level of IQ the Witcher 3 cannot run at 1080p30 on the PS4. Sure that level of IQ might not be 'ultra' any more but it is still very good.

All it takes is the right combination of settings to get 1080p30. Going down to 900p degrades the entire image, all of the time. It is definitely a balancing act and if they cannot get a stable frame rate without cutting to many effects out then dropping the resolution is of course the way to go. I think they should be able to get 1080p30 on the PS4 with a solid frame rate and good IQ unless the engine is an unoptimised mess of course.
 
This will need a 4GB VRAM GPU and a beefy i7 to run on ultra at somewhat stable 60fps@1080p. I can already imagine the meltdowns on the PC perfomance thread, not cause bad optimization or drivers, just because the game looks really good and demanding.
 
This will need a 4GB VRAM GPU and a beefy i7 to run on ultra at somewhat stable 60fps@1080p. I can already imagine the meltdowns on the PC perfomance thread, not cause bad optimization or drivers, just because the game looks really good and demanding.

This is unfortunately something that will happen.

Some people expect way too much out of their hardware and blame its shortcomings on the game. Some members of the PC community can be so irrationnal and immature sometimes, demanding games (rightfully so) lead to appalling displays of ignorance.

I look forward to "this game is completely unoptimized because it does not run at 1440p/60fps on a [.....]".
I wish some people knew a bit more about what they are buying before complaining. In addition to that PC gaming used to be that way, when high-end games could not be maxed out at anything close to 60fps at release, the first Mafia for example.
I would welcome more of those games as long as they are very scalable but ultra demanding options are frowed upon nowadays by a non negligible part of the PC community. When a game's specs are announced I lose count of how many are throwing unoptimization accusations without understanding anything about PC gaming's scalability.
Metro LL listed a Titan as ultra specs and so many people freaked out for no reason, they don't understand what options are supposed to mean anymore.

I'm saddened by all this. Developpers are discouraged to implement actual high-end, future proofing presets into their games and everyone suffers.
 
This will need a 4GB VRAM GPU and a beefy i7 to run on ultra at somewhat stable 60fps@1080p. I can already imagine the meltdowns on the PC perfomance thread, not cause bad optimization or drivers, just because the game looks really good and demanding.

3GB VRAM will be enough.
 
Sorry if I missed it, but it appears there's a new interview with the lead developer of Witcher 3. Was published Aug 21. Here's some stuff:

On Xboxone, ESRam and 1080p:
I would say that targeting Full HD with a complex rendering pipeline is a challenge in terms of ESRAM usage. Especially when this pipeline is built on deferred rendering. We are definitely taking advantage of the ESRAM on Xbox One, and we know that we have to fit into these limits, so we are constantly optimizing the usage.

On Mantle and DX12:
There are definitely plans for exploring different graphics APIs, post launch.

On multi-core use in PCs and Consoles
We are always trying to improve our multi core usage, but quad core CPUs were already quite efficiently used in the Witcher 2. The game, the renderer, the physics and audio systems are all running on different threads, and we are trying to maximize the efficiency. Since both current gen consoles provide 6 cores for the developer, our goal is to make the engine scalable.

On console CPU clock speeds
The slower clock speeds are not really a problem in terms of thread management, but they are definitely problematic in terms of performance. Optimizing the CPU side is definitely one of the challenges we have to tackle to be able to run our game on the consoles.

On taking advantage of PS4's GDDR5 and other graphical features
We always want to provide the best possible experience to all our gamers regardless of the platform and so we are not aiming to develop special graphical features for any of them.

On how DX12 may help PCs and Xboxone
I think there is a lot of confusion around what and why DX12 will improve. Most games out there can’t go 1080p because the additional load on the shading units would be too much. For all these games DX12 is not going to change anything. They might be able to push more triangles to the GPU but they are not going to be able to shade them, which defeats the purpose.

Source: GamingBolt More at the link.

Edit: if someone wants to make a new thread for this info, that's cool. I think there's actually a lot of stuff to discuss with this interview.
 
If the R7 265 is capable of running Far Cry 3 at Ultra presets with 4xMSAA @1080p while averaging 33FPS and only dropping to 27 FPS (Source) then there is no reason that with the same level of IQ the Witcher 3 cannot run at 1080p30 on the PS4. Sure that level of IQ might not be 'ultra' any more but it is still very good.
I think the 'but this game does blah blah' thing is a bit overplayed here. I mean, I get what you're trying to say, but there's a lot more going on in The Witcher 3, graphically, than in games like Infamous and Far Cry 3.

And they are running on different engines and all and wont necessarily perform perfectly equally.

All it takes is the right combination of settings to get 1080p30. Going down to 900p degrades the entire image, all of the time.
And a lack of AO makes the entire image look flat, all of the time. And a lack of AA makes the entire image look nasty, all of the time. And a lack of texture quality makes the assets look outdated, all of the time. And so on and so on.

It is all about the right combination of settings, but if there's only so much they can do with the settings before they're unhappy with the graphical quality, then resolution might have to be sacrificed a bit.
 
It's probably going to be the most demanding game to run when it releases, considering the beautiful graphics and the vast world.

I don't know if I should be mad that they're not hitting the 1080p mark, since Sony mostly delivers linear experiences and there is no point of comparison.

Is it fair to assume they're not pushing the hardware in order to have a level playing field with the Xbox One in terms of specs, or are the tools for the PS4 not mature enough to deliver a game like The Witcher in 1080p?
 
I think they said a while back that they have a much less complex version of it for consoles.

Where have you read that ?
I think it's possible a much grosser variant will make it into both consoles but it will have nothing to do with proper simulation. Basically, it won't be Hairworks, it will just be your standard fur/hair.

Here is how it looks in motion :
https://www.youtube.com/watch?v=VFWr44ZIEZc&list=UUASOkSclENSWAyIovnY8Wdw
The Gameworks page makes no mention of any other platforms aside from PC :
https://developer.nvidia.com/hairworks
 
Where have you read that ?
I think it's possible a much grosser variant will make it into both consoles but it will have nothing to do with proper simulation. Basically, it won't be Hairworks, it will just be your standard fur/hair.

Here is how it looks in motion :
https://www.youtube.com/watch?v=VFWr44ZIEZc&list=UUASOkSclENSWAyIovnY8Wdw

Ah, you edited after I replied. Yeah, it's probably not hairworks but it sounded like they were at least experimenting with some sort of basic simulation for consoles at some point.
 
Top Bottom