• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The Witcher 3 probably not 1080P on consoles

I never said it couldn't, I said I'm not convinced that any PC could. Lots of unknowns, I'll be proven right or wrong when the game comes out.
If you wanted to be proven right you could have simply narrowed down the argument to single GPU setups. Even then, I'm not at all convinced a Titan Black, 780ti or R9 290X won't run all of that at 1080p/60fps.

We shall see.

The "fact that it could" isn't a fact at all, given its an unknown... How is that a fact?
Bear with me, I'm not a native English speaker.
 
People have PCs with 4 Titan Blacks in them. That is like 20 Nvidia TF. I highly doubt they are designing their 16.6ms barrier to be only reached by 20plus TF configs...

That will run this game at 1080p 60. Nvidia will make sure of it. There is no unknown to make you unsecure in your presumption... it is obvious.
Gotta disagree with you, the unknown here is the optimization and hardware demands associated with the game.
 
Are you seriously comparing Infamous with The Witcher 3??

Well Infamous might not be as big and diverse as The Witcher and its also a completely different setting.
But Infamous is mighty impressive and has very much going on on screen while maintaining 1080p with the best AA I've seen on consoles and a steady 30FPS.
It also allows exceptionally fast travel in the open world without having huge pop in problems.
Look at DigitalFoundry stress-test: https://www.youtube.com/watch?v=_ZQrIksmPxc
The amount of desturction and particles is insane and the framerate doesn't drop below 30 except when you do the super attack, but you don't have control over Delsin in that situation anyway so its not really an issue.
Keep in mind that the video was prepatch, so they weren't able to lock the framerate to 30, just image it beeing a rock sold 30 instead of fluctuating above 30.


I really think The Witcher 3 can be 1080p on Ps4.
 
I really think The Witcher 3 can be 1080p on Ps4.

Ya it can be, but at what cost? This is just one of those games I'm really interested in seeing comparisons because it is so strikingly "next-gen looking". You'd think they'd cut AO for sure, I'd be shocked if that was in PS4 or XB1 versions.
 
Gotta disagree with you, the unknown here is the optimization and hardware demands associated with the game.
Please stop trying to argue semantics. Yes, we don't know how well any game will run on release. But anyone with any understanding of PC hardware can say that there are a lot of PCs out there now that will easily be able to run this at 1080p 60fps maxed.
I can play Star Citizen at 4K @ 40fps maxed with SLI 780s, and that's not even in the first stages of optimization...
 
No game that any AAA competent dev would develop, within the next 3 years, will require 20 TFs for 1080p 60fps.

NONE.

Agreed on the 20 TFs, but you're also not considering the bottlenecks in a beastly PC like that. A GPU only handles so much, its possible that isn't the only factor here.
 
Definitely a dissapointment. This rumor was kicking around prior, to their E3 statements and I guess it might still be the case. Oh well. This is why I built a PC though I suppose.
 
Ya it can be, but at what cost? This is just one of those games I'm really interested in seeing comparisons because it is so strikingly "next-gen looking". You'd think they'd cut AO for sure, I'd be shocked if that was in PS4 or XB1 versions.
No way are they cutting AO in a next-gen game.
 
Ya it can be, but at what cost? This is just one of those games I'm really interested in seeing comparisons because it is so strikingly "next-gen looking". You'd think they'd cut AO for sure, I'd be shocked if that was in PS4 or XB1 versions.

At the cost of some of the worst diminishing returns offenders.
Ridiculously expensive effects that don't cause huge graphical improvements.

For example high level AA, AO and tessellation.
I just hope they don't cut good AF.

No way are they cutting AO in a next-gen game.
They don't need to cut it. Just tone it down.
Like Watchdogs on PC which had very good AO, Ps4 version had toned down AO and XboxOne didn't have AO at all.
 
Are you seriously comparing Infamous with The Witcher 3??
Are there major differences in how the two play or in terms of systems complexity? I haven't followed W3 much, but from what I've played of W2 it seemed like a much simpler game than Infamous in terms of what you can do and how you do it. But From what I've seen of W3 it seemed like both are the kind of games where you have the open world that you traverse through, get tasks to accomplish and battle various kinds of enemies using various means. It's how you battle, and what means you use to traverse that mainly differs more than anything.
 

Grandmaster does say AO is "significantly lowered on the Microsoft console"..but it might as well be non-existent.

hBbb.png
 
Yeah, they seem to struggle with the framerate a bit.

At E3 they said that they have Framerate problems on PC aswell as consoles.

edit: For anyone that speaks German : http://www.gamestar.de/spiele/the-w.../the_witcher_3_wild_hunt,49062,3057023,3.html

Which is why they delayed the game so they could optimize and polish it before launch. Optimization is generally the last thing you do in development. They have another 6 months to get everything running as smooth as possible. And on consoles of course that could mean lowering resolution as part of the optimization.
 
Looks like I may upgrade my PC for this game after all.

I was hoping the PS4 version would meet my needs, but it sounds like it may lacking somewhat.

Seems weird they are saying this now. Didn't they already confirm 1080P on PS4?
This, for all three points.

Hope those 800 series GPUs are good.
 
Is this game going to be that much more graphically intensive than 1st party titles like KZ: Shadow Fall or Infamous which both run at 1080p and higher than 30fps the majority of the time?
 
Looks like I may upgrade my PC for this game after all.

I was hoping the PS4 version would meet my needs, but it sounds like it may lacking somewhat.

Seems weird they are saying this now. Didn't they already confirm 1080P on PS4?

No real performance information will be known til January at the earliest. Sometimes you just aren't able to hit the performance targets you thought you would be able to.
 
Is this game going to be that much more graphically intensive than 1st party titles like KZ: Shadow Fall or Infamous which both run at 1080p and higher than 30fps the majority of the time?

You can not compare The Witcher 3 to KillZone Shadow Fall. Both vastly different games since one actually has a living world filled with NPC's that go by their daily routine, and Shadow Fall is a linear shooter with a beautiful backdrop that you can not explore.

InFamous Second Son, on the other hand, is a valid argument but it is still a game that is not that demanding in terms of NPC's, dynamic weather and lighting/shadows since it has a fixed day/night setting.
 
I'm still baffled by the comment in the OP. The Xbox One demo was running at 900p at high settings. That was before the June SDK update.

Compared to other games, we can reliably say that if the Xbox One can run a game at 900p, the PS4 can do so in 1080p. Now all of a sudden 900p is the maximum for both consoles?

http://www.eurogamer.net/articles/2...-combat-was-deliberately-easy-cd-projekt-says

A 900p resolution is CD Projekt's "minimum", Mamais said. "We will hit 900p no problem. We'd like to get it up to 1080p on Xbox One. That's our goal. Whether we can do it or not I don't know. We've got to squeeze everything we can out of the hardware.

As expected, on PlayStation 4 Witcher 3 outputs at 1080p resolution. "It's just a slightly more powerful machine," Mamais said.
 
I'm still baffled by the comment in the OP. The Xbox One demo was running at 900p at high settings. That was before the June SDK update.
It also had performance problems while doing so.

Its not impossible that they decide to cut back on resolution to free up resources.
 
I'm still baffled by the comment in the OP. The Xbox One demo was running at 900p at high settings. That was before the June SDK update.

Compared to other games, we can reliably say that if the Xbox One can run a game at 900p, the PS4 can do so in 1080p. Now all of a sudden 900p is the maximum for both consoles?

http://www.eurogamer.net/articles/2...-combat-was-deliberately-easy-cd-projekt-says

Either the dev is pushing for parity or the gap is officially closed
 
I understand what you're saying... all I'm asking is what does the general consensus among PC GAF consider the bare minimum resolution when trying to max out a game in 2015. I get that its technically possible for any resolution really... but there's gotta be some cut-off point where its like... "dude, you're running the game at (insert resolution here) why are you even posting in the PC thread claiming you've maxed it out?" The argument started when someone claimed that 60 fps would definitely be doable on PC for this game... and I'm asking how far down in resolution are we talking, and to that point is it even relevant because nobody will play at that resolution regardless. Or maybe they will, how the heck would I know.
In PC the consensus is always go with the native resolution of your monitor... so if 900p then 900p, 1080p then 1080p, 1440p then 1040p, etc.

If you can't get a solid framerate at native resolution of your monitor then you need a new hardware.

In absolute no case I will run a game below the native resolution of my monitor.
 
In PC the consensus is always go with the native resolution of your monitor... so if 900p then 900p, 1080p then 1080p, 1440p then 1040p, etc.

If you can't get a solid framerate at native resolution of your monitor then you need a new hardware.

In absolute no case I will run a game below the native resolution of my monitor.
Agree, but I always recommend purchasing monitors with 1:1 scaling built into the display. That way you can lower resolution without scaling.
 
Agree, but I always recommend purchasing monitors with 1:1 scaling built into the display. That way you can lower resolution without scaling.

You mean with black borders? Pretty sure you can select your GPU to do the scaling and then any monitor is supported. At least nVidia control panel has this option, don't know about AMD.
 
You can always buy a smaller monitor :D
That why you need to smart choose the monitor when mounting a PC.

A super high resolution monitor with a hardware that can't reach that is useless... and a low resolution monitor with in fucking high end hardware is useless too.

Both needs to fit each other otherwise you will be subutilizing one of them.

It is the same when you choose CPU, GPU, memory, etc... you can't try to mix weak CPU with strong GPU for games for example.
 
I would say the latest video they released is a pretty good reason.
To say that CDPR are just trying to put a bunch of shit in without caring about how it runs?

No reason to think that.

To say that this game will only run ok on top-of-the-line hardware?

Again, no reason to think this.

Maybe its unfair to say absolutely no reason, but perhaps I should say 'very little' reason. You're being way too quick to jump to these conclusions.
 
I'm amazed that people doesn't know how taxing a game like The Witcher 3 or AC Unity can be, expecting those games to do 1080p @60fps on a modified 7850 is ridiculous, then you have this guy in the other thread asking for 4K support on consoles :/
 
I'm amazed that people doesn't know how taxing a game like The Witcher 3 or AC Unity can be, expecting those games to do 1080p @60fps on a modified 7850 is ridiculous, then you have this guy in the other thread asking for 4K support on consoles :/
Nobody is asking(or should I say 'expecting') for this game to be 60fps on the consoles, to be fair.

Hi scoobs.

I would like to accept your challenge and think I would win.
Ha.
 
To say that CDPR are just trying to put a bunch of shit in without caring about how it runs?

No reason to think that.

To say that this game will only run ok on top-of-the-line hardware?

Again, no reason to think this.

Maybe its unfair to say absolutely no reason, but perhaps I should say 'very little' reason. You're being way too quick to jump to these conclusions.

I was referring to how it will run on consoles - I think that even with a resolution drop it will still be hard to maintain the 30fps that they seem to be aiming at. I think they are too ambitious with how much they are filling the world with.
On PC's I think that it won't do so well on something like a 680 without having to drop down several graphic options.
My experiences with TW2 on PC (more than one PC actually) have been extremely troublesome though, so I might be biased in thinking that they aren't the best at optimizing stuff.

Again, just speculation, and hopefully I will be proven wrong.
 
I think this resolution stuff has really gone to people's heads.

I'm honestly expecting both XB1 and PS4 to be 900, but it's still gonna look freaking incredible no matter what. CDP is primarily a PC dev anyway so it shouldn't be much of a surprise, but still, the 360 port of Witcher 2 was incredible so I'm not worried at all.
I could imagine matching resolutions but PS4 pushing a higher framerate.

I think he can expect more. Few years from now Witcher 3 will be the Oblivion of this generation in terms of graphics.
I never thought of it like this... O_O
 
Show me a PC that can run this game at 60fps and maxed out and I'll show you a PC that doesn't even exist yet.

Scoobs, that's not even true. Do you keep up with PC gaming/hardware at all, or are you just hypothesizing on something you're unfamiliar with?
 
I'm amazed that people doesn't know how taxing a game like The Witcher 3 or AC Unity can be, expecting those games to do 1080p @60fps on a modified 7850 is ridiculous, then you have this guy in the other thread asking for 4K support on consoles :/

No one expecting 60 fps, the hell you have read this absurdity? We are only talking to the possibility of 1080p on ps4, no more. About Unity, well, we know perfeclty the ubisoft limits. Even with WD had some problem, I can't imagine with the next gen AC. They aren't exactly the wizards of the optimizations.
 
Top Bottom