• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The Witcher 3 probably not 1080P on consoles

Still keeping my Xbox One Collector's Edition pre-order... Until I see something official that says anything like "single digit frame rate" and/or "overly excessive screen tearing".
 
Jesus guys, just turn down the settings until you can get 1080p/30fps on the PS4. It's not hard.

As someone without a gaming PC I'm relying on them doing a good job here.
 
The conspiracy theorists that think their "GodStation 4" was gimped intentionally is hilarious.
Guys. It isn't as strong as you think it is.
 
900p on PS4 is the lowest acceptable. I'd rather have slightly reduced shaders and weaker AA or shorter fade in of distant items than 720p.

Even on solid gaming pc rigs their games tend to run horribly.

It's their post processing methods.
 
That still does not make a ACE a similar thing to a SPU. ;-)

On the GPU, (programmed) computations are performed on the CUs, and the ACEs are there to schedule and dispatch workload to the CUs. On Cell, the SPUs might be very roughly comparable to the CUs, since they also perform programmed computations, but the do it much more autonomously. There is no fire-and-forget hardware logic (like the ACEs) to schedule workloads to them. It had to be programmed manually, following one of many possible patterns.

They example is in context of the parallel working method of both. The SPU work is all out about batch your work on each so then when a job is done you don't get batch stalls in the gaps, keep the gaps small and staggered. The logic flow on the cell was crap so to make it work well you had to code with the prediction in your methods.

With GPGPU then the large que of ACE's can reorganise the GPU workload within each cycle meaning if something changes mid job it can que up another to fill the gap.

Layman example but generally how it works.
Thank you for the explanations. Well, ACE could be really helpful in game like The Witcher 3. Hope they will try to use those on the ps4 port.
 
Really? They are 6 month from release.. they are going to work on the optimalization of the game in the next months, so give them time before they can have a clear statement about this sort of things.. jesus..

hey, i could get hit by a car & die tomorrow! i need to know now! :) ...
 
The Witcher 2 was very graphically intensive. Even with a GTX 570 and Intel i5 3570k, which murders the Xbox One and PS4, I struggled to run it smoothly on very high settings at 1080p.

The Witcher 2 ran smooth for me on my 560ti on high settings. Just gotta turn down the AA and Ubersampling.
 
Of course it applies. You just proved it. Someone claimed the anims in The Witcher 3 don't look technically impressive, you said animations are artistic only and not reliant on tech
No, I didn't.

I pointed out how tech and hardware can significantly improve animations by making them completely dynamic instead of canned and fixed.
You pointed out something that was both obvious and contextually irrelevant.

Yes, The Witcher 3 does not use dynamic animations. That's probably why some people feel they look "last gen".
No, it's not, it's because they don't like how they look (which is agreeable to some extent, it just doesn't relate to "gens").
Plenty of games that I'm sure you wouldn't criticize for "looking last gen in their animations" don't use dynamically generated animations either.
But I already spoke about this on my previous post, which you probably skipped.
 
If so, then at least res should be 900p, so I can't see the problem. If it's 900p now, they should polish the game until February. And I think even 900p looks great here, hoping there won't be pop-up issues and so.

Yeah I don't see it going below 900p for either console, 1080p is a maybe on PS4 it seems.
 
Console only gamers should probably count themselves lucky their machines will even be able to run this game at all (let alone run it in HD) and pray there are no further compromises to come down the road.
[[/IMG]

They probably don't, even the 7 year old Xbox 360 could run the Witcher 2 at 720p30. A next gen console will surely run whatever game is coming out ~1 year after its launch. The Xbone footage at E3 looked like Ultra settings, if that's 720p I'm okay with that. PS4 is a beast of a machine, nearly 2TFLOPs of raw power, 8 ACEs, 18 Compute units, 64 Shaders, backed with 8 gigs of GDDR5. With so much potential it's only a matter of optimization whether it hits 1080p or not.
 
They have some kind of deal with Microsoft. They said that in response to the outrage about exclsuive pre-order stuff for the XboxOne version.

I doubt it will be anything more than just exclusive pre-order content.

The outrage that will commence if Microsoft paid for parity will be fun to watch, though.
 
They have some kind of deal with Microsoft. They said that in response to the outrage about exclsuive pre-order stuff for the XboxOne version.
I hate doing the "Sony Too™" dance but they have plenty of first, preorder and exclusive content to their games as well. I disagree to all of it anyway.
 
Send in the MS engineers this is unacceptable

MS should unlock the hidden GPU in the external power-supply to boost the resolution up to 1080p.
If this isn't enough, they should strip away the Xbone controller. The additional resources allow access to up to 10% additional CPU performance, I've heard.
 
Well they better hope they can get it to 1080p on PS4 or else I'll just be waiting on a Steam sale since I haven't beat TW2 and I'll likely have to upgrade my PC to run it nice. Hopefully this isn't just a PR codeword for "we're forcing parity on consoles because MS is nice to us"
 
I wonder what is this going to run like on my PC. I guess 60fps is out of the question. I got 80~ on TW2, without cinematic DOF.
 
I'm going to play on PC anyway because of the save carry over, but I definitely won't be running at high settings on my 7850.
 
They probably don't, even the 7 year old Xbox 360 could run the Witcher 2 at 720p30. A next gen console will surely run whatever game is coming out ~1 year after its launch. The Xbone footage at E3 looked like Ultra settings, if that's 720p I'm okay with that. PS4 is a beast of a machine, nearly 2TFLOPs of raw power, 8 ACEs, 18 Compute units, 64 Shaders, backed with 8 gigs of GDDR5. With so much potential it's only a matter of optimization whether it hits 1080p or not.

I pray that this is not a serious post.
 
I was buying it on PC anyway for the 60fps. This just makes the decision easier. But like I said in the Ryse threads, I can't tell the difference between 900p and 1080p on my tv. It still looks great if the details are there (lighting, shadows, textures, particles, AA, AF, etc)
 
A GTX 570 is absolutely weaker than the GPU in the PS4, probably weaker than the GPU in the XB1 (I forget, tbh). Not that this matters much.

But this isn't even true...
PS4 GPU ~ midpoint between 7850 and 7870. If anything, it's roughly equal to a GTX570.

Might this be the game to push me to buy a second 4GB GTX770? We'll see. After gaming at 1440p/60fps, it's kind of difficult to go back to lower resolutions along with lower IQ.
 
Physical items only. I guess that matters to some, but the game content is identical.
Well I don't think it is wrong or disagree with them... it is fine to me but it is deny what you wrote:

"CDPR has held the line on no console exclusives whether they be timed, retailer, etc."

They have console exclusivity... so it is not true.

Really? I guess i missed that. Not that it affects me in any way but i thought they were "above" of such practices.
They are not... there are exclusive items for XB1 in Limited Edition.

But it is fine... just don't make them different from others devs.
 
I hope my 760 can handle this on high-ish at 1080p. I kind of would have rather played on consoles, but this is going to be PC all the way it seems.
 
Top Bottom