• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Witcher 3 4K Patch for Xbox One X

black070

Member
From where did you read the patch isn't coming in few days? Last I read it was coming soon.

http://www.eurogamer.net/articles/2017-09-07-4k-ps4-pro-update-for-the-witcher-3-coming-in-few-days

Share this:Facebook912TwitterGoogle+Reddit209
By Robert Purchese Published 07/09/2017

UPDATE 9PM BST: CD Projekt Red has since downplayed the "few days" proximity to me. "PS4 Pro and Xbox One X: both technical updates are coming," senior PR manager Radek Adam Grabowski told me. "More details about them, including the exact moment when they will be released, are something we are going to announce when the right time comes." Will it be "just a few days", then, or significantly longer?
 

cakely

Member
So the Pro patch isn't coming in a few days then ? Bummer, I hope they don't make us wait for the X release - would just be kind of needless.

http://www.eurogamer.net/articles/2017-09-07-4k-ps4-pro-update-for-the-witcher-3-coming-in-few-days

UPDATE 9PM BST: CD Projekt Red has since downplayed the "few days" proximity to me. "PS4 Pro and Xbox One X: both technical updates are coming," senior PR manager Radek Adam Grabowski told me. "More details about them, including the exact moment when they will be released, are something we are going to announce when the right time comes." Will it be "just a few days", then, or significantly longer?

Yeah, not "a few days" but hopefully not much longer.
 
Best case it would be similar to what we got with RotTR:
* High Res mode
* Enriched graphics mode (bells & whistles)
* and Performance mode (unlocked fps)

If that's the case that would be fantastic but I would be really happy with Native 4K and much improved AA already. Or would a higher res automatically defeat those jaggies?
 

Colbert

Banned
If that's the case that would be fantastic but I would be really happy with Native 4K and much improved AA already. Or would a higher res automatically defeat those jaggies?

Higher rez would decrease the "jaggies" but not completely eliminate them.

Btw jaggies ...
giphy.gif
 

KageMaru

Member
Well that sucks. I was looking forward to starting hearts of stone soon.

I doubt that CP will use "the same engine". Even if it will be an evolution of the engine used for TW3 it will probably be pretty heavily modified making any kind of backporting complex and time consuming. It is possible that Pro will get CB only if that's what they will use for XBX as well.

Well yeah, I didn't really mean exactly the same engine. However you do make a good point that the evolution of the engine may not make it possible to backport CBR if they have it implemented in the newest iteration of the engine. Would love more than just a simple bump to 1440p but that may be wishful thinking on my part.
 
CBR renders half the pixels per frame compared to a native render pass and is using the missing pixels from the "checkerboard" from the previous frame or by interpolation.
Both portions of this process, including the reprojection, are in fact rendering. You're using the term too restrictively.

Render != interpolation or (re)construction!
Yes it does. As explained to you, CBR is not the only technique in games that interpolates or reconstructs data for the current frame. All these methods are part of rendering.

Interpolation/extrapolation (I should have used the term "Extrapolate" to begin with) is deriving pixel info from already existing pixels which are a result of a render pass. So they don't pop up from nowhere but they are not "rendered" which means produced by draw calls!
You're using the term "draw calls" incorrectly. And CBR does not necessarily reduce them (though it might).

It's not graphicallly rendering a scene in native 4k. The gpu is only pushing half the res of 4k and then post processing to guess the rest
Approximately true, but it's not a "guess". Reprojection uses known motion vectors, some implementations also verify using an ID buffer calculated during the current frame, and many have an error-correction stage at the end as well. All of this is consistent with the general subtlety and cleverness of graphics engineering. Trying to pack pejorative implications into your vague description is unwarranted.

It is half the pixels that are rendered and the result is updated to a full 2160p framebuffer by construction or extrapolation.
Again, you're using the word "rendering" too narrowly. Extrapolations are part of the rendering pipeline, whether reprojection for CBR or depth-difference analysis for SSAO.
 

Colbert

Banned
..
You're using the term "draw calls" incorrectly. And CBR does not necessarily reduce them (though it might).
...

I did not say it reduces (the amount) of draw calls, I said it reduces the cost of draw calls (based on the amount of to be calculated/drawn pixels). Thats why there is CBR! I would suggest to read more carefully what was said!

Addressing the rest of your response:
And yes I indeed use the term "rendering" very restrictive to explain the difference between a specific CBR and a native render path step!

I am really surprised to see that kind of response from you!
 

Lt-47

Member
Interpolation/extrapolation (I should have used the term "Extrapolate" to begin with) is deriving pixel info from already existing pixels which are a result of a render pass. So they don't pop up from nowhere but they are not "rendered" which means produced by draw calls! [/url]

You have one weird (and narrow) definition of what the word rendering mean.... Not that I would care that much if you didn't tell people they don't understand CBR when they clearly do
 

Colbert

Banned
You have one weird (and narrow) definition of what is rendering....

Just for the purpose of explanation as we specifically talk about one step in a render path that renders a scene natively or via CBR by draw calls. That specific step is the main difference in performance requirements for those methods after all as those draw calls are targeted onto half the pixels with CBR compared to native!

I don't talk about all the other stuff like culling or any post-processing not related to both methods!

You may find it weird, for me it is a tool to emphasize the differences more clearly!

I also give that subject a rest now as I am guilty of derailing the thread! Please forgive me!
 
Render != interpolation ?? What ? Interpolation is still a form of rendering. Pixel don't pop out from nowhere...

It's not really in the context of native resolution, and by the definition you provided, a simple upscale would be native as well.

There's a lot of subtlety in the rendering pipeline. Do texture samples that come from cache lines not count because they're not being read from GDDR? Do values interpolated from polygon vertexes not count because they aren't computed from scratch at every pixel? We have approximations left and right, but you're absolutely going to draw the line at using perfectly good pixel data from the prior frame as a shortcut?

CBR isn't as accurate as native rendering, but it can be a staggeringly close approximation. It can also fall back to interpolation when the prior frame isn't a good reference, as during rapid motion, at frame edges, or during cuts between scenes or camera angles. Worst case, 50% of the pixels are rendered natively. Best case, 100% were rendered through a conventional lighting pipeline, just not during the last frame.
Textures read from cache lines are still the same as if they were read from the GDDR. CBR degrades the image quality in return of performance. And it does that by only having expensive computations on half the pixels per frame.

It may be a smart algorithm and in best cases even be indistinguishable from native, but it doesn't change what it is.
 

KageMaru

Member
While what is being produced on screen can, and should, be considered rendering. I think Colbert is referring to the number of pixels processed within the framebuffer. At least that's how I've interpreted things so far.
 
While what is being produced on screen can, and should, be considered rendering. I think Colbert is referring to the number of pixels processed within the framebuffer. At least that's how I've interpreted things so far.

By rendered he means going through the rendering pipeline.

That is: Vertexes are processed, converted into fragments, pixel shaders, compute shaders, shadows, transparencies and what not are processed for these pixels, which result in a final color that is the output.

CBR only does that for half the pixels per frame, for the other half it interpolates/extrapolates the position of the already calculated color from the previous frame which is clearly less resource intensive.

Edit: The algorithm might be smarter and the results might be better than a simple upscaling, but as far as rendering goes CBR and a software upscaling doubling the res wouldn't be much different (especially if said upscaling algorithm gets more complex as using AA samples and some temporal component).
 

Colbert

Banned
While what is being produced on screen can, and should, be considered rendering. I think Colbert is referring to the number of pixels processed within the framebuffer. At least that's how I've interpreted things so far.

By rendered he means going through the rendering pipeline.

That is: Vertexes are processed, converted into fragments, pixel shaders, compute shaders, shadows, transparencies and what not are processed for these pixels, which result in a final color that is the output.

CBR only does that for half the pixels per frame, for the other half it interpolates/extrapolates the position of the already calculated color from the previous frame which is clearly less resource intensive.

Edit: The algorithm might be smarter and the results might be better than a simple upscaling, but as far as rendering goes CBR and a software upscaling doubling the res wouldn't be much different (especially if said upscaling algorithm gets more complex as using AA samples and some temporal component).

Can I contract you in the future to make my comments more comprehensible?
 
Sorry, should've been more clear.. Geralt looks terrible with it.

The monsters though, top notch.

The HairWorks shader for Geralt is terrible. It makes his hair grey at nights, wet when not wet etc... and the cut makes his head flat compared to default one as if his skull was sliced.
On monsters, it has the best implementation of HairWorks in any game released or not (yeah I am talking about FFXV).

Oh, btw, i twas me who told the devs to make separate options in enabling HairWorks. Otherwise you would get huge performance drops.
 

BigEmil

Junior Member
Everyone is talking about HDR, 4k and checkerboarding but i would love to see them go back to the original renderer they used to show off the game for the first time.

It's probably more time consuming but i would imagine they wouldnt mind going back and releasing their original vision of this game. they already had that engine up and running and ditched it after the shitty Xbox and PS4 GPU specs were revealed. I think a 4.2 and 6 tflop console would easily be able to run these graphics at 1080p.

2825918-6436431767-get


02.gif


b4TnCh2.gif


original.gif
Yeah this is the visual jump I want.

Not just higher resolution on the current state of the game visuals...

This is why I'm not interested in both Pro and X1X they mainly focus on higher resolutions and visuals wise not much difference just maybe add a bit of that here and there and done.
 
About chekcerboarding: all devs use checkerboarding to increase resolution for the mid-gen consoles but not all checkerboarding techniques give the same and best results. By far the best and almost flawless one is the one in Horizon. The devs arranged the pixels drawn in a very smart way, thus minimizing effects artefacts and squashed colors which gave a result very close to native solution. This is very impressive for such beautiful open world game.
 

Tyaren

Member
I already thought I was the only one that thought the hair works, even though advanced and very demanding does look actually worse than other conventional video game hair. Same goes for Lara's hair in Tomb Raider. It has a strange texture to it, is constantly moving and floating as if the character was under water. Very unnatural.

By the way, it is a shame that the "a few days" statement wasn't true after all. Oh well... :(
 

Lady Gaia

Member
CBR degrades the image quality in return of performance.

It’s blanket statements like this that earn the pushback you see in this thread. CBR does not inherently degrade image quality. It’s perfectly capable of producing visuals that are, pixel for pixel, identical to a native rendering. It just can’t do so in all circumstances and chooses to make the performance trade off you reference for circumstances where the compromise is least likely to be visible: areas with complex motion.

Do you see a lot of people arguing that UHD Blu-ray isn’t really 4K because it uses perfeptual tricks to choose what to encode and what to extrapolate? Or that the encoding specifically takes advantage of pulling content liberally from prior frames?

And it does that by only having expensive computations on half the pixels per frame.

Pixels pulled from prior frames have already had these expensive computations performed to generate them. Reusing the results of a computation is just another form of caching. Only when a prior reference isn’t readily reused does a CBR algorithm resort to interpolation, which is definitely going to lead to softness ... but typically for far fewer than half the pixels in the frame.

It may be a smart algorithm and in best cases even be indistinguishable from native, but it doesn't change what it is.

A smart algorithm that yields performance advantages while remaining potentially indistinguishable from native rendering? It seems we agree on something after all.
 

JaggedSac

Member
It’s perfectly capable of producing visuals that are, pixel for pixel, identical to a native rendering. It just can’t do so in all circumstances and chooses to make the performance trade off you reference for circumstances where the compromise is least likely to be visible: areas with complex motion.

The more motion there is, the less likely it will be identical to native rendering. This makes it work better in some genres moreso than others.
 

belvedere

Junior Butler
Outside of the various artifacts dependent on implementation, you'll hear in nearly every DF video that "you need a magnifying glass" or "unless your closer than a couple of feet away" you likely won't see a difference between CB and native.
 

thelastword

Banned
Hoping the Pro patch adds Hairworks.
No, it's too demanding and it looks worse in many scenarios, especially on Geralt...60fps is a better target.

Everyone is talking about HDR, 4k and checkerboarding but i would love to see them go back to the original renderer they used to show off the game for the first time.

It's probably more time consuming but i would imagine they wouldnt mind going back and releasing their original vision of this game. they already had that engine up and running and ditched it after the shitty Xbox and PS4 GPU specs were revealed. I think a 4.2 and 6 tflop console would easily be able to run these graphics at 1080p.
I think this is as pipe, as pipe dreams go, that's actually a lot of work to re-do or apply the first renderer. That would require lots of optimization work, bug checks etc for consoles...It would also mean asset quality and textures would have to be improved with the better lighting, shadows and physics etc...

But Hey, if they do decide to do an Enhanced version for PS5, XBOX 3, I think that's when this will be most likely a possibility...

The HairWorks shader for Geralt is terrible. It makes his hair grey at nights, wet when not wet etc... and the cut makes his head flat compared to default one as if his skull was sliced.
On monsters, it has the best implementation of HairWorks in any game released or not (yeah I am talking about FFXV).

Oh, btw, i twas me who told the devs to make separate options in enabling HairWorks. Otherwise you would get huge performance drops.
Absolutely, it looks very bad compared to the conventional hair. It's a waste of computation relative to the visible results imo.

I already thought I was the only one that thought the hair works, even though advanced and very demanding does look actually worse than other conventional video game hair. Same goes for Lara's hair in Tomb Raider. It has a strange texture to it, is constantly moving and floating as if the character was under water. Very unnatural.

By the way, it is a shame that the "a few days" statement wasn't true after all. Oh well... :(
No you're not the only one, but I do find Lara's hair to be one of the best implementations of hair., Not so much the first implementation of tress FX which was a resource hog and wasn't that great looking, where there was clipping and physics wise it was not that smooth or flowed/transitioned well......The newer TressFX in ROTTR looks much better imo... Apparently Rapid packed math seems to be a good solution to tackle believable hair at half of the computation, so it will be interesting how it improves in shadow...
 

Lady Gaia

Member
The more motion there is, the less likely it will be identical to native rendering. This makes it work better in some genres moreso than others.

We’re also seeing a lot of experimentation with various flavors of CBR, so don’t be too quick to judge its limitations based on early implementations that show objectionable artifacts. Nor are all forms of motion problematic (panning across a backdrop works fine, for instance, with newly exposed areas resolving full detail within two frames.)

Even areas where interpolation are rampant due to motion aren’t likely to be readily distinguishable by a discerning eye in real time when playing a game instead of studying details, at least at 4K on typical consumer displays. They’ll be progressively more obvious at lower resolutions. In general, I’d rather see the computational headroom saved with CBR applied to other more obvious improvements (be they complex material simulations, draw distance improvements, or other wins.)

For that matter, once you realize that HDMI bandwidth limitations for HDR content at 60Hz results in chroma subsampling the whole argument for supremacy of native rendering starts to seem even more suspect.
 

KageMaru

Member
By rendered he means going through the rendering pipeline.

That is: Vertexes are processed, converted into fragments, pixel shaders, compute shaders, shadows, transparencies and what not are processed for these pixels, which result in a final color that is the output.

CBR only does that for half the pixels per frame, for the other half it interpolates/extrapolates the position of the already calculated color from the previous frame which is clearly less resource intensive.

Edit: The algorithm might be smarter and the results might be better than a simple upscaling, but as far as rendering goes CBR and a software upscaling doubling the res wouldn't be much different (especially if said upscaling algorithm gets more complex as using AA samples and some temporal component).

I know render has multiple definitions, I shouldn't have insisted how it should be used, but it looked like there was a misunderstanding on how the word was being applied here.

Can I contract you in the future to make my comments more comprehensible?

DYDXKl8.jpg
 
It's not really in the context of native resolution, and by the definition you provided, a simple upscale would be native as well.
No one has argued that CBR is natively rendering the same number of pixels. I'm saying that it's rendering the same number of pixels, with half done in a different (less demanding but less accurate) manner.

While what is being produced on screen can, and should, be considered rendering. I think Colbert is referring to the number of pixels processed within the framebuffer. At least that's how I've interpreted things so far.
But it's very likely that there's no half-res framebuffer used in most CBR methods. People usually conceive of the process as sequential--first regular render half, then checkerboard half--but there's plenty of reason to run these paths in parallel. Thus they'd fill the same, full-size buffer.

By rendered he means going through the rendering pipeline.

That is: [vertices] are processed, converted into fragments, pixel shaders, compute shaders, shadows, transparencies and what not are processed for these pixels, which result in a final color that is the output.
That's not the pipeline for raytracing; so is raytracing not rendering? It's not the pipeline for signed distance fields; so is that approach not rendering? Your argument leads to bad conclusions. Again, what you've done is assert a narrow definition of "rendering" that doesn't cover the range of methods used to achieve the same ends, and doesn't accord with either natural sense or common usage.

Further, that entire pipeline, start to finish, has been performed for every single pixel of output--it was merely done a frame earlier for half of them. So you not only have to arbitrarily constrain the methods allowed to be "rendering", you have to also constrain the time horizon over which they're performed. This is yet more evidence that your personal definition requires some extremely fine gerrymandering to avoid ever including CBR.

Instead, you can just say that CBR renders the same number of pixels, but half of them are done in a faster but possibly less accurate way. This alternate statement has the twin benefits of being simpler and true.

By far the best and almost flawless one is the one in Horizon. The devs arranged the pixels drawn in a very smart way, thus minimizing effects artefacts and squashed colors which gave a result very close to native solution.
From what I've seen, the Infinite Warfare CBR is also very good.
 

onQ123

Member
To be honest y'all misusing "Native" but it's what most people know so trying to explain things would be another disaster.
 

KageMaru

Member
But it's very likely that there's no half-res framebuffer used in most CBR methods. People usually conceive of the process as sequential--first regular render half, then checkerboard half--but there's plenty of reason to run these paths in parallel. Thus they'd fill the same, full-size buffer.

Maybe I'm misunderstanding things here but I thought the quote below indicates a half size framebuffer. Now I'm sure there's a buffer where information from the previous frame is stored to be used in the next frame but I'm not sure that would mean it's the same as a full native 4K framebuffer.

Some developers - eg the developers of Spider-Man and For Honor - producing their own 4K techniques based on four million pixel framebuffers

http://www.eurogamer.net/articles/d...tation-4-pro-how-sony-made-a-4k-games-machine
 

Space_nut

Member
Maybe I'm misunderstanding things here but I thought the quote below indicates a half size framebuffer. Now I'm sure there's a buffer where information from the previous frame is stored to be used in the next frame but I'm not sure that would mean it's the same as a full native 4K framebuffer.



http://www.eurogamer.net/articles/d...tation-4-pro-how-sony-made-a-4k-games-machine

No matter how many processing is done cb starts off rendering a scene half the res of a 4k res. All this uprendeirng and reconstruction just adds the rest of the pixels using previous frame data which is HALF res of a 4k.

Plain and simple rendering native 4k is more demanding and computationally intensive then cb or any other reconstruction

I wonder if Quantum break is considered 1080p cb since it reconstructs at that
 

Neith

Banned
Yeah this is the visual jump I want.

Not just higher resolution on the current state of the game visuals...

This is why I'm not interested in both Pro and X1X they mainly focus on higher resolutions and visuals wise not much difference just maybe add a bit of that here and there and done.

Honestly, with HD reworked, super turbo lighting mod and the weather mod that goes with it Witcher 3 looks damn amazing on 4K Ultra with ini tweaks. It looks much closer to the trailers than it ever will on anything else IMO.
 

onQ123

Member
Maybe I'm misunderstanding things here but I thought the quote below indicates a half size framebuffer. Now I'm sure there's a buffer where information from the previous frame is stored to be used in the next frame but I'm not sure that would mean it's the same as a full native 4K framebuffer.



http://www.eurogamer.net/articles/d...tation-4-pro-how-sony-made-a-4k-games-machine

That's Temporal Injection that they was talking about but even Geometry rendering has a 4K frame buffer but other parts of the rendering is only HD.
 

Space_nut

Member
When did I say it renders native 4k?

Please add HDR to the game!




That's not how cb rendering works. No need for uprendering/post-processing. CB renders a full 2160p framebuffer. All pixels are being rendered. Just half of them are 100% accurate calculations. Other half is calculated using different methods.

Lmao how do you think the other half is caculated using different methods without no need for uprendering/post processing?? Do you know what you're saying
 

onQ123

Member
Lmao how do you think the other half is caculated using different methods without no need for uprendering/post processing?? Do you know what you're saying

Checkerboard rendering does render a 4K framebuffer

 

Neith

Banned
Checkerboard rendering does render a 4K framebuffer

Hey can I ask you something? Why are you still a junior on this site? I'm pretty new so excuse me for asking.

Also, whether it is checkboard or native it is going to be decent guys. All the game needs on consoles is a bit more supersampling, but IMO they should really go for draw distance and LODs if possible.
 
1440p with improved AA, Draw/LoD would do wonders for the Pro version

i just hope they don't use all their power shooting for higher resolutions instead

anyway I am hyped. thanks Scorpio
 

onQ123

Member
Hey can I ask you something? Why are you still a junior on this site? I'm pretty new so excuse me for asking.

Also, whether it is checkboard or native it is going to be decent guys. All the game needs on consoles is a bit more supersampling, but IMO they should really go for draw distance and LODs if possible.

Because I was telling people we was going to have 4K games this generation lol
 

KageMaru

Member
That's Temporal Injection that they was talking about but even Geometry rendering has a 4K frame buffer but other parts of the rendering is only HD.

EesAqD7.jpg


/s

Sorry no offense but it's hard to take you serious considering how eager you are to prove you know more than everyone else.

Checkerboard rendering does render a 4K framebuffer

You do realize a 4K framebuffer would be 8.2 million pixels, not 4 million, right?

Hey can I ask you something? Why are you still a junior on this site? I'm pretty new so excuse me for asking.

Also, whether it is checkboard or native it is going to be decent guys. All the game needs on consoles is a bit more supersampling, but IMO they should really go for draw distance and LODs if possible.

IIRC it's because he made a stupid thread or stupid posts.
 
Also this is off topic but why tf haven't Rockstar made a Pro patch for GTA V yet? are they that busy with GTA Online?

CDPR gets a lot of respect from for doing this after originally saying that they wouldn't. they really care about their fans
 

onQ123

Member
EesAqD7.jpg


/s

Sorry no offense but it's hard to take you serious considering how eager you are to prove you know more than everyone else.



You do realize a 4K framebuffer would be 8.2 million pixels, not 4 million, right?



IIRC it's because he made a stupid thread or stupid posts.

You do realize that the 4 million part is about Temporal Injection right? which is what I was telling you in the 1st place
 

KageMaru

Member
You do realize that the 4 million part is about Temporal Injection right? which is what I was telling you in the 1st place

Yes I knew that before you chimed in. Still don't understand how 8.2 million, which would make up a 4K framebuffer, equal 4 million?
 
I came in here to get hyped about a Witcher 3 4K patch...and all I got was confused about checkerboard rendering. "This is NeoGAF". Lol I need a T-shirt with that silk-screened on it or something. Hehe

Seriously tho, I'm cautiously optimistic about this patch...hopefully they'll go for better IQ (others mentioned draw distance, LODs, etc) rather than just straight up-rez. But in any case, this is a perfect example of why so many of us love CDPR. It's why I have W3 on Xbone/PS4/PC and haven't even played it yet. Lol Bought copies for friends too. These guys rock in just about every way.

Edit:

A part of me is hoping the patch turns out to be Witcher 3:Enhanced Edition like what they did with Witcher 1 and 2.

I was thinking the exact same thing, but with how busy they are on Cyberpunk I don't want to get my hopes up for anything beyond a really bare-bones up-rez. I'd rather be pleasantly surprised than disappointed.
 

onQ123

Member
Yes I knew that before you chimed in. Still don't understand how 8.2 million, which would make up a 4K framebuffer, equal 4 million?

You said you knew before I chimed in yet you still don't understand.

You posted a quote that was talking about 4 million pixels thinking it was about Checkerboard rendering so I posted & told you that they was talking about Temporal Injection in that quote & I also responded to someone else to show them that CB & Geometry Rendering has a 4K framebuffer & that it's TI that's using half of 4K framebuffer.
 
Top Bottom