• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Witcher 3 4K Patch for Xbox One X

KageMaru

Member
I came in here to get hyped about a Witcher 3 4K patch...and all I got was confused about checkerboard rendering. "This is NeoGAF". Lol I need a T-shirt with that silk-screened on it or something. Hehe

Seriously tho, I'm cautiously optimistic about this patch...hopefully they'll go for better IQ (others mentioned draw distance, LODs, etc) rather than just straight up-rez. But in any case, this is a perfect example of why so many of us love CDPR. It's why I have W3 on Xbone/PS4/PC and haven't even played it yet. Lol Bought copies for friends too. These guys rock in just about every way.

Edit:



I was thinking the exact same thing, but with how busy they are on Cyberpunk I don't want to get my hopes up for anything beyond a really bare-bones up-rez. I'd rather be pleasantly surprised than disappointed.

At this point I'd be happy with 1440p, HDR, and some improved effects.

You said you knew before I chimed in yet you still don't understand.

You posted a quote that was talking about 4 million pixels thinking it was about Checkerboard rendering so I posted & told you that they was talking about Temporal Injection in that quote & I also responded to someone else to show them that CB & Geometry Rendering has a 4K framebuffer & that it's TI that's using half of 4K framebuffer.

I never claimed or thought the quote was directly about CBR, everyone has known Spider-Man uses TI for a long time now. I posted that quote because TI and CBR have something in common where they both use data from a previous frame and use that info with the current frame. You assume too much.
 

onQ123

Member
At this point I'd be happy with 1440p, HDR, and some improved effects.



I never claimed or thought the quote was directly about CBR, everyone has known Spider-Man uses TI for a long time now. I posted that quote because TI and CBR have something in common where they both use data from a previous frame and use that info with the current frame. You assume too much.

Dude!

GL1fxBR.png
 

statham

Member
haven't played any dlc yet, but its going to look sweet at 4k. hope we get texture upgrades q/ more memory. This OnQ stuff about CB looks as good, is a riot. do you still believe sony's cam is as good as kinect? please respond.
 

onQ123

Member
haven't played any dlc yet, but its going to look sweet at 4k. hope we get texture upgrades q/ more memory. This OnQ stuff about CB looks as good, is a riot. do you still believe sony's cam is as good as kinect? please respond.

I haven't said anything about Checkerboard Rendering looking as good.
 

statham

Member

can you quote me? you tool a hard edge opinion on two subject while I've been here, both Sony platforms have been weaker, but you tried and still trying to downplay the other. Kinect and XBX why? Fud is spread through both. do you, "seriously" believe Xbox is weaker, if so explain, if not explain your posts.
 

KageMaru

Member

Yeah admittedly I could have elaborated my original point better and I thought I did that with my last post. Again the reason I posted that quote is because TI and CBR are similar in how they both hold information from a previous frame and insert it into the current frame to produce a higher output resolution. That is the only reason why I made that post. So if TI uses a half size framebuffer, why do you think it would be different for CBR?
 

Space_nut

Member
Yeah admittedly I could have elaborated my original point better and I thought I did that with my last post. Again the reason I posted that quote is because TI and CBR are similar in how they both hold information from a previous frame and insert it into the current frame to produce a higher output resolution. That is the only reason why I made that post. So if TI uses a half size framebuffer, why do you think it would be different for CBR?

Let them believe cb renders a scene at 4k natively and devs just love to add more processing to create the same image for no reason lol
 

onQ123

Member
can you quote me? you tool a hard edge opinion on two subject while I've been here, both Sony platforms have been weaker, but you tried and still trying to downplay the other. Kinect and XBX why? Fud is spread through both. do you, "seriously" believe Xbox is weaker, if so explain, if not explain your posts.

What are you even talking about my posts in this thread hasn't had anything to do with Sony vs MS
 
So for people who've played Witcher 3 on both console and PC, what are considered the best graphical advantages on the PC? What should CDPR be prioritising in the '4K' patch?

Native 4k >> cb the end lol

Cb isn't close to rendering a game natively computationally
Native 4K isn't better because it's computationally more expensive though, it's better because it's more accurate and will deliver better quality (because it's not dependent on certain conditions being met) in practice, because games involve motion.

Whether that extra expense is worth the performance penalty is ultimately left to developers to decide. We've seen on the Pro that most devs think it isn't, but the Xb1x will have a different budget to work with so maybe they'll favour a different strategy there.
 
can you quote me? you tool a hard edge opinion on two subject while I've been here, both Sony platforms have been weaker, but you tried and still trying to downplay the other. Kinect and XBX why? Fud is spread through both. do you, "seriously" believe Xbox is weaker, if so explain, if not explain your posts.

My god, your posts are always so poorly written and strange. Why is that?
 
Tales from out ass tech tales thread? OK.


Native is obviously what you want but I kind of feel like we are at a point where the resolutions are so high now on high end PC / PS4 Pro and soon XB1X that the difference between native 4k and 4k checkerboard is so subtle that you could confuse it with the difference between a different anti aliasing setting.

Stuff like 1800p CB is a tad soft but I would chalk that up to being sub native then down to the checkerboard. Its basically the 4K gens 900p to me so far.


Checkerboarding is a cool thing. Happy it exists. Hope it becomes a bigger thing on PC too. Personally would rather have a 1:1 pixel image even if its checkerbaorded then a sub native one

- someone who doesnt really know what they are talking about but I am just talking about how I feel about it.
 

Lady Gaia

Member
Native 4k >> cb the end lol

It's more accurate in ways that may or may not make a noticeable visual impact, but...

Cb isn't close to rendering a game natively computationally

Right. That's the point. There's almost invariably something more visually impressive you can do with the computational savings. That's why MS and Sony both give developers free reign to use the hardware as they see fit, and many developers building new titles opt for something other than native 4K.

There are lots of things that would be more computationally intensive that developers don't do because they have other priorities that make a bigger difference in their games. Don't get so caught up in console wars bullshit that you hitch you wagon to an arbitrary goal. In just a couple of months we're going to have a chance to see games running first-hand on this hardware and they'll speak for themselves. The Digital Foundry threads are going to be fascinating.
 

Mrbob

Member
My home theater PC is more powerful than xb1x and I'd rather play at a lower resolution and have the Nvidia control panel scale to 4k. Sub 4k image looks softer but you get used to it quickly. It's really not that big of a deal. I'd love to see cb support happen natively by Nvidia on PC because attempting to run games at 4k60 takes an incredible amount of power. HDR is more important than resolution so the increase in HDR support for games will make them look better.
 

KageMaru

Member
can you quote me? you tool a hard edge opinion on two subject while I've been here, both Sony platforms have been weaker, but you tried and still trying to downplay the other. Kinect and XBX why? Fud is spread through both. do you, "seriously" believe Xbox is weaker, if so explain, if not explain your posts.

He's not saying the 1X is weaker at all, come on now. I don't mind that he quoted me, I'm happy to elaborate further. I just wish he didn't talk to people as if we're all clueless.

Let them believe cb renders a scene at 4k natively and devs just love to add more processing to create the same image for no reason lol

That's the whole point of CBR. Producing results that are relatively close but at a cheaper computational cost is a good thing and shouldn't be frowned upon. I prefer native 4K, but if a developer can create an overall better picture with CBR, I trust their judgment. There will be plenty of games using CBR on the 1X and even next gen.

My god, your posts are always so poorly written and strange. Why is that?

I'm guessing English is not his first language.
 

onQ123

Member
He's not saying the 1X is weaker at all, come on now. I don't mind that he quoted me, I'm happy to elaborate further. I just wish he didn't talk to people as if we're all clueless.

To be honest I think that's something that you're dealing with on your own because I'm not talking down to anyone it's actually the other way around , I can simply correct something that someone said that was wrong & y'all just go off saying all type of stuff trying to twist things when I simply posted a fact.
 

Sygma

Member
Everyone is talking about HDR, 4k and checkerboarding but i would love to see them go back to the original renderer they used to show off the game for the first time.

It's probably more time consuming but i would imagine they wouldnt mind going back and releasing their original vision of this game. they already had that engine up and running and ditched it after the shitty Xbox and PS4 GPU specs were revealed. I think a 4.2 and 6 tflop console would easily be able to run these graphics at 1080p.

2825918-6436431767-get


02.gif


b4TnCh2.gif


original.gif

Give it a good five years and we ll reach that. In general I mean

God of War / Assassin's Creed Origins are a good beginning
 

greenmind

Banned
Give it a good five years and we ll reach that. In general I mean

God of War / Assassin's Creed Origins are a good beginning

I remember when they first showed Morrowind, and I spooged.... and when it was released it was garbage...

ill never trust that shiiiit again
 

thelastword

Banned
Yeah admittedly I could have elaborated my original point better and I thought I did that with my last post. Again the reason I posted that quote is because TI and CBR are similar in how they both hold information from a previous frame and insert it into the current frame to produce a higher output resolution. That is the only reason why I made that post. So if TI uses a half size framebuffer, why do you think it would be different for CBR?
I'm not a fan of T.I yet...Proper CBR at 4K is superior to me....When I played R&C, I could still see lots of aliasing/shimmering and some visual noise in parts...At 4k of course, never played it downsampled....I'm looking at Spider-Man and it looks much better IMO, AA is better that is, but I also happen to see a slight shift from a very sharp image to a slightly blurrier one at times as Spider swings at pace...not so much MB, but on the geometry itself.....I guess we shall see when the final game drops, but proper CBR like in HZD seems to maintain such a sharp image throughout....I guess pros and cons...

In just a couple of months we're going to have a chance to see games running first-hand on this hardware and they'll speak for themselves. The Digital Foundry threads are going to be fascinating.
Real world results should be quite interesting to observe on the different rendering techniques. Their pros, cons and how much software CB plays out against hardware CB, T.I or native 4k etc...It will also be interesting to observe how some of these techniques improve in 2018 etc..
 
Maybe I'm misunderstanding things here but I thought the quote below indicates a half size framebuffer.
http://www.eurogamer.net/articles/d...tation-4-pro-how-sony-made-a-4k-games-machine
It does, but that's a statement from Digital Foundry, not a direct quote from Mark Cerny. DF sometimes make false statements when rephrasing technical points.

As for the distinction between temporal injection and CBR, I still have yet to see any detailed discussion of exactly what "temporal injection" actually entails. Obviously it's some sort of reconstruction, but the specifics matter as to what benefits and drawbacks we should expect. Without more data I'd be extremely reticent to state anything definitive about what size buffers it requires, or where it diverges from CBR (which we have multiple developer tech presentations about).

Plain and simple rendering native 4k is more demanding and computationally intensive then cb or any other reconstruction

I wonder if Quantum break is considered 1080p cb since it reconstructs at that
Yes, native rendering is more demanding and more accurate. That doesn't make it the only acceptable method. SSAA is more demanding and more accurate than SMAA or MLAA, but games that use those latter techniques aren't pointless, or unimpressive because they're "taking shortcuts", or however else we might arbitrarily derogate them.

I don't think Quantum Break uses CBR, but another method of some kind. We'd have to know the precise details to say what the rendering resolution is.

Let them believe cb renders a scene at 4k natively and devs just love to add more processing to create the same image for no reason lol
Again, no one is saying CBR is rendering native 4K. However, it is rendering 4K. If you don't find me convincing, ask Microsoft. They make no distinction regarding resolution from multiple methods:

4kv1u1b.jpg
 

Freeman76

Member
I'm not a fan of T.I yet...Proper CBR at 4K is superior to me....When I played R&C, I could still see lots of aliasing/shimmering and some visual noise in parts...At 4k of course, never played it downsampled....I'm looking at Spider-Man and it looks much better IMO, AA is better that is, but I also happen to see a slight shift from a very sharp image to a slightly blurrier one at times as Spider swings at pace...not so much MB, but on the geometry itself.....I guess we shall see when the final game drops, but proper CBR like in HZD seems to maintain such a sharp image throughout....I guess pros and cons...


Real world results should be quite interesting to observe on the different rendering techniques. Their pros, cons and how much software CB plays out against hardware CB, T.I or native 4k etc...It will also be interesting to observe how some of these techniques improve in 2018 etc..


I cant work out if you are intentionally sounding like you know what you are on about as a form of sarcasm in this thread, or not.
 

N21

Member
Again, no one is saying CBR is rendering native 4K. However, it is rendering 4K. If you don't find me convincing, ask Microsoft.

A weaker 4K not true 4K.

What I mean is, you know that native is real deal and CB is not. There's nothing wrong with CB 4K.
 

Putty

Member
What a disastrous last few pages.

OT: Ihave no desire to play through again but ill reinstall the game on Pro when the patch drops to check the improvements...
 
A weaker 4K not true 4K.

What I mean is, you know that native is real deal and CB is not. There's nothing wrong with CB 4K.
"True 4K" has no technical definition, it can be used to mean whatever you wish. And whether something is "real deal" also has no fixed denotation. It's simply a shibboleth used to assert by fiat the presence of authenticity that demands reverence, without enumerating any justifying qualities.

Using vague terms like this only fosters unresolvable argument. Say this instead: native 4K rendering is usually more accurate (and always more demanding) than CBR 4K. This is apparent most when comparing individual frames to each other. However, some CBR implementations are good and artifact-free enough that the comparison becomes more equivocal. This is especially true when regarding the final visual effect of gameplay rather than screenshots, as display technology and human persistence and acuity of vision combine to make much of the reduced accuracy unnoticeable.

The Witcher 3 on One X will almost certainly look and perform better than on Pro. That doesn't mean it's laughable on the weaker console, or "less real", or compromised, or a poor effort.
 

Planet

Member
Why did we jump back to square one with the checkerboard rendering discussion? There have been a lot of excellent implementations which were thoroughly dissected, and the findings were just about unanimously positive.

Sure, CB 4k isn't the same quality as native 4k, but those clowns saying it's nowhere near seem to have not paid any attention to the above. It's certainly leaps and bounds better than rendering half resolution and applying regular upscaling.
 

N21

Member
"True 4K" has no technical definition, it can be used to mean whatever you wish. And whether something is "real deal" also has no fixed denotation. It's simply a shibboleth used to assert by fiat the presence of authenticity that demands reverence, without enumerating any justifying qualities.

Using vague terms like this only fosters unresolvable argument. Say this instead: native 4K rendering is sometimes more accurate (and always more demanding) than CBR 4K. The details of how and if are specific to each instance, and the acceptability--even the noticeability--of each tradeoff is a personal preference.

The Witcher 3 on One X will almost certainly look and perform better than on Pro. That doesn't mean it's laughable on the weaker console, or "less real", or compromised, or a poor effort.

Well, this is the place that had a lot of people saying that the base XB1 resolution output for games was laughable.
 

KageMaru

Member
It does, but that's a statement from Digital Foundry, not a direct quote from Mark Cerny. DF sometimes make false statements when rephrasing technical points.

As for the distinction between temporal injection and CBR, I still have yet to see any detailed discussion of exactly what "temporal injection" actually entails. Obviously it's some sort of reconstruction, but the specifics matter as to what benefits and drawbacks we should expect. Without more data I'd be extremely reticent to state anything definitive about what size buffers it requires, or where it diverges from CBR (which we have multiple developer tech presentations about).

Now I'm not sure I'd put it that way. They have made mistakes, they are only human, but they don't intentionally mislead readers. I would also hope and expect Sony to correct them on any inaccuracies in the article, especially an article that goes into detail on the inner workings of their new system. They made mistakes in the 1X article and corrected them when contacted by MS.

I also didn't post that as a negative, I just thought it was relevant to the discussion.
 

Caayn

Member
Say this instead: native 4K rendering is sometimes more accurate (and always more demanding) than CBR 4K.
For someone that seems to strives for accuracy in arguments and often "corrects" others about it, it's strange to see you imply that native isn't more accurate than checkerboard in most cases due to using the word "sometimes" when it's more often than not more accurate given the nature of motion in video games.
 

Thebonehead

Banned
Say this instead: native 4K rendering is sometimes more accurate (and always more demanding) than CBR 4K.

For someone that seems to strives for accuracy in arguments and often "corrects" others about it, it's strange to see you imply that native isn't more accurate than checkerboard in most cases due to using the word "sometimes" when it's more often than not more accurate given the nature of motion in video games.

Well, if you see screen tearing at native 4k then it won't be accurate. So maybe say predominantly accurate with caveats ?
 

Rellik

Member
What a disastrous last few pages.

OT: Ihave no desire to play through again but ill reinstall the game on Pro when the patch drops to check the improvements...

Hopefully we get a new thread for that so we can talk about it without all the bs we have on these last few pages.
 

dr_rus

Member
Checkerboarding is using temporal accumulation too so it's unclear to me why some people here make a differentiation between checkerboarding and temporal injection. Without knowing the details of implementation it's really hard to say if they are even that much different as people seem to think. The only thing which can be said for sure is that temporal injection method of frame reconstruction doesn't seem to be using MSAA h/w of the GPU which potentially makes it more compatible with the number of rendering approaches out there.

Both methods do not render into a 4K buffer - otherwise they wouldn't be methods of decoupled shading and they just wouldn't work on PS4Pro.

Well, if you see screen tearing at native 4k then it won't be accurate. So maybe say predominantly accurate with caveats ?

It would still be more accurate, even with screen tearing.
 
Now I'm not sure I'd put it that way. They have made mistakes, they are only human, but they don't intentionally mislead readers. I would also hope and expect Sony to correct them on any inaccuracies in the article, especially an article that goes into detail on the inner workings of their new system. They made mistakes in the 1X article and corrected them when contacted by MS.
I definitely wasn't implying any nefarious intent on their part! Just that they're humanly fallible, and error can creep in, and not always be corrected.

For someone that seems to strives for accuracy in arguments and often "corrects" others about it, it's strange to see you imply that native isn't more accurate than checkerboard in most cases due to using the word "sometimes" when it's more often than not more accurate given the nature of motion in video games.
It wasn't an implication, it was a direct claim. Even in motion, some implementations of CBR have very little artifacting or other obvious anomalies. It's clear that the method isn't inherently always worse than native rendering for a vast majority of frames. Of course, not all implementations are this good, and in practice native 4K is better almost all of the time. (Especially if we concentrate on static frames output, and ignore the persistence effects of current display technologies and the human visual system.)

I may have been trying to squeeze too much into too few words, so I've gone back and modified the post.
 

Caayn

Member
It wasn't an implication, it was a direct claim. Even in motion, some implementations of CBR have very little artifacting or other obvious anomalies. It's clear that the method isn't inherently always worse than native rendering for a vast majority of frames. Of course, not all implementations are this good, and in practice native 4K is better almost all of the time. (Especially if we concentrate on static frames output, and ignore the persistence effects of current display technologies and the human visual system.)

I may have been trying to squeeze too much into too few words, so I've gone back and modified the post.
So you do believe that native and checkerboard produce the same accuracy for the majority of the time.

Checkerboard is a good technique that can produce great results and the difference can be small depending on the scene and method used, but it's not as accurate as native for the majority of the time.
 

Cincaid

Member
Awesome if the update will be here soon! Been having the GOTY-edition installed for months on my Pro and when they announced the Pro-update at E3 I've just been waiting for that. A bit disappointing that they later downplayed the "in a few days"-statement, but hopefully we'll maybe get a release date accounement sometime this week at least.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I look at this thread for actual news and the last few pages have been nothing more than a native 4K vs checkerboard 4K gripefest.

Personal view: checkerboard 4K is fine.

Now when is this coming to PS4 Pro is the original question?
 

Colbert

Banned
I look at this thread for actual news and the last few pages have been nothing more than a native 4K vs checkerboard 4K gripefest.

Personal view: checkerboard 4K is fine.

Now when is this coming to PS4 Pro is the original question?

if this is the subject and OP of this thread the thread title is really really misleading LOL

Let Captain Obvious answer your question: You will get the patch between now and November!
 

thelastword

Banned
As for the distinction between temporal injection and CBR, I still have yet to see any detailed discussion of exactly what "temporal injection" actually entails. Obviously it's some sort of reconstruction, but the specifics matter as to what benefits and drawbacks we should expect. Without more data I'd be extremely reticent to state anything definitive about what size buffers it requires, or where it diverges from CBR (which we have multiple developer tech presentations about).
Well, KageMaru has already detailed that it's similar to CB, I guess you just don't want to learn ;)

In any case, I've been looking for some documentation on T.I and have not seen much, though many people are interested in the specifics...The observations I've made on it is just by playing R&C at 4K and watching the Spiderman Footage at 4K from E3...I think AA wise Spider-Man is definitely cleaner/sharper than Ratchet at 4K, but I do see this shift in sharpness on buildings and distinct geometry in Spider-Man at high pace (swinging etc.)...something I didn't observe in Ratchet, (maybe because of all the PP and effects) but on certain distant details and geometry I did observe some weird aliasing and visual noise in Ratchet at times....

Suspersampling wise though, have not seen it, but I'm sure it looks much cleaner..Of course, before people get their panties twisted, I'm not saying R&C's AA is bad...I think it's pretty accomplished, but playing it I saw some anomalies.....Hopefully, when Spider-Man lands we can learn a bit more about the technique, what insomniac is doing at render time and determine it's pro and cons relative to the final 4k image...


I cant work out if you are intentionally sounding like you know what you are on about as a form of sarcasm in this thread, or not.
Another one of those posts, bravo.....FYI, I do know what I'm talking about since my eyes were intact when I played Ratchet and watched 4k Footage of Spider-Man, I guarantee you...I probably still have the 1.5+ GB file somewhere, probably got it from Gamersyde or a presser, but for once, I don't think I will go through the trouble of getting screens because I don't think you're interested in the discussion, nor want to know what's really going on with T.I technically...
 
Top Bottom