Yeah. that one is great too. I was surprised they dropped it for COD WW2 in favor for a lower native resolution.
They didn't drop it.
All three COD studios use their own branches of the engine. Their renderers all work differently.
Yeah. that one is great too. I was surprised they dropped it for COD WW2 in favor for a lower native resolution.
I came in here to get hyped about a Witcher 3 4K patch...and all I got was confused about checkerboard rendering. "This is NeoGAF". Lol I need a T-shirt with that silk-screened on it or something. Hehe
Seriously tho, I'm cautiously optimistic about this patch...hopefully they'll go for better IQ (others mentioned draw distance, LODs, etc) rather than just straight up-rez. But in any case, this is a perfect example of why so many of us love CDPR. It's why I have W3 on Xbone/PS4/PC and haven't even played it yet. Lol Bought copies for friends too. These guys rock in just about every way.
Edit:
I was thinking the exact same thing, but with how busy they are on Cyberpunk I don't want to get my hopes up for anything beyond a really bare-bones up-rez. I'd rather be pleasantly surprised than disappointed.
You said you knew before I chimed in yet you still don't understand.
You posted a quote that was talking about 4 million pixels thinking it was about Checkerboard rendering so I posted & told you that they was talking about Temporal Injection in that quote & I also responded to someone else to show them that CB & Geometry Rendering has a 4K framebuffer & that it's TI that's using half of 4K framebuffer.
At this point I'd be happy with 1440p, HDR, and some improved effects.
I never claimed or thought the quote was directly about CBR, everyone has known Spider-Man uses TI for a long time now. I posted that quote because TI and CBR have something in common where they both use data from a previous frame and use that info with the current frame. You assume too much.
haven't played any dlc yet, but its going to look sweet at 4k. hope we get texture upgrades q/ more memory. This OnQ stuff about CB looks as good, is a riot. do you still believe sony's cam is as good as kinect? please respond.
Dude!
Dude!
Native 4k >> cb the end lol
Cb isn't close to rendering a game natively computationally
Yeah admittedly I could have elaborated my original point better and I thought I did that with my last post. Again the reason I posted that quote is because TI and CBR are similar in how they both hold information from a previous frame and insert it into the current frame to produce a higher output resolution. That is the only reason why I made that post. So if TI uses a half size framebuffer, why do you think it would be different for CBR?
can you quote me? you tool a hard edge opinion on two subject while I've been here, both Sony platforms have been weaker, but you tried and still trying to downplay the other. Kinect and XBX why? Fud is spread through both. do you, "seriously" believe Xbox is weaker, if so explain, if not explain your posts.
Native 4K isn't better because it's computationally more expensive though, it's better because it's more accurate and will deliver better quality (because it's not dependent on certain conditions being met) in practice, because games involve motion.Native 4k >> cb the end lol
Cb isn't close to rendering a game natively computationally
can you quote me? you tool a hard edge opinion on two subject while I've been here, both Sony platforms have been weaker, but you tried and still trying to downplay the other. Kinect and XBX why? Fud is spread through both. do you, "seriously" believe Xbox is weaker, if so explain, if not explain your posts.
My god, your posts are always so poorly written and strange. Why is that?
😱
What are you even talking about my posts in this thread hasn't had anything to do with Sony vs MS
Native 4k >> cb the end lol
Cb isn't close to rendering a game natively computationally
can you quote me? you tool a hard edge opinion on two subject while I've been here, both Sony platforms have been weaker, but you tried and still trying to downplay the other. Kinect and XBX why? Fud is spread through both. do you, "seriously" believe Xbox is weaker, if so explain, if not explain your posts.
Let them believe cb renders a scene at 4k natively and devs just love to add more processing to create the same image for no reason lol
My god, your posts are always so poorly written and strange. Why is that?
When did I say it renders native 4k?
He's not saying the 1X is weaker at all, come on now. I don't mind that he quoted me, I'm happy to elaborate further. I just wish he didn't talk to people as if we're all clueless.
Everyone is talking about HDR, 4k and checkerboarding but i would love to see them go back to the original renderer they used to show off the game for the first time.
It's probably more time consuming but i would imagine they wouldnt mind going back and releasing their original vision of this game. they already had that engine up and running and ditched it after the shitty Xbox and PS4 GPU specs were revealed. I think a 4.2 and 6 tflop console would easily be able to run these graphics at 1080p.
Give it a good five years and we ll reach that. In general I mean
God of War / Assassin's Creed Origins are a good beginning
I'm not a fan of T.I yet...Proper CBR at 4K is superior to me....When I played R&C, I could still see lots of aliasing/shimmering and some visual noise in parts...At 4k of course, never played it downsampled....I'm looking at Spider-Man and it looks much better IMO, AA is better that is, but I also happen to see a slight shift from a very sharp image to a slightly blurrier one at times as Spider swings at pace...not so much MB, but on the geometry itself.....I guess we shall see when the final game drops, but proper CBR like in HZD seems to maintain such a sharp image throughout....I guess pros and cons...Yeah admittedly I could have elaborated my original point better and I thought I did that with my last post. Again the reason I posted that quote is because TI and CBR are similar in how they both hold information from a previous frame and insert it into the current frame to produce a higher output resolution. That is the only reason why I made that post. So if TI uses a half size framebuffer, why do you think it would be different for CBR?
Real world results should be quite interesting to observe on the different rendering techniques. Their pros, cons and how much software CB plays out against hardware CB, T.I or native 4k etc...It will also be interesting to observe how some of these techniques improve in 2018 etc..In just a couple of months we're going to have a chance to see games running first-hand on this hardware and they'll speak for themselves. The Digital Foundry threads are going to be fascinating.
It does, but that's a statement from Digital Foundry, not a direct quote from Mark Cerny. DF sometimes make false statements when rephrasing technical points.Maybe I'm misunderstanding things here but I thought the quote below indicates a half size framebuffer.
http://www.eurogamer.net/articles/d...tation-4-pro-how-sony-made-a-4k-games-machine
Yes, native rendering is more demanding and more accurate. That doesn't make it the only acceptable method. SSAA is more demanding and more accurate than SMAA or MLAA, but games that use those latter techniques aren't pointless, or unimpressive because they're "taking shortcuts", or however else we might arbitrarily derogate them.Plain and simple rendering native 4k is more demanding and computationally intensive then cb or any other reconstruction
I wonder if Quantum break is considered 1080p cb since it reconstructs at that
Again, no one is saying CBR is rendering native 4K. However, it is rendering 4K. If you don't find me convincing, ask Microsoft. They make no distinction regarding resolution from multiple methods:Let them believe cb renders a scene at 4k natively and devs just love to add more processing to create the same image for no reason lol
I'm not a fan of T.I yet...Proper CBR at 4K is superior to me....When I played R&C, I could still see lots of aliasing/shimmering and some visual noise in parts...At 4k of course, never played it downsampled....I'm looking at Spider-Man and it looks much better IMO, AA is better that is, but I also happen to see a slight shift from a very sharp image to a slightly blurrier one at times as Spider swings at pace...not so much MB, but on the geometry itself.....I guess we shall see when the final game drops, but proper CBR like in HZD seems to maintain such a sharp image throughout....I guess pros and cons...
Real world results should be quite interesting to observe on the different rendering techniques. Their pros, cons and how much software CB plays out against hardware CB, T.I or native 4k etc...It will also be interesting to observe how some of these techniques improve in 2018 etc..
Again, no one is saying CBR is rendering native 4K. However, it is rendering 4K. If you don't find me convincing, ask Microsoft.
"True 4K" has no technical definition, it can be used to mean whatever you wish. And whether something is "real deal" also has no fixed denotation. It's simply a shibboleth used to assert by fiat the presence of authenticity that demands reverence, without enumerating any justifying qualities.A weaker 4K not true 4K.
What I mean is, you know that native is real deal and CB is not. There's nothing wrong with CB 4K.
"True 4K" has no technical definition, it can be used to mean whatever you wish. And whether something is "real deal" also has no fixed denotation. It's simply a shibboleth used to assert by fiat the presence of authenticity that demands reverence, without enumerating any justifying qualities.
Using vague terms like this only fosters unresolvable argument. Say this instead: native 4K rendering is sometimes more accurate (and always more demanding) than CBR 4K. The details of how and if are specific to each instance, and the acceptability--even the noticeability--of each tradeoff is a personal preference.
The Witcher 3 on One X will almost certainly look and perform better than on Pro. That doesn't mean it's laughable on the weaker console, or "less real", or compromised, or a poor effort.
It does, but that's a statement from Digital Foundry, not a direct quote from Mark Cerny. DF sometimes make false statements when rephrasing technical points.
As for the distinction between temporal injection and CBR, I still have yet to see any detailed discussion of exactly what "temporal injection" actually entails. Obviously it's some sort of reconstruction, but the specifics matter as to what benefits and drawbacks we should expect. Without more data I'd be extremely reticent to state anything definitive about what size buffers it requires, or where it diverges from CBR (which we have multiple developer tech presentations about).
For someone that seems to strives for accuracy in arguments and often "corrects" others about it, it's strange to see you imply that native isn't more accurate than checkerboard in most cases due to using the word "sometimes" when it's more often than not more accurate given the nature of motion in video games.Say this instead: native 4K rendering is sometimes more accurate (and always more demanding) than CBR 4K.
Say this instead: native 4K rendering is sometimes more accurate (and always more demanding) than CBR 4K.
For someone that seems to strives for accuracy in arguments and often "corrects" others about it, it's strange to see you imply that native isn't more accurate than checkerboard in most cases due to using the word "sometimes" when it's more often than not more accurate given the nature of motion in video games.
What a disastrous last few pages.
OT: Ihave no desire to play through again but ill reinstall the game on Pro when the patch drops to check the improvements...
Hopefully we get a new thread for that so we can talk about it without all the bs we have on these last few pages.
Well, if you see screen tearing at native 4k then it won't be accurate. So maybe say predominantly accurate with caveats ?
I definitely wasn't implying any nefarious intent on their part! Just that they're humanly fallible, and error can creep in, and not always be corrected.Now I'm not sure I'd put it that way. They have made mistakes, they are only human, but they don't intentionally mislead readers. I would also hope and expect Sony to correct them on any inaccuracies in the article, especially an article that goes into detail on the inner workings of their new system. They made mistakes in the 1X article and corrected them when contacted by MS.
It wasn't an implication, it was a direct claim. Even in motion, some implementations of CBR have very little artifacting or other obvious anomalies. It's clear that the method isn't inherently always worse than native rendering for a vast majority of frames. Of course, not all implementations are this good, and in practice native 4K is better almost all of the time. (Especially if we concentrate on static frames output, and ignore the persistence effects of current display technologies and the human visual system.)For someone that seems to strives for accuracy in arguments and often "corrects" others about it, it's strange to see you imply that native isn't more accurate than checkerboard in most cases due to using the word "sometimes" when it's more often than not more accurate given the nature of motion in video games.
So you do believe that native and checkerboard produce the same accuracy for the majority of the time.It wasn't an implication, it was a direct claim. Even in motion, some implementations of CBR have very little artifacting or other obvious anomalies. It's clear that the method isn't inherently always worse than native rendering for a vast majority of frames. Of course, not all implementations are this good, and in practice native 4K is better almost all of the time. (Especially if we concentrate on static frames output, and ignore the persistence effects of current display technologies and the human visual system.)
I may have been trying to squeeze too much into too few words, so I've gone back and modified the post.
Native 4k >> cb the end lol
Cb isn't close to rendering a game natively computationally
wait, did someone say something else? the difference is HUGE.
wait, did someone say something else? the difference is HUGE.
wait, did someone say something else? the difference is HUGE.
I look at this thread for actual news and the last few pages have been nothing more than a native 4K vs checkerboard 4K gripefest.
Personal view: checkerboard 4K is fine.
Now when is this coming to PS4 Pro is the original question?
I look at this thread for actual news and the last few pages have been nothing more than a native 4K vs checkerboard 4K gripefest.
Personal view: checkerboard 4K is fine.
Now when is this coming to PS4 Pro is the original question?
Well, KageMaru has already detailed that it's similar to CB, I guess you just don't want to learnAs for the distinction between temporal injection and CBR, I still have yet to see any detailed discussion of exactly what "temporal injection" actually entails. Obviously it's some sort of reconstruction, but the specifics matter as to what benefits and drawbacks we should expect. Without more data I'd be extremely reticent to state anything definitive about what size buffers it requires, or where it diverges from CBR (which we have multiple developer tech presentations about).
Another one of those posts, bravo.....FYI, I do know what I'm talking about since my eyes were intact when I played Ratchet and watched 4k Footage of Spider-Man, I guarantee you...I probably still have the 1.5+ GB file somewhere, probably got it from Gamersyde or a presser, but for once, I don't think I will go through the trouble of getting screens because I don't think you're interested in the discussion, nor want to know what's really going on with T.I technically...I cant work out if you are intentionally sounding like you know what you are on about as a form of sarcasm in this thread, or not.
if this is the subject and OP of this thread the thread title is really really misleading LOL
Let Captain Obvious answer your question: You will get the patch between now and November!