The display resolution (which is the only thing those charts are relevant for) and the internal rendering resolution aren't the same thing, and you'll have much better IQ if you render at a higher resolution, even if you then downscale it to a lower display resolution.According to this chart which is based on 20/20 vision:
You need to be 7 feet from a 55” display before you even BEGIN to notice 4K over 1080p. About 4 feet away to see the full benefit. (Full benefit of 1440p would be between those figures).
That graph has the years 2006-2012 on it, 4k TVs only started releasing the same year and they werent available at 55 inch. Its all theory as far as I'm concerned. I don't think any real world tests were done.
Have you done real world testing yourself? You own a 65inch 4k TV. Have you hooked a PC up to it and compared games at 1080p, 1440p and 4k?
The display resolution (which is the only thing those charts are relevant for) and the internal rendering resolution aren't the same thing, and you'll have much better IQ if you render at a higher resolution, even if you then downscale it to a lower display resolution.
No I know what you mean but even if you don't resolve individual pixels with your eye you can still see what is basically sub-pixel detail that the renderer drew. You can make explicit examples with very high frequency details that illustrate this, for example with moire patterns. A supersampled image is different to one rendered at native resolution, and thus will look different. It can also make itself known temporally, as sub-pixel details will jump in and out of view if you don't supersample.I think you have it backwards. That chart is derived from the fact that people can see detail at about 60 pixels per degree (nothing really to do with screens, it could be dots on a chalkboard). It actually assumes your display has near perfect black/white contrast (like an eye chart). IQ is moot here, except that if your IQ falls far short of perfect then your image would be considered lower resolution as far as that chart is concerned.
If you’re interested in the topic, this might be helpful to you:
That graph isn’t observations about TVs. It’s just mathematically extrapolated from a single observation that is probably decades old. It’s generally accepted that people can resolve detail at about 60 pixels per degree. It’s objective information that’s the product of actual research (not consumer product research but scientific/medical). That chart just takes that figure and uses math to calculate how much detail you can see from your TV.
Your observations are subjective and are probably the result of some other factors. Earlier in this thread there was a brief discussion about how bad scaling can cause an image to become significantly softer. My guess is that this explains some of your personal experience.
Hell blade 2 looks about the same.
No I know what you mean but even if you don't resolve individual pixels with your eye you can still see what is basically sub-pixel detail that the renderer drew. You can make explicit examples with very high frequency details that illustrate this, for example with moire patterns. A supersampled image is different to one rendered at native resolution, and thus will look different. It can also make itself known temporally, as sub-pixel details will jump in and out of view if you don't supersample.
I don't disagree - but gaming sites, "influencers", and other commentators were telling folk that Valhalla was what "next gen" was going to look like. So, clearly, they're wrong - but it's because Microsoft demoed this stuff as their big "next gen" reveal. So, people took Microsoft at their word, believed what we saw was what the Xbox Series X was going to deliver, and believed we needed to seriously lower our expectations. For that, I put the blame on Microsoft for shitting the bed because, frankly, it's their bed and they chose to shit in it. They could've shown us anything - anything at all. It is not Ubisoft's fault Microsoft themselves decided that Ubisoft's multi-gen multi-platform game was the best choice to introduce the world to the power of their "next gen" console. Whether they meant to or not, Microsoft created the expectation that Valhalla was Series X's "next gen" standard. It's Microsoft's fault for thinking a multi-platform multi-gen title could stack up against the absolute best Sony had to show... when that title is also on Sony's own console. An utterly baffling choice from Microsoft, frankly.
To get real specific: Sony made sure they had something - anything - that offered up a taste of what we can expect from their new console. Sure, this may not be an accurate reflection of Sony's launch titles - no arguments there - but they have provided players something to help us understand what the long term investment in their console can look like. Now we know where that 10.2-ish TFLOPs is going to go, and what our money is going to buy. And, frankly, it's pretty incredible. And look around - people are very excited for the PS5 because we can see what makes it "next gen". We get it - PS4 cannot do what we saw today. We've seen "next gen".
On the opposite side of the fence, to introduce the entire world to the incredible raw potential of their turbo-charged new "next gen" console, Microsoft selected... multi-platform and multi-generational third party titles, most of which will be on Xbox One. And now everyone expects Series X games to look like launch window cross-gen titles because that's what Microsoft selected to show. Months of PR, dozens of articles and blog posts, and an entire hour long presentation, and they still haven't shown off what their 12 TFLOPs can ultimately deliver. And look around - people are very underwhelmed by Microsoft's new console. That's on them. I have no doubt Series X can deliver visuals on par or better than the demo we saw today. But Microsoft haven't really proven it to anyone.
In short: Microsoft had their chance to get out and define next gen, and they dropped the ball in a big, big way. Now, Sony's intercepted the ball and defined "next gen" on their own terms. Microsoft let that happen, so it's absolutely on them.
If it is like Epic said... scalable then it will automatically increase the number of triangles/details up to limit of the SSD bandwidth.Let’s remember that this is a demo, not an actual game.
Let’s assume that the demo was made specifically to demonstrate the PS5’s strengths, particularly the SSD speed, and can’t run like that on the Series X or PC, there’s no evidence of that at this stage, but let’s imagine.
If that is the case then third-party games are not going to be designed around those strengths. Third-party games won’t utilise game designs that can only run efficiently on one piece of hardware, so the only games that will take advantage of those specific strengths will be first party.
So nothing really changes. The vast bulk of games are third-party and multi-platform, how they compare is still to be seen.
New fresh video:
Alex from DF just put up an Inside Unreal Engine 5 article
It's a long article, some snippets
- Lumen is not ray tracing, it's using another form of tracing
- lumen also has specular reflections
- large objects are traced through voxels
- medium objects are signed distance fields
- small objects are screen space (like Gears 5 on XseX)
- you can see screen space artifacts in the demo
- uses temporal accumulation like RT, so there's a latency to lighting
- micro-polygon rendering is primarily used in offline rendering like film
- nanite uses a high resolution tiling normal map for fine details to help conserve vram through virtual texturing
- nanite scales the model complexity by how many pixels it takes up
- micro sized objects are shadowed with SS shadows and combined with a virtualized shadow map
- shadow map resolution is aligned with screen resolution
- shadows are filtered to create a penumbra
- unknown if nanite applies to animated objects like foliage or hair, or characters
- demo is dynamic res (mostly 1440p) at 30fps
- resolution scaling is more expensive with this technique
Yeah, pretty much... he had a big falling out with Phil over exclusives going to PC. We used to always argue with him and Nxtgen720 and Sonic Wolf but then they all changed to PlayStation.Was he the one that according to legend was harassing people who had a ps4 so that they pick an xb1 instead?
See even the hardest of hardcore can change side... I red some research years ago that said that the people most likely to adopt a religion were the one who held deep religion beliefs to begin with.Yeah, pretty much... he had a big falling out with Phil over exclusives going to PC. We used to always argue with him and Nxtgen720 and Sonic Wolf but then they all changed to PlayStation.
Well destructibility was on display. But definitely no AI beyond some bats itching to be put into a soup. That’s why it’s a tech demo and not a game I guess.How come so little is animated in the demo? I'm curious why deformations aren't a priority or destructibility or animation e.g. the 500 statues are just standing there. The graphics are undeniably awesome and the pipeline savings next gen level but what of the gameplay? There is very little there in the way of enemies animated or AI in real time reacting to players choices/actions.
There's more to next gen I want to see these hardware platforms used for.
It's starting to feel like every page in every thread needs the Unreal Engine 4 reveal video posted. Games didn't just match, but exceeded that one.
the image quality of nanite surpasses traditional renderers of 1440p, it baffles pixel counters. Lumin might not be ray tracing but it is global illumination and should give similar quality, perhaps indistinguishable.Most games won't hit 1440p 30fps without raytracing?!?
imagine that demo trashing every 4k pc game in existence and having superior image quality that even baffled expert pixel counters.Imagine having to run your graphics optimized demo in 1440p at 30fps on a platform that is wanted for 4k gaming at 60fps.
I assume a lot of people out there would be happy with this though as they couldn't tell the resolution difference and have no problem with 30fps as they are used to it.
According to this viseo Lumin works similar to RT, but it's way more optimized and can run with good results even without HW accelearation.the image quality of nanite surpasses traditional renderers of 1440p, it baffles pixel counters. Lumin might not be ray tracing but it is global illumination and should give similar quality, perhaps indistinguishable.
imagine that demo trashing every 4k pc game in existence and having superior image quality that even baffled expert pixel counters.
Can't imagine that.
So have you done your own real world tests?
You are aware that this also runs on a pc?https://nofilmschool.com/sites/default/files/styles/article_2500/public/unreal_engine_5_07.jpg?itok=nRzLw4sw
Look at the image quality, I cannot perceive any aliasing or artifact in this still image. 4K pc games have pop in, aliasing, shimmering. Even nvidia's latest rtx demo from a few days ago had some shimmering in some areas.
you are aware that with an nvme drive, assuming that's sufficient, and at best at only slightly higher resolution 30fps.You are aware that this also runs on a pc?
Are you asking this out of curiosity?
Or are you really seriously saying that "I dont believe in it if YOU havent PERSONALLY tested it?"
Because if it would be the second option, that would be sad lifestyle.
You could not believe in physics, maths, round world, vaccinations if you or someone you know have been doing the tests and science.
You could not even believe in other countries until you have visited them, or planets/stars
It is just physics vs human eyes that we dont see more details after certain distance.
You cant even see difference between 360P video and 4k video, if you place the TV on stadium and watch it from the half way.
Same with home setups, unless viewers sits close enough = 1080p vs 4k looks the same, or differences are really really small.
And most people sit just too far to get (full) benefits. more than 2-3m and you need huge TV to gain significant difference from fullHD to 4k, that is the point.
I still use 1080p Screen as I have checked 4k screens many times on stores and while they look great from 1-2m away, after 3-4m it doesnt look that special.
No. That one would never, but I mean, never, betray his Green God.Was he the one that according to legend was harassing people who had a ps4 so that they pick an xb1 instead?