• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry claims PS5 doesn't exhibit any evidence of VRS(Variable Rate Shading) from PS5 showcase.

geordiemp

Member
I still think you are on to something and you have kind of convinced me that Sony might not have put a lot of emphasis on VRS as it is normally viewed/implemented. I am far less certain this has to do with the timing of the console, but rather a design choice though. This is my logic - which goes back to my previous post.

In the GPU space for PC we have all seen the charts that determine success in the PC space: FPS at 1080p, FPS at 1440p and FPS at 4K across titles at various graphical settings. Within your cost bracket you need to be competitive to survive. Cost = mm2 of silicon.

VRS is a hardware feature that sacrifices some (not much, but some) graphical fidelity to increase FPS by having criteria per draw/ screen space and/or triangle (roughly speaking) and apply different amount of shading work based on these criteria. In other words, small reduction in IQ to get a substantial productivity increase in shading per mm2 of silicon. This makes an awful lot of sense in a PC context for the reasons outlined above.

If you have a design target of 4K/30FPS - that I can see Cerny have - that you have just loaded with a significantly higher VRAM budget than what any PC platform can achieve (i.e. more textures and higher resolution per texture) you want to ensure to maintain that IQ you have just invested in. What Matt at least in my mind eluded to, is that Cerny seems to have created priorities per triangle/ primitive on the geometry side. If you then use those priorities to guide which primitive shaders to use you basically have a VRS solution without VRS if that makes sense? Then you have full control of your IQ and as long as the API to set priorities on the geometry side is straightforward that gives very good control for the designer.

Maybe I am wrong (this is a speculation thread after all!) but the solution above would align with Cerny's speech and Matt's twitter comments and still give VRS capabilities but just without the standardised VRS solution (and this would also explain the lack of 'standard' VRS talk regarding the GPU). If this is correct we should assume a slighter larger than normal allocation of mm2 of silicon to both geometry and primitive shaders on the PS5 GPU.

Your assuming allot based on a first reveal of some games, last gen many Ps4pro games had a performance mode for 60 FPS. This new gen bluepoint already said demons souls with have both.

So 30 FPS modes will be full 4K by the looks of it and some RT on Ps5. A performance mode will pull out more frame time saving tricks for sure including temporal and others.....

We just have to wait, it was only a first showing at 1080p30 stream..
 
Last edited:

Ahmady

Banned
This doesn't really mean anything. Could still be supported, but just not used or shown by now. There is still plenty of time. Hopefully it is supported.
 
because VRS is not about geometry.
True but my point is still viable:

"Variable rate shading, or coarse pixel shading, is a mechanism to enable allocation of rendering performance/power at varying rates across the rendered image. Visually, there are cases where shading rate can be reduced with little or no reduction in perceptible output quality, leading to “free” performance."

In other words: If you manage to use VRS to free performance without any visible Visual evidence - you won't be able to say if it uses VRS or not.

Same with antialiasing and so on and so on.

Heres an example of that DF statement: "I can't see atoms with my eyes therefore I argue that my reality isn't based on atoms".
 
Last edited:

geordiemp

Member
I never said that PS5 is RDNA1.
Secondly Geometry Engine isn't NEW. Its NEW to PS5 and to XSX. Not new to RDNA 2.
Geometry Engine isn't some sony term. Its an actual graphics engine.
Its like saying compute shader or pixel shader is just a sony term.

Geometry Engine is in PS5 and XSX and is on the PC cards.
Its the foundation and backbone of RDNA 1 architecture. You almost literally can't take it out because you literally have to completely change the arch.
Its not something that you could or want to remove. Its like saying I want to remove async compute.

Now about primitive shader vs mesh shader. They are doing things fundamentally different but overlap in some things.

"The way geometry is handled has however become complex over the years. The mesh shaders of Turing represent the way forward for improving efficiency, and reducing bandwidth and memory requirements. Navi seems to be somewhere in the middle with its primitive shaders which were dysfunctional in Vega. " from /r/AMD

Here is an in-depth analysis of primitive shader vs mesh shader

Nope Cerny said they are culling the vertices before processing them and mentioned Brand new features for NEW Geometry engine RDNA2.

Go watch his presentation again, and try listening.

Listen to Cerny again carefully, he calls the geometry engine NEW in ps5 for custom RDN2 and mentions SYNTHESISE GEOMETRY ON THE FLY AS A BRAND NEW CAPABILITY, ..TIMESTAMPED.



Cerny said Performance optimisation such as removing backfaced /removal of vertices and offscreen traingles
 
Last edited:

martino

Member
True but my point is still viable:

"Variable rate shading, or coarse pixel shading, is a mechanism to enable allocation of rendering performance/power at varying rates across the rendered image. Visually, there are cases where shading rate can be reduced with little or no reduction in perceptible output quality, leading to “free” performance."

In other words: If you manage to use VRS to free performance without any visible Visual evidence - you won't be able to say if it uses VRS or not.

Same with antialiasing and so on and so on.

Heres an example of that DF statement: "I can't see atoms with my eyes therefore I argue that my reality isn't based on atoms".

How do you think they looked into it ? the problem is here i think....
 

DeepEnigma

Gold Member
I mean the imagem quality is decreased with VRS but I agree with you how to spot VRS without have seen the game without it.

It is really weird DF talk about “evidence”.

What they believe to be “evidence” of VRS can be any other thing causing IQ degradation.

It’s fascinating that DF always has these “concern” sound bites, but they’ve only been in one direction. At least once a month a new “we should be concerned” thread is spawned on the PS5 that stems from something DF said. Then something will get clarified or squashed (like native 4K with ray tracing at the same time), and yet they’ll find something else to seemingly get the “hardcore” to propagate as “fact” or twist words to argue and spread “doubt.”

I have not once seen it in the other direction so far with these two boxes with them. Hell, I never see it (“concern” speculation) from any of the other tech breakdown sites (NX/RedGaming). Even Linus had to apologize for being wrong when that certain group tried to push on him the feeding of BS info.

I did find it funny that Alex tried to plant the seed of the Blu-ray drive being “tacked on at the last minute”, but thankfully Richard squashed that because even that was too obvious of reaching for FUD.

I can’t wait for absolutely no “concern soundbites” in next month’s showcase. 🤭 Mainly can’t not wait until these machines are out with the actual games to argue about.
 
Last edited:
How do you think they looked into it ? the problem is here i think....
They looked verd close at the videos/screenshots. This is like you using your eye or a lense in search of atoms.

They can't tell if its there or not. They would have to ask developers/sony or have a devconsole and devbuild of the game to be able to tell for sure. Everything else is pure speculation from their side.

This result of their observation seems to be in favor of MS once again even though they can't be sure.
 

martino

Member
They looked verd close at the videos/screenshots. This is like you using your eye or a lense in search of atoms.

They can't tell if its there or not. They would have to ask developers/sony or have a devconsole and devbuild of the game to be able to tell for sure. Everything else is pure speculation from their side.

This result of their observation seems to be in favor of MS once again even though they can't be sure.

or it's your bias that can only allow you to come to this conclusion
 
Last edited:

DeepEnigma

Gold Member
what? Is this real? I thought it looked like a Currentgen game, so easily 4K@60fps possible. How is it not even 4K? This can’t be right lol

Sackboy is now the most important game to certain people to measure the console’s prowess, not the native 4K & ray traced, impeccable IQ particle and object physics galore of the infinitely better looking R&C, Loading up the entire massive and complex levels in 1.5 second switching.

🤭
 
It’s fascinating that DF always has these “concern” sound bites, but they’ve only been in one direction. At least once a month a new “we should be concerned” thread is spawned on the PS5 that stems from something DF said. Then something will get clarified or squashed (like native 4K with ray tracing at the same time), and yet they’ll find something else to seemingly get the “hardcore” to propagate as “fact” or twist words to argue and spread “doubt.”

I have not once seen it in the other direction so far with these two boxes. Hell, I never see it from any of the other tech breakdown sites (NX/RedGaming). Even Linus had to apologize for being wrong when that certain group tried to push on him the feeding of BS info.

I did find it funny that Alex tried to plant the seed of the Blu-ray drive being “tacked on at the last minute”, but thankfully Richard squashed that because even that was too obvious of reaching for FUD.

I can’t wait for absolutely no “concern soundbites” in next month’s showcase. 🤭 Mainly can’t not wait until these machines are out with the actual games to argue about.
If the concern soundbites are only about ambiguous details or possible missing features and it in turn garners clarification from the platform holder, then they should be encouraged to do it across all platforms.

If, on the other hand, they are just pointless clickbait bullshit, then...
 

geordiemp

Member
Sackboy is now the most important game to certain people to measure the console’s prowess, not the native 4K & ray traced, impeccable IQ particle and object physics galore of the infinitely better looking R&C, Loading up the entire massive and complex levels in 1.5 second switching.

🤭

Sackboy was also a 4 player coop game at 60 FPS, maybe 1500 to 1600p60 is OK for multiple player game on same screen ?
 

Ar¢tos

Member
I think it is quite normal for launch window titles to totally ditch VRS, and I expect no different from XSX games.
Efficient VRS is very time consuming to implement, and launch window titles don't have time to waste with that (unless you leave the implementation to an algorithm and just pray it works).
 

jimbojim

Banned
I'll bite.

From https://www.tomsguide.com/news/firs...may-be-revealed-today-with-forza-motorsport-8

Forza Horizon 4

Swapping FH4’s blur motion into VRS version enables XSX’s FH4 to reach 120 fps 4K.

If VRS improves the frame rate by 32 percent which yields 120 fps 4K, then non-VRS version would be 81.6 fps 4K

So, you posted false performance gains because it was based on reddit rumor. Anyway VRS performance gains are around 15% - 20% IIRC in strategic game Gears Tactics according to DF, but compromise is lower IQ
 
Last edited:

FranXico

Member
It’s fascinating that DF always has these “concern” sound bites, but they’ve only been in one direction. At least once a month a new “we should be concerned” thread is spawned on the PS5 that stems from something DF said. Then something will get clarified or squashed (like native 4K with ray tracing at the same time), and yet they’ll find something else to seemingly get the “hardcore” to propagate as “fact” or twist words to argue and spread “doubt.”

I have not once seen it in the other direction so far with these two boxes with them. Hell, I never see it (“concern” speculation) from any of the other tech breakdown sites (NX/RedGaming). Even Linus had to apologize for being wrong when that certain group tried to push on him the feeding of BS info.

I did find it funny that Alex tried to plant the seed of the Blu-ray drive being “tacked on at the last minute”, but thankfully Richard squashed that because even that was too obvious of reaching for FUD.

I can’t wait for absolutely no “concern soundbites” in next month’s showcase. 🤭 Mainly can’t not wait until these machines are out with the actual games to argue about.
Why are there no concern soundbites about the xbox Velocity Architecture? The numbers suggest IO is simply much faster on the PS5 and all that vague talk about "bandwidth multipliers" (LOL) and the amazing BCPack compression is not getting any scrutiny from DF, ever. When it comes to Playstation, they always doubt PR and assume the absolute worst in lack of information. When it comes to Xbox, they always take PR at face value and assume the best in lack of information.

It needs to be said though, one thing the DF is not is a hivemind. Even in their coverage videos, John tries to be reasonable or give any kind of credit to Playstation, but he repeatedly gets shut down by Richard - case in point, the video about PS5 BC, that Richard just used to promote Xbox at every other sentence. Or the Unreal 5 video, where Dictator was very reluctant to acknowledge that fast data streaming from the SSD was used at all.

The double standards are there though, and any discerning person will recognize them.
 
Last edited:

FALCON_KICK

Member
I don't understand the point of having VRS if the game is designed and performs at native 4k and stable 30 fps.

VRS will find more implementation in high FPS games like Racing Sims and VR. Sony is already using their own version of VRS for Playstation VR.

Is this not similar to saying XYZ game doesn't have checker-boarding 4K when it is running at native 4K?
 

geordiemp

Member
They really should have demoed the 4 player coop aspect more, I only noticed 2 players.

They did ?

8R61e6s.png
 
Last edited:

DeepEnigma

Gold Member
Why are there no concern soundbites about the xbox Velocity Architecture? The numbers suggest IO is simply much faster on the PS5 and all that vague talk about "bandwidth multipliers" (LOL) and the amazing BCPack compression is not getting any scrutiny from DF, ever. When it comes to Playstation, they always doubt PR and assume the absolute worst in lack of information. When it comes to Xbox, they always take PR at face value and assume the best in lack of information.

It needs to be said though, one thing the DF is not is a hivemind. Even in their coverage videos, John tries to be reasonable or give any kind of credit to Playstation, but he repeatedly gets shut down by Richard - case in point, the video about PS5 BC, that Richard just used to promote Xbox at ever other sentence. Or the Unreal 5 video, where Dictator was very reluctant to acknowledge that fast data streaming from the SSD was used at all.

The double standards are there though, and any discerning person will recognize them.

Yeah, it’s sad to see John get dog piled on when he tries to remain neutral or positive about something by certain fanatic groups of each side. So much so he deletes his tweets when I think he should leave them up so they can show their toxic ass to everyone.

With that said though, there are a small handful of people that give him shit on here as well whenever he posts, by taking their frustration for Richard and Alex our on him unfairly.

Still chuckling at that little weasel trying to “suggest” that the Blu-ray drive was tacked on last minute and the PS5 was originally designed around the digital edition. I couldn’t roll my eyes hard enough. The mask stays slipping with certain ones.
 
Last edited:

FALCON_KICK

Member
Why are there no concern soundbites about the xbox Velocity Architecture? The numbers suggest IO is simply much faster on the PS5 and all that vague talk about "bandwidth multipliers" (LOL) and the amazing BCPack compression is not getting any scrutiny from DF, ever. When it comes to Playstation, they always doubt PR and assume the absolute worst in lack of information. When it comes to Xbox, they always take PR at face value and assume the best in lack of information.

It needs to be said though, one thing the DF is not is a hivemind. Even in their coverage videos, John tries to be reasonable or give any kind of credit to Playstation, but he repeatedly gets shut down by Richard - case in point, the video about PS5 BC, that Richard just used to promote Xbox at ever other sentence. Or the Unreal 5 video, where Dictator was very reluctant to acknowledge that fast data streaming from the SSD was used at all.

The double standards are there though, and any discerning person will recognize them.

Might be off-topic discussion but since we are discussing DF bias, DF could have ran several tests with Sea of Thieves on several SSD models ranging from Sata to latest NVMe, ie MBPS to GBPS and evaluated and compared the effectiveness of Velocity Architecture and DirectStorage solution implemented on Xbox series X demo.

The only test we have is a from twitter handle who is using a faster SSD than the one in Xbox Series X having a 2 sec advantage on xbox.
 
Last edited:
Bullcrap. They are as "custom" as PS4 gpu was "custom". They are basically off the shelve parts with minimal changes mostly to I/O. Stuff like VRS is at core of design not something you can bolt on.

Moreover AMD wouldn't develop two architectures for two consoles. They use the same arch and arch is what decides features capability.
Bullcrap? That's why Ps4 Pro could do FP16 instructions and Xbox One X not?
 
Last edited:

FranXico

Member
What the hell? Mesh shaders (and amplification shaders) have nothing to do with VRS!



VRS:

Thank you for clearing my confusion. It is not clear when I started conflating the two, but the two demonstrations side by side made me realize my error.

If I understand correctly then, Variable Rate Shading is a general filtering technique that is applied by Adaptive Shaders, where filtering can be done based on content or motion.
Mesh shading actually is more akin to primitives shading, and both of them are doing geometry culling before the rasterization step.
Both filtering steps (either on geometry or rasterization) save processing time.

It still stands to reason that what MS patented was their own application of the VRS technique for the XsX.
But assuming that the PS5 GPU could not possibly have any form of VRS at all really sounds like a stretch.
 
Last edited:

On Demand

Banned
Digital foundry are a bunch MS fanboys and overhypers. They always downplay anything PlayStation. Just like they did with Raytacing. Ignore anything they have to say about PS. There’s a very good chance they’re not telling the truth nor do they care too. It’s all about doing the bare minimum with PS and obfuscating everything as possible.

That intro was not funny either. I wouldn’t be surprised if it came at the suggestion of the manbaby clubhouse discord. What kind of “professional” outlet uses something like that? With Richard sounding like a damn idiot. You can tell they were told to do it by the one who is a member of that discord.

So yeah, DF saying PS5 doesn’t have a certain feature? Not shocking at all. That’s their MO sadly.
 

martino

Member
So if I understand this thread correctly, DF will see VRS evidence during Xbox July event because they will notice image degradation provoked by it. And the conclusion to that is that it will both degrade the image quality, but improve the graphics at the same time.

You need to pause and zoom to detect vrs on moving object for example (one way to do it).
objective is to not waste power on details you can't see in normal viewing conditions
it doesn't mean you can't see use of it , if you're looking for it.
But Is that really suprising who can't see the point or turn it into derision ?
 

Thirty7ven

Banned
You need to pause and zoom to detect vrs on moving object for example (one way to do it).
objective is to not waste power on details you can't see in normal viewing conditions
it doesn't mean you can't see use of it , if you're looking for it.
But Is that really suprising who can't see the point or turn it into derision ?

What’s surprising to me is that we are still fighting over VRS. As if that’s the end all be solution, or something that isn’t going to be bog standard. It’s in DX12...

What’s the big deal about this? Are people so clueless that they don’t understand Gears Tactics has VRS? Do you need a XSX to enable VRS in Gears Tactics?
 
Last edited:

Dory16

Banned
Microsoft didn't have to talk about audio because Xbox One's audio was already superior. XB1 was able to be updated to Dolby Atmos. PS4 wasn't able to. Series X also has DTS X which is superior to Atmos and Tempest because it can have an Unlimited number of 3D sound objects. Just because Sony finally will have 3D audio and gave it fancy name doesn't make it special. Its not a true home theater surround sound format. No receivers will be decoding Tempest. Its basically a virtual surround sound format for headphones and to simulate surround sound through your tv.
Agreed. I've been using Atmos the entire generation on Xbox One, which is why I never paid attention to any 3D audio talk in consoles. Atmos and DTS X are the best Hollywood and blu-ray sound formats. If Tempest is that great, let's see Sony release Sony Pictures movies in it. Pitching individual rain drop sounds to console players who don't know any better sounds like more "emotion engine" shenanigans to me.
 

ethomaz

Banned
Thank you for clearing my confusion. It is not clear when I started conflating the two, but the two demonstrations side by side made me realize my error.

If I understand correctly then, Variable Rate Shading is a general filtering technique that is applied by Adaptive Shaders, where filtering can be done based on content or motion.
Mesh shading actually is more akin to primitives shading, and both of them are doing geometry culling before the rasterization step.
Both filtering steps (either on geometry or rasterization) save processing time.

It still stands to reason that what MS patented was their own application of the VRS technique for the XsX.
But assuming that the PS5 GPU could not possibly have any form of VRS at all really sounds like a stretch.
MS patented their own (software logic) implementation of VRS in DirectX 12U (it is not even exclusive to Xbox One Series X$.

VRS was first showed and implemented the hardware part by nVidia (you still needs the software logic that MS did with DX12U).

BTW nVidia explain pretty well where you can use VRS with their software implementation:


Hardware:Compatible with: VR Ready Quadro and VR Ready GeForce Turing based GPUs.
Software:Compatible with the following APIs:
DX11, DX12, OpenGL, Vulkan.


That article is specific about VRWorks but the feature (VRS) is not used only with VRWorks.
 
Last edited:

martino

Member
What’s surprising to me is that we are still fighting over VRS. As if that’s the end all be solution, or something that isn’t going to be bog standard. It’s in DX12...

What’s the big deal about this? Are people so clueless that they don’t understand Gears Tactics has VRS? Do you need a XSX to enable VRS in Gears Tactics?

Because in the end you are part of the clueless about some subtilities here and it's a good reason to have this thread in the end.
Being part of directX 12 doesn't mean all hardware magically support the feature and only nvidia 2xxx cards can do it for now ( even if vrs was added before RDNA was released). Also gears tactics doesn't use full implementation of the feature limiting what you can do with it
(see here https://devblogs.microsoft.com/directx/gears-tactics-vrs/ and here https://docs.microsoft.com/en-us/windows/win32/direct3d12/vrs tier 1 vs tier 2)

I don't see why first batch of games (half made if not more without final specs) would push the system to use VRS on PS5 or XSX
So this is the biggest reason we won't see it at first imo (PC version of MS games will probably push the feature for low end graphics cards though)
that said presence/ absence of it stays an interesting technical point to underline.
 
Last edited:

Dory16

Banned
More than VRS or not VRS, I'm concerned by 60 fps or not 60 fps. Not enough Sony games seen during the reveal ran at 60 fps. There's a madz gaming youtube video lisiting the ones that can be expected to reach 60 fps and none of the heavy hitters are on there (Horizon, Spider man, Ratchet etc...):
I think Sony's studios have been getting away with having beautiful art style and picture quality to mask technical shortcomings. Naughty Dog hasn't made a single game this gen that wasn't 1440p and unstable 30 fps. Yet everythime they release a trailer, the entire gaming crowd goes "OMG, how are they doing that on a PS4, it's impossible".
This gen may be a good opportunity to reinstate objective metrics for technical achievement. Games like RDR2 or DMC5 that were absolute technical masterpieces have not quite received the praise that they deserved for their graphics because of that hype over facts mindset among gamers.
I don't know if it's the lack of VRS that is keeping so many PS5 games from hitting 60fps but there is a bigger question to be asked, which is why so few people care even in next-gen?
 
My personal view, for what it's worth, is that Sony were most likely intending to launch in 2019. The tech at the time was RDNA 1, and thats what they ran with. The 36 CU limit also played into alllowing easier hardware based BC. Then they made a decision to hang off and swapped to the same base but RDNA 2.
Their GPU is a custom one that has different options than XSX does.
They have AMD primitive shader, and GE. They added Ray Tracing cores, and the die shrink allowed them to crank the GPU frequency to 2.23.
Just because RDNA 2 has certain abilities, doesnt mean it comes across as a given. Each company will ask for what they want, and pay AMD for it.
Sony went with AMD smartshift, but MS didnt.
With the PS4, Sony added more ROPs and ACEs, MS didnt.

While both PS5 and XSX have Ray Tracing, they dont do it exactly the same, as it is also API dependent.
I think that Sony haven't ran with VRS, because they would have said so by now.
People are pointing to the 6 year old Sony patent, but the fact is a 6 year old patent is nothing like the VRS 2.0 that MS is running with now. If Sony was balls deep into VRS using their own solution, I would have expected a newer parent showing up to date tech to rival VRS 2.0.
Sony would have been well aware of VRS, and maybe they didnt value it. Maybe They think they have a better way to do it.
I dont know. I am saying what I am only due to what Sony have said up to date.
They may well come out tomorrow and say the PS5 has VRS, and settle it.

Good point about the patent update; we've already found so many of their other patents the past year or so, by now we should've probably expected something on the GPU along that front. Maybe it'll be found at a later date, who knows 🤷‍♂️

For RT I think, now going back to some of the earlier MS press info, they basically added some tensor core-like equivalents to the GPU for DirectML and DXR RT, to work alongside the general graphics CUs. So when they gave that "in addition to essentially 13 TF in RT performance" figure earlier I at first didn't think too much of it. But considering MS has a major focus on ML with the system, and they would obviously want equivalents to Tensor Cores in their hardware for such purposes, I think there's merit to that figure even if some types try running with it to claim 25 TF (which you can't do, since those other cores aren't intended for general graphics workloads, they have a specific use).

I still think you are on to something and you have kind of convinced me that Sony might not have put a lot of emphasis on VRS as it is normally viewed/implemented. I am far less certain this has to do with the timing of the console, but rather a design choice though. This is my logic - which goes back to my previous post.

In the GPU space for PC we have all seen the charts that determine success in the PC space: FPS at 1080p, FPS at 1440p and FPS at 4K across titles at various graphical settings. Within your cost bracket you need to be competitive to survive. Cost = mm2 of silicon.

VRS is a hardware feature that sacrifices some (not much, but some) graphical fidelity to increase FPS by having criteria per draw/ screen space and/or triangle (roughly speaking) and apply different amount of shading work based on these criteria. In other words, small reduction in IQ to get a substantial productivity increase in shading per mm2 of silicon. This makes an awful lot of sense in a PC context for the reasons outlined above.

If you have a design target of 4K/30FPS - that I can see Cerny have - that you have just loaded with a significantly higher VRAM budget than what any PC platform can achieve (i.e. more textures and higher resolution per texture) you want to ensure to maintain that IQ you have just invested in. What Matt at least in my mind eluded to, is that Cerny seems to have created priorities per triangle/ primitive on the geometry side. If you then use those priorities to guide which primitive shaders to use you basically have a VRS solution without VRS if that makes sense? Then you have full control of your IQ and as long as the API to set priorities on the geometry side is straightforward that gives very good control for the designer.

Maybe I am wrong (this is a speculation thread after all!) but the solution above would align with Cerny's speech and Matt's twitter comments and still give VRS capabilities but just without the standardised VRS solution (and this would also explain the lack of 'standard' VRS talk regarding the GPU). If this is correct we should assume a slighter larger than normal allocation of mm2 of silicon to both geometry and primitive shaders on the PS5 GPU.

I think this is possible. One thing of note though: I'm not sure if 4K/30 is necessarily Cerny's design target, as such choices generally come down to the development studios. He might see that as an optimal situation for game performance on the platform, but devs still get to choose what baseline native targets they aim for. Over the course of the generation as platform limits are pushed we could see more 1440p/30 scenarios (but in real-time gameplay) like the UE5 demo showed off, if it means devs reaching at least that level of visual fidelity, on PS5.

If the modifications to the GE and PS you suggest hold true, I wonder if it would still be compiler-controlled like Primitive Shaders were on RDNA1. Mesh Shaders changed that going forward, but I don't know to what extent that type of customization could've been done with PS on Sony's end. It's possible Sony wanted to move aspects of VRS-like techniques to different parts of the processing and rendering pipeline, so it'll be interesting to see how it plays out in practice.

IIRC VRS 2.0 cleans up a lot of the sacrifice in IQ VRS 1.0 had; how much so I wouldn't know. But I'm very interested to see the results of the "VRS" > Primitive Shader approach of PS5 compared to the Mesh Shading > VRS approach of XSX. And just how Sony's own takes on that compare with VRS 2.0 (a lot of it would come down to what level of support Vulkan has for it, I think).
 

kimcorecoba

Neo Member
Hi ron..
XBO's $499 includes Kinect cost factors which reduced the overall budget for the GPU. XSX APU (~359mm2 size chip) includes 16 nm improvements without the Kinect cost factor.

Without Kinect cost factor, XBO APU's ~363mm2 size ship could have supported 28 CU size GPU.
Hi ron. Nice seeing you here
 

Elog

Member
Your assuming allot based on a first reveal of some games, last gen many Ps4pro games had a performance mode for 60 FPS. This new gen bluepoint already said demons souls with have both.

So 30 FPS modes will be full 4K by the looks of it and some RT on Ps5. A performance mode will pull out more frame time saving tricks for sure including temporal and others.....

We just have to wait, it was only a first showing at 1080p30 stream..

Do not get me wrong. I did not mean that the console is not capable of doing 60 FPS. My point is that if the design vision was focused on 4K/30FPS with a very high amount and quality of textures then the solution I proposed makes sense. You can still prioritize down the shaders on the primitive level to increase FPS. However, that is not what the architecture was optimized around.

Once again - I fully acknowledge that it is a speculation from me and that I might be wrong - but if the geometry and primitive shader hardware is oversized it kind of confirms my hypothesis. We will see!
 

Andodalf

Banned
Good point about the patent update; we've already found so many of their other patents the past year or so, by now we should've probably expected something on the GPU along that front. Maybe it'll be found at a later date, who knows 🤷‍♂️

For RT I think, now going back to some of the earlier MS press info, they basically added some tensor core-like equivalents to the GPU for DirectML and DXR RT, to work alongside the general graphics CUs. So when they gave that "in addition to essentially 13 TF in RT performance" figure earlier I at first didn't think too much of it. But considering MS has a major focus on ML with the system, and they would obviously want equivalents to Tensor Cores in their hardware for such purposes, I think there's merit to that figure even if some types try running with it to claim 25 TF (which you can't do, since those other cores aren't intended for general graphics workloads, they have a specific use).

I don't think they have any bespoke ML hardware, they just have customized the CUs to have 4i and 8i as well as the usual FP16 and FP32. In the Direct ML definition in the XSX glossary, they say there is 24 TF of FP16 perf, which is just the 12TF of normal FP32 but x2, like one expects from rapid math.


It was an impressive showing for a game that hasn't even begun to access the next generation features of the new GPU. Right now, it's difficult to accurately quantify the kind of improvement to visual quality and performance we'll see over time, because while there are obvious parallels to current-gen machines, the mixture of new hardware and new APIs allows for very different workloads to run on the GPU. Machine learning is a feature we've discussed in the past, most notably with Nvidia's Turing architecture and the firm's DLSS AI upscaling. The RDNA 2 architecture used in Series X does not have tensor core equivalents, but Microsoft and AMD have come up with a novel, efficient solution based on the standard shader cores. With over 12 teraflops of FP32 compute, RDNA 2 also allows for double that with FP16 (yes, rapid-packed math is back). However, machine learning workloads often use much lower precision than that, so the RDNA 2 shaders were adapted still further.


"We knew that many inference algorithms need only 8-bit and 4-bit integer positions for weights and the math operations involving those weights comprise the bulk of the performance overhead for those algorithms," says Andrew Goossen. "So we added special hardware support for this specific scenario. The result is that Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations. Note that the weights are integers, so those are TOPS and not TFLOPs. The net result is that Series X offers unparalleled intelligence for machine learning."

Yeah the DF article confirms this. IDK how 24TF FP16 turns to 49Tops 8i but i guess that's just how it works going from floats to intergers.
 

geordiemp

Member
Do not get me wrong. I did not mean that the console is not capable of doing 60 FPS. My point is that if the design vision was focused on 4K/30FPS with a very high amount and quality of textures then the solution I proposed makes sense. You can still prioritize down the shaders on the primitive level to increase FPS. However, that is not what the architecture was optimized around.

Once again - I fully acknowledge that it is a speculation from me and that I might be wrong - but if the geometry and primitive shader hardware is oversized it kind of confirms my hypothesis. We will see!

They also designed the console around VR, which requires 90 FPS, so I dont think so, probably opposite is true.

They just chose to showcase games in 4k30 mode as its youtube is my logical conclusion, looks better for first reveal.
 

geordiemp

Member
Good point about the patent update; we've already found so many of their other patents the past year or so, by now we should've probably expected something on the GPU along that front. Maybe it'll be found at a later date, who knows 🤷‍♂️

For RT I think, now going back to some of the earlier MS press info, they basically added some tensor core-like equivalents to the GPU for DirectML and DXR RT, to work alongside the general graphics CUs. So when they gave that "in addition to essentially 13 TF in RT performance" figure earlier I at first didn't think too much of it. But considering MS has a major focus on ML with the system, and they would obviously want equivalents to Tensor Cores in their hardware for such purposes, I think there's merit to that figure even if some types try running with it to claim 25 TF (which you can't do, since those other cores aren't intended for general graphics workloads, they have a specific use).



I think this is possible. One thing of note though: I'm not sure if 4K/30 is necessarily Cerny's design target, as such choices generally come down to the development studios. He might see that as an optimal situation for game performance on the platform, but devs still get to choose what baseline native targets they aim for. Over the course of the generation as platform limits are pushed we could see more 1440p/30 scenarios (but in real-time gameplay) like the UE5 demo showed off, if it means devs reaching at least that level of visual fidelity, on PS5.

If the modifications to the GE and PS you suggest hold true, I wonder if it would still be compiler-controlled like Primitive Shaders were on RDNA1. Mesh Shaders changed that going forward, but I don't know to what extent that type of customization could've been done with PS on Sony's end. It's possible Sony wanted to move aspects of VRS-like techniques to different parts of the processing and rendering pipeline, so it'll be interesting to see how it plays out in practice.

IIRC VRS 2.0 cleans up a lot of the sacrifice in IQ VRS 1.0 had; how much so I wouldn't know. But I'm very interested to see the results of the "VRS" > Primitive Shader approach of PS5 compared to the Mesh Shading > VRS approach of XSX. And just how Sony's own takes on that compare with VRS 2.0 (a lot of it would come down to what level of support Vulkan has for it, I think).

I think the RDNA1 narrative needs to stop, both MS and Sony started design on RDNA1 and moved to RDNA2, anything else is just plain FUD, and both will of known what was coming years in advance.

7vcPFRF.png


Both customised the RDNA2 with whatever they wanted from same menu of logic and functional blocks, thinking any different just shows naivity of semiconductor desigin.

Anything unique to Sony or MS will be patented HARDWARE design, but likely will be also owned by AMD as the design partner......

Both Sony and MS will have on the menu the logic functions for culling vertices not in view, lowering resolution in parts of an image, precedural shading, and they will have chosen what they thinks gives best performance and given each API a different name if they wish to :messenger_beaming: .

The only thing I have read that is unique is cache scrubbers and coherence engine so far.
 
Last edited:

Ascend

Member
For the ones that don't know, 'shading' is basically the technical term in graphics for 'coloring' (not exactly 100% true since shading is a term in itself, but it should help the non-native English speakers understand it better). I guess that alone would clarify quite a bit what it is. It has very little to do with geometry.
Leaving this here... Timestamped explanation of VRS with visual representation (based on nVidia's tech, but it's the same). Makes it easier to understand.

 

Ar¢tos

Member
More than VRS or not VRS, I'm concerned by 60 fps or not 60 fps. Not enough Sony games seen during the reveal ran at 60 fps. There's a madz gaming youtube video lisiting the ones that can be expected to reach 60 fps and none of the heavy hitters are on there (Horizon, Spider man, Ratchet etc...):
I think Sony's studios have been getting away with having beautiful art style and picture quality to mask technical shortcomings. Naughty Dog hasn't made a single game this gen that wasn't 1440p and unstable 30 fps. Yet everythime they release a trailer, the entire gaming crowd goes "OMG, how are they doing that on a PS4, it's impossible".
This gen may be a good opportunity to reinstate objective metrics for technical achievement. Games like RDR2 or DMC5 that were absolute technical masterpieces have not quite received the praise that they deserved for their graphics because of that hype over facts mindset among gamers.
I don't know if it's the lack of VRS that is keeping so many PS5 games from hitting 60fps but there is a bigger question to be asked, which is why so few people care even in next-gen?

If you are that concerned with 60fps, you should move to pc.
Graphics sell games to casuals, not Framerates.
 

Dory16

Banned
If you are that concerned with 60fps, you should move to pc.
Graphics sell games to casuals, not Framerates.
I could swear that high and stable framerates were one of the expected features for those next gen systems. Everybody shitted on the jaguar CPUs for 7 years and was looking forward to finally having decent frame rates on consoles. Visit the next gen thread on this very forum if you need a reminder of the hype that preceded the release of the PS5.
I'm baffled by how many times I'm being sent to the PC shop when I mention frame rates since the PS5 reveal. All of a sudden 60 fps is not something anyone should have even expected on a console.
Hopefully the trend will be different after the Xbox event.
 

Doncabesa

Member
I could swear that high and stable framerates were one of the expected features for those next gen systems. Everybody shitted on the jaguar CPUs for 7 years and was looking forward to finally having decent frame rates on consoles. Visit the next gen thread on this very forum if you need a reminder of the hype that preceded the release of the PS5.
I'm baffled by how many times I'm being sent to the PC shop when I mention frame rates since the PS5 reveal. All of a sudden 60 fps is not something anyone should have even expected on a console.
Hopefully the trend will be different after the Xbox event.
Phil Spencer has stated that he greatler prefers 60fps to 30fps and their recent 1st party has shown that to be true, even getting Gears 5 up to it with variable res techniques. I feel good about Xbox 1st party focusing on 60/120 (for mp) but the ubisofts and sony 1st party types will probably still focus on pretty 30 over less pretty 60.
 
na9TJZG.jpg


FUD: PS5 has no hardware accelerated ray tracing because 'verbiage' from what Cerny stated.
Goal: Sony's PS5 event shows games using ray tracing

FUD: PS5 isn't RDNA2
Goal: Sony and AMD's CEO say PS5 is RDNA2.

FUD: SSD is just for faster load times.
Goal: Insomniac shows a glimpse of what can be done with increased I/O with Ratchet and Clank Rift Apart. Warping to entirely different worlds in mere seconds.

FUD: PS5 is a 1440p+4K upscaling machine.
Goal: Sony's PS5 event shows most games running at native 4K
You could technically do RT on Series X at.... 0.5fps second. The proof will be in the games when they release.
 
Last edited:

DeepEnigma

Gold Member
So if I understand this thread correctly, DF will see VRS evidence during Xbox July event because they will notice image degradation provoked by it. And the conclusion to that is that it will both degrade the image quality, but improve the graphics at the same time.

I almost spit out my tea!

Well played, good sir! :pie_roffles::messenger_ok:

I think the RDNA1 narrative needs to stop, both MS and Sony started design on RDNA1 and moved to RDNA2, anything else is just plain FUD, and both will of known what was coming years in advance.


Both customised the RDNA2 with whatever they wanted from same menu of logic and functional blocks, thinking any different just shows naivity of semiconductor desigin.

Anything unique to Sony or MS will be patented HARDWARE design, but likely will be also owned by AMD as the design partner......

Both Sony and MS will have on the menu the logic functions for culling vertices not in view, lowering resolution in parts of an image, precedural shading, and they will have chosen what they thinks gives best performance and given each API a different name if they wish to :messenger_beaming: .

The only thing I have read that is unique is cache scrubbers and coherence engine so far.

Ah shit, we are back on the RDNA1 FUD? FFS, come on already! You are not getting those clocks, RT, or features on RDNA1. Just stop, it's getting more and more transparent. (not you, but those same people who keep circling back to it).
 
Last edited:
If you are that concerned with 60fps, you should move to pc.
Graphics sell games to casuals, not Framerates.

Well since most early games are 4k 30fps, can they implement something like checkerboard 4k 60fps? Basically an option between resolution and performance.

Or is that when a game has maxed out the CPU at 30fps, varying the resolution wouldn't do much?
 
Top Bottom