• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(*) Sony PS5 Vs. Xbox Series X Technical Analysis: Why The PS5’s 10.3 TFLOPs Figure Is Misleading

Status
Not open for further replies.

CJY

Banned
I think it nuts Mark Cerny would bring up Ray Tracing, Primitive Shaders, but say nothing at all of Variable Rate Shading (a very big deal feature) in a deep dive. I listened to that talk very carefully. And he essentially was prepping the viewers to not lock Sony down into any specific feature set based on whatever AMD may release on the PC side of things. He stressed that they chose their own path, and should specific things they've done, like their scrubbers as someone else mentioned, happen to end up in AMD PC hardware it means their collaboration together was all the more successful. He basically stressed that they are their own RDNA 2 custom, meaning it's entirely possible that not all features we may come to associate with RDNA2 should be automatically expected to be in PS5's Custom RDNA2.

As Digital Foundry said, there was also no mention of machine learning. Mark Cerny is a very, very detail oriented kind of person. I don't see him leaving these things out by chance or coincidence.
I read somewhere that the term "VRS" is a Microsoft trademark. Do you know if there is any truth to that? Would lend some sort of reasoning as to why it wasn't mentioned if it is indeed included.
 

Bogroll

Likes moldy games
There isn't a bigger % gap between pro and X here.

Not to mention X had a massive memory bandwidth and RAM advantage. That's almost nothing now.

Ps4 and Xbox one actually showed a larger gpu difference in terms of TF, but we only got 900p vs 1080p.
Is 2 TF difference in RDNA2 more than it seems though compare to 2 TF difference in GCN.
I don't know it's just a thought.
 

RaySoft

Member
I see what you did there with the title.. cunning:)

I therize the reason Sony downclocked the CPU is because they have other custom blocks in the APU that also consumes wattage and generate heat, like the SSD controller/handler.
2-300 MHz on the CPU dont matter anyways, at least not compared to what the SSD block inside the APU delivers. It's a balance act and at the speed the CPU cores are already running, it's deminishing returns anyways on those 300MHz. If anything it tells me Sony are trying to create a balanced box insted of just pushing the usual envelope of "faster everything".
That SSD speed will revolutionize how gameengines are created, mark my words.
I see now why some devs have expressed enthusiasm with the overall design.
 

RaySoft

Member
I read somewhere that the term "VRS" is a Microsoft trademark. Do you know if there is any truth to that? Would lend some sort of reasoning as to why it wasn't mentioned if it is indeed included.

MS has a patent on their implementation of VRS, not VRS itself. VRS has been used for many years already in some shape or form, it's nothing new.
PS5 will also have VRS.
 
I read somewhere that the term "VRS" is a Microsoft trademark. Do you know if there is any truth to that? Would lend some sort of reasoning as to why it wasn't mentioned if it is indeed included.

I know Microsoft has a patented version of it, yes, but if Nvidia and AMD and countless other people can speak publicly about it, I don't see why Sony would be exempt. But Pro-Elite does make a valid point that it SHOULD be helpful for PSVR 2.0, and so maybe sony will discuss it when they talk about that. But in all honesty, if PS5 does not have that feature, it's a bloodbath.

And just keep in mind what i'm saying here. I'm not saying the PS5 won't have unbelievable looking games that I will not hesitate for a single second to say I think looks better than whatever I deem worthy of comparison on Series X based on style and artistic design, animation, effects etc, all things that are far more important than just raw specifications. It's not weak a console, and I will never label the PS5 as such. I just don't believe its power potential is really in the same realm as where the Series X can get to when a top developer puts it through its paces.
 

Jonsoncao

Banned
Even if Xbox had 40 Tflops or 100 Tflops.... I wouldn't buy one. (Most people wouldn't)

Maybe that will help you understand where Xbox is as a brand.

Why you pretend to be a Playstation fan to make playstation fans look stupid? We are talk about hardware specification in this thread, not brand...

BTW: just to be clear, VRS and mesh shaders were not confirmed? I checked AMD's dx12U list and the PS5 stream, I didn't find Cerny mentioning any of those.
 

Armorian

Banned
it is a DirectX feature, I don't think that Microsoft allows Sony to use Dx12

Variable-rate shading (VRS)

It's part of the architecture not DX12 alone. For turing:

VRS is supported by DirectX 12, a group of APIs (application programming interfaces) Microsoft made that communicates with a PC’s components for rendering 2D and 3D graphics, video rendering and playing audio. It also works with the older DirectX 11, OpenGL and Vulkan. Nvidia says it's working on integrating VRS with Unreal Engine and Unity.

IF PS5 has full fledged RDNA2 chip it shoud support VRS.
 
I see what you did there with the title.. cunning:)

I therize the reason Sony downclocked the CPU is because they have other custom blocks in the APU that also consumes wattage and generate heat, like the SSD controller/handler.
2-300 MHz on the CPU dont matter anyways, at least not compared to what the SSD block inside the APU delivers. It's a balance act and at the speed the CPU cores are already running, it's deminishing returns anyways on those 300MHz. If anything it tells me Sony are trying to create a balanced box insted of just pushing the usual envelope of "faster everything".
That SSD speed will revolutionize how gameengines are created, mark my words.
I see now why some devs have expressed enthusiasm with the overall design.

Both consoles will do this. The I/O potential for both is insane. There is no adequate match for what they do on PC just yet.
 
Why you pretend to be a Playstation fan to make playstation fans look stupid? We are talk about hardware specification in this thread, not brand...

BTW: just to be clear, VRS and mesh shaders were not confirmed? I checked AMD's dx12U list and the PS5 stream, I didn't find Cerny mentioning any of those.

I was under the impression that Sony’s version of this stuff was part of the “geometry engine” talk
 

wintersouls

Member
Great post! Sony not only launched a system that wasn't a true generational leap but is misleading the world about the GPU.

5577_8b1e_480.gif
 

Neur4lN01s3

Neophyte
MS has a patent on their implementation of VRS, not VRS itself. VRS has been used for many years already in some shape or form, it's nothing new.
PS5 will also have VRS.

VRS Tier1 and Tier2 are a DirectXfeature DirectX. Have you link to tech documentation of similar feature on PS5's vulkan/opengl?
 
Last edited:

CJY

Banned
New Macbook Airs got announced yesterday, I'm thinking of buying one. (I <3 retina displays)

Just looking at the specs, the core i5 version has a base clock of 1.1Ghz with a Turbo Boost of up to 3.5Ghz. That's a huge boost and would probably only last for seconds. 1 minute - tops.

An overclock of the chip would have it run at maybe 2Ghz (obviously with substantially more cooling).

Point is: PS5 is neither boosting nor overclocking. Cerny didn't just "decide" to make PS5 variable so he could "overclock" or traditionally "boost" the APU in response to XSX. To say otherwise is truly daft.
 

CJY

Banned
MS has a patent on their implementation of VRS, not VRS itself. VRS has been used for many years already in some shape or form, it's nothing new.
PS5 will also have VRS.
I'm saying MS own a trademark on the term "VRS", not the tech as a whole. I might be wrong though. I'm probably wrong, I just read it somewhere.
 

Zero707

If I carry on trolling, report me.
Sony is definitely risking with chips yields,
also komachi updated his chips table
and there is no oberon in the list ? it just hold PCI ID 13F9
y7N4XlG.png
 

SleepDoctor

Banned
Lol there's a SMALLER gap between PS5 - XsX than between Pro and One X, are you trying to pull the wool over people's eyes? How can anyone take you seriously with a line like that.

There's a tiny 16% gap on paper here, but the point is all devs and industry pro have said the same - PS5 is faster than the small flops disparity suggests, so deal with this reality and stop coming out with absurd claims.


12.1 vs 9.2 tf now with some instances if 10.2 while overclocking.


6tf vs 4.2 now.

Math is definitely not your strong suit
 
I'm saying MS own a trademark on the term "VRS", not the tech as a whole. I might be wrong though. I'm probably wrong, I just read it somewhere.

They definitely have SOME kind of patent on the tech itself, but I don't know where that ends, or if a company would need to pay for its use or something. Hard to say. Microsoft seems to have submitted it way back in 2016 to the patent office.
 

Neur4lN01s3

Neophyte
I'm saying MS own a trademark on the term "VRS", not the tech as a whole. I might be wrong though. I'm probably wrong, I just read it somewhere.

I've read the same thing
RDNA2 and Nvidia supports VRS, that is a microsoft directX feature

for example the nvidia support is called Adaptive Shading (and Motion Adaptive Shading for another implementation), while Amd call it VRS, but the trademark is owned by Microsoft, as official documentation reports "Variable Rate Shading - VRS"
 
Last edited:

NickFire

Member
So, if this is ends up being the case, how would the boost really be helpful. Let’s say there’s an area of a game that needs the 10.28 TF, if I just stay in that area and never leave will the PS5 melt?
It's been a couple days since watching and reading the details, and I'm no expert. But I don't think there's an issue of melting. It sounded to me like they have a set amount of available power and set max thermals, so if the GPU gets boosted the CPU slows down. Basically just allocating power and thermal limits between CPU and GPU. My take away from this belief is that PS5 will be primarily a 9+ for games that push the CPU full out. This will definitely be a factor I take into account when deciding whether to switch back to MS, but I doubt I'm switching right now. I'm still rocking all 1080P TV's, do not want to buy new TV's before an existing one dies, and with the economic impacts of this virus I feel less confident than ever on the possibility MS takes a huge loss on the SeX. So right now I'm guessing a $399 PS5, a 499-599 SeX, and a $399 SeS. If it shakes out that way its PS5 all the way for me.
 

Jonsoncao

Banned
Sony is definitely risking with chips yields,
also komachi updated his chips table
and there is no oberon in the list ? it just hold PCI ID 13F9
y7N4XlG.png

Given the current circumstances, the best guess is that Oberon is an earlier prototype of PS5 GPU for testing BC (not RDNA2 but RDNA), so that the dev can simulate what happens for the real custom RDNA2 PS5 GPU.

A guess would be the actual manufacturing, stress test name for PS5 RDNA2 GPU must have changed in TSMC.
 

Neur4lN01s3

Neophyte
Even Intel uses VRS..

Right.
Being a Dx12 Hardware, it can support the whole Dx12 features set. Intel, Nvidia, Amd supports Dx VRS (well, Intel will support it when their hardware will be on market)
 

Piku_Ringo

Banned
It's been a couple days since watching and reading the details, and I'm no expert. But I don't think there's an issue of melting. It sounded to me like they have a set amount of available power and set max thermals, so if the GPU gets boosted the CPU slows down. Basically just allocating power and thermal limits between CPU and GPU. My take away from this belief is that PS5 will be primarily a 9+ for games that push the CPU full out. This will definitely be a factor I take into account when deciding whether to switch back to MS, but I doubt I'm switching right now. I'm still rocking all 1080P TV's, do not want to buy new TV's before an existing one dies, and with the economic impacts of this virus I feel less confident than ever on the possibility MS takes a huge loss on the SeX. So right now I'm guessing a $399 PS5, a 499-599 SeX, and a $399 SeS. If it shakes out that way its PS5 all the way for me.
No way the PS5 is going to cost anything than $599 unless they release a gimped version for those unwilling to spend that much
 

Zero707

If I carry on trolling, report me.
Given the current circumstances, the best guess is that Oberon is an earlier prototype of PS5 GPU for testing BC (not RDNA2 but RDNA), so that the dev can simulate what happens for the real custom RDNA2 PS5 GPU.

A guess would be the actual manufacturing, stress test name for PS5 RDNA2 GPU must have changed in TSMC.
i think Ariel was pre silcon for teseting and oberon is the same as Ariel but with RDNA 2 Features
but what interesting PS5 have the same BW as Ariel or close to it i don't Remember the number Exactly and not
530 BW like Oberon
 
The games are going to speak for themselves, they always do, it's a matter of time.

But how will they speak if you can't hear them over the loud fan blowing inside your box, constantly switching RPM because of boost/dynamic clocks? If Sony goes for a traditional console sized box again, I can totally see that thing being hot and loud again despite being less powerful than MS mini PC.

Disclaimer: Don't take that too seriously.
 
Last edited:
The gap is quite a bit larger and nobody with a brain will take that boost clock seriously when mark cerny already confirmed the worst case scenario (aka any truly AAA release that is expected to the push the system) will see the clocks drop on CPU and GPU. The 10.2 is window dressing. The GPU is most likely an entire teraflop lower than what's been reported. Who exactly is trying to pull the wool over people's eyes? Sony's boost clock claims are the very epitome of such.

For example, Jason Schrier's statement that both are beyond an RTX 2080 is patently false based on what we now know about the PS5. We still also have no confirmation of VRS and a couple other things from Cerny's deep dive. These two systems are just not in the same ballpark in terms of performance. It became obvious the moment those specs came out officially for PS5. Is the PS5 powerful? Of course it is, nobody would dare say it isn't, but it's quite a bit behind the Series X, and I don't think a credible person can state otherwise if they're being honest.



Should I listen to one of the oldest Xbox fanboys from this site, or an AMD engineer and countless world class devs with hands on both machines? Hmmm...

The gap is smaller than it's ever been, learn maths.
 
Last edited:
But how will they speak if you can't hear them over the loud fan blowing inside your box, constantly switching RPM because of boost/dynamic clocks? If Sony goes for a traditional console sized box again, I can totally see that thing being hot and loud again despite being less powerful than MS mini PC.

Disclaimer: Don't take that too seriously.


Cerny addressed cooling solution explicitly in the talk. And your idea of the way boost works on it is also wrong. Nice try.
 
Cerny addressed cooling solution explicitly in the talk. And your idea of the way boost works on it is also wrong. Nice try.

Please enlighten me. My idea was that dynamically upping the clock leads to higher operating temps, which leads to a need for better cooling, which lets the fan spin at a higher RPM, which makes the box louder. Wrong?
 

Bogroll

Likes moldy games


Should I listen to one of the oldest Xbox fanboys from this site, or an AMD engineer and countless world class devs with hands on both machines? Hmmm...

The gap is smaller than it's ever been, learn maths.

Well if it was gun to the head moment which one one come out on top most times i know which one i'd choose.
 
I think it nuts Mark Cerny would bring up Ray Tracing, Primitive Shaders, but say nothing at all of Variable Rate Shading (a very big deal feature) in a deep dive. I listened to that talk very carefully. And he essentially was prepping the viewers to not lock Sony down into any specific feature set based on whatever AMD may release on the PC side of things. He stressed that they chose their own path, and should specific things they've done, like their scrubbers as someone else mentioned, happen to end up in AMD PC hardware it means their collaboration together was all the more successful. He basically stressed that they are their own RDNA 2 custom, meaning it's entirely possible that not all features we may come to associate with RDNA2 should be automatically expected to be in PS5's Custom RDNA2.

As Digital Foundry said, there was also no mention of machine learning. Mark Cerny is a very, very detail oriented kind of person. I don't see him leaving these things out by chance or coincidence.
Jesus christ, when gaf was gaf, they banned you for fanboy warring. You went to the other forum and got yourself banned for fanboy warring. You then went to twitter to cry claiming persecution complex. Now you're back here doing the same shit. Senjtsu Sage knows nothing. SenjutsuSage is nothing. It's impossible to imagine a more irrelevant individual in technical discussions than SenjutsuSage because such an individual does not exist. You have no qualifications, you lack the education and frankly, your reasoning is juxtaposed with any semblance of logic.

Why do you pretend to be something you're not? Anybody and I mean anybody who even works in the field can smell your bullshit from a mile away. I know I smelt it the day I came across your post all those years ago on gaf. I studied CompSci for several years. I've worked in the field for several years and I even defer when something is out of my scope of expertise. You on the other hand are the biggest imposter I've ever seen and like a mosquito, you continue to be nothing but a pest to all the forums you infest.
 

Jonsoncao

Banned
i think Ariel was pre silcon for teseting and oberon is the same as Ariel but with RDNA 2 Features
but what interesting PS5 have the same BW as Ariel or close to it i don't Remember the number Exactly and not
530 BW like Oberon

Based on AquariusZi's post on ptt (https://pttweb.tw/s/4airz) who corroborated the 36CU speculation, Ariel is not the actual PS5 GPU.

However, some of AquariusZi's posts had way too many jargons which is extremely hard to decode.
 
I think it nuts Mark Cerny would bring up Ray Tracing, Primitive Shaders, but say nothing at all of Variable Rate Shading (a very big deal feature) in a deep dive. I listened to that talk very carefully. And he essentially was prepping the viewers to not lock Sony down into any specific feature set based on whatever AMD may release on the PC side of things. He stressed that they chose their own path, and should specific things they've done, like their scrubbers as someone else mentioned, happen to end up in AMD PC hardware it means their collaboration together was all the more successful. He basically stressed that they are their own RDNA 2 custom, meaning it's entirely possible that not all features we may come to associate with RDNA2 should be automatically expected to be in PS5's Custom RDNA2.

As Digital Foundry said, there was also no mention of machine learning. Mark Cerny is a very, very detail oriented kind of person. I don't see him leaving these things out by chance or coincidence.

When Cerny mentioned RT in the Wired article you called him a liar, in the other place. But now, sudenly, you need him to say VRS to beleive it's real.
I'm tired of your bs. You're just a xbox fanboy who wants to trash on everything with a PS logo.

And yes, your favorite plastic box it's more powerfull. Congratulations.
 
Jesus christ, when gaf was gaf, they banned you for fanboy warring. You went to the other forum and got yourself banned for fanboy warring. You then went to twitter to cry claiming persecution complex. Now you're back here doing the same shit. Senjtsu Sage knows nothing. SenjutsuSage is nothing. It's impossible to imagine a more irrelevant individual in technical discussions than SenjutsuSage because such an individual does not exist. You have no qualifications, you lack the education and frankly, your reasoning is juxtaposed with any semblance of logic.

Why do you pretend to be something you're not? Anybody and I mean anybody who even works in the field can smell your bullshit from a mile away. I know I smelt it the day I came across your post all those years ago on gaf. I studied CompSci for several years. I've worked in the field for several years and I even defer when something is out of my scope of expertise. You on the other hand are the biggest imposter I've ever seen and like a mosquito, you continue to be nothing but a pest to all the forums you infest.

My, my, aren't you the special one? You sound really mad.. I wish you could tell me all about it, but Corona. All you need to know is I'm the person who was way more right than the 13TF and whatever else was said "insiders"

Hold this plz.

L
 
Status
Not open for further replies.
Top Bottom