• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Dusk Golem reiterates that the Xbox will be more powerful than the PS5 (Admitted to starting console wars, demodded)

jimbojim

Banned
Call it a waste if u want. Its still higher. And anyway, MS have talked about ai learning in the xsx meaning they may come up with a dlss alternative.

I would rather use DLSS or something else. Difference is barely noticeable if you don't zoom it 500%. LOL That's why DF said it's a waste of resources when they analyzed Death Stranding for PC. Better waste it other resources
 

reinking

Gold Member
I cropped to hide the labels. My point is that there IS as clear difference between Pro and One X.

The comparison here is 1:1, perfectly fair when representing 4K and working with a much bigger image.

kvyJ2Ax.png








He ma, look at this standalone random screenshot of RDR2 for Xbox One S, looks fine to me, I'll just go ahead and miss the point entirely though and post it anyway, after all if I have nothing to compare to it, it should look fine on Xbox One S, who the heck needs next gen consoles when I can just choose ignore that there's something better out there :messenger_winking_tongue:

UaMoHrJ.png
Side by side comparisons and DF videos are clearly going to show the advantages. Ask anyone playing them apart from each other and it becomes less of an issue. I doubt anyone that takes home a PS5 is going to be disappointed. All of this stuff is just for epeen measuring which MS seems to be good at lately. What is the saying? It is not the size that matters but how you use it?
 

jimbojim

Banned
I cropped to hide the labels. My point is that there IS as clear difference between Pro and One X.

The comparison here is 1:1, perfectly fair when representing 4K and working with a much bigger image.

kvyJ2Ax.png








He ma, look at this standalone random screenshot of RDR2 for Xbox One S, looks fine to me, I'll just go ahead and miss the point entirely though and post it anyway, after all if I have nothing to compare to it, it should look fine on Xbox One S, who the heck needs next gen consoles when I can just choose ignore that there's something better out there :messenger_winking_tongue:

UaMoHrJ.png

Reconstruction technique in RDR 2 on PS4 Pro is crap. It really sucks compared to others like Rise Of The Tomb Raider, The Surge where it is implemented great. Won't mention first party titles. If properly used it is barely noticeable until you zoom in hardly.
 
Last edited:

Seph-

Member
I'm not taking this at face value. It's months old information that's being used as if it's indicative of end result on 1 game. I think really people should be questioning if maybe RE8 devs are simply dropping the ball here and doing a bad job. Anyone who isn't pulling some fanboy narrative nonsense knows these consoles are close. There simply isn't enough difference here to net this situation. That and well, we've already seen games running at Native 4k according to outlets at both 30 and 60fps. Personally I would rather use a resolution upscale technique as I think native is a waste at this point.

Again sounds to me like either bad info or just RE8 devs are dropping the ball, i'm fully on the side of Xbox will run 3rd party titles better and each console can do things the other can't as well as a SLIGHT resolution difference being a end result you likely won't even notice. But to go as far as to say one is struggling and one isn't on 1 specific title and use is as a indication that "oh x console is bad" is just silly.
 

killatopak

Gold Member
I like Dusk Golem but it’s pretty hard to believe when XBX games are hitting 4k. RE8 was even supposed to come out on current gen games so it’s just not believable to me.
 

Mister Wolf

Gold Member
Honestly mate, I don't think many people care about either game. The Gunk looks like a cutesy platformer, and that's covered by R&C. Scorn has been in development hell and nobody knows anything about the game. 2 games with little hype behind them from no-name developers.


"Developer Ebb designed Scorn around the idea of "being thrown into the world." In the game, you find yourself in a dream-like world that's made up of several interconnected regions that you can explore in any order you want as each one has its own self-contained story and theme that contributes to the overall narrative.

As you explore, you'll unlock additional skills, weapons, and items that allow you to explore further and discover new areas. Of course, this being a horror game, your journey won't be without a few frights and dangerous-looking threats."

Its Metroid.
 
Last edited:

Deto

Banned
Also 40% more CUs so the Raytracing capabilities will be much better.

Okay, astroturfing at least for the typical xbox consumer, it worked.


The lies started.

2TF vs 0.5TF. If something needs 1TF more power than the PS5 has it doesn’t matter what the percentage difference is, just how much power there is.


The differences are compared in %.
source: basic mathematics.

Always the same script.

tell me, do you believe that or lie on the internet to "win the argument"?
 
Last edited:
What's so hard to understand? RE8 will be optimized to run at 4K dynamic on PS5.



PS5 GPU will not stay at 2.2GHz all the time, so it will be 18% performance gap in the best scenario.

I don't think it'll be a 2.23 GHz most of the time, actually (let's see how many people read past that to realize I'm not insinuating they probably think I'm insinuating x3). Neither will Series X be at 1.825 GHz most of the time, either. It doesn't make any sense. If your game isn't demanding that much power, why have the GPU stress out at that level? I don't think "variable" frequency or "sustained" clocks are referring to this type of thing.

I believe the variable frequency comes into play in scenarios where a game is actually maxing out the clocks. So it has more to do with sustaining that power at a long period of time as best a possible, based on the stress of the power load. So in those scenarios where the clock is being maxed, if the power budget is exceeded and the GPU needs more power, the system will siphon some of the power budget from the CPU if able. If not, then the GPU will downclock. It bases this on power, not heat, and the frequencies are adjusted based on any reduction in the power load.

It works differently on the Series systems; again they aren't going to be at max clocks all the time because a lot of game logic doesn't even requires that. However, for points where max clocks are required, the systems determine the power budget based on the production of heat, and rather than using that to lower the clocks, it uses that to increase the cooling. So unless the cooling somehow fails, a game a guaranteed those max clocks at those frequencies for as long as the game needs it.

Keep in mind too that in both systems cases (PS5, Series X/Series S), that you don't necessarily need max clocks to be active to cause the scenarios mentioned above. For PS5 I'd imagine that frequent peaks at high clocks for either CPU or GPU in short periods would run a chance of the power supply not being able to draw/produce enough required power quickly enough to then distribute throughout the system? Dunno, I don't really know how PSUs work xD.

For Series X/Series S they have to contend with residual heat dissipation which could happen if excess heat isn't removed from the system fast enough inbetween peaks of high power usage. This is where you can run into the typical overheating issue of virtually any system (PS5 can also run into this issue, hence why it needs very good cooling even if it isn't monitoring for heat production to determine the level of cooling, but rather power draw as the monitored element), which can cause a shutdown. However I'd assume this won't be a case for the Series systems since the One X had an extremely good cooling system, and I'd also say this shouldn't be a factor with PS5, either.

All speculation on my part, but it at least sounds like it makes the most sense.



He's not wrong, but both systems have GEs. Unless AMD rebranded the GE for RDNA2 and Sony for whatever reason are going with the RDNA1 terminology there. I don't see that really being the case, and I also don't see AMD just outright removing the GE from RDNA2 and onward, either unless, again, it's been retooled and renamed.

Taking the more favorable option here, let's just say both systems have GEs. But we do know Sony have made some customizations with theirs. The question is if MS has. Only a few more days to find out (hopefully)!
 
Last edited:

onQ123

Member
As i predicted earlier, Series X will offer the best gaming performance, smaller volume/weight/heat/consumption, at the same or lower price than PS5.
Add in Gamepass, and 1TB ssd expansion, Phil is attacking the covid-hit gaming community with the best value, best designed next gen experience.

I do hope Sony is not that drunk with PS4 success, to think they can avoid price competition with Series X.


You better wait for the real results y'all haven't learned yet after getting burned over & over again
 

Seph-

Member
I don't think it'll be a 2.23 GHz most of the time, actually (let's see how many people read past that to realize I'm not insinuating they probably think I'm insinuating x3). Neither will Series X be at 1.825 GHz most of the time, either. It doesn't make any sense. If your game isn't demanding that much power, why have the GPU stress out at that level? I don't think "variable" frequency or "sustained" clocks are referring to this type of thing.

I believe the variable frequency comes into play in scenarios where a game is actually maxing out the clocks. So it has more to do with sustaining that power at a long period of time as best a possible, based on the stress of the power load. So in those scenarios where the clock is being maxed, if the power budget is exceeded and the GPU needs more power, the system will siphon some of the power budget from the CPU if able. If not, then the GPU will downclock. It bases this on power, not heat, and the frequencies are adjusted based on any reduction in the power load.

It works differently on the Series systems; again they aren't going to be at max clocks all the time because a lot of game logic doesn't even requires that. However, for points where max clocks are required, the systems determine the power budget based on the production of heat, and rather than using that to lower the clocks, it uses that to increase the cooling. So unless the cooling somehow fails, a game a guaranteed those max clocks at those frequencies for as long as the game needs it.

Keep in mind too that in both systems cases (PS5, Series X/Series S), that you don't necessarily need max clocks to be active to cause the scenarios mentioned above. For PS5 I'd imagine that frequent peaks at high clocks for either CPU or GPU in short periods would run a chance of the power supply not being able to draw/produce enough required power quickly enough to then distribute throughout the system? Dunno, I don't really know how PSUs work xD.

For Series X/Series S they have to contend with residual heat dissipation which could happen if excess heat isn't removed from the system fast enough inbetween peaks of high power usage. This is where you can run into the typical overheating issue of virtually any system (PS5 can also run into this issue, hence why it needs very good cooling even if it isn't monitoring for heat production to determine the level of cooling, but rather power draw as the monitored element), which can cause a shutdown. However I'd assume this won't be a case for the Series systems since the One X had an extremely good cooling system, and I'd also say this shouldn't be a factor with PS5, either.

All speculation on my part, but it at least sounds like it makes the most sense.



He's not wrong, but both systems have GEs. Unless AMD rebranded the GE for RDNA2 and Sony for whatever reason are going with the RDNA1 terminology there. I don't see that really being the case, and I also don't see AMD just outright removing the GE from RDNA2 and onward, either unless, again, it's been retooled and renamed.

Taking the more favorable option here, let's just say both systems have GEs. But we do know Sony have made some customizations with theirs. The question is if MS has. Only a few more days to find out (hopefully)!
Probably an easier or I guess different way of thinking it atleast for me how I understood it was the Xbox runs like your standard console (which i'm going to add here the whole "sustained" thing is a bit overblown.) So runs with the clock like you would assume it would but just not always having a workload for it on certain screens and so on. While say the PS has the ability to run at the desired clocks or TF (AKA Variable) that the developers need for that scene and so on. So it helps explain how it's "A new paradigm" because it's offering something previously not allowed by consoles to date. Then we just add in the psu either running constant or varying etc.
 
Last edited:

Iced Arcade

Member
This guy knows as much as Matt... which is nothing.

Seriously even this gen, there was barely a difference in games between the base X1 & PS4 unless you had videos side by side cropped in counting pixels. Next gen will be the same
 
Last edited:

jimbojim

Banned
2 TF more is a lot for that rendering dude, there will be SOME differences that doesn’t mean the PS5 isn't well designed but XSX is just better in numbers

This is your 2 TF, i mean, 1.8 TF stronger GPU in numbers. Not bad for a weaker GPU, isn't it.


Whilst I'm here, I might as well post the rest. It's important to note all the below are theoretical maximums, including whether the clocks are fixed or not:

Extrapolated from RDNA1:

Triangle rasterisation is 4 triangles per cycle.

PS5:
4 x 2.23 GHz ~ 8.92 Billion triangles per second

XSX:
4 x 1.825 GHz - 7.3 Billion triangles per second

Triangle culling rate is twice number triangles rasterised per cycle.

PS5:
8 x 2.23 GHz - 17.84 Billion triangles per second

XSX:
8 x 1.825 GHz - 14.6 Billion triangles per second

Pixel fillrate is with 4 shader arrays with 4 RBs (render backends) each, and each RB outputtting 4 pixels each. So 64 pixels per cycle.

PS5:
64 x 2.23 GHz - 142.72 Billion pixels per second

XSX:
64 x 1.825 GHz - 116.8 Billion pixels per second

Texture fillrate is based on 4 texture units (TMUs) per CU.

PS5:
4 x 36 x 2.23 GHz - 321.12 Billion texels per second

XSX:
4 x 52 x 1.825 GHz - 379.6 Billion texels per second

Raytracing in RDNA2 is alleged to be from modified TMUs.

PS5:
4 x 36 x 2.23 GHz - 321.12 Billion ray intersections per second

XSX:
4 x 52 x 1.825 GHz - 379.6 Billion Ray intersections per second
 

geordiemp

Member
I don't think it'll be a 2.23 GHz most of the time, actually (let's see how many people read past that to realize I'm not insinuating they probably think I'm insinuating x3). Neither will Series X be at 1.825 GHz most of the time, either. It doesn't make any sense. If your game isn't demanding that much power, why have the GPU stress out at that level? I don't think "variable" frequency or "sustained" clocks are referring to this type of thing.

I believe the variable frequency comes into play in scenarios where a game is actually maxing out the clocks. So it has more to do with sustaining that power at a long period of time as best a possible, based on the stress of the power load. So in those scenarios where the clock is being maxed, if the power budget is exceeded and the GPU needs more power, the system will siphon some of the power budget from the CPU if able. If not, then the GPU will downclock. It bases this on power, not heat, and the frequencies are adjusted based on any reduction in the power load.

It works differently on the Series systems; again they aren't going to be at max clocks all the time because a lot of game logic doesn't even requires that. However, for points where max clocks are required, the systems determine the power budget based on the production of heat, and rather than using that to lower the clocks, it uses that to increase the cooling. So unless the cooling somehow fails, a game a guaranteed those max clocks at those frequencies for as long as the game needs it.

Keep in mind too that in both systems cases (PS5, Series X/Series S), that you don't necessarily need max clocks to be active to cause the scenarios mentioned above. For PS5 I'd imagine that frequent peaks at high clocks for either CPU or GPU in short periods would run a chance of the power supply not being able to draw/produce enough required power quickly enough to then distribute throughout the system? Dunno, I don't really know how PSUs work xD.

For Series X/Series S they have to contend with residual heat dissipation which could happen if excess heat isn't removed from the system fast enough inbetween peaks of high power usage. This is where you can run into the typical overheating issue of virtually any system (PS5 can also run into this issue, hence why it needs very good cooling even if it isn't monitoring for heat production to determine the level of cooling, but rather power draw as the monitored element), which can cause a shutdown. However I'd assume this won't be a case for the Series systems since the One X had an extremely good cooling system, and I'd also say this shouldn't be a factor with PS5, either.

All speculation on my part, but it at least sounds like it makes the most sense.



He's not wrong, but both systems have GEs. Unless AMD rebranded the GE for RDNA2 and Sony for whatever reason are going with the RDNA1 terminology there. I don't see that really being the case, and I also don't see AMD just outright removing the GE from RDNA2 and onward, either unless, again, it's been retooled and renamed.

Taking the more favorable option here, let's just say both systems have GEs. But we do know Sony have made some customizations with theirs. The question is if MS has. Only a few more days to find out (hopefully)!

Cerny and Nauty Dog has a patent on GE so it will be unique to Sony if its in ps5, compresses vertices and shaders, new technology


Compressng GPU data and more efficent and patented last few weeks just before tear down....what do you think.

Maybe Paul from RGT is on to something, and moores law is dead.

Its unlikely straight RDNA2, will have some of it but it will highly likely be customised around this and the Sector type VRS that is also Cerny patent.

I wonder if Sony will allow the new GPU GE patent, the new mesh shader and VRS type patents in lats month in RDNA3 ?

How do you think Sony first party games shown were native 4K and have RT ?
 
Last edited:

onQ123

Member
This is your 2 TF, i mean, 1.8 TF stronger GPU in numbers. Not bad for a weaker GPU, isn't it.


Whilst I'm here, I might as well post the rest. It's important to note all the below are theoretical maximums, including whether the clocks are fixed or not:

Extrapolated from RDNA1:

Triangle rasterisation is 4 triangles per cycle.

PS5:
4 x 2.23 GHz ~ 8.92 Billion triangles per second

XSX:
4 x 1.825 GHz - 7.3 Billion triangles per second

Triangle culling rate is twice number triangles rasterised per cycle.

PS5:
8 x 2.23 GHz - 17.84 Billion triangles per second

XSX:
8 x 1.825 GHz - 14.6 Billion triangles per second

Pixel fillrate is with 4 shader arrays with 4 RBs (render backends) each, and each RB outputtting 4 pixels each. So 64 pixels per cycle.

PS5:
64 x 2.23 GHz - 142.72 Billion pixels per second

XSX:
64 x 1.825 GHz - 116.8 Billion pixels per second

Texture fillrate is based on 4 texture units (TMUs) per CU.

PS5:
4 x 36 x 2.23 GHz - 321.12 Billion texels per second

XSX:
4 x 52 x 1.825 GHz - 379.6 Billion texels per second

Raytracing in RDNA2 is alleged to be from modified TMUs.

PS5:
4 x 36 x 2.23 GHz - 321.12 Billion ray intersections per second

XSX:
4 x 52 x 1.825 GHz - 379.6 Billion Ray intersections per second

There is also the faster internal cache because of the higher clocks also but people only seem to look at one number & claim victory
 

Allandor

Member
Both? So, VRS isn't mentioned for PS5. PS5 doesn't have VRS
Geometry engine isn't mentioned for XSX, XSX has it.
Can you explain me double standards
Really??
A few posts above GE => Mesh Shaders.

And you can't calculate the numbers. The one console has variable clocks and also I really don't think that the bigger GPU has same amount of the other unites like the PS5-GPU. That does not make sense at all.
 

tusharngf

Member
This is your 2 TF, i mean, 1.8 TF stronger GPU in numbers. Not bad for a weaker GPU, isn't it.


Whilst I'm here, I might as well post the rest. It's important to note all the below are theoretical maximums, including whether the clocks are fixed or not:

Extrapolated from RDNA1:

Triangle rasterisation is 4 triangles per cycle.

PS5:
4 x 2.23 GHz ~ 8.92 Billion triangles per second

XSX:
4 x 1.825 GHz - 7.3 Billion triangles per second

Triangle culling rate is twice number triangles rasterised per cycle.

PS5:
8 x 2.23 GHz - 17.84 Billion triangles per second

XSX:
8 x 1.825 GHz - 14.6 Billion triangles per second

Pixel fillrate is with 4 shader arrays with 4 RBs (render backends) each, and each RB outputtting 4 pixels each. So 64 pixels per cycle.

PS5:
64 x 2.23 GHz - 142.72 Billion pixels per second

XSX:
64 x 1.825 GHz - 116.8 Billion pixels per second

Texture fillrate is based on 4 texture units (TMUs) per CU.

PS5:
4 x 36 x 2.23 GHz - 321.12 Billion texels per second

XSX:
4 x 52 x 1.825 GHz - 379.6 Billion texels per second

Raytracing in RDNA2 is alleged to be from modified TMUs.

PS5:
4 x 36 x 2.23 GHz - 321.12 Billion ray intersections per second

XSX:
4 x 52 x 1.825 GHz - 379.6 Billion Ray intersections per second

thumbs up..nice summary
 
Probably an easier or I guess different way of thinking it atleast for me how I understood it was the Xbox runs like your standard console (which i'm going to add here the whole "sustained" thing is a bit overblown.) So runs with the clock like you would assume it would but just not always having a workload for it on certain screens and so on. While say the PS has the ability to run at the desired clocks or TF (AKA Variable) that the developers need for that scene and so on. So it helps explain how it's "A new paradigm" because it's offering something previously not allowed by consoles to date. Then we just add in the psu either running constant or varying etc.

But this isn't making too much sense to me. If they're enabling the ability to run at desired clock or TF the developer needs for the scene, what does that imply with other systems? Does it mean that other systems were providing an excess of frequency or power for the task at hand, wasting power as a result? Right now I'm typing this post on a PC and have my Task Manager open at all times, running an image editor and got a bunch of other tabs open. Integrated graphics (don't kill me xD), but nowhere near 100% system resources being stressed. Nowhere near 100% power being drawn, as my system cooling is literally silent. And this is a PC.

I just don't think there's really been cases of any system with fixed clocks drawing an excess of power to waste on rendering not requiring higher frequency usage, not for any lengthy period of time anyway. You may get a few frames upon a reduction in system frequency utilization where the power load has to take a bit to calm down to match the lower system utilization levels (this might be what you're referring to?), but that kind of fits more in what I was saying earlier in the post you responded to, and it still comes down to how efficient the cooling in the system is (extremely good cooling staves this off for the most part because even at higher clocks such a system would never generate enough heat to start needing to draw in more power for sustaining those clocks and the cooling won't have to work so hard in the first place as a result).

I can imagine with PS5's setup the power and cooling can make those adjustments a few ms faster, but in real-world practice probably won't make anything more than 1% - 3% difference in gains compared to the sort of "fixed" frequency setup of a contemporary platform. However for Sony's specific design it very likely brings a ton of performance gains compared to implementing a comparable "fixed" clock setup on PS5, in fact they more or less said as such in Road to PS5.

Both? So, VRS isn't mentioned for PS5. PS5 doesn't have VRS
Geometry engine isn't mentioned for XSX, XSX has it.
Can you explain me double standards

GE is absolutely essential to the RDNA architecture to function. Hence why both systems have it. It's a term AMD coined anyway, not Sony, so either AMD rebranded it for RDNA2 and onward, or it's still present in RDNA2 as Geometry Engine.

VRS is a "nice bonus" for RDNA2 GPUs but isn't absolutely critical for RDNA to function. But I haven't even been saying PS5 doesn't have "VRS". I just said VRS is a MS-branded term so if Sony has an equivalent, it is probably under a different name and may/may not be implemented somewhat differently in the pipeline.
 
Last edited:

teezzy

Banned
It's going to show in third party games, but first party Sony stuff is going to be gorgeous.

Same way first party Nintendo knows their way around Switch hardware, Sony's devs are able to work the same magic.

Let's hope these new studios MS bought can do the same and set new industry standards.
 

jimbojim

Banned
LOL

ve38F9c.jpg




And you can't calculate the numbers. The one console has variable clocks and also I really don't think that the bigger GPU has same amount of the other unites like the PS5-GPU. That does not make sense at all.

Yes, you can. Because PS5 will be at max. clocks most of its time. And clocks won't drop so dramatically, only couple of percentages, like Cerny mentioned it. But these stressed scenes happens rarely
 
Ehh... like I've always said, Native 4K has always been a dick measuring contest. Hopefully with DLSS2.0 becoming more popular ppl will start dropping this obsession with native 4K. From what we're hearing both Xbox and PS are working on their own version of DLSS2.0, albeit a less sophisticated version of it. Even then, I'm sure there will be videos and screenshots, zoomed in of course, to show slight differences. If you have to pause and zoom in to see them, would they even really stand out when you're playing the game?

But for sure, Series X is more powerful than PS5. You can't argue hard numbers and both consoles have hard numbers that you can't explain away.
 
Last edited:

Seph-

Member
But this isn't making too much sense to me. If they're enabling the ability to run at desired clock or TF the developer needs for the scene, what does that imply with other systems? Does it mean that other systems were providing an excess of frequency or power for the task at hand, wasting power as a result? Right now I'm typing this post on a PC and have my Task Manager open at all times, running an image editor and got a bunch of other tabs open. Integrated graphics (don't kill me xD), but nowhere near 100% system resources being stressed. Nowhere near 100% power being drawn, as my system cooling is literally silent. And this is a PC.

I just don't think there's really been cases of any system with fixed clocks drawing an excess of power to waste on rendering not requiring higher frequency usage, not for any lengthy period of time anyway. You may get a few frames upon a reduction in system frequency utilization where the power load has to take a bit to calm down to match the lower system utilization levels (this might be what you're referring to?), but that kind of fits more in what I was saying earlier in the post you responded to, and it still comes down to how efficient the cooling in the system is (extremely good cooling staves this off for the most part because even at higher clocks such a system would never generate enough heat to start needing to draw in more power for sustaining those clocks and the cooling won't have to work so hard in the first place as a result).

I can imagine with PS5's setup the power and cooling can make those adjustments a few ms faster, but in real-world practice probably won't make anything more than 1% - 3% difference in gains compared to the sort of "fixed" frequency setup of a contemporary platform. However for Sony's specific design it very likely brings a ton of performance gains compared to implementing a comparable "fixed" clock setup on PS5, in fact they more or less said as such in Road to PS5.
Well 1st. Integrated Graphics.... lol kidding. But as for that, we know that in the past they let the frequency stay sort of set and let the power fluctuate. We also for the most part know that you can in fact change a frequency multiple times in a scene. I personally don't think PC is the right comparison and it's also worth noting since this is relatively new and different this is all in theory until it's in front of us. However based on what we know such as they have indeed said they have the power value at a set amount and don't let it vary, and wen't with a varying frequency based on load (i.e what the scene requires).

So I admit it's certainly theoretical until we obviously have hands on experiences but I don't think it's improbable. I just as I said see it as them allowing devs to use what their engine needs while trying to get as close to that theoretical number everyone loves throwing around as possible. It's just new, and all we really can do is theorize. It's just well, different.

I do apologize if I have left some stuff out, today is a busy one, but i'm just trying to chime in on my free time, or perhaps I just misunderstood something you asked.
 

pasterpl

Member
Yes, you can. Because PS5 will be at max. clocks most of its time. And clocks won't drop so dramatically, only couple of percentages, like Cerny mentioned it. But these stressed scenes happens rarely

maybe I am being picky here, but what most means? 99% or 51% or maybe some other %?
 

NickFire

Member
I don't think we need more names to tell us that 12 TF is higher than either 9 or 10 TF. But let's be real. Series S is going to dictate everything outside of native 4k. So there will not be a single game that Series X can run and PS5 cannot. Maybe PS5 won't have native 4k, but I don't even have a 4k TV so I could care less. And even when I do, I've already seen enough to not care about native 4k of last gen games. Not a game changer to me.
 

jimbojim

Banned
Dark theme, yo

Sorry, will change to dark. LOL

maybe I am being picky here, but what most means? 99% or 51% or maybe some other %?

Both. It can be from 50, 60, 70 to 99%. Like i've said, stressed scenes happens rarely in games. Don't worry, PS5 isn't a 9 TF console. But looks like some people deliberately ignored what Cerny said. LOCKED profiles exists on PS5 :

Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.

 
Last edited:

pawel86ck

Banned
I don't think it'll be a 2.23 GHz most of the time, actually (let's see how many people read past that to realize I'm not insinuating they probably think I'm insinuating x3). Neither will Series X be at 1.825 GHz most of the time, either. It doesn't make any sense. If your game isn't demanding that much power, why have the GPU stress out at that level? I don't think "variable" frequency or "sustained" clocks are referring to this type of thing.

I believe the variable frequency comes into play in scenarios where a game is actually maxing out the clocks. So it has more to do with sustaining that power at a long period of time as best a possible, based on the stress of the power load. So in those scenarios where the clock is being maxed, if the power budget is exceeded and the GPU needs more power, the system will siphon some of the power budget from the CPU if able. If not, then the GPU will downclock. It bases this on power, not heat, and the frequencies are adjusted based on any reduction in the power load.

It works differently on the Series systems; again they aren't going to be at max clocks all the time because a lot of game logic doesn't even requires that. However, for points where max clocks are required, the systems determine the power budget based on the production of heat, and rather than using that to lower the clocks, it uses that to increase the cooling. So unless the cooling somehow fails, a game a guaranteed those max clocks at those frequencies for as long as the game needs it.

Keep in mind too that in both systems cases (PS5, Series X/Series S), that you don't necessarily need max clocks to be active to cause the scenarios mentioned above. For PS5 I'd imagine that frequent peaks at high clocks for either CPU or GPU in short periods would run a chance of the power supply not being able to draw/produce enough required power quickly enough to then distribute throughout the system? Dunno, I don't really know how PSUs work xD.

For Series X/Series S they have to contend with residual heat dissipation which could happen if excess heat isn't removed from the system fast enough inbetween peaks of high power usage. This is where you can run into the typical overheating issue of virtually any system (PS5 can also run into this issue, hence why it needs very good cooling even if it isn't monitoring for heat production to determine the level of cooling, but rather power draw as the monitored element), which can cause a shutdown. However I'd assume this won't be a case for the Series systems since the One X had an extremely good cooling system, and I'd also say this shouldn't be a factor with PS5, either.

All speculation on my part, but it at least sounds like it makes the most sense.
You are correct, because technically speaking no modern GPU will run at fixed clocks all the time. I was thinking only about demanding gaming scenarios where GPU will need to run at max clocks for extended period of time. XSX will offer sustained level of performance in such scenario for sure.
 

CuNi

Member
DF said it was 4K. Unless Capcom have some tricks in play to make it look 4K.

I'm not doubting that it will run 4k or 4k cb and I'm not making a statement on that.
I only defend Dusk when people accuse him of things he never said as he never said it will not even run on 1080p, he merely said it runs badly even at 1080p right now but will run fine on release.
Nothing shocking there, many games run poorly in development builds.

So if the 1080p constraints will be fixed by launch, what about the 4K constraints?

I'm not an insider, I have no clue what res it will run on final version. Maybe 4k cb, maybe native 4k. No clue.
My issue was simply that he never said "it will only run in 1080p" or anything close to it. He merely said, current dev build seems to have issues even at 1080p but even said himself it will run good on release.
My guess is that this is simply a poorly optimized dev build. Maybe devs got kits late, maybe they had no time to optimize yet, I don't know.

I am very sure, just as he said, on release it will run butter smooth on both consoles and this is everything that counts I'd say.
 

Elog

Member
Elog...Elog...you know that's not what I'm saying. What I'm saying is simply Series systems DO have customizations...and they do. But people tend to ignore that reality and just say they are "PCs in a box". Which, I mean relative to older systems like PS2, Gamecube, Saturn, SNES, MegaDrive etc....BOTH of them are PCs in a box. They're both x86-based, that is a PC architecture primarily.

What you're referring to as "information" are either unsubstantiated rumors (the RDNA3 ones from people like MLID were destroyed by a PS5 software engineer on Twitter btw), or patents that could or could not be reflective of actual hardware in the PS5. There are just as many such patents relating to technologies that could or could not be in the Series systems, what we're trying to gauge is the probability of such patents in a finished retail product.

I honestly was not sure when I read your post since you seemed to add customisations to the Tflop delta between the two machines which implies that the PS5 has none.

Happy to get the clarification though :)

As to the actual customisations to the PS5 you list several that I would consider sophisticated BS right now - just like yourself.

I expect a significantly different GE since the GE that comes with the current RDNA2 design out of the box cannot do culling/prioritisation in the way Cerny talked about it in his speech. I believe this comes with both pros and cons. For developers utilising it it will give significant advantages to the entire rendering pipeline, however I have question marks how a standard PC-centric engine will manage this (i.e. might be a disadvantage in multi-platform titles with some serious eye candy in first party titles). How well Sony has developed the API will determine how it will fare in multi-platform titles.

Secondly, a lot points towards customisations regarding the cache/ memory side of things. Early information hinted at some weird soldering of memory chips to the board. This is one of those things that intrigue me the most but also one of those things that I am the most uncertain about. The person with dev-kit access was rather specific though.

Thirdly, it is the RT. Sony has been very tightlipped here and hinted towards that they did not go with the standard AMD approach. Some even interpreted that as if the PS5 would not have RT. Now that we know it has RT the question is what the silicon looks like. I assume the original information is correct that it is not the standard AMD approach. That begs the question though - what is it if that is not the case? That early information might of course be wrong and they sit with the bog-standard RDNA2 RT set-up.

And then comes the API.

Hopefully we will get to know more soon.
 

PresetError

Neophyte
Wait so you think that just because the PS5 has more power than the One X it will be able to do everything at native 4K just because it’s the same amount of pixels?

Why do you think that not all games are 4K natively on the X? Different engines and games require different amounts of power to run at 4K.

Because the PS5 has a lot more power than the One X and Mark Cerny said in the past at least 8TF were needed to render games in native 4K.
 

NullZ3r0

Banned
Okay, astroturfing at least for the typical xbox consumer, it worked.


The lies started.




The differences are compared in %.
source: basic mathematics.

Always the same script.

tell me, do you believe that or lie on the internet to "win the argument"?
Jesus. No, percentage is just one metric that doesn't tell the whole story. Xbox's TF advantage is at least 2 TF. Nothing will change that. On top of that there's around 40% more compute units which are critical when it comes to raytracing.

Sony has come to terms with this and that's why they're shoveling money at 3rd parties to differentiate PS5 versions in some other way because the graphics comparisons will be lost.

It's time for you to come to terms with this as well.
 
Top Bottom