• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Shmunter

Member
The big issue that I'm seeing is Dealers comments. He's basically making the claim that the XSX should run games at double the framerate than the PS5.

It's very dangerous for him to set such high expectations since he runs a YouTube tech channel similar to NX gamer.

If the results are nowhere near that there will be alot of meltdowns. And I mean really really bad ones.

In my opinion he should put his bias aside and look at the situation objectively instead of letting his preference cloud his judgement.
He said double framerate on XsX? Nah, that’s over the top even for him.

At the end of the day he will need to change his tune when more evidence stars piling up. If he compromises any DF videos with bias he won’t last in the face of facts.

Edit: I mixed up dealer to dictator from DF. Ignore this post of mine
 
Last edited:

DaGwaphics

Member
Yes, you’re right this is correct. My point is “sustaining clocks” is kind of meaningless in the context of actual work done.

Calculation costs watts, watts produce heat. By targeting to run at maximum watts, you’re targeting to run at maximum calculation rate, regardless of clock speed.

This is how modern GPU overclocking is done by people that want to eliminate boost and have a repeatable stable level of performance.
You want to be power draw limited (maximum calculation per unit of time) while having a good enough cooling solution to never thermally throttle.

Clock speed doesn’t equal calculation done per unit of time either. Just as a GPU can idle at its peak clock and generate no heat/do no actual work.

Extreme examples help. You could have a fixed 2Ghz clock doing a very computationally cheap job that draws a small amount of power and makes a small amount of heat.
You could then have a variable 2Ghz peak clock that drops to 1.5Ghz to keep at maximum power draw while doing a computationally very expensive job, and the second one would be crunching more FLOPS (per unit of time, which is what FLOPS is) than the first example. Despite the clocks not only dropping but being lower.

Useful work done is watts of power draw, which then becomes heat. It’s not clock-speed which says nothing about calculation being done.

Some people clearly think clock speed is a measure of work done, or that TF can be “fixed”. My only point is that it’s demonstrably wrong, as anyone used to modern CPU/GPU overclocking will know.

It’s all about power draw. That’s the measure of numbers crunched.

Agreed. It's always max power draw that controls final workload. Sony is going a new route to achieve a fixed cooling budget, MS is going the old fashioned way of providing enough headroom for the 95th percentile +25%. The result will be that the XSX will likely get louder as the generation moves along and devs push the system more and more, but that's nothing new.
 
He said double framerate on XsX? Nah, that’s over the top even for him.

At the end of the day he will need to change his tune when more evidence stars piling up. If he compromises any DF videos with bias he won’t last in the face of facts.

"lol it’s math, minimum of 40to50 frames per second on GPU Alone. In addition to better RT"

That's what I'm making a reference to. It's a pretty big claim.
 

FranXico

Member
So the expansion "slot" (as much as we know of it or if it exists) has direct connection to the IO? I think folk are assuming that it will be slowed by the interface, unless the interface is also proprietary?
I'm really interested to see what they've done with it.
I think the PS5 is going to end up with a proprietary interface for the expansion slot. Pretty much the same solution as MS.
 

HeisenbergFX4

Gold Member
I think the PS5 is going to end up with a proprietary interface for the expansion slot. Pretty much the same solution as MS.

Unless something has changed it will use the PCI Express 4.0 standard but Sony will have to certify the drives to work but the way I understand it its based on the physical size of the drives making sure they actually fit.

Someone feel free to correct me if I am wrong as my brain hasnt spun up yet this morning :)
 

patsu

Member
So, your translation



Suggests the start bit is possible in editor but unknown resolution.

So the 1080p comment context is still unknown ?

and then Sweeny tweets it was just a video anyway and shuts it all down///.





The rest of the thread put things in context.

 

THE:MILKMAN

Member
Unless something has changed it will use the PCI Express 4.0 standard but Sony will have to certify the drives to work but the way I understand it its based on the physical size of the drives making sure they actually fit.

Someone feel free to correct me if I am wrong as my brain hasnt spun up yet this morning :)

That's right and to add Mark Cerny said that drives will just need 'a little extra speed' above the 5.5GB/s to arbitrate the additional priority levels. I'm thinking ~6GB/s? but 7GB/s ones will be available by years end anyway.
 

Thirty7ven

Banned
Regarding size of games, let's use Spiderman's metrics regarding duplicated assets on disk, which were 25% and use improved compression at 25% over what we have already. Let's keep it simple, as the numbers will vary a bit, but not a whole lot. It's a 46.3GB game.

Becomes a 21.65 GB game if they were to do it on PS5, not changing anything else.

Also, the current generation of consoles launched with 500GB models, and probably the majority of console owners still hold a 500 GB model. I know in an enthusiast forum we go straight to "Where can I hold all these games, I need a 2 TB HDD connected to my 1TB console because I have a 30 games installed even though I only play two of them at a time."

The U5 demos was showing what's possible, but that doesn't mean the best use of the resources is a 600MB 3D model of a single statue. Maybe the best use of the resources is just a bunch of variety on screen, just a whole lot of different assets at the same time.
 
[...]

The U5 demos was showing what's possible, but that doesn't mean the best use of the resources is a 600MB 3D model of a single statue. Maybe the best use of the resources is just a bunch of variety on screen, just a whole lot of different assets at the same time.
Excatly. I would rather have 4 times the asset diversity then 4 time the asset resolution.
Just imagine what kind of diverse and crowded game you'll be able to experience with all those hardware improvements alone.
Now add all API and Engine improvements on top.

I can't wait to see some exclusive next-gen games.
Cyperpunk 2077 on next gen consoles might be a very ealry glipmse at what could be possible. At least I hope they'll go and try to use these new possibilites the next-gen consoles will offer.
 
Last edited:
T

Three Jackdaws

Unconfirmed Member
So there is this rumour/leak going around that Aloy in HZD2 will have a higher poly/triangle count than all the characters combined from the previous game.

As far as I’m aware the first person to post this was FoxyGamesUK who claimed he had heard it from his “source” although someone on this thread mentioned that developer on the game was actually the first to mention it.

As for Moore’sLawIsDead, who also said he heard the same thing (making it out like he heard it from his source) is lying imo around the same time he said that quote in his video which was very recent, just before it he replied to FoxyGamesUK tweet which first mentioned the poly/triangle count stuff saying something along the lines of “you have me curious” or some shit lol

damages his credibility imo just recycling information which was already leaked and rumoured and then passing it off like he’s got some sort of inside sources. Almost as bad as Tidux😂
 

J_Gamer.exe

Member
The big issue that I'm seeing is Dealers comments. He's basically making the claim that the XSX should run games at double the framerate than the PS5.

It's very dangerous for him to set such high expectations since he runs a YouTube tech channel similar to NX gamer.

If the results are nowhere near that there will be alot of meltdowns. And I mean really really bad ones.

In my opinion he should put his bias aside and look at the situation objectively instead of letting his preference cloud his judgement.

Exactly, this is the kind of narrative being set by some and its setting xbox up for a fall IMO.

Weve seen it go from tflops to SSD doesnt matter, then suddenly it does and SFS and bcpack can magically make up for the lack of io hardware etc.

The ssd and io is looking to have a much bigger impact than a lot anticipated, if that results in better quality assets than the most powerful console then oh boy...

I personally think the cpu will be equal as 3.5 vs 3.6 and sony has been very open about tempest power and that'll reduce the cpu load. Xbox less open about their audio hardware and usually when quiet it usually means something.

Also not sure about the cpu needing more usage on xbox side for SSD where ps5 is almost completely taken care of in the io?
 

Handy Fake

Member
Exactly, this is the kind of narrative being set by some and its setting xbox up for a fall IMO.

Weve seen it go from tflops to SSD doesnt matter, then suddenly it does and SFS and bcpack can magically make up for the lack of io hardware etc.

The ssd and io is looking to have a much bigger impact than a lot anticipated, if that results in better quality assets than the most powerful console then oh boy...

I personally think the cpu will be equal as 3.5 vs 3.6 and sony has been very open about tempest power and that'll reduce the cpu load. Xbox less open about their audio hardware and usually when quiet it usually means something.

Also not sure about the cpu needing more usage on xbox side for SSD where ps5 is almost completely taken care of in the io?
I sometimes wonder if it's the difference in marketing culture between the US and Japan.
As an outsider from across the pond, American adverts tend to be* big, brash and all about power.

* or at least seem to be
 
Last edited:

Handy Fake

Member
I find it rather amusing. Like in the US, if you have a cold and take a certain medication, you'll be out windsurfing and scaling the tallest peaks within minutes.
In the UK, if you have a cold and take a certain medication, you're probably going to manage to get the ironing done and not batter the kids to death.
 
if that results in better quality assets than the most powerful console then oh boy...

If all the XSX does provide a slightly higher resolution most people will not notice that. However something like half the load times or the elimination of them might be more noticeable than a slight increase in resolution.

I know I say this alot but we have to wait and see what the real differences are. Luckily it won't be much longer until we see them.
 

THE:MILKMAN

Member
Exactly, this is the kind of narrative being set by some and its setting xbox up for a fall IMO.

Weve seen it go from tflops to SSD doesnt matter, then suddenly it does and SFS and bcpack can magically make up for the lack of io hardware etc.

The ssd and io is looking to have a much bigger impact than a lot anticipated, if that results in better quality assets than the most powerful console then oh boy...

I personally think the cpu will be equal as 3.5 vs 3.6 and sony has been very open about tempest power and that'll reduce the cpu load. Xbox less open about their audio hardware and usually when quiet it usually means something.

Also not sure about the cpu needing more usage on xbox side for SSD where ps5 is almost completely taken care of in the io?

Ultimately both will do fine and right now I'm impatiently awaiting to see the receipts (games) and only then will we get a true indication where things are heading.

Right now all the I/O, SSD speeds, XVA etc talk boils down to one comparable element both Microsoft and Sony have stated officially.

XSX I/O = 5 Zen 2 cores + 'other'
PS5 I/O = 11 Zen 2 cores + 2x co-processors and coherency engines + other

I'm sure both will have their own custom bits within the GPU/CPU (e.g. texture filters, GPU scrubbers) that are yet to be announced? In fact Microsoft seem to have gone out of their way to not show the internal SSD in the teardown videos so maybe they have some secret sauce there we don't know about yet?
 

Nickolaidas

Member
Ergo my Ass. Im not into the "lowest denominator" thing. Was tomb raider in ps4 holded by the xone? No . Double the framerate. If there any obvious advantage to take that make the job easy, developers gonna use if. I tell you, if the lower denominator is the targeted goal in a 3rd party project and a, at least x2 difference in performance is not used in any way, you know they are in a parity arrangement.

And the same goes bought ways. You think if developers can push more resolution or frames in xbox they are not going to do it?

Of course all those things are tied to game design, time, budget, deadlines, etc. So if nobody takes advantage of the systems those are the reasons
Except that resolution and framerate do not change gameplay. Asset loading can affect the speed a character runs in the game world. In a Series X game, the Flash would need to run slower because the Series X won't be able to load the assets in the same speed the PS5 could. As a result, the devs are forced to limit the Flash's running speed in the PS5 because the Series X won't be able to keep up. You can't have the Flash run through a city in five seconds in the PS5 and fifteen seconds in the Series X. The Series X owners would complain of a lesser experience.

Ergo, the asset loading speeds of the PS5 would not be fully used because the game would need to have equal gameplay mechanics in both versions.

That's what I mean when I say that multiplat games won't be able to fully utilize the PS5's speed.
 
Last edited:

Larryfox

Member
It will just be connected over a pretty standard PCIe4.0 bus with enough lanes to support 7.5GB/s to the main chip.

I know the point you were making, though. The proprietary secret sauce in the custom flash controller can’t be used by the expandable m.2 slot. That’s why you need a 7.5GB/s NVMe there, instead of just a 5.5GB/s one. You need an extra 2GB/s to close the gap to the custom flash controller in their testing presumably.
I’d love to know why they decided to go that route and not make a custom SSD that uses their custom flash controller.
 
Correct me if I'm mistaken but did people forget that with the PS5 it will have configurable installs. For instance if you want to play campaign mode you can just install that portion. If you beat the campaign and only want to play the online mode from now on you can delete that portion? That will free up a lot of space on the SSD. I'm not sure if XSX will have that feature.
 

TLZ

Banned
Awkward times for Dictator and the rest of tards who have been pushing this PS5/Demo/Laptop BS 👇:messenger_tears_of_joy:

Mlqfl1y.jpg


MdzgPes.jpg

Facts line up, idiot? :)
Alex you God damn fool.

Tim Sweeny Epic founder: PS5 SSD is super fast, great IO, eliminates bottlenecks and there's nothing like it in the market now. It enabled us to create that nextgen demo you've seen.

Alex: ummt ahhh urrghrgh yea I dunno uhhh we need to wait uhmmmuhh

Some random Chinese dudes running a video of the demo on their laptop at 1080p40

Alex: Oh yus that lines up FACTS.

You God damn wanker. I seriously hope you get thrown off DF.
 

whoever81

Member
This Tuesday is the day we all have been waiting for! I promise Sony’s gonna announces the June event officially. If nothing happened you can sacrifice me! You’ve my words! :messenger_smiling:
Ok but the day we've been waiting for is not the announcement of the reveal but the reveal itself 😊
 
Last edited:
T

Three Jackdaws

Unconfirmed Member
Ultimately both will do fine and right now I'm impatiently awaiting to see the receipts (games) and only then will we get a true indication where things are heading.

Right now all the I/O, SSD speeds, XVA etc talk boils down to one comparable element both Microsoft and Sony have stated officially.

XSX I/O = 5 Zen 2 cores + 'other'
PS5 I/O = 11 Zen 2 cores + 2x co-processors and coherency engines + other

I'm sure both will have their own custom bits within the GPU/CPU (e.g. texture filters, GPU scrubbers) that are yet to be announced? In fact Microsoft seem to have gone out of their way to not show the internal SSD in the teardown videos so maybe they have some secret sauce there we don't know about yet?
Let’s not forget PS5’s insane 12 channel interface on the SSD and it’s 6 levels of priority, something which Series X doesn’t have.

I mentioned this before but no amount of secret sauce is going to make the Series X SSD outperform the PS5 in the same way no amount of Secret Sauce in the PS5’s GPU will make it outperform Series X in terms of teraflop advantages. However this is an Apples to oranges comparison because SSD and GPU utilisation are 2 very different things.

Both companies had to compromise on either of the 2 and Sony chose the unique SSD route whilst having a capable GPU and Series X chose the powerful GPU whist retaining a capable SSD.
 
Alex you God damn fool.

Tim Sweeny Epic founder: PS5 SSD is super fast, great IO, eliminates bottlenecks and there's nothing like it in the market now. It enabled us to create that nextgen demo you've seen.

Alex: ummt ahhh urrghrgh yea I dunno uhhh we need to wait uhmmmuhh

Some random Chinese dudes running a video of the demo on their laptop at 1080p40

Alex: Oh yus that lines up FACTS.

You God damn wanker. I seriously hope you get thrown off DF.

I've been reading through the tweets and some people are accusing Tim Sweeney of lying.

Any truth to this?
 

Larryfox

Member
Correct me if I'm mistaken but did people forget that with the PS5 it will have configurable installs. For instance if you want to play campaign mode you can just install that portion. If you beat the campaign and only want to play the online mode from now on you can delete that portion? That will free up a lot of space on the SSD. I'm not sure if XSX will have that feature.
They have a feature that works kind of the same way on Xbox already called fast start, but It doesn’t allow you to pick what part first to install. I wouldn’t be surprised if they improve on that feature like what Sony is planning. Sony didn’t talk about it as an exclusive feature and it already happens on PC so maybe Xbox has it 🤷🏿‍♂️
 

Bo_Hazem

Banned
Aye, there was a wee discussion about this a good few hundred pages back. I'm honestly not sure.
I do wonder if perhaps there's architecture in place for the very fast shifting of full games from external to internal SSD to make up for the lack of IO seen within the architecture.
Bo_Hazem Bo_Hazem postulated that there could possibly be some on-chip cache RAM to mitigate any slowdown of streaming from a slower SSD IO but as you say, we'd need to see when the details emerge.

The expansion bay actually shares the same i/o, the main problem is the NVMe architecture is using 4 channels per 16 modules/chips as I understood and PS5's custom SSD is using 12 channels per 12 modules/chips. And PS5's SSD has 6 priority level vs 2 in NVMe.

So my suggested solution was having the option to make a partition inside the main SSD that compensates for the lack of speed in current SSD's, while not necessarily having the whole game temporarily setting in that partition, instead use a chunk every time with still a 7GB/s NVMe for better results and smaller partition space.

Mark Cerny has something for us, probably companies will make a new architecture or Sony would do it as they're no joke in the memory industry for camera memory cards:

f064273efb870263b73c1081b64243dc


And this one is insanely expensive starting at $220 for 128GB

Sony-512GB-TOUGH-CEB-G-Series-CFexpress-Type-B-Memory-Card.jpg
 

Thirty7ven

Banned
I've been reading through the tweets and some people are accusing Tim Sweeney of lying.

Any truth to this?

You know what's happening. Xbox has been driving the narrative since November(?) and fanboys got used to the new normal. This is just reality crashing down. A lot of people aren't taking any of this lightly and Inside Xbox the other day didn't help matters.

Hopefully after the Xbox and Ps5 events we will move on, but I'm afraid the information war is only going to get worse until we get H2H. It's gonna be D day for the Xbox fanboys and there's only one outcome they are willing to accept.
 

FranXico

Member
You know what's happening. Xbox has been driving the narrative since November(?) and fanboys got used to the new normal. This is just reality crashing down. A lot of people aren't taking any of this lightly and Inside Xbox the other day didn't help matters.

Hopefully after the Xbox and Ps5 events we will move on, but I'm afraid the information war is only going to get worse until we get H2H. It's gonna be D day for the Xbox fanboys and there's only one outcome they are willing to accept.
7 years, mate.
 

THE:MILKMAN

Member
Let’s not forget PS5’s insane 12 channel interface on the SSD and it’s 6 levels of priority, something which Series X doesn’t have.

I mentioned this before but no amount of secret sauce is going to make the Series X SSD outperform the PS5 in the same way no amount of Secret Sauce in the PS5’s GPU will make it outperform Series X in terms of teraflop advantages. However this is an Apples to oranges comparison because SSD and GPU utilisation are 2 very different things.

Both companies had to compromise on either of the 2 and Sony chose the unique SSD route whilst having a capable GPU and Series X chose the powerful GPU whist retaining a capable SSD.

I have a couple of tech questions about the SSD chips. How do you work out the speed of each chip? I know they are rated in megatransfers but what is the equation? Will the XSX SSD have 4 chips for a 4 channel interface but be stacked/3D lower density chips or just 4 higher density chips?
 

Bo_Hazem

Banned
Nope :messenger_grinning_sweat:

If I remember correctly I said 26tf and 48tf for our two best renderfarms.

But that is not massive. It really is a mediocrit. If you compare to the thousands of render farms that ILM uses our two renderfarms are like a tamagochi. If you want something massive check out the new Ampere and its PetaFlops. (with a cost of $ 199,000 each,). :messenger_tears_of_joy:

Could you imagine using those to just beat the PS5 and feel glorious :messenger_tears_of_joy: It's like bringing a nuke into a fist fight.
 

yewles1

Member
Yeah I figured as much with my experience with building PCs and watching benchmarking and overclocking videos on LTT GN and the like. Even under heavy cooling at a certain point clocks drop under thermal throttling. Thats why I am curious of the fixed claim either they have some kind of crazy cooling or the chip they are using is more effecient / underutilized so it will not thermal throttle, and it feels wasteful for a system to run at sustained clocks capable of producing 12 tf or graphical performance even if you are just watching youtube. Just my 2 cents.
So which is then, is Cerny's statement on variable clocks a little overstated and that he should've focused more on the benefits of AMD Smartshift, OR are MS's statements on fixed clocks a marketing fallacy?
 
T

Three Jackdaws

Unconfirmed Member
I have a couple of tech questions about the SSD chips. How do you work out the speed of each chip? I know they are rated in megatransfers but what is the equation? Will the XSX SSD have 4 chips for a 4 channel interface but be stacked/3D lower density chips or just 4 higher density chips?
Good questions, although I’m not tech savy enough to answer them. I am curious myself. I hope someone else on the thread can answer them.
 

FranXico

Member
So which is then, is Cerny's statement on variable clocks a little overstated and that he should've focused more on the benefits of AMD Smartshift, OR are MS's statements on fixed clocks a marketing fallacy?
Statements from both companies always have abundant embellishment with a kernel of truth.

Fixed clocks is what consoles have used for a long time already, and quite standard, so MS did not lie when claiming fixed clocks. This is true.

The PS5 "continuous boost" was very superficially explained by Cerny when trying to be truthful, but the complexity of the frequency control did him no favors. Suffice to say, most confusion comes from people assuming a linear relation between power consumption and GPU/CPU frequencies, which is absolutely not the case. I think in very few sources this is adequately explained.
 
Last edited:

Bo_Hazem

Banned
yeah lets see how much polygons they can use for a moving character. and give me some boobs and ass physics on Aloy :pie_drooling:

I feel you, man. Here, your dream thread come true:


images
 

Kusarigama

Member
I've been reading through the tweets and some people are accusing Tim Sweeney of lying.

Any truth to this?
Yeah first it was Mark Cerny who was lying about PS5 having RT, then PS5 being on RDNA2 then the Crytek guy was also lying and now it is Tim Sweeney who is lying. How dare anybody talk anything positive about PS5, Xbox Series X has 12, I repeat 12 TFLOPS!

Why is it so hard for the Xbox fanatics to even hear someone say anything positive about PS5? You folks have the MOST POWERFUL console with all the advantages over PS5 (MS isn't even competing with Sony and handily beating them). They have more higher clocked cpu, more TFLOPS gpu, faster RAM, faster ssd(they are deluding themselves with claims of 3 times 4.8gbps but we'll bite, thanks to XVA, BCPACK, AND SFS), more studios with MS' deep pockets no studio is free from the threat of acquisition, more games than they can announce and market properly.

Take all of these and have fun don't go around spoiling fun for others.

And if you truly love xbox then criticize the gross mistakes that MS has done with the previous inside xbox event. They take the blame and continue to so same things. Phil Spencer tweets that he is excited for starting the next gen campaigning with honesty and transparency. They outright lied setting wrong expectations and failing to deliver on their own words. Jez Corden did a good article criticizing that, even praising SoP and NinDirect and suggesting that direction for inside xbox.
 
Last edited:

TLZ

Banned
Teraflops aren’t “fixed” on XSX. They are a theoretical peak on both systems.

The GPU frequency is fixed on XSX, but just frequency alone means nothing.

Any PC gamer will know that to utilise a GPU 100% you pretty much need a “burn test” benchmark running. Something pretty much designed to flip every transistor every clock-cycle, usually rendering nonsense or something pointless and mostly static.

Any PC gamer will also know that it’s easy to have a fixed high clock on something idling.
It’s only when a CPU/GPU is loaded up with real work to do that watts are consumed and turned into heat that needs to be dealt with.

For example, I can overclock my i7 to 5.0Ghz and it will sit there just fine. I can even run some games and it will be happy. However, if I run a stress/burn test like Prime95 or Linpack, within seconds it’s spiking in temperature and throttling back to deal with the heat. With the ability to throttle back disabled it instead crashes the entire system as the thermal protections kick in.
It only even gets close to its theoretical maximum FLOPS during a burn test, not when doing the tasks it generally does.

The quoted teraflop numbers for both systems are literally a hand calculation of a theoretical maximum that doesn’t consider thermal constraints—because the cooling capability is unknown—the figures quoted for both are basically if every single transistor was flipped every single clock tick, like the worlds worst burn-test.

There’s no such thing as a “fixed” 12 teraflops. A flop is a measurement of one kind of calculation a GPU does, relating to the programmable side rather than the fixed function parts. It depends on work-load, not on frequency.
To repeat, if it didn’t vary by work-load then a 5Ghz CPU would be the same temperature whether it was idling on the desktop or running Prime95. That is not the case. That is not how it works.

The actual peak performance either console will be able to sustain is actually calculated differently.
Microsoft has taken the traditional route to set their clocks to match what they estimate to be a worse case scenario in actual game code (and not a 12TF burn test), and a worse case ambient temperature, and what their cooling system is capable of dissipating.

Game developers will have access to profiling and telemetry tools to see how close they are to the designed limits.
Any condition where this estimated peak is breeched, like a clogged heatsink, broken fan or crazy desert heat would cause the game to crash if it was during a computationally intense task.

PS5 on the other hand encourages developers to work to a maximum wattage, and sizes the fan to match that known maximum under the expected extreme of ambient heat they expect it to reasonably encounter.
Wattage is drawn by the actual calculation work being done, not by frequency alone, and wattage creates heat.
A worse case burn-test would cause it to drop frequency a “few” percent (to apparently reduce wattage 10%), and the game wouldn’t crash. But this would only happen if the developer programmed the game in this way, and it’s not determined by ambient temperature, as all PS5s need to be equal and deterministic.
This means it technically easier to get the actual unknown maximum out of a PS5 during design time.

Both systems vary in the actual amount of work they do, neither will reach burn-test peaks.

Xbox’s power consumption (heat) varies as the work load increases at a fixed frequency, up to a point it crashes because of the heat, with an expected margin or headroom the developer works within to make sure it doesn’t happen in any reasonable scenario.

PS5’s frequency varies as work load increases at a fixed power consumption (heat), up to a point it reduces frequency to keep power consumption within the limit that the developer will work within to make sure this work capability drop doesn’t happen in any reasonable scenario, although if it does, it’s less critical. There’s no margin to be as concerned with as just peaking it won’t crash the game.

Both systems will be able to throttle back on frequency and power respectively when idling outside of a game.

In a Eurogamer article Cerny insisted it would take the likes on an intentionally programmed burn-test that flips all transistors every clock-tick to cause the GPU to declock a few percent. He further suggested that from what they’re seeing even when the GPU spends an entire 33ms frame doing actual work with no idle it’s not reducing its clocks.
He clarified that a race to idle condition wasn’t being used to keep the clocks artificially high, and that even under constant work they stayed high.

Remember, heat is generated by calculation work, not clock speed alone.

PS5’s targeting of watts instead of an estimated peak is about cooling and efficiency. It’s not about “boosting” up in the same way a mobile or PC CPU or GPU does. The people still insisting and implying PS5 works that way have failed to understand what is going on, and are likely intentionally trying to get people to believe it’s working as a mobile or PC boost clock does, for whatever strange reason.

Tables that compare the two GPUs like the last one posted here, that had TF in one column, and ROPs in another are also being disingenuous.

For the TF column, that number is a performance metric that has been calculated by multiplying some figures by the GPU core clock-speed.
For the ROP column the clock-speed is then disregarded, even though ROP performance is one of those fixed function parts of the GPU that scales with clock-speed.
The two numbers being provided aren’t comparable if the clock-speeds are different.
The same applies to the cache figures. A cache of size N at 2Ghz has twice the bandwidth of a cache at size N at 1Ghz.
This is what Cerny meant by a rising tide lifting all boats.

If you want to compare teraflops, then you need to compare the other areas of the GPU also with the clock-speed in mind. Failing to do that is to be misleading. Maybe even intentionally so, although I’m more inclined to believe the people making these kinds of comparisons just don’t understand what’s going on and how things really work.

Both the PS5 and XSX GPUs will be better at each other when doing different work loads. The XSX clearly has an advantage in outright theoretical compute ability, and therefore ray tracing ability, but how the rest of the system feeds these machines, how efficiently they’re used, how well the different APIs are written, and how skilled the game and engine developers are will have far more of an impact on the end result.

For clarity, and it’s worth repeating, I’m impressed and really happy with the job both companies are doing. Microsoft is definitely back on the right track and fans of that system should be very happy. I’m personally more excited about PS5 and what its IO might bring to the table and will be buying that first, but I’ll also be very happy to spend my money on an XSX and will be doing exactly that, too. Microsoft deserve to see that focusing the device as a gaming machine makes them money again.
Both of these babies are impressive. They’ll be closer in multi-platform perfomance than some people are suggesting (hoping?). It’s going to be a great ~7 years.
Even PC is going to take a quantum leap in response to this. Everyone wins. Be happy!
thicc_girls_are_teh_best thicc_girls_are_teh_best is that you? There's only one poster I know of who loves typing this much.

You couldn't resist staying away from the thread for too long so you came back under another alt :messenger_tears_of_joy:
 
Last edited:
Yeah first it was Mark Cerny who was lying about PS5 having RT, then PS5 being on RDNA2 then the Crytek guy was also lying and now it is Tim Sweeney who is lying. How dare anybody talk anything positive about PS5, Xbox Series X has 12, I repeat 12 TFLOPS!

Why is it so hard for the Xbox fanatics to even hear someone say anything positive about PS5? You folks have the MOST POWERFUL console with all the advantages over PS5 (MS isn't even competing with Sony and handily beating them). They have more higher clocked cpu, more TFLOPS gpu, faster RAM, faster ssd(they are deluding themselves with claims of 3 times 4.8gbps but we'll bite, thanks to XVA, BCPACK, AND SFS), more studios with MS' deep pockets no studio is free from the threat of acquisition, more games than they can announce and market properly.

Take all of these and have fun don't go around spoiling fun for others.

And if you truly love xbox then criticize the gross mistakes that MS has done with the previous inside xbox event. They take the blame and continue to so same things. Phil Spencer tweets that he is excited for starting the next gen campaigning with honesty and transparency. They outright lied setting wrong expectations and failing to deliver on their own words. Jez Corden did a good article criticizing that, even praising SoP and NinDirect and suggesting that direction for inside xbox.

Just seems like more Discord level type FUD to me. The PS5s SSD has caused more meltdowns than the Series X 12TFs in my opinion.

It's really sad to see people behave this way.
 

Bo_Hazem

Banned
Actually Lumen does use raytracing. They don't trace triangles but they trace voxels, signed distance fields and screen space.

So it might even be possible to use a similar technique in Decima or other Sony engines while using hardware raytracing for better performance.

What I am really hoping is that techniques similar to Nanite will be used in Sony engines as well. That UE5 detail is serious business!

Assuming that they've been working closely with Sony all these years, I do really expect that specific technique was actually made possible by Mark Cerny's way of thinking, and I can't see Sony's first party studios not playing around with similar technique already.

Remember, Sony Pictures Imageworks does CGI for other big movies from the likes of Warner Bros and Marvel, along with theirs. Such tech would be extremely cost/time efficient, and they might even migrate that for their movie making for rapid workflow.







A previous rumor suggested that Sony's cinema division, Sony Pictures, is working closely with PlayStation division this time around. If so, we can only expect great things on many levels.
 
Last edited:
Status
Not open for further replies.
Top Bottom