D
Deleted member 775630
Unconfirmed Member
This article just confirms what everybody already knew. The PS5 is a great system, the XSX is just better
That comparison actually made the PS5 stupidly high clocks look really bad. - after Cerny's claim that higher clocks give actually better results when compared to the same computing power but achieved with lower clocks and more CUs, it turns out that even less computing power gives virtually the same results, given the same CU count. They should've really stick to those 2GHz or even 1.9Ghz and call it a day. And to think Cerny was talking about effective utilization of CUs...
For their box it’s true, no lies there. They maximized their system , they have 36 CUs used to the max efficiently. But MS played it all close to the chest and outclassed and out designed them with a out of the box form factor. It is what it is. Not saying that we won’t see amazing games from Sony because we will, we have seen what their studios can do with PS4.That comparison actually made the PS5 stupidly high clocks look really bad. - after Cerny's claim that higher clocks give actually better results when compared to the same computing power but achieved with lower clocks and more CUs, it turns out that even less computing power gives virtually the same results, given the same CU count. They should've really stick to those 2GHz or even 1.9Ghz and call it a day. And to think Cerny was talking about effective utilization of CUs...
I've been wondering, who is actually arguing otherwise?This article just confirms what everybody already knew. The PS5 is a great system, the XSX is just better
This basically, it is what it is. Both will have great games but MS has the hardware edge.This article just confirms what everybody already knew. The PS5 is a great system, the XSX is just better
, sure we have the “pack” shitting on PS5 and calling it a crappy design out of genuine concern .
It is not enough to have a well made console, the other must be shit... people must not see any positive in it. Then the coordinated effort to ensure no positive PS5 thread is left without “intervention“ else people may start believing in such things and it must not be allowed.
@NXGamer can you explain why you said the GPU wouldn't "throttle" but only 50 or 60 MHz? Or is it just not worth your time?
The bolded is wrong dude. I'm not sure why you keep thinking this. @SonGoku just summarized the DF article (it's better than the video so PLEASE go read it). These points below (written by DF by the way) is what you need to keep in mind.
- The CPU and GPU each have a power budget. If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU.
- There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
- GPU will spend most of its time at or near its top frequency in situations where the whole frame is being used productively in PS5 games. The same is true for the CPU, based on examination of situations where it has high utilization throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency.
- With race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time.
Shit design is shit design mate. That box is throttling on the clocks they put it. Why? shit design. As all of this could easily be solved.
For their box it’s true, no lies there. They maximized their system , they have 36 CUs used to the max efficiently. But MS played it all close to the chest and outclassed and out designed them with a out of the box form factor. It is what it is. Not saying that we won’t see amazing games from Sony because we will, we have seen what their studios can do with PS4.
But hardware wise , MS outclassed them.
For their box it’s true, no lies there. They maximized their system , they have 36 CUs used to the max efficiently. But MS played it all close to the chest and outclassed and out designed them with a out of the box form factor. It is what it is. Not saying that we won’t see amazing games from Sony because we will, we have seen what their studios can do with PS4.
But hardware wise , MS outclassed them.
NXGamer video was laughable bad as i stated in the topic that covered his video. I wouldn't take anything the guy says for anything worth.
About this part
- There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower
Pretty sure the DF guy said that it will not be able to push both, because that would mean its fixed hardware clocks. Which cerny also was specific about that it wasn't.
If that's not the case, then cerny did a mighty bad job at explaining what his box did.
A very small group of people that seem to think a faster SSD, less CU's at higher clock rate and Tempest Audio would make up the difference in CPU, GPU bandwidth speed.I've been wondering, who is actually arguing otherwise?
Game was not designed for 8 core cpu's mate. barely any game is. Next gen will peg those CPU's to oblivion specially with increases density of everything AI physics etc. That stuff already was going to happen this generation until devs realized the cpu's simple couldn't handle it for shit. Also big chance that GPU will always sit at 100% usage when dynamic resolution at 4k will push it always to its max.
People can try to sugarcoat the bad design all day long, but what they should do is give sony lots of shit so they can make changes still by redesigning there box to get stable clocks or even go so far to redesign the entire box and slam in the same GPU microsoft has.
Shit design is shit design mate. That box is throttling on the clocks they put it. Why? shit design. As all of this could easily be solved.
NXGamer video was laughable bad as i stated in the topic that covered his video. I wouldn't take anything the guy says for anything worth.
All I've seen is people saying PS5 might not be the "weak" console a certain group claim it is.A very small group of people that seem to think a faster SSD, less CU's at higher clock rate and Tempest Audio would make up the difference in CPU, GPU bandwidth speed.
You sure about that? The consoles aren't even out yet mate lol. Is the only thing you care about is teraflops?
You sure DF said that? Where did they say that? Are you sure you heard or read it correctly?
, I think your discord group needs to change trolling tack a bit.
Still, you are right that Sony needs to take this into account too and not get frustrated. They keep explaining their design choices and it is good for people that are interested in such things, but it is a waste of time against the “is it really HW raytracing?” / HW accelerated RT concern troll brigade.
I’m really quite certain you don’t have a clue what you are even talking about. This post is the text equivalent of this:
This sounds exactly like that "intervention" by X fans on videos about PS5, I guess that plan is already in motion.
and you're a system architect too ugh. The lengths some X fans are going thru is soo embarssing.
Was somewhere at the doom part where he was playing it if i can remember i could have heard him wrong tho as i did some administration work while listening towards it, but that quote u just pushed makes it sound like those clocks are fixed and will be able to run always at max frequency's whenever they feel like it. U either can run at full clocks or u can't there is no in between basically.
That cpu and gpu clocks jump up and down when need is there isn't much interesting, that's just nice for your energy bill at the end of the day that's about it.
Debunk what i say, or stop quoting me. This is the 3rd time now in a topic u start to shit on my posts with no arguments other then your flower power feelings logic i couldn't give 2 shits about. If you got information i don't know share it and shit with facts on my post. I will appreciate it even. As of now u are just wasting my time.
Nice argument. Got actual some factual information or tech detail instead of posts that say nothing then "ur wrong".
Was somewhere at the doom part where he was playing it if i can remember i could have heard him wrong tho as i did some administration work while listening towards it, but that quote u just pushed makes it sound like those clocks are fixed and will be able to run always at max frequency's whenever they feel like it. U either can run at full clocks or u can't there is no in between basically.
That cpu and gpu clocks jump up and down when need is there isn't much interesting, that's just nice for your energy bill at the end of the day that's about it.
Debunk what i say, or stop quoting me. This is the 3rd time now in a topic u start to shit on my posts with no arguments other then your flower power feelings logic i couldn't give 2 shits about. If you got information i don't know share it and shit with facts on my post. I will appreciate it even. As of now u are just wasting my time.
Nice argument. Got actual some factual information or tech detail instead of posts that say nothing then "ur wrong".
Hey look another one pops up. no argument just straight up shitposting. Nice job mate.
The CPUs are virtually identical on both machines (3% diff). Considering on PS5 the CPU won't have the audio and the I/O to process at all (which won't be the case on XSX contrary to what they want you to believe), PS5 CPU may well be more potent in actual games.A very small group of people that seem to think a faster SSD, less CU's at higher clock rate and Tempest Audio would make up the difference in CPU, GPU bandwidth speed.
Girls and Guys, hold up here for a moment!
You know the big problem that we are all going to have is the following:
At the end of the day we all do troll each other for the love that we have for each other, the love we have for game and this industry.
I like being sarcastic and so on yeah, but let's be very precise, honest and serious for a moment and pelase hear me out you will find yourself in there as well, I am certain:
If any of these companies fails to deliver something spectacular, and with spectacular I mean great gaming experiences, this Generation it can get very f'ugly for the Single Player Games Experiences in a short-medium time frame.
My single player experience is almost exclusively derives from Console Gaming and I am not going to exclude my Switch here. Switch games are freaking awesome.
By the end of the day I started my console career with a SNES and moved over to a PS and N64 and I actually was a games kid - no console fan. Now I am a grown up and have to pay my bills and pay for whatever I want to do. I play Xbox occasionally, but since it has been available on PC now I am just playing on my new rig - it is THAT simple. I am just being convenient as much as possible....
The current state-of-play is that the best single player experiences I have witnessed came 40% from PS, 40% from Switch and 20%! from Xbox. If I am to lose my best Single Player experiences to a fuck up from any of these companies, being it Sony in regards to their console or being it Xbox in regards to their Game Studios or even Switch maybe they kill Mario, who knows... it will hurt bad and to be honest my money is too important to me to spent it on a console or a game that is really going to be sub-par to anything we expect.
By the end of the day the games will decide and if those games don't deliver a good experience I am going to be very sad. The graphics I am fine with if they are a good evolution from what we have seen on PS4pro and Xbone X. I do not really need the next generation of incredible 4k/120hz graphics - personally what I need is a good plot, a good gameplay, a good and innovative idea and I need heart and love. I need you to make me feel I spent my time on a product that tried to do something real here.
I love games as long as they are good.
That's all I wanted to say on a more serious note..
https://www.neogaf.com/threads/digital-foundry-ps5-uncovered.1534691/post-257644442You sure about that? The consoles aren't even out yet mate lol. Is the only thing you care about is teraflops?
You sure DF said that? Where did they say that? Are you sure you heard or read it correctly?
@NXGamer can you explain why you said the GPU wouldn't "throttle" but only 50 or 60 MHz? Or is it just not worth your time?
The bolded is wrong dude. I'm not sure why you keep thinking this. @SonGoku just summarized the DF article (it's better than the video so PLEASE go read it). These points below (written by DF by the way) is what you need to keep in mind.
- The CPU and GPU each have a power budget. If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU.
- There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
- GPU will spend most of its time at or near its top frequency in situations where the whole frame is being used productively in PS5 games. The same is true for the CPU, based on examination of situations where it has high utilization throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency.
- With race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time.
If this point is true, then yes, you are right and I'm wrong. I definitely hope this is the case. However, based on what I gathered in this thread, this point is simply not true, the CPU and the GPU can't run at their max frequencies at the same time. Several people said this. What's the truth then? We need the exact words from Cerny's mouth to decide but I suppose he was vague, that's why the confusion. And vaguePR talk rarely means good news. Again, I hope I'm wrong.
- There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
brutal but honest.That's right, folks. Overclocking is a better console solution than more on-chip computational power!
God, I am loving this generation already.
... in terms of graphics.This article just confirms what everybody already knew. The PS5 is a great system, the XSX is just better
If this point is true, then yes, you are right and I'm wrong. I definitely hope this is the case. However, based on what I gathered in this thread, this point is simply not true, the CPU and the GPU can't run at their max frequencies at the same time. Several people said this. What's the truth then? We need the exact words from Cerny's mouth to decide but I suppose he was vague, that's why the confusion. And vaguePR talk rarely means good news. Again, I hope I'm wrong.
- There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
So if PS5 has enough power to run both cpu/gpu "at full power", what's the point of see-sawing back and forth if both cpu/gpu are already running at max?The best way to describe it is like this: the PS5 has enough power budget to be able to run both the CPU and GPU at full power, but like Richard explained in the video from Cerny, there are going to be times where the workload isn’t going to require the CPU or GPU to be running full blast. SmartShift allows the power to run back and forth between them to give one or the other, extra juice when needed. The issue is that Xbox fanboys are trying to confuse people with FUD that it’s a this or that situation; that either the CPU is up and the GPU is down and vice versa. Cerny literally said that is not the case.
I looked at the overal simplicity of both designs, MS won hands down. It is what it is.You sure about that? The consoles aren't even out yet mate lol. Is the only thing you care about is teraflops?
You sure DF said that? Where did they say that? Are you sure you heard or read it correctly?
So if PS5 has enough power to run both cpu/gpu "at full power", what's the point of see-sawing back and forth if both cpu/gpu are already running at max?
Then my question is the same as StreetsofBeige 's.The best way to describe it is like this: the PS5 has enough power budget to be able to run both the CPU and GPU at full power, but like Richard explained in the video from Cerny, there are going to be times where the workload isn’t going to require the CPU or GPU to be running full blast. SmartShift allows the power to run back and forth between them to give one or the other, extra juice when needed. The issue is that Xbox fanboys are trying to confuse people with FUD that it’s a this or that situation; that either the CPU is up and the GPU is down and vice versa. Cerny literally said that is not the case.
Richard quoting Mark Cerny said:With latency of just a few milliseconds, data can be requested and delivered within the processing time of a single frame, or at worst for the next frame. This is in stark contrast to a hard drive, where the same process can typically take up to 250ms. What this means is that data can be handled by the console in a very different way - a more efficient way. "I'm still working on games. I was a producer on Marvel's Spider-Man, Death Stranding and The Last Guardian," says Mark Cerny. "My work was on a mixture of creative and technical issues - so I pick up a lot of insight as to how systems are working in practice."
One of the biggest issues is how long it takes to retrieve data from the hard drive and what this means for developers. "Let's say an enemy is going to yell something as it dies, which can be issued as an urgent cut-in-front-of-everybody else request, but it's still very possible that it takes 250 milliseconds to get the data back due to all of the other game and operating requests in the pipeline," Cerny explains. "That 250 milliseconds is a problem because if the enemy is going to yell something as it dies, it needs to happen pretty much instantaneously; this kind of issue is what forces a lot of data into RAM on PlayStation 4 and its generation."
The CPUs are virtually identical on both machines (3% diff). Considering on PS5 the CPU won't have the audio and the I/O to process at all (which won't be the case on XSX contrary to what they want you to believe), PS5 CPU may well be more potent in actual games.
And XSX bandwidth has constraints. But yeah XSX has a bit more Tflops power, there is no denying that.
You have me on ignore, yet you replied to me 5 minutes later. Ya, ok.I have you on ignore, but I’ll answer this question. More than likely the fluctuation accounts for the increase in heat and potential cooling of the system. When Richard asked that exact question, Cerny said we would see soon in a system teardown how the cooling comes into play and that people would be satisfied with what they’ve come up with.
Then my question is the same as StreetsofBeige 's.
If they can run at the max frequencies at the same time, then why vary the frequencies at all? What's the point? Somehow this is not coming together.
Then my question is the same as StreetsofBeige 's.
If they can run at the max frequencies at the same time, then why vary the frequencies at all? What's the point? Somehow this is not coming together.
In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
"There's another phenomenon here, which is called 'race to idle'. Let's imagine we are running at 30Hz, and we're using 28 milliseconds out of our 33 millisecond budget, so the GPU is idle for five milliseconds. The power control logic will detect that low power is being consumed - after all, the GPU is not doing much for that five milliseconds - and conclude that the frequency should be increased. But that's a pointless bump in frequency," explains Mark Cerny.
PS5 caps its CPU and GPU clocks at 3.5GHz and 2.23GHz respectively, but how stable are the frequencies? At this point, the clocks may be faster, but the GPU has no work to do. Any frequency bump is totally pointless. "The net result is that the GPU doesn't do any more work, instead it processes its assigned work more quickly and then is idle for longer, just waiting for v-sync or the like. We use 'race to idle' to describe this pointless increase in a GPU's frequency," explains Cerny. "If you construct a variable frequency system, what you're going to see based on this phenomenon (and there's an equivalent on the CPU side) is that the frequencies are usually just pegged at the maximum! That's not meaningful, though; in order to make a meaningful statement about the GPU frequency, we need to find a location in the game where the GPU is fully utilised for 33.3 milliseconds out of a 33.3 millisecond frame.
"So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."
Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.
In other words, PS5 has trouble running at full 3.5 and 2.23 at all times. If there weren't issues, just leave it maxed out at all times like every other console.Stop listening to StreetsoBeige first of all. 2nd.....Cerny has already answered this question. The answer is too better cool the PS5 console with more predictability. Instead of having a fan run faster to cool the system down, the clocks vary to a slight degree to remain in the power budget to keep the console cooled down. Again lowering the GPU from 2.23 GHz to 2.17 GHz saves 10% of power or 10% less heat (this will help the PS5 NOT run hotter).
That article where they talk about the console being too hot is 99% bullshit, don't worry about that.Exactly. I really hope they know what they are doing with this PS5 but day after day it seems to me more and more that it can easily be a disaster in the end. And I really hope I'm not right, because I want the best possible PS5. If that means delaying and redesigning the thing then so be it, but unfortunately after announcing all these, there is almost no chance for this.
Yeah, I think this is it.You have me on ignore, yet you replied to me 5 minutes later. Ya, ok.
As for your reply that heating issues can make the system decrease clocks if things get hectic, that's exactly what many of us have been saying the whole time.
The system cannot maintain max speeds of 3.5 and 2.23 when the system runs into tough operations leading to heat issues for prolonged time. So that's why it downclocks.
And that's where we all get the vague "most of time" statements because nobody knows how often it will happen, when it will happen and how much it will happen.
Yeah, I think this is it.
So here is how I understand all this:
Cerny wanted to boost the system somehow. He achieved this by introducing constant power draw. Constant power draw means constant fan speed and noise. This constant power draw allows the clocks to be boosted to 3.5 Ghz and 2.23 Ghz respectively. The system can do this at the same time theoretically but if both ran at max clocks all the time the system would overheat because of the constant fan speed. To solve this problem he introduced variable clockrates. This doesn't sound too bad in theory, but this will make game designers' job more difficult. I already mentioned an example: dynamic open world games would suffer from this because what happens in such a game cannot be controlled all the time by the developers. If let's say there are too many characters in a certain moment when eveything is blowing up it can result in tanking the framerate. Well, this also happens with existing consoles too. So maybe it's not that bad as I imagined I just hope this doesn't mean developers will "downgrade" their games because they want to avoid such cases.
That bolded paragraph doesn't even make sense.In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
lol Actual factual information. Not one thing you said in that entire post made sense or proved a point. Then you say the PS5 is a shit design, yet you literally can not prove why. Why is the PS5 a ‘shit design”? The burden of proof is on you Lead System Architect. Give a good answer or I’m reporting you for console warring and trolling.
Game was not designed for 8 core cpu's mate. barely any game is. Next gen will peg those CPU's to oblivion specially with increases density of everything AI physics etc. That stuff already was going to happen this generation until devs realized the cpu's simple couldn't handle it for shit. Also big chance that GPU will always sit at 100% usage when dynamic resolution at 4k will push it always to its max.
People can try to sugarcoat the bad design all day long, but what they should do is give sony lots of shit so they can make changes still by redesigning there box to get stable clocks or even go so far to redesign the entire box and slam in the same GPU microsoft has.
I’m really quite certain you don’t have a clue what you are even talking about. This post is the text equivalent of this:
you are concern trolling and ignoring arguments made left and right while providing no data on your own beyond a hyper cynical look at the data Cerny provides and expect others to do the homework for you while you just shit on things? You keep doing you... do not throw a fit if people do not buy into what you are saying and poke holes in it.
This sounds exactly like that "intervention" by X fans on videos about PS5, I guess that plan is already in motion.
and you're a system architect too ugh. The lengths some X fans are going thru is soo embarssing.
you are concern trolling and ignoring arguments made left and right while providing no data on your own beyond a hyper cynical look at the data Cerny provides and expect others to do the homework for you while you just shit on things? You keep doing you... do not throw a fit if people do not buy into what you are saying and poke holes in it.
People have been bringing up "the gpu only needs to downclock 2% max" kind of thing. And if it does that everything works out. Who knows how true or false the 2% claim is.Yeah, I think this is it.
So here is how I understand all this:
Cerny wanted to boost the system somehow. He achieved this by introducing constant power draw. Constant power draw means constant fan speed and noise. This constant power draw allows the clocks to be boosted to 3.5 Ghz and 2.23 Ghz respectively. The system can do this at the same time theoretically but if both ran at max clocks all the time the system would overheat because of the constant fan speed. To solve this problem he introduced variable clockrates. This doesn't sound too bad in theory, but this will make game designers' job more difficult. I already mentioned an example: dynamic open world games would suffer from this because what happens in such a game cannot be controlled all the time by the developers. If let's say there are too many characters in a certain moment when eveything is blowing up it can result in tanking the framerate. Well, this also happens with existing consoles too. So maybe it's not that bad as I imagined I just hope this doesn't mean developers will "downgrade" their games because they want to avoid such cases.
A very small group of people that seem to think a faster SSD, less CU's at higher clock rate and Tempest Audio would make up the difference in CPU, GPU bandwidth speed.
Maybe I'm misunderstanding something, but it seems to me that Cerny says that they could both run at full speed all the time if necessary, but since that apparently is never the case, the speeds can be downclocked.In other words, PS5 has trouble running at full 3.5 and 2.23 at all times. If there weren't issues, just leave it maxed out at all times like every other console.
So it has to downclock to cut back on power and heat. That's exactly what all of us noticed two weeks ago.
Yeah, I think this is it.
So here is how I understand all this:
Cerny wanted to boost the system somehow. He achieved this by introducing constant power draw. Constant power draw means constant fan speed and noise. This constant power draw allows the clocks to be boosted to 3.5 Ghz and 2.23 Ghz respectively. The system can do this at the same time theoretically but if both ran at max clocks all the time the system would overheat because of the constant fan speed. To solve this problem he introduced variable clockrates. This doesn't sound too bad in theory, but this will make game designers' job more difficult. I already mentioned an example: dynamic open world games would suffer from this because what happens in such a game cannot be controlled all the time by the developers. If let's say there are too many characters in a certain moment when eveything is blowing up it can result in tanking the framerate. Well, this also happens with existing consoles too. So maybe it's not that bad as I imagined I just hope this doesn't mean developers will "downgrade" their games because they want to avoid such cases.
i hope not, majority of people doesn't give a fuck about 3d audio with crappy tv speakers.In theory it can, but in practice it never would. Devs will decide how much it should take and plan accordingly. Devs not wanting to budget too much to it and the Tempest Hardware not being fully utilized could be a reality. Speaking to the 20 GB/s, even that is a ton and I doubt it will hit that number ofen.
Nobody is saying the ultra fast SSD will give PS5 an extra teraflop. Why would anyone say that? Teraflop doesn't matter anyway. That 18% advantage in teraflop will only result in a higher screen output resolution that you can't notice even in youtube comparison videos anyway.
People are saying (including developers) that the PS5 SSD will allow an unprecedented and unmatched insane texture details and more varied art. A developer explained this in detail in one of his tweets.
In short, to get instant access to urgent data, more of it needs to be stored in RAM on the current generation consoles - opening the door to a huge efficiency saving for next-gen. The SSD alleviates a lot of the burden simply because data can be requested as it's needed as opposed to caching a bunch of it that the console may need... but may not. There are further efficiency savings because duplication is no longer needed. Much of a hard drive's latency is a factor of the fact that a mechanical head is moving around the surface of the drive platter. Finding data can take as long - or longer - as reading it. Therefore, the same data is often duplicated hundreds of times simply to ensure that the drive is occupied with reading data as opposed to wasting time looking for it (or "seeking" it).
"Marvel's Spider-Man is a good example of the city block strategy. There are higher LOD and lower LOD representations for about a thousand blocks. If something is used a lot it's in those bundles of data a lot," says Cerny.
Without duplication, drive performance drops through the floor - a target 50MB/s to 100MB/s of data throughput collapsed to just 8MB/s in one game example Cerny looked at. Duplication massively increases throughput, but of course, it also means a lot of wasted space on the drive. For Marvel's Spider-Man, Insomniac came up with an elegant solution, but once again, it leaned heavily on using RAM.
"Telemetry is vital in spotting issues with such a system, for example, telemetry showed that the city database jumped in size by a gigabyte overnight. It turned out the cause was 1.6MB of trash bags - that's not a particularly large asset - but the trash bags happened to be included in 600 city blocks," explains Mark Cerny. "The Insomniac rule is that any asset used more than four hundred times is resident in RAM, so the trash bags were moved there, though clearly there's a limit to how many assets can reside in RAM."
It's another example of how the SSD could prove transformative to next-gen titles. The install size of a game will be more optimal because duplication isn't needed; those trash bags only need to exist once on the SSD - not hundreds or thousands of times - and would never need to be resident in RAM. They will load with latency and transfer speeds that are a couple of orders of magnitude faster, meaning a 'just in time' approach to data delivery with less caching.
Behind the scenes, the SSD's dedicated Kraken compression block, DMA controller, coherency engines and I/O co-processors ensure that developers can easily tap into the speed of the SSD without requiring bespoke code to get the best out of the solid-state solution. A significant silicon investment in the flash controller ensures top performance: the developer simply needs to use the new API. It's a great example of a piece of technology that should deliver instant benefits, and won't require extensive developer buy-in to utilise it.
A whole lot of “stuff”