• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry]PS5 uncovered

TheStruggler

Report me for trolling ND/TLoU2 threads
You're not exactly grasping the concept here, bud.
Neither are you its a fact that the pro version has a higher framerate compared to the X which is the more powerful system, most people care about frames than resolution which is why devs and companies like MS and Sony going forward should give options for resolution or performance via frame rate.
 

newtonfb

Member
Dial it back on the heavy console war vitriol.
You know what is the most annoying thing from all those threads... basically since 2013...every Sony Fanboy choking on Cernys di&k and thinking he is god. Is it really that far fetch that another engineer or group of engineers came up with a better design?
 
Dial it back on the console war fanbase instigating and baiting. Focus on the context of the thread topics.
You know what is the most annoying thing from all those threads... basically since 2013...every Sony Fanboy choking on Cernys di&k and thinking he is god. Is it really that far fetch that another engineer or group of engineers came up with a better design?
To them it's impossible, and they think they can wordsmith their way out of it. It's insanity.
 

SonGoku

Member
Ah, more funny numbers that don't take into account situational performance.
It did actually 17-21% -> 10-10.27TF. Cerny did state it will remain at peak frequency or close to most of the time
Furthermore we have info from DF of devs throttling CPU to ensure sustained peak 10.27TF that heavily implies crossgen games are not taxing the CPU. This is already worst case scenario for engines not optimized around power consumption...

Just to humor you lets say hypothetically GPU runs at the 9.2TF that's still a smaller gap than current gen 30% vs 40%.
 
Lol, Alex from DF blasting resetera as the cesspool it is..

"How about you quote what i actually said and you will realise I did not say at all what you just typed.
how about this ResetERA moderation, how about you actually do something?

I am calling this complete and utter bullshit. I clearly never said anything of the sort and have made Videos in the topic about how the SSD enables game design.

Utter trash ResetERA - this place is utter trash."
 
*Checks username*

Larry-David-Laughter-on-Couch.gif

This is ironic considering the amount of BS you spout. Your tag is quite fitting.

You know what is the most annoying thing from all those threads... basically since 2013...every Sony Fanboy choking on Cernys di&k and thinking he is god. Is it really that far fetch that another engineer or group of engineers came up with a better design?

People are choking on Cerny’s dick? Holy shit you warriors are really upset about this huh? The power difference between both systems is less than PS4/Xbone and Pro/1X and it just be eating you all up inside.
 
Last edited:

onQ123

Member
Spreewaldgurke Spreewaldgurke Some choice quotes you can use for the OP Summary:

SSD
  • There's low level and high level access - it's the new I/O API that allows developers to tap into the extreme speed of the new hardware.
  • The concept of filenames and paths is gone in favour of an ID-based system which tells the system exactly where to find the data they need as quickly as possible.
  • With latency of just a few milliseconds, data can be requested and delivered within the processing time of a single frame, or at worst for the next frame. 'Just in time' approach to data delivery with less caching (RAM usage).
  • Developers can easily tap into the speed of the SSD without requiring bespoke code to get the best out of it solid-state solution.
  • A significant silicon investment in the flash controller ensures top performance: the developer simply needs to use the new API. Technology should deliver instant benefits, and won't require extensive developer buy-in to utilise it.



I can picture the next GTA using a 3D scan of a whole city with a dataset too big to fit into 16GB of RAM but because PS5 & Xbox SX will have SSDs they can stream it.
 

SonGoku

Member
I can picture the next GTA using a 3D scan of a whole city with a dataset too big to fit into 16GB of RAM but because PS5 & Xbox SX will have SSDs they can stream it.
Indeed, remember the crazy streaming techniques GTA5 used on PS360 streaming simultaneously from disk and hdd to maximize bandwidth
Next R* next gen only game will be insane, GTA6 might be crossgen :/
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Pc based clocks on 5700 XT and similar aren’t just temperature based either.... power delivery and compute load are also factors ... so it looks like they’ve removed the temperature factor based on the chips AND cooling solution having a consistent profile .

At the right voltage my 5700XT clocks never vary more than 60 or so MHz from game to game and are always between 1980mhz and about 2040mhz.

If they’ve tuned it properly and know the profile of the cooling and APU as well as Cerny says they do they won’t have any trouble being consistent.

NXGamer estimates that the PS5 will vary by 50 MHz or so. He said this a weeks ago, so you're findings and opinion is SPOT ON!
 
NXGamer estimates that the PS5 will vary by 50 MHz or so. He said this a weeks ago, so you're findings and opinion is SPOT ON!
Who cares what he estimates? No one has the knowledge to even make that assertion. The frequency swing is completely unknown...

giphy.gif


It's getting really old seeing everyone running to an appeal to authority for the completely unknown when in entire reality these people are not an authority of any kind.
 
Last edited:

marquimvfs

Member


A very interesting reading. To me is very clear what Cerny is trying to do. He stated very clearly that the system will not throttle and is very capable of run at top frequencies both on CPU and GPU if needed. According to him, he's trying to optimize the system with the power budget in mind and to end a thing he calls "race to idle", that is when a CPU or GPU, running at a full clock, finishes a light task and then stays idle just waiting to take part in another task when needed by the game. He stated that want developers to reduce the clock whenever possible to finish all the light tasks in a higher time to not "waste" power running at high clocks and then idling. That's what developers saw when running light games (that had Jaguar in mind when developed), they stated that the CPU was running in clocks lower than the top, even calling it a "throttle", that was not the case, the performance was not suffering, that's why the article states right after thar it was more a case of calling it a "profile", in lack of a better word to describe it. Simple enough.
 
Last edited:

Neo_game

Member


really wondering what will happened on ps5 in similar scenario
also will main thread high ipc love in games stop at some point ?


CPU load is pretty much always less than 40%. Only for few secs it reached 61% and ironically GPU load then dropped to 91%. What was most interesting is that this game is using 10gig of vram and 13gig of main ram. This just indicates that the ram upgrade this gen is pretty disappointing.
 

Vroadstar

Member
Who cares what he estimates? No one has the knowledge to even make that assertion. The frequency swing is completely unknown...

It's getting really old seeing everyone running to an appeal to authority for the completely unknown when in entire reality these people are not an authority of any kind.

Yikes! embarrassing

That "18%" (minimum btw) translates to a PlayStation 4 and a half of GPU compute on top of the PlayStation 5's GPU.

It's not just teraflops, Microsoft's GPU goes places which the PlayStation 5's cannot follow.

First off my assertions are completely accurate, and 18% is the minimum because the PlayStation 5 clocks downwardly. You charlatans seem to really grasp to that 10.28 figure like it's the result of fixed frequency. You just run with it like it's business as usual, like you're talking to people who don't understand that's a peak boost clock which goes down.

Secondly the Series X will undoubtedly have more ROPs, the likely scenario is 72 or 80 vs. 64. Microsoft is running a GPU of considerably more size at conservative speeds, a comparative retail unit would undoubtedly be running around 2Ghz and be more in the upper range of the stack. 72 to 80 ROPs makes much more sense for their system than 64 does. Sony's configuration is of a smaller GPU that would also be in the neighborhood of 2Ghz but fall more into the middle and given the workload capability 64 would be peak to avoid bottlenecking on output.

More TMU's is also a given.
 

DaMonsta

Member
nope

"Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing,...Right now, it's still difficult to get a grip on boost and the extent to which clocks may vary. "

"Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core."
It all kinda seems like double speak back and forth.

On one hand, they say developers don’t need to optimize, but then in the same set of comments they say they expect devs to find new ways to optimize their code.

On one hand they say there’s enough power to run both at max, then on the other hand they say they have different power profiles for devs to use depending on their power needs.

It all kinda seems like purposeful obfuscation, instead of giving straight answers.
 

Sagii86

Member
CPU load is pretty much always less than 40%. Only for few secs it reached 61% and ironically GPU load then dropped to 91%. What was most interesting is that this game is using 10gig of vram and 13gig of main ram. This just indicates that the ram upgrade this gen is pretty disappointing.


Considering GDDR6 is 5 times faster than DDR4, I wouldn't worry about handling old titles at high fps, 4k textures or higher for that matter. It's not the quantity that matter but the quality. Power consumption will play a huge role on next gen memory.
 
Last edited:

ethomaz

Banned
It all kinda seems like double speak back and forth.

On one hand, they say developers don’t need to optimize, but then in the same set of comments they say they expect devs to find new ways to optimize their code.

On one hand they say there’s enough power to run both at max, then on the other hand they say they have different power profiles for devs to use depending on their power needs.

It all kinda seems like purposeful obfuscation, instead of giving straight answers.
Devs doesn’t need to optimized to the variable frequencies works and that is true.
But if you optimize you can control the workload and so the variation in frequency... that is true too.

The profiles are only for test, debug, profiling. Retail machine won’t have fixed profiles.
 
Last edited:

ZehDon

Member
The one real take away for me was the CU count comparison - a really good piece of investigating there by DF.
While the higher clocks can produce equal performance on fewer compute units, the ceiling seems rather low for the trade offs involved. 36 CUs compared to 52 in the Series X is, for me, now a major point of interest for me. Parallelism is a major focus for modern engines - PS5, potentially, has a real tangible disadvantage in this area.
 

Jigga117

Member
It all kinda seems like double speak back and forth.

On one hand, they say developers don’t need to optimize, but then in the same set of comments they say they expect devs to find new ways to optimize their code.

On one hand they say there’s enough power to run both at max, then on the other hand they say they have different power profiles for devs to use depending on their power needs.

It all kinda seems like purposeful obfuscation, instead of giving straight answers.

I mentioned the same thing in previous threads that just watching the initial video you can see him contradict himself before and after statements. It will all come out overtime
 

ethomaz

Banned
The one real take away for me was the CU count comparison - a really good piece of investigating there by DF.
While the higher clocks can produce equal performance on fewer compute units, the ceiling seems rather low for the trade offs involved. 36 CUs compared to 52 in the Series X is, for me, now a major point of interest for me. Parallelism is a major focus for modern engines - PS5, potentially, has a real tangible disadvantage in this area.
It is a biased test.
RX 5700 doesn’t run at 2100Mhz and so the performance showed in the video is with the GPU throttling.
 

DaMonsta

Member
Devs doesn’t need to optimized to the variable frequencies works and that is true.
But if you optimize you can control the workload and so the variation in frequency... that is true too.

The profiles are only for test, debug, profiling. Retail machine won’t have fixed profiles.
I mean I guess you could say devs don’t “need” to do anything.

But reality seems devs will have to optimize their code in very specific ways to get the most out of PS5
 

ethomaz

Banned
I mean I guess you could say devs don’t “need” to do anything.

But reality seems devs will have to optimize their code in very specific ways to get the most out of PS5
Well that is true for any hardware even the actual PS4.
 

ethomaz

Banned
In the DF video didn't they get vastly diminishing returns on FPS as the clock went higher and higher ?
Yeap.
That is why RDNA tests are useless to make any comparison.

The RDNA 2 clock/performance scale is just too different.


Yes and no. This is a bit more specific to just the PS5 and something devs won’t have to consider on other platforms. Kinda like X1 esram.
IMO I believe that optimization is workload is already done in all games for any platform... so it is basically the same.

Now SSD use is a change not only in optimization but in design phase... it is a change of game development paradigm.
That is where the game development will be different for PS5 if devs wants to take all from it.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
actually smartshift is a nice piece of technology the problem is that sony presented it in the worst way possible. It is usefull to squeeze every little power oh your chip if you already have a comfort termal/power budget, a thing that i'm not so sure sony is having right now.
I firmly believe that the console was planned for 2019 and then rdna2 wasn't ready and neither sony so they postponed to 2020. It will for sure be a nice console and 1st party will be able to make some awesome games on it, but from a design and technological standpoint i don't like it, doesn't seem to me like a balanced system and still Sony has to prove they know how to make a silent gaming box. Beign a sucker for gaming i will buy both but the more i hear the more i think sony changed theyre plans mid year

The more you hear, the less you understand. That much is clear.
 

makaveli60

Member
this is exactly what i was thinking, in open world games you can transform a calm scene in a giant mess just by playing the damn game and creating disasters, just look how rdr2 slow down during scene with fire, explosions or a shitload of npc on screen fighting.
Exactly. I really hope they know what they are doing with this PS5 but day after day it seems to me more and more that it can easily be a disaster in the end. And I really hope I'm not right, because I want the best possible PS5. If that means delaying and redesigning the thing then so be it, but unfortunately after announcing all these, there is almost no chance for this.
 

SonGoku

Member
In the DF video didn't they get vastly diminishing returns on FPS as the clock went higher and higher ?
It isn't an appropriate comparison and a bit misleading to present without a disclaimer on the potential variables that make such comparison futile
  • 5700 wasn't designed to clock that high
  • Power starved at higher frequencies
  • RDNA2 is supposed to clock higher
  • The PS5 GPU architecture, silicon design and power delivery was designed around that high frequency which is far different from a PC gamer slapping an aftermarket cooler on a GPU and overclocking it
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I see. Shame, worst case scenario has to be taken into account always = hindering what could be possible if the machine worked like the previous Playstations. I don't like this, it's similar to the concept of Lockhart hindering nextgen which I was also bitching about. It's clear now that this was designed as a less powerful machine. I'd rather they left the GPU at 9,2 TF and have the machine work similarly to PS4. I suppose this will be the first machine I will buy anyway, but I'm quite disappointed. Thanks for clarfying.

You clearly do not understand what people are writing to you. The PS5 is not hindering anything of what's possible on the console. You just aren't understanding how it works. If the GPU was always 9.2 TFs constant, it'll be a weaker console.

How can you not understand this? The design is more exotic and different, but that doesn't mean it's worse than if it was the GitHub GPU. The only thing you need to worry about is this......did Cerny create a good enough cooling system for the PS5. If he did, then the PS5 will blow you away more than any other Playstation ever.
 

Kenpachii

Member
CPU load is pretty much always less than 40%. Only for few secs it reached 61% and ironically GPU load then dropped to 91%. What was most interesting is that this game is using 10gig of vram and 13gig of main ram. This just indicates that the ram upgrade this gen is pretty disappointing.

Game was not designed for 8 core cpu's mate. barely any game is. Next gen will peg those CPU's to oblivion specially with increases density of everything AI physics etc. That stuff already was going to happen this generation until devs realized the cpu's simple couldn't handle it for shit. Also big chance that GPU will always sit at 100% usage when dynamic resolution at 4k will push it always to its max.

People can try to sugarcoat the bad design all day long, but what they should do is give sony lots of shit so they can make changes still by redesigning there box to get stable clocks or even go so far to redesign the entire box and slam in the same GPU microsoft has.
 
Last edited:

sinnergy

Member
You clearly do not understand what people are writing to you. The PS5 is not hindering anything of what's possible on the console. You just aren't understanding how it works. If the GPU was always 9.2 TFs constant, it'll be a weaker console.

How can you not understand this? The design is more exotic and different, but that doesn't mean it's worse than if it was the GitHub GPU. The only thing you need to worry about is this......did Cerny create a good enough cooling system for the PS5. If he did, then the PS5 will blow you away more than any other Playstation ever.
It’s the best play station ever made!
 

Bogroll

Likes moldy games
It was about the power before the reveal understandably. Now certain PS fans are on the defense, certain Xbox fans are rubbing it in. Most of us on here know when it comes down to it there's not going to be that much in it, but most of us being honest it was about that TF number. Ps fans arguing now know there really ain't going to be that much difference and Xbox fans on here should as well. So Xbox fans now you shouldn't argue about it now cause you'e just setting yourself's up for "told you so, ner ner ner ner ner" etc from certain Ps4 people comforting them selfs.
Me as iv'e already said they are going to be fairly close and that should be stating the obvious on this to most forum. I'm more interested now on how quiet they're going to be under constant power hungry stressing games.
 

makaveli60

Member
You clearly do not understand what people are writing to you. The PS5 is not hindering anything of what's possible on the console. You just aren't understanding how it works. If the GPU was always 9.2 TFs constant, it'll be a weaker console.

How can you not understand this? The design is more exotic and different, but that doesn't mean it's worse than if it was the GitHub GPU. The only thing you need to worry about is this......did Cerny create a good enough cooling system for the PS5. If he did, then the PS5 will blow you away more than any other Playstation ever.
It seems that you are the one who don't understand what I wrote. Of course it would be weaker but they could design their games better if they always knew their budget. With what we have now they could squeeze in more graphics at max GPU usage but then that maybe means that the CPU can't keep up = framerate tanks. Please read again and try to understand what I say before accusing me that I don't understand something.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
With a 17-21% gap in GPU performance and roughly the same amount of bandwidth proportional to their GPUs computational power. Im curious what makes you think Sony will have to resort to parity policies?
Games will just use dynamic resolution, at worst the PS5 CPU throttles to 3GHZ but i can't see that being an issue for cross gen games designed around jaguar cores as richard pointed out.

The CPU frequency probably doesn't even have to go down that much. I'd be surprised it it goes down more than 3.2GHz.
 

Panajev2001a

GAF's Pleasant Genius
People can try to sugarcoat the bad design all day long, but what they should do is give sony lots of shit so they can make changes still by redesigning there box

:LOL: , sure we have the “pack” shitting on PS5 and calling it a crappy design out of genuine concern :rolleyes:.
It is not enough to have a well made console, the other must be shit... people must not see any positive in it. Then the coordinated effort to ensure no positive PS5 thread is left without “intervention“ else people may start believing in such things and it must not be allowed.
 
Last edited:

SonGoku

Member
The CPU frequency probably doesn't even have to go down that much. I'd be surprised it it goes down more than 3.2GHz.
Agree was just making up a hypothetical worst case scenario were a launch game didn't bother to optimize for power consumption, even then it wouldn't be an issue because as Richard said crossgen games will hardly tax the CPU (unless they aim 120fps) and also because the power control unit would automatically redirect power to CPU when available.

Based on info we have i expect next gen only games designed to push the consoles will balance CPU around 3.3-3.5GHz and GPU 10-10.27TF through different loads.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Who cares what he estimates? No one has the knowledge to even make that assertion. The frequency swing is completely unknown...

giphy.gif


It's getting really old seeing everyone running to an appeal to authority for the completely unknown when in entire reality these people are not an authority of any kind.

NXGamer NXGamer can you explain why you said the GPU wouldn't "throttle" but only 50 or 60 MHz? Or is it just not worth your time?


It seems that you are the one who don't understand what I wrote. Of course it would be weaker but they could design their games better if they always knew their budget. With what we have now they could squeeze in more graphics at max GPU usage but then that maybe means that the CPU can't keep up = framerate tanks. Please read again and try to understand what I say before accusing me that I don't understand something.

The bolded is wrong dude. I'm not sure why you keep thinking this. SonGoku SonGoku just summarized the DF article (it's better than the video so PLEASE go read it). These points below (written by DF by the way) is what you need to keep in mind.



  • The CPU and GPU each have a power budget. If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU.
  • There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
  • GPU will spend most of its time at or near its top frequency in situations where the whole frame is being used productively in PS5 games. The same is true for the CPU, based on examination of situations where it has high utilization throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency.
  • With race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time.
 

ZywyPL

Banned
In the DF video didn't they get vastly diminishing returns on FPS as the clock went higher and higher ?

That comparison actually made the PS5 stupidly high clocks look really bad. - after Cerny's claim that higher clocks give actually better results when compared to the same computing power but achieved with lower clocks and more CUs, it turns out that even less computing power gives virtually the same results, given the same CU count. They should've really stick to those 2GHz or even 1.9Ghz and call it a day. And to think Cerny was talking about effective utilization of CUs...
 
Top Bottom