• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF Direct: The PS5

mcjmetroid

Member
I admit, I was expecting “PS5 reveal” but this was not it, this was more behind the scene of how they creating PS5 rather than proper reveal.
And this is the problem, a lot of people my casual gaming friends included expected to hear about this and obviously got bored and said "fuck this I'm just gonna buy a PC"

They can't be the only ones. I get what it was but it for sure a marketing fuck up here. People expected to actually see the PS5 and some tech demo.

Now is the key time for Xbox. Hit them hard.
 
Last edited:

darkinstinct

...lacks reading comprehension.
Xbox One vs PS4: 66 % CU, 7 % higher clock speed

A 27 % deficit in raw TF lead to a 30 % reduction in resolution with lower memory bandwidth leading to lower fidelity. Clocking higher barely had an effect to overcome the gap.

PS5 vs XSX: 69 % CU, up to 22 % higher clock speed

A minimum 15 % deficit in raw TF may lead to a minimum 15 % reduction in resolution (1800p would be 30 % vs 4K). Lower memory bandwidth will lead to lower fidelity.
 

Business

Member
After finishing the vid, the biggest question mark remaining for me is the Storage expansion

It’s gonna take a specialized pci 4.0 nvme drive to fit in the PS5, right? That can’t be cheap

Hate to say it but I might prefer MS’ solution here

I think an open system where you can use PC disks, even if they have to be ‘certified’ because the industry doesn’t have standard speeds and physical measurements, is superior to a propietary storage expansion solution.
 

GHG

Gold Member
He is not grasping the velocity engine and the additional 100 gig instantly accessible storage like a cartridge of old ... this will be the game changer..

Speeds aside, those look great on paper, I do agree there , but it will stay on paper.

The "velocity engine" was the marketing they cooked up the moment they realised they would fall short in that area.

I'm glad you're enjoying sipping on it.
 

sinnergy

Member
A damage control video that was made from the GDC presentation that got cancelled because of Coronavirus? JFC. The ignorance on this forum is overwhelming.
Yet MS had excellent coverage, and they also deal with Corona, Sony has been terrible with their PR, I watch lots of GDC through the years, this was so terrible and in layman’s terms it was embarrassing imo.
 

GHG

Gold Member
If you saw the video you'd see that Cerni stated that 10.3 Teraflops is the typical performance, this isn't a PC style boost clock.

The reality is that both the CPU and the GPU cannot run at their advertised max boost clocks at the same time.

That's a problem and creates yet another balancing act for developers that they probably didn't want to have to deal with.
 

sinnergy

Member
The "velocity engine" was the marketing they cooked up the moment they realised they would fall short in that area.

I'm glad you're enjoying sipping on it.
I’ll be enjoying it in my games and DF videos for sure ! Thanks mate!
 

Jadsey

Member
So it seems the the PS5 is the console equivalent of a Dexterity build in Dark Souls, all about nimbleness and speed of thought, better suited to the skilled gamer and devastating when played correctly.

The Xbox Series is the Pure Strength build....very slow but hits hard, suited too the more casual gamer.

I myself, play a low intelligence build :)
 

GymWolf

Member
Why would it drop in demanding scenes? That makes no sense.
The exact opposite is what you are looking for with variable clocks. Drop the clock in scenes that are not demanding to let the CPU/GPU rest and push the clocks up when something big happens in the game.
This is the exact opposite of what cerny said.
Max boost clock on normal scene, toned down clock for heavy stuff.
So basically the console work at his worst when real power is needed.
 

ZehDon

Gold Member
I’m confused on a couple of points:
if the system can sustain the “boost” speeds... then it’s not a “boost”, it’s just ... the speed that the thing operates at. If he meat to explain that the system clocks down when under less load to drop power and heat, that’s a fundamental difference to “boosting” when under load. Which is it?

Cerny explains that the CPU and GPU power draw dictates the clock speeds, and that this is a deterministic process. But he also says you can trade CPU power for GPU power. Does this mean that the system can theoretically go higher than the caps listed, where I can trade CPU clock for more GPU clock, or does this actually mean that it can’t sustain both cap clock speeds simultaneously?

36CU at higher clocks is designed to widen the pipe, so to speak, rather than offer up more pipes. However, this limits parallelism in this aspect of the hardware. In modern software trends, threading and parallelism is basically the only way to scale up efficiently. Why did Sony go against the grain, so to speak? The amount of parallelism in modern graphics programming needs as many pipes as it can get.

Clock to heat ratio is not linear; it’s exponential. AMD have had issues with this in their hardware with their consumer cards, where they ran much hotter than their nVidia counterparts. While I don’t doubt Sony’s thermal solution will be sufficient, the noise of the PS4 Pro is not really acceptable for a consumer grade appliance. What thermal solution can Sony offer up that would be power efficient, silent, and sufficient, for those boost clock speeds?

The audio engine is compared to, basically, sticking the entire PS4 cpu on the board and dedicating it to audio. Do developers have more control over this piece of the hardware, or is this a “hands off” area, that uses the power independently to produce Sony’s desired audio output? If developers can control it, can it only be used for audio? How’s does it interface with the rest of the machine?
 

Azurro

Banned
The reality is that both the CPU and the GPU cannot run at their advertised max boost clocks at the same time.

That's a problem and creates yet another balancing act for developers that they probably didn't want to have to deal with.

The situation where both the CPU and GPU need to run at 100% of their capability at the same time is very, very rare, and if it does happen, then it will only downclock the GPU by a few percent points. Not ideal, I don't like this design or this console, it's a safe and middling generational leap, but I feel because of the way the information was provided that a lot of misinformation is happening right now.
 

Shmunter

Member
This power management may actually be a standard in the near future. As mentioned, PS4 going nuts on menu screens because of the load/power mismatch. All this disappears with the new approach.

It’s also important to note there is no unexpected thermal throttling, it’s all about balancing demand in efficient ways - unrelated to heat. If software requires full tilt, be my guest, playing an indi 2d platformer, or in an inventory screen, scale it down.

its like e.g. dynamic resolution in software to maintain FPS, but applied to hardware.
 
Last edited:
You will be disappointed.

Pc gpu can only sustained boost clocks with water cooling or really loud heavy air-cooling.

Your only hope is Sony somehow managed to build a cheap, safe and effective water cooling system for PS 5.
GPU boost in gaming PC Vs PS5 is different.
It's in the video. On PC, overclock/boost behaviour is tied to thermal and power constraints.

In PS5, it's designed around a fixed power (and therefore thermal) load. He literally talks about how having a console perform differently in a hot room or a cold room is bad for the end user. So the CPU and GPU frequency is tied to the the workload specifically, so it's performs exactly the same in all environments.

The operating frequency of the chip is actually dependant on the workload, not the ambient temperature.
 

sinnergy

Member
This power management may actually be a standard in the near future. As mentioned, PS4 going nuts on menu screens because of the load/power mismatch. All this disappears with the new approach.

It’s also important to note there is no unexpected thermal throttling, it’s all about balancing demand in efficient ways - unrelated to heat. If software requires full tilt, be my guest, playing an indi 2d platformer, or in an inventory screen, scale it down.

its like e.g. dynamic resolution in software to maintain FPS, but applied to hardware.
Which takes extra effort , you don’t want throttling, it’s weird for a console. That’s why switch / PS4/ series x etc are fixed .

To me this screams catching up to MS, weird decision.
 

longdi

Banned
GPU boost in gaming PC Vs PS5 is different.
It's in the video. On PC, overclock/boost behaviour is tied to thermal and power constraints.

In PS5, it's designed around a fixed power (and therefore thermal) load. He literally talks about how having a console perform differently in a hot room or a cold room is bad for the end user. So the CPU and GPU frequency is tied to the the workload specifically, so it's performs exactly the same in all environments.

The operating frequency of the chip is actually dependant on the workload, not the ambient temperature.
Power = heat.
Like i say, unless he managed to cramp a liquid aio into ps5, prepare to be disappointed. Unless you are playing Minecraft RT, next gen games wont allow ps5 to sustain 2.3ghz on full load.

Its really marketing cover up from Mark. It just don't make sense.

I have a ryzen 3950x, amd built their new chips fully around temperature control.
 
Power = heat.
Like i say, unless he managed to cramp a liquid aio into ps5, prepare to be disappointed. Unless you are playing Minecraft RT, next gen games wont allow ps5 to sustain 2.3ghz on full load.

Its really marketing cover up from Mark. It just don't make sense.

I have a ryzen 3950x, amd built their new chips fully around temperature control.
Yes, power = heat.
Which is why having a fixed power budget, means you'll have a fixed heat output.
The power cannot go higher than their designed limit, so the heat output cannot go higher.
If the PS5 is designed to use a maximum of 275W for example, then as long as the cooler can offer 275W of cooling, it'll be fine. Likelihood is, they'll have engineered a cooler to cool more than the maximum power, for safety.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Power = heat.
Like i say, unless he managed to cramp a liquid aio into ps5, prepare to be disappointed. Unless you are playing Minecraft RT, next gen games wont allow ps5 to sustain 2.3ghz on full load.

Its really marketing cover up from Mark. It just don't make sense.

I have a ryzen 3950x, amd built their new chips fully around temperature control.

It looks like AMD and Sony have designed the cooling, noise and power profile, and the fixed power consumption with likely voltage controlled frequency variation (1-2% reduction bringing 10% or more power consumption and thus heat reduction as likely they adjust voltage instead of just lowering frequency blindly... not cheap, not rushed to hit a magic marketing number and yes it would be pointless if the chip could only spent a tiny fraction of time above 1.8 GHz, but you are free to have reasonable doubt).
 

MrRenegade

Report me if I continue to troll
Power = heat.
Like i say, unless he managed to cramp a liquid aio into ps5, prepare to be disappointed. Unless you are playing Minecraft RT, next gen games wont allow ps5 to sustain 2.3ghz on full load.

Its really marketing cover up from Mark. It just don't make sense.

I have a ryzen 3950x, amd built their new chips fully around temperature control.
Yeah sure, they will announce the games and you will be still calculating heat dissipation. Move on.
 

Shmunter

Member
Which takes extra effort , you don’t want throttling, it’s weird for a console. That’s why switch / PS4/ series x etc are fixed .

To me this screams catching up to MS, weird decision.
Yeah, I’m no fan of the approach just trying to make sense of what Sony is up to.

We’re simple creatures, we want to put a nice bow tie on some simple like for like figures and call it a day. Seemingly we have a curveball here that’s making heads spin.
 

GymWolf

Member
You mean games that no one cares about maybe other than Ori because its new and the only thing xbox has to play next considering Halo is months away?
giphy.gif
cuphead was goty for a lot of people that year tho.
 

martino

Member
Power = heat.
Like i say, unless he managed to cramp a liquid aio into ps5, prepare to be disappointed. Unless you are playing Minecraft RT, next gen games wont allow ps5 to sustain 2.3ghz on full load.

Its really marketing cover up from Mark. It just don't make sense.

I have a ryzen 3950x, amd built their new chips fully around temperature control.

And what do you think of less cu is better because it's difficult to fill more ? :D
 
Last edited:

longdi

Banned
It looks like AMD and Sony have designed the cooling, noise and power profile, and the fixed power consumption with likely voltage controlled frequency variation (1-2% reduction bringing 10% or more power consumption and thus heat reduction as likely they adjust voltage instead of just lowering frequency blindly... not cheap, not rushed to hit a magic marketing number and yes it would be pointless if the chip could only spent a tiny fraction of time above 1.8 GHz, but you are free to have reasonable doubt).
Then thats clock stretching. We see that all the time with zen2, while you can undervolt it, it will display the same or higher clocks, but the performance is lowered.

I can show 3950x running all cores at 4.2ghz at 1.22v but it will render worse than when all cores at 4.05ghz at 1.28v
 

GymWolf

Member
Oh look, DF is actually excited about the PS5 and slap down stupid TF console warriors in the first few minutes of the video.
curious to see how they are gonna spin and retract that statement for every multy game that is gonna run better on sex (so basically 99% of them most probably)
 

Azurro

Banned
What he said dont make common sense unless like i say, ps5 is using water aio cooling.

No, you don't understand, this isn't boost clock as you have seen before. This is a fixed power budget and acts in a deterministic way. It is not tied to thermal constraints but instead to a power budget.

Oh wait, I see your other posts, you are just trolling.
 

longdi

Banned
No, you don't understand, this isn't boost clock as you have seen before. This is a fixed power budget and acts in a deterministic way. It is not tied to thermal constraints but instead to a power budget.

Oh wait, I see your other posts, you are just trolling.
Which parts am i trolling?
Look up amd clock stretching.

Modern gpu and cpu are totally designed around thermals. Their voltage frequency curve is fully determined by temps.

So it's either Sony has really expensive cooling or they are just going to display high clocks all the time while real performance drop on load.
 

GymWolf

Member
Which parts am i trolling?
Look up amd clock stretching.

Modern gpu and cpu are totally designed around thermals. Their voltage frequency curve is fully determined by temps.

So it's either Sony has really expensive cooling or they are just going to display high clocks all the time while real performance drop on load.
what is not clear to me is how you can exclude ambiental temps from the equation?!
i live in catania, one of the most hotter city in italy during summer (and the rest of the year tbf), the difference between my room and a room in a far colder city it's gonna be around 20+ °celsius or more, how can a system don't take in account such a giant difference in external temps? also majority of people (me for example) never clean their console, isn't accumulated dust during the years have some sort of hit on internal temps?

serious question, not trolling.
 
Last edited:
what is not clear to me is how you can exclude ambiental temps from the equation?!
i live in catania, one of the most hotter city in italy during summer (and the rest of the year tbf), the difference between my room and a room in a far colder city it's gonna be around 20+ °celsius or more, how can a system don't take in account such a giant difference in external temps? also majority of people (me for example) never clean their console, isn't accumulated dust during the years have some sort of hit on internal temps?

serious question, not trolling.


Because they are going by power draw or Voltage consumed basically. Not temperatures to determine clock speed.

That is why a PS5 in the south pole will have the same performance as a PS5 in a hut in a Jungle. As all machines need to perform in a completely deterministic and predictable way.
 
Last edited:
I think people forget that the problem with the Pro/X weren't that the GPU wasn't good enough but it was the CPU being shit even by 2013 standards, if PS could deliver games like GOW and GOT with the that old GPU/CPU then i think there's nothing to worry about, since it seems PS is going for performance rather than resolution this time around which is fine by me.

In any case it's too early to give any thought on this until games are shown and played next year.
 

wintersouls

Member
When the first games come out on the street of both, more than one who complains that PS5 is not powerful is going to have to shut their mouths. Wait for studios like ND or Santa Monica to show what they have and then we'll talk.
 

McRazzle

Member
I think people forget that the problem with the Pro/X weren't that the GPU wasn't good enough but it was the CPU being shit even by 2013 standards, if PS could deliver games like GOW and GOT with the that old GPU/CPU then i think there's nothing to worry about, since it seems PS is going for performance rather than resolution this time around which is fine by me.

In any case it's too early to give any thought on this until games are shown and played next year.
Except the bandwidth is abysmal in the PS5.
Check out the specs of the AMD 5700XT, which has almost an identical configuration and performance.
It has the same bandwidth as the PS5 at 448 GB/s, and it's not even capable of ray tracing.
The PS5 has to share it's bandwidth with an 8 core/16 thread CPU clocked at 3.5 GHZ.

Sony needs to go home and get it's shine box.

From TechPowerup:

Clock Speeds
Base Clock : 1605 MHz
Game Clock : 1755 MHz
Boost Clock : 1905 MHz
Memory Clock : 1750 MHz : 14000 MHz effective

Memory
Memory Size : 8 GB
Memory Type : GDDR6
Memory Bus : 256 bit
Bandwidth : 448.0 GB/s

Render Config
Shading Units : 2560
TMUs : 160
ROPs : 64
Compute Units : 40
L2 Cache : 4 MB

Theoretical Performance
Pixel Rate : 121.9 GPixel/s
Texture Rate : 304.8 GTexel/s
FP16 (half) performance : 19.51 TFLOPS (2:1)
FP32 (float) performance : 9.754 TFLOPS
FP64 (double) performance : 609.6 GFLOPS (1:16)
 
Last edited:
Except the bandwidth is abysmal in the PS5.
Check out the specs of the AMD 5700XT, which has almost an identical configuration and performance.
It has the same bandwidth as the PS5 at 448 GB/s, and it's not even capable of ray tracing.
The PS5 has to share it's bandwidth with an 8 core/16 thread CPU clocked at 3.5 GHZ.

Sony needs to go home and get it's shine box.

From TechPowerup:

Clock Speeds
Base Clock : 1605 MHz
Game Clock : 1755 MHz
Boost Clock : 1905 MHz
Memory Clock : 1750 MHz : 14000 MHz effective

Memory
Memory Size : 8 GB
Memory Type : GDDR6
Memory Bus : 256 bit
Bandwidth : 448.0 GB/s

Render Config
Shading Units : 2560
TMUs : 160
ROPs : 64
Compute Units : 40
L2 Cache : 4 MB

Theoretical Performance
Pixel Rate : 121.9 GPixel/s
Texture Rate : 304.8 GTexel/s
FP16 (half) performance : 19.51 TFLOPS (2:1)
FP32 (float) performance : 9.754 TFLOPS
FP64 (double) performance : 609.6 GFLOPS (1:16)
I'm no PC tech wizard so all of that is over my head, point is it's an upgrade over the CURRENT gen, i remember similar "fears" about the PS4/Xbone when they were released, the games that came out this gen proved Sony/MS right.

The most important thing about consoles is price and 3rd party support.
 

sinnergy

Member
Yeah, I’m no fan of the approach just trying to make sense of what Sony is up to.

We’re simple creatures, we want to put a nice bow tie on some simple like for like figures and call it a day. Seemingly we have a curveball here that’s making heads spin.
Yeah on purpose “ less but better” this ain’t it with PS5.
 

Amaranty

Member
Well there is some truth to it, but some people are taking it out of context.

The example cerny gave resulted in the 2 gpus equaling the same tflop performance

So instead of this:

48cus @ 1674mhz = 10.28tflops

Using 36cus @2230mhz = 10.28tflops

Tflops is performance of the vector ALU's , however at higher clockspeeds rasterization performance is higher in and command buffer processing is also faster inline with the clockspeed increase, also L2 and other caches have higher bandwidth at higher clock speed

So while there are benefits in clocking less cus very high, its not completely a free ride the higher the clock the further away the mem is in terms of cycling, so while cerny said the benefits far outweigh the negatives, he is talking about 2 different approaches which result in the same tflops.

Also when comparing the ps5 gpu to the XSX gpu an advantage the xsx has is that it is at a constant performance level, there are no variables like the PS5, we also dont know how the PS5s added 22% clockspeed over the xsx will impact actual performance, we really need a game developer to flat out tell us the performance differences, my gut tells me that xsx GPU will still be about 20% more powerful when you factor in the 25% extra ram bandwidth it has.

I would also be interested to see benchmarks done on a 22% lower clocked GPU with the same tflops as one clocked 22% faster.

I'm not very tech savy so please forgive my ignorance, but is the CU amount and clock-speed differences something similar to a quad core high clock-speed vs octa core lower clock-speed CPU-s? I remember in the past, that for gaming quad cores sometimes performed even better or similar to octa core processors because devs didn't or couldn't utilize the extra cores.

Although we are talking about consoles that are supposed to last 6-7 years, so down the line we might see XSX pulling ahead due to better optimization?
 

GymWolf

Member
They can’t downplay probably of a paid deal 🤣
i don't think they are payed by sony, but they can't just say out loud that sex is the better console overall, they try to keep a non-biased tone, they have to keep the hype high for the people and we know that sony users are double the xbox user, so yeah.
 
Last edited:

Kenpachii

Member
i don't think they are payed by sony, but they can't just say out loud that sex is the better console overall, they try to keep a non-biased tone, they have to keep the hype high for the people and we know that sony users are double the xbox user, so yeah.

More like they should grow a fucking spine and actually say what they think instead of sitting there and sugarcoating opinions because reasons.
 

hyperbertha

Member
XSX's SSD will not be "Slow" by any stretch of the word. But the numbers that Sony's custom SSD is putting out absolutely makes it look slow, it makes EVERYTHING look slow. I'm almost having trouble believing it's accurate.

Thats why i said this generation is going to be interesting.

Sony's console absolutely is losing in the horsepower department, but this is a rare case where their secondary innovations are absolutely an edge worth factoring into the big picture.


I've always been one to roll my eyes at gimmicks that get thrown around to try and soften the impact of lower horsepower...but this is a rare case where i'd say it's possibly very much worth the effort they've put into it.
but isn't it the kind of stuff that only affects first party games?
 
Top Bottom