• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Cerny: Devs Don’t Have to Optimize in Any Way for PS5’s Variable Clocks, It’s All Automatic

0915dcce-a8d9-4737-adda-b3f9b7ec09f3_1024.jpg
 
I don't think talking about it is helping, just show it already.

Ideally Sony would just use 40 CU's @ ~2150Mhz sustained clocks for similar power consumption but provide 11 Tflops, use a larger GPU die in a revision for better binning, just take the L.

Having said that, I plan on buying a PS5 even if they change nothing.

I think this approach, at the very least, would have looked much better on paper.
 
All that has happened with the PS5 is the introduction of convolution for the developers

Sony is electronics company and it shows in how they design the system.

To me it looks a little like PS3 era but much more calculated and less risky. Primarily cause its easier to develop for than even PS4 (by Cerny's time to triangle)

They have made an impressive machine from electronics stand point. As stand alone on its own its marvelous. For game devs it could be or not be a headache.
 
- What species are you?
- 'GitHub folks'
ahahahahahahhahahahaha

There is no confusion on the early PS5 adopters' mine as someone here suggested. They and me trust Sony, if we didn't understand the jargon and technical details we don't go around spreading uncertainty and doubt in a doom and gloom, this is done by a certain group which is already had been revealed in part to be 'Discord folks' = 'GitHub folks' who are mostly masking themselves as neural up until they can't anymore, and they pose naive questions or post problematic media with no commentary to start turning whatever legitimate argument there is in the thread into console waring.

There is no need for any of that. No one here is talking about Series X or making comparisons unless these 'GitHub folks' come and start some shit.
 
PS4 and Xbox One I think are pretty close in US NPD. Xbox got killed in other countries. Even UK didn't do great when 360 did great there.

If SeX can do great with games and price they can gain ground in the US, but I don't see them making much traction overseas except maybe get some UK back.

PS5 will outsell SeX. The gap now is roughly 100M vs 50M. It can grow if SeX bombs. But it can shrink if SeX catches on.


:messenger_tears_of_joy:
 

Hobbygaming

has been asked to post in 'Grounded' mode.
Hmmm, a lot of people in the Next Gen Spec thread said the PS5 was 13.3TF (back when TFs mattered) and that the people supporting the GitHub leaks would be eating crow. We all saw how that turned out; GitHub folks weren't the ones being served crow.
Look at what happened with the X and the Pro. Even the base PS4 had better looking exclusives than found on the X
 

manzo

Member
What a marketing disaster.

Microsoft is showing exciting things.

Sony has an engineer explaining talking points about power management.

Exactly this. Sony has a fine piece of hardware in their hands and they just keep falling flat on their face trying to explain it. At this point, I would stop trying to talk about power management or anything else, as it doesn’t change the fact that on paper, they are underpowered versus the competition.
It’s the games that matter. Show the fucking goods, not the hardware. People play games on these things. The moment you are in the game, you don’t think what hardware it’s running on.
 

farmerboy

Member
This just goes to show that a fixed system is universally better. All that has happened with the PS5 is the introduction of convolution for the developers because Sony is too proud to just take the compute L and lower clocks to a level of fixed stability.

Who cares if it lands at say 3.5Ghz on the CPU and 9.X teraflops on the GPU, it will still be a fine system, and it will be an easier development environment. This whole variable clocks nonsense is just damage control, that's all it is.

Funny how you guys have no trouble understanding XBOXs split ram setup, but variable clocks are convoluted damage control. Wow.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
I’m starting to get the feeling PlayStation will perform very bad this next gen. Xbox will be the preferred platform going forward.

Could be wrong but they haven’t felt like the same sony for awhile.

Doubtful. There's a very big chunk of the ps players that follows Sony like a cult. They'd rather skip gaming altogether if Sony shuts down rather than getting an Xbox.
 

ZywyPL

Banned
Cerny is the most technical. And he's the lead architect so it's his baby.

I don't see how Hermen Hulst, Jim Ryan or any other exec would be able to get technical clear things up.

That's probably why none of them have said anything. It's basically Cerny answering everything by default.

That's the point, he doesn't have the soft skills required to do the job, anyone can go out and throw the numbers/features, like Spencer did with his reveal post, and his not a technical person, so far what Cerny is doing is nothing but damage and spreading FUD. Creepy nerds should stay in the lab, not go out on stage.
 
Look at what happened with the X and the Pro. Even the base PS4 had better looking exclusives than found on the X
Well that's not entirely accurate and its kind of disingenuous, look at Gears 5 as an example. That was built with the X in mind and it looks fantastic, and it's operating at 60 FPS on top of that. It's not cut and dry, they aim for different goals. I mean Forza Horizon 4 is the best looking racer there is, and that was also built with the X in mind. It's just a different modus operandi for rendering and performance goals, different engines etc.

Exactly this. Sony has a fine piece of hardware in their hands and they just keep falling flat on their face trying to explain it. At this point, I would stop trying to talk about power management or anything else, as it doesn’t change the fact that on paper, they are underpowered versus the competition.
It’s the games that matter. Show the fucking goods, not the hardware. People play games on these things. The moment you are in the game, you don’t think what hardware it’s running on.
This is where things go sideways, if you have to reiterate your talking points and people have to overtly explain your hardware not necessarily to convince you that it's good, but rather so you can attempt at getting people to understand the logic and design, you've kind of shit the bed.

This is all x86, it's simple, it should be made simple. Xbox has a straight forward tried and tested design amongst their customization, and while Sony's doesn't diverge from the same overall design philosophy they introduced convoluted elements which on the surface seem to clash and even to a layman there's issues which can be seen in terms of optimizing software around what appears to be a moving target.

Funny how you guys have no trouble understanding XBOXs split ram setup, but variable clocks are convoluted damage control. Wow.
We understand all of it, but one has what appears to be an inherent flaw while the other does not. A closed system that has to scale back and cannibalize compute either from the CPU or GPU and offset itself is weird design, there's no inherent advantage to this and there's an inherent loss when things get too taxing. That is why everyone is saying this was not the core system design, that they took a fixed platform and wanted more from it and this was the only option forward.

For the Xbox its memory pool is largely functioning in the way of the PC. That 10GB's is prioritized as VRAM, while the 3.5GB's remainder can also function in that degree if called to, it undoubtedly will never need to. The remainder in the pool functions more as the system RAM, like DDR would. CPU function, audio, background tasks, parsing the drawcalls etc, and in this environment you don't need huge bandwidth overhead.
 

BluRayHiDef

Banned
That's the point, he doesn't have the soft skills required to do the job, anyone can go out and throw the numbers/features, like Spencer did with his reveal post, and his not a technical person, so far what Cerny is doing is nothing but damage and spreading FUD. Creepy nerds should stay in the lab, not go out on stage.
Wow, you're insulting the guy. What's so creepy about him? You're being extremely disrespectful. Just because someone isn't loud and extroverted doesn't make them creepy. Learn some respect; the guy is literally a genius.
 
Wow, you're insulting the guy. What's so creepy about him? You're being extremely disrespectful. Just because someone isn't loud and extroverted doesn't make them creepy. Learn some respect; the guy is literally a genius.
Mark Cerny is very much geek creep, while I'm sure he's a great and nice guy he appears incredibly awkward. That's a pretty universally accepted viewpoint, it's not an insult, its an observation.
 

Kokoloko85

Member
Here come the xbox fans on here trying to make it seem like an issue. you make an effort to make a console look bad.
Tryung to act like sony isnt showing anything.

To make it clear for xbox fans.
Playstation doesnt need to rush and reveal there plans just because xbox done it. They have a set plan and goal and MS isnt flose to there league so dont get a hissy fit and make it seem like an issue Playstationhadnt been fully revealed.
Stop acting like they need to rush and show or xbox wins lmao.

Guys your console sold half of playstation and is outsold by switch.

Xbox one is closer Wii U sales then PS4 sales...
( this is for reality check for the xbox fans its like economy cola saying coka cola needs to rush there ads or economy cola wins lol)

Check out the likes compared to PS5 logo reveal and your whole xbox reveal....

Nos Stay home and play ori and pretend your console is gonna win a console war it has never won. Just because you want to make The doesnt mean it will.
Its not gonn be in a worse postion than the PS3 no matter how bad you want to make it look and the Xbox look good.


Looks like I have to keep posting this, so xbox fans get it. Read it with me.

Xbox will never win the console war.

Xbox never has won any console war.

Xbox only came close but lost to ps3 because of: (read it with me xbox fans, it helps u understand)


Xbox will never be as successful as they once were with xbox 360. The only reason xbox 360 had a 4 year success over PS3 was because the xbox 360:

1. Released a a whole year ahead of PS3

2. Cheaper than the PS3

3. Easier to develop for than PS3

4. The boom of online gaming, online store, indies, DLC which playstation didnt have any at the time.

5. Having loads of Playstation associated games for the first time, like Devil May Cry, Resident Evil 5, Tekken, Final Fantasy, Street fighter IV.

6. Having the better performance for most 3rd party games

7. Playstation launching horribly and all the bad press it got those days.
2010 onwards Playstation made a comeback and hasnt look lost since. Xbox doesnt have those advantages apart from having maybe better performance for 3rd party games.
 
Last edited:

BluRayHiDef

Banned
I think this approach, at the very least, would have looked much better on paper.

That approach would have been more expensive, as it would have required a die with more than 40 CUs to account for chips with dysfunctional CUs in order to assure that they could get at least 40 working CUs per chip. Sony does not want to release another console that's as expensive as the PS3 was at its launch ($600) and therefore want to minimize manufacturing costs as they want to be able to sell at a profit.
 
That approach would have been more expensive, as it would have required a die with more than 40 CUs to account for chips with dysfunctional CUs in order to assure that they could get at least 40 working CUs per chip. Sony does not want to release another console that's as expensive as the PS3 was at its launch ($600) and therefore want to minimize manufacturing costs as they want to be able to sell at a profit.
Does this logic really hold water though considering the overhead in cost of the SSD? Microsoft and Sony's systems likely have incredibly similar BoM's.
 

BluRayHiDef

Banned
Mark Cerny is very much geek creep, while I'm sure he's a great and nice guy he appears incredibly awkward. That's a pretty universally accepted viewpoint, it's not an insult, its an observation.
What makes him creepy? Please describe in detail what's creepy or awkward about him.
 
What makes him creepy? Please describe in detail what's creepy or awkward about him.
The cadence of his voice, the annotation of words, his physical mannerisms. He doesn't present as a normal individual, and he's not so it's understandable. This is the last thing that needs defense from a community member, he's on the surface quite abnormal.

This is a weird sword for you to fall on.
 

BluRayHiDef

Banned
Does this logic really hold water though considering the overhead in cost of the SSD? Microsoft and Sony's systems likely have incredibly similar BoM's.

Yes, the argument holds water. They're saving costs in regard to the APU (the CPU and the GPU) in order to allocate more of their overall budget to the I/O system. SSDs are typically cheaper than processors. Furthermore, as a side note, it should be noted that the PS5 is designed such that its I/O system handles some of the tasks that CPUs and GPUs typically handle, which means that its CPU and GPU don't have to be as powerful as those of the XSX.
 
Last edited:

farmerboy

Member
We understand all of it, but one has what appears to be an inherent flaw while the other does not. A closed system that has to scale back and cannibalize compute either from the CPU or GPU and offset itself is weird design, there's no inherent advantage to this and there's an inherent loss when things get too taxing. That is why everyone is saying this was not the core system design, that they took a fixed platform and wanted more from it and this was the only option forward.

Its a new, novel way of controlling thermals. Its ok if you dont understand it, very smart people called "engineers" designed it, so not so smart people like me and you can enjoy it without melting our own brains trying to figure it out.

The advantage is that this setup allows the console to take advantage of known downtimes (the cpu and gpu are hardly ever at 100% capacity at the same time, therefore not needing full power ALL the time) by lowering frequency, and thus power, to lower thermals.

But hey, I suppose even Nvidia engineers like including "inherent flaws" into their designs.

What I think this all really points to is that Sony is still aiming for a small form factor.
 
Last edited:
Yes, the argument holds water. They're saving costs in regard to the APU (the CPU and the GPU) in order to allocate more of their overall budget to the I/O system. No SSD costs as much as a processor. Furthermore, as a side note, it should be noted that the PS5 is designed such that its I/O system handles some of the tasks that CPUs and GPUs typically handle, which means that its CPU and GPU don't have to be as powerful as those of the XSX.
This entire post from beginning to end is total nonsense. Not only are SSD's when you get into these kind of speeds incredibly expensive, It's just an SSD. Not one aspect of this drive will take over CPU or GPU tasks, it's not a processor, it's not a rendering device of any kind.

A 1 TB NVMe drive that is 2GB/s slower than the one in the PS5 costs over $200...
 

Hobbygaming

has been asked to post in 'Grounded' mode.
Well that's not entirely accurate and its kind of disingenuous, look at Gears 5 as an example. That was built with the X in mind and it looks fantastic, and it's operating at 60 FPS on top of that. It's not cut and dry, they aim for different goals. I mean Forza Horizon 4 is the best looking racer there is, and that was also built with the X in mind. It's just a different modus operandi for rendering and performance goals, different engines etc.

This is where things go sideways, if you have to reiterate your talking points and people have to overtly explain your hardware not necessarily to convince you that it's good, but rather so you can attempt at getting people to understand the logic and design, you've kind of shit the bed.

This is all x86, it's simple, it should be made simple. Xbox has a straight forward tried and tested design amongst their customization, and while Sony's doesn't diverge from the same overall design philosophy they introduced convoluted elements which on the surface seem to clash and even to a layman there's issues which can be seen in terms of optimizing software around what appears to be a moving target.

We understand all of it, but one has what appears to be an inherent flaw while the other does not. A closed system that has to scale back and cannibalize compute either from the CPU or GPU and offset itself is weird design, there's no inherent advantage to this and there's an inherent loss when things get too taxing. That is why everyone is saying this was not the core system design, that they took a fixed platform and wanted more from it and this was the only option forward.

For the Xbox its memory pool is largely functioning in the way of the PC. That 10GB's is prioritized as VRAM, while the 3.5GB's remainder can also function in that degree if called to, it undoubtedly will never need to. The remainder in the pool functions more as the system RAM, like DDR would. CPU function, audio, background tasks, parsing the drawcalls etc, and in this environment you don't need huge bandwidth overhead.
There's a list of games on PS4 that look better than those. Gears 5 is a good game and it looks good

TLOU Part 2
Ghost of Tsushima
Uncharted 4
The Order 1886
Detroit
Death Stranding

All look better
 

darkinstinct

...lacks reading comprehension.
Yup. And this is exactly what happened in 2013 when MS execs were getting hassled with Q&A until launch. Shoe on the other foot.

No matter how many times a question is answered, gamers and press kept coming with repetitious kinds of dirt digging.

All Spencer has done lately is a cheesy Skype meeting with IGN's McCaffrey answering snowball questions with vague answers while grinning the whole time.

Cerny is partly to blame though. He dug himself the ditch going into technical analysis paralysis. So tech nerds want more info and clarification. MS promoted key specs, some videos, form factor and put up an online glossary. MS even showed the controller. That's enough to get everyone off their backs and MS even did their showcase two days before Cerny.
Cerny is not to blame for anything. He built exactly the 399 console for a 2019 launch that Sony asked for. Like he did in 2013 and in 2016. Only Sony got scared when they learned about XSX targeting 2020 and being much more powerful, so they scrambled. But Cerny just did the same job, build a cost efficient console that punches above its weight.
 
Its a new, novel way of controlling thermals. Its ok if you dont understand it, very smart people call "engineers" designed it, so not so smart people like me and you can enjoy it without melting our own brains trying to figure it out.

The advantage is that this setup allows the console to take advantage of known downtimes (the cpu and gpu are hardly ever at 100% capacity at the same time, therefore not needing full power ALL the time) by lowering frequency, and thus power, to lower thermals.

But hey, I suppose even Nvidia engineers like including "inherent flaws" into their designs.

What I think this all really points to is that Sony is still aiming for a small form factor.
They could have offset this entire process by merely acquisitioning a GPU with 4 additional CU's. Instead it appears quite obvious that they built a base system which at one point was fixed in design at a lesser spec, but for whatever reason decided to seek more from it. Given the thermal workload, voltage requirements and bus size they undoubtedly ran into a limitation. If you've essentially got a finished system and you've hit a wall there's really only one way to sideline that wall, you have to introduce offset variability where one component can throttle up while the other throttles down when called to.

I'm not saying it's a junk solution or anything, but it's undoubtedly born from a limitation which wasn't accounted for originally. I mean they're doing what they've got to do, a system redesign would be costly not only financially but in terms of time, it would push off their release window.

In terms of what Nvidia is doing that's not really relevant because code for PC is not designed or optimized around a closed system, its designed and generally optimized around thousands of configurations.
 
Last edited:

BluRayHiDef

Banned
This entire post from beginning to end is total nonsense. Not only are SSD's when you get into these kind of speeds incredibly expensive, It's just an SSD. Not one aspect of this drive will take over CPU or GPU tasks, it's not a processor, it's not a rendering device of any kind.

A 1 TB NVMe drive that is 2GB/s slower than the one in the PS5 costs over $200...

I never said that the SSD would take over tasks that are typically performed by CPUs and GPUs; I said that the I/O system (which includes the SSD but isn't only the SSD) would do so. For example, the PS5's I/O system has components called Coherency Engines, which - among other things - help the GPU manage its local memory (cache).

"Coherency comes up in a lot of places, probably the biggest coherency issue is stale data in the GPU caches," explains Cerny in his presentation. "Flushing all the GPU caches whenever the SSD is read is an unattractive option - it could really hurt the GPU performance - so we've implemented a gentler way of doing things, where the coherency engines inform the GPU of the overwritten address ranges and custom scrubbers in several dozen GPU caches do pinpoint evictions of just those address ranges."

Link: https://www.eurogamer.net/articles/...s-and-tech-that-deliver-sonys-next-gen-vision
 
I never said that the SSD would take over tasks that are typically performed by CPUs and GPUs; I said that the I/O system (which includes the SSD but isn't only the SSD) would do so. For example, the PS5's I/O system has components called Coherency Engines, which - among other things - help the GPU manage its local memory (cache).

"Coherency comes up in a lot of places, probably the biggest coherency issue is stale data in the GPU caches," explains Cerny in his presentation. "Flushing all the GPU caches whenever the SSD is read is an unattractive option - it could really hurt the GPU performance - so we've implemented a gentler way of doing things, where the coherency engines inform the GPU of the overwritten address ranges and custom scrubbers in several dozen GPU caches do pinpoint evictions of just those address ranges."

Link: https://www.eurogamer.net/articles/...s-and-tech-that-deliver-sonys-next-gen-vision
That ties into the memory subsystem, not the GPU itself.
 

Leyasu

Banned
No one is on Sony's back except Xbox fans..... Sony fans have perfectly understood what Cerny said..... It is only Xbox fans who need a translator.... They can only understand TF.
lol How can you say that with a straight face? Sony fans were all "TFs and GDDR5" back in the day. They destroyed Microsoft when they claimed that the xbone was a balanced system.

Now, sony fans are like, "most balanced system and SSD"

But, this is you writing this. I am always shocked everytime when I see you pop up. How you survived the banhammer is beyond me.
 
lol How can you say that with a straight face? Sony fans were all "TFs and GDDR5" back in the day. They destroyed Microsoft when they claimed that the xbone was a balanced system.

Now, sony fans are like, "most balanced system and SSD"

But, this is you writing this. I am always shocked everytime when I see you pop up. How you survived the banhammer is beyond me.
Don't even bother with him honestly, he was all about the 13.3 Teraflop PS5 and teraflops in general before it was revealed not to be. The second that presentation went up everyone including him switched tracks to SSD, SSD, SSD!
 

ZywyPL

Banned
That's what happens when non-devs, non-technical armchair analyst gamers try to understand dev talk. Falls on deaf ears, misunderstood and spun the wrong way.

Gamers: Hey Mark, how much power and torque this car has?
Cerny: The engine in our car delivers variable power, depending on how much do you step on the gas.
Gamers: Oh, ok, makes sense. But what's its maximum power?
Cerny: You don't have to adjust your driving technique, it's all done automatically.
Gamers:...

That's why all the companies around the world have spokesmans to do the conversation with the customers, not engineers.


Wow, you're insulting the guy. What's so creepy about him? You're being extremely disrespectful. Just because someone isn't loud and extroverted doesn't make them creepy. Learn some respect; the guy is literally a genius.

njs5yz0sn8n41.jpg
 

Panajev2001a

GAF's Pleasant Genius
They could have offset this entire process by merely acquisitioning a GPU with 4 additional CU's. Instead it appears quite obvious that they built a base system which at one point was fixed in design at a lesser spec, but for whatever reason decided to seek more from it. If you've essentially got a finished system and you've hit a wall there's really only one way to sideline that wall, you have to introduce offset variability where one component can throttle up while the other throttles down when called to.

I'm not saying it's a junk solution or anything, but it's undoubtedly born from a limitation which wasn't accounted for originally. I mean they're doing what they've got to do, a system redesign would be costly not only financially but in terms of time, it would push off their release window.

Going from a 40 CU’s design to a 44 CU’s design and revisiting their clockspeed strategy is a mere small change since when? I will give you that the narrative you build around them hitting a sudden wall and panicking out by boosting the clocks is entertaining ;)(interesting that this is also the system they had readied for a Q4 2019 release and they supposedly merely just retargeted it to very high clocks for the release in Q4 2020).

The design was built around high clocks, fixed power budget from day one, and sizeable part of the SoC was devoted to audio and SSD I/O and other custom HW to ensure they could reduce bottlenecks moving data in and out of RAM and feeding the GPU avoiding stalls to a minimum. I think their capped variable clocks solution and high clockspeed final design was there since the very early days.

This design would also allow them to let the system adapt to developer workloads and intelligently boost speed as well as react, yes react, to their competition if needed by adjusting the console power consumption sweetspot a few percent on the last 8-10 months.
 
Last edited:
Going from a 40 CU’s design to a 44 CU’s design and revisit their clockspeed strategy is a mere small change. I will give you that the narrative you build around them hitting a sudden wall and panicking out by boosting the clocks is entertaining ;)(interesting that this is also the system they had readied for a Q4 2019 release and they supposedly merely just retargeted it to very high clocks for the release in Q4 2020).

The design was built around high clocks, fixed power budget from day one, and sizeable part of the SoC was devoted to audio and SSD I/O and other custom HW to ensure they could reduce bottlenecks moving data in and out of RAM and feeding the GPU avoiding stalls to a minimum. I think their capped variable clocks solution and high clockspeed final design was there since the very early days.

This design would also allow them to let the system adapt to developer workloads and intelligently boost speed as well as react, yes react, to their competition if needed by adjusting the console power consumption sweetspot a few percent on the last 8-10 months.
A 44 (40 CU) system could have given them the performance threshold they're aiming for now with lesser power draw. They would be able to run at a lower frequency, it would run cooler, the GPU would require less voltage, less wattage etc.

That's why I stand resolute in my view that none of this was intentional as the basis for the system. It very much reads like they had a fixed system, locked frequencies, set power draw but wanted more from it but to overcome the thermal, bus and voltage limitations; something had to give which brings us back to the computational offsets and variance between the CPU and GPU.

It seems as clear as day, I don't know what the big deal is or why this is even controversial.
 
Last edited:
GPUs typically manage their own cache. So, this does pertain to the GPU itself, as it alleviates the GPU from managing its cache itself and therefore enables it to focus its processing power on other things.
They manage their own cache because discrete GPU's they have their own pool of memory, that's not the case here, it's a unified system detached from the GPU, it merely draws from it.
 

GymWolf

Member
Cerny is the most technical. And he's the lead architect so it's his baby.

I don't see how Hermen Hulst, Jim Ryan or any other exec would be able to get technical clear things up.

That's probably why none of them have said anything. It's basically Cerny answering everything by default.
Hermen probably helped creating decima engine, he is probably an high level expert himself.
 

Panajev2001a

GAF's Pleasant Genius
...
lol How can you say that with a straight face? Sony fans were all "TFs and GDDR5" back in the day. They destroyed Microsoft when they claimed that the xbone was a balanced system.

Now, sony fans are like, "most balanced system and SSD"

But, this is you writing this. I am always shocked everytime when I see you pop up. How you survived the banhammer is beyond me.

Will see if the balance in the architecture will change massively between PS5 and PS5 Pro like the massive memory architecture change they did going from Xbox One to Xbox One X (which comes after them boosting the clocks at literally the last minute for Xbox One’s launch and then boosting clocks again for the Xbox One S’s one).

You can try to make the case for Xbox One (pre the two clockspeed boosts mentioned above) being more balanced than PS4 and how that improved performance and closed the gap though. Aside from some idiotic cases this discussion occurred and still saw one system clearly on top of almost all scenarios.
Some tradeoffs, such as needing higher clocks to claw back efficiency lost due to running fully virtualised CPU and GPU, paid back twice over when we started talking about enhanced BC. Also, yes ESRAM’s lower latency helped with compute efficiency for their reduced number of CU’s, but even that was counter balanced by PS4’s bet on a higher ACE count and deeper queues.

OG Xbox One was a more complex system to develop for (ESRAM + slower DDR4 main RAM management) and more expensive and with a far lower performance peak. The situation is not the same now.
 
Last edited:
...


Will see if the balance in the architecture will change massively between PS5 and PS5 Pro like the massive memory architecture change they did going from Xbox One to Xbox One X (which comes after them boosting the clocks at literally the last minute for Xbox One’s launch and then boosting clocks again for the Xbox One S’s one).

You can try to make the case for Xbox One (pre the two clockspeed boosts mentioned above) being more balanced than PS4 and how that improved performance and closed the gap though.
Aside from some idiotic cases this discussion occurred and still saw one system clearly on top of almost all scenarios. Some tradeoffs, such as needing higher clocks to claw back efficiency lost due to running fully virtualised CPU and GPU paid back twice over when we started talking about enhanced BC). Also, yes ESRAM’s lower latency helped with compute efficiency for their reduced number of CU’s, but even that was counter balanced by PS4’s bet on a higher ACE count and deeper queues.

OG Xbox One was a more complex system to develop for (ESRAM + slower DDR4 main RAM management) and more expensive and with a far lower performance peak. The situation is not the same now.
I really hope we don't get "Pro" consoles, don't get me wrong I enjoy mine but only because they bridge a gap in display technology, that is their purpose. That purpose wouldn't be repeated here so their need is kind of non-existent.
 

Panajev2001a

GAF's Pleasant Genius
They manage their own cache because discrete GPU's they have their own pool of memory, that's not the case here, it's a unified system detached from the GPU, it merely draws from it.
Unless you have the GPU acting as the classical northbridge and managing main RAM access (main RAM being GDDR memory, interface not used for Ryzen cores, but used for RDNA2 ones) for both, then the situation is the same.

Not sure what your point about cache management and owning their pool of VRAM which in a custom SoC like this one is essentially the same thing anyways.
 

BluRayHiDef

Banned
Gamers: Hey Mark, how much power and torque this car has?
Cerny: The engine in our car delivers variable power, depending on how much do you step on the gas.
Gamers: Oh, ok, makes sense. But what's its maximum power?
Cerny: You don't have to adjust your driving technique, it's all done automatically.
Gamers:...

That's why all the companies around the world have spokesmans to do the conversation with the customers, not engineers.




njs5yz0sn8n41.jpg
He looks tired. That doesn't equate to creepy. Also, when you grab a random frame of someone from a video, it's easy to make them look bad.
 

Panajev2001a

GAF's Pleasant Genius
I really hope we don't get "Pro" consoles, don't get me wrong I enjoy mine but only because they bridge a gap in display technology, that is their purpose. That purpose wouldn't be repeated here so their need is kind of non-existent.

Look I agree with you, I would not like to have a Series S or Pro consoles. I like one base fixed spec for the entire generation and that is it.
Companies like them because they help prices stay high and avoid the usual race to the bottom with redesigns.
 
Unless you have the GPU acting as the classical northbridge and managing main RAM access (main RAM being GDDR memory, interface not used for Ryzen cores, but used for RDNA2 ones) for both, then the situation is the same.

Not sure what your point about cache management and owning their pool of VRAM which in a custom SoC like this one is essentially the same thing anyways.
Because his original post made it appear like the SSD would be offloaded for tasks from the CPU and GPU, it was a very deceptive post. Also the memory being decoupled means its allocation is not tied to any one thing, its system cache, not GPU cache.
 

farmerboy

Member
A 44 (40 CU) system could have given them the performance threshold they're aiming for now with lesser power draw. They would be able to run at a lower frequency, it would run cooler, the GPU would require less voltage, less wattage etc.

That's why I stand resolute in my view that none of this was intentional as the basis for the system. It very much reads like they had a fixed system, locked frequencies, set power draw but wanted more from it but to overcome the thermal, bus and voltage limitations; something had to give which brings us back to the computational offsets and variance between the CPU and GPU.

It seems as clear as day, I don't know what the big deal is or why this is even controversial.

Well I tell you what, after Cerny fucks this up, you should apply for the position. Cos fuck me, you system architect written all over you.😂😂😂
 
Well I tell you what, after Cerny fucks this up, you should apply for the position. Cos fuck me, you system architect written all over you.😂😂😂
This is easily one of the worst methods of dodging criticism and the underpinnings of a failed counterpoint. I don't have to be a system architect to understand voltage and wattage draws, frequencies, thermals and the like and how they all relate.

More CU's at lower frequencies making up the same compute ceiling would draw less power and produce less heat.
 
Last edited:

makaveli60

Member
They did answer it in a roundabout way. The way I understand it its only power draw that is constant. They did say that otherwise different people with different ambient temperatures will get different performance. So temperature can go up and down but power draw limit will make sure it never overheats at a reasonable range of ambient temperatures and gives same performance to everyone.
Admittedly, it is very confusing and the overall gist i am getting is that Sony indeed screwed up at least a little as far as power of the system goes. Good systems are never this complicated to explain. Or maybe their communication is horrible. But that usually happens when your system is not upto the mark, like MS last generation. How clear was Sony about PS4 and how MS kept fumbling. This seems exact opposite this time.
Overall, to me it seems that Cerny is hinting at is that there will almost never be a possibility of using both CPU and GPU simultaneously at their full clocks. But since that kind of scenario never really happens in a game, it doesnt matter. So if the scene is GPU bound the GPU will be at full clock and CPU will be downclocked to remain within power budget, and vice versa. And the system will do this juggling automatically.
Of course i am not a developer, but as per my understanding, wouldn't a developer prefer to know the absolute max he can stretch the system at every level? Adding a dymanic power shift between CPU and GPU will complicate things for a developer, no? Wouldnt they then just optimize for the minimum assured frequencies so that they dont have to worry about the dynamic stuff? And by all accounts, that means optimising for ~9TF GPU level for PS4? Or keeping the GPU at full but CPU at lower frequencies, something Digital Foundry hinted at?
I hope i am wrong, but it shouldn't be this complicated to explain otherwise, like MS where we know exactly how everything will work.
Yep. I was glad that I (thought) I finally understood how the system will work and it didn't seem that horrible finally. But then this though came about the temperature thing, dust collecting etc. and if what you say is true, that the performance can vary between systems because of the temperature then that's terrible. It would be basically the worst console ever. I would be surprised if they designed such a shitty system so I really hope that's not the case. However, there is no other answer to my questions so it's worrying.
 
Yep. I was glad that I (thought) I finally understood how the system will work and it didn't seem that horrible finally. But then this though came about the temperature thing, dust collecting etc. and if what you say is true, that the performance can vary between systems because of the temperature then that's terrible. It would be basically the worst console ever. I would be surprised if they designed such a shitty system so I really hope that's not the case. However, there is no other answer to my questions so it's worrying.
Basically it would perform better or worse depending on your environment, i.e. during the summer or winter?
 
Top Bottom