• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • The Politics forum has been nuked. Please do not bring political discussion to the rest of the site, or you will be removed. Thanks.

News Drama AMD 4700S CPU Reviewed: Defective PS5 Chips Find New Life

SmokSmog

Member
Apr 22, 2018
397
2,238
555

AMD 4700S CPU Reviewed: Defective PS5 Chips Find New Life


 

ethomaz

is mad because DF didn't do a video on a video of a video of a video on PS5
Mar 19, 2013
42,244
45,055
1,310
39
Brazil
Latency is the mainly issue for PC centric processing… so it is really not a contender to a good OS CPU.

DDR4 could extract way more from it for general PC tasks.
 
  • Like
Reactions: Bo_Hazem and M1chl

Lysandros

Member
Jul 28, 2020
1,130
3,997
370
that CPU is in ps5 ? its slow
As far as i know PS5 doesn't use repurposed defective chips and slow PCIe 2.0 bus which seems to be the bottleneck here, so no i don't consider it to be 'the PS5 CPU/chip'.

Edit: Besides, the basic cooling used in the setup isn't even remotely comparable to PS5's in performance, the temperatures north of 100 degree leads to significant throttling as stated in the article. By the way PS5 may very well use lower latency GDDR6 chips, we do not know the timings at which the RAM operates, these can be customized.
 
Last edited:

TrueLegend

Member
Jun 7, 2021
677
1,616
630
Why is anyone expecting PS5 CPU to be faster than desktop CPUs. Its an APU and only one component among many for a device that retails for 400$. The CPU is more than enough for what it is paired with especially for devs who have been working with jaguar cores and still producing good-looking games. Even the low CPU of current-gen consoles are god-tier compared to older gen including Pro/One X. It is not a bottleneck on gpu. You can run 4K60 games on modern i3 11th gen if you are not planning very high and ultra setting. The PS5 aims for 4K30 1800p-2160p/ 4K60 1200-1400p with most setting on medium if not low.Testing it with ultra setting is not ideal. What matters is image quality at the end of the day and framerate. The hardware is balanced on that. And it is a damn good machine. Also what's up with 8$ and 16$?
 

DonJuanSchlong

Spice Spice Baby
Jul 15, 2020
3,846
9,919
745
Why is anyone expecting PS5 CPU to be faster than desktop CPUs. Its an APU and only one component among many for a device that retails for 400$. The CPU is more than enough for what it is paired with especially for devs who have been working with jaguar cores and still producing good-looking games. Even the low CPU of current-gen consoles are god-tier compared to older gen including Pro/One X. It is not a bottleneck on gpu. You can run 4K60 games on modern i3 11th gen if you are not planning very high and ultra setting. The PS5 aims for 4K30 1800p-2160p/ 4K60 1200-1400p with most setting on medium if not low.Testing it with ultra setting is not ideal. What matters is image quality at the end of the day and framerate. The hardware is balanced on that. And it is a damn good machine. Also what's up with 8$ and 16$?
Cecret Cerny Ceause is the reason. Wish people would realize by now. I get the console warriors get clicks here, but it's time that the right info is used, rather than armchair devs.

Ps5 is a great piece of hardware for the price. Unbeatable, as matter a fact, both next gen consoles are sold at a steal. But it shouldn't be compared to mid range, high end, or enthusiast PC's
 

lh032

I cry about Xbox and hate PlayStation.
Mar 8, 2021
1,907
4,196
470
Not surprising, just look at the pricing.

Thats how you got an affordable gaming machine.
Want a powerful hardware? game on PC.
 
Last edited:

MonarchJT

Member
Sep 25, 2020
3,000
4,627
410
and here the you will realize that a modern desktop PC cpu could run a PS5 (or Xbox for what it matters) game and at the same time do whatever decompressor are doing and still have probably power to spare
 
Last edited:

Md Ray

Member
Nov 12, 2016
4,152
13,376
785
and here the you will realize that a modern desktop PC cpu could run a PS5 (or Xbox for what it matters) game and at the same time do whatever decompressor are doing and still have probably power to spare
At PCIe Gen4 speeds, no it doesn't. Please let's stop this. We have data from NVIDIA showing how core-heavy task it is and can use up to 24 cores on a 3rd gen Threadripper.

And You won't find more than 6-8 cores in most modern gaming PCs, let alone 24.


This is why RTX IO GPU-based decompression and DirectStorage are being introduced.
 
Last edited:

Leo9

Member
Dec 23, 2020
85
167
210
At PCIe Gen4 speeds, no it doesn't. Please let's stop this. We have data from NVIDIA showing how core-heavy task it is and can use up to 24 cores on a 3rd gen Threadripper.

And You won't find more than 6-8 cores in most modern gaming PCs, let alone 24.


This is why RTX IO GPU-based decompression and DirectStorage are being introduced.
Looking at that graph 24 cores are needed for 14GB/s. No game will need that bandwidth, not until 2040 at least.
 

3liteDragon

Member
Mar 3, 2020
2,176
13,134
765
It never was. More like mobile, lower clocked Zen 2 than desktop part.
The one in the consoles aren't mobile versions of the Zen 2 desktop CPUs, the Jaguar CPUs are what you would call mobile CPUs cause they were designed for low-power use with notebooks & tablets. The console ones have slight adjustments (like the FPU register cuts on PS5's CPU) and are clocked slightly lower than their PC counterparts, but they're still desktop-class CPUs. And plus, audio processing & data decompression from storage are completely offloaded from the PS5's CPU cores, so imagine how much more CPU horsepower's gonna be available this time around for current-gen only titles in a few years (and these CPUs support hyper-threading on top of that). It's gonna be insane, the CPU jump from last-gen is MORE THAN enough.
 
Last edited:

Armorian

Banned
Jan 17, 2018
3,298
5,293
520
Then why link the worst picture in the OP, some people smh. Thank you for different graphs.

Mistake or...? :messenger_tears_of_joy: This was GPU related graph completely irrelevant for CPU performance alone.

The one in the consoles aren't mobile versions of the Zen 2 desktop CPUs, the Jaguar CPUs are what you would call mobile CPUs cause they were designed for low-power use with notebooks & tablets. The console ones have slight adjustments (like the FPU register cuts on PS5's CPU) and are clocked slightly lower than their PC counterparts, but they're still desktop-class CPUs. And plus, audio processing & data decompression from storage are completely offloaded from the PS5's CPU cores, so imagine how much more CPU horsepower's gonna be available this time around for current-gen only titles in a few years. It's gonna be insane, the CPU jump from last-gen is MORE THAN enough.

Zen 2 desktop, 32MB cache:



Zen 2 Mobile, 8MB:



PS5/XSX CPU has 8MB...
 

3liteDragon

Member
Mar 3, 2020
2,176
13,134
765
Zen 2 desktop, 32MB cache:



Zen 2 Mobile, 8MB:



PS5/XSX CPU has 8MB...
Yes, the consoles only run games and are not general-purpose like PCs, thus the massive L3 cache cut on both console CPUs, still doesn't make them "mobile" CPUs. Again, the PS4/Xbox One CPU is a mobile CPU because of it's architectural design which is best suited for tablets & other low-power devices. The Jaguar CPUs consume less than 20W of power, that's the definition of a mobile CPU.
 
Last edited:

Armorian

Banned
Jan 17, 2018
3,298
5,293
520
Yes, the consoles only run games and are not general-purpose like PCs, thus the massive L3 cache cut on both console CPUs, still doesn't make them "mobile" CPUs. Again, the PS4/Xbox One CPU is a mobile CPU because of it's architectural design best suited for tablets & other low-power devices. The Jaguar CPUs consume less than 20W of power, that's the definition of mobile CPUs.

Fuck me, AMD APUs are based on mobile versions of their chips. And cache is THE MOST IMPORTANT for gaming. Why AMD will make Z3 refresh with more cache targeted for gamers?

 
  • Praise the Sun
Reactions: MightySquirrel

3liteDragon

Member
Mar 3, 2020
2,176
13,134
765
Fuck me, AMD APUs are based on mobile versions of their chips. And cache is THE MOST IMPORTANT for gaming. Why AMD will make Z3 refresh with more cache targeted for gamers?

Dude, those games are running on a R9 5900X CPU (12C/24T) prototype (not the 2020 version) with 192MB of stacked L3$ being compared to the 2020 5900X with 64MB of L3$ and the performance uplift is decent. They're going overkill with it cause they can, the consoles are gonna be just fine with 8MB of L3$ across the board, but they aren't "mobile" versions. You gotta search up the definition of a mobile CPU.
 
  • LOL
Reactions: MightySquirrel

Armorian

Banned
Jan 17, 2018
3,298
5,293
520
Dude, those games are running on a R9 5900X CPU (12C/24T) prototype (not the 2020 version) with 192MB of stacked L3$ being compared to the 2020 5900X with 64MB of L3$ and the performance uplift is decent. They're going overkill with it cause they can, the consoles are gonna be just fine with 8MB of L3$ across the board, but they aren't "mobile" versions. You gotta search up the definition of a mobile CPU.

Desktop APUs, PS5 uses CPU chiplets from 4700 series but with different clocks and GPU obviously:



They look like chiplets made for notebooks to me :messenger_smiling:
 
  • Like
Reactions: MightySquirrel

Xyphie

Member
Oct 4, 2007
2,938
659
1,415
Somewhere
The console's CPU floor plan is basically 100% lifted from Renoir with similar clocks so in that sense it's a "mobile" processor. Of course, thinking about it in terms of "mobile", "server", "desktop" or whatever is very stupid. The cut to L3 cache in consoles is because caches are very expensive in terms of die space, not because they are ineffectual, a Zen2/3 CCD is ~50% L3 cache for instance. If you're going to save money on your CPU design the caches are the first thing to go.
 
Jul 29, 2013
3,802
7,341
1,095
US
PS5 was benchmarked in Userbenchmark in 2019. It had ~155ns memory latency and benched similar to a Ryzen 1700. Still a great CPU for console use, and I doubt CPU bottlenecking is much of an issue outside maybe 120fps mode.
 

winjer

Member
Aug 3, 2021
493
1,121
305
Test shown in OP is completely limited by PCIE 2.0 x4 bus. In CPU centric tests it looks like this:

It's basically a Zen1 CPU. It's still a huge jump from Jaguar cores on the previous gen.
But they should have kept that 32MB of L3 cache. With that 145ns of memory latency, the CPU will spend too much time waiting for data to process.
If only AMD already had 3D cache in time...

The weird part of this 4700S is that the CPU can only access a part of all that memory bandwidth.
And without a full GPU, the rest is just wasted.
 
Last edited:
  • Like
Reactions: Armorian

DenchDeckard

Member
Feb 28, 2021
2,585
4,631
395
Seems ok. Just fine tbh. Nothing crazy and will obviously be smoked by a 5600X etc. Perfect fora console, and much better than the POS we had with jaguar cores.
 

Md Ray

Member
Nov 12, 2016
4,152
13,376
785
It's basically a Zen1 CPU. It's still a huge jump from Jaguar cores on the previous gen.
But they should have kept that 32MB of L3 cache. With that 145ns of memory latency, the CPU will spend too much time waiting for data to process.
If only AMD already had 3D cache in time...

The weird part of this 4700S is that the CPU can only access a part of all that memory bandwidth.
And without a full GPU, the rest is just wasted.
Single-core wise it's Zen+ level, and multi-core wise it performs similar to a desktop Zen 2 Ryzen 5 3600 (a tiny bit better) in rendering workloads.

It's a monolithic design rather than chiplet so 32MB wasn't feasible and had to go. Even the latest Zen 3 desktop APUs have only 16MB of L3$.
 

Lysandros

Member
Jul 28, 2020
1,130
3,997
370
It's basically a Zen1 CPU. It's still a huge jump from Jaguar cores on the previous gen.
But they should have kept that 32MB of L3 cache. With that 145ns of memory latency, the CPU will spend too much time waiting for data to process.
If only AMD already had 3D cache in time...

The weird part of this 4700S is that the CPU can only access a part of all that memory bandwidth.
And without a full GPU, the rest is just wasted.
I think console engineers are smarter than most of us and can make sensible decisions. Die space isn't free, with that much increase (x4) in cache size they would have to make significant cuts to GPU part, which certainly isn't ideal for a mere ~ 10% increase in CPU IPC.
 

Clear

Member
Feb 2, 2009
13,501
9,484
1,365
The uplift moving from last gen consoles to current gen is pretty impressive if you ask me. Just looking at how backwards compatibility is performing - we're seeing greater than double framerates and usually at much improved resolution and detail levels, plus massively improved i/o performance in boxes that cost only a little more than their predecessors.

If you consider these gains to be more-or-less baseline due to them only leveraging a minimal amount of the new hardware's feature-set and capabilities, seems to me that the future looks pretty bright.
 
  • Like
Reactions: Md Ray

winjer

Member
Aug 3, 2021
493
1,121
305
Single-core wise it's Zen+ level, and multi-core wise it performs similar to a desktop Zen 2 Ryzen 5 3600 (a tiny bit better) in rendering workloads.

It's a monolithic design rather than chiplet so 32MB wasn't feasible and had to go. Even the latest Zen 3 desktop APUs have only 16MB of L3$.

Yes, if we consider clock speeds, IPC would be near Zen+. Basically we have a Zen+, with lower clocks and 7C14T
But let's remember that games don't scale as well with core count.
IPC is still the most important metric. So this CPU won't scale like a 3600. More like a 2700
But there is a problem here. Programs like pov-ray and cinebench, are not cache and memory latency sensitive.
They gain almost nothing from more cache and lower latency.
But games are, a lot more. So in games, these console CPUs would perform closer to Zen1

Yes, it's a SoC, so space is at a premium.
That's why I said, if only AMD already had 3d-cache ready at the time.
That would allowed them to double the L3 cache.
 
Last edited:

Zathalus

Member
Jul 16, 2020
1,084
2,828
470
I came across the following review as well. It is in German, but nothing a bit of Google translate can't sort out. It basically confirm the Zen 2 CPU in the PS5 has cutdown FPU units that was speculated a while back based on the die shots.

No idea what (if any) impact this would have on games going forward.
 
Jan 29, 2019
6,903
7,716
520
Ps5 is a great piece of hardware for the price. Unbeatable, as matter a fact, both next gen consoles are sold at a steal. But it shouldn't be compared to mid range, high end, or enthusiast PC's
Comparing is fine, people need to compare things all the time. Especially when the consoles and PC mostly run the game games.
Cecret Cerny Ceause is the reason. Wish people would realize by now. I get the console warriors get clicks here, but it's time that the right info is used, rather than armchair devs.
You are the first or second biggest console warrior on this channel.
 

Kilau

Member
Dec 12, 2013
3,951
4,962
710
Pretty cool and not horrible performance, not sure what I would ever use one for though, would need a specific use case.
is fine, people need to compare things all the time. Especially when the consoles and PC mostly run the game games.

You are the first or second biggest console warrior on this channel.
But he isn’t a “console” guy at all…wait channel?
 
  • Praise the Sun
Reactions: DonJuanSchlong

SlimySnake

Member
Feb 5, 2013
12,841
36,381
1,260
Hey i thought ps5 cpu is ryzen 7 3700x, did I miss the News somehow?
Same 8 core 16 thread Zen 2 architecture as the 3700x, yes, but with only 1/4th of the L3 cache. 32 MB vs 8 MB.

The PS5 APU is only 300mm2 and includes the GPU as well as a pretty complex I/O block. A slightly bigger chip wouldve offered more performance for sure, but wouldve also increased the thermals and cost. For $500, you are getting a $400 CPU (albeit paired back slightly) and a $379 GPU and a $180 SSD.
 

DonJuanSchlong

Spice Spice Baby
Jul 15, 2020
3,846
9,919
745
Same 8 core 16 thread Zen 2 architecture as the 3700x, yes, but with only 1/4th of the L3 cache. 32 MB vs 8 MB.

The PS5 APU is only 300mm2 and includes the GPU as well as a pretty complex I/O block. A slightly bigger chip wouldve offered more performance for sure, but wouldve also increased the thermals and cost. For $500, you are getting a $400 CPU (albeit paired back slightly) and a $379 GPU and a $180 SSD.
Why do people actually believe that though? People are claiming 3700x/3800x well after it was confirmed to be far from the case. Then people get mad when I point this out. WTF
 

Allandor

Member
Feb 8, 2018
1,404
1,122
435
Germany
Hey i thought ps5 cpu is ryzen 7 3700x, did I miss the News somehow?
It is (just like in the xbox) a lower clocked 8 core Zen 2 CPU with reduced cache. And also Sony removed some things on the CPU side. But CPU results are also much lower because of GDDR6 memory with higher latencies than DDR4. Especially Ryzen likes low latencies.
Also keep in mind, normally the CPU would also not have such a power-envelop than in the tested box and in that test the CPU alone can use the memory. So CPUs should be a bit worse in the consoles (also with lower boost-clocks).

And still, this is so, so, so much better than last gens Jaguar cores ;)

Fuck me, AMD APUs are based on mobile versions of their chips. And cache is THE MOST IMPORTANT for gaming. Why AMD will make Z3 refresh with more cache targeted for gamers?
Yes, more cache would have worked well especially as GDDR6 is used and those latencies are not great for a CPU. Cache is really simple (and has a low defect rate) but would have taken a huge part of the Die and that space is better used for the GPU as it is more important for the overall speed of the console.
 
Last edited:

Md Ray

Member
Nov 12, 2016
4,152
13,376
785
The uplift moving from last gen consoles to current gen is pretty impressive if you ask me. Just looking at how backwards compatibility is performing - we're seeing greater than double framerates and usually at much improved resolution and detail levels, plus massively improved i/o performance in boxes that cost only a little more than their predecessors.

If you consider these gains to be more-or-less baseline due to them only leveraging a minimal amount of the new hardware's feature-set and capabilities, seems to me that the future looks pretty bright.
Yeah, MS claims it's about 3x (minimum) improvement over One X Jaguar CPU, and up to 4x depending on the title.



And that's kind of what I'm seeing here (up to 4x higher fps) with a 3700X in this particular title that hasn't even been optimized for 16 threads. This section is known to be CPU-heavy.

Witcher 3 on One X (900p in perf mode), notice the perf mode framerates for comparisons.


3.5x improvement (using settings roughly equivalent to consoles)




3.8x improvement




3.8x improvement




4.0x improvement

 
Last edited: