• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD 4700S CPU Reviewed: Defective PS5 Chips Find New Life

SmokSmog

Member

AMD 4700S CPU Reviewed: Defective PS5 Chips Find New Life

b581cf5c54c2c426fbb629a59248ab7ac726032ecb3b9db838b934ef033535f8.jpg

a9f16de7191feec7637fc997929bee7c73c6547401934cfffe510226e238ce01.jpg
 

ethomaz

Banned
Latency is the mainly issue for PC centric processing… so it is really not a contender to a good OS CPU.

DDR4 could extract way more from it for general PC tasks.
 

Lysandros

Member
that CPU is in ps5 ? its slow
As far as i know PS5 doesn't use repurposed defective chips and slow PCIe 2.0 bus which seems to be the bottleneck here, so no i don't consider it to be 'the PS5 CPU/chip'.

Edit: Besides, the basic cooling used in the setup isn't even remotely comparable to PS5's in performance, the temperatures north of 100 degree leads to significant throttling as stated in the article. By the way PS5 may very well use lower latency GDDR6 chips, we do not know the timings at which the RAM operates, these can be customized.
 
Last edited:

TrueLegend

Member
Why is anyone expecting PS5 CPU to be faster than desktop CPUs. Its an APU and only one component among many for a device that retails for 400$. The CPU is more than enough for what it is paired with especially for devs who have been working with jaguar cores and still producing good-looking games. Even the low CPU of current-gen consoles are god-tier compared to older gen including Pro/One X. It is not a bottleneck on gpu. You can run 4K60 games on modern i3 11th gen if you are not planning very high and ultra setting. The PS5 aims for 4K30 1800p-2160p/ 4K60 1200-1400p with most setting on medium if not low.Testing it with ultra setting is not ideal. What matters is image quality at the end of the day and framerate. The hardware is balanced on that. And it is a damn good machine. Also what's up with 8$ and 16$?
 
Why is anyone expecting PS5 CPU to be faster than desktop CPUs. Its an APU and only one component among many for a device that retails for 400$. The CPU is more than enough for what it is paired with especially for devs who have been working with jaguar cores and still producing good-looking games. Even the low CPU of current-gen consoles are god-tier compared to older gen including Pro/One X. It is not a bottleneck on gpu. You can run 4K60 games on modern i3 11th gen if you are not planning very high and ultra setting. The PS5 aims for 4K30 1800p-2160p/ 4K60 1200-1400p with most setting on medium if not low.Testing it with ultra setting is not ideal. What matters is image quality at the end of the day and framerate. The hardware is balanced on that. And it is a damn good machine. Also what's up with 8$ and 16$?
Cecret Cerny Ceause is the reason. Wish people would realize by now. I get the console warriors get clicks here, but it's time that the right info is used, rather than armchair devs.

Ps5 is a great piece of hardware for the price. Unbeatable, as matter a fact, both next gen consoles are sold at a steal. But it shouldn't be compared to mid range, high end, or enthusiast PC's
 

lh032

I cry about Xbox and hate PlayStation.
Not surprising, just look at the pricing.

Thats how you got an affordable gaming machine.
Want a powerful hardware? game on PC.
 
Last edited:

Md Ray

Member
and here the you will realize that a modern desktop PC cpu could run a PS5 (or Xbox for what it matters) game and at the same time do whatever decompressor are doing and still have probably power to spare
At PCIe Gen4 speeds, no it doesn't. Please let's stop this. We have data from NVIDIA showing how core-heavy task it is and can use up to 24 cores on a 3rd gen Threadripper.

And You won't find more than 6-8 cores in most modern gaming PCs, let alone 24.

j1kVuie.png

This is why RTX IO GPU-based decompression and DirectStorage are being introduced.
 
Last edited:

Leo9

Member
At PCIe Gen4 speeds, no it doesn't. Please let's stop this. We have data from NVIDIA showing how core-heavy task it is and can use up to 24 cores on a 3rd gen Threadripper.

And You won't find more than 6-8 cores in most modern gaming PCs, let alone 24.

j1kVuie.png

This is why RTX IO GPU-based decompression and DirectStorage are being introduced.
Looking at that graph 24 cores are needed for 14GB/s. No game will need that bandwidth, not until 2040 at least.
 

3liteDragon

Member
It never was. More like mobile, lower clocked Zen 2 than desktop part.
The one in the consoles aren't mobile versions of the Zen 2 desktop CPUs, the Jaguar CPUs are what you would call mobile CPUs cause they were designed for low-power use with notebooks & tablets. The console ones have slight adjustments (like the FPU register cuts on PS5's CPU) and are clocked slightly lower than their PC counterparts, but they're still desktop-class CPUs. And plus, audio processing & data decompression from storage are completely offloaded from the PS5's CPU cores, so imagine how much more CPU horsepower's gonna be available this time around for current-gen only titles in a few years (and these CPUs support hyper-threading on top of that). It's gonna be insane, the CPU jump from last-gen is MORE THAN enough.
 
Last edited:

Armorian

Banned
Then why link the worst picture in the OP, some people smh. Thank you for different graphs.

Mistake or...? :messenger_tears_of_joy: This was GPU related graph completely irrelevant for CPU performance alone.

The one in the consoles aren't mobile versions of the Zen 2 desktop CPUs, the Jaguar CPUs are what you would call mobile CPUs cause they were designed for low-power use with notebooks & tablets. The console ones have slight adjustments (like the FPU register cuts on PS5's CPU) and are clocked slightly lower than their PC counterparts, but they're still desktop-class CPUs. And plus, audio processing & data decompression from storage are completely offloaded from the PS5's CPU cores, so imagine how much more CPU horsepower's gonna be available this time around for current-gen only titles in a few years. It's gonna be insane, the CPU jump from last-gen is MORE THAN enough.

Zen 2 desktop, 32MB cache:

9UZZLAe.png


Zen 2 Mobile, 8MB:

IBVDFFx.png


PS5/XSX CPU has 8MB...
 

3liteDragon

Member
Zen 2 desktop, 32MB cache:

9UZZLAe.png


Zen 2 Mobile, 8MB:

IBVDFFx.png


PS5/XSX CPU has 8MB...
Yes, the consoles only run games and are not general-purpose like PCs, thus the massive L3 cache cut on both console CPUs, still doesn't make them "mobile" CPUs. Again, the PS4/Xbox One CPU is a mobile CPU because of it's architectural design which is best suited for tablets & other low-power devices. The Jaguar CPUs consume less than 20W of power, that's the definition of a mobile CPU.
 
Last edited:

Armorian

Banned
Yes, the consoles only run games and are not general-purpose like PCs, thus the massive L3 cache cut on both console CPUs, still doesn't make them "mobile" CPUs. Again, the PS4/Xbox One CPU is a mobile CPU because of it's architectural design best suited for tablets & other low-power devices. The Jaguar CPUs consume less than 20W of power, that's the definition of mobile CPUs.

Fuck me, AMD APUs are based on mobile versions of their chips. And cache is THE MOST IMPORTANT for gaming. Why AMD will make Z3 refresh with more cache targeted for gamers?

Najpewniej-poznalismy-szczegoly-Zen-3-AMD.-Szykuje-sie-mala-rewolucja-4.jpg
 

3liteDragon

Member
Fuck me, AMD APUs are based on mobile versions of their chips. And cache is THE MOST IMPORTANT for gaming. Why AMD will make Z3 refresh with more cache targeted for gamers?

Najpewniej-poznalismy-szczegoly-Zen-3-AMD.-Szykuje-sie-mala-rewolucja-4.jpg
Dude, those games are running on a R9 5900X CPU (12C/24T) prototype (not the 2020 version) with 192MB of stacked L3$ being compared to the 2020 5900X with 64MB of L3$ and the performance uplift is decent. They're going overkill with it cause they can, the consoles are gonna be just fine with 8MB of L3$ across the board, but they aren't "mobile" versions. You gotta search up the definition of a mobile CPU.
 

Armorian

Banned
Dude, those games are running on a R9 5900X CPU (12C/24T) prototype (not the 2020 version) with 192MB of stacked L3$ being compared to the 2020 5900X with 64MB of L3$ and the performance uplift is decent. They're going overkill with it cause they can, the consoles are gonna be just fine with 8MB of L3$ across the board, but they aren't "mobile" versions. You gotta search up the definition of a mobile CPU.

Desktop APUs, PS5 uses CPU chiplets from 4700 series but with different clocks and GPU obviously:

g3MlLBc.png


They look like chiplets made for notebooks to me :messenger_smiling:
 

Xyphie

Member
The console's CPU floor plan is basically 100% lifted from Renoir with similar clocks so in that sense it's a "mobile" processor. Of course, thinking about it in terms of "mobile", "server", "desktop" or whatever is very stupid. The cut to L3 cache in consoles is because caches are very expensive in terms of die space, not because they are ineffectual, a Zen2/3 CCD is ~50% L3 cache for instance. If you're going to save money on your CPU design the caches are the first thing to go.
 

CrustyBritches

Gold Member
PS5 was benchmarked in Userbenchmark in 2019. It had ~155ns memory latency and benched similar to a Ryzen 1700. Still a great CPU for console use, and I doubt CPU bottlenecking is much of an issue outside maybe 120fps mode.
 

winjer

Gold Member
Test shown in OP is completely limited by PCIE 2.0 x4 bus. In CPU centric tests it looks like this:

It's basically a Zen1 CPU. It's still a huge jump from Jaguar cores on the previous gen.
But they should have kept that 32MB of L3 cache. With that 145ns of memory latency, the CPU will spend too much time waiting for data to process.
If only AMD already had 3D cache in time...

The weird part of this 4700S is that the CPU can only access a part of all that memory bandwidth.
And without a full GPU, the rest is just wasted.
 
Last edited:

DenchDeckard

Moderated wildly
Seems ok. Just fine tbh. Nothing crazy and will obviously be smoked by a 5600X etc. Perfect fora console, and much better than the POS we had with jaguar cores.
 

Md Ray

Member
It's basically a Zen1 CPU. It's still a huge jump from Jaguar cores on the previous gen.
But they should have kept that 32MB of L3 cache. With that 145ns of memory latency, the CPU will spend too much time waiting for data to process.
If only AMD already had 3D cache in time...

The weird part of this 4700S is that the CPU can only access a part of all that memory bandwidth.
And without a full GPU, the rest is just wasted.
Single-core wise it's Zen+ level, and multi-core wise it performs similar to a desktop Zen 2 Ryzen 5 3600 (a tiny bit better) in rendering workloads.

It's a monolithic design rather than chiplet so 32MB wasn't feasible and had to go. Even the latest Zen 3 desktop APUs have only 16MB of L3$.
 

Lysandros

Member
It's basically a Zen1 CPU. It's still a huge jump from Jaguar cores on the previous gen.
But they should have kept that 32MB of L3 cache. With that 145ns of memory latency, the CPU will spend too much time waiting for data to process.
If only AMD already had 3D cache in time...

The weird part of this 4700S is that the CPU can only access a part of all that memory bandwidth.
And without a full GPU, the rest is just wasted.
I think console engineers are smarter than most of us and can make sensible decisions. Die space isn't free, with that much increase (x4) in cache size they would have to make significant cuts to GPU part, which certainly isn't ideal for a mere ~ 10% increase in CPU IPC.
 

Clear

CliffyB's Cock Holster
The uplift moving from last gen consoles to current gen is pretty impressive if you ask me. Just looking at how backwards compatibility is performing - we're seeing greater than double framerates and usually at much improved resolution and detail levels, plus massively improved i/o performance in boxes that cost only a little more than their predecessors.

If you consider these gains to be more-or-less baseline due to them only leveraging a minimal amount of the new hardware's feature-set and capabilities, seems to me that the future looks pretty bright.
 

winjer

Gold Member
Single-core wise it's Zen+ level, and multi-core wise it performs similar to a desktop Zen 2 Ryzen 5 3600 (a tiny bit better) in rendering workloads.

It's a monolithic design rather than chiplet so 32MB wasn't feasible and had to go. Even the latest Zen 3 desktop APUs have only 16MB of L3$.

Yes, if we consider clock speeds, IPC would be near Zen+. Basically we have a Zen+, with lower clocks and 7C14T
But let's remember that games don't scale as well with core count.
IPC is still the most important metric. So this CPU won't scale like a 3600. More like a 2700
But there is a problem here. Programs like pov-ray and cinebench, are not cache and memory latency sensitive.
They gain almost nothing from more cache and lower latency.
But games are, a lot more. So in games, these console CPUs would perform closer to Zen1

Yes, it's a SoC, so space is at a premium.
That's why I said, if only AMD already had 3d-cache ready at the time.
That would allowed them to double the L3 cache.
 
Last edited:

Zathalus

Member
I came across the following review as well. It is in German, but nothing a bit of Google translate can't sort out. It basically confirm the Zen 2 CPU in the PS5 has cutdown FPU units that was speculated a while back based on the die shots.

No idea what (if any) impact this would have on games going forward.
 
Ps5 is a great piece of hardware for the price. Unbeatable, as matter a fact, both next gen consoles are sold at a steal. But it shouldn't be compared to mid range, high end, or enthusiast PC's
Comparing is fine, people need to compare things all the time. Especially when the consoles and PC mostly run the game games.
Cecret Cerny Ceause is the reason. Wish people would realize by now. I get the console warriors get clicks here, but it's time that the right info is used, rather than armchair devs.
You are the first or second biggest console warrior on this channel.
 

Kilau

Gold Member
Pretty cool and not horrible performance, not sure what I would ever use one for though, would need a specific use case.
is fine, people need to compare things all the time. Especially when the consoles and PC mostly run the game games.

You are the first or second biggest console warrior on this channel.
But he isn’t a “console” guy at all…wait channel?
 

SlimySnake

Flashless at the Golden Globes
Hey i thought ps5 cpu is ryzen 7 3700x, did I miss the News somehow?
Same 8 core 16 thread Zen 2 architecture as the 3700x, yes, but with only 1/4th of the L3 cache. 32 MB vs 8 MB.

The PS5 APU is only 300mm2 and includes the GPU as well as a pretty complex I/O block. A slightly bigger chip wouldve offered more performance for sure, but wouldve also increased the thermals and cost. For $500, you are getting a $400 CPU (albeit paired back slightly) and a $379 GPU and a $180 SSD.
 
Same 8 core 16 thread Zen 2 architecture as the 3700x, yes, but with only 1/4th of the L3 cache. 32 MB vs 8 MB.

The PS5 APU is only 300mm2 and includes the GPU as well as a pretty complex I/O block. A slightly bigger chip wouldve offered more performance for sure, but wouldve also increased the thermals and cost. For $500, you are getting a $400 CPU (albeit paired back slightly) and a $379 GPU and a $180 SSD.
Why do people actually believe that though? People are claiming 3700x/3800x well after it was confirmed to be far from the case. Then people get mad when I point this out. WTF
 

Allandor

Member
Hey i thought ps5 cpu is ryzen 7 3700x, did I miss the News somehow?
It is (just like in the xbox) a lower clocked 8 core Zen 2 CPU with reduced cache. And also Sony removed some things on the CPU side. But CPU results are also much lower because of GDDR6 memory with higher latencies than DDR4. Especially Ryzen likes low latencies.
Also keep in mind, normally the CPU would also not have such a power-envelop than in the tested box and in that test the CPU alone can use the memory. So CPUs should be a bit worse in the consoles (also with lower boost-clocks).

And still, this is so, so, so much better than last gens Jaguar cores ;)

Fuck me, AMD APUs are based on mobile versions of their chips. And cache is THE MOST IMPORTANT for gaming. Why AMD will make Z3 refresh with more cache targeted for gamers?
Yes, more cache would have worked well especially as GDDR6 is used and those latencies are not great for a CPU. Cache is really simple (and has a low defect rate) but would have taken a huge part of the Die and that space is better used for the GPU as it is more important for the overall speed of the console.
 
Last edited:

Md Ray

Member
The uplift moving from last gen consoles to current gen is pretty impressive if you ask me. Just looking at how backwards compatibility is performing - we're seeing greater than double framerates and usually at much improved resolution and detail levels, plus massively improved i/o performance in boxes that cost only a little more than their predecessors.

If you consider these gains to be more-or-less baseline due to them only leveraging a minimal amount of the new hardware's feature-set and capabilities, seems to me that the future looks pretty bright.
Yeah, MS claims it's about 3x (minimum) improvement over One X Jaguar CPU, and up to 4x depending on the title.

qPWYea6.png


And that's kind of what I'm seeing here (up to 4x higher fps) with a 3700X in this particular title that hasn't even been optimized for 16 threads. This section is known to be CPU-heavy.

Witcher 3 on One X (900p in perf mode), notice the perf mode framerates for comparisons.
YR5PISv.png


3.5x improvement (using settings roughly equivalent to consoles)

krAUF8w.png

oEI1Rr8.jpg


3.8x improvement

Pe3vUgZ.png

UiB6MfK.jpg


3.8x improvement

4VeM9KP.png

ryUtFjg.jpg


4.0x improvement

xTuQWro.png
 
Last edited:
Top Bottom