• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
One of those anonymous pastebin supposed spec leaks for nextxbox:

Xbox Infinite (Anaconda)

Die - 352mm^2

GPU - 11.1TFLOP/s
Details: 64 CU, 8 disabled, 56 active, 1548MHz.

CPU - 8 cores, 16 threads, 3.3GHz.
Details: Zen 2.

Memory - 24GB GDDR6, 4GB DDR4.
Details: Samsung’s K4ZAF325BM-HC14 clocked at 3300MHz, 13.2Gb/s, 12 chips, 384-bit, 634GB/s. 24GB GDDR6 is available for developers, 3GB DDR4 dedicated to the OS, 1GB DDR4 dedicated to the SSD.

Storage - 256GB NVMe SSD, 2TB HDD
Details: Players don’t have access to the SSD, the 2TB drive is replaceable, an external drive works too. The OS manages the SSD for caching, Microsoft is using machine learning algorithms for analyzing games while they are being played on the development kit, the algorithm keeps track of what blocks are being used and what blocks are most likely to be used next while developers are playing the games for thousands of hours. The OS keeps on the SSD only relevant blocks to the last known player position in the game. For example, if the player is in level 3, the OS won't load level 6 to the SSD until the player reaches level 5. The SSD also keeps a compressed memory snapshot when a game is closed for fast launching the game to the spot the player had left it. Developers have some control over what is stored on the SSD by marking a block's priority level if they wish to do so. If a player hasn't touched a game for a while, if necessary its' memory snapshot and or data will get dumped from the SSD.

Cooling - vapor chamber

External media drive - Blu-ray optical drive

Price - 499$

------------------------------------------------------------------

Xbox Infinite Value (Lockhart)

Die - 288mm^2

GPU - 4.98TFLOP/s
Detail: 40CU, 4 disabled, 36 active, 1081MHz.

CPU - 8 cores, 16 threads, 3.3GHz.

Memory - 18GB GDDR6, 4GB DDR4.
Details: 2600Hz, 10.4Gb/s, 9 chips, 288-bit, 374GB/s.

Storage - 120GB NVMe SSD, 1TB HDD

Cooling - Blower fan

External media drive - None

Price - 299$

------------------------------------------------------------------

Some of the recommendations from Microsoft to developers:
- Develop your game to the Xbox Infinite as a lead platform.
- Xbox Infinite Value was built to run even sub-4K Xbox Infinite games in Full HD.
- Xbox Infinite Value version is allowed to run above 1080p, but it isn't allowed to have better graphical features, higher fidelity or frame-rate than the Xbox Infinite.
- Microsoft recommends using any leftover headroom on the Xbox Infinite Value GPU to increase resolution.
 

LordOfChaos

Member
Why do people act like the SPUs don't exist when comparing the CELL with other cpus?


The thread was about Xenon with the three core comment, but SPEs wouldn't fare better in "average game code". They note they weren't even getting the 0.2IPC another dev had mentioned. So over 5x ipc, lets say 7 active cores each, half the clock speed on Jaguar. The 7th-8th gen wasn't treading water CPU wise being the main point, even if it was a smaller leap than before.

The link in that thread provides another MMX-P3 comparison, innocent whistling
 
Last edited:

ethomaz

Banned
Why do people act like the SPUs don't exist when comparing the CELL with other cpus?
To match with the point people are trying to make ;)

Cell vs Jaguar PS4 will lies in the task you choose to compare... Cell win some, PS4’s Jaguar win others.

Overall in games they are pretty close.

The biggest advantage of PS4’s Jaguar is to be x86_64 that allow easy and fast use for developers plus a lot of tools that can run native (the legacy of over 30 years of x86).
 
Last edited:
Why do people act like the SPUs don't exist when comparing the CELL with other cpus?
I'm not sure I get your point. You said Jaguar isn't an upgrade over 7th-gen era PowerPC CPUs and I said this is not true. I even gave a source that proves it, straight from the horse's mouth. Do you disagree with Sebastian's comment?

Why would you compare SPUs with traditional CPU cores? SPUs are not replacement for traditional CPU horsepower. It's just that Sony had a certain silicon budget and they chose to spend it on SIMD stuff (which was a necessity for grand AAA cinematic games like Uncharted 2) vs Microsoft spending more silicon budget on traditional CPU cores.

When people say Jaguar is crap, do they also mention the fact that there's a Radeon GPU for number crunching? Why do people act like GPU compute doesn't exist when comparing the Jaguar with other CPUs ? ;)

You need traditional CPU horsepower to run branchy (AI) code efficiently & quickly. SIMD assistance won't help you there.

The reason consoles never had beastly CPUs (traditional CPUs, not fancy co-processors) is because console games are more simple vs PC games with far more elaborate AI (strategy games are a prime example of this).

This is no longer true though, since console games also employ quite advanced AI these days (RDR2). And it's gonna get even better next-gen (GTA6), therefore a Zen 2 CPU is needed.

Of course that doesn't mean that the Zen 2 CPU will also process stuff like audio or decompression (like PCs do).
 

Ar¢tos

Member
I'm talking in general, in forums. It's not unusual to see people complaining that the ps4 should have BC with ps3 because "if X1 can emulate the 360 3 powerpc cores, there is no excuse for Ps4 to not emulate the ps3 single powerpc core" and other similar comments.
 
I'm talking in general, in forums. It's not unusual to see people complaining that the ps4 should have BC with ps3 because "if X1 can emulate the 360 3 powerpc cores, there is no excuse for Ps4 to not emulate the ps3 single powerpc core" and other similar comments.
I'm not going to argue that a Jaguar CPU can emulate 6 SPUs in real time, because that would be silly to suggest. Uncharted 2 is out of the question.

But what about games that don't utilize the SPUs? You know, there are plenty of indie games on PS3 and some 3rd party AAA ones (like RDR1) that could perhaps be emulated on PS4. That's the question that needs to be asked. :)
 

Ar¢tos

Member
I'm not going to argue that a Jaguar CPU can emulate 6 SPUs in real time, because that would be silly to suggest. Uncharted 2 is out of the question.

But what about games that don't utilize the SPUs? You know, there are plenty of indie games on PS3 and some 3rd party AAA ones (like RDR1) that could perhaps be emulated on PS4. That's the question that needs to be asked. :)
Probably because those aren't the games people want to play with BC.
(isn't RDR1 code a massive clusterfuck? That alone should make BC hard for it)
 
Probably because those aren't the games people want to play with BC.
(isn't RDR1 code a massive clusterfuck? That alone should make BC hard for it)
You don't have to touch the source code to emulate a game, at least RPCS3 doesn't do that (neither MS does it).

The source code is needed if you want to make a remaster, which is highly unlikely, even though it would sell like hotcakes.

I'm not sure if the Nvidia GPU could complicate things from a legal perspective. Would Sony need a licence from Nvidia to emulate RSX on PS4/PS5? Their contract was only valid for the PS3, AFAIK.

MS is lucky in a sense, because they used an ATi GPU on XBOX 360 and they still have a partnership with AMD. OG XBOX uses an Nvidia GPU and there are very few emulated OG XBOX games on XBOX ONE, which is very weird if you think about it, considering the fact that both utilize the x86 architecture.

I'm just wondering if the emulation hurdles are more legal than technical...

Either way, it would be really cool if they offered some sort of PS3 BC on PS5. Zen 2 is more than capable for the task.
 
To match with the point people are trying to make ;)

Cell vs Jaguar PS4 will lies in the task you choose to compare... Cell win some, PS4’s Jaguar win others.

Overall in games they are pretty close.
They're not close at all. PPC might have a slight SIMD advantage (debatable if you take into account Seb's FMA comment, not to mention PS4 Pro/XB1X boosted CPU clocks), but for everything else that really matters in regards to game code (AI, branch prediction etc.) they're vastly inferior.

Here's another interesting comment from early 2005 (before even the XBOX 360 was released):


"Gameplay code will get slower and harder to write on the next generation of consoles. Modern CPUs use out-of-order execution, which is there to make crappy code run fast. This was really good for the industry when it happened, although it annoyed many assembly language wizards in Sweden. Xenon and Cell are both in-order chips. What does this mean? It’s cheaper for them to do this. They can drop a lot of cores. One out-of-order core is about four times [did I catch that right? Alice] the size of an in-order core. What does this do to our code? It’s great for grinding on floating point, but for anything else it totally sucks. Rumours from people actually working on these chips – straight-line runs 1/3 to 1/10th the performance at the same clock speed. This sucks."

The "PS3/XBOX 360 had beastly CPUs" meme needs to die. ;)

There's a reason XBOX 360 had 3 CPU cores, while PCs only had 1-2 cores back in 2005. It's not magic, it's a silicon budget trade-off.

Comparing numbers (flops, CPU cores) without context is meaningless and misleading. We can all do better than this. :)

The biggest advantage of PS4’s Jaguar is to be x86_64 that allow easy and fast use for developers plus a lot of tools that can run native (the legacy of over 30 years of x86).
Not really. 8th-gen consoles aimed for 8GB of RAM, therefore 64-bit memory addressing was needed (that was the biggest selling point of AMD64 back in 2003). ARM didn't even have 64-bit CPU cores back then (minus Apple and their exclusive, semi-custom ARM cores).

x86 assembly is a convoluted mess (another myth is that it's "easy" to use) and most game devs don't even bother to write assembly (with very few exceptions). There's a reason compilers and C/C++ exist. :)
 

Evilms

Banned
vba5TqO.jpg

t1558922400z4.png
 
Last edited:

psorcerer

Banned
The reason consoles never had beastly CPUs (traditional CPUs, not fancy co-processors) is because console games are more simple vs PC games with far more elaborate AI (strategy games are a prime example of this).

You don't know what you're talking about. Sorry. "AI" code needs a fraction of a platform power if designed correctly.
It's not a 4D chess. AI in a computer game needs to provide a feeling of some resistance and then just lay low and ply dead.
 
You don't know what you're talking about. Sorry. "AI" code needs a fraction of a platform power if designed correctly.
It's not a 4D chess. AI in a computer game needs to provide a feeling of some resistance and then just lay low and ply dead.
Thanks for the compliment. I guess you don't remember how old school, console games used to be (yes, they had very simplistic AI compared to what we have these days).

Care to explain why most current-gen, open world games run at 30 fps?

Are you saying that even RDR2 AI is not designed "correctly"? Maybe Rockstar should hire you, since apparently "you know what you're talking about"? Hmmm...

I'll gladly accept an apology from you, as soon as Cerny shows RDR2 running at rock solid 60 fps thanks to the power of Zen 2 CPU (yes, that will be a big fat bullet point on their upcoming PS5 presentation).
 

psorcerer

Banned
Thanks for the compliment. I guess you don't remember how old school, console games used to be (yes, they had very simplistic AI compared to what we have these days).

Care to explain why most current-gen, open world games run at 30 fps?

Are you saying that even RDR2 AI is not designed "correctly"? Maybe Rockstar should hire you, since apparently "you know what you're talking about"? Hmmm...

I'll gladly accept an apology from you, as soon as Cerny shows RDR2 running at rock solid 60 fps thanks to the power of Zen 2 CPU (yes, that will be a big fat bullet point on their upcoming PS5 presentation).

1. Most console games will always run 30 fps, just because better graphics sells and higher fps - does not. On PC you cannot reliably get 32ms frames, therefore you must always aim for x2 fps just to stay playable.

2. RDR2 is a huge multiplatform game, created by a 1000 people sweatshop. It cannot possibly be optimized. So they aimed at 60 where they could. And then just downgraded to 30 where they missed the target. Pragmatic.

3. It doesn't prove anything. Probably you don't really understand what I'm talking about.
 

ethomaz

Banned
Why can't it be 40 CUs with 36 active?
You disable 1 CU in each SE.

Navi = 8SE = minimum 8 CU disabled
Vega = 4SE = minimum 4 CU disabled

Anaconda looks like Navi... Lockhart looks like non-Navi... confusing? Everything will be clear in 4 hours.

Nothing related to that fake picture... there is only two Navi 10 launching... XT $499 and Pro $399.
 
Last edited:

TeamGhobad

Banned
anaconda specs if true are very disappointing.

also 256gb of SSD...never thought i would be disappointed by storage space.
 
Last edited:

SonGoku

Member
Actually I wish he simply stopped spinning, instead he started overblowing every miniscule difference touting how MS engineered the perfect console. He is just a MS shill, which now is accompanied with "Dictator" another MS shill at DF. Only natural person there is John who does "Retro" series and he usually does the best game analysis out of the three.
Tom and John are unbiased
I agree RL let's his bias interfere with his analysis.
 
1. Most console games will always run 30 fps, just because better graphics sells and higher fps - does not. On PC you cannot reliably get 32ms frames, therefore you must always aim for x2 fps just to stay playable.

2. RDR2 is a huge multiplatform game, created by a 1000 people sweatshop. It cannot possibly be optimized. So they aimed at 60 where they could. And then just downgraded to 30 where they missed the target. Pragmatic.

3. It doesn't prove anything. Probably you don't really understand what I'm talking about.
No friendo, I'm afraid you are the one who doesn't understand what I'm talking about.

Are you saying that Rockstar (R-O-C-K-S-T-A-R, not Bethesda!) didn't properly optimize their game for current-gen consoles? What the fuck?!

Mentioning "1000s of people" is misleading, since they only have a handful of programmers (just like every other studio). The rest are doing creative work (assets, textures, audio etc.) Now I'm fairly convinced you don't know what the hell you're talking about!

"Better graphics at all costs" is another meme that needs to die ASAP. They're "sweatshops" because many "gamers" are graphics whores, but in this gen you see a lot of people caring about gameplay fluidity (that's why RAGE 2 targets 60 fps instead of 4K).

And no, PCs can reliably get 32 and 16ms frames. Stop the misinformation. There are unoptimized games everywhere (like Bloodborne with its uneven frame pacing). So what?

Oh well, another one joins my list. Not willing to argue with forumers that clearly misconstrue my posts.

And one last thing before we part ways...

That's why the whole discussion about "cpu power" is so stupid.
Right, it's so stupid that Cerny himself announced a Zen 2 CPU for the PS5, way before the official presentation. Maybe they should hire you instead?

Modern gaming machine does not need CPU.
You heard it here first folks, games don't need a CPU, they can just run off of the GPU!

Just remove all CPU traces (including Jaguar) from the APU die and add even more GPU compute units! Brilliant.

their game code which should use maybe 1% of the frame time suddenly uses 50%.
Basically you're saying that modern video games run 50 times slower than they're supposed to.

50 x 30 fps = 1500 fps

and even Lua is "too hard" for them
TIL that modern video games engines use Lua instead of C/C++.
 
Last edited:

CrustyBritches

Gold Member
I'm planning on buying the 12c/24t Ryzen 3000 variant. It would be cool to hit 5GHz, but that's not a deal breaker either way for me. Navi rumors have been all over the map. I'm into mainstream/bang-for-buck GPUs in the $349 and under bracket, preferably under $300. If what the Sapphire rep said is true I'll probably hold off until holiday deals later this year. At this point a new Ryzen CPU and better RAM sound more enticing than letting these companies push the limits on GPU pricing.
 

SonGoku

Member
The "PS3/XBOX 360 had beastly CPUs" meme needs to die. ;)

There's a reason XBOX 360 had 3 CPU cores, while PCs only had 1-2 cores back in 2005. It's not magic, it's a silicon budget trade-off.
Its a trade off that worked on a closed console system though, it paid in dividents to those willing to work optimizing their code for in order SIMD focused chips, the WiiU OoO CPU supposed to be much better at general code struggled with PS360 optimized games, many devs spoke out about it

Im willing to bet that with the same die budget in 2005 a out of order CPU wouldn't have produced as good results on a console.
 

TeamGhobad

Banned
About CPU power.

I heard that Bulldozer has 4 times the IPC over Jaguar. Zen1 has 40% IPC over Bulldozer. Zen2 has 15% IPC over Zen1.
This would give us about 6.5x more power at the same clocks, double the clocks and its 13x.
 

ethomaz

Banned
About CPU power.

I heard that Bulldozer has 4 times the IPC over Jaguar. Zen1 has 40% IPC over Bulldozer. Zen2 has 15% IPC over Zen1.
This would give us about 6.5x more power at the same clocks, double the clocks and its 13x.
The biggest performance jump in CPUs are not IPC but instructions sets... SSE, SSE2, SSE3, AVX, etc... all these instructions when used make code way faster than actual IPC increase.
 
Last edited:

LordOfChaos

Member
About CPU power.

I heard that Bulldozer has 4 times the IPC over Jaguar. Zen1 has 40% IPC over Bulldozer. Zen2 has 15% IPC over Zen1.
This would give us about 6.5x more power at the same clocks, double the clocks and its 13x.

This part is throwing off the equation. Might be surprising but Jaguar and Bulldozer had comparable IPC, the reason Jaguar wasn't on the high end was that its short pipeline didn't allow it to clock as high. Remember also that a Bulldozer two "core" module had one FPU.


Maybe you heard Bulldozer had 4x the performance of Jaguar? If you normalize per core and per clock it doesn't hold up
 
Last edited:
Its a trade off that worked on a closed console system though, it paid in dividents to those willing to work optimizing their code for in order SIMD focused chips, the WiiU OoO CPU supposed to be much better at general code struggled with PS360 optimized games, many devs spoke out about it

Im willing to bet that with the same die budget in 2005 a out of order CPU wouldn't have produced as good results on a console.
Yeah, that's because Wii U had a different power balance and it was a GPGPU precursor for modern consoles (that follow the same philosophy if you think about it).

Back in 2013-2014 both PS4 and XBOX ONE faced the exact same struggles with cross-gen game engines that followed the old paradigm.

Personally, I think Cell was useful in an era where the concept of GPU compute didn't exist (GeForce 8800 GTX was released in late 2006 and CUDA came in 2007), but CPU-wise it leaves a lot to be desired. CPUs like Pentium 4 and Athlon 64 were stronger at general code.

About CPU power.

I heard that Bulldozer has 4 times the IPC over Jaguar. Zen1 has 40% IPC over Bulldozer. Zen2 has 15% IPC over Zen1.
This would give us about 6.5x more power at the same clocks, double the clocks and its 13x.
IIRC, Bulldozer and Jaguar have roughly the same IPC.

The difference is that Jaguar has pipelines with fewer stages, which means lower clocks.

Remember Pentium 4 vs 3? Pentium 4 had up to 30 stages and it clocked much higher, but IPC was abysmal.

That's why Intel ditched P4, went back to the drawing board and got Pentium 3/M as a basis for their Core series.
 
I've heard so many contradicting stories about Bulldozer, why is it so hard to get an accurate comparison?
You can read this post if you want:

 

SonGoku

Member
Yeah, that's because Wii U had a different power balance and it was a GPGPU precursor for modern consoles (that follow the same philosophy if you think about it).
That's my point, WiiU followed 8th gen design philosophy and it struggled to run PS360 optimized games.
The PS360 CPUs were the best choice at the time imo.
 

LordOfChaos

Member
I've heard so many contradicting stories about Bulldozer, why is it so hard to get an accurate comparison?

For one the aforementioned shared FPU - what do you test, integer where it behaves like two cores, or floating point where it behaves like one?
It also had major cache and bandwidth issues
sandra-cm.gif


So between the two of those, what do you compare? The best case, or the worst? The results would be significantly different. It was often doodoo but could occasionally compare to the very best.

Then there's the other issue where no one really gave a shit about testing Jaguar on PC lol



Relevant bit from N Negotiator 's post
Jaguar and Piledriver IPC will be in the same ballpark. However when running these chips at low clocks (<19W) all the transistors spent in Piledriver design that allow the high clock ceiling are wasted, but all the disadvantages are still present. Thus Piledriver needs more power and more chip area to reach similar performance than Jaguar. There's no way around this. Jaguar core has better performance per watt.
 
Last edited:
That's my point, WiiU followed 8th gen design philosophy and it struggled to run PS360 optimized games.
The PS360 CPUs were the best choice at the time imo.
For what they wanted to do, yeah. Jaguar was also the best choice back in 2012-2013, contrary to the popular belief.

Consoles are traditionally multimedia-focused machines first and foremost. That's why they put emphasis on special co-processors and SIMD raw power.

But these days the lines have blurred between PCs and consoles, people want better framerates, they want better AI and 7nm lithography will most likely allow us to have our cake and eat it too, with no (severe) CPU compromises for the first time ever.
 

SonGoku

Member
Jaguar was also the best choice back in 2012-2013, contrary to the popular belief.
Definitely! a better CPU would have eaten into GPU die and power consumption budget
Jaguar it was the perfect choice for Sony, it was much better at general code than the PPE and the Async compute optimized GPU would take over any SPEs functions. ND devs commented how working with CELL was similar to CUs (GPGPU)
 
Last edited:
Status
Not open for further replies.
Top Bottom