• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Substance Engine benchmark implies PS4 CPU is faster than Xbox One's

Whatever the number of cores or the clockspeeds, we have no real world evidence that the CPU is faster in the Xbox one.

multiplatform games favour ps4 seemingly under all in game circumstances. This benchmark favours ps4. If the Xbox one did have a faster CPU there would likely be a situation in one of the multiplatform games where the Xbox one demonstrated this.

then we have this benchmark. And a developer confirmation.

official confirmation of clockspeeds from Microsoft tell us nothing about the clock speed of the ps4.

stick a fork in the theory that the Xbox one has a faster CPU.
 

StratJ

Banned
I try to stay away from these threads because they're simply embarrassing. To start with, this is a process running across one core, of course the i7 is going to absolutely demolish it. The single core speed of an i7 is miles higher than the single core speed of a Jaguar. I'd be interested to seeing the application in real use.

To those who are drawing the line in the sand saying that's it, we all know the PS4 is more powerful in CPU terms than the Xbox. This is one process, one benchmark and one usage which may have so many factors on other hardware in the box. The CPU's are pretty much the same. You're never going to see a big difference in a 0.15Ghz increase, they probably did it based on figures they received through hardware testing and the fact that the box could simply handle it without any difference.

This is one test, one process and one benchmark which provides figures on texture compression for one specific engine on the CPU on ONE core. It should not define the whole CPU.
 
Makes perfect sense to have more CPU power available to match the better GPU and RAM.

I wonder if anyone will still care about multiplatform game comparisons in a few years.
"...and again for the 536th time, the PS4 version performs much better in pretty much every category *yawns*" :D
 

Marlenus

Member
I try to stay away from these threads because they're simply embarrassing. To start with, this is a process running across one core, of course the i7 is going to absolutely demolish it. The single core speed of an i7 is miles higher than the single core speed of a Jaguar. I'd be interested to seeing the application in real use.

To those who are drawing the line in the sand saying that's it, we all know the PS4 is more powerful in CPU terms than the Xbox. This is one process, one benchmark and one usage which may have so many factors on other hardware in the box. The CPU's are pretty much the same. You're never going to see a big difference in a 0.15Ghz increase, they probably did it based on figures they received through hardware testing and the fact that the box could simply handle it without any difference.

This is one test, one process and one benchmark which provides figures on texture compression for one specific engine on the CPU on ONE core. It should not define the whole CPU.

If we were comparing different CPU architectures then I could agree with this because some designs might be strong in X,Y and Z while other designs are strong at A,B and C. The thing is both consoles use the same CPU architecture so if there is a benefit in one area then it strongly implies that there is a benefit in all areas.

I tried to find more information on the benchmark but could not get a lot other than to say that Substance is used by a lot of different game studios so it is relevant.

TRega, there has never been an official announcement regarding the PS4 CPU clock speed. I was under the impression that the 1.6Ghz assumption was because that is what some dev kits had, of course some dev kits also had 4 GB of GDDR5 but we know that changed so to assume that the dev kit specifications followed through to production hardware is a bit of a leap in all honesty.

The difference in performance on this benchmark is 16%, even if partially caused by OS overheads/compiler differences it is unlikely that it would make this much of a difference considering substance is used across all gaming platforms.
 

Bundy

Banned
Whatever the number of cores or the clockspeeds, we have no real world evidence that the CPU is faster in the Xbox one.

multiplatform games favour ps4 seemingly under all in game circumstances. This benchmark favours ps4. If the Xbox one did have a faster CPU there would likely be a situation in one of the multiplatform games where the Xbox one demonstrated this.

then we have this benchmark. And a developer confirmation.

official confirmation of clockspeeds from Microsoft tell us nothing about the clock speed of the ps4.
Well, we have this:
Yes, you can get more out of the PS4's CPU than you can the Xbox's.

Well, I'm actually not suprised ;)

Stick a fork in the theory that the Xbox one has a faster CPU.
Yep!
 

RVinP

Unconfirmed Member
The difference between both the consoles is just 2MB/s (From the benchmark), hardly worth the mention that PS4 CPU is faster than Xbox One CPU for any currently released games or the upcoming ones.

Its the GPU which matters for the consoles, CPU not so much.
 

StratJ

Banned
If we were comparing different CPU architectures then I could agree with this because some designs might be strong in X,Y and Z while other designs are strong at A,B and C. The thing is both consoles use the same CPU architecture so if there is a benefit in one area then it strongly implies that there is a benefit in all areas.
Exactly, the only thing is, we don't know what the test is doing. It could be streaming assets from RAM or flushing everything out after results have been compressed.

There's so many factors around this test which are unknown. I'd be intrigued if anyone on here has worked with this engine and knows exactly what it's doing here.

Definitely interesting though never the less.
 

Bundy

Banned
The difference between both the consoles is just 2MB/s (From the benchmark), hardly worth the mention that PS4 CPU is faster than Xbox One CPU for any currently released games or the upcoming ones.

Its the GPU which matters for the consoles, CPU not so much.
It is worth the mention, because Penello and Major Nelson were spreading BS with their "we have a faster CPU / 10% more CPU" quotes.
 

onQ123

Member
The difference between both the consoles is just 2MB/s (From the benchmark), hardly worth the mention that PS4 CPU is faster than Xbox One CPU for any currently released games or the upcoming ones.

Its the GPU which matters for the consoles, CPU not so much.

2MB/s more than 12MB/s is a 16 -17% advantage.
 

StratJ

Banned
Go back to p2 - matt's a multiplat developer, so presumably knows his onions.

No offence to the chap, I'd like to understand what he's worked on and working on to believe him flat-out.

Since he's got an original username though, that speaks volume.
 

Bundy

Banned
No offence to the chap, I'd like to understand what he's worked on and working on to believe him flat-out.
Since he's got an original username though, that speaks volume.
Matt is a developer well vetted by the mods.

It's funny how the Xbox is behind in literally every category now.
 

Marlenus

Member
Exactly, the only thing is, we don't know what the test is doing. It could be streaming assets from RAM or flushing everything out after results have been compressed.

There's so many factors around this test which are unknown. I'd be intrigued if anyone on here has worked with this engine and knows exactly what it's doing here.

Definitely interesting though never the less.

Substance is integrated by default into Unity, UE3, Maya, Modo and 3DS Max so it is pretty wide spread.

Also each 'substance' uses only a few KB of memory so it is very small and you can get quite far with the 1MB of L2 cache on each Jaguar module.

The test also refers to 1 CPU which to me means 1 core. That would suggest that the PS4 CPU is higher clocked than the Xbox 1.

What we really need is for someone to hack the bios to see the CPU FSB and multiplier values, then we will know the actual clock speed.
 

StratJ

Banned
Matt is a developer well vetted by the mods.

It's funny how the Xbox is behind in literally every category now.

Sure man how about he flies you over to his development studio and gives you a tour.

Because I'm critically questioning information and people I know nothing about? If he's a professional figure, how about a LinkedIn page?

Let's be honest, not many people in the world are going to be concurrently working on both systems. Even if you're working on a multi-platform next-gen team, you'll be assigned to one console.

Substance is integrated by default into Unity, UE3, Maya, Modo and 3DS Max so it is pretty wide spread.

Also each 'substance' uses only a few KB of memory so it is very small and you can get quite far with the 1MB of L2 cache on each Jaguar module.

The test also refers to 1 CPU which to me means 1 core. That would suggest that the PS4 CPU is higher clocked than the Xbox 1.

What we really need is for someone to hack the bios to see the CPU FSB and multiplier values, then we will know the actual clock speed.

Cheers for the clarification there. Definitely interesting stuff. You would of thought Sony would have publicly spoken about their clock speed after the PR release from MS on theirs.

Although you mentioned 1MB of cache on the CPU's, this is referring to texture compression. Surely this benchmark will be streaming and storing some assets in main RAM?
 

Skeff

Member
Because I'm critically questioning information and people I know nothing about? If he's a professional figure, how about a LinkedIn page?

Let's be honest, not many people in the world are going to be concurrently working on both systems. Even if you're working on a multi-platform next-gen team, you'll be assigned to one console.



Cheers for the clarification there. Definitely interesting stuff. You would of thought Sony would have publicly spoken about their clock speed after the PR release from MS on theirs.

Although you mentioned 1MB of cache on the CPU's, this is referring to texture compression. Surely this benchmark will be streaming and storing some assets in main RAM?

You do realize developers could simply just be fired for saying PS4 CPU > XB1 CPU right? that's breaking at least 2 NDA's.
 

Argyle

Member
Because I'm critically questioning information and people I know nothing about? If he's a professional figure, how about a LinkedIn page?

Let's be honest, not many people in the world are going to be concurrently working on both systems. Even if you're working on a multi-platform next-gen team, you'll be assigned to one console.

Why would he post his LinkedIn? He's presumably under NDA to even have access to both consoles.

Also that's not entirely true, I know I've had every dev kit on my desk when working on a multiplatform game.
 

Hurley

Member
Because I'm critically questioning information and people I know nothing about? If he's a professional figure, how about a LinkedIn page?

I dunno, how about he might be under a NDA of sorts?
-edit- beaten, twice.

Let's be honest, not many people in the world are going to be concurrently working on both systems. Even if you're working on a multi-platform next-gen team, you'll be assigned to one console.

Let's be honest, if he's a multiplat dev then there's chances he's worked with both machines.
 

RVinP

Unconfirmed Member
It is worth the mention, because Penello and Major Nelson were spreading BS with their "we have a faster CPU / 10% more CPU" quotes.

2MB/s more than 12MB/s is a 16 -17% advantage.

.crucify PR statement <- true
.technically citing CPU differences between the consoles <-practically irrelevant

Irrelevant because we all already know the PS4 can accommodate more visual processing (ie..a bit higher resolution,a bit more visual effects, etc), which all requires a bit more CPU power than what is required to run the games on Xbox One level visual settings.

That 2MB/s (benchmark) gap will close up, when more visuals are displayed on PS4 output than Xbox one. (Please correct me if I am wrong in this context)

And GPU processing/compute was also given a spot light, so that takes a chunk from CPU being a bit more away from whats supposed to be the center of attention.

Is the same 'Substance Engine benchmark' freely available for public download (PC version)?
 

Hubble

Member
Because I'm critically questioning information and people I know nothing about? If he's a professional figure, how about a LinkedIn page?

Let's be honest, not many people in the world are going to be concurrently working on both systems. Even if you're working on a multi-platform next-gen team, you'll be assigned to one console.



Cheers for the clarification there. Definitely interesting stuff. You would of thought Sony would have publicly spoken about their clock speed after the PR release from MS on theirs.

Although you mentioned 1MB of cache on the CPU's, this is referring to texture compression. Surely this benchmark will be streaming and storing some assets in main RAM?

To retain your insider access, as GAF is well known for, you must have some sort of privacy especially to your employers, which the gaming industry is tough known for dismissals and turn overs, so a lot of insider information is anonymous here...... Matt is vetted by the mods, that's the mods jobs. Not his or anyone else here.

And taken everything in context, it's clear MS did not plan the Xbox One hardware well. Reports by insiders a year or two ago indicated poorly optimized developer kits that were behind in tools compared to the PS4, rushed hardware improvements, Eurogamer breaking news on the ESRAM 'optimization' of it's magic upspeed that was absolutely ridiculous to read on how the engineers did not realize their peak speeds until that recent article, the GPU 53 MHZ big wow upgrade which is meaningless, etc., and now this, which is no surprise. Even Pachter on his show a week or two ago said he can't believe Microsoft screwed up on the GDD5.

You are really are pulling at straws to make you not want to believe.
 

StratJ

Banned
You do realize developers could simply just be fired for saying PS4 CPU > XB1 CPU right? that's breaking at least 2 NDA's.

Why would he post his LinkedIn? He's presumably under NDA to even have access to both consoles.

Also that's not entirely true, I know I've had every dev kit on my desk when working on a multiplatform game.

I dunno, how about he might be under a NDA of sorts?



Let's be honest, if he's a multiplat dev then there's chances he's worked with both machines.
Jesus, I'm not on a warpath to disproves this guys claims. They're most definitely true. To be honest, I'm more interested into what he'd be working on rather than the proof of his credibility.

Lets be honest, the benchmark details are very sparse. I'm definitely interested into learning more around it and future benchmarks to come out of the consoles. Like the very obvious points you made above, there's strict NDA's.

Also, you'd be surprised how many professionals and corporate read social/online material to base their decisions on. You'd also be surprised how many colleagues know each others internet aliases.
 
Because I'm critically questioning information and people I know nothing about? If he's a professional figure, how about a LinkedIn page?

Let's be honest, not many people in the world are going to be concurrently working on both systems. Even if you're working on a multi-platform next-gen team, you'll be assigned to one console.

You're really, really grasping at straws here.

1) Every insider on GAF is vetted by Bish, a developer himself. Pretty sure he does a better job at that than you'd ever do.

2) So you want him to be fired?

3) Let's say you're right about there being completely different programming teams for each version (even though it's bullshit), do you think each of those teams would be locked in separate basements or something? No, they'd be working side by side and talking to each other.
 

onQ123

Member
.crucify PR statement <- true
.technically citing CPU differences between the consoles <-practically irrelevant

Irrelevant because we all already know the PS4 can accommodate more visual processing (ie..a bit higher resolution,a bit more visual effects, etc), which all requires a bit more CPU power than what is required to run the games on Xbox One level visual settings.

That 2MB/s (benchmark) gap will close up, when more visuals are displayed on PS4 output than Xbox one. (Please correct me if I am wrong in this context)

And GPU processing/compute was also given a spot light, so that takes a chunk from CPU being a bit more away from whats supposed to be the center of attention.

Is the same 'Substance Engine benchmark' freely available for public download (PC version)?

The biggest point is the fact that we have gone from thinking that the Xbox One had a ~ 10% CPU advantage to seeing the PS4 CPU perform ~17% better in a benchmark & hearing from a dev that you can get more out of the PS4 CPU than the Xbox One CPU.

Close to a 30% boost in CPU performance compared to what we thought it had or it's just that the Xbox One CPU is performing a lot worse than we thought.
 
So let me get this straight:
PS4 | Xbox One
7 Free Cores | 6 Free Cores
14 MB/s Per Core | 12 MB/s Per Core
98 MB/s available | | 72 MB/s Available

So basically PS4:
CPU: 36% advantage
GPU: 50% advantage
(More GPGPU power)
Faster Ram
hUMA
Easier to program for (Lack of Esram bottleneck)

If this is true... Esram was a mistake.

or it's just that the Xbox One CPU is performing a lot worse than we thought.

It is within expectations.
 

kitch9

Banned
I try to stay away from these threads because they're simply embarrassing. To start with, this is a process running across one core, of course the i7 is going to absolutely demolish it. The single core speed of an i7 is miles higher than the single core speed of a Jaguar. I'd be interested to seeing the application in real use.

To those who are drawing the line in the sand saying that's it, we all know the PS4 is more powerful in CPU terms than the Xbox. This is one process, one benchmark and one usage which may have so many factors on other hardware in the box. The CPU's are pretty much the same. You're never going to see a big difference in a 0.15Ghz increase, they probably did it based on figures they received through hardware testing and the fact that the box could simply handle it without any difference.

This is one test, one process and one benchmark which provides figures on texture compression for one specific engine on the CPU on ONE core. It should not define the whole CPU.

I'm almost embarrassed for you.
 

coldfoot

Banned
Same place we heard that XBone reserves two — the rumor mill. Of course, as I pointed out, the mill was clearly wrong about the PS4's clock, so it's possible it's wrong about this as well.
There is credible evidence that PS4 has 6 cores available from Killzone and Battlefield presentations. The 7-core available is tales from my ass and ONLY came about after this benchmark.

Can I ask again how you're determining that? Both the iPad 2 and iPhone 5 are dual-core machines, with the latter on a newer architecture and a 30% higher clock. How does a 66% performance advantage for the iPhone 5 show us the test is per-core rather than per-CPU?
That said, comparing the iPhone 5 to the Tegra 4 would seem to indicate a per-core test. That, or the quad-core Tegra 4 is really shitty at this test when compared to the dual-core A6. lol
I meant to say comparing the iOS stuff to OTHER cpu's confirms that, just like you confirmed. Jaguar is weak compared to Intel, but it's still way faster than ARM.
 
If this is true... Esram was a mistake..
ESRAM was not a mistake, it was a solution for the limited DDR3 bandwidth problem. If they had gone for GDDR5 or a similar alternative from the get go, there would have been no need for ESRAM and other workarounds, there would be more room for the GPU and the system would be easier to use.

It's easy to say that now, obviously, and I'm sure there was some internal debate at some point that ended with them going for DDR3. In the end, they made the wrong decision.
 

Fafalada

Fafracer forever
StratJ said:
Let's be honest, not many people in the world are going to be concurrently working on both systems.
You don't need to be concurrently working on systems to know their relative performance. Especially within your codebase.

Surely this benchmark will be streaming and storing some assets in main RAM?
He means that working set of the algorithm fits mostly in 1mb, not that it never reads/writes outside that. Ie. the code rarely waits for memory/is likely compute bound - which would also mean it scales well with cpu clock speed.
 
ESRAM was not a mistake, it was a solution for the limited DDR3 bandwidth problem. If they had gone for GDDR5 or a similar alternative from the get go, there would have been no need for ESRAM and other workarounds, there would be more room for the GPU and the system would be easier to use.

It's easy to say that now, obviously, and I'm sure there was some internal debate at some point that ended with them going for DDR3. In the end, they made the wrong decision.
My favourite hypothetical: if Sony launched with 4 gigs would it have still been a wrong decision? Probably not. But it's moot now.
 
ESRAM was not a mistake, it was a solution for the limited DDR3 bandwidth problem. If they had gone for GDDR5 or a similar alternative from the get go, there would have been no need for ESRAM and other workarounds, there would be more room for the GPU and the system would be easier to use.

It's easy to say that now, obviously, and I'm sure there was some internal debate at some point that ended with them going for DDR3. In the end, they made the wrong decision.

It is really unfortunate for MS that the decision to go with cheap DDR3 memory has had a gimping domino effect on everything else in the system.

Because of the low bandwidth of DDR3 they had to make space on the chip for the ESRAM. As a result of the ESRAM taking die space the GPU had to have less transistors (? ) and couldn't be as fast as the PS4's. And because the GPU is in a lower processing power bracket, it made sense to couple it with a slower CPU.

And all this had the merry effect of increasing costs! So choosing DDR3 has been a disaster. But all of this is only really a sticking point if you know exactly what the PS4 is packing inside comparatively, and how it is technologically superior in just about every way.
 
So let me get this straight:
PS4 | Xbox One
7 Free Cores | 6 Free Cores
14 MB/s Per Core | 12 MB/s Per Core
98 MB/s available | | 72 MB/s Available

So basically PS4:
CPU: 36% advantage
GPU: 50% advantage
(More GPGPU power)
Faster Ram
hUMA
Easier to program for (Lack of Esram bottleneck)

If this is true... Esram was a mistake .

ESRAM is the band-aid, not the mistake. DDR3 is the mistake. The Xbone was absolutely not designed around ESRAM; that's in there to make up for the vastly slower main pool of RAM.

Edit: the post above, like Clarissa and Cassandra before it, pretty much explains it all
 
My favourite hypothetical: if Sony launched with 4 gigs would it have still been a wrong decision? Probably not. But it's moot now.

If Sony went with 4gb they would have been a laughing stock.

I almost wonder if MS got a hold of a 4gb PS4 dev kit and figured that was close to the spec of the final machine. They decided to launch with 8gb (albeit cheaper/slower ram, but much more of it) to over spec the ps4 by a good deal.

Then the PS4 RAM announcement...
 

CoG

Member
hmmm, ....I wonder who suggested the ESRAM solution, the almightly MS fellow engineers?

They were probably pretty proud of themselves as it's an "efficient" solution but it does not really change the fact that their choice in RAM has crippled the console. The fact that a dumb simple game as KI cannot run in 1080p does not bode well for that console's future titles.
 

Skeff

Member
hmmm, ....I wonder who suggested the ESRAM solution, the almightly MS fellow engineers?

Likely somebody who knew they were making the best out of a bad situation because their boss wanted TV TV SPORTS TV.

EDIT: I think the MS engineers have done a very good job with the XB1 hardware, considering the goals and restrictions from the management.
 

Kleegamefan

K. LEE GAIDEN
You don't need to be concurrently working on systems to know their relative performance. Especially within your codebase.


He means that working set of the algorithm fits mostly in 1mb, not that it never reads/writes outside that. Ie. the code rarely waits for memory/is likely compute bound - which would also mean it scales well with cpu clock speed.

StratJ, I'll premptively save you some embarrassment and let you know both Faf and Argyle are devs who have been posting on GAF for over a decade now. So perhaps you should choose carefully the words you put in your next post.
 

androvsky

Member
Honestly, the MS and AMD engineers did a pretty good job if they were told in 2010 that the system had to have 8GB of RAM. In not sure what else they could have done at that point. Iirc, EDRAM was having process shrink issues and GDDR5 obviously didn't have the capacity.

What else could they have done? Split memory, which gave PS3 devs so much grief?
 

Bossofman

Neo Member
Considering that the CPU's are darn near identical, there are only a few ways, this developer could say you can get 'more' out of the PS4 one. Either it is clocked higher, or it has an extra core to work with, there really aren't a lot of possibilities.
 
There is credible evidence that PS4 has 6 cores available from Killzone and Battlefield presentations.
I'll agree that the KZ presentation is fairly damning, but again, I'll point out that demo was created well in advance of hardware finalization.

WRT BF, as I said, the fact that it runs on six cores only implies that one of the two platforms is limited to six cores, as it seems unlikely they would split their code in to six threads for one platform and seven threads for the other, especially with a launch game. Why go through the extra work if you're already getting 117% performance from the same six cores anyway?

The 7-core available is tales from my ass and ONLY came about after this benchmark.
Actually, GopherD said Sony were shooting for "less than one core" long ago. I can't find the original post &#8212; likely archived by now &#8212; but here are a bunch of people talking about it.

Edit: Oh, here you go. :)

I meant to say comparing the iOS stuff to OTHER cpu's confirms that, just like you confirmed. Jaguar is weak compared to Intel, but it's still way faster than ARM.
Gotcha. Thanks. <3
 
Top Bottom