• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Frostbite Technical Director on why Frostbite never came to Wii U

fmpanda

Member
Yeah, and Max Payne couldn't have been done on GameCube. Some developers just want every platform to be the same and have no interest in optimizing for unique architectures. "We ran some tests on a different engine" is hardly even an attempt to seriously answer this question about Frostbite 3.

The really sad thing is that Nintendo hasn't shown up with anything to prove them wrong about the system's capabilities. Either it really is a lump of shit, or even the system's creators aren't interested in pushing the metal.

I would go with the bolded, especially as the console isn't selling right now. Maybe we'll get an EA title or two during the next 5 years, but no one is going to dedicate one ounce of effort as long as the Wii U doesn't sell.
 
It's worse at SIMD. Otherwise it's actually quite good.

No, he says:

Performance varies by workload, but I'm willing to bet that they're similar at integer workloads and the Cortex-A9 definitely has more SIMD oomph thanks to NEON

So Espresso's vaunted integer power is similar to an 18 month old phone CPU, and definitely worse for SIMD. Nothing about that sounds "quite good". The Ouya literally has a faster CPU than the WiiU. Think about that for a while.
 

Daingurse

Member
Yeah, and Max Payne couldn't have been done on GameCube. Some developers just want every platform to be the same and have no interest in optimizing for unique architectures. "We ran some tests on a different engine" is hardly even an attempt to seriously answer this question about Frostbite 3.

The really sad thing is that Nintendo hasn't shown up with anything to prove them wrong about the system's capabilities. Either it really is a lump of shit, or even the system's creators aren't interested in pushing the metal.

If having a unique architecture brings only problems and no benefits, why the hell would they want that? That's the problem, Nintendo didn't give a damn about what Third-Parties want. The only people the console cater's towards are Nintendo's own in house devs. A lower tech ceiling is useful, when all the teams have gotten used to Gamecube level assets for nearly a decade.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
No, he says:



So Espresso's vaunted integer power is similar to an 18 month old phone CPU, and definitely worse for SIMD. Nothing about that sounds "quite good". The Ouya literally has a faster CPU than the WiiU. Think about that for a while.
Did you somehow entirely miss the context of the discussion? Espresso was being compared to Xenon, supposedly unfavorably, which was used as a justification for FB's absence from the WiiU. What marcan said about Espresso compared to A9 has zilch to do with the discussion. He did say something about Xenon, though, which you somehow entirely discarded. I wonder why.
 
Fail flow has something intersting to say about the cpu



The guy who persumingly cracked the WiiU.

I guess we can all stop blaming the CPU.

Isn't he talking about integer workloads.
As far as i know linear algebra math in games is done with floating point data types.
And linear algebra is perfect for SIMD optimization which 360 cores have FMX128 i believe it was called.
 

wsippel

Banned
So Espresso's vaunted integer power is similar to an 18 month old phone CPU, and definitely worse for SIMD. Nothing about that sounds "quite good". The Ouya literally has a faster CPU than the WiiU. Think about that for a while.
AMD E350 1600 (Bobcat): CoreMark 10987.00 (2:pThreads, 1.6GHz)
Samsung Exynos4 (Cortex A9): CoreMark 22243.00 (4:pThreads, 1.4GHz)

Even PS4 and Xbox720 will probably have a hard time competing with the Cortex A9 in certain areas. If Espresso could indeed keep up with a quad core 1.6GHz A9 (I kinda doubt it), it would be surprisingly close to the other next generation systems in terms of CPU performance for everything that isn't SIMD.

EDIT: Somehow assumed the PS4 and 720 used quad core CPUs.
 
No, he says:



So Espresso's vaunted integer power is similar to an 18 month old phone CPU, and definitely worse for SIMD. Nothing about that sounds "quite good". The Ouya literally has a faster CPU than the WiiU. Think about that for a while.

Why do you keep bringing up the fact its a CPU used in phones? Are you unaware the Jaguar used in the PS4/720 is also a mobile processor?
 
Why do you keep bringing up the fact its a CPU used in phones? Are you unaware the Jaguar used in the PS4/720 is also a mobile processor?

But they're not. If you put an 8 core Jaguar in a tablet it would run for about 30 minutes off a battery. It's twice as big as anything anyone is planning to put in a laptop or tablet. And in both cases we're talking about devices significantly larger than an iPad. A 4 core Cortex-A9, by contrast, can run all long on a tiny phone battery. The PS4 and Durango CPUs are 4-8 times as powerful.

blu said:
Did you somehow entirely miss the context of the discussion? Espresso was being compared to Xenon, supposedly unfavorably, which was used as a justification for FB's absence from the WiiU. What marcan said about Espresso compared to A9 has zilch to do with the discussion. He did say something about Xenon, though, which you somehow entirely discarded. I wonder why.

You seem to have lost the thread as well. And he said "per clock" which we already know, but Xenon is also clocked 3 times higher than Espresso. And a marginal advantage in integer performance over a 7 year old design is a hollow victory when we're talking about games skipping the platform because it still isn't fast enough for what the games are actually doing on the CPU using SIMD.

AMD E350 1600 (Bobcat): CoreMark 10987.00 (2:pThreads, 1.6GHz)
Samsung Exynos4 (Cortex A9): CoreMark 22243.00 (4:pThreads, 1.4GHz)

Even PS4 and Xbox720 will probably have a hard time competing with the Cortex A9 in certain areas. If Espresso could indeed keep up with a quad core 1.6GHz A9 (I kinda doubt it), it would be surprisingly close to the other next generation systems in terms of CPU performance for everything that isn't SIMD.

You realize the PS4/Durango CPU has 4 times as many cores and a SIMD unit twice as wide as that Bobcat, right? Not to mention other efficiency improvements...
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
You seem to have lost the thread as well. And he said "per clock" which we already know, but Xenon is also clocked 3 times higher than Espresso.
When a CPU is largely more efficient per clock, clock differences of the magnitude of Xenon/Espresso (which apropos is 2.58, not 3), will not help you much. Why do you think Intel dropped their 4GHz P4s in favor of ~3x slower clock designs?

And a marginal advantage in integer performance over a 7 year old design is a hollow victory when we're talking about games skipping the platform because it still isn't fast enough for what the games are actually doing on the CPU using SIMD.
So basically Sony and MS should not bother with their yet-to-be-released consoles as they will not be fast enough in SIMD compared to the 7-year-old Cell and ergo we will never see FB on those platforms? Gotcha.
 

wsippel

Banned
You realize the PS4/Durango CPU has 4 times as many cores and a SIMD unit twice as wide as that Bobcat, right? Not to mention other efficiency improvements...
Sorry, I somehow thought they both used quad core Jaguars. Still, that makes those things roughly twice as powerful as "18 months old smartphone CPUs" - well, 24 months by the time the systems come out. Yay I guess...?
 
And he said "per clock" which we already know, but Xenon is also clocked 3 times higher than Espresso. [/B]And a marginal advantage in integer performance over a 7 year old design is a hollow victory when we're talking about games skipping the platform because it still isn't fast enough for what the games are actually doing on the CPU using SIMD.

Bingo.

The only logical inference from the comparison is that unless the Xenon is 3 times slower than the A9 per clock, the XBOX 360 is still faster than the Wii U in terms of CPU power. And I SERIOUSLY doubt the 2007 PPC per clock power is 1/3 raw computational power of the 2012 A9. ESPECUIALLY for floating point calculations, which are what games use most extensively.
 
When a CPU is largely more efficient per clock, clock differences of the magnitude of Xenon/Espresso (which apropos is 2.58, not 3), will not help you much. Why do you think Intel dropped their 4GHz P4s in favor of ~3x slower clock designs?

For thermal reasons and for efficiency reasons. There's a clockspeed cap on things, beyond 3GHz things start to get really complicated. So once you hit that clockspeed on the die you need to think about architectural enhancements that will yield better power per clock. The P4 simply could not go higher on clockspeed - you were never going to see a 5GHz P4.

And just because they went to a newer and more efficient architecture doesn't mean it was automatically faster. If they couldn't have achieved good yields on higher clockspeeds, they could have very well ended up with processors that were actually slower despite being more efficient.

You see a case of this in GPUs and CPUs often. When AMD first launched its Phenom line, for example. Or when the first ATI X1000 and HD2000 series cards were launched.

The math is simple. Clockspeed x Ops per clock = Power. Just because you have more Ops per clock doesn't mean you are more powerful.
 
Sorry, I somehow thought they both used quad core Jaguars. Still, that makes those things roughly twice as powerful as "18 months old smartphone CPUs" - well, 24 months by the time the systems come out. Yay I guess...?

You didn't double the figure enough times. At least 4 times, as much as 8 times.

blu said:
So basically Sony and MS should not bother with their yet-to-be-released consoles as they will not be fast enough in SIMD compared to the 7-year-old Cell and ergo we will never see FB on those platforms? Gotcha.

Both Durango and PS4 are vastly more capable of using GPU compute performance for heavy vector workloads than WiiU, by virtue of employing the far more performant GCN GPU architecture, having far more closely coupled HSA designs, and having vastly more GPU capacity to exploit. And in any case both match Xenon's vector performance already which should mean Frostbite isn't going to have any trouble.
 

wsippel

Banned
Bingo.

The only logical inference from the comparison is that unless the Xenon is 3 times slower than the A9 per clock, the XBOX 360 is still faster than the Wii U in terms of CPU power. And I SERIOUSLY doubt the 2007 PPC per clock power is 1/3 raw computational power of the 2012 A9. ESPECUIALLY for floating point calculations, which are what games use most extensively.
From what I've seen, PPEs have the worst real world IPC since NetBurst.

EDIT: Found it. This is comparing only a single core:

PPE CoreMark/MHz: 0.97
Cortex A9 CoreMark/MHz: 3.97

So there you go. The smartphone CPUs of today are not three, but four times as powerful as Xenon clock-for-clock. Actually, the A15 is even more efficient at 4.68 CoreMark/MHz.


You didn't double the figure enough times. At least 4 times, as much as 8 times.
I think you didn't read the benchmarks correctly. It's comparing two higher clocked AMD cores to four lower clocked A9 cores, and the ARM comes out more than twice as fast. At the same number of cores, the A9 easily beats Bobcat clock-for-clock.
 

Hoo-doo

Banned
It ran like ass ported over, and it's not remotely worth the money investment to get it running up to speed with the other versions seeing as there really isn't a worthwhile install base.

Not that hard. EA are running a business, not a charity.
I'm enjoying the people who honestly think these two gigantic companies are having a "fight" and will withhold software support just out of spite. I mean, really?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
You didn't double the figure enough times. At least 4 times, as much as 8 times.
How did you reach to those numbers?

Both Durango and PS4 are vastly more capable of using GPU compute performance for heavy vector workloads than WiiU, by virtue of employing the far more performant GCN GPU architecture, having far more closely coupled HSA designs, and having vastly more GPU capacity to exploit. And in any case both match Xenon's vector performance already which should mean Frostbite isn't going to have any trouble.
Sure. Only that all those things you mentioned are relevant to new software. Software originally written with Cell SIMD performance in mind will never run on ps4. Cue in incapability to beat 7-year-old designs, accusations of incompetence, blah blah.

As re FB's running on Xenon implying FB will not have any trouble running on 8 Jaguars: are you sure something in FB@Xenon does no rely on the single-thread performance of that CPU?

For thermal reasons and for efficiency reasons. There's a clockspeed cap on things, beyond 3GHz things start to get really complicated. So once you hit that clockspeed on the die you need to think about architectural enhancements that will yield better power per clock. The P4 simply could not go higher on clockspeed - you were never going to see a 5GHz P4.
Why would Intel need to release a 5GHz CPU if P4@4GHz outperformed everybody else?.. Oh wait, it didn't. Why? Because their IPC was so piss-poor that to stay competitive they needed to upclock to speeds they could not viably achieve. Ergo the change of philosophy.

The math is simple. Clockspeed x Ops per clock = Power. Just because you have more Ops per clock doesn't mean you are more powerful.
Did you read what you quoted when you decided to reply to my post?

From what I've seen, PPEs have the worst real world IPC since NetBurst.

EDIT: Found it. This is comparing only a single core:

PPE CoreMark/MHz: 0.97
Cortex A9 CoreMark/MHz: 3.97

So there you go. The smartphone CPUs of today are not three, but four times as powerful as Xenon clock-for-clock. Actually, the A15 is even more efficient at 4.68 CoreMark/MHz.
Good find.
 

spekkeh

Banned
Nintendo's first party titles have never been about 'pushing the metal'. If you're looking to Nintendo to "prove" 3rd party devs are lazy, you'll be looking a long time. That's just not what Nintendo is about when it comes to their own games.

Metroid Prime on NGC and Super Mario Galaxy on Wii are some of the most beautiful games of the respective platforms. Although I guess it's a bit of an unwinnable situation for Nintendo. Make games that look too pretty -> people will only buy AAA Nintendo games, we're not going to develop for it. Make unassuming games -> see the hardware is crap, we're not going to develop for it.

Of course Nintendo should've taken the Sony route and sat down with the large third party publishers from the start, but yeah, water under the bridge.
 

MYE

Member
The original conversation between Frostbite Technical Director repi and Teletuby Hitler

You mean Teletuby Poe

Nintendo's first party titles have never been about 'pushing the metal'. If you're looking to Nintendo to "prove" 3rd party devs are lazy, you'll be looking a long time. That's just not what Nintendo is about when it comes to their own games.

Smash Brawl, Super Mario Galaxy and Metroid Prime 3 are some of the best looking games on that level of tech, and they all run at 60fps.
 
Sure. Only that all those things you mentioned are relevant to new software. Software originally written with Cell SIMD performance in mind will never run on ps4. Cue in incapability to beat 7-year-old designs, accusations of incompetence, blah blah.

As re FB's running on Xenon implying FB will not have any trouble running on 8 Jaguars: are you sure something in FB@Xenon does no rely on the single-thread performance of that CPU?

DICE has been at the forefront of making their engine highly parallelized. I believe they use a jobs model on both PS3 and 360, so I can't imagine getting FB to work across 8 Jaguars is that big an issue for them. And aren't you the one who's so fond of bringing up that the WiiU's dedicated sound hardware saves a ton of CPU time? That's just as true of PS4 (and Durango). PS4 also doesn't need to dedicate large amounts of its CPU time to compensating for a weak GPU the way Cell did, so we're down to PS3 games that used less than half the Cell time for vertex culling and MLAA or otherwise compensating for RSX and doing sound work and which are using physics middleware that isn't going to ship a GPGPU accelerated version for PS4 they could plug straight in to. And why are they moving this PS3 game to PS4 anyway, and not working on a new title for PS4 in potentially a new engine? I'm not sure what you're even arguing for anymore, or why you're arguing at all. I mean, moving the goalposts is one thing, but you're putting them in the weirdest places for no apparent reason.
 

Hedja

Member
I don't understand why they can't just say the truth.

"We won't be investing time and money into porting FB3 over to the WiiU because the small userbase doesn't warrant it."

Simple. We all know that for third-parties the WiiU isn't worth developing for right now.

Why do they need to crap on about how the WiiU's not powerful enough as though it's not their decision? Or maybe it's just the fact that FB3 isn't flexible enough to be ported onto newer platforms; in which case it's more their fault. That's probably the real reason since the former statement would've been quite obviously good enough without all this drama.
 
I don't understand why they can't just say the truth.

"We won't be investing time and money into porting FB3 over to the WiiU because the small userbase doesn't warrant it."

Simple. We all know that for third-parties the WiiU isn't worth developing for right now.

Why do they need to crap on about how the WiiU's not powerful enough as though it's not their decision? Or maybe it's just the fact that FB3 isn't flexible enough to be ported onto newer platforms; in which case it's more their fault. That's probably the real reason since the former statement would've been quite obviously good enough without all this drama.

They probably have a ported version but the userbase doesn't warrent the time and money to optimize it yet. Maybe when the wiiu gets wii numbers EA will think differently but from what i heard from the just cause dev thread nintendo has shitty developer support so it seems.
 

spekkeh

Banned
Activision had to turn down Call of Duty's lighting effects and make other changes to get it running on Wii U.

Frostbite...they simply said the results "weren't promising". That doesn't mean they couldn't get it to run ...but if you've got to re-write the engine to work on new architecture, deal with lower memory bandwidth, etc, and then expect low sales... What's the point?

Yeah people that say devs are lazy in their ports... it's simply a cost/benefit calculation. As long as it's not too egregious, people will buy 'lazy' ports. The extra refinement will not get you an equal extra ROI, it may even mean making a loss. Mass Effect sold what, 20k? You can't even justify porting a game, let alone porting and optimizing an engine before porting and optimizing the actual games. Frostbite 3 should run pretty well on the Wii U, but right now if it doesn't run instantly = not going to put anymore money into it.
Of course for EA it's easier to say 'hardware doesn't cut it' than 'you our public are insignificant'.
 

Log4Girlz

Member
Yeah people that say devs are lazy in their ports... it's simply a cost/benefit calculation. As long as it's not too egregious, people will buy 'lazy' ports. The extra refinement will not get you an equal extra ROI, it may even mean making a loss. Mass Effect sold what, 20k? You can't even justify porting a game, let alone porting and optimizing an engine before porting and optimizing the actual games. Frostbite 3 should run pretty well on the Wii U, but right now if it doesn't run instantly = not going to put anymore money into it.
Of course for EA it's easier to say 'hardware doesn't cut it' than 'you our public are insignificant'.

Well EA put some effort into NFS, perhaps that'll pay off.
 

wsippel

Banned
Good find.
Thanks. There's sadly no 750 in the database. Only the PPE, 970, 405 and POWER7. And the PPE is unsurprisingly the weakest/ least efficient of the bunch. The closest thing to Espresso is probably the 405, and even that old piece of shit is more than twice as fast per clock.
 
I don't understand why they can't just say the truth.

"We won't be investing time and money into porting FB3 over to the WiiU because the small userbase doesn't warrant it."

Simple. We all know that for third-parties the WiiU isn't worth developing for right now.

Why do they need to crap on about how the WiiU's not powerful enough as though it's not their decision? Or maybe it's just the fact that FB3 isn't flexible enough to be ported onto newer platforms; in which case it's more their fault. That's probably the real reason since the former statement would've been quite obviously good enough without all this drama.

Nextbox and PS4 will have low userbases at the start aswell but their is the beleif (and history) of these type of games selling well on the earlier platforms from these cpmpanies even at launch.

The lower power is most likely a hurdle to creating a port for the engine that is financially worth it.
 

Argyle

Member
DICE has been at the forefront of making their engine highly parallelized. I believe they use a jobs model on both PS3 and 360, so I can't imagine getting FB to work across 8 Jaguars is that big an issue for them. And aren't you the one who's so fond of bringing up that the WiiU's dedicated sound hardware saves a ton of CPU time? That's just as true of PS4 (and Durango). PS4 also doesn't need to dedicate large amounts of its CPU time to compensating for a weak GPU the way Cell did, so we're down to PS3 games that used less than half the Cell time for vertex culling and MLAA or otherwise compensating for RSX and doing sound work and which are using physics middleware that isn't going to ship a GPGPU accelerated version for PS4 they could plug straight in to. And why are they moving this PS3 game to PS4 anyway, and not working on a new title for PS4 in potentially a new engine? I'm not sure what you're even arguing for anymore, or why you're arguing at all. I mean, moving the goalposts is one thing, but you're putting them in the weirdest places for no apparent reason.

Just in case anyone (not necessarily you Brad!) needs to understand how Frostbite works, so we are all at least on the same page here:

http://dice.se/wp-content/uploads/Sthlm10_ParallelFutures_Final.ppt

Honestly anyone trying to make the argument "but the PS4/Durango will have problems with Frostbite too!" should probably read the section on what Johan wants in future hardware carefully.

From what I've seen, PPEs have the worst real world IPC since NetBurst.

EDIT: Found it. This is comparing only a single core:

PPE CoreMark/MHz: 0.97
Cortex A9 CoreMark/MHz: 3.97

So there you go. The smartphone CPUs of today are not three, but four times as powerful as Xenon clock-for-clock. Actually, the A15 is even more efficient at 4.68 CoreMark/MHz.

Yup. The PPE is terrible! No really, I agree! Yet all the evidence looks to me that three of them at 3.2Ghz is still faster than the Espresso at 1.25Ghz on real-world game workloads.

Maybe they should release CoreMark on the eShop :)

(also...single core...so, single thread? The results box does not have any parallel execution indicated, so in other words more or less that is not measuring the entire PPE?)
 

Mithos

Member
Well EA put some effort into NFS, perhaps that'll pay off.

Only if EA invents a time machine sends the game back in time so EA and Criterion can advertise the game as coming for PC, 360, PS3 and Wii U and that the Wii U version will be the best looking console version.
 

Hedja

Member
Nextbox and PS4 will have low userbases at the start aswell but their is the beleif (and history) of these type of games selling well on the earlier platforms from these cpmpanies even at launch.

The lower power is most likely a hurdle to creating a port for the engine that is financially worth it.

They probably have a ported version but the userbase doesn't warrent the time and money to optimize it yet. Maybe when the wiiu gets wii numbers EA will think differently but from what i heard from the just cause dev thread nintendo has shitty developer support so it seems.

That's exactly my point. They should just say it outright. That it's not worth the time and money.
 
I think you didn't read the benchmarks correctly. It's comparing two higher clocked AMD cores to four lower clocked A9 cores, and the ARM comes out more than twice as fast. At the same number of cores, the A9 easily beats Bobcat clock-for-clock.

There's something weird about that Bobcat score. The report on the site thinks it only has a single core:

V77YYas.png

I'm also having trouble finding any other Cortex-A9 based CPUs that match the performance of the one you quoted, and that includes other results from the same Exynos 4, which is to say I'm not sure how reliable these comparisons actually are.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
DICE has been at the forefront of making their engine highly parallelized. I believe they use a jobs model on both PS3 and 360, so I can't imagine getting FB to work across 8 Jaguars is that big an issue for them.
What you could or could not imagine was not the question. I'll repeat my question: what makes you sure FB@Xenon does not rely on the single-thread performance of that CPU? How do you know their job granularity would scale down favorably to slower cores? Have you seen FB performing on Bobcats/Jaguars? Oh, BTW, welcome to the discussion from several pages ago, where I was using the exact same argument about FB's parallelism capabilities to question the notion that FB was not viable on Espresso.

And aren't you the one who's so fond of bringing up that the WiiU's dedicated sound hardware saves a ton of CPU time?
Nope. But thank you for pointing that out, anyway.

That's just as true of PS4 (and Durango). PS4 also doesn't need to dedicate large amounts of its CPU time to compensating for a weak GPU the way Cell did, so we're down to PS3 games that used less than half the Cell time for vertex culling and MLAA or otherwise compensating for RSX and doing sound work and which are using physics middleware that isn't going to ship a GPGPU accelerated version for PS4 they could plug straight in to. And why are they moving this PS3 game to PS4 anyway, and not working on a new title for PS4 in potentially a new engine?
We are discussing ps3/ps4 portability issues for the same reason we are discussing FB's potential portability issues from Xenon to Espresso. Is that too hard to follow?

If a piece of software was written with the SIMD peformance of 6 SPEs in mind, it might but more likely might not be portable to 7 (or whatever the rumored number-of-cores-available-to-apps is) Jaguar cores. Yes, that's too bad, Jaguars suck, etc, etc. Well not really, they were simply not designed with such usecases in mind, but nevermind that. Likewise with FB@Espresso - if the SIMD performance of 3 PPE cores is the bare minimum for the engine to be viable, then Espresso pulls the short straw (and a handful of Jaguars are not safe either, but nevemind). That would also imply, though, that DICE painted themselves in the corner (again, if you had followed the discussion from a few pages back you'd have saved me all this).
 
I wonder if Nintendo was in touch with developers when they were developing the WiiU. I can't beleive how many "incompatibility" issues WiiU has with future industry technology. Clearly they should have discussed what was coming with devs, like when Epic advised Microsoft to bump the RAM to 512mb in the 360.
 

wsippel

Banned
There's something weird about that Bobcat score. The report on the site thinks it only has a single core:

I'm also having trouble finding any other Cortex-A9 based CPUs that match the performance of the one you quoted, and that includes other results from the same Exynos 4, which is to say I'm not sure how reliable these comparisons actually are.
Bobcat is a "pseudo" dual core chip. It's only a single core with some elements like decoders and integer ALUs doubled. Jaguar will be the same if I remember correctly. Still, the run is using "both" cores - that's what "2:pThreads" means.

Also, the difference between the Exynos 4 results can be easily explained: They used different compilers for both correct runs. The third, much lower result is a single threaded run, but someone forgot to enter that detail, so the reported per-core score is wrong. Notice how the displayed result is almost exactly one quarter of the expected result?
 
Bobcat is a "pseudo" dual core chip. It's only a single core with some elements like decoders and integer ALUs doubled. Jaguar will be the same if I remember correctly. Still, the run is using "both" cores - that's what "2:pThreads" means.

No they aren't. You're thinking of Bulldozer, and even those present as multi-core at a hardware level.

Also, the difference between the Exynos 4 results can be easily explained: They used different compilers for both correct runs. The third, much lower result is a single threaded run, but someone forgot to enter that detail, so the reported per-core score is wrong. Notice how the displayed result is almost exactly one quarter of the expected result?

OK, how do we know the bobcat score is representative if the compiler used makes results so variable?

What you could or could not imagine was not the question. I'll repeat my question: what makes you sure FB@Xenon does not rely on the single-thread performance of that CPU? How do you know their job granularity would scale down favorably to slower cores? Have you seen FB performing on Bobcats/Jaguars? Oh, BTW, welcome to the discussion from several pages ago, where I was using the exact same argument about FB's parallelism capabilities to question the notion that FB was not viable on Espresso.

Actually, I know because Argyle posted a link!

You'll notice in this presentation from DICE about Frostbite towards the end they literally say OoOE is better for their gamecode, but that it's ultimately meaningless since their engine spends the vast majority of its time doing vector heavy work. Which is to say Espresso being 30-40% faster than Xenon on integer code is useless when it's also 1/8th as fast doing SIMD.

We are discussing ps3/ps4 portability issues for the same reason we are discussing FB's potential portability issues from Xenon to Espresso. Is that too hard to follow?

No one was talking about PS3 to PS4 portability, and you only brought it up in some misguided attempt to disparage the PS4's capabilities.
 

prag16

Banned
I wonder if Nintendo was in touch with developers when they were developing the WiiU. I can't beleive how many "incompatibility" issues WiiU has with future industry technology. Clearly they should have discussed what was coming with devs, like when Epic advised Microsoft to bump the RAM to 512mb in the 360.
Unreal Engine 3 games and Assassin's Creed 3 among others at launch were developed on non-final dev kits and were for the most part at parity with 360, sometimes pulling ahead of the PS3. Cryengine 3 allegedly "runs beautifully".

If the CPU was really such a diarrhea mess to the extent that some of the armchair experts here are claiming, how is that possible? Porting FB3 taking a non-zero amount of effort to optimize is probably enough at this time considering the low user base and poor outlook.
 

wsippel

Banned
Yup. The PPE is terrible! No really, I agree! Yet all the evidence looks to me that three of them at 3.2Ghz is still faster than the Espresso at 1.25Ghz on real-world game workloads.

Maybe they should release CoreMark on the eShop :)

(also...single core...so, single thread? The results box does not have any parallel execution indicated, so in other words more or less that is not measuring the entire PPE?)
Yeah, now that you mention it, they apparently didn't use SMT. Hard to tell how much that would do. Maybe 20 - 30%?

Anyway, if Marcan happens to be correct and Espresso is indeed comparable to a quad core A9 at 1.6GHz (I heavily doubt it), it would obviously beat Xenon under non-SIMD workloads (22243 vs ~9300 - 12400). With the info he released on how to enable the full Espresso CPU in Wii mode, it should actually be possible to benchmark the chip.
 
Unreal Engine 3 games and Assassin's Creed 3 among others at launch were developed on non-final dev kits and were for the most part at parity with 360, sometimes pulling ahead of the PS3. Cryengine 3 allegedly "runs beautifully".

If the CPU was really such a diarrhea mess to the extent that some of the armchair experts here are claiming, how is that possible? Porting FB3 taking a non-zero amount of effort to optimize is probably enough at this time considering the low user base and poor outlook.

It's the second console with new architecture to be BEYOND MAXED in its infancy! That's the only explanation. The first was PS2, which proved it was inferior to the Dreamcast because the ports of Code Veronica, Rayman 2, Crazy Taxi and Grandia II told me so.
 
It's the second console with new architecture to be BEYOND MAXED in its infancy! That's the only explanation. The first was PS2, which proved it was inferior to the Dreamcast because the ports of Code Veronica, Rayman 2, Crazy Taxi and Grandia II told me so.

The PS2 hardware was shit for its time and generation, but the bad ports were not the proof. Neither are bad ports the proof that WiiU's CPU is terribly poor. The hardware speaks for itself.
 

hodgy100

Member
don't forget that ARM processors are also RISC processors. meaning you need to perform many more instructions to get the same effect. Thats another plus that x86 and ppc have over ARM.
 

wsippel

Banned
OK, how do we know the bobcat score is representative if the compiler used makes results so variable?
Compilers can make quite a difference depending on the chip, which is why the compiler is always mentioned in those benchmarks. Sony will probably use either gcc or Open64 while Nintendo uses Green Hills Optimizing Compilers.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Actually, I know because Argyle posted a link!
You know that FB performs fine on Bobcats/Jaguars, because Argyle posted a link to DICE presentation which says their game code would benefit from OOOE? You lost me here, I thought we were discussing SIMD throughput.

You'll notice in this presentation from DICE about Frostbite towards the end they literally say OoOE is better for their gamecode, but that it's ultimately meaningless since their engine spends the vast majority of its time doing vector heavy work.
Erm, the timing snapshot is for a particular game (BC2@ps3). What the engine requires in terms of vector capabilities is not something you could conclude from that snapshot.

Which is to say Espresso being 30-40% faster than Xenon on integer code is useless when it's also 1/8th as fast doing SIMD.
Which is to say you'd rush to conclusions on this subject. Perhaps you need to step down and take a breath?

No one was talking about PS3 to PS4 portability, and you only brought it up in some misguided attempt to disparage the PS4's capabilities.
No, you were making general statements about Espresso, I just showed you how misguided those were by doing a parallel with the ps3/ps4 portability, which you did not like, but I continued using it as a good parallel.
 
Both Bobcat and Bulldozer are pseudo dual core chips, but the implementation is indeed different. As I wrote, Bobcat only has two decoders and two parallel integer ALUs while Bulldozer is a true CMT design. See:


And read this: http://products.amd.com/pages/NotebookAPUDetail.aspx?id=1&f1=&f2=&f3=&f4=&f5=&f6=&

That's a diagram of a single bobcat core. dual decoders and dual issue pipelines are really common in modern CPU cores. The E-350 in question would have two of those cores in the picture. I'm not sure anyone should trust your technical knowledge if you don't even know this stuff.

You know that FB performs fine on Bobcats/Jaguars, because Argyle posted a link to DICE presentation which says their game code would benefit from OOOE? You lost me here, I thought we were discussing SIMD throughput.

They didn't say it benefits from OoOE, they said any gains from OoOE on game code are a drop in the bucket since the vast majority of their workload is vector heavy. You tried to claim Frostbite would be limited by single thread performance. I gave you evidence that they've been using a jobs system to spread work across many cores since Bad Company 2. The Jaguar based CPU in PS4/Durango is a 100GFlop design that should work great in that framework. I only brought up the OoOE thing because you and others in this thread keep touting the OoOE advantage Espresso enjoys over Xenon and Cell as some kind of saving grace, but here we have an engineer at DICE saying explicitly how useless that advantage is in a real game.

No, you were making general statements about Espresso, I just showed you how misguided those were by doing a parallel with the ps3/ps4 portability, which you did not like, but I continued using it as a good parallel.

So a 15 GFlop processor not being good enough for an engine designed around 100 GFlop processors means a 100 GFlop processor is not good enough for an engine designed around 100 GFlops processors? The situations are not remotely comparable.
 
Pretty good read on Dat Unprecedented Partnership.

http://playeressence.com/eas-unprecedented-partnership-with-nintendo-and-other-fairy-tails/

I understand that this is a business, but how do you go from announcing an unprecedented partnership mentioning EA Sports and Battlefield series, from not even calling the system next gen, not releasing Wii U versions of your multi million selling franchises, EA wants to give Wii U the Dreamcast treatment.

I know Nintendo is also at fault, they have made mistakes, but boy what a shitty company EA is. EA it is your loss, sorry but I will not be supporting you anymore, there are too many options, too many games out there.

I also have a PS3 and intend on getting a PS4 or PC, EA you will not have my money. I can live without EA games, and I hope this comes back to bite them.

Call me salty or fanboy, I dont care, this shit is just to obvious to ignore. I hate when companies incurr in dubious behavior, could have been against sony or any other company.
 

wsippel

Banned
That's a diagram of a single bobcat core. dual decoders and dual issue pipelines are really common in modern CPU cores. The E-350 in question would have two of those cores in the picture. I'm not sure anyone should trust your technical knowledge if you don't even know this stuff.
Maybe you shouldn't, but it doesn't matter in this discussion to begin with: The chip benchmarked was an E-350, there is no single core E-350, and CoreMark used two threads. Doesn't matter what it says in the info page, both cores were used.
 

prag16

Banned
EA it is your loss, sorry but I will not be supporting you anymore, there are too many options, too many games out there.

I WOULD be right there with you, but unfortunately EA has Star Wars and Mass Effect. :(

I suppose there are "other ways" to get those games on PC without supporting EA, but that has never been my modus operandi.

Which is to say Espresso being 30-40% faster than Xenon on integer code is useless when it's also 1/8th as fast doing SIMD.

What? Where the hell are you getting those numbers from? 1/8 as fast on SIMD? Citation needed.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
They didn't say it benefits from OoOE, they said any gains from OoOE on game code are a drop in the bucket since the vast majority of their workload is vector heavy. You tried to claim Frostbite would be limited by single thread performance. I gave you evidence that they've been using a jobs system to spread work across many cores since Bad Company 2.
I know how FB operates, thank you very much (for the umpteenth time - check the discussion a few pages ago). The FLOPS workloads that the engine can handle (as in viable throughput), are one thing. The engine's requirements - an entirely different thing. The entire argument started from:

DICE guy: We tried FB2 on the WiiU and it did not perform well.
forumite: It must be the CPU!
me: how come their well-scalable-with-CPUs engine did not perform well? If they say that did not perform well outside of the context of any game, then the engine's requirements alone (read: for doing any meaningful workloads) must be pegged at something akin to 3x PPEs (which was a sarcastic statement on my part, I do not honestly expect that to be the case). And since the only advantage Xenon has over Espresso is (SIMD) FLOPS, that would mean the engine eats a significant amount of FLOPS for breakfast. Like how the software occlusion culling should eat FLOPS on the PS3, but that does not have to be the case on other platforms where the trisetup is on the average twice faster than on the RSX.

The Jaguar based CPU in PS4/Durango is a 100GFlop design that should work great in that framework. I only brought up the OoOE thing because you and others in this thread keep touting the OoOE advantage Espresso enjoys over Xenon and Cell as some kind of saving grace, but here we have an engineer at DICE saying explicitly how useless that advantage is in a real game.
Erm, 'real games' this gen are going to do a good portion of their FLOPS on the GPU. If 'real games' were so CPU-FLOPS-bound, we'd be seeing an evolution of the CELL in ps4 today, not 8 Jaguars and a GPU that blows them out of the water in FLOPS and which features dedicated CUs.

So a 15 GFlop processor not being good enough for an engine designed around 100 GFlop processors means a 100 GFlop processor is not good enough for an engine designed around 100 GFlops processors? The situations are not remotely comparable.
I've been aware what you're claiming since the beginning. Here's a hypothetical for you to perhaps help ypu understand what I've been asking you about single-threaded performance: do you expect FB to perform equally well compared to the 'baseline' 3x PPE if the setup contained 10 cores at ~1GHz each? How about 100 cores at 100MHz? If yes - why? If not, then how are you so sure FB would be fine on a bunch of Jaguars? We are still talking SIMD FP here, not even discussing GP code.
 
Top Bottom