• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Bink 2 runs faster on PS4 GPU than XB1 GPU, but same on either CPU

Chobel

Member
I'll boycott Ubisoft if the final game will have dancer parity!!!111



The difference in dancers (113 / 98) is roughly the same as the clock speed difference (1,75 Ghz / 1,6 Ghz).

Xbone CPU is clocked 9.37% higher than PS4 CPU, so it should be 107/98 dancers. Xbone has 6 more dancers than it should = more efficient.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
So, CPU tests only are made basing on pure power and CPU clock without interfering consoles OS?

No. How would that even be possible? The benchmark is most likely running on debug consoles where the OS of both systems is interfering normally. The OS (i.e. the virtualization layer) was probably the reason why the Xbone's CPU was initially slower despite the clock speed advantage.

Xbone CPU is clocked 9.37% higher than PS4 CPU, so it should be 107/98 dancers.

That's why I qualified it with "roughly".

The discrepancy is due to the Xbone's hidden dCPU (Dancer CPU) in the powerbrick.
 
I didn't know where to post this... so I upped my own thread...

http://gdcvault.com/play/1020939/Efficient-Usage-of-Compute-Shaders

Ubisoft cloth simulation:

XBox 360 CPU: 34 dancers
PS3s Cell: 105 dancers (they say "SPUs rock!)
PS4 CPU: 98 dancers... wtf :|
XBone CPU: 113 dancers
PS4 GPGPU: 1600
XBone GPGPU: 830

DAMN!

And there will be a handful of devs who actually utilize GPGPU within their engine this generation.

Sony first party teams already are pushing GPGPU stuff. But yeh. Forget about it when it comes to 3rd parties. May be a few that do good stuff with it. But won't be many. Same as with the SPU's of Cell.

Cloth simulation is the type of thing SPE's would of blazed through and laughed at. Which is why it is right there with the current gen CPU's.
 
Final results slide screencap:

RzyX4Wy.png

hSbE0.gif


fake edit: PS4 am 2 times more powerful confirmed!

The discrepancy is due to the Xbone's hidden dCPU (Dancer CPU) in the powerbrick.
Don't forget the hidden "Emotion Engine" inside the Xbone's HDMI cable.
 
All will. It's becoming a standard technique that is available on all gaming platforms.

Yeah, but it is extra programming. Extra work. From what I've seen over the years anything that isn't absolutely 100% necessary will be overlooked by the majority of developers.

Unless Sony builds a very easy API to take advantage of the Compute units of the architecture I just don't see it being used as much as it should. Hope I'm wrong though!
 

sbrew

Banned
1.4ms to 2.3ms doesn't sound like a very big difference. PC to console sounds like a very big difference though (4ms to 11ms).

PC (2.8Ghz Core i5 with 4 cores and AMD R9 290x): CPU: 1.3 ms. GPU: 1.4 ms.
PS4: CPU 2.3 ms. GPU: 1.6 ms.
Xbox: CPU 2.3 ms. GPU: 2.3 ms
Dat cell. These amd cpus really are shit.

I wouldn't call them shit. From these numbers, it seems an i5 core is about 3 times faster, but that's why the PS4 and XB1 have 8 cores. As developers get used to using more cores, things will get better for the new consoles.

Yes, they are not nearly as impressive bang for the buck as the Cell or even the 360 CPU were, but that's the price we pay when both Sony and MS decided to break even from the start this gen.


Wow, PS4 GPGPU gives about two times One performance, interesting to see how this will develop throughout the gen.

Holy megaton, that GPGPU performance gap is much bigger than expected.

There really is nothing more to it, ps4 hardware isn't anything amazing, the xbone is just much weaker

I think 99% of everyone here at Gaf knows the PS4 GPU is not slightly more powerful but a LOT more powerful.

It's almost twice as fast. More than "slightly" faster

39% faster (or XB1 is 64% slower) by my math, looks impressive to me.

Remember, folks, this is the GPU *COMPUTE* performance, and the advantage we are seeing for the PS4 makes perfect sense as we are talking 18 vs. 12 CUs. This DOES NOT mean the PS4 GPU is that much faster for graphics in games, as only 14 of the PS4's CUs can be fully utilized for those because of bandwidth limitations.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
This DOES NOT mean the PS4 GPU is that much faster for graphics in games, as only 14 of the PS4's CUs can be fully utilized for those because of bandwidth limitations.

This bullshit again? Ranger is that you?
 

Mr Moose

Member
YRemember, folks, this is the GPU *COMPUTE* performance, and the advantage we are seeing for the PS4 makes perfect sense as we are talking 18 vs. 12 CUs. This DOES NOT mean the PS4 GPU is that much faster for graphics in games, as only 14 of the PS4's CUs can be fully utilized for those because of bandwidth limitations.

Says 18 in the slides.
 

Ploid 3.0

Member
God

of

War

I can't wait to see that bad boy. Sony first party are usually good at wowing me, and now with this hardware I have a feeling they will keep surprising. ND and SSM are great magicians. They made the PS3 sing.
 
God

of

War

I can't wait to see that bad boy. Sony first party are usually good at wowing me, and now with this hardware I have a feeling they will keep surprising. ND and SSM are great magicians. They made the PS3 sing.
Don't forget Guerrilla Games and Quantic Dream.

Remember, folks, this is the GPU *COMPUTE* performance, and the advantage we are seeing for the PS4 makes perfect sense as we are talking 18 vs. 12 CUs. This DOES NOT mean the PS4 GPU is that much faster for graphics in games, as only 14 of the PS4's CUs can be fully utilized for those because of bandwidth limitations.
This has already been debunked, sir.
 
Remember, folks, this is the GPU *COMPUTE* performance, and the advantage we are seeing for the PS4 makes perfect sense as we are talking 18 vs. 12 CUs. This DOES NOT mean the PS4 GPU is that much faster for graphics in games, as only 14 of the PS4's CUs can be fully utilized for those because of bandwidth limitations.

Are you talking about this

tormentos.jpg


The slide says 2012, pretty sure this is outdated when it leaked. PS4 only had 4 gigs of gangsda ram 5.
 

gruenel

Member
Remember, folks, this is the GPU *COMPUTE* performance, and the advantage we are seeing for the PS4 makes perfect sense as we are talking 18 vs. 12 CUs. This DOES NOT mean the PS4 GPU is that much faster for graphics in games, as only 14 of the PS4's CUs can be fully utilized for those because of bandwidth limitations.

That 14+4 nonsense still hasn't died?

On topic: Can't wait till ND unleashes dat GPGPU ;)
 

lyrick

Member
PC (2.8Ghz Core i5 with 4 cores and AMD R9 290x): CPU: 1.3 ms. GPU: 1.4 ms.
PS4: CPU 2.3 ms. GPU: 1.6 ms.
Xbox: CPU 2.3 ms. GPU: 2.3 ms.

Read more at http://gamingbolt.com/new-benchmark...ay-4k-video-frames-faster#d34cetGAZmluKJtt.99

How horribly unoptimized is Bink video if the differences between a GPU with 5.6 TF of compute and one with 1.8 TF of compute only offers a 12.5% decode difference?

And why is this not shown on the Developers site?
http://www.radgametools.com/bnkmain.htm
It is really fast - it can play 4K video frames (3840x2160) in 4 ms PCs and 11 ms PS4/Xbox One using the CPU only (or 1.4 ms PC and 2.3 ms PS4/Xbox using GPU acceleration)!
 

Insane Metal

Gold Member
Remember, folks, this is the GPU *COMPUTE* performance, and the advantage we are seeing for the PS4 makes perfect sense as we are talking 18 vs. 12 CUs. This DOES NOT mean the PS4 GPU is that much faster for graphics in games, as only 14 of the PS4's CUs can be fully utilized for those because of bandwidth limitations.
That has been debunked since the console launch. All 18CUs are fully independent to be used in compute or graphics. No limitations. Anyway, it doesn't mean 100% more performance, yeah.
 

Conduit

Banned
Remember, folks, this is the GPU *COMPUTE* performance, and the advantage we are seeing for the PS4 makes perfect sense as we are talking 18 vs. 12 CUs. This DOES NOT mean the PS4 GPU is that much faster for graphics in games, as only 14 of the PS4's CUs can be fully utilized for those because of bandwidth limitations.

Developers can use all 18 CU's for rendering if they want. IIRC, Mark Cerny talked about that before.
 

JaggedSac

Member
No wonder, it is likely already crowded with more crucial data. From all we know, data synchronization between CPU and GPU is more costly, too.

Whether ESRAM is crowded most likely depends on what point in the process they are doing these calculations. Either way, it shows just how hard a shot to the knees the main memory bandwidth is on the Bone. ESRAM utilization is the key to performance on there and there isn't quite enough of it for most engines, lol.
 

Koobion

Member
Can't believe people are still clutching at straws to not face the fact that Xbox One's library will always and forever be littered with games that are less visually impressive and run worse than the PS4 counterpart. It is a waste of time to try to spin it otherwise.
 

Zalusithix

Member
How horribly unoptimized is Bink video if the differences between a GPU with 5.6 TB of compute and one with 1.6 TB of compute only offers a 12.5% decode difference?

And why is this not shown on the Developers site?
http://www.radgametools.com/bnkmain.htm

A) TB is a measure of space - be it memory, or storage.
B) Just because hardware A has X times the theoretical total computational power of hardware B, not every algorithm can be expected to execute X times faster on hardware A. Things aren't that simple... nowhere near that simple.
 

sbrew

Banned
Developers can use all 18 CU's for rendering if they want. IIRC, Mark Cerny talked about that before.

They can, but the +4 donesn't help that much, if at all. It's not like they magically discovered more bandwidth since that 2012 slide was made.

(If they did, I missed it)
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
They can, but the +4 donesn't help that much, if at all. It's not like they magically discovered more bandwidth since that 2012 slide was made.

(If they did, I missed it)

Let me know if you have evidence that every game engine has a bandwidth bottleneck at exactly 14CUs. You are pushing Mixtermdeia BS.

Show me one game with a memory bottleneck that leaves the CUs underutilized. I'm sure you have access to all major studio PS4 engine profiles.
 

lyrick

Member
A) TB is a measure of space - be it memory, or storage.
B) Just because hardware A has X times the theoretical total computational power of hardware B, not every algorithm can be expected to execute X times faster on hardware A. Things aren't that simple... nowhere near that simple.

Exchange the Bs with Fs, as brainfarts happen, then explain the shitty scaling.

I can almost see the 2.3ms to 1.4ms, but 1.6ms to 1.4ms with over 3x the computational power makes me think the scaling is shit. Or the revised numbers from the devs were just misinterpreted.

If they're supposedly seeing gains of over 30% [2.3ms to 1.6ms] when going from a 23% compute increase [1.3 TF GPU to a 1.8 TF GPU], but then only seeing a 12.5% gain with a 320% more computational power [1.8TF GPU to a 5.6TF GPU] something is a-fucking-miss.
 

Mr Moose

Member
Exchange the Bs with Fs, as brainfarts happen, then explain the shitty scaling.

I can almost see the 2.3ms to 1.4ms, but 1.6ms to 1.4ms with over 3x the computational power makes me think the scaling is shit. Or the revised numbers from the devs were just misinterpreted.

If they're supposedly seeing gains of over 30% [2.3ms to 1.6ms] when going from a 23% compute increase [1.3 TF GPU to a 1.6 TF GPU], but then only seeing a 12.5% gain with a 350% more computational power [1.6TF GPU to a 5.6TF GPU] something is a-fucking-miss.

What's 1.6TF?
 

Zalusithix

Member
Exchange the Bs with Fs, then explain the shitty scaling.

I can almost see the 2.3ms to 1.4ms, but 1.6ms to 1.4ms with over 3x the computational power makes me think the scaling is shit.

IF they're supposedly seeing gains of over 30% [2.3ms to 1.6ms] when going from a 23% compute increase [1.3 TF GPU to a 1.6 TF GPU ] then only seeing a 12.5% gain with a 350% more computational power [1.6TF GPU to a 5.6TF GPU] something is a-fucking-miss.

There could be a limit to how "wide" they can distribute the problem. Perhaps it maxes out at somewhere around the CU count of the PS4. From there the scaling to the PC might be a byproduct of the raw MHz jump and not from additional CUs.
 

lyrick

Member
What's 1.6TF?

Sorry 1.83TF... either way it's shit compared to the 5.6TF of the 290x

The CPU results already show massive gains on the 4 core i5 vs the jaguar, so it's not limited by CPU. In fact it somehow gets worse on the PC end when it comes to GPU compute.
 

GameSeeker

Member
I didn't know where to post this... so I upped my own thread...

http://gdcvault.com/play/1020939/Efficient-Usage-of-Compute-Shaders

Ubisoft cloth simulation:

XBox 360 CPU: 34 dancers
PS3s Cell: 105 dancers (they say "SPUs rock!)
PS4 CPU: 98 dancers... wtf :|
XBone CPU: 113 dancers
PS4 GPGPU: 1600
XBone GPGPU: 830

DAMN!

Those numbers clearly show that for both next-gen consoles GPGPU is the way to go for physics simulations. The fact that the PS4 is almost 2X the Xbone shouldn't be surprising given that GPGPU was a key design point for the PS4. 2nd generation PS4 titles should really push GPGPU and do some amazing things.

You are wrong.

Matt has spoken and that should end any and all discussion of 14+4. Thank you Matt.
 

lyrick

Member
There could be a limit to how "wide" they can distribute the problem. Perhaps it maxes out at somewhere around the CU count of the PS4. From there the scaling to the PC might be a byproduct of the raw MHz jump and not from additional CUs.

It's certainly possible, but if that is the case that would basically mean that this particular Bench is worthless as it's almost a perfect situation for PS4s Architecture.

But is this benchmark anything to do with the TFs of the GPUs? (I honestly have no clue lol).

Yes number crunching for the code/decode process is directly attributable to the Theoretical FLOP/s maximum, probably even more so than the normal graphical bullshit bickering.
 

NeOak

Member
They can, but the +4 donesn't help that much, if at all. It's not like they magically discovered more bandwidth since that 2012 slide was made.

(If they did, I missed it)
You seriously don't know how this shit works.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
What about this?

http://www.cinemablend.com/games/PS...x-One-CPU-According-Benchmark-Test-61203.html

Claims the PS4 CPU is faster than the Xbox One.

That Ubisoft dev in the other thread claims that Microsoft has freed some CPU allocation recently. http://www.neogaf.com/forum/showthread.php?t=913010

Which would make sense given the many many rumors of PS4 having an uptick speed also.

http://en.wikipedia.org/wiki/PlayStation_4_technical_specifications#Central_processing_units

"The CPU's base clock speed is said to be 1.6 GHz, with an unknown uptick in speed for CPU-intensive usage"

Funny, the source quoted in that Wikipedia articles doesn't say anything at all about an "uptick".
 

LordOfChaos

Member
Really want to know the frame times for the Wii U CPU and GPU, are they available?

I loled at
But this is perhaps the first solid instance of the PS4′s GPU being superior.
You know, apart from just about every cross platform game.
 
Top Bottom