• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ubisoft GDC EU Presentation Shows Playstation 4 & Xbox One CPU & GPU Performance- RGT

The Llama

Member
Could be, but I think I remember that Intel was like "We make so much profit on our chips that its not worth it to make cut rate deals on consoles." I also think there is some bad blood between Intel and the console makers. Finally AMD might be able to offer a better CPU/GPU package than Intel. Not sure about that last one. I haven't been following Intel's integrated graphics capabilities.

Oh and backwards compatibility will be easier if the console manufacturers stick with AMD. My wild ass prediction is that this console cycle will be a lot shorter than people expect. However the next gen of consoles won't replace the current gen, but will be an upgrade to it. I think Microsoft will push this trend because they will want to hit the reset switch as soon as possible on this gen. They made their name on being the most powerful console and yet now they are reduced to claiming resolution doesn't matter. That has got to hurt.

Yeah, agreed. Kinda like how Windows XP was around forever, then Vista/7/8 returned to a more normal cycle.
 
FYI, that lowest common denominator is still higher than the vast majority of Windows PCs. Want to blame the lack of technical progress on something? Blame the consumers who won't spend more on hardware, be it console or PC.

You have a valid point. Aren't games on PC designed to run on systems with much lower specs than the PS4 or Xbone? Catering only to the PC high end elite sounds great when you're not the one worried about being in the poor house.
 

Furyous

Member
Does 100 percent more GPU mean the GPU on the PS4 performs 100 percent better/more efficient/more powerful than the Xbox One? If that's true then the value proposition just increased over 25 percent, IMHO, for the life of the console.
 

Competa

Banned
Ok, let's think for a second.

The cpu is the thing that is going to haul the consoles back, even the ps4. As I can see it, MS saw this coming and is investing in "the power of the cloud" to compute all the AI and destruction as we saw in their presentation.

I don't hope sony "wasted" a lot of money on the gpu and ram when their cpu is a bottleneck. But isn't Sony doing something like ms azure ?
 
Ok, let's think for a second.

The cpu is the thing that is going to haul the consoles back, even the ps4. As I can see it, MS saw this coming and is investing in "the power of the cloud" to compute all the AI and destruction as we saw in their presentation.

I don't hope sony "wasted" a lot of money on the gpu and ram when their cpu is a bottleneck. But isn't Sony doing something like ms azure ?

Because Sony invested in the GPGPU future. Which is arguably much more useful than relying on the cloud.
 
Could be, but I think I remember that Intel was like "We make so much profit on our chips that its not worth it to make cut rate deals on consoles." I also think there is some bad blood between Intel and the console makers. Finally AMD might be able to offer a better CPU/GPU package than Intel. Not sure about that last one. I haven't been following Intel's integrated graphics capabilities.

Oh and backwards compatibility will be easier if the console manufacturers stick with AMD. My wild ass prediction is that this console cycle will be a lot shorter than people expect. However the next gen of consoles won't replace the current gen, but will be an upgrade to it. I think Microsoft will push this trend because they will want to hit the reset switch as soon as possible on this gen. They made their name on being the most powerful console and yet now they are reduced to claiming resolution doesn't matter. That has got to hurt.

Heh, it must be nice to have so much power in an industry. I think I read a statement like that made by Intel last year.

For backwards compatibility , you need the same type of cpu? I thought that dealt with architecture or is that the same thing?
 

Culex

Banned
Does 100 percent more GPU mean the GPU on the PS4 performs 100 percent better/more efficient/more powerful than the Xbox One? If that's true then the value proposition just increased over 25 percent, IMHO, for the life of the console.

Only for GPCPU calculations, though. Not every cycle running on the processor will be called for GPCPU. Those that are, seem to be 100% faster on the PS4.
 
Ok, let's think for a second.

The cpu is the thing that is going to haul the consoles back, even the ps4. As I can see it, MS saw this coming and is investing in "the power of the cloud" to compute all the AI and destruction as we saw in their presentation.

I don't hope sony "wasted" a lot of money on the gpu and ram when their cpu is a bottleneck. But isn't Sony doing something like ms azure ?
Well you could offload some CPU processing to the "cloud" like Microsoft proposes, or you could offload some CPU processing to the GPU like Sony proposes. Personally I'd rather not be dependent on the cloud if I didn't have to be, and the trip to the GPU will have a lot less lag than having to hop out on the net.

For backwards compatibility , you need the same type of cpu? I thought that dealt with architecture or is that the same thing?

I think the bigger issue would be with the GPU. It's coded much closer to the metal and would have the greatest change if they switched from AMD.
 

Rising_Hei

Member
FYI, that lowest common denominator is still higher than the vast majority of Windows PCs. Want to blame the lack of technical progress on something? Blame the consumers who won't spend more on hardware, be it console or PC.
This, if high-end PCs are finally showing progress right now it's because PS4 and XB1 have been released.
Nobody goes all-out with these high end PCs if they aren't really rich because the number of people with these PCs is just ridiculous, you have to optimize for everyone (mid range and low range) with the game being playable if you want to hit a good portion of the cake

Well you could offload some CPU processing to the "cloud" like Microsoft proposes, or you could offload some CPU processing to the GPU like Sony proposes. Personally I'd rather not be dependent on the cloud if I didn't have to be, and the trip to the GPU will have a lot less lag than having to hop out on the net.
The power of the cloud isnt a viable option like the power PS4 already has inside.
 

Xando

Member
Ok, let's think for a second.

The cpu is the thing that is going to haul the consoles back, even the ps4. As I can see it, MS saw this coming and is investing in "the power of the cloud" to compute all the AI and destruction as we saw in their presentation.

I don't hope sony "wasted" a lot of money on the gpu and ram when their cpu is a bottleneck. But isn't Sony doing something like ms azure ?
If cloud AI means Titanfall AI i'll take the GPGPU stuff.
Imo i doubt the cloud will do much unless you have god internet or live near a Azure center. There are also a lot of potential risks when all your use the cloud (Internet outage for example).
Before they show me that it works perfectly in my home i'd rather take a local solution.
 

Loofy

Member
As a member of PC GAF, I'm not laughing. I want technology to progress and that isn't going to happen when the lowest common denominator is that low. I do feel slightly vindicated though because I and many others were accused of having an agenda for stating the blindigly obvious ever since next gen console specs were released: these consoles are really weak compared to both contemporary PC technology and in relative power at launch compared to both the PS3 and 360.

In terms of CPU, it is clear that GPGPU is going to have to be pushed hard if these consoles are to last for five or six more years. I do not know however if using the GPU for CPU tasks costs GPU resources.
You should be lucky. If console specs were any higher the pc ports would probably need 8-core haswells just to run ubisofts games.

Just be grateful we arent running on PCs lowest common denominator. Every game would look like TF2.
 

AmFreak

Member
But why the tide change in CPU performance? Is there a difference in the CPU structure that makes each CPU better suited for certain tasks? Or did the new june SDK give the XBO CPU a ~30% performance boost?

1. The One cpu is clocked 9.375% higher.
The ps4 cpu upped to One clocks would result in ~ 107.1875.
That would shrink the difference to ~5%.
Also the gpu difference is much more questionable, cause it's much bigger than it should be.

2. This is what i tried to say. These aren't benchmarks.
This is a paper to show what you can do with the GPU's in these consoles and how they did it.
We don't know how much time they spent optimizing any of these systems.
And the end result was clear from the beginning -> gpu wins.
Why would they spend endless time on any of these cpu's (old or new), if it was clear from the start that they would use the gpu's?
 

MaulerX

Member
If cloud AI means Titanfall AI i'll take the GPGPU stuff.
Imo i doubt the cloud will do much unless you have god internet or live near a Azure center. There are also a lot of potential risks when all your use the cloud (Internet outage for example).
Before they show me that it works perfectly in my home i'd rather take a local solution.


But doesn't it seem like both are in their infancies to actually make a major impact this generation? By next generation I would expect both forms of tech to be advanced enough that both Sony and Microsoft would be using a form of BOTH solutions. Just my take.
 
Considering the diminished draw call overhead on PS4 vs DX11 XB1/PC and the gradual increase of specially-tailored code for it as the generation goes by, I wouldn't be too worried about the CPU. In fact, I actually think RAM could turn out to be the limiting factor again if Sony/MS don't free up some of it in the future.

People worried about the PS3 - PS4 CPU comparison need to be reminded that GPGPU is PS4's equivalent of the SPU sauce. If you compare the Cell's lone PPU to the 6 developer-enabled Jaguar cores... well, there's no comparison, really. If I'm not mistaken, each of the Xbox 360's 3 cores is based on the Cell PPU, so expect actual PPU performance for that scenario to be some number below the thirty-somethings managed by the 360 in that graph.
 

Locuza

Member
If I'm understanding the slides right, all that is saying is that using DX11 on the XB1 has an implied buffer copy while using DX11 on the PS4 it has to be done explicitly. It's still going through DX11 in both scenarios although it is using DX11 extensions that Sony has added to the API to manually manage the buffer on the PlayStation side. This is evident because it says DX11 on both slides.
The slides are about the porting.
From DX11 and i think they mean "PC" DX11, not the DX11.x from the Xbox One.
It doesn't goes through DX11.
The Engine use the Playstation Shader Language (PSSL) and goes through the GNM API.

I highly doubt an ARM APU would be viable even now for the current generation of consoles. The 64 bit cores are only now beginning to reach products outside Apple and they scale poorly on the high performance side of things. They are designed for efficiency at tight tdp/power envelopes. The x86 cores still dominate in applications where that's less of a constraint. Jaguar might not look like much vs a i7 but its more than capable enough vs the highest end ARM cores.
Nah, an ARM Core A57 should be on the same level, if not higher and better.
Apples Cyclone plays way above both.
 

Xando

Member
But doesn't it seem like both are in their infancies to actually make a major impact this generation? By next generation I would expect both forms of tech to be advanced enough that both Sony and Microsoft would be using a form of BOTH solutions. Just my take.
I'm not so deep in game development so i'm not sure but if i remember right ND and team ICE mentioned they are working with GPGPU already(?).
 

CLEEK

Member
I do not know however if using the GPU for CPU tasks costs GPU resources.

Of course they do. How else would it work?

In the AMD world, both on PCs and current consoles, the GPUs have ACE (Asynchronous Compute Engines), each with 8 queue. You split up your compute tasks into 'fine grain' chunks, and then the ACE queues them up and feeds them to the GPU asynchronously. You obviously have to have CU capacity to do these tasks.

For reference, the PS4 has 8 ACEs with 64 queue (the same as the RX290), where as the XB1 only has 2 ACEs with 16 queue. The PS4 has 500GFLOPs more CU performance over the XB1, giving it substantially more GPGPU capabilities.
 
I do not know however if using the GPU for CPU tasks costs GPU resources.

Well, of course it does, but it's rather rare to have actual 100% GPU usage during the entire frametime. With ACEs and GPGPU you can minimize this "waste". Are you familiar with The Tomorrow Children's presentation on their render methods?

I don't believe this for a second. You have some benchmarks to back this up?

Yeah, I'm a big fan of Apple's custom CPU initiative and its results, but I have a hard time believing that.
 

AlphaDump

Gold Member
does the cloth/dancer example, with around 100 dancers for xb1/ps4, represent the current CPU solution for AC:unity?


It seems in the later slides it shows PS4 being more than 6x better (600+ dancers) than the current solution when using GPU solutions, unless I am reading that wronng...
 
The slides are about the porting.
From DX11 and i think they mean "PC" DX11, not the DX11.x from the Xbox One.
It doesn't goes through DX11.
The Engine use the Playstation Shader Language (PSSL) and goes through the GNM API.

It literally says "On DirectX 11". Why would it say that if it isn't using DX11 emulation? And I'm not talking about the shader language. The PS4 has a native language for talking to the GPU call GNM. On top of that they have a language called GNMX which is meant to help port DX11 code over but it has overhead and is not a perfect 1 to 1 match with DX11.

Naughty Dog Developers Discuss Xbox One DirectX 12 & Playstation 4 API Optimization

With that said, we do know that the PS4 has a low level and a ‘wrapper’ high low access of the GPU. The GNM, the lowest level API allows you to control many of the functions of the GPU, but requires much more work. The higher level API available for the PS4 is known as GNMX and offers developers an easier time in porting titles over if they’re more familar with say DX11. It is perfect for smaller studios who aren’t bothered about squeezing every last drop from the heart of PS4′s GPU due to their game not requiring the extra grunt.

As stated on the slides, Assassin's Creed is using the DX11 emulation layer however because it isn't exactly like PC DX11 they have to account for the differences when using the compute shaders. On the PS4, if the buffer isn't being used then it doesn't need to be copied first. That's the first "The PS4 Version" slide. However if the the buffer is being used a copy needs to be made first which is the second slide. This is what is explained. They are using the Sony extensions to DX11 in their GNMX API to manually control the buffer, but they are still going through the GNMX.
 

beast786

Member
People DO understand that Microsoft has some of the smartest graphics programmers IN THE WORLD. They CREATED DirectX, the standard API’s that everyone programs against. So while people laude Sony for their HW skills, do you really think that MS don’t know how to build a system optimized for maximizing graphics for programmers? Seriously? There is no way they are giving up a 30%+ advantage to Sony. And ANYONE who has seen both systems running could say there are great looking games on both systems. If there was really huge performance difference – it would be obvious.

I get a ton of hate for saying this – but it’s been the same EVERY generation. Sony claims more power, they did it with Cell, they did it with Emotion Engine, and they are doing it again. And, in the end, games on our system looked the same or better.

I’m not saying they haven’t built a good system – I’m merely saying that anyone who wants to die on their sword over this 30%+ power advantage are going to be fighting an uphill battle over the next 10 years…
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
This was the best solution for the price and performance that we got i think. Going for monster components last time was the reason for 7th gen being so long. If they can avoid that, this gen should be easy to get through.

does the cloth/dancer example, with around 100 dancers for xb1/ps4, represent the current CPU solution for AC:unity?


It seems in the later slides it shows PS4 being more than 6x better (600+ dancers) than the current solution when using GPU solutions, unless I am reading that wronng...

As said in the slides, they are still in the testing phases for these techs. AC Unity has been worked on for about 3 years, far too early to start utilizing that kind of tech from the start..

On the other hand, Infamous already uses GPGPU for every particle in the world, Tomorrow children uses it for their voxel lighting, and UC4 probably uses it to a large extent as well. Killzone uses it for task scheduling..and i think DEEP DOWN might be using it for their fire, water and debris physics too....so its not like taking advantage of these things should be too far off because they are already being used.
 

Locuza

Member
I don't believe this for a second. You have some benchmarks to back this up?
Well first of all, every comparison is rough, because we are looking at two different ISAs and the Software can make huge difference, also the OS can impact the scores by a huge amount.

You can find a good technical article about Apples A7 on Anandtech:
http://www.anandtech.com/show/7910/apples-cyclone-microarchitecture-detailed

You will see that the Cyclone architecture isn't a cute and litte one.
The cores are beefy and beat everything till now on the ARM IPC-side.
A good question would be, how far you can push the clockrate if not power constraint.

Sadly, most of the time you can only find some Geekbenches or Browser-Benchmarks.
And even worse i didn't find a good comparison quick.

So i will only copy something quick and dirty from Notebookcheck.
http://www.notebookcheck.net/Apple-A7-SoC.103280.0.html

http://www.notebookcheck.net/AMD-A-Series-A4-5000-Notebook-Processor.92867.0.html

It literally says "On DirectX 11". Why would it say that if it isn't using DX11 emulation? And I'm not talking about the shader language. The PS4 has a native language for talking to the GPU call GNM. On top of that they have a language called GNMX which is meant to help port DX11 code over but it has overhead and is not a perfect 1 to 1 match with DX11.
It says "on DX11" because it only should show you how the DX11 Model works.
Further they say "on PS4"
No implicit synchronization, no implicit buffer duplication You have to manage everything by yourself

If i must to speculate, i would say GNMX takes some responsibility, so you don't have to manage everything yourself.
One similar way MS should be going.
DX12 with explicit synchronization vs. DX11.3 with implicit.
 
1. The One cpu is clocked 9.375% higher.
The ps4 cpu upped to One clocks would result in ~ 107.1875.
That would shrink the difference to ~5%.
Also the gpu difference is much more questionable, cause it's much bigger than it should be.


2. This is what i tried to say. These aren't benchmarks.
This is a paper to show what you can do with the GPU's in these consoles and how they did it.
We don't know how much time they spent optimizing any of these systems.
And the end result was clear from the beginning -> gpu wins.
Why would they spend endless time on any of these cpu's (old or new), if it was clear from the start that they would use the gpu's?
Where Im going is: in december 2013, PS4 CPU ran at 1.6ghz and XBO CPU ran at 1.75ghz. So they were the same speed as today. The substance benchmark results gave a clear advantage for the PS4 CPU which could process 14 MB/s textures per core while XBO CPU could only process 12 MB/s texture data per core. Ok, this results made people here think that you could obtain more performance from PS4 CPU, and a user here which I understood is in the know (Matt) confirmed that was true. Ok.

Now, we have these guys at Ubi that are using the CPU to process a cloth simulation, and the results are opposite to what we got in the substance benchmark, as in now the XBOs CPU can process 15% more dancers than PS4s CPU. My question is, and maybe it is just impossible to answer for somebody that doesn't work at Ubi, why the performance results for their simulation are opposite to what we got last year wiith substance benchmark? I understand different engines have different requirements, etc, but I thought the jaguar cores in Both current gen machines were essentially equal. So why the difference, an specially, why te difference is now oposite to what it was in 2013 (PS4 performance delta changed from +15% to -15%)?

When I saw the substance benchmark last year, i assumed that it was a good representation to assume what we could expect from each CPU, but now the results are opposite, and I don't know what to make out of it. What is the real performance delta in CPU? Has PS4 15% more CPU performance as I thought last year? Is XBOs CPU 15% more powerful according to todays info? Or is the real performance delta 9.x% in XBOs favour, as the core speed indicates, and we are just seeing different engines giving different results because of factors we don't know about?
 
People DO understand that Microsoft has some of the smartest graphics programmers IN THE WORLD. They CREATED DirectX, the standard API’s that everyone programs against. So while people laude Sony for their HW skills, do you really think that MS don’t know how to build a system optimized for maximizing graphics for programmers? Seriously? There is no way they are giving up a 30%+ advantage to Sony. And ANYONE who has seen both systems running could say there are great looking games on both systems. If there was really huge performance difference – it would be obvious.

I get a ton of hate for saying this – but it’s been the same EVERY generation. Sony claims more power, they did it with Cell, they did it with Emotion Engine, and they are doing it again. And, in the end, games on our system looked the same or better.

I’m not saying they haven’t built a good system – I’m merely saying that anyone who wants to die on their sword over this 30%+ power advantage are going to be fighting an uphill battle over the next 10 years…

Umm...your "smartest graphics programmers IN THE WORLD" had the worse API and development environment for this console generation at release, and the problem with the PS3 was with it's hardware, not it's software. The cell did have more power, as is shown by the slides referenced in the article, but it was much harder to code for. That is the reason that power wasn't utilized.

The XB1 is both underpowered and harder to code for this generation. There is simply no way to make up in software for a hardware deficiency. Did you not see the last graph in the article? If Microsoft was as good as you say, then how can this graph possibly be true?

playstation-4-vs-xbox-one-vs-xbox-360-vs-ps3-cpu-and-gpu.jpg
 
Umm...your "smartest graphics programmers IN THE WORLD" had the worse API and development environment for this console generation at release, and the problem with the PS3 was with it's hardware, not it's software. The cell did have more power, as is shown by the slides referenced in the article, but it was much harder to code for. That is the reason that power wasn't utilized.

The XB1 is both underpowered and harder to code for this generation. There is simply no way to make up in software for a hardware deficiency. Did you not see the last graph in the article? If Microsoft was as good as you say, then how can this graph possibly be true?

playstation-4-vs-xbox-one-vs-xbox-360-vs-ps3-cpu-and-gpu.jpg

I'm sure he is sarcastic. that was Penello words
 

androvsky

Member
Where Im going is: in december 2013, PS4 CPU ran at 1.6ghz and XBO CPU ran at 1.75ghz. So they were the same speed as today. The substance benchmark results gave a clear advantage for the PS4 CPU which could process 14 MB/s textures per core while XBO CPU could only process 12 MB/s texture data per core. Ok, this results made people here think that you could obtain more performance from PS4 CPU, and a user here which I understood is in the know (Matt) confirmed that was true. Ok.

Now, we have these guys at Ubi that are using the CPU to process a cloth simulation, and the results are opposite to what we got in the substance benchmark, as in now the XBOs CPU can process 15% more dancers than PS4s CPU. My question is, and maybe it is just impossible to answer for somebody that doesn't work at Ubi, why the performance results for their simulation are opposite to what we got last year wiith substance benchmark? I understand different engines have different requirements, etc, but I thought the jaguar cores in Both current gen machines were essentially equal. So why the difference, an specially, why te difference is now oposite to what it was in 2013 (PS4 performance delta changed from +15% to -15%)?

When I saw the substance benchmark last year, i assumed that it was a good representation to assume what we could expect from each CPU, but now the results are opposite, and I don't know what to make out of it. What is the real performance delta in CPU? Has PS4 15% more CPU performance as I thought last year? Is XBOs CPU 15% more powerful according to todays info? Or is the real performance delta 9.x% in XBOs favour, as the core speed indicates, and we are just seeing different engines giving different results because of factors we don't know about?
I seem to recall one of these Ubisoft presentations mentioned something about getting MS to cut back on the CPU reservation recently, asking with a bunch of other devs asking for it too.
 

AmyS

Member
Right now, with PS4 having been on the market for very nearly 1 year (just about 3 weeks short of Nov. 15th) I'd love to know what Mark Cerny thinks would be reasonable as far as CPU / GPU / APU and memory specifications for the next generation PS5.
 
Well first of all, every comparison is rough, because we are looking at two different ISAs and the Software can make huge difference, also the OS can impact the scores by a huge amount.

You can find a good technical article about Apples A7 on Anandtech:
http://www.anandtech.com/show/7910/apples-cyclone-microarchitecture-detailed

You will see that the Cyclone architecture isn't a cute and litte one.
The cores are beefy and beat everything till now on the ARM IPC-side.
A good question would be, how far you can push the clockrate if not power constraint.

The A7/8 are no doubt very impressive for a tdp constrained performance per watt focused use case. That article says nothing about how it compares to Jaguar. The only benchmark I could find was a comparison of a lower clocked Kabini (1.5 GHz vs the consoles) to a higher clocked ARM processor from Samsung. It is much, much faster even though Intel blows it out of the water. I'm sure the A8x is better but so is the Jaguar in the consoles (256bit memory etc). ARM isn't ready for non mobile applications yet.
 

CLEEK

Member
I'm sure he is sarcastic. that was Penello words

For those who missed it, that was Albert Penello, doing an amazing job of spinning and FUD prior to the XB1's release. Along with the quoted post (might have even been on the same day) he betted that you might see at most a "single digit" real world FPS performance difference between XB1 and PS4 games.

Yeah, Albert, about that...

I don't know how much you know about GAF meme culture (I'm an expert), but constant repetition is a huge part of it.

It's over, period. EventHorizon lost their last ace, and that's the end of their NeoGAF hopes and dreams.

It's not hyperbole, it's not fanboy drivel. It is LITERALLY it for EventHorizon. He has nothing left, nothing he can post tomorrow would fix the hole now created. There is no reason left for any one, hardcore or casual, to not block him. Except if they want to play Carnival of Stupid. Which will also come to Wii at some point.

The age of EventHorizon is done.
 
Ubisoft benchmark just proves that PS4's more powerful GPU with unified fast RAM and and better compute structure can give you %100 advantage over X1.

from the article:
the performance difference between the Xbox One’s GPU and Playstation 4′s GPU is actually slightly higher than what you might think. What Ubisoft do say is that “PS4 – 2 ms of GPU time – 640 dancers” – but no form of metric for the Xbox One, which is a bit of a shame. It’s clear however that for the benchmarking Ubisoft have used here, the PS4′s GPU is virtually double the speed.

Now pump that resolution Ubi !!
 
I'm sure he is sarcastic. that was Penello words

I don't know how much you know about GAF meme culture (I'm an expert), but constant repetition is a huge part of it.

Damn, I think I blew a few brain fuses trying to understand this tech stuff. That went right over my head.

It says "on DX11" because it only should show you how the DX11 Model works.
Further they say "on PS4"
But that is not how DX11 model works. The DX11 model does an implicit copy of the buffer before use, yet the first slide doesn't show the copy. That slide is how the DX11 model works on the PS4.

If i must to speculate, i would say GNMX takes some responsibility, so you don't have to manage everything yourself.
One similar way MS should be going.
DX12 with explicit synchronization vs. DX11.3 with implicit.
The GNMX is a compatibility wrapper around the GNM to make it easier to port DX11 code, but it is not DX11. Here is another explanation of it.

How The Crew was ported to PlayStation 4

Most people start with the GNMX API which wraps around GNM and manages the more esoteric GPU details in a way that's a lot more familiar if you're used to platforms like D3D11. We started with the high-level one but eventually we moved to the low-level API because it suits our uses a little better," says O'Connor, explaining that while GNMX is a lot simpler to work with, it removes much of the custom access to the PS4 GPU, and also incurs a significant CPU hit.

It is entirely possible that I'm wrong, but according to their own slides Assassin's Creed is using the DX11 wrapper to access the PS4's GPU which has a significance CPU overhead. That would explain why they say that the CPU was the limiting factor and that resolution parity was the best they could achieve when their own graphs show a huge PS4 GPU advantage.
 

bobbytkc

ADD New Gen Gamer
So why did Microsoft and Sony opt for relatively weak cpus by AMD instead of chips by Intel? A lot of products nowadays have Intel's chip as their cpus. In Microsoft's case, the Surface runs with i3, i5, and i7. Perhaps they will use profits from this gen to create powerful hardware next generation? Phones, tablets, and other stuff can be crazy expensive but gaming consoles can't?
A console is half the price of an iphone.
 

RoboPlato

I'd be in the dick
It is entirely possible that I'm wrong, but according to their own slides Assassin's Creed is using the DX11 wrapper to access the PS4's GPU which has a significance CPU overhead. That would explain why they say that the CPU was the limiting factor and that resolution parity was the best they could achieve when their own graphs show a huge PS4 GPU advantage.

This is a very good theory. Kind of nuts though that Ubi would be using the wrapper API since Sony just put that in for smaller studios with less resources.
 
"What Ubisoft do say is that “PS4 – 2 ms of GPU time – 640 dancers” – but no form of metric for the Xbox One, which is a bit of a shame."

2 ms on the XBone GPU would give you 332 dancers. The last graph in the OP shows 5 ms of time on each processor. I don't know why they broke out a 2 ms slice on the PS4, but it's just 40% of the 5 ms slice.

Probably more interesting to say that 0.3 ms on the PS4 GPU gets you 100 dancers, roughly the same as 5 ms on the CPUs of the PS3/PS4/XBone. It takes 0.6 ms on the XBone's GPU to process the same 100 dancers. So that stuff sorta represents "double the burden" on the XBone's GPU.
 

Cole Slaw

Banned
Right now, with PS4 having been on the market for very nearly 1 year (just about 3 weeks short of Nov. 15th) I'd love to know what Mark Cerny thinks would be reasonable as far as CPU / GPU / APU and memory specifications for the next generation PS5.
I'd love to know whom Mark Cerny would think could be the next Mark Cerny, so Sony can start grooming him for the job of architecting PS6.
 

Locuza

Member
The A7/8 are no doubt very impressive for a tdp constrained performance per watt focused use case. That article says nothing about how it compares to Jaguar. The only benchmark I could find was a comparison of a lower clocked Kabini (1.5 GHz vs the consoles) to a higher clocked ARM processor from Samsung. It is much, much faster even though Intel blows it out of the water. I'm sure the A8x is better but so is the Jaguar in the consoles (256bit memory etc). ARM isn't ready for non mobile applications yet.
If you look at notebookcheck they have some benchmarks on the bottom.

The Benchmark you found uses the old A15, the new A57 should be a good chunk faster:
http://images.anandtech.com/doci/7995/Screen%20Shot%202014-05-06%20at%202.59.56%20AM.png

AMDs ARM SoC Seattle is the successor of the old Jaguar based X-"****" Products:
http://community.amd.com/servlet/JiveServlet/showImage/38-2545-2550/AMD+Server+Roadmap.jpg

And the raw data from the ARM A57 is really similar.
Cache-Size, Clocks 2 Ghz, three-way superscalar (Jaguar two-way).

And since the IPC of the A7/8 from Apple is much higher, i don't see Jaguar being on Top of ARM.

It is entirely possible that I'm wrong, but according to their own slides Assassin's Creed is using the DX11 wrapper to access the PS4's GPU which has a significance CPU overhead.
I understand it totally different than you. ^^
But on which slide they are saying, they using "the DX11 wrapper to access the PS4's GPU"?
 
A console is half the price of an iphone.
Which people keep for 2, 3 years tops. These consoles are here to stay for 5-7 years. I'm not saying consoles should be crazy expensive, but a $500 PS4 wouldn't be so bad. Oh well.

So as this gen wears on, developers will make use of the GPGPU which will help create more advanced ai right?
 
I’m not saying they haven’t built a good system – I’m merely saying that anyone who wants to die on their sword over this 30%+ power advantage are going to be fighting an uphill battle over the next 10 years…
Good ol' Penello.
Funny that it isn't even "30%+". It's "40%+"
 
Top Bottom