• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ubisoft GDC EU Presentation Shows Playstation 4 & Xbox One CPU & GPU Performance- RGT

jelly

Member
It is really irritating to me just how low MS and Sony went on these CPUs. Just terrible.

I think it came down to you have a choice between this and nothing.

Cheap, More cores is the way to go, heat friendly. I think they own the designs which leads to better cost reduction. Intel would be expensive, hot, brute force and cost reduction, ask MS with the original Xbox.
 

Orayn

Member
How can this be true if PS4 is better at offloading CPU tasks to the GPU?

GPGPU is only useful for specific applications. PS4 could be "better at offloading" when it's actually possible, but there are plenty of cases when it's not possible at all.
 

Bl@de

Member
When games look like this,



I don't really care for CPU. The GPU is where you're going to see a lot of tasks offloaded, that will have an effect on graphics.

Well okay ... Graphics ... But I'd love to see advancements in physics, AI and other fields ... And a jaguar 1.6ghz cpu is a piece of crap when looking at performance ... Thats just how it is. And this generation just started

Maybe developers just find more ways to use the gpu for ther things too
 

omonimo

Banned
You do realize how computers work right?
You cannot just magically offload eveerything to a GPU, let alone, that there is a enough bandwidth and power for doing all of that whilst maintaing high quality shading and image quality.
But there is a cpu on console.
 
I think it came down to you have a choice between this and nothing.

Cheap, More cores is the way to go, heat friendly. I think they own the designs which leads to better cost reduction. Intel would be expensive, hot, brute force and cost reduction, ask MS with the original Xbox.

That is not how it would be at all. It would just be expensive. Just because it is an intel chip does not make it "brute force or something." It would just be better... at everything.

But there is a cpu on console.

Of course, but his post was pointing at off-loading to the GPU because of the CPU. I am pointing out how you cannot just "offload" to the GPU, it does not work that way.
 

88random

Member
I'm pretty sure Sony's and MS's engineers know better than the majority of you guys, they wouldn't put these CPUs if they thought they were going to be bottlenecks. GPUs will take care of majority of stuff.
 
I'm pretty sure that's bogus, I mean, no way Microsoft would let Sony have a 30% advantage, talk about almost 100%... right?
 

JordanN

Banned
You do realize how computers work right?
You cannot just magically offload eveerything to a GPU, let alone, that there is a enough bandwidth and power for doing all of that whilst maintaing high quality shading and image quality.

You're right. But I think time will tell you'll forget about the CPU weakness in favor of the plethora of things the GPU will let you do.

I've only seen one developer point out the CPU as challenging their game, but that same developer already went on to make a far more ambitious game than others (Sucker Punch).
 
I'd like 1080p/30fps, no bugs, and weapon and clothing animations that don't clip through the fucking character. Seriously, it's been happening since AC1. You run and the sword goes through your character and the robe goes through his legs. How is this still happening?
 

The Llama

Member
It is really irritating to me just how low MS and Sony went on these CPUs. Just terrible.

If devs actually multi-thread their games properly and use their GPGPU capabilities properly, I guarantee that games will be GPU limited far, far more than CPU limited. Hell, last gen devs off-loaded some things from the PS3's GPU to the Cell's SPU's because they were so GPU limited but had spare CPU cycles.
 

Chaostar

Member
GPGPU is only useful for specific applications. PS4 could be "better at offloading" when it's actually possible, but there are plenty of cases when it's not possible at all.

So we're supposed to ignore this advantage because there's some situations where it won't apply? Asynchronous compute is likely to become a pretty big deal in the future progression of games on both consoles.
 
You're right. But I think time will tell you'll forget about the CPU weakness in favor of the plethora of things the GPU will let you do.

I've only seen one developer point out the CPU as challenging their game,
but that same developer already went on to make far more ambitious game than others (Sucker Punch).

Multiple devs have pointed to the CPUs being problematic, with Crytek saying that its low power almost endangered the production of Ryse.

My recommendation is just to not put all your eggs (hopes) in the basket that you can just have GPUs which magically fix any CPU problems.
 
I'm pretty sure Sony's and MS's engineers know better than the majority of you guys, they wouldn't put these CPUs if they thought they were going to be bottlenecks. GPUs will take care of majority of stuff.

If the use cases are there its in the article you want to offload task with more then 64 objects. So you can fill up a GCN wavefront and make optimal use of GPGPU capabilities.

More info in the AMD OpenCL programmers guide
 

Daviii

Member
Well. Sorry for bombing the thread but it looks shockingly obvious that as generation advances and the tasks traditionally on the CPU are more often and efficiently offloaded to the GPU, the difference between PS4 and Xbox One should theoretically increase rather than what I'm used to hear in this forum. Right?
 
the only thing that endangered ryse was employees' payroll.

Clever post. You realize that has nothing to do with what I was talking about though...

Care to give some concrete data about this weakness? Not just the flops in the cpu.

The entire inclusion of distance shadow maps into the Cryengine branch for Ryse is because the CPUs could not handle it. Their tech PDFs point to this.
 

gruenel

Member
Well okay ... Graphics ... But I'd love to see advancements in physics, AI and other fields ... And a jaguar 1.6ghz cpu is a piece of crap when looking at performance ... Thats just how it is. And this generation just started

Maybe developers just find more ways to use the gpu for ther things too

Physics processing is like the perfect example of tasks that can be offloaded to the GPU very well!

That's why PhysX exists.
 

omonimo

Banned
Multiple devs have pointed to the CPUs being problematic, with Crytek saying that its low power almost endangered the production of Ryse.

My recommendation is just to not put all your eggs (hopes) in the basket that you can just have GPUs which magically fix any CPU problems.
Multiple? Like?
 

Thrakier

Member
Oh wow, that CPU is a joke in both consoles. And here I'am thinking about upgrading my 2500k...at 4.1GHZ.

What the hell.
 
Fascinating presentation that largely confirms existing speculation about both platforms. The PS4 does appear to be a forward thinking design due to how it appears to have been designed around leveraging GPU compute. I wonder what this means to multiplat games as they begin to use more and more GPU compute on both consoles. The PS4 would seem to have a lot more overhead to use for GPUGPU. Multiplats right now are largely feature equal on both platforms and the GPU difference just shows up as better resolutions/framerates for the PS4. In the future, you could see PS4 versions of multiplats have more graphical features because the Xbox One versions won't have enough compute overhead available to do the same at acceptable resolutions and fr.
 

vpance

Member
Well. Sorry for bombing the thread but it looks shockingly obvious that as generation advances and the tasks traditionally on the CPU are more often and efficiently offloaded to the GPU, the difference between PS4 and Xbox One should theoretically increase rather than what I'm used to hear in this forum. Right?

Yes, unless forced parity remains a thing. Hopefully a 40M install base vs 18M will be a big enough difference to dissuade such tactics.
 
You do realize how computers work right?
You cannot just magically offload eveerything to a GPU, let alone, that there is a enough bandwidth and power for doing all of that whilst maintaing high quality shading and image quality.


doesn't gpgpu not work against image quality? i'm pretty sure i read somewhere cerny saying devs can up performance without sacrificing graphical quality. i imagine there will be tradeoffs but gpgpu doesn't take away from rendering and vice versa.


while there will be processes that will always require cpu or will always be better on cpu, my guess is that a lot of traditional cpu tasks will move over to gpu in the future.
 

hodgy100

Member
I think it came down to you have a choice between this and nothing.

Cheap, More cores is the way to go, heat friendly. I think they own the designs which leads to better cost reduction. Intel would be expensive, hot, brute force and cost reduction, ask MS with the original Xbox.

Intel's chips are much much more efficient than amd's they use less power and produce less heat. All while being more powerful. The decision to go with amd chips will have been a cost reduction one as Intel is not cheap. But itel is better than amd in every metric for cpu's
 
Multiple? Like?

Crytek, Sucker Punch, the guys making the new call of duty.
doesn't gpgpu not work against image quality? i'm pretty sure i read somewhere cerny saying devs can up performance without sacrificing graphical quality. i imagine there will be tradeoffs but gpgpu doesn't take away from rendering and vice versa.


while there will be processes that will always require cpu or will always be better on cpu, my guess is that a lot of traditional cpu tasks will move over to gpu in the future.
Even when you are using compute units on a GPU you are feeding the GPU with information. Bandwith is required. The consoles have shared bandwidth and a limited amount to go around. THe GPU is not just some free spot where you throw some data on it and it crunches it for free. It is a trade off like anything else.
 

JordanN

Banned
Multiple devs have pointed to the CPUs being problematic, with Crytek saying that its low power almost endangered the production of Ryse.

My recommendation is just to not put all your eggs (hopes) in the basket that you can just have GPUs which magically fix any CPU problems.
I don't think the system is magic. I've admitted the CPU can be a bottleneck when presented with it. But I'm sick of hearing the bashing the consoles get just for having this CPU like you can't make no game with it.

Even last page, someone said the PS4 is going to be held back the most by it, yet just every multiplat has been better on it regardless of cpu clock. When it comes to value, you're not buying the console in hopes of what the CPU can do.
 

R1CHO

Member
doesn't gpgpu not work against image quality? i'm pretty sure i read somewhere cerny saying devs can up performance without sacrificing graphical quality. i imagine there will be tradeoffs but gpgpu doesn't take away from rendering and vice versa.


while there will be processes that will always require cpu or will always be better on cpu, my guess is that a lot of traditional cpu tasks will move over to gpu in the future.

There is not magic. If you use it for one thing, you don't have it at the same time for another.
 
I don't think the system is magic. I've admitted the CPU can be a bottleneck when presented with it. But I'm sick of hearing the bashing the consoles get just for having this CPU like you can't make no game with it.

Even last page, someone said the PS4 is going to be held back the most by it, yet just every multiplat has been better on it regardless of cpu clock. When it comes to value, you're not buying the console in hopes of what the CPU can do.

Of course you will get games on em (they already have em!). I am just recommending to temper your 'technical' conclusions about these boxes. Especially in regards to CPU related things.
 

MaulerX

Member
Really disappointing. If only Microsoft would have focused on power (and not Kinect), we probably would have had a beast of a console at the same price. Oh well.
 

truth411

Member
And sell it at $600? That worked really well for the PS3...



Another generation of horrible ports, probably.

Nah, devs know how to use Cell now, it would have lead to amazing exclusives, but would have bumped up the price a bit since you can't have Cell and a GPU on the same die. Wasn't Sony makeing a 4PPE , 32 SPE Cell processor a few years ago that was aiming for 1 Tflop? That Cell processor plus a dedicated GPU working together would have been a MONSTER of a console.
 
Wait Minute, you said they claimed this cpu is not enough powerful for the Console? Crytek of course but the others Too? I hardly believe so.

No they never say that it is not enough to put games on them, but that they make sacrafices to quality/dynamism/framerate to get there games on it.

Heck, even Respawn has pointed to being CPU bound.
 

enzo_gt

tagged by Blackace
PC GAF must be laughing their heads off.
Maybe one day when the CPU bottleneck ends up impeding more than just having a bazillion AI characters on screen at once. Until then, I don't know what what bottom line difference it makes. Or will make, moving forward. Don't both consoles offload a bunch of CPU processing to dedicated audio chips and GPU processing as well?
 
I'm pretty sure Sony's and MS's engineers know better than the majority of you guys, they wouldn't put these CPUs if they thought they were going to be bottlenecks. GPUs will take care of majority of stuff.
Actually the business side had the last call; cost is everything to these companies now.
 

gruenel

Member
There is not magic. If you use it for one thing, you don't have it at the same time for another.

Mark Cerny disagrees.

If you look at how the GPU and its various sub-components are utilised throughout the frame, there are many portions throughout the frame – for example during the rendering of opaque shadowmaps – that the bulk of the GPU is unused. And so if you’re doing compute for collision detection, physics or ray-casting for audio during those times you’re not really affecting the graphics. You’re utilising portions of the GPU that at that instant are otherwise under-utilised. And if you look through the frame you can see that depending on what phase it is, what portion is really available to use for compute
 
Would love for a simple filter to banish the 30 vs 60 vs 900/1080 vs 1.3TFlops vs 1.8 TFLOPS threads into "peasant" hell and turn GAF into a PCGAF view only.
I suggested this as a console only gamer and PC gamers chewed my head off. I don't need to hear all your issues with drivers and floppy discs.
 
Mark Cerny disagrees.

Sure you have underutilizerd compontents. But if you feed those underutilized components with info to get em moving... you are taking away from bandwidth. Even in the best scenario you are not just getting "free performance" due to clever programming. It is give and take... like all things on a closed platform.

Also, please be aware that there str a lot of "words" thrown about by hardware makers about their products. Even Cernys words should be looked on with scrutiny. "Super charged" "power of the cell" "HuMA" etc... all acronyms power point presentations etc.. direct from manufactures have to be looked upon with said scrutiny. In the end they are propogandizing a product.
 

twobear

sputum-flecked apoplexy
Of course you will get games on em (they already have em!). I am just recommending to temper your 'technical' conclusions about these boxes. Especially in regards to CPU related things.
Doesn't this fly in the face of what this thread is about? They're saying that a task that last generation could have been done on the CPU, this time can be offloaded to the GPU. So with the 360 you had to use the CPU to make 38 dancers, whereas on the PS4 you can use the GPU to do it and you can make 1600 of them.

I'm sure that some tasks can't be offloaded to the CPU of course.
 
Top Bottom