• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

Qvoth

Member
I'm not trying to sound like a snob or a jerk, but there is no part of the Xbox One that will outperform the PS4 on a technical level. You can argue with me till you are blue in the face about latency, SHAPES, and the gymnastics of balance; but the reality is- the majority of Xbox One games in a vacuum will perform less so than PS4 all things considered.

On point, Xbox One exclusives will have a tough time against the PS4 exclusives in terms of pure tech. Art direction is a whole other story. I hear those dudes at Black Tusk got some fiya in the barrel.

thought xbone's cpu is better now or something?
 

nib95

Banned
Does this mean that the CPU in PS4 is clocked the same or higher than what Microsoft has (iirc 1.75Ghz) in XBONE?

That extra 9 Gflop difference from the cpu isn't going to make any difference in the grand scheme of a 521 Gflops disadvantage in overall performance.

But Yea, I don't think the PS4's CPU clock speed is confirmed. The best guess is 1.6ghz though based on rumoured spec and the KZ PDFs.
 
That extra 9 Gflop difference from the cpu isn't going to make any difference in the grand scheme of a 521 Gflops disadvantage in overall performance.

But Yea, I don't think the PS4's CPU clock speed is confirmed. The best guess is 1.6ghz though based on rumoured spec and the KZ PDFs.

While I don't doubt that, I'm curious just what kind of benefit that Microsoft could derive from the upclock. I mean, I'm technical but I'm not an EE/CHE.

It seems bizarre given their other decisions up to this point that Microsoft would risk even slightly harming yields for such a small increase unless they're counting on the positive PR they received from it to help them. I imagine it has been discussed to death already though.
 

RoboPlato

I'd be in the dick
That extra 9 Gflop difference from the cpu isn't going to make any difference in the grand scheme of a 521 Gflops disadvantage in overall performance.

But Yea, I don't think the PS4's CPU clock speed is confirmed. The best guess is 1.6ghz though based on rumoured spec and the KZ PDFs.
Yeah, I think the 1.6GHz is almost definite at this point with MS openly sharing the 1.75GHz number. GopherD said the goal for the CPU reserve was less than one core, although that was a long time ago.
 
Why? Anyone should be able to participate on the boards, no matter where you work. That said...

Common sense dictates that you don't use your work email account to make those kinds of posts without disclosing that association up front.

Anyways, back to the music.


Is it a bannable offense to declare my undying love for you?
 

nib95

Banned
Yeah, I think the 1.6GHz is almost definite at this point with MS openly sharing the 1.75GHz number. GopherD said the goal for the CPU reserve was less than one core, although that was a long time ago.

1 core reserve is a bit optimistic imo. I reckon it'll be 2, but I'd be impressed if it was one.
 
I'm not trying to sound like a snob or a jerk, but there is no part of the Xbox One that will outperform the PS4 on a technical level. You can argue with me till you are blue in the face about latency, SHAPES, and the gymnastics of balance; but the reality is- the majority of Xbox One games in a vacuum will perform less so than PS4 all things considered.

On point, Xbox One exclusives will have a tough time against the PS4 exclusives in terms of pure tech. Art direction is a whole other story. I hear those dudes at Black Tusk got some fiya in the barrel.

I wasn't arguing that... I was just saying they're both going to have some great exclusives after a few years that will shiiiiiiine. :) It's going to be great.
 

RoboPlato

I'd be in the dick
1 core reserve is a bit optimistic imo. I reckon it'll be 2, but I'd be impressed if it was one.
Me too but I think it's more likely than an upclock at this point. If they do something like with the Vita's OS where it uses less resources it's not at the forefront. I've just been curious about the CPU specifics for a while since Sony has been so open about every other architectural and technical detail of the system.
 

CoG

Member
-Mentions both consoles are similar in performance, 10 pts

I just got into a RL argument today with a former Xbox One team member who tried to sell me that one. Also the RAM situation is comparable and I don't understand how caching works, lol.
 

jusufin

Member
Does this mean that the CPU in PS4 is clocked the same or higher than what Microsoft has (iirc 1.75Ghz) in XBONE?

I wouldn't count a upclock as being better, they both have the exact same CPU, Sony could easily match or surpass that down the line. What he stated is technically true, the PS4 on paper is the superior system hardware wise. What remains to be seen is how much of the difference will show in games.
 

nib95

Banned
I just got into a RL argument today with a former Xbox One team member who tried to sell me that one. Also the RAM situation is comparable and I don't understand how caching works, lol.

Redmond employee training centre.

tumblr_li5uszybM11qe0eclo1_r3_500FINAL2.gif
 

djkeem

Unconfirmed Member
Microsoft's problem isn't that they tried to shape the narrative-everybody does that. Their problem is that they changed their narrative repeatedly in under a year. It makes it difficult to take seriously almost anything they're saying at this point, particularly given their noncommital responses when people ask them about the possibility of DRM making a return.

Is this true? Do you have a source for this?
 

onQ123

Member
That extra 9 Gflop difference from the cpu isn't going to make any difference in the grand scheme of a 521 Gflops disadvantage in overall performance.

But Yea, I don't think the PS4's CPU clock speed is confirmed. The best guess is 1.6ghz though based on rumoured spec and the KZ PDFs.

You have to remember that the KZ PDF could have been from the Devkits with 8 Bulldozer cores clocked at 1.6GHz.


devkits.jpg



DVKT-KS000K (“Initial 1″)

Runs Orbis OS
CPU: Bulldozer 8-core, 1.6 Ghz
Graphics Card: R10 with special BIOS
RAM: 8 GB (system memory)
BD Drive
HDD: 2.5 ” 160 GB
Network Controller
Custom South Bridge allows access to controller prototypes
 
T

thepotatoman

Unconfirmed Member
At first I thought the last minute overclock was because tests were better than expected. Now I'm worried that it was another panic move 180 to keep up with PS4 and we'll be seeing RRoD version 2.
 
Is this true? Do you have a source for this?

It seems that the last word was more firmly against it, but still wouldn't rule it out. Unfortunately I'm having trouble finding the article that I was thinking about when I said that, it was shortly after their reversal and caused a decent stir.

I'll link you to what I did find though.

Source: Penello

Phil Harrison-"But we got clear feedback that some of the things we were proposing were perhaps a little too far into the future. So we changed."

Not that they were bad ideas; just they were too far advanced. Someday we'll understand and accept the glorious MS DRM. Someday...

Not what I was thinking of either, but it works. I think it was from an interview with one of the lesser known guys at Microsoft.
 

xaosslug

Member
*checks rinemy's email account*
*immediately recognizes Microsoft partnership*
*calls to wife for advice on how to spell 'disingenuous' for the ban message*
*goes back to playing GTA V*

you know, that 'these consoles will be similar!!!' seems to be MS' new message really says a lot.
 

onQ123

Member
At first I thought the last minute overclock was because tests were better than expected. Now I'm worried that it was another panic move 180 to keep up with PS4 and we'll be seeing RRoD version 2.

I think they had room to play with the clocks this time around ( no pun intended). there are Jaguar CPU's clocked at 2.0GHz & GCN GPU's clocked at 1GHz.
 
That extra 9 Gflop difference from the cpu isn't going to make any difference in the grand scheme of a 521 Gflops disadvantage in overall performance.

But Yea, I don't think the PS4's CPU clock speed is confirmed. The best guess is 1.6ghz though based on rumoured spec and the KZ PDFs.

I don't think it's fair to dismiss the cpu advantage on X1 though. It's unreasonable to measure CPU performance with flops anyways.

To put things generally, CPUs are more for logic while GPUs are for doing massive number of calculations.
 

ypo

Member
Hey guys.

To be blunt both of these consoles are going to be extremely similar to each other in terms of performance. If anyone is making a purchase of one over the other based on these measures alone, STOP.
The gap difference between the consoles and your computer you are using is probably bigger than pitting the PS4/XB1 against each other.

I'm not calling anyone out.. I just want to make a point and leave my 2 cents. Don't let gaming journalism play with your head.

Damn is Microsoft this pathetically desperate that they are willing to undersell their own system? So the Core 2 Duo with a Intel graphics card I got at home is more powerful than the Xbone? That's the message now?
 
I'm not trying to sound like a snob or a jerk, but there is no part of the Xbox One that will outperform the PS4 on a technical level. You can argue with me till you are blue in the face about latency, SHAPES, and the gymnastics of balance; but the reality is- the majority of Xbox One games in a vacuum will perform less so than PS4 all things considered.

On point, Xbox One exclusives will have a tough time against the PS4 exclusives in terms of pure tech. Art direction is a whole other story. I hear those dudes at Black Tusk got some fiya in the barrel.

And to think when the next gen rumors started you were saying the Durango would be stronger than the Orbis. I love the twists and turns leading up to a new console, nothing quite like it.
 

Codeblew

Member
I don't think it's fair to dismiss the cpu advantage on X1 though. It's unreasonable to measure CPU performance with flops anyways.

To put things generally, CPUs are more for logic while GPUs are for doing massive number of calculations.

Why do you think X1 has an advantage on CPU? Just because of the upclock? How about how much of the CPU is reserved for the 3 OS's? I don't think we know the answer to that but since it seems X1 is more "heavy" on the OS side, we cannot assume they have an advantage just because of a slight upclock.
 

RoboPlato

I'd be in the dick
Why do you think X1 has an advantage on CPU? Just because of the upclock? How about how much of the CPU is reserved for the 3 OS's? I don't think we know the answer to that but since it seems X1 is more "heavy" on the OS side, we cannot assume they have an advantage just because of a slight upclock.
It's two cores reserved for the XBO OS.
 

onQ123

Member
And to think when the next gen rumors started you were saying the Durango would be stronger than the Orbis. I love the twists and turns leading up to a new console, nothing quite like it.

Well Devkits with powerful GPUs & 12GB of ram will make you think that the console is going to be a beast when the other console is said to only have 2GB of ram.
 
Why do you think X1 has an advantage on CPU? Just because of the upclock? How about how much of the CPU is reserved for the 3 OS's? I don't think we know the answer to that but since it seems X1 is more "heavy" on the OS side, we cannot assume they have an advantage just because of a slight upclock.

As far as we know, both ps4 and x1 have 2 cores reserved. That means both systems will have at most 6 cores for games.

The fact that x1's cpu has 9% advantage cannot be questioned if ps4's cpu is clocked at 1.6ghz. That advantage may not amount to much if x1's vitualization layer affects CPU performance though. So there's no way of knowing really.
 
Why? Anyone should be able to participate on the boards, no matter where you work. That said...

Common sense dictates that you don't use your work email account to make those kinds of posts without disclosing that association up front.

Anyways, back to the music.

Sorry, I meant nobody who is a shill is safe, not that you arbitrarily ban people =P
 

Krakn3Dfx

Member
As far as we know, both ps4 and x1 have 2 cores reserved. That means both systems will have at most 6 cores for games.

The fact that x1's cpu has 9% advantage cannot be questioned if ps4's cpu is clocked at 1.6ghz. That advantage may not amount to much if x1's vitualization layer affects CPU performance though. So there's no way of knowing really.

Sony has confirmed neither the number of cores used for the os or cpu speed so all of this is speculation. We do know the gpu in the ps4 is roughly 40 percent faster than the Xbox and that the ps4's memory is faster and not hampered by esram requirements, which all developers have said is a better way for developing outside of Microsoft.

At some point people need to stop ignoring the obvious instead of falling back on idle speculation.
 
Sony has confirmed neither the number of cores used for the os or cpu speed so all of this is speculation. We do know the gpu in the ps4 is roughly 40 percent faster than the Xbox and that the ps4's memory is faster and not hampered by esram requirements, which all developers have said is a better way for developing outside of Microsoft.

At some point people need to stop ignoring the obvious instead of falling back on idle speculation.

Of course ps4 has a massive gpu advantage. Look through my post history and you'll see I'm more for ps4 than x1. I just get irked when people use flops to measure CPU performance.
 

Perkel

Banned
Of course ps4 has a massive gpu advantage. Look through my post history and you'll see I'm more for ps4 than x1. I just get irked when people use flops to measure CPU performance.

6% upclock won't change anything. If you have with OC CPU in your PC you know that. Change is noticeable when you do 20-30%
 

KidBeta

Junior Member
Of course ps4 has a massive gpu advantage. Look through my post history and you'll see I'm more for ps4 than x1. I just get irked when people use flops to measure CPU performance.

What would you rather measure CPU perf in, giga ops?, from any metric its only 6% faster.

CPU crunches numbers just like the GPU, its just that its better suited to a different problem set due to the nature of its more complicated front end.
 
Phil Harrison-"But we got clear feedback that some of the things we were proposing were perhaps a little too far into the future. So we changed."

Not that they were bad ideas; just they were too far advanced. Someday we'll understand and accept the glorious MS DRM. Someday...

Yeah. And never forget the tag line they gave vg247 after they announced the 180 on their DRM scheme:

Finally, Whitten could not give any reassurance that Microsoft will not change its policies in the future.

Microsoft loves the future...
 
Hey guys.

To be blunt both of these consoles are going to be extremely similar to each other in terms of performance. If anyone is making a purchase of one over the other based on these measures alone, STOP.
The gap difference between the consoles and your computer you are using is probably bigger than pitting the PS4/XB1 against each other.

I'm not calling anyone out.. I just want to make a point and leave my 2 cents. Don't let gaming journalism play with your head.

RINEMY.png
 

szaromir

Banned
6% upclock won't change anything. If you have with OC CPU in your PC you know that. Change is noticeable when you do 20-30%
It's 9% and it gives exactly 9% more cycles for devs to use. What they do with it, I don't know, but giving PC benchmarks is useless.
 
What would you rather measure CPU perf in, giga ops?, from any metric its only 6% faster.

CPU crunches numbers just like the GPU, its just that its better suited to a different problem set due to the nature of its more complicated front end.

Good question. I don't really know. I don't think there's any real metric in measuring CPU performance outside real life benchmarks unless they are of the same exact architecture.

GPUs can be vaguely measured in flops since it's generally discouraged to use branches in shader codes. But with types of programs that runs on CPUs which have lots of branches, it's not easy to measure their performance by theoretical number of some ops per second. Not to mention the way different CPUs have different number of registers and handle caching differently which are a lot more important for CPUs than GPUs because memory accesses are a lot more random.

I know from your posts that you are very knowledgeable about computers. Would you serious add CPU flops to GPU flops?

(btw, ps3's Cell was a different story since from the way it looks, the cell chip was basically an APU with a single core CPU and 6 other GPU cores)

Edit: Also the vast majority of GPU operations are floating point operations. On CPUs, it is not.
 

stryke

Member
Beautiful DetectiveGAFwork! Haha, yowzers.

Man, I just watched the scene in GTAV today where Lester talks about nothing you say or do disappearing for good in the internet age and it's so true.

If you think that's impressive, you should check out his reddit account.
 

skdoo

Banned
You would think after numerous developers came out and said the PS4 was significantly more powerful this thread would have died. Apparently not
 

KidBeta

Junior Member
Good question. I don't really know. I don't think there's any real metric in measuring CPU performance outside real life benchmarks unless they are of the same exact architecture.

GPUs can be vaguely measured in flops since it's generally discouraged to use branches in shader codes. But with types of programs that runs on CPUs which have lots of branches, it's not easy to measure their performance by theoretical number of some ops per second. Not to mention the way different CPUs have different number of registers and handle caching differently which are a lot more important for CPUs than GPUs because memory accesses are a lot more random.

I know from your posts that you are very knowledgeable about computers. Would you serious add CPU flops to GPU flops?

(btw, ps3's Cell was a different story since from the way it looks, the cell chip was basically an APU with a single core CPU and 6 other GPU cores)

Edit: Also the vast majority of GPU operations are floating point operations. On CPUs, it is not.

IIRC branching on GPU's has been improving in rather large strides, i'm pretty sure they don't suffer from the same branch mis prediction as the CPU can as they don't attempt to predict branches (Im not 100% sure on this). Then again branch mispredictions aren't that big a deal on modern x86 processors such as jaguar as they have a shallow pipeline (iirc the branch misprediction penalty was 13 cycles).

But yes I agree a lot of the CPU performance has 0 to do with the actual algorithm your using (and its complexity) and more to do with how good your access pattern is, what kind of cache hit rate you are getting and how well you handle synchronisation (for multi core algos)

Its a bit of a false dichotomy to compare CPU and GPU FLOPs I agree with that too there is a pretty wise division among them but they are suited or massively different tasks, and most rarely translate well over to the other (I can see more GPU algos running decently on the CPU then the other way around), but to a degree it works, to what degree is anyones guess but its better then nothing.
 
Top Bottom