• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I wonder if some hacker will finally find out the final clock speeds of the PS4, now that it is in the wild.
 

RoboPlato

I'd be in the dick
I wonder if some hacker will finally find out the final clock speeds of the PS4, now that it is in the wild.

I still can't believe we don't know this. I'm almost positive it's 1.6GHz but it's so odd how much information we have on everything else but the CPU.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I still can't believe we don't know this. I'm almost positive it's 1.6GHz but it's so odd how much information we have on everything else but the CPU.

I am also confident that it is 1.6Ghz. Would still be nice to have confirmation, and maybe additional information about the final resource allocation of the operating system.
 

RoboPlato

I'd be in the dick
I am also confident that it is 1.6Ghz. Would still be nice to have confirmation, and maybe additional information about the final resource allocation of the operating system.

Same details I'm looking for as well. I'm curious if the second processor and added RAM help with OS stuff. GopherD said their aim was for less than one core for OS processes and I'm curious to see if that panned out.
 

Aklamarth

Member
Hmmmmm

vgv5V8gl.jpg
 
Except the PS3 launched with an already old GPU and half the RAM a whole year later

This time the PS3 has a much better GPU, RAM several times faster and none of the OS and kinect baggage of the XO which for example takes 2 cores off the CPU all the time
Also 10% of the GPU reserved, and less ram allocated than ps4?
It's a miracle that NFS almost looks/performs the same on xbox one.
Well that's interesting.
 

satam55

Banned
This should be pointed out, According to the FCC filing for both the PS4 Dev Kit (https://apps.fcc.gov/oetcf/eas/reports/ViewExhibitReport.cfm?mode=Exhibits&RequestTimeout=500&calledFromFrame=N&application_id=435177&fcc_id=AK8DUTD1000) & the retail PS4 (https://apps.fcc.gov/oetcf/eas/reports/ViewExhibitReport.cfm?mode=Exhibits&RequestTimeout=500&calledFromFrame=N&application_id=583848&fcc_id=AK8CUH100C1), the PS4 is supposed have Bluetooth 4.0 in addition to Bluetooth 2.1 +EDR. On both of those links, the info is in the last .pdf file called "FCC 15C Report". It should be on page 5.

Now that it's confirmed that the wireless radio in the PS4 is a "Marvell Wireless Avastar 88W8797", we can 100% confirm that the PS4 has Bluetooth 4.0. Here's a link for the .pdf file info on the Marvell Wireless Avastar 88W8797 SoC:
http://www.marvell.com/wireless/assets/marvell_avastar_88w8797.pdf


ohhh.png
I wonder why Sony doesn't list Bluetooth 4.0 as a tech spec feature on the PS4.
 

Cuyejo

Member
This should be pointed out, According to the FCC filing for both the PS4 Dev Kit (https://apps.fcc.gov/oetcf/eas/reports/ViewExhibitReport.cfm?mode=Exhibits&RequestTimeout=500&calledFromFrame=N&application_id=435177&fcc_id=AK8DUTD1000) & the retail PS4 (https://apps.fcc.gov/oetcf/eas/reports/ViewExhibitReport.cfm?mode=Exhibits&RequestTimeout=500&calledFromFrame=N&application_id=583848&fcc_id=AK8CUH100C1), the PS4 is supposed have Bluetooth 4.0 in addition to Bluetooth 2.1 +EDR. On both of those links, the info is in the last .pdf file called "FCC 15C Report". It should be on page 5.

Now that it's confirmed that the wireless radio in the PS4 is a "Marvell Wireless Avastar 88W8797", we can 100% confirm that the PS4 has Bluetooth 4.0. Here's a link for the .pdf file info on the Marvell Wireless Avastar 88W8797 SoC:
http://www.marvell.com/wireless/assets/marvell_avastar_88w8797.pdf


ohhh.png
I wonder why Sony doesn't list Bluetooth 4.0 as a tech spec feature on the PS4.

hmm, that is weird...
 

stryke

Member
Looks like PS4 had 20 CUs afterall (2 disabled)

FOR_BLOG_CXD90026G_die_151197_PolyB_branded.png


Would have been nice if they weren't. PS4 could have been a nice 2+ tflop machine.
 
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind

"You forget about the eSRAM. It is not just DDR3. eSRAM has theoretical speed of about 204gb/s. DDR3 has about 68gb/s. Now. PS4 has 176gb/s of RAM speed due to GDDR5.Now, here is how the RAM works on the X1. You can add the RAM speeds together. You are be able to theoretically get 204+68 = 272gb/s. That's not realworld though. You get that through perfect coding. Never gonna happen. What you can get is 2 states: Good coding or Bad coding. Good code, you should be able to reach 200gb/s. Bad code, you will get 140-150gb/s. So it is in the developers hands. If they code good games (which we have to assume they will, except for launch), they will hit around 190 to 200gb/s, faster than the PS4. Depends on the coding.You can expect exclusives to hit that all the time. Multiplats is another story because of against-factors. You have a little more work to do with the X1. It is not about difficulty of development. It is about time and laziness. Can you trust developers to not be lazy on the X1 when PS4 doesn't require as much effort? It's up in the air at this point."

I'm even a bit confused at what's being said? Faster and better but harder to code for?
 
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind



I'm even a bit confused at what's being said? Faster and better but harder to code for?

You need to pretty much redesign your rendering engine for it too work.
Not gonna happen for launch games maybe patched in later if your lucky.

But every optimization devs do for esram will also boost ps4 performance because you have 5~6gb of fast memory and not only 32mb faster memory+ 5 gb of slower memory.
 
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind



I'm even a bit confused at what's being said? Faster and better but harder to code for?

The statement completely ignores the fact that this high bandwidth is not only more difficult to achieve but also only applies for the tiny 32 MB space. The Xbox One might have an advantage in some special cases, but for the most part PS4's solution is without doubt the faster one (and that's regardless of how much optimization you do).
Also I wouldn't talk about "dev lazyness". It's all a cost-benefit equation.
 
The statement completely ignores the fact that this high bandwidth is not only more difficult to achieve but also only applies for the tiny 32 MB space. The Xbox One might have an advantage in some special cases, but for the most part PS4's solution is without doubt the faster one (and that's regardless of how much optimization you do).
Also I wouldn't talk about "dev lazyness". It's all a cost-benefit equation.

I expected it to be a silly idea. It sounds like he is saying Xbox exclusives will be better but for mutiplats its dependant on developer to max out and use the "better" hardware.
 
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind



I'm even a bit confused at what's being said? Faster and better but harder to code for?
Its all meaningless drivel.

The Bone is worse than the PS4 in terms of computational power by a third. No amount of eSRAM will make up for that kind of gap.
 

jcm

Member
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind

You have a little more work to do with the X1. It is not about difficulty of development. It is about time and laziness. Can you trust developers to not be lazy on the X1 when PS4 doesn't require as much effort? It's up in the air at this point."

I'm even a bit confused at what's being said? Faster and better but harder to code for?

I hate it when people talk about "lazy developers". Dudes practically live at work for 9 months at a time to make your video games and you sit on your couch and call them lazy?
 

Perkel

Banned
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind



I'm even a bit confused at what's being said? Faster and better but harder to code for?

Old already here.

ESRAM is patch for lucklustre bandwidth and that 204GB/s is already bullshit to begin with. They basically added read to write and thus 204GB and in reality there is no chance that read or write will be doing more than 100GB/s.

At best we are looking at 100GB/s + 70GB/s in each way. That would put it at GDDR5 speed but we must remember this 100GB/s is for only 32MB not for whole 8GB.

There is no real scenario in which we can say Xbone configuration is better than PS4 mem config. That is beside even ease of use or hUMA and we heard from devs and we see it in games.
 

foxbeldin

Member
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind



I'm even a bit confused at what's being said? Faster and better but harder to code for?

This bad code/good code thing is bs stuff.
eSRAM can only reach it's peak performance (i'm not talking about theoretical) when it's read/write at the same time. Best they could do in "not really real game world" conditions was about 130Gb/s, and they can't achieve that all the time.
And you can't overlook the fact that everything that has to go through DDR3 is gonna be capped at 68Gb/s, eSRAM won't magically extract the data from there at 133Gb/s.
eSRAM size is so tiny that it's gonna be used in an extremely limited number of operations while the rest will go through slow DDR3.
Meanwhile on PS4, every single bit of the memory is at 176Gb/s, and you never have to move that data from one pool to another.
 

DonMigs85

Member
Its all meaningless drivel.

The Bone is worse than the PS4 in terms of computational power by a third. No amount of eSRAM will make up for that kind of gap.
Also doesn't it have only 16 ROPs versus 32 on PS4? That means it has just a little over half the latter's fillrate.
 
In what page were we talking about comments made by Need for Speed Rivals dev? Remember that comment about them being particularly surprised about the performance of one of the consoles? I suspected it was XBO because on paper PS4 is clearly superior, so how would that be a surprise. Of course I was jumped on and told that it was the PS4 that surprised them (not sure how they'd be surprised of the obvious :p)

Any update on that now that the game is out in the wild?
 
Are there still no proper comparisons for Xbone/PS4 multiplats?

WTF is going on?

Looks like this round of multiplats is a wash anyway. Basically all devs did for the most part is build their engines for the higher userbase last gen systems, then recoded for x86 and increasedp resolutions a bit and added a few post processing effects. Some teams did a little more then others. The main difference seems to be resolution for this round. 1080p for PS4 and 720p for X1. Which is in line with what you would expect in the PC world when pitting a low end 16 ROP 1.3tflop card vs a mid range 32 ROP / 1.8tflop card. Results got to a T really.

On X1 they kept texture fidelity the same as all the others and dropped resolution to keep the frame rate in line with the others. PS4 basically direct pott
 

SRG01

Member
A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind

I'm even a bit confused at what's being said? Faster and better but harder to code for?

First off, it's not as easy as "adding" the speeds together. eSRAM input/outputs directly onto the same bus, so it's not like you can use it as another pipe to DDR3 (and that would cause some insane coherency issues).

Second, the best way to use the XBO eSRAM implementation is to make it nearly the same as the eDRAM on the 360, in which you can essentially get "free" frame buffer operations due to its speed, but is limited to its small size.

Third, a system's speed is never determined by the speed of its fastest component, but the speed of its most limiting component. Think bottlenecks. It doesn't matter how fast eSRAM is if the rest of the system is slow as molasses.
 
There are still forums with the technically ignorant claiming ESRAM/DDR3 latency, tiled resources, CPU bottlenecks, display panes, and other long debunked or irrelevant things will give Xbox a huge performance advantage. They still want to believe, apparently.
 

DonMigs85

Member
One question though, in terms of pure integer and floating point performance how much better/worse is the 8-core Jaguar versus Xenon and Cell?
 

IN&OUT

Banned
I was thinking about this difference the other day and I had a very interesting discussion with my friend about this very topic. My argument was that PS4 vs X1 is as big of a gap as OGXbox vs PS2 era. I had some observations to support my argument:

Back in PS2 era, SDTVs were the norm.
Most SDTVs were able to output pictures of 576i/50fps (PAL) or 480i/60fps (NTSC)*, of course frames can go lower than maximum 50/60 fps.

During that era, PS2,GC and OGXbox HAD to target only ONE resolution (depends on PAL vs NTSC). So each console must first target this resolution then use what’s left of each console power to increase graphics and textures quality. That’s why Splinter Cell (OGXbox) almost looked like a different game to its PS2 counterpart. PS2 didn’t have the luxury of lowering the resolution then stretch the image because there was no lower resolution back then.

These days we have PS4 with massive power advantage to Xbone, but also we have HD TVs with variable common resolutions of 720p,900p and 1080p. So every time Xbone started to struggle to keep up with PS4, developers just lower the resolution to the next stop, this step usually free up a considerable amount of power (56-125% less pixels per frame) depending on 900p-720p stops, after freeing this power devs can use the freed power to maintain the same texture quality between the versions. I think this trend will continue throughout this generation or until PS4 starts tapping into its GPGPU compute, that time Xbone will seriously need more than lowering the resolution to keep multiplatform games look similar ( by similar I mean a game with all effects, bells and whistles ASIDE from res+ fps) . Let’s assume this hypothetical situation where PS4 utilize six of its CUs for compute:

PS4 : 12CU rendering + 6 CUs Compute.

in order to achieve similar compute performance:

Xbone: 6CUs rendering + 6 CUs compute.

Notice that Xbone rendering (graphics abilities) is 100% lower than PS4 in this case! so I can see huge graphical performance and disparities coming later this gen between the two, much greater than those during PS2/Xbox era.

also let’s assume another hypothetical situation where we have a single resolution solution (either 720p, or 1080p)

in this case PS4 will easily pull ahead of Xbone by using the remaining power to enhance textures or IQ or even step up the framerates.
Small example just to explain the power required to run 1080p game vs 720p to the average joe (will use COD PS4 vs X1):

PS4 : 1080p @ 60fps
X1: 720p @ 60fps

Imagine this imaginary scenario were you have three 720p TVs hooked up with single HDMI. The more power the console output, the more TVs show pictures.

X1: one TV will output the game at 720p. two TVs are black (Turned off due to lack of power)
PS4: two TVs are showing the exact same 720p picture of X1 + the last TV is ON with 480i picture !

So basically PS4 is rendering 2.5 twice the resolution while maintaining the same framrates and might have better IQ !

FOR DOUBTERS: DID you see how big of a difference is that? (POWER WISE)



My conclusion is that the idea of Xbox vs PS2 power difference next gen is very true and realistic, it just happen to be less “visible” due to different output technologies these days.

* I’m fully aware that some later SDTV was able to output 720p but this resolution was rarely used by games back then.

What does GAF think?

Forgive my weak grammar.
 

IN&OUT

Banned
^ When I see a Halo CE/Killzone disparity, I'll agree that the gulf is as big as the Xbox/PS2 gap.

Believe me, if X1/PS4 was forced to output one resolution. difference will be even bigger than Xbox/PS2.
when devs start utilizing GPGPU, X1 will struggle to achieve similar computational power without sacrificing graphical rendering resources. X1 is very resource scarce.
 

iceatcs

Junior Member
Is the ps4 bandwidth starved for the ROPS it has?

With the games showing. Look like it isn't true at all. It seem that xbox one is not in the balance to bring closer to PS4.

So far we got near every PS4 version are high superior than Xbone one.
 

HTupolev

Member
With the games showing. Look like it isn't true at all. It seem that xbox one is not in the balance to bring closer to PS4.

So far we got near every PS4 version are high superior than Xbone one.
PS4 versions of games being better doesn't imply that bandwidth isn't sometimes a bottleneck for the GPU's I/O units.

(If it is a bottleneck on occasion, I really wouldn't be surprised; that's not exactly unheard of for consoles without embedded memory pools.)
 
I would say that the only thing Microsoft can do is upclock the CPU and GPU again. I think Harrison said they could do it again. If they don't then yeah, MS is in trouble. Just look at all the multiplats having a notable performance/resolution increase on the ps4.
 

iceatcs

Junior Member
PS4 versions of games being better doesn't imply that bandwidth isn't sometimes a bottleneck for the GPU's I/O units.

(If it is a bottleneck on occasion, I really wouldn't be surprised; that's not exactly unheard of for consoles without embedded memory pools.)

But I don't think Xbox one got the best version just because PS4's bottleneck.

Remember every hardwares have bottleneck, it called limit capacity.
 

SRG01

Member
I would say that the only thing Microsoft can do is upclock the CPU and GPU again. I think Harrison said they could do it again. If they don't then yeah, MS is in trouble. Just look at all the multiplats having a notable performance/resolution increase on the ps4.

Upclocks won't do much in terms of performance. There are other fundamental limits that limit the XBO versus the PS4.
 

HTupolev

Member
But I don't think Xbox one got the best version just because PS4's bottleneck.

Remember every hardwares have bottleneck, it called limit capacity.
Yes, but I'm not seeing how this is relevant.

Nobody is questioning whether or not the hardware "has bottlenecks" (lol) or whether they mean that the Xbox One versions are better; the question is whether BW is a bottleneck for GPU I/O.
 

onQ123

Member
I was thinking about this difference the other day and I had a very interesting discussion with my friend about this very topic. My argument was that PS4 vs X1 is as big of a gap as OGXbox vs PS2 era. I had some observations to support my argument:

Back in PS2 era, SDTVs were the norm.
Most SDTVs were able to output pictures of 576i/50fps (PAL) or 480i/60fps (NTSC)*, of course frames can go lower than maximum 50/60 fps.

During that era, PS2,GC and OGXbox HAD to target only ONE resolution (depends on PAL vs NTSC). So each console must first target this resolution then use what’s left of each console power to increase graphics and textures quality. That’s why Splinter Cell (OGXbox) almost looked like a different game to its PS2 counterpart. PS2 didn’t have the luxury of lowering the resolution then stretch the image because there was no lower resolution back then.

These days we have PS4 with massive power advantage to Xbone, but also we have HD TVs with variable common resolutions of 720p,900p and 1080p. So every time Xbone started to struggle to keep up with PS4, developers just lower the resolution to the next stop, this step usually free up a considerable amount of power (56-125% less pixels per frame) depending on 900p-720p stops, after freeing this power devs can use the freed power to maintain the same texture quality between the versions. I think this trend will continue throughout this generation or until PS4 starts tapping into its GPGPU compute, that time Xbone will seriously need more than lowering the resolution to keep multiplatform games look similar ( by similar I mean a game with all effects, bells and whistles ASIDE from res+ fps) . Let’s assume this hypothetical situation where PS4 utilize six of its CUs for compute:

PS4 : 12CU rendering + 6 CUs Compute.

in order to achieve similar compute performance:

Xbone: 6CUs rendering + 6 CUs compute.

Notice that Xbone rendering (graphics abilities) is 100% lower than PS4 in this case! so I can see huge graphical performance and disparities coming later this gen between the two, much greater than those during PS2/Xbox era.

also let’s assume another hypothetical situation where we have a single resolution solution (either 720p, or 1080p)

in this case PS4 will easily pull ahead of Xbone by using the remaining power to enhance textures or IQ or even step up the framerates.
Small example just to explain the power required to run 1080p game vs 720p to the average joe (will use COD PS4 vs X1):

PS4 : 1080p @ 60fps
X1: 720p @ 60fps

Imagine this imaginary scenario were you have three 720p TVs hooked up with single HDMI. The more power the console output, the more TVs show pictures.

X1: one TV will output the game at 720p. two TVs are black (Turned off due to lack of power)
PS4: two TVs are showing the exact same 720p picture of X1 + the last TV is ON with 480i picture !

So basically PS4 is rendering 2.5 twice the resolution while maintaining the same framrates and might have better IQ !

FOR DOUBTERS: DID you see how big of a difference is that? (POWER WISE)



My conclusion is that the idea of Xbox vs PS2 power difference next gen is very true and realistic, it just happen to be less “visible” due to different output technologies these days.

* I’m fully aware that some later SDTV was able to output 720p but this resolution was rarely used by games back then.

What does GAF think?

Forgive my weak grammar.

Um console did run at lower resolutions in the PS2\GameCube\Xbox era.
 

IN&OUT

Banned
I would say that the only thing Microsoft can do is upclock the CPU and GPU again. I think Harrison said they could do it again. If they don't then yeah, MS is in trouble. Just look at all the multiplats having a notable performance/resolution increase on the ps4.

Risky move, X1 APU is huge, upclocks can increase heat, shortening the lifecycle of the chip.
 

IN&OUT

Banned
Um console did run at lower resolutions in the PS2\GameCube\Xbox era.

What console? what resolution?

EDIT: because I really don't remember any games during PS2 era, where resolution was lower than 480i, then PS2 had to scale the image back. and even if that was the case to some games, it was never a common thing to do. devs used to target the standard resolution at the time (480i) then use remaining assets to improve graphics.
 
Top Bottom