• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Xbone Specs/Tech Analysis: GPU 33% less powerful than PS4

That video is kind of useless when you're not concerned with what sells more and more concerned with what system has better tech for it's games.

Exactly what I was thinking. I don't care which one sells better, I just want the version of multiplatform games that look and run the best.
 
you will be seeing much bigger difference than what you with ps3 vs xbox and i expect most multiplat games to look/run significantly better too

1.8TF vs 1.1TF , 18CU vs 12 CU , 5 gig of slow DDR3 vs 7gig of ultrafast GDDR5. Both the machines are built with PC based architecture, so expect the difference to show from right at launch

You're going to be disappointed. The PS4 is stronger, easily. The difference in real world games will not be as significant as you think. Nothing will really convince people till they start seeing the games and the differences aren't as drastic as people are dreaming up.

Perhaps because you simply don't just add the two together trying to pass it off as a sustainable rate. That said, they should have mentioned its existence separately at the least.

You don't just combine it, but you also don't ignore it. And you're talking about sustainable rate? The 102GB/s provided by the ESRAM is even more sustainable than the 68GB/s on the DDR3 due to low latency, and the fact that the ESRAM shares absolutely nothing with the CPU. The ESRAM bandwidth belongs strictly to the GPU and its clients. No contention from the CPU, I/O etc. To ignore the ESRAM is the height of stupidity. I'm not saying the ESRAM makes it stronger than the PS4, but anyone treating it like it somehow doesn't exist is, well, just spouting nonsense. It's one thing to disagree about the impact it will have, but ignoring it...
 

nib95

Banned
7770 vs 7850 is a much better comparison.

I suppose that's fair, though in reality it-s somewhere in-between the two, unless the rumoured clock bump comes to fruition, which is looking less likely now. Having said that, the PS4 does have the advantage of that added extra bandwidth and far more ram. Whilst the Xbox One also has far more ram, the reality is that because of the low DDR3 bandwidth, only 1gb per frame at 60fps and 2gb per frame at 30fps, can actually be accessed at any given frame.

Oh, so just ignore the ESRAM on Xbox One? How silly. I wonder why I ever bother to read any of this stuff lol.

It shouldn't be outright ignored I agree, but it will make little difference in the grand scheme of things. it's only good for a paltry 32mb.
 

i-Lo

Member
Also consider the rumor that you can code to the metal on PS4 while you have to go through Microsoft's layers on Xbone.

The difference should easily be 50-100% better framerates for multiplats, or higher resolution/effects.

First party games developed ground up for PS4 should also show a pretty huge advantage, larger than what we saw last gen for sure.

That's not entirely accurate.

PS4 has one low level wrapper and one regular one. From what I can tell the low level is a challenging one to use with greater rewards but I doubt 3rd parties will dig that deep when a simpler alternative exists.

Mark my words that we will never see a game with double the framerate on PS4 (while both run at the same resolution). A 50% increase in power does not mean the devs will translate that into framerate advantage. Post processing effects will likely be the thing to get boost.

As far as first party goes, given what they have managed on the PS3, I have little doubt that in time they'll really produce stunning results but we should temper our expectation pertaining to the gap some here expect to see.
 

CLEEK

Member
You're going to be disappointed. The PS4 is stronger, easily. The difference in real world games will not be as significant as you think. Nothing will really convince people till they start seeing the games and the differences aren't as drastic as people are dreaming up.

The differences will be there, and traditional gamers will be able to seem them easily.

What will happen is the inevitable effect of diminishing returns. Xbone games will look amazing (compared to existing games), but for the most part, it will need a trained eye to distinguish them from PS4 games.

MS have bet the farm on this. They've spec'd the Xbone so the jump over the 360 is big enough to warrant the upgrade, but to do so on a tight BOM budget.
 
Oh, so just ignore the ESRAM on Xbox One? How silly. I wonder why I ever bother to read any of this stuff lol.

It doesn't even relate to the graph, it's like a different metric. It will help with certain applications and it will mean multitasking will be just as snappy as the GDDR5 advantage - but there's no pretending it makes up even part of the overall performance gap. Its like a totally different engine part. You can't add them together.
 
You're going to be disappointed. The PS4 is stronger, easily. The difference in real world games will not be as significant as you think. Nothing will really convince people till they start seeing the games and the differences aren't as drastic as people are dreaming up.

Because the hardware is SO similar, it should be very easy to crank up the anti aliasing, or numerous other effects on the PS4 version. This should require very little effort from the developer. Those kinds of tweaks would be easily noticeable to anyone who is serious about games.

Is this going to be a game changing advantage at retail? I don't think so. But for those of us who care about such things, it's going to be a big reason to own a PS4.
 

beast786

Member
You're going to be disappointed. The PS4 is stronger, easily. The difference in real world games will not be as significant as you think. Nothing will really convince people till they start seeing the games and the differences aren't as drastic as people are dreaming up.

Which console for this gen constantly had better 3rd party performance games?

Just an architect issue caused gimped ports. imagine now , no architect Differnce and 50% powerful....... you really think it won't make a difference ?
 

nib95

Banned
The ps3 had a better (well more parallel) CPU though didn't it?

Exactly, which is why coupled with coding to the metal (it sounds like PR fluff but I promise you in this instance it isn't) many of the first party titles still looked superior to the competition. Essentially Sony first party were using different techniques to offload GPU load on to the CPU, and it's many SPE's.
 

i-Lo

Member
Oh, folks can you please stop assuming 1GB fig. as the reserved amount for non-gaming function for PS4? Let us wait for solid numbers.

And before someone asks, no, I do not presume it to be 3GB.
 
That's not entirely accurate.

PS4 has one low level wrapper and one regular one. From what I can tell the low level is a challenging one to use with greater rewards but I doubt 3rd parties will dig that deep when a simpler alternative exists.

Mark my words that we will never see a game with double the framerate on PS4 (while both run at the same resolution). A 50% increase in power does not mean the devs will translate that into framerate advantage. Post processing effects will likely be the thing to get boost.

As far as first party goes, given what they have managed on the PS3, I have little doubt that in time they'll really produce stunning results but we should temper our expectation pertaining to the gap some here expect to see.

You're also discounting the RAM advantage, it's not just the GPU that's better.

Then why are we seeing some real-world GPU benchmarks showing a doubling of framerate for comparable GPUs of the PS4 vs Xbone for certain games?

I'm not saying it'll always happen, but at the same time I don't think for a second that it'll NEVER happen.
 

Shayan

Banned
You're going to be disappointed. The PS4 is stronger, easily. The difference in real world games will not be as significant as you think. Nothing will really convince people till they start seeing the games and the differences aren't as drastic as people are dreaming up.

Like i said, difference is very big and will show right off the bat at launch. Unlike ps3, developers wont have to spend years to learn how to code for the system.

GDRR5 in real applications is way more efficient than DDR3 too. You can live in your dream world but difference will be quite big. After all, xone is based on a much weaker architecture

It shouldn't be outright ignored I agree, but it will make little difference in the grand scheme of things. it's only good for a paltry 32mb.

lol,most people dont even understand that and to make things worse, only 5gig is available to developers
 

nib95

Banned
The biggest blow to Microsoft on the hardware front will happen if Sony can somehow get close to the price of the Xbox One. Getting 50%+ performance for a similar price to the competition should be a huge score among core gamers with the PS4.
 

CLEEK

Member
You're also discounting the RAM advantage, it's not just the GPU that's better.

Then why are we seeing some real-world GPU benchmarks showing a doubling of framerate for comparable GPUs of the PS4 vs Xbone for certain games?

I'm not saying it'll always happen, but at the same time I don't think for a second that it'll NEVER happen.

I think it will happen.

We've seen it with Vita ports, where the devs make the decisions on going for native resolution / effects or frame rate. Games like MvC3 run at 60fps like its console counterparts but non native, games like Stranger's Wrath forego the 60fps of the console version to ensure native resolution.
 
The biggest blow to Microsoft on the hardware front will happen if Sony can somehow get close to the price of the Xbox One. Getting 50%+ performance for a similar price to the competition should be a huge score among core gamers with the PS4.

Exactly.

Price parity + Launch timing parity is a HUGE win for them.

People bring up the Xbox being more powerful than the PS2, but it launched much later.

Timing is critical
 

i-Lo

Member
You're also discounting the RAM advantage, it's not just the GPU that's better.

Then why are we seeing some real-world GPU benchmarks showing a doubling of framerate for comparable GPUs of the PS4 vs Xbone for certain games?

I'm not saying it'll always happen, but at the same time I don't think for a second that it'll NEVER happen.

I am taking all things into consideration and third parties will not be spending extra effort to double framerate on PS4.

In terms of "Never" I did list to be the fact when both are running at same resolution. I don't doubt it'd be easier to do so if PS4 version was running at 1280x1080p vs full 1080p on Xbone.

As such, I think we may see dynamic resolution being used more aggressively for Xbone compared to PS4 down the line while maintaining the same framerate.
 

JaggedSac

Member
Yep, everyone should just understand that the PS4 will have better looking games. The best looking game on the PS4 will be better than the best looking on the bone. Taste not withstanding. Of course, games will look great on both consoles. It will still come down to what games on what systems you are interested in.
 
I am taking all things into consideration and third parties will not be spending extra effort to double framerate on PS4.

In terms of "Never" I did list to be the fact when both are running at same resolution. I don't doubt it'd be easier to do so if PS4 version was running at 1280x1080p vs full 1080p on Xbone.

As such, I think we may see dynamic resolution being used more aggressively for Xbone compared to PS4 down the line while maintaining the same framerate.

It also depends on how they handle ports.

Consider this -

If games are made from the ground up on PS4 as the lead platform, and they target 60 FPS, then that means on the Xbone the game is probably running at around 35-40FPS or so.

And instead of having a variable framerate, the devs will just lock the Xbone version at 30 FPS to keep it stable. OR, they could substantially reduce the resolution of the XBone to get it to 60 fps.
 

Cidd

Member
Oh, folks can you please stop assuming 1GB fig. as the reserved amount for non-gaming function for PS4? Let us wait for solid numbers.

And before someone asks, no, I do not presume it to be 3GB.

I agree, but I think the next thing we can do is look at how the PSVita handles it's system memory. Cerny was also behind it's design so maybe they have a few similarities.

I think we also need to take into account that the system os will be reserving some CPU/GPU power as well, maybe even 10% like the Xbone. But even if we take everything into account the PS4 still comes out ahead.
 

Truespeed

Member
You're going to be disappointed. The PS4 is stronger, easily. The difference in real world games will not be as significant as you think. Nothing will really convince people till they start seeing the games and the differences aren't as drastic as people are dreaming up.

I know I certainly won't. But, I think you're underestimating the performance gap. Consider the impact of every reviewer recommending that people buy the PS4 version each and every time for the next 7 years.
 
I suppose that's fair, though in reality it-s somewhere in-between the two, unless the rumoured clock bump comes to fruition, which is looking less likely now. Having said that, the PS4 does have the advantage of that added extra bandwidth and far more ram. Whilst the Xbox One also has far more ram, the reality is that because of the low DDR3 bandwidth, only 1gb per frame at 60fps and 2gb per frame at 30fps, can actually be accessed at any given frame.



It shouldn't be outright ignored I agree, but it will make little difference in the grand scheme of things. it's only good for a paltry 32mb.

The size of the ESRAM isn't what's important, it's the latency and the bandwidth. Also, 32MB is perfect for a GPU targeting 1080p, if I'm not mistaken. The 32MB isn't useless just because it's small. You would think people learned their lesson about doing that after the 10MB of EDRAM on the Xbox 360 ended up being so important for the 360's overall performance.

I know I certainly won't. But, I think you're underestimating the performance gap. Consider the impact of every reviewer recommending that people buy the PS4 version each and every time for the next 7 years.

But if the game runs and also looks great on the Xbox One, most people won't care about things like that. And I say this as someone who is getting most multiplatforms on the PS4.
 

i-Lo

Member
It also depends on how they handle ports.

Consider this -

If games are made from the ground up on PS4 as the lead platform, and they target 60 FPS, then that means on the Xbone the game is probably running at around 35-40FPS or so.

And instead of having a variable framerate, the devs will just lock the Xbone version at 30 FPS to keep it stable. OR, they could substantially reduce the resolution of the XBone to get it to 60 fps.

If that was the case, then yes. I'd say that would be possible and that to maintain the same framerate aggressive use of dynamic resolution or simply lowering the resolution for Xbone would yield desired result.

However, I get a sense that given PS4 will be so easy to work with Xbone being somewhat more complicated (ram configuration) with potentially less ram, most third parties may lead on that platform and later simply port it over and feature some ancillary improvements.
 
If that was the case, then yes. I'd say that would be possible and that maintain the same framerate aggressive use of dynamic resolution or simply lowering the resolution for Xbone would yield desired result.

However, I get a sense that given PS4 will be so easy to work with Xbone being somewhat more complicated (ram configuration) with potentially less ram, most third parties may lead on that platform and later simply port it over and feature some ancillary improvements.

That didn't happen last time with the PS3....it was more complicated to work with and games usually lead on the 360.

I think devs like to lead with the easiest platform and then deal with the complications of more convoluted platforms as an afterthought.
 

AlphaDump

Gold Member
One thing to remember is that the memory bandwidth for PS4 be actually around ~156GB/sec given the CPU generally utilizes <20GB/s.

I'm having a hard time following some of your arguments on the last few pages, but this point in particular is redundant because with that logic, every system would inherently have that deviation.
 

Man

Member
Oculus Rift v2 (720p per eye) + Every multiplatform game on PS4 in 720p 120 FPS.
Oculus Rift v.2 will be 1080p meaning 960x1080 per eye (according to roadmap).
More fps is always better! but note that framerate output won't be halved on Oculus since each eye has their own dedicated screen estate. A game that pumps out 60 image frames per second will be 60fps 3D on Oculus. No flickering! :)
 
Frame rate increases is a tough one to gauge because at some point the CPU becomes the bottleneck, with the CPUs being the same in both consoles in cases where they are maxed out they wont be able to feed the GPU to make use of the extra frame rate capabilities. Better resolution and IQ or more advanced effects may be easier to optimize for in a multiplat game if the engine is CPU bound.
 

nib95

Banned
The size of the ESRAM isn't what's important, it's the latency and the bandwidth. Also, 32MB is perfect for a GPU targeting 1080p, if I'm not mistaken. The 32MB isn't useless just because it's small. You would think people learned their lesson about doing that after the 10MB of EDRAM on the Xbox 360 ended up being so important for the 360's overall performance.

The latency is not nearly as important with rendering. It's brute speed/bandwidth that matters, which is why all newer versions of ram sacrifice latency for more bandwidth. Latency wise we're talking nano second differences that really don't matter in the bigger picture.

And then you have to remember that the 32mb Esram is still slower compared to the GDDR5 in the PS4 (102Gbps vs 176Gbps). The Esram isn't useless, but it's certainly not going to make up for the slower DDR3 nor compete with the massive amount of high speed GDDR5 ram in the PS4.
 

i-Lo

Member
That didn't happen last time with the PS3....it was more complicated to work with and games usually lead on the 360.

I think devs like to lead with the easiest platform and then deal with the complications of more convoluted platforms as an afterthought.

Two reasons why that happened (save the exceptions):

  • Significantly different architecture where Xbox 360 was the far easier option. There isn't a chasm like that this time.
  • Much better development tool at the start of the generation for Xbox. PS3's situation was exacerbated because the comparatively poorer dev tools. Today, that doesn't exist.

I'm having a hard time following some of your arguments on the last few pages, but this point in particular is redundant because with that logic, every system would inherently have that deviation.

I was pointing to the fact that the GPU will not have access to 176GB/sec due to concurrent cpu related tasks and the that was purported to utilize somewhat less than 20GB/sec most of the time. So I just simply subtracted that from the total.
 

Cidd

Member
If that was the case, then yes. I'd say that would be possible and that to maintain the same framerate aggressive use of dynamic resolution or simply lowering the resolution for Xbone would yield desired result.

However, I get a sense that given PS4 will be so easy to work with Xbone being somewhat more complicated (ram configuration) with potentially less ram, most third parties may lead on that platform and later simply port it over and feature some ancillary improvements.

Isn't easier to just down port?
 
Frame rate increases is a tough one to gauge because at some point the CPU becomes the bottleneck, with the CPUs being the same in both consoles in cases where they are maxed out they wont be able to feed the GPU to make use of the extra frame rate capabilities. Better resolution and IQ or more advanced effects may be easier to optimize for in a multiplat game if the engine is CPU bound.

But PS4's GPU has massive GPGPU capabilities, so it should easily be able to lead in many traditionally CPU-bound cases.
 

bobbytkc

ADD New Gen Gamer
One is subtractive and the other is additive (given their highly similar base architecture, this analysis is relevant). Choosing what to sacrifice generally proves to be more problematic.
The pc version of multiplatform games already have graphical settings. This is a non issue. It is just a matter of choosing graphical settings that are already planned for so it will run in their respective platforms
 
I know this is a tech thread... but have developers (the majority of them, atleast) ever not used the market leader as the lead platform?


If i'm not mistaken


360 targeted

PS2 targeted

PS1 targeted

SNES targeted

NES targeted

2600 targeted

(though the bottom 3 are different scenarios)




If the ps4 sells more it will be the lead platform. If it doesn't, it won't.
 
I know this is a tech thread... but have developers (the majority of them, atleast) ever not used the market leader as the lead platform?


If i'm not mistaken


360 targeted

PS2 targeted

PS1 targeted

SNES targeted

NES targeted

2600 targeted

(though the bottom 3 are different scenarios)




If the ps4 sells more it will be the lead platform. If it doesn't, it won't.

good point, lead platforms are based more on commercial success.
 

bobbytkc

ADD New Gen Gamer
I know this is a tech thread... but have developers (the majority of them, atleast) ever not used the market leader as the lead platform?


If i'm not mistaken


360 targeted

PS2 targeted

PS1 targeted

SNES targeted

NES targeted

2600 targeted

(though the bottom 3 are different scenarios)




If the ps4 sells more it will be the lead platform. If it doesn't, it won't.

The market leader for most of the last gen was the Wii. I thought this was obvious.
 
The size of the ESRAM isn't what's important, it's the latency and the bandwidth. Also, 32MB is perfect for a GPU targeting 1080p, if I'm not mistaken. The 32MB isn't useless just because it's small. You would think people learned their lesson about doing that after the 10MB of EDRAM on the Xbox 360 ended up being so important for the 360's overall performance.

It isnt useless certainly but even in ideal scenarios the total banwidth available will be less than PS4, and in normal use cases most likely significantly less. It also adds development complexity, apparently devs dont like optimizing different multi plat skus if some are to be believed ;)

Latency advantage is only really useful for compute tasks, thing is if your doing compute on XBOne your taking even more resources from the weak GPU, not a good idea. I guess you also have to take into account what effect the 6 extra ACEs in the PS4 GPU will have on compute.

Also the latency advantage may be insignificant. If we say ESRAM has 5ns latency and GDDR5 50ns (numebrs ive made up out of thin air) thats a seemingly significant difference of x10. This doesnt take into account any latency in the GPU pipeline though, if the GPU compute pipeline ads 300ns to that figure you end up with 305ns latency vs 350ns, not as significant as it might have seemed at first.
 
I know this is a tech thread... but have developers (the majority of them, atleast) ever not used the market leader as the lead platform?
If i'm not mistaken

360 targeted

PS2 targeted

PS1 targeted

SNES targeted

NES targeted

2600 targeted

(though the bottom 3 are different scenarios)
If the ps4 sells more it will be the lead platform. If it doesn't, it won't.

Still this gen unlike any other both systems going to be coming out at the same time .
Then the market leader may be different in certain regions .
So if anything i say they will go with the easier platform to develop for PS4 or just use the PC and port down on both systems .
 

Man

Member
The market leader for most of the last gen was the Wii. I thought this was obvious.
Bit of a special case though. Two different generations of hardware in the same console generation. 2:1 favored HD systems (and HD systems were 'comparable' to PC hardware).
And like most Nintendo machines the Wii wasn't the best platform for third parties.
 

vpance

Member
Oculus Rift v.2 will be 1080p meaning 960x1080 per eye (according to roadmap).
More fps is always better! but note that framerate output won't be halved on Oculus since each eye has their own dedicated screen estate. A game that pumps out 60 image frames per second will be 60fps 3D on Oculus. No flickering! :)

Very nice. So how does that work? I thought you always needed to render 2 different viewpoints.
 

Man

Member
What past consoles have had this same level of difference?
There has never been an Apples to Apples comparison before in console history as far as I know.
PS2 to Gamecube is probably the closest comparison in power but it doesn't really fit the bill since the PS2 was quite exotic with some really standout key performances.
The PS4 should be much, much more adept at handling 1080p with richer effects and/or higher framerate than the Xbone.
 
But PS4's GPU has massive GPGPU capabilities, so it should easily be able to lead in many traditionally CPU-bound cases.

True although im not sure things like making draw calls can be offloaded to gpu compute. Then again CPU saved elsewhere would give more headroom for those tasks that do need to run on CPU. Also if your doing lots of compute thats taking away some of the flops from the rendering side that would be needed for the framerate increase. Unless of course the compute is done while raising overall GPU efficiency by using the spare flops available when the GPU is not ALU limited. In the end it will be a juggling act but either way i suspect we will see some games with better framerates and some with better IQ, and the rest a mix of both. Or they just up the eye candy and particle or physics effects.
 

Lasdrub

Member
There has never been an Apples to Apples comparison before in console history as far as I know.
PS2 to Gamecube is probably the closest comparison in power but it doesn't really fit the bill since the PS2 was quite exotic with some really standout key performances.
The PS4 should be much, much more adept at handling 1080p with richer effects and/or higher framerate than the Xbone.

Thanks! I just wanted a general idea of what the difference might be.
 
Top Bottom