• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

No, MS won't do that. People just have to realize, I believe, that MS is targeting the XB1 differently than it did the 360. They also already said they didn't target the higher end of graphics, etc., for gaming, people just didn't listen. In the end, MS is banking on the app and TV functionality making the XB1 take off like the Wii did. Remember, the Wii's graphics capabilities were NOTHING like the X360 or the PS3, but it did very well. MS wants to do that, and the XB1 system is the way they believe they can make that happen.

The Wii wasn't $500. Targeting a mainstream audience with an enthusiast price makes no sense.
 

Marlenus

Member
What made Microsoft go with a turd of a GPU this time around? Didn't 360 have a better GPU than PS3?

It is a combination of factors and until the design process comes out why is just speculation.

Based on what I have read it seems that MS had certain design targets they wanted to hit which meant they had to use the DDR3 + ESRAM setup. From there the rest of the GPU design decisions were based on APU size, the transistor budget and power envelope.

The ESRAM takes up around 2 billion transistors, this contributes to the size of the APU and adds in power. If they had gone with a similar GPU to the PS4 then it would be a larger APU that is harder and more expensive to manufacture.
 
I don't think ESRAM makes sense as an excuse for CoD. The engine is not using deferred rendering.

Weak GPU makes more sense. Fillrate issue.

And it fits in line with them asking for the extra 10% of GPU use. Memory might not be the weak link in this case.
 
That is also true, the 360 had a forward looking GPU with the unified architecture but the PS3 had seperate Vectors and Pixel pipeline design of the Nvidia 7xxx series gpus.

At least both the X1 and the PS4 are using GCN which is a current and up to date architecture.

right. architecturally the X1 GPU isn't as dated as RSX was. so there's that.
 

Skeff

Member
No, MS won't do that. People just have to realize, I believe, that MS is targeting the XB1 differently than it did the 360. They also already said they didn't target the higher end of graphics, etc., for gaming, people just didn't listen. In the end, MS is banking on the app and TV functionality making the XB1 take off like the Wii did. Remember, the Wii's graphics capabilities were NOTHING like the X360 or the PS3, but it did very well. MS wants to do that, and the XB1 system is the way they believe they can make that happen.

This is precisely why I think the XB1 will fail, people who buy a console for app and TV functionality don't pay $500 and then $60 a year, considering the XB1 uses a say whatever you can see feature, based of the one guide to control your TV and the oneguide is behind the XBLG paywall, essentially full TV control is behind the paywall.

The Wii offered something truly different to the PS360 at a much cheaper price and without a paywall.
 
And it fits in line with them asking for the extra 10% of GPU use. Memory might not be the weak link in this case.

I have only seen confirmation that COD Ghosts on next gen adds a bunch of new lighting effects (HDR, volumetric, self shadowing). I haven't seen any confirmation that it isn't using deferred rendering to make it happen.

we do not know if there is a performance difference between the two versions. we do not know if the Xbox One version is running all the same effects at the same complexity as the PS4 version.

we only know the resolutions. remember Battlefield 4, running at a lower average framerate on Xbox One and (currently) missing at least one effect to boot?

10% more pixels over 720p isn't going to get you very far at all. It would only mean that the PS4 version was about double the number of pixels instead of *more than* double.
 

Raistlin

Post Count: 9999
You-Died.jpg
Actually, it was an accurate statement in terms of performance.


right. architecturally the X1 GPU isn't as dated as RSX was. so there's that.
Though this is also true.




Basically it's a modern architecture, but in terms of raw performance ... it's actually behind where RSX was at its time. Different era though I suppose, since you could level a similar argument versus PS4?
 

Marlenus

Member
I have only seen confirmation that COD Ghosts on next gen adds a bunch of new lighting effects (HDR, volumetric, self shadowing). I haven't seen any confirmation that it isn't using deferred rendering to make it happen.

we do not know if there is a performance difference between the two versions. we do not know if the Xbox One version is running all the same effects at the same complexity as the PS4 version.

we only know the resolutions. remember Battlefield 4, running at a lower average framerate on Xbox One and (currently) missing at least one effect to boot?

10% more pixels over 720p isn't going to get you very far at all. It would only mean that the PS4 version was *almost* double the number of pixels instead of *more than* double.

The 10% extra GPU power would probably not have resulted in a resolution bump but it may have resulted in an effects bump or a flat frame rate bump. It will be interesting to see the comparisons for COD as the fact they asked for that extra power suggests to me that even at 720P they are(perhaps were) having some problems with something.
 
The Wii wasn't $500. Targeting a mainstream audience with an enthusiast price makes no sense.

They also don't have the charm of Nintendo. They can't be both the console for shooters and the console for families at the same time. You can't do everything for everyone. That's MS's problem.
 

Megatonne

Banned
Well, unless I'm mistaken, so is the PS4's GPU. Not to downplay the comparison too much, but yeah.

Xbones is substantially weaker and the difference shines right in 1080p...Look up performance on the 7770(it will be harder and harder to find because it is so outdated by now). It can't do 1080p/60fps in anything released within the last two years. The 7850/7870 of the PS4 is right at 60fps/1080p in the same games

battlefield_3_ultra.png


Crysis3.png
 

TheTwelve

Member
ms can still win the tech wars by dropping xb1 price to $349
$499 for a clearly weaker console, lol, not even apple that bad. ms is trying to be the new nintendo.


MS isn't winning a thing this gen, imo. However, they can stay competitive by dropping the price to $349.

However, how long will that take? A $100 price drop would do wonders in terms of sales but, I don't see even a $50 price drop until post-summer 2014. A $150 price drop? I don't see it being that low until at least 2015.

Then, would Sony just sit back and let them undercut them in price? I don't think so. They're going to make sure the PS4 is cheaper than the X1 alllll gen.
 
The 10% extra GPU power would probably not have resulted in a resolution bump but it may have resulted in an effects bump or a flat frame rate bump. It will be interesting to see the comparisons for COD as the fact they asked for that extra power suggests to me that even at 720P they are(perhaps were) having some problems with something.

right. people are taking what could be a bad sign about a specific game, as a reason to be optimistic about the future.

I'm not clear on how different lighting techniques need to be programmed. I know Ghosts is doing volumetric lighting on next gen consoles (which is a new feature). How many examples do we have of games that use the technique that do not use deferred techniques?

All the ones I know of use deferred techniques, but I don't understand things enough to know if one infers the other/

Crysis was the first game I played with it, and I know Cryengine 2 and 3 have some deferred techniques (cryengine 3 has more). All UE3 games have deferred rendering techniques (and I believe Bioshock Infinite has volumetric lighting). I know Alan Wake has it, and again, uses deferred techniques.
 
Xbones is substantially weaker and the difference shines right in 1080p...Look up performance on the 7770(it will be harder and harder to find because it is so outdated by now). It can't do 1080p/60fps in anything released within the last two years. The 7850/7870 of the PS4 is right at 60fps/1080p in the same games

battlefield_3_ultra.png


Crysis3.png

One of those cards has 1GB of video RAM, the other 2GB. I think thats an unfair comparison, find some benchmarks with the 1GB 7850.
 
MS isn't winning a thing this gen, imo. However, they can stay competitive by dropping the price to $349.

However, how long will that take? A $100 price drop would do wonders in terms of sales but, I don't see even a $50 price drop until post-summer 2014. A $150 price drop? I don't see it being that low until at least 2015.

Then, would Sony just sit back and let them undercut them in price? I don't think so. They're going to make sure the PS4 is cheaper than the X1 alllll gen.

MS can cut Kinect from the package to cut price substantially. It's just a question of whether or not they are too proud to do it.
 

Y2Kev

TLG Fan Caretaker Est. 2009
Both consoles are using very standard parts so it really comes down to who is able to absorb loss and how fast do parts scale.

What's the most expensive part of each system? I have to believe the PS4 actually scales better but I don't know how Kinect factors in. Does ESRAM come down in cost over time? Cause GDDR5 will get very cheap.
 

Megatonne

Banned
One of those cards has 1GB of video RAM, the other 2GB. I think thats an unfair comparison, find some benchmarks with the 1GB 7850.

The Crysis 3 comparison is a 1gb 7850 and like I said these are hard to find for the 7770, the card is irrelevant by now. This isn't an indication of what the GPUs are fully capable of. But it's an indication that the PS4 GPU is much more capable at 1920x1080 and the difference is substantial
 

goonergaz

Member
Then, would Sony just sit back and let them undercut them in price? I don't think so. They're going to make sure the PS4 is cheaper than the X1 alllll gen.

I recall something about the XBO hardware being quicker to drop in manufacturing costs compared to PS4, so I guess it all depends if Sony want to make money on the hardware or not (and visa-versa).
 
Both consoles are using very standard parts so it really comes down to who is able to absorb loss and how fast do parts scale.

What's the most expensive part of each system? I have to believe the PS4 actually scales better but I don't know how Kinect factors in. Does ESRAM come down in cost over time? Cause GDDR5 will get very cheap.

This is an odd situation...

Esram prices generally don't change.

When memory becomes obsolete it generally goes up in price. GDDR3 willl be more expensive the end of the Xbox One's life than currently, yet GDDR3 still hasn't quite hit rock bottom in price.

GDDR5 prices will continue to become cheaper over the PS4's life.
 

Marlenus

Member
Both consoles are using very standard parts so it really comes down to who is able to absorb loss and how fast do parts scale.

What's the most expensive part of each system? I have to believe the PS4 actually scales better but I don't know how Kinect factors in. Does ESRAM come down in cost over time? Cause GDDR5 will get very cheap.

ESRAM scales really well with node shrinks compared to logic so as these are shrunk to 20nm and less the % gap between the size of the X1 APU and the size of the PS4 APU will reduce.
 

DBT85

Member
What made Microsoft go with a turd of a GPU this time around? Didn't 360 have a better GPU than PS3?

The PS3 got a GPU at the last hour, they were planning on using another Cell until the last minute. And a design has a silicone budget of x mm. ESram takes up a large chunk of that MM, so they had to reduce the mm used by the GPU.

Both consoles are using very standard parts so it really comes down to who is able to absorb loss and how fast do parts scale.

What's the most expensive part of each system? I have to believe the PS4 actually scales better but I don't know how Kinect factors in. Does ESRAM come down in cost over time? Cause GDDR5 will get very cheap.

Well the ESRAM is made on the same process at the same time as the rest of the APU, so as they drop from 28nm they'll make savings on the APU which of course includes the CPU, GPU and the ESRAM. The question I have is how easy will it be for them to scale down with that larger APU.
 

pushBAK

Member
I don't think ESRAM makes sense as an excuse for CoD. The engine is not using deferred rendering.

Weak GPU makes more sense. Fillrate issue.

PS4: 1.84TF GPU ( 18 CUs)
PS4: 1152 Shaders
PS4: 72 Texture units
PS4: 32 ROPS
PS4: 8 ACE/64 queues
8gb GDDR5 @ 176gb/s

Verses

Xbone: 1.31 TF GPU (12 CUs)
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues
8gb DDR3 @ 69gb/s+ 32MB ESRAM @109gb/s

It's absolutely a fillrate issue; the X1 GPU is particularly bad at rasterisation a 3D scene. That is, converting the 3D scene into a 2D image, written to memory, and further modified by Pixel Shaders and eventually displayed on the screen.

Now, correct me if I'm wrong but...

The ROPs in the GPU is what takes care of this; they convert the 3D scene into a 2D image and write it to memory.

X1 has 16 ROP units, and write the 2D image to DDR3 memory at 69GB/s (or eSRAM at 109GB/s)
PS4 has 32 ROP units, and write the 2D image to GDDR5 memory at 176GB/s

Needless to say, PS4 will have better fillrate. This basically means, upping the resolution on the PS4 will have a minimal decrease in frame rate (compared to the X1), due to the PS4 being more efficient at the process.
 
Further researching...

Anvil Next (some Ubi games). It uses deferred lighting. Frostbite 3 (lots of EA games)... deferred lighting. UE4... deferred. Cryengine 3... deferred. Crystal engine (Tomb Raider, Deus Ex: Human Revolution)... deferred.

We won't have to wait long to find out for sure if resolution issues are framebuffer related or launch related.
 

Y2Kev

TLG Fan Caretaker Est. 2009
Further researching...

Anvil Next (some Ubi games). It uses deferred lighting. Frostbite 3 (lots of EA games)... deferred lighting. UE4... deferred. Cryengine 3... deferred. Crystal engine (Tomb Raider, Deus Ex: Human Revolution)... deferred.

We won't have to wait long to find out for sure if resolution issues are framebuffer related or launch related.


Everyone is going deferred. Which is why it probably killed the engineers to only have 32mb of ESRAM on xbone but I doubt the suits cared.
 
Everyone is going deferred. Which is why it probably killed the engineers to only have 32mb of ESRAM on xbone but I doubt the suits cared.

Yeah 32MB broken into even smaller 4x8MB chunks. Not much you can do with 8MB with the next-gen engines that are coming down the pipes.

Likely just like the 360 era, this eSRAM will be pretty much used for post processing type stuff and not much else. So when it comes to the main graphic engine, you will have to souly rely on the 67GB/s of bandwidth, which is just ... impossible. Impossible for launch games, more then impossible for true next-gen " built from the ground up with final hardware in mind on final devkits " engines.

I don't honestly know what Microsoft can do.

According to AMD the 7770 has " the Radeon HD 7770 offers up 1.28 TFLOPS of compute performance, with a texture fillrate of 40GT/s, a pixel fillrate of 16 GP/s, and peak memory bandwidth of 72GB/s."

44638.png


44639.png


Do we know the X1's fillrates?
 
Everyone is going deferred. Which is why it probably killed the engineers to only have 32mb of ESRAM on xbone but I doubt the suits cared.

I'm sure the suits think 720p is good enough. There are certainly plenty of people who agree with them, and I think for a lot of people it will be.

But just like with last gen and SD, by the end of the gen the number of people okay with 720p are going to be smaller in number than they are at the launch of the gen.

And I still suspect Ghosts is deferred, since they've touted realtime volumetric/HDR lighting. I mean, again, it might not be and they haven't confirmed that it is, but if it isn't... well, it's really hard to explain why the Xbox One isn't running it closer in resolution to the PS4 version. Because, there's no reason we should expect more than the 50% difference in resolution we see between the Xbox One version of BF4 and the PS4 version.

It makes perfect sense that the PS4 version of COD is higher resolution than BF4 is. You just have to look at the two games. So why isn't the Xbox One version, if not because of framebuffer limitations?
 
Further researching...

Anvil Next (some Ubi games). It uses deferred lighting. Frostbite 3 (lots of EA games)... deferred lighting. UE4... deferred. Cryengine 3... deferred. Crystal engine (Tomb Raider, Deus Ex: Human Revolution)... deferred.

We won't have to wait long to find out for sure if resolution issues are framebuffer related or launch related.

Everyone is going deferred. Which is why it probably killed the engineers to only have 32mb of ESRAM on xbone but I doubt the suits cared.

Well sounds like this won't be a good gen for MS multiplats then

ESram seems very limiting
 

twobear

sputum-flecked apoplexy
Both consoles are using very standard parts so it really comes down to who is able to absorb loss and how fast do parts scale.

What's the most expensive part of each system? I have to believe the PS4 actually scales better but I don't know how Kinect factors in. Does ESRAM come down in cost over time? Cause GDDR5 will get very cheap.

Wasn't the entire point of their using eSRAM in order to save money?
 

Y2Kev

TLG Fan Caretaker Est. 2009
It was to save money vs. using EDRAM. They opted not to use GDDR5 because they didn't think it would be available in quantity by fall 2013.

I think people are forgetting one important fact

xbone has more ram than ps4 which will allow for more effects like svogi and dogus
 
Well sounds like this won't be a good gen for MS multiplats then

ESram seems very limiting

Microsoft obviously think 720p is good enough. If they didn't, Killer Instinct and that golf game would be running at higher resolutions. Neither *needs* real time lighting. We know they don't prioritize 1080p, because most of their exclusive launch games aren't 1080p (and all of Sony's launch exclusives are).

So why should we expect them to make sure the hardware can do 1080p with advanced rendering techniques?

I honestly think they believe 720p is fine like almost every video game editorial writer and youtube commenter.

I wonder if KI is using deferred lighting as well. Would explain 720p.

I literally haven't heard one other explanation for it. I've asked a bunch of times to people who believe COD shouldn't be 720p. I don't mean to say their explanations weren't plausible, I mean, I wasn't ever given one.
 
Wasn't the entire point of their using eSRAM in order to save money?

Yes but the general memory scenario for the XB1 relative to the PS4 has far less room for future price drops

I.E. While the ESram will likely get cheaper over time as they successfully shrink the APU (Pretty sure they'll accomplish this only time will tell) but the majority of the memory the DDR3 is already at rock bottom prices

The PS4 however has GDDR5 memory which will drop in price quite a lot over the gen

CPUS are the same, not sure what the price disparity is between GPU's though

Microsoft obviously think 720p is good enough. If they didn't, Killer Instinct and that gold game would be running at higher resolutions. Neither *needs* real time lighting. We know they don't prioritize 1080p, because most of their exclusive launch games aren't 1080p (and all of Sony's exclusives are).

So why should we expect them to make sure the hardware can do 1080p with advanced rendering techniques?

I honestly think they believe 720p is fine like almost every video game editorial writer and youtube commenter.

Yeah I'm sure that's the case. It's too bad I prefer full hd I guess

Do you think we'll see sub-hd games on the XB1 near the end of the gen when devs start trying to really push games? That would be pretty bad
 

nib95

Banned
It was to save money vs. using EDRAM. They opted not to use GDDR5 because they didn't think it would be available in quantity by fall 2013.

I think people are forgetting one important fact

xbone has more ram than ps4 which will allow for more effects like svogi and dogus

More ram? You mean 32mb more? According to GAF's insiders, the OS reserve is 2GB on the PS4. Which actually gives it 1GB more ram for games compared to the Xbox One.
 

Crisco

Banned
Yeah 32MB broken into even smaller 4x8MB chunks. Not much you can do with 8MB with the next-gen engines that are coming down the pipes.

Likely just like the 360 era, this eSRAM will be pretty much used for post processing type stuff and not much else. So when it comes to the main graphic engine, you will have to souly rely on the 67GB/s of bandwidth, which is just ... impossible. Impossible for launch games, more then impossible for true next-gen " built from the ground up with final hardware in mind on final devkits " engines.

I don't honestly know what Microsoft can do.

From a video game perspective? Not a damn thing. That's why it's better just to not think of the Xbox One as a video game console. It's an all in one computing and entertainment device that just happens to play games. Like a smart phone or a tablet, but centered around your living room display instead of a tiny touch screen.It's essentially MS's end game, the final stage of their decade long strategy to wrestle control of the living room from the likes of Nintendo, Sony, and cable/set-top boxes.

Honestly I think it's far more likely that MS releases a SKU without a controller, than one without Kinect.
 
Yes but the general memory scenario for the XB1 relative to the PS4 has far less room for future price drops

I.E. While the ESram will likely get cheaper over time as they successfully shrink the APU (Pretty sure they'll accomplish this only time will tell) but the majority of the memory the DDR3 is already at rock bottom prices

The PS4 however has GDDR5 memory which will drop in price quite a lot over the gen

CPUS are the same, not sure what the price disparity is between GPU's though

X1 equiv. on market is 100-120$ ( 7770 )

PS4 equiv. on market is about 200-220$ ( 7850 - 7870, i call it a 7860, right in the middle of both )
 

Marlenus

Member
Yeah 32MB broken into even smaller 4x8MB chunks. Not much you can do with 8MB with the next-gen engines that are coming down the pipes.

Likely just like the 360 era, this eSRAM will be pretty much used for post processing type stuff and not much else. So when it comes to the main graphic engine, you will have to souly rely on the 67GB/s of bandwidth, which is just ... impossible. Impossible for launch games, more then impossible for true next-gen " built from the ground up with final hardware in mind on final devkits " engines.

I don't honestly know what Microsoft can do.

According to AMD the 7770 has " the Radeon HD 7770 offers up 1.28 TFLOPS of compute performance, with a texture fillrate of 40GT/s, a pixel fillrate of 16 GP/s, and peak memory bandwidth of 72GB/s."

44638.png


44639.png


Do we know the X1's fillrates?

The X1 has fillrates similar to the 7770. The Pixel fill rate will depend on memory bandwidth. I did some scaling calculations based on the 7770, 7790, 7850 and 7970 (where the 7970 was ROP bound rather than bandwidth bound like the other cards) and they came out quite close to Anandtechs numbers. I will edit the post in a minute when I find them.
 

Skeff

Member
The costs for PS4 should drop faster than the costs for XB1, due to GDDR5, the 4Gbit chips are right at the cutting edge of the market and the cost will come down massively, not just across the generation but even within the first year of the generation.

The main cost reduction for MS will be the APU, the PS4 will also benefit from reduced APU costs but also will benefit from vast cost reductions in Memory.

One of the potential yield issues associated with esram is electrical leakage, it is possible that this means the XB1 APU may not be able to move to the next process as quickly as the PS4. (this isn't fact, but definitely a possibility.)
 
Oh right, yeah.

Yeezus, is Xbone the most cynically designed console of all time? It's pretty much completely unlikeable.

maybe. I wouldn't be surprised if someone crunched numbers to figure out if it was worth including more ESRAM in order to facilitate higher resolutions.

for a company that wanted to try to argue it had designed it's console 'for the future', it's certainly not as forward looking a piece of hardware as the Xbox 360 was.
 
Top Bottom