• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New Xbox One Rumors: PS4 PSN > XBL, snap crashes games, ES RAM life long bottleneck

Status
Not open for further replies.
Nope mort you were on your own with that

Not a single other source weighing in

http://www.neogaf.com/forum/showthread.php?p=87786946#post87786946

cboat

http://www.neogaf.com/forum/showthread.php?t=705667

Yep absolutely no other sources at all

Seriously though I hope MS gets it fixed for launch but doubt it by the sound of it

At least cboat sounded somewhat optimistic shrug
I've been meaning to say it but your summaries are super awesome and convenient. Thank you, honestly.


I second wholeheartedly...
 
Gaming company? Are you sure that description fits Microsoft, or even the Xbox's division, at the moment?

I don't want them to "fail", but if there's any justice in the world, they'll lose a lot of market share to Sony, and have to focus, lose the arrogance, and stop thinking their customers are brainless addicts. Or at least stop thinking that all of their customers are.

Any justice in the world? Fuck me, that's an absolute classic. Id forgotten that Sony was the knight in shining armour riding up on their gallant steed to save the gaming community from the evil Xbox tyranny.
 
iyFoiql.png

This...this is just sad and utterly cringeworthy. Imagine the mindset of someone who actually believe that their demographic would buy this as non-PR talk...
 
Let me rephrase here. Saying DDR3 feeds ESRAM might not be the choice of words to describe why you can't really combine the two bandwidths. The GPU can access both ESRAM and DDR3 as the same time, but ESRAM function is not the same as DDR3. ESRAM holds a single frame; the frame being the resultant of the GPU calculations. It is not necessary to redraw this frame every time, but only modify the pixels that changed while retaining much of the original frame as possible. Things like alpha blending, depth/stencil, and MSAA are considered "free" because they are not heavy on the GPU, but heavy on memory fetches. Hence why ESRAM has crazy bandwidth.
Exactly. But usually the biggest consumers of bandwidth are the ROPs, which is why you can get away with a slower ram + some type of on chip ram. Adding bandwidth makes sense not because xbone will have games that uses 270GB/s of asset data, but because if there is need the ROPs will have plenty of bandwidth to play with, while keeping the ddr3 free for assets and cpu.

Since the esram is fully addressable by the gpu this time, we might see it being used as a buffer for some interesting stuff like procedural synthesis... This could also be done without even touching the ddr3...

The 68gb/s ddr3 might sound like is terribly slow compared to 176, but compare the ratios of 360/Ps3 and Xbone/Ps4... You will see that xbone's ddr3 is actually not very much slower than 360's gddr3 compared to Ps3 ram setup... But that is something we should see soon enough: If the ddr3 is indeed slow we will probably see some games having reduced asset quality on xbone.
 

Snubbers

Member
Heh, it sounds one sided because the result is one sided as we see in the BF4 comparisons. I am just trying to put some reasoning behind the rumors. I don't think ESRAM is the bottleneck.

Maybe I should post a disclaimer that I am not invested or interested in either systems. PC master race represent.

No problems, but you are reaching somewhat making statements like "You cannot add the two".. Now, we both know PS4 is a better design, but making up rationale is kind of not a good thing to do...
 

astraycat

Member
Exactly. But usually the biggest consumers of bandwidth are the ROPs, which is why you can get away with a slower ram + some type of on chip ram. Adding bandwidth makes sense not because xbone will have games that uses 270GB/s of asset data, but because if there is need the ROPs will have plenty of bandwidth to play with, while keeping the ddr3 free for assets and cpu.

Since the esram is fully addressable by the gpu this time, we might see it being used as a buffer for some interesting stuff like procedural synthesis... This could also be done without even touching the ddr3...

The 68gb/s ddr3 might sound like is terribly slow compared to 176, but compare the ratios of 360/Ps3 and Xbone/Ps4... You will see that xbone's ddr3 is actually not very much slower than 360's gddr3 compared to Ps3 ram setup... But that is something we should see soon enough: If the ddr3 is indeed slow we will probably see some games having reduced asset quality on xbone.

Really the biggest problem with ESRAM is just that it's tiny. Let's say I have several render passes - a few shadow map passes for the cascades maybe, a simple deferred pass, a resolve pass.

On the PC or PS4 I'd just fill in the command buffers and go. I'd have 2 places for sync points: after the shadow passes/the deferred render, and after the resolve. Once I've hit the second sync point, I just present.

On the the XB1 however, I need exclusive use of the ESRAM for my deferred pass, and I'd also like to have my shadow depth buffers in ESRAM during their passes, so I serialize them. First I do all my shadow passes, sync, then move them out and sync again.

Then I do my deferred pass and sync. At this point, I have to decide where I want to put all my buffers for post-processing/resolve pass. I want my final buffer to be there at least, and all the gbuffers I can fit after that. So I swap a buffer out for another, requiring a few more sync points, then do my resolve pass.

After the resolve's done my buffer is still in ESRAM. I don't want it locked there for a whole VSync interval, so I have to copy it out again before presentation. Already it's far more complicated than what I'd have to do on the PC/PS4, and each additional sync is potential time for CUs to sit idle and twiddle their thumbs. Imagine adding several post-processing passes to that.

And this scenario doesn't even take into account the idea of beginning to build the command buffer for the next frame while the GPU is till churning on this one. Just thinking about managing ESRAM contention in that scenario gives me a headache.
 

Kysen

Member
All that balanced talk from MS pr in the DF articles has led to a console running games at 720p with fluctuating framerate wow.
 
Any justice in the world? Fuck me, that's an absolute classic. Id forgotten that Sony was the knight in shining armour riding up on their gallant steed to save the gaming community from the evil Xbox tyranny.

Grow up. Sony's playing their cards right at the moment because they learned from their PS3 screw ups - stemming from arrogance and greed - while they played catch up this gen. Sony currently has the correct stance - that pleasing customers is good business. They lost a lot of market share, and a lot of money, and rightly so, but they've come around...at least for now.
 
Grow up. Sony's playing their cards right at the moment because they learned from their PS3 screw ups - stemming from arrogance and greed - while they played catch up this gen. Sony currently has the correct stance - that pleasing customers is good business. They lost a lot of market share, and a lot of money, and rightly so, but they've come around...at least for now.

Me grow up? You're the one claiming that one companies console beating anothers is "justice", it's fanboyism at it's worst. They're both very good consoles, with pros and cons, which one is the best is not down to you, it's down to the individual purchasing it, so don't give me this justice talk.
 
I can't believe they would implement eSRAM knowing it would be a system life bottleneck. I just can't.

Maybe they thought it would work in the same way eDRAM works in the 360?

:shrugs:

What I do know is the division desperately needs J.Allard back to help stop the ship from capsizing...probably too late though.
 

Copenap

Member
I don't believe they knew it would be a bottleneck, it was probably beneficial in testing/on paper.

I believe the common opinion is that they 100% needed 8 gigs of ram for running the three OS and the whole multi media function. Since they probably thought GDDR5 is too expensive they had to go with DDR3 and basically needed to implement esram because 8 gigs of DDR3 alone would never be enough for gaming.
 
8GB GDDR5 wasn't an option until just before E3.
Sony was really lucky because they went for GDDR5 early and found a chip price cut that allowed them to move from 4 to 8.
Without that lucky strike it would've been XBox one (8GB DDR3) vs Playstation 4 (4 GB GDDR5) and we'll be seeing a more balanced brawl.
 

twobear

sputum-flecked apoplexy
I can't believe they would implement eSRAM knowing it would be a system life bottleneck. I just can't.

The point is, it's a compromise. They could have put in a sensible amount of eDRAM with a decent bandwidth, but they didn't want to because of cost. They knew it was a bottleneck all along, they just didn't care because it saves them money. Didn't one of the early Xbone prototypes have dual GPUs? There were beastly systems on the table at some point.
 
Maybe they thought it would work in the same way eDRAM works in the 360?

:shrugs:

It will do eventually I speculate, once the drivers are sorted out. The advantage with the current ESRAM being open for the developers to handle as they please. The disadvantage at the moment is not having the option to have it handled for you like before. I imagine the optimal solution will be having it handled for you, but with the option for you to manually edit certain aspects, I imagine the driver will allow that with time.
 

Jibbed

Member
I've been watching this thread from afar and things seem to be calming down a bit now.

So, all things considered... is this actually the ass-storm CBOAT was alluding to and Bish is hiding from, or is there more to come? Any comment thus far from either of them?
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I can't believe they would implement eSRAM knowing it would be a system life bottleneck. I just can't.

There is no other alternative if cost projections of GDDR5 or eDRAM on package do no meet your requirements during the design phase. Given those circumstances, the XBO's architecture is reasonable. 8GB for running Windows 8 in parallel to games was more important to them than game performance. And we can be justly mad at the people who made this a core requirement. It's not about providing a better "experience" for games, it's about boosting Microsoft's own apps ecosystem across all platforms in order to compete with iOS and Android.
 

Dahaka

Member
900p on the PS4 is dissapointing but considering it's cross-gen and DICE supposedly even thought about delays it's kinda ok.
 
AA is overrated at 1080p anyway, especially on a TV.

I kind of agree, but only if there's some blur/dof or similar effects and having AA would drop us down to 900p

at 60fps the jaggies should integrate fairly well over time

Haven't tried it on my TV yet but on my PC it looks fine to me
 

avaya

Member
Why they have ESRAM is simple

They wanted 8GB guaranteed for that you need a small scratchpad to meet bandwidth. Naturally you'd choose eDRAM, however they couldn't get that fabbed @28nm at a variety of places. So they had to go for the shitty solution. Good enough. They thought Sony would come in with 4GB max.

At the time they felt confident. That was before Cerny dropped 8GB GDDR5 and made their solution look redundant.
 

Verendus

Banned
Another megathread? Ain't no stopping this train is there? Rumour needs to begin on Xbox One hardware failure now. That must be the next course of action now.
 

Verendus

Banned
Breaking: insider suggests Xbox One hardware prone to higher failure rate than Xbox 360.
I'm not suggesting it. I'm just saying it seems like the next course of action.

We've had two threads where the OS is bonkers or whatever. Stands to reason that Commisioner Gordon's law of escalation will come into effect, and there will be murmurs of Xbox One hardware failures now, and potential self-destruction sequences at launch.

Would fit in with all the Al-Qaeda rumours floating around.
 
Status
Not open for further replies.
Top Bottom