• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

CBOAT: ESRAM handicap for now, but will get better

I think it's because the Xbone is using eSRAM instead of eDRAM, but I might be wrong.

ESRAM - slower, takes more space for equivalent capacity Vs EDRAM. Can be integrated into an APU

EDRAM - faster, takes less silicon so you can have more storage or more space available for GPU. Cannot currently be integrated on an APU, requires a daughter die (I think the process node for EDRAM is stuck at a larger size than the current APU?)

If they'd gone for an EDRAM daughter die with maybe 64MB, similar system to 360, the bandwidth would have been much higher, more capacity for 1080p support with deferred rendering and they'd have had more space to put a much more powerful GPU on. And still keep the DDR3 solution
 
I feel for the Xbox One users that will have to experience something as bad as initial PS3 Skyrim. Don't worry, the patches will come.
 
I feel for the Xbox One users that will have to experience something as bad as initial PS3 Skyrim. Don't worry, the patches will come.

Wasn't that because there wasn't enough system RAM and they were writing stuff to the HDD?

Not sure how a weak GPU will cause such issues.

[edit] Such comments do make me wonder how much of the 'interest' around the power difference is PS3-only owners hoping that Xbone users are punished the same way they were for Sony's poor decisions.
 
If Xbone engineers were real gamers, they would immediately refuse designing X1 as it is now. MS should've implemented Gddr5 inside the APU or opted to more conventional PowerPC + Powerful GPU approach....but they cared about shoving media apps and advertising more than anything else.
Engineers are constrained to certain limitations. They aren't free to do what they want. Also GDDR5 inside the APU? You know that that isn't possible right.
I feel for the Xbox One users that will have to experience something as bad as initial PS3 Skyrim. Don't worry, the patches will come.
That was due to the split memory pool inside the PS3.
 
What a silly thing to say. Engineers have to design within constraints, they're not free to do whatever they want.

True, but also engineers can force a design into a project especially if there is an obvious flaw in the current design.

It needs courage to do so, but believe me, it can be easily done, higher management can get a good scare when engineers started explaining the pros and cons of each design.

MS is not short on cash, (I know that doesn't mean they can spend whatever amount of money) but a good engineer would have known that 1080p in 2013 is a must, implementing eSram will be huge bottleneck to such requirement.
 
I feel for the Xbox One users that will have to experience something as bad as initial PS3 Skyrim. Don't worry, the patches will come.

hopefully it will not come down to that mess.
maybe graphics a bit worse at start, but hopefully games will play solid.
its not like its the cell + spe +mem thing
 
If MS bosses were real gamers, they wouldn't even have thought asking engineers to design X1 as it is now. MS should've implemented Gddr5 inside the APU or opted to more conventional PowerPC + Powerful GPU approach....but they cared about shoving media apps and advertising more than anything else.

Fixed.
 
mj-laughing.gif

You can laugh all you want while stucking in sub-1080p games for another 8 years.
 
True, but also engineers can force a design into a project especially if there is an obvious flaw in the current design.

It needs courage to do so, but believe me, it can be easily done, higher management can get a good scare when engineers started explaining the pros and cons of each design.

MS is not short on cash, (I know that doesn't mean they can spend whatever amount of money) but a good engineer would have known that 1080p in 2013 is a must, implementing eSram will be huge bottleneck to such requirement.

MS's engineers probably aren't gamers, though. They're engineers. Sony's engineers probably aren't gamers either. The problem is with the suits not the engineers.
 
True, but also engineers can force a design into a project especially if there is an obvious flaw in the current design.

It needs courage to do so, but believe me, it can be easily done, higher management can get a good scare when engineers started explaining the pros and cons of each design.

MS is not short on cash, (I know that doesn't mean they can spend whatever amount of money) but a good engineer would have known that 1080p in 2013 is a must, implementing eSram will be huge bottleneck to such requirement.
That implies it is a design flaw.

If Cboat is correct that XBO was meant to launch next year, it could simply be that the tools for the systems are just not where they need to be. That's not something an engineer could have possibly foreseen.
 
If Xbone engineers were real gamers, they would immediately refuse designing X1 as it is now. MS should've implemented Gddr5 inside the APU or opted to more conventional PowerPC + Powerful GPU approach....but they cared about shoving media apps and advertising more than anything else.

ib1YRHiDzPQHh9.gif


Yeah sounds about right.
 
Technical curiosity, what's the difference? I'm guessing EDRAM must be more limited than ESRAM, to have such substantial benefits without being a direct replacement.

esram is static and edram is dynamic, essentially the main differences are:

Edram is higher density - in the same amount of die space, you can get around 3x the memory size.
esram can lower power.
esram can be fabricated in more places (very important for XB1) as edram requires extra stages during fabrication.

Essentially the main benefit Esram provides over edram would be $$$$$
 
esram is static and edram is dynamic, essentially the main differences are:

Edram is higher density - in the same amount of die space, you can get around 3x the memory size.
esram can lower power.
esram can be fabricated in more places (very important for XB1) as edram requires extra stages during fabrication.

Essentially the main benefit Esram provides over edram would be $$$$$

I might be completely mistaken here but I thought that only a few companies are allowed to produce x86 processors, so why is the fabrication thing important?

Or is the license only for the design?
 
That implies it is a design flaw.

If Cboat is correct that XBO was meant to launch next year, it could simply be that the tools for the systems are just not where they need to be. That's not something an engineer could have possibly foreseen.

I'm not the most technical person in gaf, not even close. But even I would have seen that 32 mb of eSram would be somewhat a bottleneck for a next generation console...it's no magic, just simple calculations and you have the data.
 
If they'd gone for an EDRAM daughter die with maybe 64MB, similar system to 360, the bandwidth would have been much higher, more capacity for 1080p support with deferred rendering and they'd have had more space to put a much more powerful GPU on. And still keep the DDR3 solution

This makes me wonder why they went with eSRAM? Cost?
 
I might be completely mistaken here but I thought that only a few companies are allowed to produce x86 processors, so why is the fabrication thing important?

Or is the license only for the design?

I'm not sure on the specifics of this, but AMD are apparently only allowed to license x86 if they fabricate it themselves, the fabrication of edram reportedly could not be done at any of these facilities and so would have caused Microsoft to have edram fabricated with a seperate company on a daughter die, the problem of course would be that they would not have got such a large discount if they did this.

Or at least that's my take on it.
 
I think that we'll see a variable frame buffers a lot more this time around. Absolute resolution will be much harder to quantify as a measure of overall quality.
Clearly MS has made a hard decision on the value propsition of targetting performance at above say 1 to 1.5 mpixels, given their target audience.

The xbone has the ability to easily mix several display planes of different dimensions, so this will likely mean that hud or osd in a game is rendered to 1080, keeping imporant info sharper, while the main framebuffer is variable depending on system load.
 
I'm not the most technical person in gaf, not even close. But even I would have seen that 32 mb of eSram would be somewhat a bottleneck for a next generation console...it's no magic, just simple calculations and you have the data.

Why is it a bottleneck? We've already heard how 32MB is a large enough buffer/scratchpad for up to 1080p.
 
Every time a game is revealed on xbone, the first question on everyone's lips will be, at what resolution? It doesn't matter if it's noticable or not, it's another negative connotation associated with the brand...

Agreed... But only by the Hardcore, I was in Tesco just the other day talking to a guy who works in the Tech area and he had bought the PS4 to play Fifa and Cod (I know I shook my head too), when I said well at least you have the better console with more power/higher rez etc....His answer was " I thought they were the same until the Xbox got upgraded on its processor(s)? and the PS4 was cheaper!!!"

This is the market that matters and outside of us on Gaf and hardcore gamers the Majority of the buyers will not know/notice or care
 
esram is static and edram is dynamic, essentially the main differences are:

Edram is higher density - in the same amount of die space, you can get around 3x the memory size.
esram can lower power.
esram can be fabricated in more places (very important for XB1) as edram requires extra stages during fabrication.

Essentially the main benefit Esram provides over edram would be $$$$$

Interesting stuff, thanks!
 
Entrecôte;87276841 said:
Why is it a bottleneck? We've already heard how 32MB is a large enough buffer/scratchpad for up to 1080p.

No it isn't, a 1080p framebuffer can be as large as 150mb.

And at the very minimum using old rendering techniques it is ~24mb without AA.

With 2xFXAA or FSAA a 1080p framebuffer is a minimum of 40mb.

BattleField 3's G-Buffer at 1080p and 4xFXAA is ~150mb
KZ:SF's G-Buffer is ~47mb without AA.
 
Agreed... But only by the Hardcore, I was in Tesco just the other day talking to a guy who works in the Tech area and he had bought the PS4 to play Fifa and Cod (I know I shook my head too), when I said well at least you have the better console with more power/higher rez etc....His answer was " I thought they were the same until the Xbox got upgraded on its processor(s)? and the PS4 was cheaper!!!"

This is the market that matters and outside of us on Gaf and hardcore gamers the Majority of the buyers will not know/notice or care

Perception spreads from the core. We are the tastemakers. Our decisions and preferences do matter. Not for every purchase, but enough to get momentum started.

This is how the 360 gained marketshare, and it's how the PS4 will regain some of that marketshare.
 
Entrecôte;87276841 said:
Why is it a bottleneck? We've already heard how 32MB is a large enough buffer/scratchpad for up to 1080p.

No, you heard that 16ROPs are enough for 1080p which is true. No one in his right mind can say that 32mb esram is enough to handle next gen textures going back and fourth to feed the GPU.

ESRAM might be enough for current generation asset and textures but not next gen. That's why xbox one games immediately drop the resolution to get more effects.
 
Perception spreads from the core. We are the tastemakers. Our decisions and preferences do matter. Not for every purchase, but enough to get momentum started.

This is how the 360 gained marketshare, and it's how the PS4 will regain some of that marketshare.

You mean just like the Wii and PS Vita?

Lets not kid ourselves here, the 'power' of the core gamer to influence sales is definitely hit and miss, and I'm inclined to call it more correlation than causation. I think the price tag will have more to do with the Xbone's impending floppage than will core gamers leaving it for dead.
 
That implies it is a design flaw.

If Cboat is correct that XBO was meant to launch next year, it could simply be that the tools for the systems are just not where they need to be. That's not something an engineer could have possibly foreseen.
If that would have meant that the Xbox One would've received a power boost. Instead of what we are getting now. I would've preferred that. It would have given Microsoft more time to prepare their PR. It would possible have prevented this entire clusterfuck we are currently having.
 
ps4-dual-architecture.jpg


OP reminded me of that. Hopefully devs will get a handle on the new Xbox soon. It took a long while with the PS3 and in a lot of cases 3rd party support never fully recovered. At least the X1 doesn't seem as confusing as the PS3 was to program for.

It's also has way less power compared to it's competitioner. We are not having a 360/PS3 situation here at all, where one console was accessible and powerful and the other one was complicated but potentially more powerful. The new situation: ONE console is inaccessible and weak, the other one is accessible and (well, for a 400$ console...) strong. The outcome will be devastating for X1 and so many people still planning on buying it anyway just shows how little people care for technology when they buy new technology. I know, it sounds weird, but it's true.
 
Dat Next Gen feel of 720p.

At this point, Xbox One would need to come out with some stellar first party titles in order to bring me over. It seems insane to me that they are charging $500 for a machine that will mostly do 720-900p for third party titles when third party titles make up 90% of the releases in a given year. PS4 has a major advantage for many because it just makes sense to own the system that is going to have the best quality for the majority of the releases in a year.

I really hope MS can come up with some exciting IPs in the next 4 years. Gears and Halo doesn't cut it for me anymore. In terms of how this impacts sales, I'm not convinced this will be a big dent. But not because people don't care, but because they just don't know.
 
No, you heard that 16ROPs are enough for 1080p which is true. No one in his right mind can say that 32mb esram is enough to handle next gen textures going back and fourth to feed the GPU.

ESRAM might be enough for current generation asset and textures but not next gen. That's why xbox one games immediately drop the resolution to get more effects.


No. I was not referring to that.

http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/3

At 32MB the ESRAM is more than enough for frame buffer storage, indicating that Microsoft expects developers to use it to offload requests from the system memory bus. Game console makers (Microsoft included) have often used large high speed memories to get around memory bandwidth limitations, so this is no different. Although 32MB doesn’t sound like much, if it is indeed used as a cache (with the frame buffer kept in main memory) it’s actually enough to have a substantial hit rate in current workloads (although there’s not much room for growth).
 
It's also has way less power compared to it's competitioner. We are not having a 360/PS3 situation here at all, where one console was accessible and powerful and the other one was complicated but potentially more powerful. The new situation: ONE console is inaccessible and weak, the other one is accessible and (well, for a 400$ console...) strong. The outcome will be devastating for X1 and so many people still planning on buying it anyway just shows how little people care for technology when they buy new technology. I know, it sounds weird, but it's true.
Indeed. Everyone keeps on ignoring reality. It's silly to expect the gap to ever be closed between the two consoles. One simply is much more powerful than the other. It's as simple as that.
 
You mean just like the Wii and PS Vita?
Lets not kid ourselves here, the 'power' of the core gamer to influence sales is definitely hit and miss, and I'm inclined to call it more correlation than causation. I think the price tag will have more to do with the Xbone's impending floppage than will core gamers leaving it for dead.

In fairness now, handhelds are not as popular as they once were.

Back on the other hand though, apparently "gamers" chose the dreamcast, and look how that turned out...

Why did I post this?
 
You mean just like the Wii and PS Vita?

Lets not kid ourselves here, the 'power' of the core gamer to influence sales is definitely hit and miss, and I'm inclined to call it more correlation than causation. I think the price tag will have more to do with the Xbone's impending floppage than will core gamers leaving it for dead.

Price is the biggest factor, no doubt. I'm just saying you can't discount the tastemakers completely. There is an impact.
 
The xbone has the ability to easily mix several display planes of different dimensions, so this will likely mean that hud or osd in a game is rendered to 1080, keeping imporant info sharper, while the main framebuffer is variable depending on system load.

This is not an ability exclusive to the Xbone, the PS4 can do the same thing.
 
Agreed... But only by the Hardcore, I was in Tesco just the other day talking to a guy who works in the Tech area and he had bought the PS4 to play Fifa and Cod (I know I shook my head too), when I said well at least you have the better console with more power/higher rez etc....His answer was " I thought they were the same until the Xbox got upgraded on its processor(s)? and the PS4 was cheaper!!!"

This is the market that matters and outside of us on Gaf and hardcore gamers the Majority of the buyers will not know/notice or care

I've heard similar stuff too, know a couple of people who think the Xbone is significantly more powerful than the PS4 because it's so much more expensive.

I find it crazy how willing some people are to drop £400 on a console without even taking five minutes to do a quick Google search.
 
You mean just like the Wii and PS Vita?

Lets not kid ourselves here, the 'power' of the core gamer to influence sales is definitely hit and miss, and I'm inclined to call it more correlation than causation. I think the price tag will have more to do with the Xbone's floppage than will core gamers leaving it for dead.

I think the Wii and Vita were very different cases, they were not against direct competition IMO. Sure the 3DS was against the Vita but I thik they appeal to different people, same for the Wii vs. the HD twins.

I do not think of Nintendo as competition against Sony/Microsoft as they're just so different, I see the Wii U as a supplementary console, e.g. I will get a PS4 and Wii U because they offer very different things. I hope that doesn't sound condescending.

We, as the hardcore, do not decide the console generations but we do help, we know which is the more powerful system, what these jiggaflops mean and we tell people, they don't know what were talking about but they know which is stronger hardware, does that affect their purchasing decisions? sometimes, but not always.

When deciding what console people wanted Power wasn't a concern for the Wii, it was different, you could actually play tennis.

But when deciding between Xbox and PS, well they both have half hearted attempts to play tennis but one is cheaper and more powerful.

TL:DR: People listen to us but sometimes they buy something other than graphics.
 
Top Bottom