• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

CBOAT: ESRAM handicap for now, but will get better

Really??! Anyone like playing 1080p is just showing off his epenis?

You don't have a problem playing 720-900p, good for you, but don't dismiss the opinions of people that likes playing in resolution higher than that, and more importantly don't make it like it's a wrong to play with higher resolution.

Talking specially about XBoners. Waving XBone epenis'

If you want 1080p that much get a PC or PS4. Choice, it's a good thing.
 
Wisest to wait for a kinect-less SKU with a price below 349€. 499€ is insane for a such a resolution and performance crippled console.
 
So what a shitload of other people have been saying but NOW it's a fact that he says it?
It always was about the games, it always will be about the games.
Seems like a Somy advantage to me. Their first party studios were much stronger this generation and, if multiplats are now better on their platform as well, seems like a huge loss for MS.
 
Man I wonder if this is what John felt like when he wrote the Book of Revelation.

Okay this got me. I'm imaging CBoat in an office somewhere in Redmond just stewing or something. So, I'm assuming Thuway was an insider, but he got banned because he had only one source? FamousMortimer is the guy that spearheaded the NoPS4Drm thing right? CBoat is the professional deep throat though. In CBoat we trust.

Just be careful man. With the launches this close you've got to be doubly careful. We all love you, but don't fly too close to the flame. You are a saint, but I don't want to see you get martyred.

Actually, I'm sure you are fine. I.m just paranoid because I'm imagining some Zero Dark Thirty scene in Redmond where Balmer is torturing people to find out who Cboat is.
 
I've got a question if 720p is truly what Titanfall will also run at.

By the time Titanfall comes out, will we be able to build a PC that can run it at the same graphics at 720p for $500?
 
I'm still left with a few unanswered questions:

1. Was the eDRAM in the 360 hard programmed to limited functionality, or was it like the eSRAM in that developers had to decide themselves how to manage a higher speed memory pool? If the latter, why is eSRAM development so difficult. I realize its more difficult than the PS4, but my assumption is it would be comparable to developing for the 360 not developing for cell.

2. Which design choice drove the other. Was the relatively low TF GPU chosen because 32mb of eSRAM limits the benefits of more flops as a bandwidth bottleneck or was 32mb's of eSRAM chosen because they decided on a GPU that is shader unit limited. Would a 48mb eSRAM pool take up too much apu space? Would love to know what other eSRAM pool vs. GPU configurations were considered. I ask this because the size of the eSRAM pool, the amount of shaders, and the scaler all point to a designed "balance" of 720p or 900p that is upscaled.

I'm not too technically inclined but I'll try to help. For 1) I believe that the eDRAM was indeed limited to a function set, so there wasn't much to 'decide' on how to use it. The eSRAM, if I understand, is much more flexible. This is a blessing and a curse. It can be used for far more functions, but it doesn't come with the benefit of being made for a specific function. Therefore, there's more to 'learn' in handling the eSRAM as opposed to the eDRAM.

I'm not knowledgeable enough to even address the second part without making some stupid errors. Sorry.
 
I'm getting an XB1, but I won't be buying any 720p games next gen. I've been playing 1080p@60fps since forever on my PC, I could live without the 60fps, but going back to 720p is unacceptable, especially when gaming on a 60 inch TV.
I expect a res bump with each gen, being ok with 720p next gen is like being ok with 480p games on PS360.
 
Well, Microsoft could dish out XBox One+ (or something along that line) with more GPU cores and GDDR5, and it could be XBox One backward compatible.
Even "One+" games could run with low resolution/details on One "classic".

Point is : are they ready to spend A LOT of money to continue their console business ?

I don't think the average consumer is that upset with 900p that this system will be some big failure because of it's rendering resolution. People are buying this for the games and also the additional entertainment value of the system. In the grand scheme of things most of the buyers don't care whether the game is 900p upscaled or 1080p as long as it plays fine and looks fine.

It will cost any company a lot of money to continue their console business, in the grand scheme of things to both MS and Sony the game divisions are much smaller than other divisions, both have plenty of money in reserve. Both will continue on. Look at MS who had billions invested with the original Xbox that took a long time to recover and Sony who was hemorrhaging money with the PS3 for years (only to have PS3 become the dominant console). Potentially the ESRAM may become a huge bonus of the One once more developers understand how to use it.

Their last system was out 8 years before the next one was introduced. We won't see a new Xbox for a very long time after the One comes out. If they did it would kill their credibility as people would feel like their console purchases were not safe (since the last system wouldn't have been supported very long). XB1 won't turn out to be the failure so many people on the internet think it will be.
 
So basically slight improvement over time as the dev tools get better and the speed of development may improve over time...
I think it mostly just reaffirms the nasty position of the x1, not alot of positivity there.

When compared to ps4 we know it should be much much faster to develope for as its tools rippen and the quality can only muliply over time.

720p - 900p the norm sucks. Had to be the esram.
 
The reason the resolution is lower is that frame buffers just straight up more space at higher resolution, and when you only have 32 MB, you have to make sacrifices if you want to load more stuff into the eSRAM.
hmm

it will be interesting to see how this will play out in the future
 
Ah gaf..damn you's!! This damn thread kept me reading for over 30 minutes so far lol...exciting times indeed but yet fun while we wait for next gen day1 to drop!
 
Was the eDRAM in the 360 hard programmed to limited functionality, or was it like the eSRAM in that developers had to decide themselves how to manage a higher speed memory pool? If the latter, why is eSRAM development so difficult. I realize its more difficult than the PS4, but my assumption is it would be comparable to developing for the 360 not developing for cell.

The eDRAM on the 360 had additional fixed-function features, but it was mainly a dedicated, fast piece of memory to hold render targets. This is basically what the XB1's ESRAM is there for. In contrast to the 360's eDRAM, the XBO's ESRAM shares a common memory address space with main memory from the GPU's point of view, but that doesn't really change its most reasonable use cases which is still holding render targets.

Which design choice drove the other. Was the relatively low TF GPU chosen because 32mb of eSRAM limits the benefits of more flops as a bandwidth bottleneck or was 32mb's of eSRAM chosen because they decided on a GPU that is shader unit limited.

Die-size was most probably chosen with costs in mind. You design everything around that restriction, and balancing (*g*) the number of CUs and the amount of eSRAM is surely something that you do at the same time. Although it seems pretty obvious to me that 16MB of ESRAM would have been way too small. 40MB-48MB would have definitely been an advantage, but that apparently would have cost too much die space.

For comparison, Intel uses a L4 cache of 128MB of eDRAM for its Iris Pro 5200 iGPU on the same package. There was a quote from them on Anandtech that 32MB would have also been "ok", but then again you don't target 1080p when you design an iGPU mainly for notebooks.
 
Well the discussion seems to have gone under the assumption that while X1 devtools will improve over time, PS4 will remain the same through and through and that just can't be right, can it? I mean, even Killzone devs have said they've just "scratched the surface" of what the machine can do.
 
not entirely true, smart use of ESRAM and cache can increase pipeline efficiency quite a bit

but 50% over any meaningful period of time is stretching it
Even 15% is stretching it.

ESRAM will be used mostly for Framebuffer that will fit practically every MB there is.
 
What happens when PC lead engines get more complex and even PS4 has to drop its resolution to cope? Can there be a situation where Xbox can't even sustain 720p, or would they drop details first? It just feels like if they're struggling already, they have no headroom to deal with further advances in game engines.
 
Which design choice drove the other.
The way I understand it from tidbits people like Matt have leaked.

Media apps/Snap/OS functions etc. needed lots of RAM...
=>8GB relatively slow DDR3
=>32MB ESRAM
=>less die space
=>14 CUs with two disabled for yield

Essentially, all come as a consequence of the desire to "conquer the living room."

You can throw Kinect in every box into the mix as a BoM opportunity-cost as well, competing with silicon budget.

And likely directive from on high that the box must be profitable at launch restricting overall BoM.
 
Well the discussion seems to have gone under the assumption that while X1 devtools will improve over time, PS4 will remain the same through and through and that just can't be right, can it? I mean, even Killzone devs have said they've just "scratched the surface" of what the machine can do.

I've pointed this out earlier (possibly in another topic); the basic achitecture for the two systems is the same, an x64 CPU and a GCN GPU running on pooled memory. Most improvements the devs can squeeze out of the Bone will also help to get extra milage out of the PS4. The PS4 will always be a moving target out of reach. The real question is will the Bone be able to deliver the baseline of what the consumer expects, and how will those expectations change when they see the PS4..
 
People paid $600 for the PS3 and wasn't even getting 720p on a lot of games.

Sony paid $850 to bring that console to us at $600. Blu-ray players were $1k at the time and the PS3 had hdmi, wifi, and played 3 generations of Playstation games using internal components of those systems.

There's no way MS is taking anything close to a loss like that. Not sure they're talking a loss at all. So no, it's not the same thing.
 
Well the discussion seems to have gone under the assumption that while X1 devtools will improve over time, PS4 will remain the same through and through and that just can't be right, can it? I mean, even Killzone devs have said they've just "scratched the surface" of what the machine can do.

Not sure how you managed to take that from this thread.
 
I know people can do it with actual handwriting but you can be identified by typing style too?

I wouldn't doubt it. Words you use, common mistakes, etc.

And with the kind of info he's given out at times I'd be stunned if they haven't tried to find him.
 
Sony paid $850 to bring that console to us at $600. Blu-ray players were $1k at the time and the PS3 had hdmi, wifi, and played 3 generations of Playstation games using internal components of those systems.

There's no way MS is taking anything close to a loss like that. Not sure they're talking a loss at all. So no, it's not the same thing.

good post.
 
Sony paid $850 to bring that console to us at $600. Blu-ray players were $1k at the time and the PS3 had hdmi, wifi, and played 3 generations of Playstation games using internal components of those systems.

There's no way MS is taking anything close to a loss like that. Not sure they're talking a loss at all. So no, it's not the same thing.

As someone who bought a PS3. None of that made paying 200 dollars more for a console with inferior versions of games any better, especially when some ports where gimped to the point of missing essential features like voice chat.
 
Sony paid $850 to bring that console to us at $600. Blu-ray players were $1k at the time and the PS3 had hdmi, wifi, and played 3 generations of Playstation games using internal components of those systems.

There's no way MS is taking anything close to a loss like that. Not sure they're talking a loss at all. So no, it's not the same thing.

All this made no difference for people who wanted to play the new-get games as best they could, so they decided to wait on the Blu-Ray, thought Wi-Fi wasn't all that hot and saw they could play their 2 generations of old PS games on 2 generations of old consoles, saved $200 and bought an XBOX 360.

Do you see a pattern?
 
Very true. However, side by side, you WOULD be able to tell the difference. Still almost 350k pixels difference.

Why would you be playing two copies of the same game side by side though?

Really this all just feeds into our insecurities. Knowing that someone out there is playing this game in a higher resolution? How dare they! 900p will be fine. Heck, 720p with nice IQ will be fine too. Some PS4 games will probably be 900p too (or an equivalent like 1280x1080)
 
I'm not too technically inclined but I'll try to help. For 1) I believe that the eDRAM was indeed limited to a function set, so there wasn't much to 'decide' on how to use it. The eSRAM, if I understand, is much more flexible. This is a blessing and a curse. It can be used for far more functions, but it doesn't come with the benefit of being made for a specific function. Therefore, there's more to 'learn' in handling the eSRAM as opposed to the eDRAM.

I'm not knowledgeable enough to even address the second part without making some stupid errors. Sorry.

I'm pretty sure Microsoft said in that DA article that the ESRAM could be managed automatically like before, if the developer didn't want to optimise it themselves. I can't understand why it's so complicated for the developers, unless of course there drivers are very immature.
 
Why would you be playing two copies of the same game side by side though?

Really this all just feeds into our insecurities. Knowing that someone out there is playing this game in a higher resolution? How dare they! 900p will be fine. Heck, 720p with nice IQ will be fine too. Some PS4 games will probably be 900p too (or an equivalent like 1280x1080)

This makes no sense, by that logic people wouldn't have minded buying a car with worse milage/performace/comfortability for the same money either. It doesn't work that way. You don't have to see them side by side, you just need to know that one is better than the other. It doesn't matter whether you learned that through personal observation, or through media, internet, word of mouth etc. Consumer goods like gaming consoles are highly substitutable products, if you know one of them offers a better value proposition, you buy it. The whole "So what if the other version is 1080p, 720p will still look good" argument holds no water when you factor in rational consumer behavior. You will hear that irrational sentence only from people who are already invested in the system, because that's what consumers do, become irrationally attached once they became wrongly invested.

By your logic, why are people still not buying 360s and 720p HDTVs? Are they not good enough? The only people who argued 720p was 'good enough' were people who bought 720p HDTVs early in the adoption cycle, i.e. idiots like myself. Then everyone flocked to 1080p (rationally) once they were widely available.

We are at the beginning of a generation. The inferior and superior product are launching together. Consumer sentiment is growing, and people are getting informed. When, not if, people know the inferior product, they will steer away.
 
Well the discussion seems to have gone under the assumption that while X1 devtools will improve over time, PS4 will remain the same through and through and that just can't be right, can it? I mean, even Killzone devs have said they've just "scratched the surface" of what the machine can do.

I'm already curious to see what Naughty Dog will come up with and what second generation games will look like (probably like The
Order?)
 
so if the Xbone fits somewhere between 720-900p considering hardware and development challenges, where does this leave the ps4. could the stronger specs and easier development lead more games to be 900-1080p?
 
Why would you be playing two copies of the same game side by side though?

Really this all just feeds into our insecurities. Knowing that someone out there is playing this game in a higher resolution? How dare they! 900p will be fine. Heck, 720p with nice IQ will be fine too. Some PS4 games will probably be 900p too (or an equivalent like 1280x1080)

I don't mind someone playing a game with 4 times the pixel count as me, so long as they didn't pay $100 less to do it.
 
ibxqzHlQjiImZD.png
 
It's complicated...and dangerous to answer unless you're an insider. For right now, suffice it to say that PS4 is ahead in development and has a stronger box. How much, stronger remains in flux depending on developer and the maturity of development tools.

Take away the ram and ESRAM which is clouding the issue. Let's assume for a moment that Xbox has amazing, mature tools that gets the most from the ESRAM and can max out the GPU.

You still have a lower powered GPU than the PS4.

So right now the ESRAM appears to be a bottleneck. But when they get past that, you get to the real bottleneck (in terms of any kind of parity with PS4). No matter how efficient the ram, you will simply not be able to do as much with Xbox one.

900p is a really smart compromise. Most people won't notice - especially if the Xbox is their only machine, and it is enoug of a drop in pixels to help bridge that gap in performance.
 
This makes no sense, by that logic people wouldn't have minded buying a car with worse milage/performace/comfortability for the same money either. It doesn't work that way. You don't have to see them side by side, you just need to know that one is better than the other. It doesn't matter whether you learned that through personal observation, or through media, internet, word of mouth etc. Consumer goods like gaming consoles are highly substitutable products, if you know one of them offers a better value proposition, you buy it. The whole "So what if the other version is 1080p, 720p will still look good" argument holds no water when you factor in rational consumer behavior. You will hear that irrational sentence only from people who are already invested in the system, because that's what consumers do, become irrationally attached once they became wrongly invested.

By your logic, why are people still not buying 360s and 720p HDTVs? Are they not good enough? The only people who argued 720p was 'good enough' were people who bought 720p HDTVs early in the adoption cycle, i.e. idiots like myself. Then everyone flocked to 1080p (rationally) once they were widely available.

We are at the beginning of a generation. The inferior and superior product are launching together. Consumer sentiment is growing, and people are getting informed. When, not if, people know the inferior product, they will steer away.

People buy cars with worse mileage all the time, because they like the badge, or their friend has one, or it has comfortable seats. People 'make do' all the time, and once they've purchased, they don't even know or care.
 
Take away the ram and ESRAM which is clouding the issue. Let's assume for a moment that Xbox has amazing, mature tools that gets the most from the ESRAM and can max out the GPU.

You still have a lower powered GPU than the PS4.

So right now the ESRAM appears to be a bottleneck. But when they get past that, you get to the real bottleneck (in terms of any kind of parity with PS4). No matter how efficient the ram, you will simply not be able to do as much with Xbox one.

900p is a really smart compromise. Most people won't notice - especially if the Xbox is their only machine, and it is enoug of a drop in pixels to help bridge that gap in performance.

Quite. If you said to people, Xbox One plays Ryse, and PS4 plays KZ:SF, people aren't going to feel conned, not in the slightest. They both look out of this world and one a similar footing, and only the 0.001% of gamers can actually tell the resolution difference with the naked eye.
 
Well, Microsoft could dish out XBox One+ (or something along that line) with more GPU cores and GDDR5, and it could be XBox One backward compatible.
Even "One+" games could run with low resolution/details on One "classic".

Point is : are they ready to spend A LOT of money to continue their console business ?

Xbox One "classic", you say? oh brother!! First the system needs to "earn" that title of "Classic" and that simply does not look that will happen. The entire idea is outlandish bro..it'll NEVER happen. Props for keeping positive, though

On a side note: MS will do just fine as long as they deliver the "games" ...most people will not really care about the 900p thing..720p being the norm though? Hmmm... i think that is a short term problem that will be fixed later,not sooner though
 
I'm pretty sure Microsoft said in that DA article that the ESRAM could be managed automatically like before, if the developer didn't want to optimise it themselves. I can't understand why it's so complicated for the developers, unless of course there drivers are very immature.

I read the DF article and I don't remember MS saying that.

EDIT: I'm reading that article now and I still can't find it.
 
People buy cars with worse mileage all the time, because they like the badge, or their friend has one, or it has comfortable seats. People 'make do' all the time, and once they've purchased, they don't even know or care.

Do not try to generalize exceptional cases. Consumers are largely rational, money does not grow in trees. People do not buy cars with worse milage all the time. They buy it because they have to sacrifice while making product decisions; that may be because they don't have enough money for the better option, it may be because the inferior product offers a benefit that the superior product does not (better speed or comfort, better aftersales etc.) or lastly because they are uninformed - but they don't buy the worse offering because they enjoy buying crap.

As for your points; first of all, while brand loyalty is a factor the amount of differentiationg software between the two consoles is at an all time low. Secondly, making do has nothing to do with the situation at hand considering the inferior product is also the more expensive, so you are not relegated to a poor choice due to inadequate purchasing power. Lastly, 'once they've purchased' is completely irrelevant because nobody has invested in the consoles at this point.
 
Top Bottom