• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Xbone Specs/Tech Analysis: GPU 33% less powerful than PS4

Oh, PS3 had flop potential, let me tell you! ;)
I can't shake a feeling of deja vu.

Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.

Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.

I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.
 
I can't shake a feeling of deja vu.

Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.

Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.

I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.

Not trying to insult you, but this is like the 20millionth time someone has brought up PS3/360 completely forgetting/not understanding/not knowing that.

1. PS3's GPU is weaker than 360
2. PS3 was much harder to program for than 360
3. 360 shared much with PC development

Compared to this generation in which

1. Both consoles share the same architecture*
2. Both consoles share the same generation of GPU (and featureset derived from said GPU generation)
3. Both consoles share near identical development environment
4. PS4 at this point unarguably has the more powerful GPU, and both consoles share the same CPU
 
I can't shake a feeling of deja vu.

Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.

Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.

I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.

Biggest difference, is it's a even more direct comparison than PS3 than 360.

CPU and GPU provided both by AMD and PS4 easily to program for from the get go, you don't have highly specialized Cell CPU to contend with or split memory architecture versus unified (360) or even different GPU architecture of Nvidia versus AMD (dedicated versus unified shaders). It's a much more direct approach, exact same CPU, but faster GCN GPU design provided period.
 

TheD

The Detective
I can't shake a feeling of deja vu.

Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.

Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.

I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.

No, it is nothing like the PS3 vs 360!

All of the extra power the PS3 had was tied up in the SPEs on the CELL. the RSX had just about no advantages vs the Xenos (processing power was about the same, number of ROPs is the same ect.).

This time on the other hand the extra power of the PS4 is not in something odd like the SPEs, it is in the GPU that shares about the same architecture as the xbone's GPU.
 

CLEEK

Member
I can't shake a feeling of deja vu.

Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.

Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.

I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.

The difference this gen is that the Xbox and Playstation are built on the exact same core architecture. So for the first time ever, the FLOPS of both are true, meaningful figures. The ~600GFLOPS advantage the PS4 has isn't smoke and mirrors.

Its not like this gen where on had the better GPU, the other the better CPU. The only differences (other than GPU power) between the Xbone and PS4 is memory.
 

mrklaw

MrArseFace
The difference this gen is that the Xbox and Playstation are built on the exact same core architecture. So for the first time ever, the FLOPS of both are true, meaningful figures. The ~600GFLOPS advantage the PS4 has isn't smoke and mirrors.

Its not like this gen where on had the better GPU, the other the better CPU. The only differences (other than GPU power) between the Xbone and PS4 is memory.

No, it is nothing like the PS3 vs 360!

All of the extra power the PS3 had was tied up in the SPEs on the CELL. the RSX had just about no advantages vs the Xenos (processing power was about the same, number of ROPs is the same ect.).

This time on the other hand the extra power of the PS4 is not in something odd like the SPEs, it is in the GPU that shares about the same architecture as the xbone's GPU.

Biggest difference, is it's a even more direct comparison than PS3 than 360.

CPU and GPU provided both by AMD and PS4 easily to program for from the get go, you don't have highly specialized Cell CPU to contend with or split memory architecture versus unified (360) or even different GPU architecture of Nvidia versus AMD (dedicated versus unified shaders). It's a much more direct approach, exact same CPU, but faster GCN GPU design provided period.

Not trying to insult you, but this is like the 20millionth time someone has brought up PS3/360 completely forgetting/not understanding/not knowing that.

1. PS3's GPU is weaker than 360
2. PS3 was much harder to program for than 360
3. 360 shared much with PC development

Compared to this generation in which

1. Both consoles share the same architecture*
2. Both consoles share the same generation of GPU (and featureset derived from said GPU generation)
3. Both consoles share near identical development environment
4. PS4 at this point unarguably has the more powerful GPU, and both consoles share the same CPU


Now *I'm* getting a feeling of deja vu

:)
 

Durante

Member
L1 and L2 are both likely lower than that mark. At least L1 certainly has to be. GPUs are designed to deal with latency, but latency is still very much a high priority deeper down in the memory architecture. It's the GDDR5 that's allowed to be higher latency, but the L1 data cache, the L2 cache, the shared L1 between every 4 compute units, they are all very low latency.
Actually, on a GTX 460, you get ~50 nanoseconds effective latency in L1. I really think you underestimate the general magnitude of latency on memory accesses from GPUs compared to CPUs.
 
Sony seems to have hit all the right bases with the hardware design this generation.

In fact, one could argue it is EASILY the best designed console they've ever had.
 
I think in the grand scheme of things, based on what has been said, it will provide a very helpful boost to performance for developers, enough to be considered very significant to Xbox One game development. Then when you consider very useful GCN features such as PRT (Partial Resident Textures), where you can load part of a texture into memory as opposed to the entire thing, suddenly that 32MB starts to seem a lot bigger and more useful than some might've originally expected.

http://www.anandtech.com/show/4455/amds-graphics-core-next-preview-amd-architects-for-compute/5



I can't possibly quantify how much it will help performance. All I do know for certain is that it absolutely will. Keep in mind when I say this I'm not saying this in a way to compare it to the PS4. The PS4 is simply stronger. I'm just saying that the One will be quite a bit more capable in the overall graphics performance department than it's currently being given credit for. Virtual texturing in game engines in general will make that 32MB seem almost like the pacific ocean, which is the primary reason why I think those dismissing it as 'just 32MB' aren't looking at the full picture.

Drawing examples from how significant EDRAM was to 360 game development is important because it should give pause to any thinking the ESRAM can't be very vital to overall performance. It's easy to be overshadowed in all the GDDR5, 1.8 teraflops talk, but all Microsoft consoles have been quite capable in the performance department. This time will be no different, because those engineers really know their shit.



This misses the larger point. Who cares what their primary reason was for putting it in there. The point is that it will have implications beyond simply providing more bandwidth, and experienced programmers have already clearly said as much. People would like some to believe that the ESRAM can't be extremely useful both for its bandwidth benefits as well as for its latency. It isn't one or the other. There are bandwidth benefits, latency benefits, and benefits to existing development techniques. There are performance benefits to the bandwidth, naturally, but then there are also performance benefits to the esram being low latency. In situations where there are cache misses and the required data isn't in the L1 or L2 caches for the One's GPU, developers will be happy that a possible trip to VRAM (for data that will be in the ESRAM) won't be anywhere nearly as expensive as it would have otherwise been with GDDR5 or even with the DDR3. Would it have been simpler to have things the way Sony does with the PS4? Absolutely, but the engineers made their decisions, the Xbox One isn't as powerful as the PS4. But, even so, there are some upsides to the architectural design of the Xbox One, something even a Sony developer acknowledges.

PRT lets you use parts from a larger texture by allowing you to load only those parts to memory but what do you do when you're actually using more than 32MB of parts at a time? 32MB is still 32MB, and DDR3 is still going to be slow.
 
Sony seems to have hit all the right bases with the hardware design this generation.

In fact, one could argue it is EASILY the best designed console they've ever had.

That's becasue they learned from their past mistakes. We should also consider the reserved resoruces for the XboxONE, 3GB of RAM and 2 cores. I hope PS4 reserves no more than 512Mb and all cores are available.
 
Bros, thanks for the spec blasts. As I said, objectively I get it.

My point wasn't to downplay a tech advantage. My point was I'm intrigued by how similar the conventional wisdom is right now compared to the last time we approached a generational transition--before we had almost a decade's worth of hindsight on what the machines actual performance and bottlenecks were (free AA on the 360 lol.)

I know there's a power imbalance, I'm excited to see how it'll manifest in the games I play.

Why do people think this is going to be a repeat of history, when console history has never repeated itself?
If that's directed at me, I don't think it'll be a repeat at all. I think things like DRM / required online / price / and most importantly games will determine what happens.
 
I wonder why Sony didn't go for non-proprietary hardware before this.

Could you imagine how much more dominant they would've been this generation if they DIDN'T go with the current PS3 design?
 
That's becasue they learned from their past mistakes. We should also consider the reserved resoruces for the XboxONE, 3GB of RAM and 2 cores. I hope PS4 reserves no more than 512Mb and all cores are available.
The 3gb is likely a safety net, they can always reduce it in the future (like Sony did this generation). I'm still not sure I understand why it's such a huge amount though.

As for PS4 not reserving any cores and only reserving 512mb, that seems a little short-sighted to the opposite. It then completely limits the background tasks Sony can implement on PS4 in future. Better to reserve early, and free up cores and RAM throughout the cycle. If you don't reserve early on then there's no going back.
 
I think people need to realize that the 360 and PS3 games are looking awesome right now, and now they are at a minimum adding a 10x increase in performance... its gonna be a sweet next gen upgrade.
 

LCGeek

formerly sane
I think people need to realize that the 360 and PS3 games are looking awesome right now, and now they are at a minimum adding a 10x increase in performance... its gonna be a sweet next gen upgrade.

I like your point but features of the cpus and gpus alone are a huge jump for gaming. Same for console forcing most of the gaming ecosystem in to 64bit based games. This generation is gonna be sweet for anyone on these systems or pc.
 
I think people need to realize that the 360 and PS3 games are looking awesome right now, and now they are at a minimum adding a 10x increase in performance... its gonna be a sweet next gen upgrade.


This. Look at what Naughty Dog and Quantic dream were able to achieve in TLoU, Uncharted 3, and beyond two souls.
 
In terms of flops, the Xbone is, I believe, equidistant from the PS4 and Wiiu.

So

Sony >> Xbone >> WiiU

Where I believe each > is 300 gflops.

I believe that's accurate.

Though I do think Wii U is slightly closer to PS360 at around 500 (320+ its unknown gpu hardware)
 

mrklaw

MrArseFace
The 3gb is likely a safety net, they can always reduce it in the future (like Sony did this generation). I'm still not sure I understand why it's such a huge amount though.

As for PS4 not reserving any cores and only reserving 512mb, that seems a little short-sighted to the opposite. It then completely limits the background tasks Sony can implement on PS4 in future. Better to reserve early, and free up cores and RAM throughout the cycle. If you don't reserve early on then there's no going back.

sure, but for OS reservation remember that Sony were perhaps only counting on 4GB (with a hope to stretch to 8). So the OS was probably designed around 512MB (1GB would be a huge amount if they were still at 4GB).

perhaps they've upped it to 1GB now to give themselves some breathing space but they might not need it.
 
The 3gb is likely a safety net, they can always reduce it in the future (like Sony did this generation). I'm still not sure I understand why it's such a huge amount though.

As for PS4 not reserving any cores and only reserving 512mb, that seems a little short-sighted to the opposite. It then completely limits the background tasks Sony can implement on PS4 in future. Better to reserve early, and free up cores and RAM throughout the cycle. If you don't reserve early on then there's no going back.

They already have specialized hardware for that. A dedicated chip for video encoding, for audio, and I think also an ARM chip for background tasks. It makes much more sense than wasting entire cores that could be used for games. Let's say PS4 uses all 8 core, it would make a lot of difference, as it is a lot of additional power.
 
sure, but for OS reservation remember that Sony were perhaps only counting on 4GB (with a hope to stretch to 8). So the OS was probably designed around 512MB (1GB would be a huge amount if they were still at 4GB).

perhaps they've upped it to 1GB now to give themselves some breathing space but they might not need it.

And if they designed the system with 512MB reserved for the OS, it means the actual OS used a lot less than that, probably around 200-250MB. You always reserve more than what you need.
 

LCGeek

formerly sane
They already have specialized hardware for that. A dedicated chip for video encoding, for audio, and I think also an ARM chip for background tasks. It makes much more sense than wasting entire cores that could be used for games. Let's say PS4 uses all 8 core, it would make a lot of difference, as it is a lot of additional power.

Agreed.

I just expected a gaming machine but them going out of their way to do this and do it right says a lot. Even on my pc it's still a pain to setup recording and it has a nasty performance hit.
 
PRT lets you use parts from a larger texture by allowing you to load only those parts to memory but what do you do when you're actually using more than 32MB of parts at a time? 32MB is still 32MB, and DDR3 is still going to be slow.

Nothing prevents data from temporarily being stored inside the DDR3 before it's moved from DDR3 to ESRAM using the Xbox One's move engines. The move engines play a crucial role here in properly utilizing the ESRAM.

They also use notably less resources than if you were to use a shader to make the copy, which is good for saving the bandwidth for other things. You will always be using more than 32MB at anytime on a Durango game, but that's the point of being able to use both pools simultaneously, or being able to make effective use of the move engines. Use ESRAM for specific tasks and use the DDR3 for others, and when you need new data in the esram, let the move engines take care of that in parallel with other operations. Under this scenario, the ESRAM's bandwidth will come in handy to help the DDR3, which is obviously slower. At the end of the day though, I think you have the makings of a very good and efficient design on Xbox One that will produce some incredible looking games.

It is never about how one console's specs match up against another's, it's more important to ask what will the hardware allow developers to do.

DDR3 is slow in comparison to GDDR5.
Split pools is less easy to work with compared to completely unified 8GB.
It's better to have a more powerful 1.8 teraflops GPU instead of a weaker 1.2tflops GPU.
It's better to have 7GB of usable ram instead of 5GB.

However, as it concerns Xbox One developers, none of these things should matter. It's their job to make games, not comparisons. They will simply try to make the best game that they possibly can on the available hardware. As long as they do that to the best of their ability, it won't matter how either of the two consoles stack up. The Xbox One is weaker, but it's a nice design that I think is really well put together. Left out of the equation is the very powerful dedicated audio chip on the Xbox One that will cut down on the amount of cpu resources required for audio compared to the xbox 360, which sometimes took as much as 3 threads across two cores.
 

Mitchings

Banned
I think in terms of real world differences...

Most multiplatform titles on the PS4 will likely have a somewhat tighter Performance & tighter Image Quality; like most third-party titles this gen, only with the roles reversed and being a little more prominent.

But I have little doubt that there will be a considerable difference in the technical quality of exclusive, first/second-party content, with PS4 pulling away considerably in that respect once the second generation of titles hit.
 

grumble

Member
I think in terms of real world differences...

Most multiplatform titles on the PS4 will likely have a somewhat tighter Performance & tighter Image Quality; like most third-party titles this gen, only with the roles reversed and being a little more prominent.

But I have little doubt that there will be a considerable difference in the technical quality of exclusive, first/second-party content, with PS4 pulling away considerably in that respect once the second generation of titles hit.

If Sony is smart, they'll assist a bit with third party titles on the ps4 to ensure that the differences are as noticeable as possible, especially early on

Should also be a bigger difference than last gen and more consistent.
 
If Sony is smart, they'll assist a bit with third party titles on the ps4 to ensure that the differences are as noticeable as possible, especially early on

Should also be a bigger difference than last gen and more consistent.

They shouldn't have to, not with the architectures being what it is. I'm in the camp that doesn't think the differences will be that drastic, but I guess we'll have to see. We won't know for sure till we see more games.
 

Mitchings

Banned
If Sony is smart, they'll assist a bit with third party titles on the ps4 to ensure that the differences are as noticeable as possible, especially early on

Should also be a bigger difference than last gen and more consistent.

Definitely.

This time that extra potential isn't 'exotic' in nature, so there are no real excuses other than those of time/finance.
 

Mitchings

Banned
Does anyone know if the Xbox One's DDR3 UMA will act as a unified address space in the same manner that the PS4's GDDR5 UMA will....with the same piece of data being accessible by the CPU, GPU etc. without any need for 'walling off' and subsequently having to duplicate said data....?
 

Rumba

Banned
Nothing prevents data from temporarily being stored inside the DDR3 before it's moved from DDR3 to ESRAM using the Xbox One's move engines. The move engines play a crucial role here in properly utilizing the ESRAM.

They also use notably less resources than if you were to use a shader to make the copy, which is good for saving the bandwidth for other things. You will always be using more than 32MB at anytime on a Durango game, but that's the point of being able to use both pools simultaneously, or being able to make effective use of the move engines. Use ESRAM for specific tasks and use the DDR3 for others, and when you need new data in the esram, let the move engines take care of that in parallel with other operations. Under this scenario, the ESRAM's bandwidth will come in handy to help the DDR3, which is obviously slower. At the end of the day though, I think you have the makings of a very good and efficient design on Xbox One that will produce some incredible looking games.

It is never about how one console's specs match up against another's, it's more important to ask what will the hardware allow developers to do.

DDR3 is slow in comparison to GDDR5.
Split pools is less easy to work with compared to completely unified 8GB.
It's better to have a more powerful 1.8 teraflops GPU instead of a weaker 1.2tflops GPU.
It's better to have 7GB of usable ram instead of 5GB.

However, as it concerns Xbox One developers, none of these things should matter. It's their job to make games, not comparisons. They will simply try to make the best game that they possibly can on the available hardware. As long as they do that to the best of their ability, it won't matter how either of the two consoles stack up. The Xbox One is weaker, but it's a nice design that I think is really well put together. Left out of the equation is the very powerful dedicated audio chip on the Xbox One that will cut down on the amount of cpu resources required for audio compared to the xbox 360, which sometimes took as much as 3 threads across two cores.

PS4 has an extra bus that can be used to bypass its GPU's L1/L2 cache and directly access memory, and Cerny said it can do 20 GB/s.

Also, PS4 has dedicated hardware for decoding audio streams as well.
 

Biker19

Banned
I can't shake a feeling of deja vu.

Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.

Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.

Like others said, there's a big difference between PS3 Vs. Xbox 360 & Xbox One Vs. PS4. The reason that the Xbox 360 had mostly better multiplats is because that the 360 was very easy to develop for for 3rd parties, unlike with the PS3 (PS3 had very rough multiplats until like, 2009 when 3rd party developers were trying to get used to the hardware). The great power definitely showed with such 1st/2nd party games like Uncharted 2 that blows away most 1st/2nd party games & multiplats graphically on the Xbox 360.

PS4, however, is much different, in which they're ditching the cell processor & are going with the x86-64 architecture from PC for easier development.
 

RoboPlato

I'd be in the dick
PS4 has an extra bus that can be used to bypass its GPU's L1/L2 cache and directly access memory, and Cerny said it can do 20 GB/s.

Also, PS4 has dedicated hardware for decoding audio streams as well.

Yeah, the bus customizations on PS4 are pretty extensive to help with latency and we don't know what the actual latency of the PS4's GDDR5 is. It can vary quite a bit and is fairly customizeable if you have the voltage to spare. I've seen several people on B3D say that latency is a complete non-issue in comparing the two systems.
 
Nothing prevents data from temporarily being stored inside the DDR3 before it's moved from DDR3 to ESRAM using the Xbox One's move engines. The move engines play a crucial role here in properly utilizing the ESRAM.

Sure, except you use 3 times as much bandwidth moving data from DDR3 to be read from ESRAM than it would take to just read it from DDR3. It only makes sense to do in the cases where you are going to read that data many times otherwise it is both faster and more efficient to simply read from DDR3.
 
So we know the PS3 GPU probably has about 50-60% more raw power, depending on how much the OS reserves. What kind of performance difference is that going to make? I'm familiar with PC cards where the same architecture with a 50% spec bump usually gives about 25-30% more FPS.

With that in mind I could see a PS4 multiplatform game running at 1080p 40 FPS while running at 30 FPS on Xbox One.
 
So we know the PS3 GPU probably has about 50-60% more raw power, depending on how much the OS reserves. What kind of performance difference is that going to make? I'm familiar with PC cards where the same architecture with a 50% spec bump usually gives about 25-30% more FPS.

With that in mind I could see a PS4 multiplatform game running at 1080p 40 FPS while running at 30 FPS on Xbox One.

FPS difference is likely going to be 50%-100%, depending on the game, from GPU comparisons that were posted here earlier.

Advantage is that PS4 also enjoys substantially larger bandwidth and better memory architecture in general.
 

mkotechno

Banned
Wow, with that power gap, MS is sure to release this console at $299!!!
And Sony will release theirs no higher than $399!!!
It's everyone's wish come true!
Too bad the real pricing for XBOX One will be $399.99 and PS4 will be $449.99.

All analysts sugest PS4 will be cheaper. BOOM!
 
The 128MB acts as a cache for the CPU and GPU!
What do you think it is connected to the GPU for?.... because the Main DDR3 RAM does not have enough bandwidth for the GPU!

It only has value for Haswell due to 1. it acting as a cache for the CPU and 2. it acting as a cache for the GPU to try and offset the lack of bandwidth.

The way you say this makes it sound like the ESRAM can't have any value for the Xbox One if it isn't being used as a cache, which flies in the face of the fact that EDRAM, which helped the Xbox 360 so tremendously throughout its life, also wasn't used as a cache either.

The Haswell 128MB of EDRAM acting as a cache for the CPU as well as the GPU is a side benefit of the way Intel designed it, but historically in consoles EDRAM need not ever be a cache, much less a cache that works for both the CPU and the GPU in order to still provide meaningful benefits to performance. The PS2 had EDRAM, the Xbox 360 had EDRAM, I believe the Nintendo consoles for years have had a combination of EDRAM and 1T-SRAM (not true SRAM like what the Xbox One has, but an EDRAM variant instead), and in all cases -- especially stressing the PS2's EDRAM and the Xbox 360's EDRAM implementation -- the performance benefits have been real and meaningful.

The Xbox One's ESRAM has some pretty nice benefits over EDRAM on the 360. It has none of the primary drawbacks of EDRAM on the 360. And then on top of that fact it has even lower latency. It not needing to be refreshed helps with regards to its latency.

http://en.wikipedia.org/wiki/Memory_refresh

Computer memory that does not require refreshing is available, called static random access memory (SRAM).[2] SRAM circuits take up more room on the semiconductor chip, because each SRAM memory cell requires 4 - 6 transistors, compared to a single transistor and a capacitor for DRAM. For this reason the storage capacity of SRAM chips is much less than DRAM, so SRAM memory is more costly per bit. Therefore DRAM is used for the main memory in computers, video games, and most other large uses of semiconductor memory. The need for extra circuitry to perform memory refresh makes DRAM circuits and their timing significantly more complicated than SRAM circuits, but the great advantages of DRAM in density and cost justify this complexity.

A post made on this forum earlier about the possible benefits of low latency ESRAM.

http://www.neogaf.com/forum/showpost.php?p=50467425&postcount=495

I actually like Durango's architecture from the rumors we see, it sounds like a PS2 on steroids somewhat. Very low latency ESRAM might make for some algorithms which will not work well on PC's and more PC like architectures (like GS's very fast bandwidth and ultra low overhead for some operations which are expensive on just about any other GPU out there thanks to its design), but the system will need more care from programmers than a system providing greater bandwidth with a single pool of memory. Both systems look fun to program for from the outside.

I think developers will find interesting ways to use the One's hardware. I for one am dying to see what Rare, 343i, Turn 10, and Remedy do with the machine. I should probably toss Lionhead in there, too, as I think they may return to form on the Xbox One. I don't think they were particularly at their best on the 360.

All analysts sugest PS4 will be cheaper. BOOM!

I fucking wish :D
 

onQ123

Member
The way you say this makes it sound like the ESRAM can't have any value for the Xbox One if it isn't being used as a cache, which flies in the face of the fact that EDRAM, which helped the Xbox 360 so tremendously throughout its life, also wasn't used as a cache either.

The Haswell 128MB of EDRAM acting as a cache for the CPU as well as the GPU is a side benefit of the way Intel designed it, but historically in consoles EDRAM need not ever be a cache, much less a cache that works for both the CPU and the GPU in order to still provide meaningful benefits to performance. The PS2 had EDRAM, the Xbox 360 had EDRAM, I believe the Nintendo consoles for years have had a combination of EDRAM and 1T-SRAM (not true SRAM like what the Xbox One has, but an EDRAM variant instead), and in all cases -- especially stressing the PS2's EDRAM and the Xbox 360's EDRAM implementation -- the performance benefits have been real and meaningful.

The Xbox One's ESRAM has some pretty nice benefits over EDRAM on the 360. It has none of the primary drawbacks of EDRAM on the 360. And then on top of that fact it has even lower latency. It not needing to be refreshed helps with regards to its latency.

http://en.wikipedia.org/wiki/Memory_refresh



A post made on this forum earlier about the possible benefits of low latency ESRAM.

http://www.neogaf.com/forum/showpost.php?p=50467425&postcount=495



I think developers will find interesting ways to use the One's hardware. I for one am dying to see what Rare, 343i, Turn 10, and Remedy do with the machine. I should probably toss Lionhead in there, too, as I think they may return to form on the Xbox One. I don't think they were particularly at their best on the 360.



I fucking wish :D

About the PS2 on steroids comment: I don't think so PS2 Embedded VRAM was 1/8 the size of it's main memory & had 15 X the bandwidth

if it was like a PS2 on steroids we would be looking at a console with 1GB of embedded VRAM with 1.02TB/s bandwidth matched up with the 8GB of 68GB/s main memory.
 

TheD

The Detective
The way you say this makes it sound like the ESRAM can't have any value for the Xbox One if it isn't being used as a cache, which flies in the face of the fact that EDRAM, which helped the Xbox 360 so tremendously throughout its life, also wasn't used as a cache either.

The Haswell 128MB of EDRAM acting as a cache for the CPU as well as the GPU is a side benefit of the way Intel designed it, but historically in consoles EDRAM need not ever be a cache, much less a cache that works for both the CPU and the GPU in order to still provide meaningful benefits to performance. The PS2 had EDRAM, the Xbox 360 had EDRAM, I believe the Nintendo consoles for years have had a combination of EDRAM and 1T-SRAM (not true SRAM like what the Xbox One has, but an EDRAM variant instead), and in all cases -- especially stressing the PS2's EDRAM and the Xbox 360's EDRAM implementation -- the performance benefits have been real and meaningful.

The Xbox One's ESRAM has some pretty nice benefits over EDRAM on the 360. It has none of the primary drawbacks of EDRAM on the 360. And then on top of that fact it has even lower latency. It not needing to be refreshed helps with regards to its latency.

http://en.wikipedia.org/wiki/Memory_refresh



A post made on this forum earlier about the possible benefits of low latency ESRAM.

http://www.neogaf.com/forum/showpost.php?p=50467425&postcount=495



I think developers will find interesting ways to use the One's hardware. I for one am dying to see what Rare, 343i, Turn 10, and Remedy do with the machine. I should probably toss Lionhead in there, too, as I think they may return to form on the Xbox One. I don't think they were particularly at their best on the 360.



I fucking wish :D

I did not say that, I said that the reason some versions of Haswell have it is because the main RAM bandwidth is too low for a GPU like what is in some versions of Haswell and that it also acts as a cache for the CPU.

If Haswell could instead be using a higher bandwidth memory bus without needing more DIMMs or GDDR5 then a very much doubt Intel would of given it the EDRAM.
 

kitch9

Banned
I can't shake a feeling of deja vu.

Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.

Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.

I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.


The PS3 was a pain in the ass to code for, the tools were crap and it had significantly lower usable ram which was split sub optimally.

Conventional wisdom should have suggested it was always going to struggle and it did.
 

mrklaw

MrArseFace
A post made on this forum earlier about the possible benefits of low latency ESRAM.

http://www.neogaf.com/forum/showpost.php?p=50467425&postcount=495



I think developers will find interesting ways to use the One's hardware. I for one am dying to see what Rare, 343i, Turn 10, and Remedy do with the machine. I should probably toss Lionhead in there, too, as I think they may return to form on the Xbox One. I don't think they were particularly at their best on the 360.

Panajev appreciates it from an interesting architectural PoV, but also points out that it would take effort and care to get the most from it. That didn't work so well for PS2/PS3 (PS2 got away with it through market dominance but it wasn't easy to work with)

A lower powered machine that is the 'odd one out' in terms of architecture has the risk of not having optimised ports - especially if tools are less mature and PC/PS4 are really straightforward to work on. Basically an almost exact reverse of 360/PS3
 
The PS3 was a pain in the ass to code for, the tools were crap and it had significantly lower usable ram which was split sub optimally.

Conventional wisdom should have suggested it was always going to struggle and it did.

Maybe I'm misremembering, but I think quite a few people thought the PS3's design was going to be its own worst enemy. They were just kinda drowned out by an equal number that optimistically (read as: naively) thought devs were going to get excited about learning the secret ways to unlock all that power.

360 won the multiplatform war by being straightforward. That the PS4 looks to be straightforward AND a powerhouse is pretty exciting to me.
 
LOL @ people thinking 50% is nothing to worry about.

Look at the difference in performance between Generations of PC graphic cards, cards in the same line are usually 10-20% more powerful e.g GTX 480 > 580 > 680 > 780.

50% is a big deal, almost 2 generations or more ahead in terms of sheer power.
 
Top Bottom