Meccanical
Member
Oh, PS3 had flop potential, let me tell you!![]()
HO HO!
Oh, PS3 had flop potential, let me tell you!![]()
Oh, PS3 had flop potential, let me tell you!![]()
I can't shake a feeling of deja vu.Oh, PS3 had flop potential, let me tell you!![]()
I can't shake a feeling of deja vu.
Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.
Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.
I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.
I can't shake a feeling of deja vu.
Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.
Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.
I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.
I can't shake a feeling of deja vu.
Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.
Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.
I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.
I can't shake a feeling of deja vu.
Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.
Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.
I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.
The difference this gen is that the Xbox and Playstation are built on the exact same core architecture. So for the first time ever, the FLOPS of both are true, meaningful figures. The ~600GFLOPS advantage the PS4 has isn't smoke and mirrors.
Its not like this gen where on had the better GPU, the other the better CPU. The only differences (other than GPU power) between the Xbone and PS4 is memory.
No, it is nothing like the PS3 vs 360!
All of the extra power the PS3 had was tied up in the SPEs on the CELL. the RSX had just about no advantages vs the Xenos (processing power was about the same, number of ROPs is the same ect.).
This time on the other hand the extra power of the PS4 is not in something odd like the SPEs, it is in the GPU that shares about the same architecture as the xbone's GPU.
Biggest difference, is it's a even more direct comparison than PS3 than 360.
CPU and GPU provided both by AMD and PS4 easily to program for from the get go, you don't have highly specialized Cell CPU to contend with or split memory architecture versus unified (360) or even different GPU architecture of Nvidia versus AMD (dedicated versus unified shaders). It's a much more direct approach, exact same CPU, but faster GCN GPU design provided period.
Not trying to insult you, but this is like the 20millionth time someone has brought up PS3/360 completely forgetting/not understanding/not knowing that.
1. PS3's GPU is weaker than 360
2. PS3 was much harder to program for than 360
3. 360 shared much with PC development
Compared to this generation in which
1. Both consoles share the same architecture*
2. Both consoles share the same generation of GPU (and featureset derived from said GPU generation)
3. Both consoles share near identical development environment
4. PS4 at this point unarguably has the more powerful GPU, and both consoles share the same CPU
Actually, on a GTX 460, you get ~50 nanoseconds effective latency in L1. I really think you underestimate the general magnitude of latency on memory accesses from GPUs compared to CPUs.L1 and L2 are both likely lower than that mark. At least L1 certainly has to be. GPUs are designed to deal with latency, but latency is still very much a high priority deeper down in the memory architecture. It's the GDDR5 that's allowed to be higher latency, but the L1 data cache, the L2 cache, the shared L1 between every 4 compute units, they are all very low latency.
I think in the grand scheme of things, based on what has been said, it will provide a very helpful boost to performance for developers, enough to be considered very significant to Xbox One game development. Then when you consider very useful GCN features such as PRT (Partial Resident Textures), where you can load part of a texture into memory as opposed to the entire thing, suddenly that 32MB starts to seem a lot bigger and more useful than some might've originally expected.
http://www.anandtech.com/show/4455/amds-graphics-core-next-preview-amd-architects-for-compute/5
I can't possibly quantify how much it will help performance. All I do know for certain is that it absolutely will. Keep in mind when I say this I'm not saying this in a way to compare it to the PS4. The PS4 is simply stronger. I'm just saying that the One will be quite a bit more capable in the overall graphics performance department than it's currently being given credit for. Virtual texturing in game engines in general will make that 32MB seem almost like the pacific ocean, which is the primary reason why I think those dismissing it as 'just 32MB' aren't looking at the full picture.
Drawing examples from how significant EDRAM was to 360 game development is important because it should give pause to any thinking the ESRAM can't be very vital to overall performance. It's easy to be overshadowed in all the GDDR5, 1.8 teraflops talk, but all Microsoft consoles have been quite capable in the performance department. This time will be no different, because those engineers really know their shit.
This misses the larger point. Who cares what their primary reason was for putting it in there. The point is that it will have implications beyond simply providing more bandwidth, and experienced programmers have already clearly said as much. People would like some to believe that the ESRAM can't be extremely useful both for its bandwidth benefits as well as for its latency. It isn't one or the other. There are bandwidth benefits, latency benefits, and benefits to existing development techniques. There are performance benefits to the bandwidth, naturally, but then there are also performance benefits to the esram being low latency. In situations where there are cache misses and the required data isn't in the L1 or L2 caches for the One's GPU, developers will be happy that a possible trip to VRAM (for data that will be in the ESRAM) won't be anywhere nearly as expensive as it would have otherwise been with GDDR5 or even with the DDR3. Would it have been simpler to have things the way Sony does with the PS4? Absolutely, but the engineers made their decisions, the Xbox One isn't as powerful as the PS4. But, even so, there are some upsides to the architectural design of the Xbox One, something even a Sony developer acknowledges.
Sony seems to have hit all the right bases with the hardware design this generation.
In fact, one could argue it is EASILY the best designed console they've ever had.
If that's directed at me, I don't think it'll be a repeat at all. I think things like DRM / required online / price / and most importantly games will determine what happens.Why do people think this is going to be a repeat of history, when console history has never repeated itself?
The 3gb is likely a safety net, they can always reduce it in the future (like Sony did this generation). I'm still not sure I understand why it's such a huge amount though.That's becasue they learned from their past mistakes. We should also consider the reserved resoruces for the XboxONE, 3GB of RAM and 2 cores. I hope PS4 reserves no more than 512Mb and all cores are available.
I think people need to realize that the 360 and PS3 games are looking awesome right now, and now they are at a minimum adding a 10x increase in performance... its gonna be a sweet next gen upgrade.
I think people need to realize that the 360 and PS3 games are looking awesome right now, and now they are at a minimum adding a 10x increase in performance... its gonna be a sweet next gen upgrade.
In terms of flops, the Xbone is, I believe, equidistant from the PS4 and Wiiu.
So
Sony >> Xbone >> WiiU
Where I believe each > is 300 gflops.
The 3gb is likely a safety net, they can always reduce it in the future (like Sony did this generation). I'm still not sure I understand why it's such a huge amount though.
As for PS4 not reserving any cores and only reserving 512mb, that seems a little short-sighted to the opposite. It then completely limits the background tasks Sony can implement on PS4 in future. Better to reserve early, and free up cores and RAM throughout the cycle. If you don't reserve early on then there's no going back.
The 3gb is likely a safety net, they can always reduce it in the future (like Sony did this generation). I'm still not sure I understand why it's such a huge amount though.
As for PS4 not reserving any cores and only reserving 512mb, that seems a little short-sighted to the opposite. It then completely limits the background tasks Sony can implement on PS4 in future. Better to reserve early, and free up cores and RAM throughout the cycle. If you don't reserve early on then there's no going back.
sure, but for OS reservation remember that Sony were perhaps only counting on 4GB (with a hope to stretch to 8). So the OS was probably designed around 512MB (1GB would be a huge amount if they were still at 4GB).
perhaps they've upped it to 1GB now to give themselves some breathing space but they might not need it.
They already have specialized hardware for that. A dedicated chip for video encoding, for audio, and I think also an ARM chip for background tasks. It makes much more sense than wasting entire cores that could be used for games. Let's say PS4 uses all 8 core, it would make a lot of difference, as it is a lot of additional power.
PRT lets you use parts from a larger texture by allowing you to load only those parts to memory but what do you do when you're actually using more than 32MB of parts at a time? 32MB is still 32MB, and DDR3 is still going to be slow.
I think in terms of real world differences...
Most multiplatform titles on the PS4 will likely have a somewhat tighter Performance & tighter Image Quality; like most third-party titles this gen, only with the roles reversed and being a little more prominent.
But I have little doubt that there will be a considerable difference in the technical quality of exclusive, first/second-party content, with PS4 pulling away considerably in that respect once the second generation of titles hit.
If Sony is smart, they'll assist a bit with third party titles on the ps4 to ensure that the differences are as noticeable as possible, especially early on
Should also be a bigger difference than last gen and more consistent.
If Sony is smart, they'll assist a bit with third party titles on the ps4 to ensure that the differences are as noticeable as possible, especially early on
Should also be a bigger difference than last gen and more consistent.
Nothing prevents data from temporarily being stored inside the DDR3 before it's moved from DDR3 to ESRAM using the Xbox One's move engines. The move engines play a crucial role here in properly utilizing the ESRAM.
They also use notably less resources than if you were to use a shader to make the copy, which is good for saving the bandwidth for other things. You will always be using more than 32MB at anytime on a Durango game, but that's the point of being able to use both pools simultaneously, or being able to make effective use of the move engines. Use ESRAM for specific tasks and use the DDR3 for others, and when you need new data in the esram, let the move engines take care of that in parallel with other operations. Under this scenario, the ESRAM's bandwidth will come in handy to help the DDR3, which is obviously slower. At the end of the day though, I think you have the makings of a very good and efficient design on Xbox One that will produce some incredible looking games.
It is never about how one console's specs match up against another's, it's more important to ask what will the hardware allow developers to do.
DDR3 is slow in comparison to GDDR5.
Split pools is less easy to work with compared to completely unified 8GB.
It's better to have a more powerful 1.8 teraflops GPU instead of a weaker 1.2tflops GPU.
It's better to have 7GB of usable ram instead of 5GB.
However, as it concerns Xbox One developers, none of these things should matter. It's their job to make games, not comparisons. They will simply try to make the best game that they possibly can on the available hardware. As long as they do that to the best of their ability, it won't matter how either of the two consoles stack up. The Xbox One is weaker, but it's a nice design that I think is really well put together. Left out of the equation is the very powerful dedicated audio chip on the Xbox One that will cut down on the amount of cpu resources required for audio compared to the xbox 360, which sometimes took as much as 3 threads across two cores.
I can't shake a feeling of deja vu.
Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.
Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.
PS4 has an extra bus that can be used to bypass its GPU's L1/L2 cache and directly access memory, and Cerny said it can do 20 GB/s.
Also, PS4 has dedicated hardware for decoding audio streams as well.
Guessing it's needed for all the voice command stuff.
Nothing prevents data from temporarily being stored inside the DDR3 before it's moved from DDR3 to ESRAM using the Xbox One's move engines. The move engines play a crucial role here in properly utilizing the ESRAM.
So we know the PS3 GPU probably has about 50-60% more raw power, depending on how much the OS reserves. What kind of performance difference is that going to make? I'm familiar with PC cards where the same architecture with a 50% spec bump usually gives about 25-30% more FPS.
With that in mind I could see a PS4 multiplatform game running at 1080p 40 FPS while running at 30 FPS on Xbox One.
FPS difference is likely going to be 50%-100%, depending on the game, from GPU comparisons that were posted here earlier.
Advantage is that PS4 also enjoys substantially larger bandwidth and better memory architecture in general.
Wow, with that power gap, MS is sure to release this console at $299!!!
And Sony will release theirs no higher than $399!!!
It's everyone's wish come true!
Too bad the real pricing for XBOX One will be $399.99 and PS4 will be $449.99.
The 128MB acts as a cache for the CPU and GPU!
What do you think it is connected to the GPU for?.... because the Main DDR3 RAM does not have enough bandwidth for the GPU!
It only has value for Haswell due to 1. it acting as a cache for the CPU and 2. it acting as a cache for the GPU to try and offset the lack of bandwidth.
Computer memory that does not require refreshing is available, called static random access memory (SRAM).[2] SRAM circuits take up more room on the semiconductor chip, because each SRAM memory cell requires 4 - 6 transistors, compared to a single transistor and a capacitor for DRAM. For this reason the storage capacity of SRAM chips is much less than DRAM, so SRAM memory is more costly per bit. Therefore DRAM is used for the main memory in computers, video games, and most other large uses of semiconductor memory. The need for extra circuitry to perform memory refresh makes DRAM circuits and their timing significantly more complicated than SRAM circuits, but the great advantages of DRAM in density and cost justify this complexity.
I actually like Durango's architecture from the rumors we see, it sounds like a PS2 on steroids somewhat. Very low latency ESRAM might make for some algorithms which will not work well on PC's and more PC like architectures (like GS's very fast bandwidth and ultra low overhead for some operations which are expensive on just about any other GPU out there thanks to its design), but the system will need more care from programmers than a system providing greater bandwidth with a single pool of memory. Both systems look fun to program for from the outside.
All analysts sugest PS4 will be cheaper. BOOM!
The way you say this makes it sound like the ESRAM can't have any value for the Xbox One if it isn't being used as a cache, which flies in the face of the fact that EDRAM, which helped the Xbox 360 so tremendously throughout its life, also wasn't used as a cache either.
The Haswell 128MB of EDRAM acting as a cache for the CPU as well as the GPU is a side benefit of the way Intel designed it, but historically in consoles EDRAM need not ever be a cache, much less a cache that works for both the CPU and the GPU in order to still provide meaningful benefits to performance. The PS2 had EDRAM, the Xbox 360 had EDRAM, I believe the Nintendo consoles for years have had a combination of EDRAM and 1T-SRAM (not true SRAM like what the Xbox One has, but an EDRAM variant instead), and in all cases -- especially stressing the PS2's EDRAM and the Xbox 360's EDRAM implementation -- the performance benefits have been real and meaningful.
The Xbox One's ESRAM has some pretty nice benefits over EDRAM on the 360. It has none of the primary drawbacks of EDRAM on the 360. And then on top of that fact it has even lower latency. It not needing to be refreshed helps with regards to its latency.
http://en.wikipedia.org/wiki/Memory_refresh
A post made on this forum earlier about the possible benefits of low latency ESRAM.
http://www.neogaf.com/forum/showpost.php?p=50467425&postcount=495
I think developers will find interesting ways to use the One's hardware. I for one am dying to see what Rare, 343i, Turn 10, and Remedy do with the machine. I should probably toss Lionhead in there, too, as I think they may return to form on the Xbox One. I don't think they were particularly at their best on the 360.
I fucking wish![]()
The way you say this makes it sound like the ESRAM can't have any value for the Xbox One if it isn't being used as a cache, which flies in the face of the fact that EDRAM, which helped the Xbox 360 so tremendously throughout its life, also wasn't used as a cache either.
The Haswell 128MB of EDRAM acting as a cache for the CPU as well as the GPU is a side benefit of the way Intel designed it, but historically in consoles EDRAM need not ever be a cache, much less a cache that works for both the CPU and the GPU in order to still provide meaningful benefits to performance. The PS2 had EDRAM, the Xbox 360 had EDRAM, I believe the Nintendo consoles for years have had a combination of EDRAM and 1T-SRAM (not true SRAM like what the Xbox One has, but an EDRAM variant instead), and in all cases -- especially stressing the PS2's EDRAM and the Xbox 360's EDRAM implementation -- the performance benefits have been real and meaningful.
The Xbox One's ESRAM has some pretty nice benefits over EDRAM on the 360. It has none of the primary drawbacks of EDRAM on the 360. And then on top of that fact it has even lower latency. It not needing to be refreshed helps with regards to its latency.
http://en.wikipedia.org/wiki/Memory_refresh
A post made on this forum earlier about the possible benefits of low latency ESRAM.
http://www.neogaf.com/forum/showpost.php?p=50467425&postcount=495
I think developers will find interesting ways to use the One's hardware. I for one am dying to see what Rare, 343i, Turn 10, and Remedy do with the machine. I should probably toss Lionhead in there, too, as I think they may return to form on the Xbox One. I don't think they were particularly at their best on the 360.
I fucking wish![]()
I can't shake a feeling of deja vu.
Objectively looking at the specs, I understand that the PS4 is more powerful than the XBONE.
Subjectively however, this feels exactly like the conversations we were having around E3 2006. PS3's tech superiority, storage of BluRay, power of the CELL, flops, rops, orders of magnitude, etc. To my eyes, that speculated power-gap never materialized in any meaningful way, and the 360 walked away with the lion's share of better multiplats, which conventional wisdom wouldn't have predicted.
I'm excited to see all this new tech in action, and to see how all of the plans / power/ promises pan out. We all win either way.
A post made on this forum earlier about the possible benefits of low latency ESRAM.
http://www.neogaf.com/forum/showpost.php?p=50467425&postcount=495
I think developers will find interesting ways to use the One's hardware. I for one am dying to see what Rare, 343i, Turn 10, and Remedy do with the machine. I should probably toss Lionhead in there, too, as I think they may return to form on the Xbox One. I don't think they were particularly at their best on the 360.
The PS3 was a pain in the ass to code for, the tools were crap and it had significantly lower usable ram which was split sub optimally.
Conventional wisdom should have suggested it was always going to struggle and it did.