I expected it to be a silly idea. It sounds like he is saying Xbox exclusives will be better but for mutiplats its dependant on developer to max out and use the "better" hardware.Originally Posted by lightchris
The statement completely ignores the fact that this high bandwidth is not only more difficult to achieve but also only applies for the tiny 32 MB space. The Xbox One might have an advantage in some special cases, but for the most part PS4's solution is without doubt the faster one (and that's regardless of how much optimization you do).
Also I wouldn't talk about "dev lazyness". It's all a cost-benefit equation.
Its all meaningless drivel.A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?
The Bone is worse than the PS4 in terms of computational power by a third. No amount of eSRAM will make up for that kind of gap.
I hate it when people talk about "lazy developers". Dudes practically live at work for 9 months at a time to make your video games and you sit on your couch and call them lazy?A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?You have a little more work to do with the X1. It is not about difficulty of development. It is about time and laziness. Can you trust developers to not be lazy on the X1 when PS4 doesn't require as much effort? It's up in the air at this point."
Old already here.A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?
ESRAM is patch for lucklustre bandwidth and that 204GB/s is already bullshit to begin with. They basically added read to write and thus 204GB and in reality there is no chance that read or write will be doing more than 100GB/s.
At best we are looking at 100GB/s + 70GB/s in each way. That would put it at GDDR5 speed but we must remember this 100GB/s is for only 32MB not for whole 8GB.
There is no real scenario in which we can say Xbone configuration is better than PS4 mem config. That is beside even ease of use or hUMA and we heard from devs and we see it in games.
This bad code/good code thing is bs stuff.A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?
eSRAM can only reach it's peak performance (i'm not talking about theoretical) when it's read/write at the same time. Best they could do in "not really real game world" conditions was about 130Gb/s, and they can't achieve that all the time.
And you can't overlook the fact that everything that has to go through DDR3 is gonna be capped at 68Gb/s, eSRAM won't magically extract the data from there at 133Gb/s.
eSRAM size is so tiny that it's gonna be used in an extremely limited number of operations while the rest will go through slow DDR3.
Meanwhile on PS4, every single bit of the memory is at 176Gb/s, and you never have to move that data from one pool to another.
Also doesn't it have only 16 ROPs versus 32 on PS4? That means it has just a little over half the latter's fillrate.Originally Posted by zomgbbqftw
Its all meaningless drivel.
The Bone is worse than the PS4 in terms of computational power by a third. No amount of eSRAM will make up for that kind of gap.
WTF is going on?
Any update on that now that the game is out in the wild?
Looks like this round of multiplats is a wash anyway. Basically all devs did for the most part is build their engines for the higher userbase last gen systems, then recoded for x86 and increasedp resolutions a bit and added a few post processing effects. Some teams did a little more then others. The main difference seems to be resolution for this round. 1080p for PS4 and 720p for X1. Which is in line with what you would expect in the PC world when pitting a low end 16 ROP 1.3tflop card vs a mid range 32 ROP / 1.8tflop card. Results got to a T really.Are there still no proper comparisons for Xbone/PS4 multiplats?
WTF is going on?
On X1 they kept texture fidelity the same as all the others and dropped resolution to keep the frame rate in line with the others. PS4 basically direct pott
First off, it's not as easy as "adding" the speeds together. eSRAM input/outputs directly onto the same bus, so it's not like you can use it as another pipe to DDR3 (and that would cause some insane coherency issues).A friend posted this on Facebook, copied from an IGN forum, and I thought it might be a bit nonsensical. ..but I lack the technical knowledge to debunk it, would ya guys mind
I'm even a bit confused at what's being said? Faster and better but harder to code for?
Second, the best way to use the XBO eSRAM implementation is to make it nearly the same as the eDRAM on the 360, in which you can essentially get "free" frame buffer operations due to its speed, but is limited to its small size.
Third, a system's speed is never determined by the speed of its fastest component, but the speed of its most limiting component. Think bottlenecks. It doesn't matter how fast eSRAM is if the rest of the system is slow as molasses.
Back in PS2 era, SDTVs were the norm.
Most SDTVs were able to output pictures of 576i/50fps (PAL) or 480i/60fps (NTSC)*, of course frames can go lower than maximum 50/60 fps.
During that era, PS2,GC and OGXbox HAD to target only ONE resolution (depends on PAL vs NTSC). So each console must first target this resolution then use what’s left of each console power to increase graphics and textures quality. That’s why Splinter Cell (OGXbox) almost looked like a different game to its PS2 counterpart. PS2 didn’t have the luxury of lowering the resolution then stretch the image because there was no lower resolution back then.
These days we have PS4 with massive power advantage to Xbone, but also we have HD TVs with variable common resolutions of 720p,900p and 1080p. So every time Xbone started to struggle to keep up with PS4, developers just lower the resolution to the next stop, this step usually free up a considerable amount of power (56-125% less pixels per frame) depending on 900p-720p stops, after freeing this power devs can use the freed power to maintain the same texture quality between the versions. I think this trend will continue throughout this generation or until PS4 starts tapping into its GPGPU compute, that time Xbone will seriously need more than lowering the resolution to keep multiplatform games look similar ( by similar I mean a game with all effects, bells and whistles ASIDE from res+ fps) . Let’s assume this hypothetical situation where PS4 utilize six of its CUs for compute:
PS4 : 12CU rendering + 6 CUs Compute.
in order to achieve similar compute performance:
Xbone: 6CUs rendering + 6 CUs compute.
Notice that Xbone rendering (graphics abilities) is 100% lower than PS4 in this case! so I can see huge graphical performance and disparities coming later this gen between the two, much greater than those during PS2/Xbox era.
also let’s assume another hypothetical situation where we have a single resolution solution (either 720p, or 1080p)
in this case PS4 will easily pull ahead of Xbone by using the remaining power to enhance textures or IQ or even step up the framerates.
Small example just to explain the power required to run 1080p game vs 720p to the average joe (will use COD PS4 vs X1):
PS4 : 1080p @ 60fps
X1: 720p @ 60fps
Imagine this imaginary scenario were you have three 720p TVs hooked up with single HDMI. The more power the console output, the more TVs show pictures.
X1: one TV will output the game at 720p. two TVs are black (Turned off due to lack of power)
PS4: two TVs are showing the exact same 720p picture of X1 + the last TV is ON with 480i picture !
So basically PS4 is rendering 2.5 twice the resolution while maintaining the same framrates and might have better IQ !
FOR DOUBTERS: DID you see how big of a difference is that? (POWER WISE)
My conclusion is that the idea of Xbox vs PS2 power difference next gen is very true and realistic, it just happen to be less “visible” due to different output technologies these days.
* I’m fully aware that some later SDTV was able to output 720p but this resolution was rarely used by games back then.
What does GAF think?
Forgive my weak grammar.
Believe me, if X1/PS4 was forced to output one resolution. difference will be even bigger than Xbox/PS2.Originally Posted by Salvor.Hardin
^ When I see a Halo CE/Killzone disparity, I'll agree that the gulf is as big as the Xbox/PS2 gap.
when devs start utilizing GPGPU, X1 will struggle to achieve similar computational power without sacrificing graphical rendering resources. X1 is very resource scarce.
PS4 versions of games being better doesn't imply that bandwidth isn't sometimes a bottleneck for the GPU's I/O units.With the games showing. Look like it isn't true at all. It seem that xbox one is not in the balance to bring closer to PS4.
So far we got near every PS4 version are high superior than Xbone one.
(If it is a bottleneck on occasion, I really wouldn't be surprised; that's not exactly unheard of for consoles without embedded memory pools.)
But I don't think Xbox one got the best version just because PS4's bottleneck.PS4 versions of games being better doesn't imply that bandwidth isn't sometimes a bottleneck for the GPU's I/O units.
(If it is a bottleneck on occasion, I really wouldn't be surprised; that's not exactly unheard of for consoles without embedded memory pools.)
Remember every hardwares have bottleneck, it called limit capacity.
Upclocks won't do much in terms of performance. There are other fundamental limits that limit the XBO versus the PS4.Originally Posted by Wishmaster92
I would say that the only thing Microsoft can do is upclock the CPU and GPU again. I think Harrison said they could do it again. If they don't then yeah, MS is in trouble. Just look at all the multiplats having a notable performance/resolution increase on the ps4.
Yes, but I'm not seeing how this is relevant.But I don't think Xbox one got the best version just because PS4's bottleneck.
Remember every hardwares have bottleneck, it called limit capacity.
Nobody is questioning whether or not the hardware "has bottlenecks" (lol) or whether they mean that the Xbox One versions are better; the question is whether BW is a bottleneck for GPU I/O.
Um console did run at lower resolutions in the PS2\GameCube\Xbox era.I was thinking about this difference the other day and I had a very interesting discussion with my friend about this very topic. My argument was that PS4 vs X1 is as big of a gap as OGXbox vs PS2 era. I had some observations to support my argument:
Back in PS2 era, SDTVs were the norm.
Most SDTVs were able to output pictures of 576i/50fps (PAL) or 480i/60fps (NTSC)*, of course frames can go lower than maximum 50/60 fps.
During that era, PS2,GC and OGXbox HAD to target only ONE resolution (depends on PAL vs NTSC). So each console must first target this resolution then use what’s left of each console power to increase graphics and textures quality. That’s why Splinter Cell (OGXbox) almost looked like a different game to its PS2 counterpart. PS2 didn’t have the luxury of lowering the resolution then stretch the image because there was no lower resolution back then.
These days we have PS4 with massive power advantage to Xbone, but also we have HD TVs with variable common resolutions of 720p,900p and 1080p. So every time Xbone started to struggle to keep up with PS4, developers just lower the resolution to the next stop, this step usually free up a considerable amount of power (56-125% less pixels per frame) depending on 900p-720p stops, after freeing this power devs can use the freed power to maintain the same texture quality between the versions. I think this trend will continue throughout this generation or until PS4 starts tapping into its GPGPU compute, that time Xbone will seriously need more than lowering the resolution to keep multiplatform games look similar ( by similar I mean a game with all effects, bells and whistles ASIDE from res+ fps) . Let’s assume this hypothetical situation where PS4 utilize six of its CUs for compute:
PS4 : 12CU rendering + 6 CUs Compute.
in order to achieve similar compute performance:
Xbone: 6CUs rendering + 6 CUs compute.
Notice that Xbone rendering (graphics abilities) is 100% lower than PS4 in this case! so I can see huge graphical performance and disparities coming later this gen between the two, much greater than those during PS2/Xbox era.
also let’s assume another hypothetical situation where we have a single resolution solution (either 720p, or 1080p)
in this case PS4 will easily pull ahead of Xbone by using the remaining power to enhance textures or IQ or even step up the framerates.
Small example just to explain the power required to run 1080p game vs 720p to the average joe (will use COD PS4 vs X1):
PS4 : 1080p @ 60fps
X1: 720p @ 60fps
Imagine this imaginary scenario were you have three 720p TVs hooked up with single HDMI. The more power the console output, the more TVs show pictures.
X1: one TV will output the game at 720p. two TVs are black (Turned off due to lack of power)
PS4: two TVs are showing the exact same 720p picture of X1 + the last TV is ON with 480i picture !
So basically PS4 is rendering 2.5 twice the resolution while maintaining the same framrates and might have better IQ !
FOR DOUBTERS: DID you see how big of a difference is that? (POWER WISE)
My conclusion is that the idea of Xbox vs PS2 power difference next gen is very true and realistic, it just happen to be less “visible” due to different output technologies these days.
* I’m fully aware that some later SDTV was able to output 720p but this resolution was rarely used by games back then.
What does GAF think?
Forgive my weak grammar.
Risky move, X1 APU is huge, upclocks can increase heat, shortening the lifecycle of the chip.Originally Posted by Wishmaster92
I would say that the only thing Microsoft can do is upclock the CPU and GPU again. I think Harrison said they could do it again. If they don't then yeah, MS is in trouble. Just look at all the multiplats having a notable performance/resolution increase on the ps4.
What console? what resolution?Um console did run at lower resolutions in the PS2\GameCube\Xbox era.
EDIT: because I really don't remember any games during PS2 era, where resolution was lower than 480i, then PS2 had to scale the image back. and even if that was the case to some games, it was never a common thing to do. devs used to target the standard resolution at the time (480i) then use remaining assets to improve graphics.
Well this thread is all about gap between PS4 and Xbox One performance. That's why I brought it up. Even if there is bandwidth bottleneck but the quite of the gap still there like we expected, aka difference resolution and minor graphics.Yes, but I'm not seeing how this is relevant.
Nobody is questioning whether or not the hardware "has bottlenecks" (lol) or whether they mean that the Xbox One versions are better; the question is whether BW is a bottleneck for GPU I/O.
Maybe we should wait much longer to see if the gap might get smaller when the devs got used to these systems, like PS3 catching up.
PS3/360 gap example is not applicable to PS4/X1 at all. totally different situations. I'm not gonna explain why for the millionth time.Well this thread is all about gap between PS4 and Xbox One performance. That's why I brought it up. Even if there is bandwidth bottleneck but the quite of the gap still there like we expected, aka difference resolution and minor graphics.
Maybe we should wait much longer to see if the gap might get smaller when the devs got used with those systems, like PS3 catching up.
Maybe. There are two sides to that argument.Maybe we should wait much longer to see if the gap might get smaller when the devs got used to these systems, like PS3 catching up.
Will the weird architecture allow the Xbox One to grow more as developers figure out "the right compromises" on the platform, or does the greater versatility of the PS4 mean that there's another axis to the decision-making that will allow the PS4 to grow in more interesting ways over time?
I'm more with bolded part because:Maybe. There are two sides to that argument.
Will the weird architecture allow the Xbox One to grow more as developers figure out "the right compromises" on the platform, or does the greater versatility of the PS4 mean that there's another axis to the decision-making that will allow the PS4 to grow in more interesting ways over time?
PS4 has more hardware to grow. future game design rely heavily on GPGPU which PS4 excels at.
Xbone in the other hand has a bottleneck (Esram) to an already weaker system. a fully utilized esram (~160gb/s real world performance in only 32MB cache) will still lag behind PS4 (172gb/s of ~6GB of GDDR5) accessible to the GPU. not factoring other hardware components available to PS4 only (extra CU's, more ROP's,ALU and texture units).
Yeah, half the ROPs is already a big handicap for Xbone.PS3/360 gap example is not applicable to PS4/X1 at all. totally different situations. I'm not gonna explain why for the millionth time.
I guess late in the gen many multiplats will be 720p on Xbone with toned-down shader effects.
| Thread Tools | |