TechnicPuppet
Nothing! I said nothing!
I don't think the gap will be this large. But we will see.
I don't either. More like 5 to 10 percent.
I don't think the gap will be this large. But we will see.
I don't either. More like 5 to 10 percent.
I don't either. More like 5 to 10 percent.
Enough of a performance difference to force MS to hype the shit out of superior online, system versatility, and motion control intergration.
MS already lost the power battle, they better take advantage of their usability advantages...
After reading the old post I noticed that X1 camp went from " MS will go all out this gen, Sony can't compete financially" to " 40% is negligible, we have AUDIO" lol
EPIC TRANSFORMATION
Based on the hardware, what makes you think that? What does "5 to 10 percent" even mean?
PC GPUs with the same performance gap have shown pretty big real-world difference.
25 fps can feel very sluggish compared to a locked 30 fps. So actually 5 fps can make a big difference..
But if you think all that extra raw GPU power will only result in slightly sharper textures and a small FPS bump, you're mistaken. It will likely result in better image quality (resolution, AA) and better/more transparencies (more foliage,etc). Third party devs can take advantage of this extra power without a lot of effort.
Agreed, but the difference between 55 fps and 60 fps won't be as noticable. Both GPU's will be capable of decent frame rates.
After reading the old post I noticed that X1 camp went from " MS will go all out this gen, Sony can't compete financially" to " 40% is negligible, we have AUDIO" lol
EPIC TRANSFORMATION
I just don't think the differences you are talking about will amount to anything but fuel for the fan boy war.
Agreed, but the difference between 55 fps and 60 fps won't be as noticable. Both GPU's will be capable of decent frame rates.
Again I agree with you. I just don't think the differences you are talking about will amount to anything but fuel for the fan boy war. I know people always get excited about little differences in draw distance, foliage density,etc.. but it doesn't change the game play experience. We aren't talking about PS2 to XBOX differences here.
My point was don't worry about hardware specs, they don't matter that much in the end. Worry about the games you want to play. If you want to play Killzone and Infamous buy a PS4, if you would prefer Halo and Forza buy a XBOX ONE. The multiplats will be similar enough to not be a determining factor in your console choice.
If only was a GAF member that time.
Also what's the story with this dual APU/GPU?
Complete and utter fantasy land bullshit propagated by a combination of three things... A lack of hardware-related knowledge, a lack of common sense, and A LOT of denial.
Agreed, but the difference between 55 fps and 60 fps won't be as noticable. Both GPU's will be capable of decent frame rates.
Again I agree with you. I just don't think the differences you are talking about will amount to anything but fuel for the fan boy war. I know people always get excited about little differences in draw distance, foliage density,etc.. but it doesn't change the game play experience. We aren't talking about PS2 to XBOX differences here.
My point was don't worry about hardware specs, they don't matter that much in the end. Worry about the games you want to play. If you want to play Killzone and Infamous buy a PS4, if you would prefer Halo and Forza buy a XBOX ONE. The multiplats will be similar enough to not be a determining factor in your console choice.
I think it's more important for people who own both systems. It's not really great for MS if every multiplatform game is better on PS4, even if just a little. Because that's the version people will buy.
I'm buying third party games on PC anyway so it doesn't matter to me.
I think it's more important for people who own both systems. It's not really great for MS if every multiplatform game is better on PS4, even if just a little. Because that's the version people will buy.
I'm buying third party games on PC anyway so it doesn't matter to me.
this should be added to the op
No one can say anything is certain at this point until we can compare multi plat titles side by side. But even then we have to consider which game it is. BF4 would be the best measuring stick, since it'll probably be the most intensive on the hardware.
My prediction? 900p+ on PS4, 720p on Xbone with less AA and lower texture quality.
Why?
Although they share a similar base architecture, both machines still have some singularitys. In the end, it depends how much the unique features in the X1 can make up for the more straight forward approach combined with the raw power advantage on the PS4. And all things considered, with my basic understanding, the gap isn't that large as many think. So 5-10% seems about right and could mean a higher resolution or a slightly better frame rate on the PS4 - or just simple things like better shadow resolution.
I simply don't understand how you and some others can think the real gap between the two will be smaller than in theory.
For the first time in console history we have 2 very similar architectures, with the most powerful (most of the gaming code these days pushes the gpu, the cpu counts relatively) also being the one easier to develop for and more importantly the one with the simpler architecture: how can you even think that the weaker and more exotic console will be more efficient than the other, when logic suggests that the first is much more prone to have bottlenecks somewhere, is beyond me: I'd expect the gap in real world to be even wider than the theoretical 30-40%, not the opposite.
this should be added to the op
I simply don't understand how you and some others can think the real gap between the two will be smaller than in theory.
For the first time in console history we have 2 very similar architectures, with the most powerful (most of the gaming code these days pushes the gpu, the cpu counts relatively) also being the one easier to develop for and more importantly the one with the simpler architecture: how can you even think that the weaker and more exotic console will be more efficient than the other, when logic suggests that the first is much more prone to have bottlenecks somewhere, is beyond me: I'd expect the gap in real world to be even wider than the theoretical 30-40%, not the opposite.
Because you are simply looking at only a few numbers in your equation. The ESRam is probably more powerful than some of you assume. Just look for what the Edram has done for the 360 - you can get several forms of post processing and AA basically for zero bandwidth cost.
Why zero? Everything that you read/write into the eSRAM should consume eSRAM bandwidth. The 360 had the benefit of having additional pixel processors attached to it that performed some tasks like AA "for free", but that was not an immediate effect of the eDRAM itself.
On Xbox 360, the EDRAM helps a lot with backbuffer bandwidth. For example in our last Xbox 360 game we had a 2 MRT g-buffer (deferred rendering, depth + 2x8888 buffers, same bit depth as in CryEngine 3). The g-buffer writes require 12 bytes of bandwidth per pixel, and all that bandwidth is fully provided by EDRAM. For each rendered pixel we sample three textures. Textures are block compressed (2xDXT5+1xDXN), so they take a total 3 bytes per sampled texel. Assuming a coherent access pattern and trilinear filtering, we multiply that cost by 1.25 (25% extra memory touched by trilinear), and we get a texture bandwidth requirement of 3.75 bytes per rendered pixel. Without EDRAM the external memory bandwidth requirement is 12+3.75 bytes = 15.75 bytes per pixel. With EDRAM it is only 3.75 bytes. That is a 76% saving (over 4x external memory bandwidth cost without EDRAM). Deferred rendering is a widely used technique in high end AAA games. It is often criticized to be bandwidth inefficient, but developers still love to use it because it has lots of benefits. On Xbox 360, the EDRAM enables efficient usage of deferred rendering.
Also a fast read/write on chip memory scratchpad (or a big cache) would help a lot with image post processing. Most of the image post process algorithms need no (or just a little) extra memory in addition to the processed backbuffer. With large enough on chip memory (or cache), most post processing algorithms become completely free of external memory bandwidth. Examples: HDR bloom, lens flares/streaks, bokeh/DOF, motion blur (per pixel motion vectors), SSAO/SSDO, post AA filters, color correction, etc, etc. The screen space local reflection (SSLR) algorithm (in Killzone Shadow Fall) would benefit the most from fast on chip local memory, since tracing those secondary rays from the min/max quadtree acceleration structure has quite an incoherent memory access pattern. Incoherent accesses are latency sensitive (lots of cache misses) and the on chip memories tend to have smaller latencies (of course it's implementation specific, but that is usually true, since the memory is closer to the execution units, for example Haswell's 128 MB L4 should be lower latency than the external memory). I would expect to see a lot more post process effects in the future as developers are targeting cinematic rendering with their new engines. Fast on chip memory scratchpad (or a big cache) would reduce bandwidth requirement a lot.
Not enough information about the way the two vastly differ from each other to be certain about anything just yet. I mean, until we see second generation software that has had the time, resources, and experience to actually operate according to what the hardware is capable of and how to best achieve it, I think it's way too early to get comfortable with any sense of in-game differences that are equal to the difference in dry FLOP counts.I simply don't understand how you and some others can think the real gap between the two will be smaller than in theory.
For the first time in console history we have 2 very similar architectures, with the most powerful (most of the gaming code these days pushes the gpu, the cpu counts relatively) also being the one easier to develop for and more importantly the one with the simpler architecture: how can you even think that the weaker and more exotic console will be more efficient than the other, when logic suggests that the first is much more prone to have bottlenecks somewhere, is beyond me: I'd expect the gap in real world to be even wider than the theoretical 30-40%, not the opposite.
I'm in your boat.I simply don't understand how you and some others can think the real gap between the two will be smaller than in theory.
For the first time in console history we have 2 very similar architectures, with the most powerful (most of the gaming code these days pushes the gpu, the cpu counts relatively) also being the one easier to develop for and more importantly the one with the simpler architecture: how can you even think that the weaker and more exotic console will be more efficient than the other, when logic suggests that the first is much more prone to have bottlenecks somewhere, is beyond me: I'd expect the gap in real world to be even wider than the theoretical 30-40%, not the opposite.
You'd have to wait a few years really to see what the difference is.
As I said - I'm not an expert - I just read that from an actual developer:
http://beyond3d.com/showpost.php?p=1738762&postcount=3325
You're misreading him. When he talks about using a scratchpad 'reducing bandwidth consumption' or 'costing no bandwidth', he's talking about 'external bandwidth' - to DDR3 in this case. These things still all consume bandwidth to/from eSRAM.
You're misreading him. When he talks about using a scratchpad 'reducing bandwidth consumption' or 'costing no bandwidth', he's talking about 'external bandwidth' - to DDR3 in this case. These things still all consume bandwidth to/from eSRAM.
As I said - I'm not an expert - I just read that from an actual developer:
http://beyond3d.com/showpost.php?p=1738762&postcount=3325
Why would a platform with 40% better gpu, gddr5 and easier to program for suffer from identicle frame drops and tearing. The differences will be there day1 and that's assuming the laziest of lazy devs.
Why would a platform with 40% better gpu, gddr5 and easier to program for suffer from identicle frame drops and tearing. The differences will be there day1 and that's assuming the laziest of lazy devs.
Yes but while the GPU is using that eSRAM for post processing work. The DDR3 bandwidth is totally freed up for the CPU. If so. That's quite efficient.
At the same time as you are asking why some think the differences will be less than the numbers indicate in theory you are suggesting the gap will actually bigger than this.
Because you are simply looking at only a few numbers in your equation. The ESRam is probably more powerful than some of you assume. Just look for what the Edram has done for the 360 - you can get several forms of post processing and AA basically for zero bandwidth cost. That's a clear advantage. And as Albert Penello hinted, there seem to be even more variables involved that are unknown to that date. This is the basis for my assumption, that multiplat difference will be smaller than expected. I can also be totally wrong.
I mean anyone who's not a fanboy can see that killzone, knack and driveclub look mediocre, and on top of that run at 30fps. The more I see of x1 the more balanced it looks.
I think the X1 could have up to 30% more CPU power than ps4. Here's why.
X1 has dedicated servers freeing up to 10% CPU power for multiplayer and heavily integrated online games that the ps4 CPU has to deal with. Add on the 5-10% CPU savings from x1 having dedicated audio processors, Then the 10% higher clock than ps4 and you have a significant CPU advantage on x1 over ps4.
I also believe the x1's use of directx gives it an advantage simply because its the industry standard and such a mature toolset.
I mean anyone who's not a fanboy can see that killzone, knack and driveclub look mediocre, and on top of that run at 30fps. The more I see of x1 the more balanced it looks.
I've preordered both ps4 and x1 so ill be enjoying all the goodness. From the initial specs I thought the ps4 was gonna blow the x1 out the water, but that doesn't seem to be true. Also can I just say that dedicated servers on every game is a gamechanger. No more host advantage, the reason I fell out with online on past gen. Yay.
You don't have to be a fanboy to think Killzone looks mediocre? OK. That's also 60fps in multiplayer.I think the X1 could have up to 30% more CPU power than ps4. Here's why.
X1 has dedicated servers freeing up to 10% CPU power for multiplayer and heavily integrated online games that the ps4 CPU has to deal with. Add on the 5-10% CPU savings from x1 having dedicated audio processors, Then the 10% higher clock than ps4 and you have a significant CPU advantage on x1 over ps4.
I also believe the x1's use of directx gives it an advantage simply because its the industry standard and such a mature toolset.
I mean anyone who's not a fanboy can see that killzone, knack and driveclub look mediocre, and on top of that run at 30fps. The more I see of x1 the more balanced it looks.
I've preordered both ps4 and x1 so ill be enjoying all the goodness. From the initial specs I thought the ps4 was gonna blow the x1 out the water, but that doesn't seem to be true. Also can I just say that dedicated servers on every game is a gamechanger. No more host advantage, the reason I fell out with online on past gen. Yay.
I think the X1 could have up to 30% more CPU power than ps4. Here's why.
X1 has dedicated servers freeing up to 10% CPU power for multiplayer and heavily integrated online games that the ps4 CPU has to deal with. Add on the 5-10% CPU savings from x1 having dedicated audio processors, Then the 10% higher clock than ps4 and you have a significant CPU advantage on x1 over ps4.
I also believe the x1's use of directx gives it an advantage simply because its the industry standard and such a mature toolset.
I mean anyone who's not a fanboy can see that killzone, knack and driveclub look mediocre, and on top of that run at 30fps. The more I see of x1 the more balanced it looks.
I've preordered both ps4 and x1 so ill be enjoying all the goodness. From the initial specs I thought the ps4 was gonna blow the x1 out the water, but that doesn't seem to be true. Also can I just say that dedicated servers on every game is a gamechanger. No more host advantage, the reason I fell out with online on past gen. Yay.
I think the X1 could have up to 30% more CPU power than ps4. Here's why.
X1 has dedicated servers freeing up to 10% CPU power for multiplayer and heavily integrated online games that the ps4 CPU has to deal with. Add on the 5-10% CPU savings from x1 having dedicated audio processors, Then the 10% higher clock than ps4 and you have a significant CPU advantage on x1 over ps4.
I also believe the x1's use of directx gives it an advantage simply because its the industry standard and such a mature toolset.
I mean anyone who's not a fanboy can see that killzone, knack and driveclub look mediocre, and on top of that run at 30fps. The more I see of x1 the more balanced it looks.
I've preordered both ps4 and x1 so ill be enjoying all the goodness. From the initial specs I thought the ps4 was gonna blow the x1 out the water, but that doesn't seem to be true. Also can I just say that dedicated servers on every game is a gamechanger. No more host advantage, the reason I fell out with online on past gen. Yay.
You don't factor in the very low latency of eSRAM.Having a unified memory pool with the combined bandwidth of two DDR3/eSRAM pools is not less efficient, though. In fact, it's more flexible. You don't run into problems like not being able to fit your render targets into a small eSRAM pool. The render targets of KZ:SF are already ~40MB in size.
I think the X1 could have up to 30% more CPU power than ps4. Here's why.
X1 has dedicated servers freeing up to 10% CPU power for multiplayer and heavily integrated online games that the ps4 CPU has to deal with. Add on the 5-10% CPU savings from x1 having dedicated audio processors, Then the 10% higher clock than ps4 and you have a significant CPU advantage on x1 over ps4.
I also believe the x1's use of directx gives it an advantage simply because its the industry standard and such a mature toolset.
I mean anyone who's not a fanboy can see that killzone, knack and driveclub look mediocre, and on top of that run at 30fps. The more I see of x1 the more balanced it looks.
I've preordered both ps4 and x1 so ill be enjoying all the goodness. From the initial specs I thought the ps4 was gonna blow the x1 out the water, but that doesn't seem to be true. Also can I just say that dedicated servers on every game is a gamechanger. No more host advantage, the reason I fell out with online on past gen. Yay.
You don't factor in the very low latency of eSRAM.