They're essentially trying to turn some wishful-thinking into a rumor even though it's this late in the process.I assume that this post has popped up in this thread but I haven't been keeping up:
http://forum.beyond3d.com/showpost.php?p=1763732&postcount=4689
Basically a ram upgrade that makes zero sense as far as logic goes.
An overclock to the cpu (seems possible...?)
Some other crap.
I can definitely see them trying to get the clock speed of the cpu up but given that this is in the same rumor as something as absurd as redesigning the entire motherboard to hold more ram (unless there is a DDR3 size/config I've never heard off...?) makes me doubt the entire thing. That and the people on that forum are as crazy as we are about Platinum games on Nintendo systems or Half Life 3.
To achieve 12GB wouldn't they have to either use 24 4-Gbit chips instead of 16 4-Gbit chips they currently use? The board they showed at the reveal didn't look like it has room for an additional 8 chips.
Why are they just now addressing the large system reservation if it is such a problem? This seems like it would be something that would had been known well in advance with more than enough time to address the problem before the reveal or E3.
It would also seem that this would be something Xbox fans would hope not to be true. What happens if it is true developers are complaining about the system reservation and MS decides against or is unable to upgrade to 12GB RAM?
Lost Planet, Assassins Creed, GTA 4, Red Dead Redemption, Mass Effect 3, Transformers and Skyrim on the PS3 say hello.
They are all garbage ports and are so clearly better on the Xbox 360 it isn't even a contest.
Man when will this sort of nonsensical wishful thinking end?
Uh, so you're agreeing with him? Just in an oddly phrased manner? Hope so anyway.
With HMC the bandwidth feeding mobile SoCs will far outstrip that delivered by GDDR5, 2013 is really bad timing for next-gen. We are on the cusp of a paradigm shift on the semiconductor side, performance is not going to increment but explode across the entire range of TDPs. Intel finally getting serious about mobile will aid that ramp.
I think this will have to be a 5yr cycle this time around since mobile will start to give comparable performance within 3-4yrs.
It is even less complicated than this: PS4 will get better fps and/or better resolutions in the same multiplatform titles. A developer doesn't have to do jack shit for that to happen, it simply comes with faster GPU and memory bandwidth. Then there's supposed additional 2 GBs of RAM available to games on PS4 which can lead to better textures and less pop-in, and this is very simple to implement too. And after that comes more complex shaders and effects possible with a faster GPU.I think people need to keep in mind that third party developers on PS4 are going to be competing with Sony's first party and if there is a huge disparity then third parties will get shit from the hardcore until they bring their titles up to scratch which we know isn't going to be hard given the ease of development on PS4. While the same could be said for Xbone, it will be easier for third parties to dismiss it as being difficult to work with because of the eSRAM and DMAs complicating development.
The esram is by default more complex than the PS4 setup. The key is how automated is it to gain its benefits? If it works like a cache then it is no effort to use, but then you also don't have much control over it. Alternatively if you want more control to get better results,you need to put more effort in. And by your own comments, Multiplatform devs will go for the easy route where possible. It's perfectly possible that if Xbox one's memory system is in any way complex to use, some devs simply won't bother because the publishers won't pay for the time needed.
It is even less complicated than this: PS4 will get better fps and/or better resolutions in the same multiplatform titles. A developer doesn't have to do jack shit for that to happen, it simply comes with faster GPU and memory bandwidth. Then there's supposed additional 2 GBs of RAM available to games on PS4 which can lead to better textures and less pop-in, and this is very simple to implement too. And after that comes more complex shaders and effects possible with a faster GPU.
I'm pretty sure that the difference will be quite visible in basically every mutliplatform game in one way or another.
I actually really enjoyed reading this post, because I feel we've all done something similar at some point. Sometimes you just want something to be true so badly that you lie to yourself.That was my big reality check on wishful thinking. I see so many people construct these insane conspiracy theories either in favor of their console of choice or against the ones they don't like... or even fear, like that MS is going to pay developers to make shitty PS4 ports - and I laugh. I laugh really hard. But once upon a time I was those dudes and weezer crushed my soul.
This is barely any more complex than the PS4's setup as far as I know. I don't know why people are making out like it's some vast gap in complexity and difficulty. Honestly a lot of it reads like 'it's not good enough that I should succeedothers must fail'. They're both going to be very easy for developers to work with.
Because PS4 has UMA after you load things to RAM there is no other work you need to use that data. It is as vast as you can get.
After eDram in x360 and generally split memory systems of last gen this is not issue for many devs but for PS4 is way better solution than split ram.
Gemüsepizza;69323561 said:I don't think both situations are comparable. Xbox 360 had ~4% more RAM for games than PS3 at the same speed AND additional eDRAM. Xbox One has ~29% less RAM for games at ~39% the speed of the PS4's RAM and additional eSRAM. Having a eDRAM/eSRAM configuration is not a problem when you already have more RAM at the same speed thanthe competition, it's a bonus. But it probably becomes a small annoyance when you have less RAM at a lower speed than the competitor.
360 was always described as having a UMA too, because the EDRAM was only every used to store the framebuffer (as I recall, the ROPs had to write the framebuffer to the EDRAM); both the CPU and GPU had access to the same 512MB of RAM. For all intents and purposes 360 was a UMA, and unless there's some major reason why it's different now, Xbone is too.
Split memory (a la PS3) was a problem because you were severely constrained by what you could put in RAM: your GPU could never access more than 256MB of data and your CPU could not have access to more than 256MB of data either, without passing through the other. That is not a problem with Xbone: both have access to the same 5GB pool.
My point with UMA was that PS4 has only one pool of memory shared between CPU and GPU, no other eDRAM or anything. After you load things to it you don't do anything with it to use it. That is way better solution than pool of memory + eDram/eSram. Not only memory config is faster but it is also easier to use.
Cerny in last "road to PS4" had two choices: 256bit GDDR5 memory which is currently in PS4 and other 128bit bus GDDR5 memory (88GB/s) with small eDRAM pool @ 1000GB/s
They choose not second because it added complexity.
Well, it is more difficult to work with two RAM pools with different bandwidth and sizes. ESRAM of XBO is the only reasonable pool to contain frame buffer but at the same time 32 MBs are not that much for 1080p with deferred shading and all. So you'll have to tile the frame probably which adds complexity to the renderer and increases timeframes.As interesting as the Cerny videos are, they are PR. They might just as well have not done it because, as we have seen from Xbone, EDRAM lowers yields and pushes up cost. I would prefer to hear from someone with experience whether EDRAM causes serious headaches instead of the chief architect of the console's main competitor.
As interesting as the Cerny videos are, they are PR. They might just as well have not done it because, as we have seen from Xbone, EDRAM lowers yields and pushes up cost. I would prefer to hear from someone with experience whether EDRAM causes serious headaches instead of the chief architect of the console's main competitor.
I see so many people construct these insane conspiracy theories either in favor of their console of choice or against the ones they don't like... or even fear, like that MS is going to pay developers to make shitty PS4 ports - and I laugh. I laugh really hard. But once upon a time I was those dudes and weezer crushed my soul.
PS4 is going to be worse AT GPU Boolean comparison operations while other systems can be multithreaded to aid Logic. This may hurt the PS4's use of Tessellation since boolean operations will be slower than Wii U and Xbox 1. It also has a weak CPU. It may not be able to handle many hardware lights ranging anywhere from Wii U to Last Gen. It just uses shader lights which don't have Alpha Channel correction. When a light shines on a ambient reflective object it only reflects back certain colors hiding the rest. PS4's lack of Edram means that the Char persision storage of shader information in RAM will be equivalent to Xbox One.
Booleans increase the number of header files to store their memory address and PS4 has huge memory banks think the location of one fish in a large ocean. PS4 has to store this address on the system to be read by memory controllers. This burden is much less for Embedded RAM which Wii U and Xbox one have which is much smaller in size and can be cycled for large storage. That means boolean dependent GPU operations will be worse. With all the boolean comparisons in tessellation to determine whether the structure is ABA or ABB or BAA the PS4 won't be able to handle as long tessellated strings. At 8bits Char persision Header files will reduce the Ram speed to Xbox One levels. PS4 has yet to show many hardware lights in Graphically intensive games it only has shader lights probably due to limitations which look unnatural.
It's not a myth that DDR5 is worse than DDR3 at the same speed.
I suppose this fits here since I can't find a dedicated thread when talking about PS4's hardware, but what do you make of what I found from one poster on another site concerning PS4's RAM?
This one poster on another site eh?
I'll give you a hint: The name of the site starts with a G & has 3 capital letters & a apostrophe after that.
I suppose this fits here since I can't find a dedicated thread when talking about PS4's hardware, but what do you make of what I found from one poster on another site concerning PS4's RAM?
This poster clearly has no idea what he's talking about. Nothing he said remotely made sense.
i assume that when the poster mentioned booleans, he was refering to branch prediction on the cpu.
Sony got lucky with GDDR5, they were planning on only 4GB,
The prices fell, and devs asked for more so they bumped the specs.
With the original plan everyone would be talking about how MS will crush Sony.
I hope both consoles do very well. Competition breeds excellence.
Honestly I am an Xbox consumer right now. Will the difference in powers yield the "wild advantages" people are already proclaiming? We'll have to see, but I hope it's not a massacre.
Sony got lucky with GDDR5, they were planning on only 4GB,
The prices fell, and devs asked for more so they bumped the specs.
With the original plan everyone would be talking about how MS will crush Sony.
I hope both consoles do very well. Competition breeds excellence.
Honestly I am an Xbox consumer right now. Will the difference in powers yield the "wild advantages" people are already proclaiming? We'll have to see, but I hope it's not a massacre.
PS4 is going to be worse AT GPU Boolean comparison operations while other systems can be multithreaded to aid Logic. This may hurt the PS4's use of Tessellation since boolean operations will be slower than Wii U and Xbox 1. It also has a weak CPU. It may not be able to handle many hardware lights ranging anywhere from Wii U to Last Gen. It just uses shader lights which don't have Alpha Channel correction. When a light shines on a ambient reflective object it only reflects back certain colors hiding the rest. PS4's lack of Edram means that the Char persision storage of shader information in RAM will be equivalent to Xbox One.
Booleans increase the number of header files to store their memory address and PS4 has huge memory banks think the location of one fish in a large ocean. PS4 has to store this address on the system to be read by memory controllers. This burden is much less for Embedded RAM which Wii U and Xbox one have which is much smaller in size and can be cycled for large storage. That means boolean dependent GPU operations will be worse. With all the boolean comparisons in tessellation to determine whether the structure is ABA or ABB or BAA the PS4 won't be able to handle as long tessellated strings. At 8bits Char persision Header files will reduce the Ram speed to Xbox One levels. PS4 has yet to show many hardware lights in Graphically intensive games it only has shader lights probably due to limitations which look unnatural.
It's not a myth that DDR5 is worse than DDR3 at the same speed.
Yet Timothy Lottes (FXAA creator) did a blog post on why he still favoured PS4 with its 4GB setup. Clearly you don't know shit.
But it doesn't matter what a lot of people say. Technical threads aren't supposed to be popularity contests.I never claimed to know shit, so calm down.
A lot of people would just say 4 < 8.
I suppose this fits here since I can't find a dedicated thread when talking about PS4's hardware, but what do you make of what I found from one poster on another site concerning PS4's RAM?
It reads like complete bullshit, he might as well have tried to convince you that he rides a fucking unicorn.
I suppose this fits here since I can't find a dedicated thread when talking about PS4's hardware, but what do you make of what I found from one poster on another site concerning PS4's RAM?
Yeah I don't know what the fuck he is talking about. Boolean comparisons? Booleans are fucking true/false values.
The fan-drivel is one of the highlights of GAF, especially when new hardware is on the horizon.I see so many people construct these insane conspiracy theories either in favor of their console of choice or against the ones they don't like... or even fear, like that MS is going to pay developers to make shitty PS4 ports - and I laugh.