3GB reserved for the OS is dumber than thinking Allison Brie is common looking. Which is why i think it's total bullshit.
Except it's been confirmed by MS.
3GB reserved for the OS is dumber than thinking Allison Brie is common looking. Which is why i think it's total bullshit.
3GB reserved for the OS is dumber than thinking Allison Brie is common looking. Which is why i think it's total bullshit.
Except it's been confirmed by MS.
Even if it was 2Gb (coz 2 and a bit OSes, hurr durr), the DDR3 will still be a handicap. I also doubt developers would get much use out of 32Mb of embedded RAM.
link?
Nah not really.
And yeah they will, especially if its a hardware controlled framebuffer.
Found it.link?
Fairly unambiguous.8GB DDR3 (5 GB available to games)
One thing that i'm curious is how much bandwidth do the background operating systems take on XBone. As they said in the presentation these tasks will run in tandem with the game so if you have stuff going on in the background how much BW do they reserve for this.
Gemüsepizza;58554877 said:Only 8GB DDR3 RAM and 32MB eSRAM. No GDDR5.
I don't think people really understand what "confirmed" means.
Is everyone forgetting that certain calculations can be offloaded to the cloud, or did I watch the wrong stream?
The GPU is so gimped.
Then again, GDDR5 is not ideal for non-graphics memory usage because of latency. So the PS4 could suffer in other areas in terms of performance. AI routines and the like.
The GPU is so gimped.
Then again, GDDR5 is not ideal for non-graphics memory usage because of latency. So the PS4 could suffer in other areas in terms of performance. AI routines and the like.
The GPU is so gimped.
Then again, GDDR5 is not ideal for non-graphics memory usage because of latency. So the PS4 could suffer in other areas in terms of performance. AI routines and the like.
Then again, GDDR5 is not ideal for non-graphics memory usage because of latency. So the PS4 could suffer in other areas in terms of performance. AI routines and the like.
Even if it was 2Gb (coz 2 and a bit OSes, hurr durr), the DDR3 will still be a handicap. I also doubt developers would get much use out of 32Mb of embedded RAM.
Yes, keep moving that goal post when it's still firmly stuck in the ground.
What a wasted effort on my end.
Way to use an overused term that doesn't even apply.
Divert efforts towards looking up "confirmed" ?
just show me the games. Neither has shown me anything I really must have yet.
All that's on my radar right now is Destiny and Watch Dogs. Both multiplats.
I need to see exclusives I actually care about to make a decision.
And I really hope its new IP exclusives. I'm bored with this never ending gen and the sequel farms
You mean those same AI routes that are going to the cloud which has way more latency than GDDR5?
Does anyone ACTUALLY think that this will happen? 95% of AI in games is a joke. These companies have been talking big game about using horsepower in the form of extra CPU cores to revolutionize AI since 2005. If their "labor" has born fruit, I haven't seen it. It's all basic path finding and target acquisition, and in the case of Cauladooty we don't even get that.
Uh what? We don't know any specifics of that yet, but I seriously doubt any games will offload their computations online. SimCity didn't even do that to any meaningful degree.
Well this is next gen. I sure hope AI gets better along with everything else. Regardless, my point stands for other types of computation as well. Pretty much anything non-graphics related can be done more efficiently with DDR3. This is because the data used in those calculations usually doesn't require the bandwidth that GDDR5 provides, and there is a tradeoff with getting that bandwidth.
No it won't. You still don't get it but you'll see soon.first party? maybe, yeah.
third party not so much. it'll be smaller differences.
Honestly, based on this news I hope that sony reserves around 2 gbs for the ps4 os so that they can put more into features. It could be pretty valuable so that they don't pigeonhole themselves in terms of services.
The latency argument is rubbish. Ddr2 has less latency than ddr3 and ddr3 less than ddr4. Like wise with gddr iterations. But you know what, it makes no difference at all to negate performance because the bandwidth/speed differences more than make up for the latency differences.
Honestly, based on this news I hope that sony reserves around 2 gbs for the ps4 os so that they can put more into features. It could be pretty valuable so that they don't pigeonhole themselves in terms of services.
Unlike the Xbone, PS4 will not be running 3 separate but cohesive OSs. It'll only have one (afaik) and so expecting it to go beyond 1GB is highly unrealistic.
Unlike the Xbone, PS4 will not be running 3 separate but cohesive OSs. It'll only have one (afaik) and so expecting it to go beyond 1GB is highly unrealistic.
i don't understand seperate but cohesive OS's, they have 3 separate kernels with separate schedulers running? I guess you have 1 primary kernel that manages the two other kernels?
Like, it doesn't make any sense to me whatsoever.
3GB reserved for the OS is dumber than thinking Allison Brie is common looking. Which is why i think it's total bullshit.
The very definition of moving goal posts right here.I don't think people really understand what "confirmed" means.
Unlike the Xbone, PS4 will not be running 3 separate but cohesive OSs. It'll only have one (afaik) and so expecting it to go beyond 1GB is highly unrealistic.
If this is true and the Xbox 1 releases with noticeably less power than the PS4, has there ever been a generation where 3 consoles have a significant gap between them? Wii U is around ps3 and 360's league, and the Xbox1 is like a halfway mark between the Wii U and PS4. This is very bizarre.
How many cores is the PS4 using for OS?
The very definition of moving goal posts right here.
Straight from the horse's mouth and still looking for "confirmation".
Yes, keep moving that goal post when it's still firmly stuck in the ground.
What a wasted effort on my end.
The point is that non-graphics memory doesn't need the high bandwidth. So those operations would inherently be more efficient with memory that has lower latency.
Yes, PR talk with no confirmation and talking about "experience" the One will provide, clearly confirmation from the horses mouth.
He would like an official MS letter issued directly to dIEHARD of NeoGAF. But then who will verify that the letter has indeed been issued by Microsoft and not by Kaz's fake twitter account?Did you watch that video? That GI guy mentioned only 5 GB of RAM available to games 2 or 3 times and there was not correction or denial made! Then at the Architecture panel after the reveal a different architect mentioned that 5 GB or RAM was more than enough, and that even 4 GB would have been enough, but they decided to go with 8 GB so that could deliver this All in One expurrrrience! They are not EVER going to say straight up 3 GB is reserved for OS because it doesn't benefit them from a PR perspective to do so.
Did you watch that video? That GI guy mentioned only 5 GB of RAM available to games 2 or 3 times and there was not correction or denial made! Then at the Architecture panel after the reveal a different architect mentioned that 5 GB or RAM was more than enough, and that even 4 GB would have been enough, but they decided to go with 8 GB so that could deliver this All in One expurrrrience! They are not EVER going to say straight up 3 GB is reserved for OS because it doesn't benefit them from a PR perspective to do so.
so 5 GB DDR3 vs 7GB GDDR5.
God damn.
Oh so basically you are.. reading into things not being said or denied?
Sounds like a confirmation to me.
Ok so where did GI get this idea of 5GB for games only then? They put it in a specs chart and mentioned it in that interview WITH A LEAD ARCHITECT! Yet no correction has been made, why?