• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How inFAMOUS: Second Son Used the PS4s 8 (4.5) GB of RAM + more details

mechphree

Member
Right now it doesn't even do as much as a PS3 OS which got by on much less RAM.

You sure ? I was never able to watch netflix and resume a game or jump into the ps store with a game running in the background on my ps3.
I expect that extra ram reserved is a safety thing if anything. Pretty soon ps4 will allow suspend and resume gaming along with take over some one else's game. I expect when that comes out they will start cutting back OS ram requirements.
 

Triple U

Banned
First of all check my first post in this thread.Second of all if Crytek is doing same render target in lower precision without losing quality that means that they spend less bandwidth with the same effect, so their solution is more efficient by definition.
And finally The Order is using 4xRGBA8 not sure about one, we have only one inconclusive slide about that [in material properties section].
I dont know about Killzone SF, because in post-mortem demo slides they were still finding out what is the best precision setup, but they used 5x buffers + 32b stencil/depth, but even this unoptimized setup is smaller than Infamous one.

I'm using Crytek data, because they gave the most inside into development from both Crysis 3 and Ryse and they actually researched g-buffer packing methods and their efficiency and quality in the past.



.

Eh I did. And I came away with the same conclusion. You who couldn't recall one piece of the games source code and know nothing about why they made certain game design decisions can make only flawed comparisions to other games.
 

vpance

Member
You're assuming all of the data accessed in a given frame is unique, but it typically isn't.

The point is even if games were designed to access all the ram at once in any given frame and if you could even attain the theoretical max you'd still only require 5.87GB before you stall on bandwidth. That's only a few hundred Mbs off what's available now. Since most demanding games are designed to stream in new assets all the time and just keep the commonly used stuff in cached it's not an issue. Virtual texturing is the future anyways.
 
I wish that for 1 game, naughty dog or sucker punch would have unlimited power to use. Just imagine the quality of that game.

Ryse uses a 900p buffer... 1080p buffer almost doubles the size of a 900p buffer
2,073,600/1,440,000 ~ 2.1/1.4 = 1.5 i.e. 50% more. Unless something's off on my end.
 

sono

Member
Very interesting and impressive read op, thank you.

By the way the first two slides in your post are identical (titled "what to do with 8GB".)

(I agree with others that the implied 3.5GB for OS is hard to understand - perhaps we can get a breakdown of that from Sony some day.)
 

slapnuts

Junior Member
I am just so happy we are finally into a new generation and games like Infamous, Killzone,etc so early in the lifecycle of PS4 are already pretty stunning games. Havent had a whole lot of time with Infamous with being pretty busy lately but i did have some more time to finish off Killzone single player and boy that game is so freaking beautiful and the fact that it was a launch game that clearly was half-assed just to make it out for launch blows my mind...same with Infamous being a mere 4 or so months afer the launch of PS4, again, that blows my mind in what we are seeing.

I seen many people doubt the abilities of these new consoles..especially Xbox One but in my opinion we never truly seen these GPU's fully taken advantage of on the PC side...and now we are on the console side and its spectacular. I can only imagine what some PC games will be looking like now that we are in this new generation so it is great for those of us that are unbiased gamers that are hyped for both console and PC gaming

I gotta say it again...owning a nice gaming PC and PS4 truly is the ultimate gaming combination. Having my gaming PC connected to my 55inch Panny Plasma along side my PS4 is freaking bad ass. My gaming PC does everything i need for a media center..one reason why i passed on Xbox One...but PS4's exclusives are going to blow minds .....sheesh and MLB The Show is one month away as well!! My goodness...gaming bliss baby!
 

furious

Banned
I wish that for 1 game, naughty dog or sucker punch would have unlimited power to use. Just imagine the quality of that game.


2,073,600/1,440,000 ~ 2.1/1.4 = 1.5 i.e. 50% more. Unless something's off on my end.

((2073600/1440000)-1)*100
1080p has 44% more pixels than 900p.

100-((1440000/2073600)*100)
900p has 30.55% less pixels than 1080p.

It's not hard math, why fuzz the numbers?

But you are right, double is significantly wrong.
 
I have a 5 yr old FIOS DVR that constantly buffers 20 mins of live HD footage and can record from two separate HD streams at the same time, all while offering the standard FIOS service OS, tv guides, etc. As OSes go it's nothing special but the basic HD video recording has been nothing but rock solid. The DVR has a total of 256 MB of RAM.

And does it run a game like Second Son in real time alongside? Didn't think so.

And you have no idea how either of the two "DVR" solutions work on an OS level. Stop trying to make bizarre comparisons that have nothing to do with each other to build this ridiculous narrative that Sony dropped the ball with memory allocation based on nothing but pure conjecture and armchair analysis.

"Bu-bu-bu- some DVR kinda does something like this other OS does! They're like the same thing or something!" Give me a break.
 
I hope they give more of that RAM back to developers soon.

There is nothing in the OS, even running a game and a video service at the same time, that takes anywhere near to 3.5GB. The original PS4 OS only had a 1GB maximum RAM allocation (back when there was only 4GB of RAM total) and even then there were plans to reduce that footprint.
 
I am still amazed what they did with God of War Ascension and The Last of Us with only 512mb RAM. People thinking that 4.5 is not enough need to look back.
 

Filipus

Banned
I got confused, they said the CPU was being used 100% all the time?
Poor resource management (in other words, pretty good resource management but could be better) or am I lost?
 
Personally, I also think 3.5 GB is super bloat for the OS. But then I don't care about any of the other functionality really. I hope they get it down.

PS3 had the same thing, but it's memory foot print was reduced during it's life time. I guess it's fair to expect the same from the PS4? Also, this 3.5 gigs of ram dedicated for OS, yet there are times where the OS is laggy and icons take time to load. Is the ram got to do with any of these issues?
 

astraycat

Member
I got confused, they said the CPU was being used 100% all the time?
Poor resource management (in other words, pretty good resource management but could be better) or am I lost?

Using 100% of the resources you are given is GOOD resource management, not bad. Game developers drool when they look at their performance captures and see no gaps in work. Any time there's an idle core (on the CPU or GPU) that's wasted potential work!

Of course this is all from a game developer's POV. Application developers generally don't want their app eating up all available resources, since that causes multitask systems to grind to a halt. But games are generally implicitly given all they can take, and they'd better use all that they're given!
 

system11

Member
How is a game using up all RAM a good thing in any way? It's not a fantastically beautiful game, it's not an incredibly big or detailed world, character models are good but not great and there is no AI to speak of so that would mean that it's just poorly optimized?
Why brag about yourself being bad at your job?

Depends why it's using all the RAM, really. As a silly example I'm working on a server right now with 16gb which is all in use - but that's optimal since most of that is just being used as cache.
 
PS3 had the same thing, but it's memory foot print was reduced during it's life time. I guess it's fair to expect the same from the PS4? Also, this 3.5 gigs of ram dedicated for OS, yet there are times where the OS is laggy and icons take time to load. Is the ram got to do with any of these issues?

Yeah, I am hopping for that. If they need to leverage more, they can have it.
 

Triple U

Banned
I hope they give more of that RAM back to developers soon.

There is nothing in the OS, even running a game and a video service at the same time, that takes anywhere near to 3.5GB. The original PS4 OS only had a 1GB maximum RAM allocation (back when there was only 4GB of RAM total) and even then there were plans to reduce that footprint.

It doesn't really matter whats in the OS now, they obviously have a roadmap on the table now for the future that includes more ram usage.

I really don't see why forum-goers and the like who will probably never write a line of code for PS4 get so invested in this. There hasn't been one developer AFAIK who's had anything negative to say about the PS4 memory situation, in fact its been the opposite. Please point out the developer who's pining for that marginal increase that a couple hundred megs of ram will give.
 

cripterion

Member
A bit OT but anyone's else ps4 getting loud when playing this game? I thought the ps4 was quiet until this. Game is digital btw.
 

Squozen

Member
I disagree. Its not a desktop operating system. They should be able to do alot with 2 gigs or less. Its a games console and the games should be the focus. Not the OS. Taking away available game RAM for OS stuff is not a good idea. Android doesnt need that and does more than PS4's OS. I'll be honest and say that Sony is terrible at OS development when it comes to efficiency compared to MS and other companies.

Wait, you're comparing a mobile operating system that's had years of work to a console OS that's been out for four months?
 

geordiemp

Member
How is a game using up all RAM a good thing in any way? It's not a fantastically beautiful game, it's not an incredibly big or detailed world, character models are good but not great and there is no AI to speak of so that would mean that it's just poorly optimized?
Why brag about yourself being bad at your job?

Too right, we need more 720p and 900p games to show us that up scaling goodness on console LOL...

Second son is nice and looks good and is technically impressive for a console game.

I agree in a way that it would be nice for Ps4 to utilise as much features possible for gaming...OS and other stuff is a nice to have but low in the pecking order.
 

aY227

Member
Eh I did. And I came away with the same conclusion. You who couldn't recall one piece of the games source code and know nothing about why they made certain game design decisions can make only flawed comparisions to other games.

Not games, but engine. CryEngine is general purpose and multiplatform engine [and even used in movie industry], so their solution must work for every case, not selected one, so must be the most optimal.
Now find me any source that claims that You need as big buffers for anything and it does increase performance or precision to noticeable degree.
Also explain why both KZ:SF and The Order have smaller g-buffer setup.

Have You even read those slides about g-buffer packing i posted? Or just trying to argue semantics, because hey its different dev and they must know the best.
And sure, better to have stupid discussion about RAM allocation for 30 times than actual some tech discussion here.
 

Triple U

Banned
Not games, but engine. CryEngine is general purpose and multiplatform engine [and even used in movie industry], so their solution must work for every case, not selected one, so must be the most optimal.
Now find me any source that claims that You need as big buffers for anything and it does increase performance or precision to noticeable degree.
Also explain why both KZ:SF and The Order have smaller g-buffer setup.

Have You even read those slides about g-buffer packing i posted? Or just trying to argue semantics, because hey its different dev and they must know the best.
And sure, better to have stupid discussion about RAM allocation for 30 times than actual some tech discussion here.

I'd wager that an SCEI backed studio has a firmer grip on PS4 optimization than Crytek. I don't need to find you anything, or read anything from whicherver random developer you post. Unless Sucker Punch details their design choices then we won't know why they chose the paths they did and instead we can make the same baseless conjectures and critiques that you are making now. Which im not interested in.
 

Oni Jazar

Member
Lots of people saying that the OS is bloated or inefficient for using 3.5GB if RAM. It's not bloated a lot of it is reserved. This system will be around longer than most cell phones and other devices so it needs to be ready for anything. If the system was using all 3.5 GB now then it would be difficult to use it for future features or give it back to devs.
 
I'd wager that an SCEI backed studio has a firmer grip on PS4 optimization than Crytek. I don't need to find you anything, or read anything from whicherver random developer you post. Unless Sucker Punch details their design choices then we won't know why they chose the paths they did and instead we can make the same baseless conjectures and critiques that you are making now. Which im not interested in.

Their G-Buffer seems large too for what it does. Pretty sure anyone can see that without calling their competence into question.
 

aY227

Member
Because of some arbitrary framebuffer numbers from random games?

Yeah, Your theory is better "its big, because it must be"

Give one reason why it should be, one that has any scientific basis. If You dont have, just stop arguing semantics.
 
How is a game using up all RAM a good thing in any way? It's not a fantastically beautiful game, it's not an incredibly big or detailed world, character models are good but not great and there is no AI to speak of so that would mean that it's just poorly optimized?
Why brag about yourself being bad at your job?

On the internet, nobody cares about your bullshit.

Would seriously love to see people like this put in a room with people they declare are bad at their jobs, and see how well their armchair programming actually holds up.

This is not opinion, it's just some drive-by hate :/
 

Triple U

Banned
Yeah, Your theory is better "its big, because it must be"

Give one reason why it should be, one that has any scientific basis. If You dont have, just stop arguing semantics.
My theory is that I don't have access to anything internal to sucker punch. So making arbitrary implications based on random other titles and decisions they make is flawed and will lead to fallacies.

The fact that you keep questioning me on why they did what they did shows just how much this simple theory is beyond you. It's a dead cause at this point, believe what you want I don't really care enough to keep wasting energy typing responses to you.
 

Nvzman

Member
I think that having 3.5 GB of Ram reserved is great.
It means that when Sony has the OS down well, then developers can do even more with the hardware in the future.
I like to see games get better and better, not stay pretty much the same.
 

kaching

"GAF's biggest wanker"
And does it run a game like Second Son in real time alongside? Didn't think so.

And you have no idea how either of the two "DVR" solutions work on an OS level. Stop trying to make bizarre comparisons that have nothing to do with each other to build this ridiculous narrative that Sony dropped the ball with memory allocation based on nothing but pure conjecture and armchair analysis.

"Bu-bu-bu- some DVR kinda does something like this other OS does! They're like the same thing or something!" Give me a break.
Why are you getting so incensed about this? I wasn't trying to make an absolute 1:1 comparison, just providing perspective to define what should be a reasonable upper bound on the RAM requirements for what is very well established tech at this point. Besides, this may well be a moot point if the game DVR really is implemented entirely on the low-energy ARM chip that has its own embedded RAM, like others have speculated.
 
Very interesting and impressive read op, thank you.

By the way the first two slides in your post are identical (titled "what to do with 8GB".)

(I agree with others that the implied 3.5GB for OS is hard to understand - perhaps we can get a breakdown of that from Sony some day.)
A breakdown wouldn't be very interesting. The OS was originally designed to run on 512MB, and it probably still could. Right now, that extra memory is mostly being used to improve performance. Think of it like upgrading the RAM in your computer. You can do everything you do now with 1GB, but everything just runs nicer with 4GB; you don't need to wait through a bunch of paging when you try to switch apps, you don't need to reload every time you hit Back in the browser, etc. Your workflow hasn't really changed; everything just goes more smoothly now. You may have access to a few apps you couldn't run before, but the main advantage is being able to run the new stuff alongside the old stuff without hampering overall system performance.
 

aY227

Member
My theory is that I don't have access to anything internal to sucker punch. So making arbitrary implications based on random other titles and decisions they make is flawed and will lead to fallacies.

The fact that you keep questioning me on why they did what they did shows just how much this simple theory is beyond you. It's a dead cause at this point, believe what you want I don't really care enough to keep wasting energy typing responses to you.

I question You, because You assume there is higher purpose for wasting BW and memory on those buffers. It does not give them meaningful higher precision or performance that is known.
The thing is that SP probably didnt need smaller ones and just didnt care or they didnt have time to research.

But hey, You can believe what You want. You can believe that they are using hidden dGPU for what i care.
 
I question You, because You assume there is higher purpose by wasting BW and memory on those buffers. It does not give them meaningful higher precision or performance thats is known.
The thing is that SP probably didnt need smaller ones and just didnt care or they didnt have time to research.

But hey, You can believe what You want. You can believe that they are using hidden dGPU for what i care.
Even if you're right and they could've made an effort to shrink them without affecting performance, wouldn't that just leave them more time to optimize elsewhere? Meaning, there is an advantage to the larger targets, right? Plus, it means there are still gains to be made later.
 

MXRider

Member
Hope no one posted this.

exp.jpg
 

astraycat

Member
I question You, because You assume there is higher purpose by wasting BW and memory on those buffers. It does not give them meaningful higher precision or performance thats is known.
The thing is that SP probably didnt need smaller ones and just didnt care or they didnt have time to research.

But hey, You can believe what You want. You can believe that they are using hidden dGPU for what i care.

You know, it could just be that they found the performance hit to not be all that high when exporting to that many render targets. For one thing, I think GCN cards export pixels at full rate at up to 64-bits at a time (on the console variants, I think the big variants like the 290x actually can export at 128-bits at full rate), so if they can get away with exporting 64-bit buffer (for their normals, for example) without a performance hit then why not?

You also seem to be ignoring one of the golden rules of optimization: don't optimize things that don't need to be optimized. It could be that for Crysis 3, as they were also targeting the PS3/360, had to really squeeze down in order to get it to work well on those platforms. SP on the other hand was only targeting the PS4, and found that the performance penalty of using so many render targets was negligible, and chose to spend their efforts optimizing elsewhere. I'm sure they have tools to visualize things like GPU wavefronts and how long they're running for, and would be able to see counters that indicated that their exports were stalling the system, or that their loads in their deferred were stalling the system.

The guys at Crytek are extremely good at what they do, but that doesn't mean that their way is the only way or the best way. It's just that their way was the best way for them.
 

Astral Dog

Member
Im sure that Sony did a good job with the design of the PS4 making it more future proof, the amount of memory reserved for the OS is a little concerning, but games already are using 10x the RAM of the PS3, and Infamous looks amazing, there is plenty of memory for now, with the possibility of expanding it in the future.
Developers arent complaining, and most people wouldnt have noticed if it was using 6GB or 5GB, besides, making the most of the resources available is very important or you could end up with games that are unoptimized shit,
 

HTupolev

Member
You also seem to be ignoring one of the golden rules of optimization: don't optimize things that don't need to be optimized. It could be that for Crysis 3, as they were also targeting the PS3/360, had to really squeeze down in order to get it to work well on those platforms.
That's probably a huge part of it. You usually don't want to be totally rethinking your buffering scheme by platform, and using an extremely fat G-buffer format on PS360 is more in the realm of cartoonish supervillainy than game development. So, the newer platforms also get slim G-buffers.

As a blatant example, Bungie has basically flat-out said that this is the reason that Destiny uses a 96bpp G-buffer scheme. Their deferred model is a good compromise to permit the game to render reasonably efficiently on any platform, while allowing near-720p results on 360 without resorting to tiling.

I suppose The Order is a PS4 exclusive like ISS, but it's still hard to make a direct comparison because the overall visual goals and rendering model differ significantly.
 

aY227

Member
You know, it could just be that they found the performance hit to not be all that high when exporting to that many render targets. For one thing, I think GCN cards export pixels at full rate at up to 64-bits at a time (on the console variants, I think the big variants like the 290x actually can export at 128-bits at full rate), so if they can get away with exporting 64-bit buffer (for their normals, for example) without a performance hit then why not?
Its more about wasting BW.

You also seem to be ignoring one of the golden rules of optimization: don't optimize things that don't need to be optimized.
I did not ignored that. I included it in one of my posts.
http://www.neogaf.com/forum/showpost.php?p=106821170&postcount=309
And i agree that they probably just didnt need to, but that does not change the fact that in future they probably will change their G-buffer scheme for more packed. Its not like BW is large on those consoles.
Whole my discussion here started from answering to guy post who read that Infamous uses 290mb for render targets and thought that Xbone will fall behind because it. But the truth is that in 900p and with good G-buffer packing it will fit into esram and rest render targets can be switched into esram or just used directly from ddr3.

Even if you're right and they could've made an effort to shrink them without affecting performance, wouldn't that just leave them more time to optimize elsewhere? Meaning, there is an advantage to the larger targets, right? Plus, it means there are still gains to be made later.

I've already talked about it in previous posts :)
 

Lord Error

Insane For Sony
Not games, but engine. CryEngine is general purpose and multi platform engine [and even used in movie industry], so their solution must work for every case, not selected one, so must be the most optimal
While I can't find some good reason to think you're wrong in this buffer size assessment, the position you're taking in what you're saying about Cryengine, is that of a blind faith. You can see this just by reading about SMAA implementation in AC4, and how much extra work or deviation from some Crytek's suggested solution they had to do. And this is the work that went just into AA that someone else seemingly already figured out, and documented well -- and despite that turned out to have a number of problems specific just to what their game was doing. Or even reading Crytek's own Ryse presentation, where it's clear they were thinking up solutions to problems as they went along, and inventing things that weren't already available (or done well) in their engine, just because the new game had special requirements for them.

Also, to bring up the obvious, even if some of their solutions were made to work for every case, it only means that those solutions are the most generalized - not that they're most optimal. Strange choice of words there. That's like saying that DirectX is the most optimal/fastest possible API, just because it works in every scenario and with every GPU.
 
Top Bottom