• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

F'DUPTON 3: Back in the Tub with 5.0/5.5/6/7/several Inches of RAM-Flavoured Water

Status
Not open for further replies.
BruceLeeRoy and Kagari backed up Thuway.

Hmmm...I had seen when BL did it, but I think I missed Kagari. If it's not a ton of trouble, do you have a link to his post?






Oh shit, sorry Kagari.

And thanks to all three of you :)
 

cchum

Member

0uw6d5g.gif
 

Codeblew

Member
I am assuming

5GB + 512MB(virtual)+ 512MB(physical)+2GB OS

Virtual memory isn't real memory so there is a missing 512MB of physical. It should total to 8.5gb if .5gb is virtual.

Edit: I am assuming the virtual memory is disk but I guess as pointed out by astraycat, it is not necessarily so.
 

astraycat

Member
Virtual memory isn't real memory so there is a missing 512MB of physical. It should total to 8.5gb if .5gb is virtual.

For virtual memory swappable to disk sure, you can have more of it allocated than actually fits into physical memory.

But virtual memory encompasses much more than swappable memory, so it's not necessarily the case.
 

Xyber

Member
If these menus are laggy on either console now, they've messed up. They better have one hell of a smooth user experience on these consoles.

Can't even imagine how bad the situation would've been if we only got 4GB (and people thought that was too much) if the OS is gonna use that much.
 

Vestal

Gold Member
So were officially back to 6 GB?

That was just my guess from day one. :)

Nothing exactly official.

Its the working assumption of 1 side of the coin with the other being that of the DF article.


Either way, we probably wont get the exact details for sure, and at the end of the day it wont matter squat really.. Its all just for internet console E-pen waiving.
 

Vestal

Gold Member
If these menus are laggy on either console now, they've messed up. They better have one hell of a smooth user experience on these consoles.

Can't even imagine how bad the situation would've been if we only got 4GB (and people thought that was too much) if the OS is gonna use that much.

No way that either console OS is laggy. It would be a catastrophe of massive proportions.. bear+salmon*infinity.
 

FINALBOSS

Banned
Nothing exactly official.

Its the working assumption of 1 side of the coin with the other being that of the DF article.


Either way, we probably wont get the exact details for sure, and at the end of the day it wont matter squat really.. Its all just for internet console E-pen waiving.

That's the most important part.

I don't care how games my games will look...as long as they look better than yours!
 

Fafalada

Fafracer forever
astraycat said:
They can put non-critical latency-tolerant stuff in paged memory, and put everything else that you can't gracefully handle a page fault on in the 4.5GB of guaranteed-resident memory.
Virtually nothing in realtime applications can "gracefully" handle a page fault. Or more specifically - if you want it to be graceful it requires exact/predictable latency, which is the antithesis of general purpose virtual-memory swap mechanisms.
PC games are the worst example of this - as they end up full of stuttering, popping and hiccups whenever OS decides to do something with game-controlled pages.

Sure - virtually mapping (near infinite)disk storage into your memory space is a neat thing on paper, but for data that actually matters in a realtime app, it's a very-difficult problem to solve without degrading the player experience considerably, which is why most streaming schemes in games are very limited in how they work, or when they are more ambitious (eg. Rage) they come under-fire for their fail-cases. OS managed Virtual memory that has no context of game state can only worsen those problems.
 
Nothing exactly official.

Its the working assumption of 1 side of the coin with the other being that of the DF article.


Either way, we probably wont get the exact details for sure, and at the end of the day it wont matter squat really.. Its all just for internet console E-pen waiving.

But the DF article is officially wrong I thought... The title just screams skeptical.

If multiple developers are saying they are using 6GB. Shouldn't that be enough to prove things wrong? I guess 5GB to 6GB isn't so bad... It just doesn't make sense...unless this OS is an un-optimized mess... That worries me more than ram allocation itself. :-(
 
Nothing exactly official.

Its the working assumption of 1 side of the coin with the other being that of the DF article.


Either way, we probably wont get the exact details for sure, and at the end of the day it wont matter squat really.. Its all just for internet console E-pen waiving.

Rational in a sea of chaos.

E-fistbump.
 

astraycat

Member
Virtually nothing in realtime applications can "gracefully" handle a page fault. Or more specifically - if you want it to be graceful it requires exact/predictable latency, which is the antithesis of general purpose virtual-memory swap mechanisms.
PC games are the worst example of this - as they end up full of stuttering, popping and hiccups whenever OS decides to do something with game-controlled pages.

Sure - virtually mapping (near infinite)disk storage into your memory space is a neat thing on paper, but for data that actually matters in a realtime app, it's a very-difficult problem to solve without degrading the player experience considerably, which is why most streaming schemes in games are very limited in how they work, or when they are more ambitious (eg. Rage) they come under-fire for their fail-cases. OS managed Virtual memory that has no context of game state can only worsen those problems.

I was thinking more along the lines of asynchronously doing some sort of calculation in another thread on some large dataset in swappable memory, not needed for several frames or even seconds.

Certainly nothing you'd need in anything near real-time rates.
 

Wynnebeck

Banned
All I can do is laugh at this thread. Naughty Dog was able to make The Last of Us with 512 MB of RAM. Ponder that for a second.

Sony's first party is going to melt faces this next gen if these launch titles are any indication. I really want to see more info about The Order as well.
 
Sony's first party is going to melt faces this next gen if these launch titles are any indication. I really want to see more info about The Order as well.

It's one reason why this RAM fiasco is a total non issue. Even if it's only using 5 GB (and every indication seems to point that they're going to be using more) they are still using GDRR5 and their devs are still coming off of a console that was a pain in the ass to develop for that had limited memory. So yes, the games are going to melt faces. Naughty Dog's games are among the best looking in console gaming and look at what they had to work with.
 
Sony's first party is going to melt faces this next gen if these launch titles are any indication. I really want to see more info about The Order as well.

I heard you can Super Size your Order for just a buck.

Ugh, I'm tired. Bedtime for me.

Goodnight GAF, you magnificent bastards!
 
I was thinking more along the lines of asynchronously doing some sort of calculation in another thread on some large dataset in swappable memory, not needed for several frames or even seconds.

Certainly nothing you'd need in anything near real-time rates.

Any info on whether there is a memory cap for devs? i.e. X is the max amount of bandwidth they can count on at any given time,
 

Lemnisc8

Member
'tis truly a fantastic day to join Gaf.

Now I need to break myself in and go and find some tub gifs.

Whilst im here, ill add that I have no problem with these memory allocations - bring on the games!

∞
 
I'm just speculating unfortunately :(

Fair enough but Sony is still going to have to reserve some of that bandwidth for real-time OS services. Just wondering if it will put a dent in the actually usable memory bandwidth for the devs. 2-3GBs of data doesn't just magically appear in RAM without crossing the bus.
 

astraycat

Member
Fair enough but Sony is still going to have to reserve some of that bandwidth for real-time OS services. Just wondering if it will put a dent in the actually usable memory bandwidth for the devs. 2-3GBs of data doesn't just magically appear in RAM without crossing the bus.

My guess would be that the OS services are mostly idle during the actual execution of a game, and so borrow very little of the available bandwidth. Game devs would cry bloody murder otherwise.
 
My guess would be that the OS services are mostly idle during the actual execution of a game, and so borrow very little of the available bandwidth. Game devs would cry bloody murder otherwise.

They would still have to reserve it since many of the OS services are available during game play, i.e. remote play, 15 min playback, background downloads. crossplay chat etc. Otherwise fps would drop everytime a service was used.
 

Fafalada

Fafracer forever
astraycat said:
I was thinking more along the lines of asynchronously doing some sort of calculation in another thread on some large dataset in swappable memory, not needed for several frames or even seconds.
Now you only need things like that in context of games. Basically you're talking about a solution looking for problems to solve. And PC has had decades of looking for those without terribly noteworthy things to point out.

CoG said:
Downloading...
Step 1/5: Installing Microsoft Direct X for Windows (even though it's been installed 100x already)
Step 2/5: Installing Microsoft Visual C++ runtime libraries
Step 3/5: Installing XNA
Step 4/5: Some other PC shit
Step 5/5: Some other PC shit
Launch game.
Change video settings.
Play game for 2 mins
Tweak video settings.
It's fine for people like your and I but it's never going to be as easy as the consoles for normal people. Not gonna happen.
The sad part here is that removing steps 1-5 is actually a solved problem (on tech level), whether Valve (or MS) will ever elect that it's worth to give this to the players is another matter though.
 
Gyskface is my buddy. He's legit. I know his secrets. ;)

I wonder what he means by the virtual memory being decreased from 1 GB to .5GB...does that mean .5 more GB is allocated to physical memory (not flexible)?

Wish he would elaborate on his statement, all it really confirms is that Eurogamer's info may be out of date
 

MikeDown

Banned
Couldn't even at least skim the thread, huh?
Nope, just wasn't feelin it. And i think to be fair it is alot to skim read.



When people rail against Vista, they don't understand why. Vista was awful because:

- at the time most computers had like half a gb of RAM.
- Vista had huge driver issues so shit didn't work

It was bloated because there wasn't enough RAM in most systems so performance was shit. But if you use Vista in a modern computer with 4gb of RAM, it runs fine.
Agree 100%.
 
Status
Not open for further replies.
Top Bottom