• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Witness using 5GB RAM so far

Status
Not open for further replies.

kitch9

Banned
There is the idea that using RAM which you might do without is bad, even if it is a cache you can relinquish on a whim. See the way people read Windows's Task Manager memory usage vs what they want to see in their Linux based distribution setup (the "oh it is only using 1 GB out of 12 GB, 11 GB free!!!!" thing).

People who think like that could be defined as clueless wouldn't you agree?
 

WolvenOne

Member
Next next gen probably wont be out by the end of the decade.

So, Nintendo's going to stick with the Wii U until the 20's? I doubt that, especially if they can't get the hardware to sell better.

I'd wager Nintendo ends up replacing their console around, 2016, or 2017. I doubt they'd make a console that was immensely more powerful than the ps4, but once Nintendo releases new hardware, its only a matter of time till Sony and Microsoft responds.
 

Chev

Member
Sounds unoptimized

He made a whole presentation on the subject a few years ago: you can choose between building a game or building a gigantic behemoth of optimization, as an indie team you rarely have the resources to do both. It's as simple as that.
 

JawzPause

Member
We will see games within the next 10 - 20 years needing 1TB of RAM, and some
developers (me? ;)) will come around saying it's not enough, perhaps two or
three would be better.

One of the problem is that memory consumption isn't a linear function of the
problem size. For images it's an n² grow (width x height). For volumes it's
already n³. In one of those expected futures we will see some games where
almost all the objects or surroundings are (physical) volumes. Just doubling
their resolution, (2n)³, yields 8n³. So one would need an 8x increase in
memory just for doubling the resolution. Hence, in the future we will see a
much faster memory consumption as we have today. Techniques like adaptive
refinement (within a 3d material volume) will become key algorithms to counter
act this rapid increase in memory consumption.

So with each next-gen, the memory available is eaten away faster, since the
problem size doesn't stay the same. That's perhaps the observation made by
Mr. Blow.

Na, we'll be using Quantum computers! (hopefully)
 

Krilekk

Banned
jZU6occCS3EQV.png


jri4vSKpVT85c.png


jTZ8N2zhDzH6m.png


https://twitter.com/Jonathan_Blow

The message is: If my game with only one square mile of content already eats up 5 GB, imagine what a next gen GTA will need. PS4 exclusive confirmed.

Another interpretation: Witness only uses 5 GB of the 8 GB PS4 RAM. X1 port incoming.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I hope you are kidding....

You are kidding aren't you?

If devs want to push the graphics on next gen devices as much as possible then they are going to need to use streaming technology.

With the speeds of modern hard drives and the memory available on these next gen machines. There is no reason why we shouldn't have high fidelity huge worlds with full object persistence such as bullet holes, destruction, etc... and zero load times after the initial load.

That's what I expect from next gen games.

So no. I'm not kidding. Are you?
 
Isn't he also bringing this game to iOS? I understand the concept of "if you're guaranteed to have it, use it" but if you're going to have accommodate for weaker tech as we'll then surely that's your bottleneck when it comes to design.
 
I wish them luck to get it running on ios. Im pretty sure they wont have 5gig to fill up.
But then again they could just dump graphics down use 512x512 textures instead 2kx2k.
Should cut texture memory with 16.
 

TEH-CJ

Banned
I don't understand what his problem is. The ps3 has 512 ram and it produced some of the most incredible looking and playing games this gen.

surely 8gb of ram is more than adequate to make a substantially better game. It just has to be..
 

SappYoda

Member
I think theres a general misconception of what optimized means.

I'm not going to go into details, so i'll try to simplify it with an example. You can trade memory for speed. Using more memory to make the game run faster. So those who are worried about the game performing bad because it uses more ram can be calm.

The confusion comes from the context of PC where maximizing the number of apps a system can run at the same time is important. But in a system like a console where all the resources are dedicated to a single aplication using all the resources is actually a good thing and should be taken as being optimized.
 

JaseC

gave away the keys to the kingdom.
I don't understand what his problem is. The ps3 has 512 ram and it produced some of the most incredible looking and playing games this gen.

surely 8gb of ram is more than adequate to make a substantially better game. It just has to be..

There isn't a problem. The point is that the amount of RAM the next-gen consoles are offering makes his job a lot easier.
 

IvorB

Member
I'm not sure people understand what optimisation involves. It's tweaking the code to allow it to run optimally on a system. If there is loads of RAM on the system then just use it. If you have the RAM you might as well use it rather than spending your time doing "optimisation" to reduce RAM footprint. In fact if you cut down RAM consumption by introduction streaming or whatnot then I would think that's unoptimised because you are not making best use of system resources.
 
There is more than one type of optimization. People seem to assume it means optimizing for RAM usage, which is sort of understandable as that was a major concern this gen.
But a more important optimization for games is to optimize for SPEED. In this case, you just go mad with what RAM you have to make the game run as well as possible.
So you can do crazy stuff like stuff a gigs worth of pre-calced tables in there or preload everything to RAM just to avoid some HDD or BD access.

"Sounds unoptimized" is a ridiculous reaction!
 

v1oz

Member
He's basically using all the RAM just because it's there. Which is the lazy route, you just end up with inefficient code that's a memory hog. There will always be better coders out there who can make sure every single meg counts, resulting in technically superior games with less resource utilisation and better performance.

That's why I've always preferred coders with a hacker demo scene background. Old school guys like the people from Factor 5 who really pushed hardware to do things you never thought it was capable of. They'd do amazing things with just 2MB of RAM on the Amiga - 5GB would just lend them near infinite creative possibilities.
 
There is more than one type of optimization. People seem to assume it means optimizing for RAM usage, which is sort of understandable as that was a major concern this gen.
But a more important optimization for games is to optimize for SPEED. In this case, you just go mad with what RAM you have to make the game run as well as possible.
So you can do crazy stuff like stuff a gigs worth of pre-calced tables in there or preload everything to RAM just to avoid some HDD or BD access.

"Sounds unoptimized" is a ridiculous reaction!

You know its hard to say without benchmark numbers.
Given how gddr5 does have some more latency maybe the solution for the problem can be calculated within 20 cycles instead of waiting lets say for example 30 cycles before you get it from memory.
 
I'm actually really happy to hear this, the game probably wont have a single loading screen and will be a very Journey-like experience I hope.
 

plainr_

Member
That's his point really.

Hardware can save development time by allowing you to side step optimisation, if it's powerful enough / has more capacity / whatever.

It's how more powerful hardware can actually make development cheaper.

That's exactly how I read it. I honestly expected a much cheerful thread.
 
i just realized i made a mistake in that post. being gddr wouldnt help loading times since that's bound by the disk. loading in 8gbs all at once would be rough but he'll probably have several 'first-time' loading screens when switching between areas that won't appear again when switching areas in the future. or something like that.

ill actually edit this in my other post. ah, nvm it still makes sense in another way.
Surely he could just load the area the player was in when he saved first, then load the rest of the island in the background, no?
(...)
Now you added 'small'. But I think even small teams can fill up 4, 5, 6 GB
of RAM. There are many small indie teams out there, all filled up with kind
of professional guys and gals. Indie != incapable. ;)
But being a small team does mean that it takes a long time to produce high-quality assets, meaning that you most likely don''t have the time/budget to make enough of them to fill up too much space.
 

PaulLFC

Member
Surely the only way 8GB "isn't much" is if you haven't optimised your game. What games on PC require over 8GB to function? None?
 

Chev

Member
He's basically using all the RAM just because it's there. Which is the lazy route, you just end up with inefficient code that's a memory hog. There will always be better coders out there who can make sure every single meg counts, resulting in technically superior games with less resource utilisation and better performance.

Again, he made a presentation on the subject, and the rationale is if it'll run fine unoptimized optimizing it is a waste of dev time that could be spent on making the game better. The hacker type of developer will do it because they like optimizing crazy things, but that's just escalating your costs unless they don't count their overtime (and as it happens hacker types usually don't) and/or you're ready to let your budget swell (not an option for indies who want to survive).

That's a good practice that's been known in software development outside the games industry for years: never optimize something unless you've got a metric saying it's necessary, based on your minimum specs. If he's OK with targeting systems that have 5+ GB, it's perfectly fine to have the game gobble up all that. That's why developers have been asking for more ram, so they can stop inane contortions to fit games in it.

Surely the only way 8GB "isn't much" is if you haven't optimised your game. What games on PC require over 8GB to function? None?
"We'll never need more than 640K of ram!"
 

eot

Banned
He's basically using all the RAM just because it's there. Which is the lazy route, you just end up with inefficient code that's a memory hog. There will always be better coders out there who can make sure every single meg counts, resulting in technically superior games with less resource utilisation and better performance.

That's why I've always preferred coders with a hacker demo scene background. Old school guys like the people from Factor 5 who really pushed hardware to do things you never thought it was capable of. They'd do amazing things with just 2MB of RAM on the Amiga - 5GB would just lend them near infinite creative possibilities.

Yeah, he should totally place narrow corridors everywhere so the player has to walk more slowly to accommodate streaming of the next level, even though he can actually fit it all RAM at once. That's optimization

Also what Chev said, optimizing when you don't need to is a huge waste of time.
 
Or he could just load the area the player was in when he saved first, then load the rest of the island in the background, no?

That is probably what he is doing anyway.
He probably has to have a streaming engine in the game because its on IOS. And dont think IOS devices will let you use 5gb more like 50~100mb if im not mistaken 10% of device memory.
 

kitch9

Banned
If devs want to push the graphics on next gen devices as much as possible then they are going to need to use streaming technology.

With the speeds of modern hard drives and the memory available on these next gen machines. There is no reason why we shouldn't have high fidelity huge worlds with full object persistence such as bullet holes, destruction, etc... and zero load times after the initial load.

That's what I expect from next gen games.

So no. I'm not kidding. Are you?

So why does this guy need to use streaming again?

You can use all the RAM (Which is hundreds of times faster as a storage pool than a HDD so would be a smart idea.) and streaming. Saying stuff like "Not using streaming so unoptimised LOL," is clueless.
 

TheD

The Detective
If devs want to push the graphics on next gen devices as much as possible then they are going to need to use streaming technology.

With the speeds of modern hard drives and the memory available on these next gen machines. There is no reason why we shouldn't have high fidelity huge worlds with full object persistence such as bullet holes, destruction, etc... and zero load times after the initial load.

That's what I expect from next gen games.

So no. I'm not kidding. Are you?

Hard Drives have not got much faster over the last few years and storing the data in memory (RAM) is what The Witness is doing!
RAM is the last level of memory before the caches, it is the working memory, you can not stream data from it to anything else other than when the CPU and GPU are reading the memory to calculate a frame!
You know its hard to say without benchmark numbers.
Given how gddr5 does have some more latency maybe the solution for the problem can be calculated within 20 cycles instead of waiting lets say for example 30 cycles before you get it from memory.

No it is not hard to say!

It is going to be stored in RAM or loaded off a disk drive, upto 172 Billion bytes per a second vs 20 to 100 (ish) million bytes per a second (that then have to pass via the RAM anyway!) and nanosecond latency vs 10's of milliseconds of latency!
Really fucking clear cut!
 

Jac_Solar

Member
That just means they are too lazy / not skilled enough to optimize it.

What's the point, as long as it doesn't interfere with the framerate or hit the cap? Wouldn't you say it's preferable to just load in everything at once when it works perfectly well, instead of complicating things for no reason? Why would he try to optimize when he's only at 5gb, and the system has 8? Just for the sake of it? That would be very inefficient.

This also means that there's no popin, very little loading, etc.
 

Vinci

Danish
I think the problem with many people here is that some are technologically savvy just enough to overthink his comments but not savvy enough to understand him. Meanwhile, I'm technologically dumb enough to get what he's saying immediately, as his message is one that is fairly true throughout many tasks and/or jobs...

"Use what you got, especially if it makes your life a bit easier and the job a bit less grueling." It's not about optimization/efficiency on the hardware side, it's about optimization/efficiency on the human side. He just seems thankful that this trade-off can be made without leading to problems with the game.
 
Optimization takes time and brings bugs. Sometimes there is no way for optimization. Sometimes is not worth it. Why devs must do optimization, if they have resources? This is not PC, where range of hardware is endless.
 
I fully understand all the hate around the Xbox One and I share a lot of it, but I honestly feel that he makes these comments to stir up the fanboys and keep his game in the light.

Yeah, it does sound like he makes some if these comments to emphasise how much he hates MS. Bitching just for the sake if bitching won't win him any fans.

Although if he wants to make use of all the memory available at once, who cares how he does that? Surely it's inefficient development to spend time optimising if you don't need to?!
 

nasos_333

Member
5GB for the graphics of Witness ?

Must be a misunderstanding i am sure

Maybe 5Gbits ?

I think the problem with many people here is that some are technologically savvy just enough to overthink his comments but not savvy enough to understand him. Meanwhile, I'm technologically dumb enough to get what he's saying immediately, as his message is one that is fairly true throughout many tasks and/or jobs...

"Use what you got, especially if it makes your life a bit easier and the job a bit less grueling." It's not about optimization/efficiency on the hardware side, it's about optimization/efficiency on the human side. He just seems thankful that this trade-off can be made without leading to problems with the game.

Yet it is silly to load 5GB on your ram, when you could easilly load a lot less
 
It is going to be stored in RAM or loaded off a disk drive, upto 172 Billion bytes per a second vs 20 to 100 (ish) million bytes per a second (that then have to pass via the RAM anyway!) and nanosecond latency vs 10's of milliseconds of latency!
Really fucking clear cut!

Seriously?
More like 100s of ns latency main memory vs 100.000s ns ssd read.
I have been way to generous saying 30 cycles its more like 150+ cycles that is quiet a lot of cycles to do computation vs waiting for memory to use a look up table.

Off course i took caches out of the equation to simplify a bit. And what the kind of algorithm is being used. That is why benchmarking is important.
 
Status
Not open for further replies.
Top Bottom