• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Digital Foundry at GDC: Inside the Playstation 4

Now that most of the main hardware specifications have been described in fair detail, I only need two remaining confirmations: the CPU clock speed, and amount of 8GB RAM actually available to developers for games. From all their previous interviews and statements, Sony seems to suggest ALL the 8GB of RAM is available to devs for their games, so could that mean that all OS-related functions (that are operated by supplemental processors, like the ARM processor previously mentioned) would use a separate pool of memory, like flash memory?
 
The only time I have ever used the analogue feature on face buttons was for changing the speed of binocular zooming... in Metal Gear Solid 2.

Good to see that we'll benefit from getting rid of it.
 
What parts specifically? Guerrilla said it's locked 30fps and Digital Foundry analysis said it too.

Maybe it's my macbook pro or youtube. I hope so. I'm expecting great things from this game.
 
I don't know. There is something wrong here. I suspect that the video is being encoded using the internal 'share' feature which will use the onboard PS4 video encoders. The actual gameplay might be sub 30fps but it will still get encoded at 30fps and a simple framerate analysis would reveal a rock solid 30fps. But as you can see. There seems to be some minor framerate fluctuation and stutters.
That's not how it works.
 
This is all well and good but will games on the PS4 support Deep Color?
and why in the hell hasn't Sony forced the BDA to accept xvYCC and Deep Color?
 
Don't know how accurate these are but it never dips below 30fps. http://www.youtube.com/watch?v=Hv2ofBB9O9E

Look around 4:40 when watching the 720p feel. I can definitely see the video jump as the guy turns around to the left. Can anyone else see it?

Actually, as an example, there are quite few moments after that moment that the framerate appears to stutter and slowdown but the analysis says it's a solid 30fps.
 
I'm doubting that chunk loading is going to be pulled off as smoothly as Sony would like to make it sound. Like the DF article points, game demos by themselves already clock in at multiple GBs in many cases. It seems like the ability to seemlessly download and install in the background would really require rather significant compromises at least in how the initial game experience plays out.

Or maybe every PS4 game is just going to come standard with a Galaga play load screen.
 
The OP thread implies my wallet may cry... then I reminded it that we won't be getting the PS4 at launch and it thanked me profusely.
 
I wasnt really talking the encoder. It may very well output at 30 FPS, or whatever. I'm talking DFs reader, it actually counts individual frames.

Which is my point. The GPU might have rendered the frames at sub 30fps but the video encoder will simply grab the nearest frame and encode it at 30fps. It's possible that the encoder could encode a single frame from the GPU as two frame in the 30fps stream or even miss a frame.

The DF analyser will just count the frames over a period of time and report a solid 30fps even if the underlying framerate of the original pre-encoded stream was less then 30fps.

I hope that makes sense.
 
Quotes sound interesting so far, can't wait to read the full article tonight.

"If you're coming from PS3 you're used to the split memory architecture, you can't quite use all of it, the speeds are really wacky on some of it. We don't have that. It's eight gigs, it's there, it's easy."

Am I reading that right and they are insinuating that devs have access to all 8GBs? I assumed the OS would take up a chunk of the memory, is this not the case?
 
Which is my point. The GPU might have rendered the frames at sub 30fps but the video encoder will simply grab the nearest frame and encode it at 30fps. It's possible that the encoder could encode a single frame from the GPU as two frame in the 30fps stream or even miss a frame.

The DF analyser will just count the frames over a period of time and report a solid 30fps even if the underlying framerate of the original pre-encoded stream was less then 30fps.

I hope that makes sense.

Then frame analysis of literally any video would be pointless.
 
Which is my point. The GPU might have rendered the frames at sub 30fps but the video encoder will simply grab the nearest frame and encode it at 30fps. It's possible that the encoder could encode a single frame from the GPU as two frame in the 30fps stream or even miss a frame.

The DF analyser will just count the frames over a period of time and report a solid 30fps even if the underlying framerate of the original pre-encoded stream was less then 30fps.

I hope that makes sense.

And thats how it works, when You have doubled/tripled etc frames, You have framerate drops and they are counting this exactly, how many unique frames are there per second.
 
Quotes sound interesting so far, can't wait to read the full article tonight.



Am I reading that right and they are insinuating that devs have access to all 8GBs? I assumed the OS would take up a chunk of the memory, is this not the case?

I'm sure the OS footprint is relatively small... I wouldn't rule out dedicated storage/memory for the OS either.
 
Then frame analysis of literally any video would be pointless.

No because normally they will use the direct output of the console. Not the encoded version used in this case.

Ignore the analysis. Take a critical look at the youtube video at 720 or 1080p. You can visually see the frames skipping even though the video frame rate is smooth.
 
Quotes sound interesting so far, can't wait to read the full article tonight.



Am I reading that right and they are insinuating that devs have access to all 8GBs? I assumed the OS would take up a chunk of the memory, is this not the case?

My take is you may have 150MB of game code and 260MB of texture and buffers. Well you cannot take from the CPU side to lend to the GPU side, with a UMA you don't have this restriction. He was ignoring the OS footprint for simplicity I would guess.
 
The compute data running simultaneously to rendering on the GPU is fucking awesome and it sounds like they're doing seamless installs. Excellent. I adore the design of this system so far. It just seems smart and well thought through all around.
 
I'm sure the OS footprint is relatively small... I wouldn't rule out dedicated storage/memory for the OS either.

I'm sure it's small for what it does, but I wouldn't be surprised if it uses up 512MB of memory or more. I guess it's hard to gauge how much memory it may use without knowing all the features of the OS.

I'd also be really surprised if they had another pool for the OS alone. They already have that system stuffed with memory, adding a second pool just doesn't sound necessary or likely.

My take is you may have 150MB of game code and 260MB of texture and buffers. Well you cannot take from the CPU side to lend to the GPU side, with a UMA you don't have this restriction. He was ignoring the OS footprint for simplicity I would guess.

Yeah, I understand the benefits (and drawbacks) of UMA, but you have a point that they could be ignoring the OS for the sake of simplicity.

The compute data running simultaneously to rendering on the GPU is fucking awesome and it sounds like they're doing seamless installs. Excellent. I adore the design of this system so far. It just seems smart and well thought through all around.

I have to re-watch the PS4 reveal again, I thought they announced the install as you play feature at the event. Unless you mean something different with seamless installs.
 
Quotes sound interesting so far, can't wait to read the full article tonight.



Am I reading that right and they are insinuating that devs have access to all 8GBs? I assumed the OS would take up a chunk of the memory, is this not the case?

Surprisingly, that stuck out as well. I can't imagine there not being an always available OS. Perhaps the OS, while in game is different (lite) and takes up an amount of space that is tiny compared to rest of the memory or they have a way of offloading most of OS to some place else (may that rumoured 16GB flash chip could provide answers). Even 512MB which would be a massive jump from existing console's ~50MB, it would still be a fraction of the overall system RAM.

The compute data running simultaneously to rendering on the GPU is fucking awesome and it sounds like they're doing seamless installs. Excellent. I adore the design of this system so far. It just seems smart and well thought through all around.

And this still confuses me. The guy says 1.84TFlops available for graphics but if even a single CU is being operated randomly for whatever amount of time for running non-gfx related tasks then the gfx related task really don't have 1.84TF accessible to them during those times. To me it comes off as 1.84TF for gfx + extra for compute whereas it should be more of an "on demand" substitute mechanism, i.e. 1.843TF for Gfx & GPGPU combined.
 
The only time I have ever used the analogue feature on face buttons was for changing the speed of binocular zooming... in Metal Gear Solid 2.

Good to see that we'll benefit from getting rid of it.

I had to use a PS1 pad to play Wipeout Fusion on the PS2 because using a Dual Shock 2 you would begin to lose thrust if you didn't keep the button jammed all the way down. I never liked the analog buttons.
 
Yes, now read the link and it'll explain to you their process for compressed video.

I understand the article and the principles around the analysis. But I can still see frame rate issues with the trailer even though Eurogamers tools report it as a solid 30fps.

Do you notice any hitching or framerate issues?

http://www.youtube.com/watch?v=Hv2ofBB9O9E

Watch it in 720p and look around the 4:40 mark and a little after it. Do you notice anything?
 
The question worth asking is how much effort is this going to get? rather than trying to draw silly battle lines.

Now if only that question was easily answerable.

Indeed. I suppose it all rests on whether or not it's a pack in. If it's a pack in accessory and mandatory, I could see a future where Kinect only titles are ported to the PS4.

Will it be able to replicate the experience? That's the great unknown, but I do like their suggestion of using props. Kinect would be much better if a lot of the game allowed for controller use alongside motion controls/speech recognition.
 
I really do not like tight sticks. That is the big thing I hate about the 360's controller.

Same, but maybe they will loosen up once they get broken in. I picked up a new DS3 the other day and the sticks feel stiff (buttons too) compared to my 4+ years old DS3 pads.
 
Surprisingly, that stuck out as well. I can't imagine there not being an always available OS. Perhaps the OS, while in game is different (lite) and takes up an amount of space that is tiny compared to rest of the memory or they have a way of offloading most of OS to some place else (may that rumoured 16GB flash chip could provide answers). Even 512MB which would be a massive jump from existing console's ~50MB, it would still be a fraction of the overall system RAM.



And this still confuses me. The guy says 1.84TFlops available for graphics but if even a single CU is being operated randomly for whatever amount of time for running non-gfx related tasks then the gfx related task really don't have 1.84TF accessible to them during those times. To me it comes off as 1.84TF for gfx + extra for compute whereas it should be more of an "on demand" substitute mechanism, i.e. 1.843TF for Gfx & GPGPU combined.

There are several points in a normal game render process where the GPU is not being used at 100%, like when rendering shadow maps. These open up opportunities to have some compute happening in parallel without slowing down the rendering.
 
Yoshida likes it :P

jvW437LnXrHG2.png
 
I'm excited! Seriously, there is no bad news about the PS4 at this point. I'm there day 1 unless a nasty DRM bomb goes off.

I don't think that you have to worry about that one very nasty DRM that's going around recently

OPM You talked a lot about service, tech and ideas; how are you going to present PS4 to a more casual market?

Michael Denny So I think two of the other pillars we talked about in-terms of design were simplicity and immediacy. Even taking back a step from here, PlayStation 4 can still be enjoyed old school without an Internet connection at all. So it depends what level you want to use these feature sets at. So with ’simple’ and ’immediacy’ we want it so that everything is one button click away, for example. And ’immediacy’ takes down these barriers that can be frustrating to gamers between the player getting access to the content.

http://www.officialplaystationmagazine.co.uk/2013/03/20/michael-denny/

I think charging for online play is the only potential upcoming news that could be true.
 
I understand the article and the principles around the analysis. But I can still see frame rate issues with the trailer even though Eurogamers tools report it as a solid 30fps.

Do you notice any hitching or framerate issues?

http://www.youtube.com/watch?v=Hv2ofBB9O9E

Watch it in 720p and look around the 4:40 mark and a little after it. Do you notice anything?
Yeah it microstutters. Which doesn't really effect the actual frame rate. Remember, it's an average for each second.
 
Top Bottom