• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks Rumor: Durango Memory System Overview & Example

BlazinAm

Junior Member
To me, I think the Unreal Engine 4 demo wasn't properly optimized in time for the PlayStation Meeting and not only that but it was on prototype hardware.
Assumptions with no data...

Anyway we would need direct footage from the ps4 and whatever spec'ed out PC Epic rolls out to compare.
 

El_Chino

Member
Assumptions with no data...

Anyway we would need direct footage from the ps4 and whatever spec'ed out PC Epic rolls out to compare.

Do you have data yourself to prove otherwise? It just makes sense that since all tech demos were rumored to be working with 1.5 VRam and most companies (possibly Epic themselves) were not even aware that Sony had bumped their RAM.
 

stryke

Member
Didn't someone from Epic confirm that UE4 was only using about 27% of the PS4's power and that tech demo was thrown together in like two weeks?

I don't remember anything like that but Jonathon Blow only had a couple of weeks notice for getting a trailer and speech together. You could extrapolate from that and have an idea of how much time Epic had for their tech demo.
 

onQ123

Member
Didn't someone from Epic confirm that UE4 was only using about 27% of the PS4's power and that tech demo was thrown together in like two weeks?

that was that Mike R guy but I don't think he said anything about working at Epic.

http://forum.beyond3d.com/showpost.php?p=1712718&postcount=88


Originally Posted by almighty
And?

The original spec PS4 had 4Gb so that would be the same as a PC with a 2Gb GTX680 and 2Gb system RAM.

I seriously, seriously doubt that this demo uses over 2Gb of main RAM, especially as the demo might not even be 64bit.

The GTX680 is just a much much faster GPU then what's inside PS4 and that's what the problem is, not lack of memory :rolleyes:



Originally Posted by MikeR
Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link.
 
I don't believe this to be true.


for a few reasons:

  • The upgrade from Xbox 360 to the Xbox 3 is around 990 GFLOPS so a difference of 610GFLOPS between 2 consoles in the same generation is pretty big.


Can you show the maths behind this?

If you have this right; essentially a 1000gflop increase is considered a "generational leap" but a 600gflop increase is "tiny, incremental difference".
 

iceatcs

Junior Member
I don't remember anything like that but Jonathon Blow only had a couple of weeks notice for getting a trailer and speech together. You could extrapolate from that and have an idea of how much time Epic had for their tech demo.

The leaks said the new PS4 devkit with customised silicon rather than PC just arrive last Jan

[edit] ah they haven't put on real PS4 devkit yet. Only old PC PS4 devkit.
 

onQ123

Member
Can you show the maths behind this?

If you have this right; essentially a 1000gflop increase is considered a "generational leap" but a 600gflop increase is "tiny, incremental difference".


PS4 GPU 1,840 GFLOPS - Xbox3 GPU 1,230 GFLOPS = 610 GFLOPS


Xbox 3 GPU 1,230 GFLOPS - Xbox360 GPU 240 GFLOPS = 990 GFLOPS
 

onQ123

Member
What does the Voxel version of GI allow that the "regular" GI doesn't allow?

http://www.geforce.com/whats-new/ar...s-next-gen-gtx-680-powered-real-time-graphics

What is Global Illumination and why is it so important to game realism?


Global Illumination refers to the calculation of light bouncing around a scene. GI is responsible for many of the subtle shading effects and ambience we see in real-world environments, as well as glossy and metallic reflections. Introducing real-time Global Illumination into Unreal Engine 4 is the biggest breakthrough in lighting since Unreal Engine 1 introduced real-time Direct Illumination in 1995.

Please give us an overview of how the algorithm works from generating the octree, to cone tracing, to the gathering pass.

The technique is known as SVOGI – Sparse Voxel Octree Global Illumination, and was developed by Andrew Scheidecker at Epic. UE4 maintains a real-time octree data structure encoding a multi-resolution record of all of the visible direct light emitters in the scene, which are represented as directionally-colored voxels. That octree is maintained by voxelizing any parts of the scene that change, and using a traditional set of Direct Lighting techniques, such as shadow buffers, to capture first-bounce lighting.

Performing a cone-trace through this octree data structure (given a starting point, direction, and angle) yields an approximation of the light incident along that path.

The trick is to make cone-tracing fast enough, via GPU acceleration, that we can do it once or more per-pixel in real-time. Performing six wide cone-traces per pixel (one for each cardinal direction) yields an approximation of second-bounce indirect lighting. Performing a narrower cone-trace in the direction of specular reflection enables metallic reflections, in which the entire scene is reflected off each glossy surface.

[Editor's Note: If the above sequence seems alien to you, it's because it is. Global Illumination requires a totally new lighting pipeline. In a traditional game, all indirect lighting (light that is bounced from a surface) is calculated in advance and stored in textures called lightmaps. Lightmaps give game levels a GI-look but since they are pre-computed, they only work on static objects.

In Unreal Engine 4, there are no pre-computed lightmaps. Instead, all lighting, direct and indirect, is computed in real-time for each frame. Instead of being stored in a 2D texture, they are stored in voxels. A voxel is a pixel in three dimensions. It has volume, hence the term "voxel."

The voxels are organized in a tree structure to make them efficient to locate. When a pixel is rendered, it effectively asks the voxel tree "which voxels are visible to me?" Based on this information it determines the amount of indirect light (Global Illumination) it receives.

The simple takeaway is this: UE4 completely eliminates pre-computed lighting. In its place, it uses voxels stored in a tree structure. This tree is updated per frame and all pixels use it to gather lighting information.]
 

mrklaw

MrArseFace
So 50% more power in PS4 won't make any difference (1.23 -> 1.84), but a 50% increase from that to eg a 7950 (2.8) on PC will 'blow away' PS4?
 

ekim

Member
So 50% more power in PS4 won't make any difference (1.23 -> 1.84), but a 50% increase from that to eg a 7950 (2.8) on PC will 'blow away' PS4?

That's why I refrained from partaking in these threads. It's getting way too ridiculous - I remember a time last year where we all consent to not compare FLOPS when it comes to the real-life gaming performance of a system. I pretty sure both consoles will wow us with decent looking games throughout the whole generation. Hell, we don't even know the final xB3 specs.
 

pswii60

Member
Didn't someone from Epic confirm that UE4 was only using about 27% of the PS4's power and that tech demo was thrown together in like two weeks?

Anyone who knows anything will question someone who claims something to be using "27%" of a console's power. What does that even mean? Prime95 uses 100% of a CPU's power, doesn't mean anything.
 

onQ123

Member
Anyone who knows anything will question someone who claims something to be using "27%" of a console's power. What does that even mean? Prime95 uses 100% of a CPU's power, doesn't mean anything.

Debug mode will show the % of the CPU & GPU used.
 
agreed but it can make the difference between gaming at 1080 vs playing at 720 or 60 to 30 fps for middle to late gen games depending if durango is showing its limitations first .

Rops count wont make a difference at the resolution the next gen console will be working.

Just gonna quote a older post of mine.

Let me see if im right.

Fillrate = ROPS * hz
16 * 800Mhz = 12.8 GPixel/s
32 * 800Mhz = 25.6 GPixel/s

1920 * 1080 = 2073600 pixels per frame.
2073600 / 100.000.000 = 0.0020736 GPixel per frame.

For a 60fps using only one rendertarget of 1920 * 1080.
0.0020736 * 60(as in fps) = 0.124416 GPixel/s

You can fill 102 rendertargets of 1920*1080.
So yeah 32 Rops is probably over kill for 1080p for gaming.
Edit: But 16 rops for durango seems to fit well with 24fps 4k movies.
And 32 rops is useful for sony if they are releasing 48fps 4k or 3D 4k tv sets.
 
PS4 GPU 1,840 GFLOPS - Xbox3 GPU 1,230 GFLOPS = 610 GFLOPS


Xbox 3 GPU 1,230 GFLOPS - Xbox360 GPU 240 GFLOPS = 990 GFLOPS

Ratio. Durango is ~500% of 360, PS4 (if leaks accurate) is ~150% of Durango. There's a big difference between those two things.
 

Wellscha

Member
weren't the PS3 specs leaps and bounds better than the 360 on paper and we all know how that turned out? what makes the scenario different this time?
 
weren't the PS3 specs leaps and bounds better than the 360 on paper and we all know how that turned out? what makes the scenario different this time?

not if you knew what you were looking at.

cell would be the reason people would say that, but having the greatest cpu in the world, doesn't matter that much to the graphics onscreen. although cell was almost some part gpu, so it ended up managing to help a lot for a cpu.
 

stryke

Member
weren't the PS3 specs leaps and bounds better than the 360 on paper and we all know how that turned out? what makes the scenario different this time?

CPU/GPU type and vendor is the same across the board. Sony can't afford to be arrogant. No Ken. They've released a flops figure similar to their alleged PS3 performance - that in itself is an admission that they (or nvidia whoever you want to blame) fudged numbers last time in their farting contest with MS.
 

owasog

Member
Ratio. Durango is ~500% of 360, PS4 (if leaks accurate) is ~150% of Durango. There's a big difference between those two things.
What if Durango has 67% of PS4's pixel-processing power?

At 1080p a game draws 2073600 pixels per frame. 67% of 1080p is 1382400 pixels

Of course there's more to performance than pixelcount, but roughly speaking all Durango has to do to achieve the same framerate, is use a resolution with less than 1382400 pixels. That's somewhere between:

720p = 1280x720 = 921600 pixels.
900p = 1600x900 = 1440000 pixels.

Closer to 900p than 720p. Not too bad in my opinion.
 

Thorgal

Member
Rops count wont make a difference at the resolution the next gen console will be working.

Just gonna quote a older post of mine.


so what will those extra ROPS do then ?

they are obviously not just there for show id wager and we know they won't be used for 4k gaming either

having nealy double of them compared to XBOX3 has to show a difference visualy right?

better fillrate?
 

Reiko

Banned
so what will those extra ROPS do then ?

they are obviously not just there for show id wager and we know they won't be used for 4k gaming either

having nealy double of them compared to XBOX3 has to show a difference visualy right?

better fillrate?

Sony said 4K gaming wasn't in the cards.

Durango is made to support 4K movies though.
 

iceatcs

Junior Member
so what will those extra ROPS do then ?

they are obviously not just there for show id wager and we know they won't be used for 4k gaming either

having nealy double of them compared to XBOX3 has to show a difference visualy right?

better fillrate?

Multi-screens, Remote play, VR goggles, etc.. I think.
 
so what will those extra ROPS do then ?

they are obviously not just there for show id wager and we know they won't be used for 4k gaming either

having nealy double of them compared to XBOX3 has to show a difference visualy right?

better fillrate?

I dont know but i heard something on B3D that there is no ROPS count between 16 and 32.
So you go from 16 ROPS to 32 ROPS. Maybe sony is preparing the ps4 for HFR 4K movies.
 

RoboPlato

I'd be in the dick
I dont know but i heard something on B3D that there is no ROPS count between 16 and 32.
So you go from 16 ROPS to 32 ROPS. Maybe sony is preparing the ps4 for HFR 4K movies.
This makes sense to me. It's also probably more effort than it would be worth to lower the ROPS count from the base GPU.
 

Thorgal

Member
Multi-screens, Remote play, VR goggles, etc.. I think.
/QUOTE]

* puts hype hat on.

So in theory PS4 could easily support occulus rift or an adaptation of it?

What are the chances that sony has a oculus rift support agreement and is currently keeping that in the deepest pits of secrecy until e3?

*removes hype hat.
 

iceatcs

Junior Member
Gemüsepizza;50463051 said:
PS3 has 256MB GDDR3 RAM (like Xbox 360) and 256MB XDR RAM. You make it sound a little bit like one of them is slow.
I didn't said anything slow. Just your goggles said so.
I'm only said Nvidia put half (256MB) on their GPU (which it is GDDR3 one) and Sony put another half one on system ram to make up 512MB total.

* puts hype hat on.

So in theory PS4 could easily support occulus rift or an adaptation of it?

What are the chances that sony has a oculus rift support agreement and is currently keeping that in the deepest pits of secrecy until e3?

*removes hype hat.

Very low, I think. If Sony want VR support, they will make their own.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Rops count wont make a difference at the resolution the next gen console will be working.

Just gonna quote a older post of mine.

Why would ROPs be used in video playback? Conspiracy theory stuff and goofy math aside, all AMD GPU parts have either 16 or 32 ROPs, depending on which side of the performance line they are on. Sony probably has plans for 3D games and 32 ROPs was more balanced to match the other specs.
 

RoboPlato

I'd be in the dick
Do we know if voxel lighting is confirmed to be removed from unreal 4, and therefor won't be used for ps4 games?

Voxel lighting is the biggest addition to unreal 4 isn't it? Kinda lame if it's taken out. What else is unreal 4 bringing to the table?

Well it still has dynamic lighting and global GI, which is a huge jump from UE3. I'm also hoping that the weird look of textures from UE3 is gone.
 

Panajev2001a

GAF's Pleasant Genius
There is no documentation that it's easy either. Just the rumoured architecture which looks considerably more complex than PS4. Tiling decompression units, multiple DMA move engines, all presumably to get around the external ram bandwidth limitations. If those things are relatively automated, then great. But if not, and developers have to manually schedule when to move small chunks of memory around to get the most out of the esram, then it could take a long time for that to be properly tapped, and many developers may never fully do so. sounds a lot like PS3 and efficient SPE usage

I actually like Durango's architecture from the rumors we see, it sounds like a PS2 on steroids somewhat. Very low latency ESRAM might make for some algorithms which will not work well on PC's and more PC like architectures (like GS's very fast bandwidth and ultra low overhead for some operations which are expensive on just about any other GPU out there thanks to its design), but the system will need more care from programmers than a system providing greater bandwidth with a single pool of memory. Both systems look fun to program for from the outside.
 
here is a demo of 'Voxel Cone Traced Lighting in Unity'

http://youtu.be/H1wkX3zffbU
I know what it does, I just wish people explained how it worked without the mumbo jumbo. I mean.. I can sift my way well through some heavy programming related documents, but I just could not follow that description you posted above lol.

Weird how that works huh?

& all while also ignoring the higher bandwidth memory.
It's all about the latencies bruh.

Rops count wont make a difference at the resolution the next gen console will be working.

Just gonna quote a older post of mine.
Out of curiosity, does more ROPS allow higher FPS? I mean... it allows greater fill rate, which can affect resolution, but wouldn't it also allow like a 60fps constant?

I actually like Durango's architecture from the rumors we see, it sounds like a PS2 on steroids somewhat. Very low latency ESRAM might make for some algorithms which will not work well on PC's and more PC like architectures (like GS's very fast bandwidth and ultra low overhead for some operations which are expensive on just about any other GPU out there thanks to its design), but the system will need more care from programmers than a system providing greater bandwidth with a single pool of memory. Both systems look fun to program for from the outside.
But there are much better solutions... PS2 blew away the competition with that eDRAM. 48GB/s. The Xbox and GC couldn't match it... neither could the PS3 or 360. But you look at Durango (102 GB/s from the eSRAM) and PS4 (172 GB/s) and the PS4 is on top easily... without the need for having eSRAM.

So the voxel lighting is removed though? Id be interested in reading in what interview/article this was said.
Yeah, I'm interested in this too.
 
Top Bottom