• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks Rumor: Durango Memory System Overview & Example

So 50% more power in PS4 won't make any difference (1.23 -> 1.84), but a 50% increase from that to eg a 7950 (2.8) on PC will 'blow away' PS4?
That's one of those things that goes both ways... I've seen people claiming what you said, but I have seen people making the exact same statements only backwards...
 
I know what it does, I just wish people explained how it worked without the mumbo jumbo. I mean.. I can sift my way well through some heavy programming related documents, but I just could not follow that description you posted above lol.


It's all about the latencies bruh.


Out of curiosity, does more ROPS allow higher FPS? I mean... it allows greater fill rate, which can affect resolution, but wouldn't it also allow like a 60fps constant?


But there are much better solutions... PS2 blew away the competition with that eDRAM. 48GB/s. The Xbox and GC couldn't match it... neither could the PS3 or 360. But you look at Durango (102 GB/s from the eSRAM) and PS4 (172 GB/s) and the PS4 is on top easily... without the need for having eSRAM.


Yeah, I'm interested in this too.

I found this PDF from siggraph 2012 on voxel lighting. It seems to me that the entire elemental demo, which ps4 ran, is using voxel lighting. They also refer to voxel tracing as "ray-tracing into a simplified scene." maybe you can make more sense of these slides then what he posted(or at least some of it lol).

http://www.unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf
 

onQ123

Member
There are PC GPU's way above that. Titan over 7 teraflops.

Ratio. Durango is ~500% of 360, PS4 (if leaks accurate) is ~150% of Durango. There's a big difference between those two things.

No it's 4.5 teraflops.


& no the Ratio doesn't mean more than the actual number of GFLOPS since it's still making games at 720P & 1080P.

why is the Xbox 3's 990 Extra GFLOPS more special than the PS4's 610 Extra GFLOPS.
 
Out of curiosity, does more ROPS allow higher FPS? I mean... it allows greater fill rate, which can affect resolution, but wouldn't it also allow like a 60fps constant?

If you can fill frames faster then yes.
But at the resolutions and framerates next gen is going to operate 32 ROPS will probably be overkill for the gaming side of things.
 
It seems the interview that onQ123 posted is in response to all those slides posted at siggraph 2012. I can't find anything that suggests voxel cone lighting is removed from UE4, in fact it seems like its the bases of their entire engine. I don't see how anything like that could be removed. I think this was just a baseless rumor stated on gaf without a source, and everyone continued to believe it throgh word of mouth. There technique is actually called Sparse Voxel octree global illumination(SVOGI).
 
It seems the interview that onQ123 posted is in response to all those slides posted at siggraph 2012. I can't find anything that suggests voxel cone lighting is removed from UE4, in fact it seems like its the bases of their entire engine. I don't see how anything like that could be removed. I think this was just a baseless rumor stated on gaf without a source, and everyone continued to believe it throgh word of mouth. There technique is actually called Sparse Voxel octree global illumination(SVOGI).

I heard the rumor on B3D and with the reveal on ps4 you can see that the quality was degraded.
 

Durante

Member
My question is more of "can we expect a 60fps standard if we have that many ROPs?"
You can expect the ROPs to be very unlikely to be a barrier to 60 FPS. However, there are tons of other things that could prevent 60 FPS -- CPU performance, basic GPU FLOPs, you name it.
 
My question is more of "can we expect a 60fps standard if we have that many ROPs?"

If im not mistaken ROPS do nothing more then read/write and blend pixel data at the very end of the graphical pipeline. So i would say ROPS count are not a factor in determining framerates with the amount of ROPS being in Durango and the PS4.
 
You can expect the ROPs to be very unlikely to be a barrier to 60 FPS. However, there are tons of other things that could prevent 60 FPS -- CPU performance, basic GPU FLOPs, you name it.

If im not mistaken ROPS do nothing more then read/write and blend pixel data at the very end of the graphical pipeline. So i would say ROPS count are not a factor in determining framerates with the amount of ROPS being in Durango and the PS4.

Thanks.
 
My question is more of "can we expect a 60fps standard if we have that many ROPs?"

60 FPS "standard" is a ridiculous thing to say though. There is no standard for FPS. That's up to the developer and it all depends on what they plan on bringing to the table.

There's a finite amount of power in a system. It's all about how the devs use that power budget. The PS4 has a more than decent power budget compared to last gen and the hardware is much more efficient now as well as there being less bottlenecks. This boosts the resource budget, but it will always rely on the developer for FPS
 

onQ123

Member
For the People saying that the 610 extra GFLOPS isn't going to make much difference I have a question:



What about the Unreal Engine 4 demo where they say most of the GPU GFLOPS are going to computing, so say a game push the Xbox 3 to the limit of it's 1.23TFLOPS without using any of it for compute functions & the same game is on the PS4 & they use the extra 610GFLOPS for Sparse Voxel octree global illuminations while the Xbox 3 has only baked lighting?


or they both use Sparse Voxel octree global illuminations but the Xbox 3's graphics have to be cut back a lot to get the computing power out of the 1.23TFLOPS?


say that it needs about 500GFLOPS of computing power so on the Xbox 3 they use 723GFLOPS for normal GPU functions so that they can pull off the cone traced lighting but on the PS4 they are able to use 1.34GFLOPS for normal GPU functions & still have the 500GFLOPS needed for the compute functions.


Do you still think that the 610GFLOPS isn't going to mean much?


(in a game where the devs don't want to lower the resolution from 1080P or if the game is already 720P on both consoles to pull off the graphics that they are going for )
 

Espada

Member
60 FPS "standard" is a ridiculous thing to say though. There is no standard for FPS. That's up to the developer and it all depends on what they plan on bringing to the table.

There's a finite amount of power in a system. It's all about how the devs use that power budget. The PS4 has a more than decent power budget compared to last gen and the hardware is much more efficient now as well as there being less bottlenecks. This boosts the resource budget, but it will always rely on the developer for FPS

Exactly. It's up to them whether they want ALL THE FIXIN's vs FLUID MOTION. With the fixed hardware in consoles you can't really have both. Well, you can but the framerate will be all over the place depending on how many effects are on-screen (which is a nightmare).

I think most devs will opt for 30fps unless their game is either relatively simple or they're willing to operate at sub-1080p resolutions (like CoD).
 
I think the thing that most people are forgetting when they bring up these PC GPU comparisons as metric for how things will turn out between PS4 and Durango is platform differences.

All the charts posted of the 7850 vs 7770 is understandable but honestly not too valid in trying to get the point across. For this reason, these benchmarks are ran at the exact same settings. There is no doubt that there is a difference of power between the two that is made slightly worse due to developers not being able to target those two cards directly. With that stated the same type of situation wouldn't happen in a console environment regardless. I'm not talking "coding for the metal" or anything of that nature (actually I am a bit), rather I'm speaking more on tuning each engine to run well on the given platform while still maintaining some semblance of parity. An example would be say my company releases Shooter Dudes 2, and we're pushing some great effects on one platform. In order for us to have "visual parity" and keep the framerates as close as possible we might scale back in effects on one platform versus the other. We might be using FP32 on PS4 and drop it down to FP110 on Durango, then we might scale back some particle effects, they might be 1/2 resolution on PS4 and we might scale them down to 1/4 on Durango, there might be a removal of a couple of decals in a level, or even the dreaded slight resolution drop. It's a visual difference that vast majority of gamers won't notice, but it frees up processing power and keeps near parity.

I fully expect there to be some framerate difference later on in the development cycle, but I don't honestly expect a full 60fps vs 30fps or 1080p vs 720p that many are expecting and extrapolating from desktop GPU benchmarks. The differences might manifest themselves in GPGPU/Physics type scenarios more than anything else. My two cents on the entire thing.

/shrug
 

mrklaw

MrArseFace
That's a good point. PC benchmarks are always about how many for you get from a set resolution and detail setting.

Even for PC, I'd like a set of benchmarks normalised around framerate. Eg, for battlefield 3 to hit 60fps, what detail settings do these different GPUs need to be set to? That'd actually be a useful tool for me when choosing a new graphics card, rather than trying to extrapolate what 43fps vs 47fps is likely to look like
 

DEADEVIL

Member
I think the thing that most people are forgetting when they bring up these PC GPU comparisons as metric for how things will turn out between PS4 and Durango is platform differences.

All the charts posted of the 7850 vs 7770 is understandable but honestly not too valid in trying to get the point across. For this reason, these benchmarks are ran at the exact same settings. There is no doubt that there is a difference of power between the two that is made slightly worse due to developers not being able to target those two cards directly. With that stated the same type of situation wouldn't happen in a console environment regardless. I'm not talking "coding for the metal" or anything of that nature (actually I am a bit), rather I'm speaking more on tuning each engine to run well on the given platform while still maintaining some semblance of parity. An example would be say my company releases Shooter Dudes 2, and we're pushing some great effects on one platform. In order for us to have "visual parity" and keep the framerates as close as possible we might scale back in effects on one platform versus the other. We might be using FP32 on PS4 and drop it down to FP110 on Durango, then we might scale back some particle effects, they might be 1/2 resolution on PS4 and we might scale them down to 1/4 on Durango, there might be a removal of a couple of decals in a level, or even the dreaded slight resolution drop. It's a visual difference that vast majority of gamers won't notice, but it frees up processing power and keeps near parity.

I fully expect there to be some framerate difference later on in the development cycle, but I don't honestly expect a full 60fps vs 30fps or 1080p vs 720p that many are expecting and extrapolating from desktop GPU benchmarks. The differences might manifest themselves in GPGPU/Physics type scenarios more than anything else. My two cents on the entire thing.

/shrug

Interesting post on the differences.
 
For the People saying that the 610 extra GFLOPS isn't going to make much difference I have a question:



What about the Unreal Engine 4 demo where they say most of the GPU GFLOPS are going to computing, so say a game push the Xbox 3 to the limit of it's 1.23TFLOPS without using any of it for compute functions & the same game is on the PS4 & they use the extra 610GFLOPS for Sparse Voxel octree global illuminations while the Xbox 3 has only baked lighting?


or they both use Sparse Voxel octree global illuminations but the Xbox 3's graphics have to be cut back a lot to get the computing power out of the 1.23TFLOPS?


say that it needs about 500GFLOPS of computing power so on the Xbox 3 they use 723GFLOPS for normal GPU functions so that they can pull off the cone traced lighting but on the PS4 they are able to use 1.34GFLOPS for normal GPU functions & still have the 500GFLOPS needed for the compute functions.


Do you still think that the 610GFLOPS isn't going to mean much?


(in a game where the devs don't want to lower the resolution from 1080P or if the game is already 720P on both consoles to pull off the graphics that they are going for )
I believe a game/demo with lots of compute jobs are exactly the kind of games where the flops advantage does not necessarily means higher performance... Even high end gpus are inefficient when compute jobs are mixed with graphics... Case in point: This very UE4 demo. Even a 680 runs this demo, that has a single character at only 30fps and hitting 1080p at 90% of he time.

This may very well be the case where the esram can greatly overcome the flops difference.

As for being pre-baked, the games that are considered the best looking of this generation pretty much all pre bake a big portion of the lightning pipeline. The ones who have completely dynamic lighting and support day/night cycles and destructive scenarios get laughed at he notion of being graphical powerhouses...
 
My question is more of "can we expect a 60fps standard if we have that many ROPs?"
Only if the game is severely hold back by transparencies, like bayonetta... Most games probably won't, but being ROP limited can make your framerate crawl when big explosions or smoke occupy a large portion of the screen.
 

Panajev2001a

GAF's Pleasant Genius
But there are much better solutions... PS2 blew away the competition with that eDRAM. 48GB/s. The Xbox and GC couldn't match it... neither could the PS3 or 360. But you look at Durango (102 GB/s from the eSRAM) and PS4 (172 GB/s) and the PS4 is on top easily... without the need for having eSRAM.
That a UMA solution with very high bandwidth is easier to tame and to work with we agree. Even combining main RAM and ESRAM the bandwidth is lower according to the latest leaks and it does fit with the other rumors which report lower ROP's count and a lower number of CU's. They know they have less bandwidth and they are not trying to design an unbalanced system, even though they might end up with a system slightly tougher to exploit fully. What I was trying to say is that if they can set aside a chunk of those 32 MB for CPU and GPU joint processing and each of them has low latency access to it then some people might find some nice uses out of it.
 

onQ123

Member
I believe a game/demo with lots of compute jobs are exactly the kind of games where the flops advantage does not necessarily means higher performance... Even high end gpus are inefficient when compute jobs are mixed with graphics... Case in point: This very UE4 demo. Even a 680 runs this demo, that has a single character at only 30fps and hitting 1080p at 90% of he time.

This may very well be the case where the esram can greatly overcome the flops difference.

As for being pre-baked, the games that are considered the best looking of this generation pretty much all pre bake a big portion of the lightning pipeline. The ones who have completely dynamic lighting and support day/night cycles and destructive scenarios get laughed at he notion of being graphical powerhouses...

You do understand that this is the reason why the PS4 has 8 Compute pipelines so compute jobs can be done more efficiently without the GPU taking such a big hit.

http://www.vgleaks.com/orbis-gpu-compute-queues-and-pipelines/

gpu_queues.jpg
 
2 Weeks. I'd love to go there, but tickets are ~900+ dollars.

I'm going to PAX East though... hopefully they have some stuff there.

I'll be going to PAX prime. I'm sure it'll be huge this year with all the next gen stuff. I went the past two years cause I lived in Seattle.

As far as GDC goes, that's happening before MS is announcing Durango! Wtf are they thinking? That's a big event for PS4 to get even more momentum and market awareness, with devs still not allowed to mention Durango. MS seems to really be stumbling out of the gate this time around. Seems like the situations have switched from last gen. The only explanation I can think of, is they were completely caught off guard by Sony's announcement.
 

Reiko

Banned
I'll be going to PAX prime. I'm sure it'll be huge this year with all the next gen stuff. I went the past two years cause I lived in Seattle.

As far as GDC goes, that's happening before MS is announcing Durango! Wtf are they thinking? That's a big event for PS4 to get even more momentum and market awareness, with devs still not allowed to mention Durango. MS seems to really be stumbling out of the gate this time around. Seems like the situations have switched from last gen. The only explanation I can think of, is they were completely caught off guard my Sony's announcement.

Sounds like business as usual for Microsoft.

The only thing that surprised them was 8GB GDDR5.
 
I'll be going to PAX prime. I'm sure it'll be huge this year with all the next gen stuff. I went the past two years cause I lived in Seattle.

As far as GDC goes, that's happening before MS is announcing Durango! Wtf are they thinking? That's a big event for PS4 to get even more momentum and market awareness, with devs still not allowed to mention Durango. MS seems to really be stumbling out of the gate this time around. Seems like the situations have switched from last gen. The only explanation I can think of, is they were completely caught off guard by Sony's announcement.

Keep in mind MS is no longer going to GDC. So they'd have to do their own event.
 

SPDIF

Member
As far as GDC goes, that's happening before MS is announcing Durango! Wtf are they thinking? That's a big event for PS4 to get even more momentum and market awareness, with devs still not allowed to mention Durango. MS seems to really be stumbling out of the gate this time around. Seems like the situations have switched from last gen. The only explanation I can think of, is they were completely caught off guard by Sony's announcement.

Nah. I think like CrunchinJelly said, they were always going to do their own thing, regardless of what Sony were doing.

And I wouldn't worry about lack of awareness of the new Xbox. Knowing MS, they'd have no problem spending 500 million just on advertising the reveal, never mind the actual product launch.
 

CLEEK

Member
Nah. I think like CrunchinJelly said, they were always going to do their own thing, regardless of what Sony were doing.

And I wouldn't worry about lack of awareness of the new Xbox. Knowing MS, they'd have no problem spending 500 million just on advertising the reveal, never mind the actual product launch.

I doubt the PS4 reveal would have changed MS plans or strategy at all.

Even without the next Xbox being officially announced, if studios talk about their next gen plans at GDC, even though they can't say the Xbox by name, everyone will know they mean it anyway. Unless explicitly stated as a PS4 exclusive, "Coming soon on next gen consoles" just means it's a PS4 and Xbox 3 game.

I have seen Surface RT adverts on TV loads of times. Windows 8 ads saturated TV, websites, billboard and print for weeks. I have never, ever seen a TV ad for the Vita. Whatever the specs of the two consoles are, advertising is one area where MS consistently outclasses Sony.
 

Ding-Ding

Member
Keep in mind MS is no longer going to GDC. So they'd have to do their own event.

Are you talking about as the "XBox" division or as a company.

Last I heard from developers twitter feeds, MS were taking up 10 'talk" slots for Windows 8 (and the devs were not happy about that as they consider Windows 8 API as a turkey and a waste of slots).
 
Are you talking about as the "XBox" division or as a company.

Last I heard from developers twitter feeds, MS were taking up 10 'talk" slots for Windows 8 (and the devs were not happy about that as they consider Windows 8 API as a turkey and a waste of slots).

I thought it was MS as a whole. I don't think it would stop the developers from going to GDC to give conferences though.
 
I think they have a plan and are sticking to it. Just like it seems they have a plan for the hardware.

We heard a few MS insiders on this board claiming we'd be hearing about Durango within weeks of the new year, and certainly before GDC. Yet, here we are - nothing.

Sure, they have a hardware plan in mind that can't be changed, but I think Sony's announcement delayed their unveil.
 

Fafalada

Fafracer forever
Angelus Errare said:
It's a visual difference that vast majority of gamers won't notice, but it frees up processing power and keeps near parity.
Those standards get applied to framerate too - all the other customization might get skipped if two SKUs perform within 80-90% of one another (but obviously that difference wouldn't be ignored by the likes of DF).
I recall titles that implemented an actual DirectX API interface for platform not supporting it natively, recompile, get 80% of the lead-platform and call it a day.

Or if you prefer - if the lead platform is 60fps-ish, the 20% might mean the other will be locked to 30 (we have seen that happen this gen too).
 
You do understand that this is the reason why the PS4 has 8 Compute pipelines so compute jobs can be done more efficiently without the GPU taking such a big hit.

http://www.vgleaks.com/orbis-gpu-compute-queues-and-pipelines/

Yeah I do, but we don't know how much better it will be because of that, or durango for that matter, that's why i'm talking hypothetically.

But having more threads on the fly to switch to alleviates the problem, it does not solve it. The problem is that you have latency sensitive data to process, and it takes many cycles to get it. In the event of enough threads having a cache miss (which I think becomes increasingly more likely as you have more threads sharing the same cache) you'd still take a hit... On the other hand a 32mb of "cache", that is actually almost as fast as an L2 cache to fetch data, apparently do solve the problem...
 
I doubt the PS4 reveal would have changed MS plans or strategy at all.

Even without the next Xbox being officially announced, if studios talk about their next gen plans at GDC, even though they can't say the Xbox by name, everyone will know they mean it anyway. Unless explicitly stated as a PS4 exclusive, "Coming soon on next gen consoles" just means it's a PS4 and Xbox 3 game.

I have seen Surface RT adverts on TV loads of times. Windows 8 ads saturated TV, website, billboard and print for weeks. I have never, ever seen a TV ad for the Vita. Whatever the specs of the two consoles are, advertising is one area where MS consistently outclass Sony.

I hope that with the marketing agency switch things will get better for SONY in that regard.
 

Sky Chief

Member
We heard a few MS insiders on this board claiming we'd be hearing about Durango within weeks of the new year, and certainly before GDC. Yet, here we are - nothing.

Sure, they have a hardware plan in mind that can't be changed, but I think Sony's announcement delayed their unveil.

Or maybe the insiders didn't really know
 

Karak

Member
Kind of too late for that, all eyes will never be on them now since everyone is already aware of PS4. They'll all have PS4 in the back of their minds as MS reveals their product.

Negative. All eyes does not nor has it ever meant solitary mind control due to the lack of competition. It means all eyes on them, exactly as it is written.
 
I doubt the PS4 reveal would have changed MS plans or strategy at all.

Even without the next Xbox being officially announced, if studios talk about their next gen plans at GDC, even though they can't say the Xbox by name, everyone will know they mean it anyway. Unless explicitly stated as a PS4 exclusive, "Coming soon on next gen consoles" just means it's a PS4 and Xbox 3 game.

I have seen Surface RT adverts on TV loads of times. Windows 8 ads saturated TV, website, billboard and print for weeks. I have never, ever seen a TV ad for the Vita. Whatever the specs of the two consoles are, advertising is one area where MS consistently outclass Sony.

Are you seriously comparing the advertising for Windows 8 and Surface to that for the Vita?

Win8 is MS' bread and butter. Surface is something that MS is using to target the mobile OS space, and they are desperately behind in mobile which can potentially threaten their core cashcow.
 

CLEEK

Member
Are you seriously comparing the advertising for Windows 8 and Surface to that for the Vita?

I was comparing recent product launches from both companies. In MS case, they successfully advertised the fuck out of the products. In Sony's case, they didn't. That has been pretty much par for the course with MS and Sony marketing in general over the years.

Sony really need to up their game with the PS4. Better specs and being gamer focused will generate buzz on placing like this, but that's just preaching to the converted.
 

SPDIF

Member
Are you seriously comparing the advertising for Windows 8 and Surface to that for the Vita?

Win8 is MS' bread and butter. Surface is something that MS is using to target the mobile OS space, and they are desperately behind in mobile which can potentially threaten their core cashcow.

Well, compare Vita or PS3 to the Xbox then. I think his main point still stands.
 
Top Bottom