• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New Cerny interview with Nikkei: Talks about eDRAM, "Supercharged" parts

Another great find by danhese007. This guy needs thread making privileges!

danhese007 said:
So Nikkei did an interview with Mark Cerny and apparently they considered using eDRAM in PS4 but they dropped it because while:
"For example, if we use eDRAM (on-chip DRAM) for the main memory in addition to the external memory, the memory bandwidth will be several terabytes per second. It will be a big advance in terms of performance."

"However, in that case, we will make developers solve a puzzle, 'To realize the fastest operation, what data should be stored in which of the memories, the low-capacity eDRAM or the high-capacity external memory?'" he said. "We wanted to avoid such a situation. We put the highest priority on allowing developers to spend their time creating values for their games."

He then states 4 of many bespoke supercharged parts of PS4's architecture.

As for the "supercharged" parts, or the parts that SCE extended, he said, "There are many, but four of them are representative." They are (1) a structure that realizes high-speed data transmission between the CPU and GPU, (2) a structure that reduces the number of times that data is written back from the cache memory in the GPU, (3) a structure that enables to set priorities in multiple layers in regard to arithmetic and graphics processing and (4) a function to make the CPU take over the pre-processing to be conducted by the GPU.
http://techon.nikkeibp.co.jp/english/NEWS_EN/20130401/274313/?P=2

EDIT: I'll try to tackle what's what...?
1) Is the GNB (Graphics North Bridge) that allows for high instant access using the "Onion", "Garlic", and "Onion+."
2)...maybe still part of GNB?
3)Those extra Asynchronous Compute Engines
4)no idea.
 

JaggedSac

Member
Interesting that he says that there is a big advance in performance with eDRAM, just that it takes more work to utilize. Cool stuff that they are adding dedicated hardware to solve tasks.
 

Kaako

Felium Defensor
Interesting to say the least. I hope their effort with their overall design philosophy pays off.
 

Rolf NB

Member
Interesting that he says that there is a big advance in performance with eDRAM, just that it takes more work to utilize. Cool stuff that they are adding dedicated hardware to solve tasks.
Cell SPE local stores gave SCE a lot of hands-on experience in that field.
 

The Jason

Member
So the addition of ESRAM would make it more like a non-unified pool, so they went with just high speed unified memory.

Also, sounds like they are trying to make the PS4 as efficient as possible. Cant wait to see how far devs will be able to push this thing
 

sTeLioSco

Banned
So the addition of ESRAM makes it more like a non-unified pool, so they just went with high speed unified memory.

Also, sounds like they are trying to make the PS4 as efficient as possible. Cant wait to see how far devs will be able to push this thing

definitely more than a pc with seemingly "similar" specs....

sorry nvidia about your b.s.
 

BlazinAm

Junior Member
It's odd how he said "terabytes per second."
The speed would be almost one to one with cpu/gpu's clock speed akin to L1 or l2 cache. So he isn't wrong it is just no alot of memory to work with plus if I understand the eDRAM in the next xbox accepts particular data according to the leaks.
Cell SPE local stores gave SCE a lot of hands-on experience in that field.

Something like that. Those were 32bit word length I think.
 
I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.
 

Ryoku

Member
So the addition of ESRAM woul make it more like a non-unified pool, so they went with just high speed unified memory. Several TB's per second is a bit superfluous.

Also, sounds like they are trying to make the PS4 as efficient as possible. Cant wait to see how far devs will be able to push this thing

No. But Sony learned from their mistake with the PS3, the split memory pool. eDRAM requires more work due to its split nature with system memory (as seen in Wii U, and possibly Xbox3), but offers potentially much higher bandwidth than GDDR5 also while sacrificing capacity. They preferred high capacity, high bandwidth over low capacity, potentially crazy bandwidth. eDRAM also has much lower latency than GDDR5, but I haven't looked enough to see how/if Sony combated that aspect.

We'll be seeing a TB/s or more bandwidth in GPUs a couple of years from now, actually, assuming Nvidia will be on track with their roadmap.
 

JaggedSac

Member
I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.

Probably will be PS4 having the better ports this go round.

No. But Sony learned from their mistake with the PS3, the split memory pool. eDRAM requires more work due to its split nature with system memory (as seen in Wii U, and possibly Xbox3), but offers potentially much higher bandwidth than GDDR5 also while sacrificing capacity. They preferred high capacity, high bandwidth over low capacity, potentially crazy bandwidth.

Not the same as having two large split memory pools.
 

artist

Banned
I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.
None.Zero.Zilch.

And dont forget that the PS3 had 66.7% !!!1 advantage over 360 yet the 360 had better games.
 

sTeLioSco

Banned
I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.

you compare ps3 and ps4 in terms of accessibility?

developers needed time to get more familiar with ps3.with ps4 instead of learning how to make the games work they can spent time optimizing instead....
 

BlazinAm

Junior Member
I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.

Well these machines will ideally be around for a while and because they are x86-64 based it is a lot easier for programmers to just right in and do heavy low level engineering.

Side Note: I fucking hate the keyboard lag my laptop is giving me right now.
 

i-Lo

Member
No. But Sony learned from their mistake with the PS3, the split memory pool. eDRAM requires more work due to its split nature with system memory (as seen in Wii U, and possibly Xbox3), but offers potentially much higher bandwidth than GDDR5 also while sacrificing capacity. They preferred high capacity, high bandwidth over low capacity, potentially crazy bandwidth. eDRAM also has much lower latency than GDDR5, but I haven't looked enough to see how/if Sony combated that aspect.

We'll be seeing a TB/s or more bandwidth in GPUs a couple of years from now, actually, assuming Nvidia will be on track with their roadmap.

OP you're on a roll with these threads. Thanks. It looks like the third point is all about general and graphics related compute functions being fed in as layers for near simultaneous processing.
 

Espada

Member
I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.

Correct. Only 1st party developers will be interested in squeezing the most out of the hardware. This is why many people are eager to see what Naughty Dog, Polyphony Digital, SSM, Guerilla Games, etc... do with the console. For multiplatform titles it's pointless and costly to do this, and 3rd party exclusives will be extremely rare next gen.
 

BlazinAm

Junior Member
Correct. Only 1st party developers will be interested in squeezing the most out of the hardware. This is why many people are eager to see what Naughty Dog, Polyphony Digital, SSM, Guerilla Games, etc... do with the console. For multiplatform titles it's pointless and costly to do this, and 3rd party exclusives will be extremely rare next gen.

What Cerny is stating in the interview are the aspects of the PS4 that programmers at third party studios want, at least going by twitter responses.
 

Tripolygon

Banned
They preferred high capacity, high bandwidth over low capacity, potentially crazy bandwidth. eDRAM also has much lower latency than GDDR5, but I haven't looked enough to see how/if Sony combated that aspect.

from phosphor's other thread
TL;DR: (SOMEONE CORRECT ME IF I'M WRONG)
This provides a super fast and efficient way of caching data without having to do redundant work. Cerny mentioned CPU and GPU not having to copy redundant info from the cache in order to use it. It allow straight from CPU to GPU data transferring. All of this is integrated into the CPU, GPU and North Bridge (memory controller). All of this will significantly reduce latency beyond than just providing a large L2 cache because a lot of unnecessary work is cut out and more "shortcuts" are provided.


Even more TL;DR: Worried about GDDR5 latencies? Don't be. Large Cache, very capable bus, shortcuts and clever data transferring between CPU and GPU make that a non-issue.
 
Correct. Only 1st party developers will be interested in squeezing the most out of the hardware. This is why many people are eager to see what Naughty Dog, Polyphony Digital, SSM, Guerilla Games, etc... do with the console. For multiplatform titles it's pointless and costly to do this, and 3rd party exclusives will be extremely rare next gen.

It seems like Quantic Dream is cooking up something nice as well

How many times have you been asked about PS4 by journalists?

About 100.

So, question 101, what is your perspective on how PS4 can colour the ideas you have?

Generally it will give us more subtlety, more nuance, more detail.

Could we have done Beyond on the PlayStation 2? Yeah we could have, but of course there would have been certain trade-offs.

So you're saying that you have an idea and fit the console around it?

Yes of course but also that new hardware provides new technologies that allow you to be more subtle. And that in itself does change the developer's incentives. I probably wouldn't have attempted Beyond on PS2 because I wouldn't have thought I could get there.

I know that some people are surprised by what we can do on PS3 with Beyond, and we have heard many people even question whether this is a current-gen game. But trust me, if I showed you what we're doing with the PS4 you would be amazed. It's really surprising. It's another world.
 

Lord Error

Insane For Sony
So the addition of ESRAM would make it more like a non-unified pool, so they went with just high speed unified memory.
EDRAM, not ESRAM. The difference, from what I've learned here, is that EDRAM is a lot faster, but ESRAM more flexible.Someone questioned why MS allegedly went with ESRAM instead of faster EDRAM so this flexibility was offered as possible answer.
 
Does high bandwidth RAM surpass ''normal'' RAM in OS uses ect.? Still wondering if they bumped the amount reserved for OS from 0,5 gigs.
 

Ryoku

Member
So what would happen if we had GDDR5+eDRAM? Would the human race survive?

Lagspike_exe summed it up pretty well :p

But really, the console would be a shit-ton more expensive, and honestly, that much allocated bandwidth for a system of that power would be, I dunno..... overkill? Not pointless, since it'd eliminate certain bottlenecks, but the PS4 is right where it needs to be in terms of balance, not to mention cost.

from phosphor's other thread

Thanks for that.
 

Ryoku

Member
Does high bandwidth RAM surpass ''normal'' RAM in OS uses ect.? Still wondering if they bumped the amount reserved for OS from 0,5 gigs.

Eh. I'm not the most well-informed person here regarding tech, but general OS functions don't require high-bandwidth memory. However, the amount of memory an OS uses can be very little or a lot depending on the features available. And considering the features and background tasks the PS4 will have, I'm going to assume more than 0.5GB of that RAM will be allocated for OS functions.
 

wilflare

Member
Sony really struck gold with Cerny...
I think we need more Cerny gifs now.. rather than Kaz...

and their approach is sensible. make the console as accessible/as easy to develop for = multiplatform titles potentially end up looking/performing better on the console.

but still reserving enough "sauce" for first party developers to come up with amazing stuff
 

Ryoku

Member
Did he just say that the 720 and even possibly the Wii U would actually have great bandwidth, albeit harder to work with? :)

It's possible depending on the density used. Can't think of the right term at the moment. Bus width is what I was thinking of. Could be anywhere from 75GB/s to, as he said, 1TB/s or more. Surprised we still don't know Wii U's eDRAM badwidth... :(

EDIT: Take a look at this post

When someone decaps the GPU. But it's possible to make educated guesses:

Renesas UX8 eDRAM comes in three configurations: 1MB macro on a 128bit or 256bit bus, or 8MB on a 256bit bus. Wii U has 32MB eDRAM, so the bus should be either 1024, 4096 or 8192bit wide. The eDRAM should be clocked at either 486 or 729MHz, which leaves us with six possible configurations:

1024bit, 486MHz: 57.9GB/s
1024bit, 729MHz: 86.9GB/s
4096bit, 486MHz: 231.7GB/s
4096bit, 729MHz: 347.6GB/s
8192bit, 486MHz: 463.5GB/s
8192bit, 729MHz: 695.2GB/s
 

USC-fan

Banned
Did he just say that the 720 and even possibly the Wii U would actually have great bandwidth, albeit harder to work with? :)

No he is talking about TB/s. this is more like what you will see with nvidia Volta gpu. That will be launching in 2017-2019.
 
[slash]It's possible depending on the density used.[/slash] Can't think of the right term at the moment. Could be anywhere from 75GB/s to, as he said, 1TB/s or more. Surprised we still don't know Wii U's eDRAM badwidth... :(

I remember some people thinking Sony wouldn't give us any specs talk this time round, heh.
 

Espada

Member
What Cerny is stating in the interview are the aspects of the PS4 that programmers at third party studios want, at least going by twitter responses.

Yeah, but the reality is that the ones who will make the most of these features are first party developers. There's really no incentive for multiplatform developers to do so.
 

Espada

Member
Eh. I'm not the most well-informed person here regarding tech, but general OS functions don't require high-bandwidth memory. However, the amount of memory an OS uses can be very little or a lot depending on the features available. And considering the features and background tasks the PS4 will have, I'm going to assume more than 0.5GB of that RAM will be allocated for OS functions.

From what we've heard about the PS4's OS (BSD) 512MB sounds plausible. I don't think it's safe to go by the Wii U's 1GB OS, when even Windows 7 uses ~300MB.
 
Top Bottom