• Register
  • TOS
  • Privacy
  • @NeoGAF

phosphor112
Banned
(04-04-2013, 12:12 AM)
phosphor112's Avatar
Another great find by danhese007. This guy needs thread making privileges!

Originally Posted by danhese007

So Nikkei did an interview with Mark Cerny and apparently they considered using eDRAM in PS4 but they dropped it because while:

"For example, if we use eDRAM (on-chip DRAM) for the main memory in addition to the external memory, the memory bandwidth will be several terabytes per second. It will be a big advance in terms of performance."

"However, in that case, we will make developers solve a puzzle, 'To realize the fastest operation, what data should be stored in which of the memories, the low-capacity eDRAM or the high-capacity external memory?'" he said. "We wanted to avoid such a situation. We put the highest priority on allowing developers to spend their time creating values for their games."

He then states 4 of many bespoke supercharged parts of PS4's architecture.

As for the "supercharged" parts, or the parts that SCE extended, he said, "There are many, but four of them are representative." They are (1) a structure that realizes high-speed data transmission between the CPU and GPU, (2) a structure that reduces the number of times that data is written back from the cache memory in the GPU, (3) a structure that enables to set priorities in multiple layers in regard to arithmetic and graphics processing and (4) a function to make the CPU take over the pre-processing to be conducted by the GPU.

http://techon.nikkeibp.co.jp/english...01/274313/?P=2

EDIT: I'll try to tackle what's what...?
1) Is the GNB (Graphics North Bridge) that allows for high instant access using the "Onion", "Garlic", and "Onion+."
2)...maybe still part of GNB?
3)Those extra Asynchronous Compute Engines
4)no idea.
Last edited by phosphor112; 04-04-2013 at 12:19 AM.
Respawn
Member
(04-04-2013, 12:14 AM)
Respawn's Avatar
Bless the chips based Cerny
Reiko
Banned
(04-04-2013, 12:16 AM)
Good stuff OP. Another excellent thread.
JaggedSac
Member
(04-04-2013, 12:17 AM)
JaggedSac's Avatar
Interesting that he says that there is a big advance in performance with eDRAM, just that it takes more work to utilize. Cool stuff that they are adding dedicated hardware to solve tasks.
phosphor112
Banned
(04-04-2013, 12:20 AM)
phosphor112's Avatar

Originally Posted by JaggedSac

Interesting that he says that there is a big advance in performance with eDRAM, just that it takes more work to utilize. Cool stuff that they are adding dedicated hardware to solve tasks.

It's odd how he said "terabytes per second."
Kaako
Felium Defensor
(04-04-2013, 12:20 AM)
Kaako's Avatar
Interesting to say the least. I hope their effort with their overall design philosophy pays off.
Jigolo
Member
(04-04-2013, 12:20 AM)
Jigolo's Avatar
Dat Cerny
DMPrince
Member
(04-04-2013, 12:20 AM)
Thank you based Cerny. I will be by your side just in case ps4 doesn't take off <3
Rolf NB
Member
(04-04-2013, 12:22 AM)
Rolf NB's Avatar

Originally Posted by JaggedSac

Interesting that he says that there is a big advance in performance with eDRAM, just that it takes more work to utilize. Cool stuff that they are adding dedicated hardware to solve tasks.

Cell SPE local stores gave SCE a lot of hands-on experience in that field.
The Jason
Member
(04-04-2013, 12:22 AM)
The Jason's Avatar
So the addition of ESRAM would make it more like a non-unified pool, so they went with just high speed unified memory.

Also, sounds like they are trying to make the PS4 as efficient as possible. Cant wait to see how far devs will be able to push this thing
Last edited by The Jason; 04-04-2013 at 12:29 AM.
AgentP
Banned
(04-04-2013, 12:24 AM)

Originally Posted by Rolf NB

Cell SPE local stores gave SCE a lot of hands-on experience in that field.

PS2 had 4MB eDRAM. They know the pros and cons.
sTeLioSco
Banned
(04-04-2013, 12:25 AM)
sTeLioSco's Avatar

Originally Posted by The Jason

So the addition of ESRAM makes it more like a non-unified pool, so they just went with high speed unified memory.

Also, sounds like they are trying to make the PS4 as efficient as possible. Cant wait to see how far devs will be able to push this thing

definitely more than a pc with seemingly "similar" specs....

sorry nvidia about your b.s.
SniperHunter
Member
(04-04-2013, 12:25 AM)
SniperHunter's Avatar
I wish I could understand this stuff....it's so interesting
danhese007
Member
(04-04-2013, 12:26 AM)
danhese007's Avatar

Originally Posted by phosphor112

It's odd how he said "terabytes per second."

I'm thinking he meant Tb/s. Something like 2Tb/s when converted to GB will be 256GB, more than the current PS4 176GB/s memory bandwidth.
BlazinAm
Junior Member
(04-04-2013, 12:27 AM)
BlazinAm's Avatar

Originally Posted by phosphor112

It's odd how he said "terabytes per second."

The speed would be almost one to one with cpu/gpu's clock speed akin to L1 or l2 cache. So he isn't wrong it is just no alot of memory to work with plus if I understand the eDRAM in the next xbox accepts particular data according to the leaks.

Originally Posted by Rolf NB

Cell SPE local stores gave SCE a lot of hands-on experience in that field.

Something like that. Those were 32bit word length I think.
SniperHunter
Member
(04-04-2013, 12:28 AM)
SniperHunter's Avatar
I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.
DMPrince
Member
(04-04-2013, 12:28 AM)

Originally Posted by SniperHunter

I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.

maybe capcom..
Ryoku
Member
(04-04-2013, 12:29 AM)
Ryoku's Avatar

Originally Posted by The Jason

So the addition of ESRAM woul make it more like a non-unified pool, so they went with just high speed unified memory. Several TB's per second is a bit superfluous.

Also, sounds like they are trying to make the PS4 as efficient as possible. Cant wait to see how far devs will be able to push this thing

No. But Sony learned from their mistake with the PS3, the split memory pool. eDRAM requires more work due to its split nature with system memory (as seen in Wii U, and possibly Xbox3), but offers potentially much higher bandwidth than GDDR5 also while sacrificing capacity. They preferred high capacity, high bandwidth over low capacity, potentially crazy bandwidth. eDRAM also has much lower latency than GDDR5, but I haven't looked enough to see how/if Sony combated that aspect.

We'll be seeing a TB/s or more bandwidth in GPUs a couple of years from now, actually, assuming Nvidia will be on track with their roadmap.
JaggedSac
Member
(04-04-2013, 12:29 AM)
JaggedSac's Avatar

Originally Posted by SniperHunter

I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.

Probably will be PS4 having the better ports this go round.

No. But Sony learned from their mistake with the PS3, the split memory pool. eDRAM requires more work due to its split nature with system memory (as seen in Wii U, and possibly Xbox3), but offers potentially much higher bandwidth than GDDR5 also while sacrificing capacity. They preferred high capacity, high bandwidth over low capacity, potentially crazy bandwidth.

Not the same as having two large split memory pools.
USC-fan
aka Kbsmoker
(04-04-2013, 12:30 AM)

176 Gbytes per second was realized by using 16 4-Gbit GDDR5 memory chips

Confirm the memory is like we thought.
artist
Banned
(04-04-2013, 12:30 AM)

Originally Posted by SniperHunter

I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.

None.Zero.Zilch.

And dont forget that the PS3 had 66.7% !!!1 advantage over 360 yet the 360 had better games.
sTeLioSco
Banned
(04-04-2013, 12:31 AM)
sTeLioSco's Avatar

Originally Posted by SniperHunter

I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.

you compare ps3 and ps4 in terms of accessibility?

developers needed time to get more familiar with ps3.with ps4 instead of learning how to make the games work they can spent time optimizing instead....
BlazinAm
Junior Member
(04-04-2013, 12:31 AM)
BlazinAm's Avatar

Originally Posted by SniperHunter

I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.

Well these machines will ideally be around for a while and because they are x86-64 based it is a lot easier for programmers to just right in and do heavy low level engineering.

Side Note: I fucking hate the keyboard lag my laptop is giving me right now.
danhese007
Member
(04-04-2013, 12:33 AM)
danhese007's Avatar

Originally Posted by AgentP

PS2 had 4MB eDRAM. They know the pros and cons.

This is true
i-Lo
Member
(04-04-2013, 12:33 AM)
i-Lo's Avatar

Originally Posted by Ryoku

No. But Sony learned from their mistake with the PS3, the split memory pool. eDRAM requires more work due to its split nature with system memory (as seen in Wii U, and possibly Xbox3), but offers potentially much higher bandwidth than GDDR5 also while sacrificing capacity. They preferred high capacity, high bandwidth over low capacity, potentially crazy bandwidth. eDRAM also has much lower latency than GDDR5, but I haven't looked enough to see how/if Sony combated that aspect.

We'll be seeing a TB/s or more bandwidth in GPUs a couple of years from now, actually, assuming Nvidia will be on track with their roadmap.

OP you're on a roll with these threads. Thanks. It looks like the third point is all about general and graphics related compute functions being fed in as layers for near simultaneous processing.
Ryoku
Member
(04-04-2013, 12:35 AM)
Ryoku's Avatar

Originally Posted by JaggedSac

Not the same as having two large split memory pools.

You're right. I feel as though Sony wanted to avoid a split memory pool this time around at all costs, considering they thought about eDRAM already.
Espada
Member
(04-04-2013, 12:35 AM)
Espada's Avatar

Originally Posted by SniperHunter

I wonder how many devs will take advantage of this stuff. Most ports I played on PS3 were rather piss poor. I guess only the first party studios will take advantage of it.

Correct. Only 1st party developers will be interested in squeezing the most out of the hardware. This is why many people are eager to see what Naughty Dog, Polyphony Digital, SSM, Guerilla Games, etc... do with the console. For multiplatform titles it's pointless and costly to do this, and 3rd party exclusives will be extremely rare next gen.
Napalm_Frank
Member
(04-04-2013, 12:39 AM)
Napalm_Frank's Avatar

Originally Posted by Ryoku

You're right. I feel as though Sony wanted to avoid a split memory pool this time around at all costs, considering they thought about eDRAM already.

So what would happen if we had GDDR5+eDRAM? Would the human race survive?
Lagspike_exe
Member
(04-04-2013, 12:40 AM)
Lagspike_exe's Avatar

Originally Posted by Napalm_Frank

So what would happen if we had GDDR5+eDRAM? Would the human race survive?

Human race would, but Sony probably wouldn't.
Violater
Member
(04-04-2013, 12:40 AM)
Violater's Avatar
In my veins.
BloodMoney
Member
(04-04-2013, 12:42 AM)
BloodMoney's Avatar
I still can't get over Cerny walking out at the start of the Feb 20 unveiling... blew my mind.
BlazinAm
Junior Member
(04-04-2013, 12:43 AM)
BlazinAm's Avatar

Originally Posted by Espada

Correct. Only 1st party developers will be interested in squeezing the most out of the hardware. This is why many people are eager to see what Naughty Dog, Polyphony Digital, SSM, Guerilla Games, etc... do with the console. For multiplatform titles it's pointless and costly to do this, and 3rd party exclusives will be extremely rare next gen.

What Cerny is stating in the interview are the aspects of the PS4 that programmers at third party studios want, at least going by twitter responses.
danhese007
Member
(04-04-2013, 12:45 AM)
danhese007's Avatar

Originally Posted by Ryoku

They preferred high capacity, high bandwidth over low capacity, potentially crazy bandwidth. eDRAM also has much lower latency than GDDR5, but I haven't looked enough to see how/if Sony combated that aspect.

from phosphor's other thread

Originally Posted by phosphor112

TL;DR: (SOMEONE CORRECT ME IF I'M WRONG)
This provides a super fast and efficient way of caching data without having to do redundant work. Cerny mentioned CPU and GPU not having to copy redundant info from the cache in order to use it. It allow straight from CPU to GPU data transferring. All of this is integrated into the CPU, GPU and North Bridge (memory controller). All of this will significantly reduce latency beyond than just providing a large L2 cache because a lot of unnecessary work is cut out and more "shortcuts" are provided.


Even more TL;DR: Worried about GDDR5 latencies? Don't be. Large Cache, very capable bus, shortcuts and clever data transferring between CPU and GPU make that a non-issue.

SolidSnakex
Member
(04-04-2013, 12:47 AM)
SolidSnakex's Avatar

Originally Posted by Espada

Correct. Only 1st party developers will be interested in squeezing the most out of the hardware. This is why many people are eager to see what Naughty Dog, Polyphony Digital, SSM, Guerilla Games, etc... do with the console. For multiplatform titles it's pointless and costly to do this, and 3rd party exclusives will be extremely rare next gen.

It seems like Quantic Dream is cooking up something nice as well

How many times have you been asked about PS4 by journalists?

About 100.

So, question 101, what is your perspective on how PS4 can colour the ideas you have?

Generally it will give us more subtlety, more nuance, more detail.

Could we have done Beyond on the PlayStation 2? Yeah we could have, but of course there would have been certain trade-offs.

So you're saying that you have an idea and fit the console around it?

Yes of course but also that new hardware provides new technologies that allow you to be more subtle. And that in itself does change the developer's incentives. I probably wouldn't have attempted Beyond on PS2 because I wouldn't have thought I could get there.

I know that some people are surprised by what we can do on PS3 with Beyond, and we have heard many people even question whether this is a current-gen game. But trust me, if I showed you what we're doing with the PS4 you would be amazed. It's really surprising. It's another world.

Certinty
Member
(04-04-2013, 12:48 AM)
Certinty's Avatar
Every time I read what Cerny says I shake my head in disbelief.
Lord Error
Insane For Sony
(04-04-2013, 12:49 AM)

Originally Posted by The Jason

So the addition of ESRAM would make it more like a non-unified pool, so they went with just high speed unified memory.

EDRAM, not ESRAM. The difference, from what I've learned here, is that EDRAM is a lot faster, but ESRAM more flexible.Someone questioned why MS allegedly went with ESRAM instead of faster EDRAM so this flexibility was offered as possible answer.
Napalm_Frank
Member
(04-04-2013, 12:49 AM)
Napalm_Frank's Avatar
Does high bandwidth RAM surpass ''normal'' RAM in OS uses ect.? Still wondering if they bumped the amount reserved for OS from 0,5 gigs.
Ryoku
Member
(04-04-2013, 12:50 AM)
Ryoku's Avatar

Originally Posted by Napalm_Frank

So what would happen if we had GDDR5+eDRAM? Would the human race survive?

Lagspike_exe summed it up pretty well :P

But really, the console would be a shit-ton more expensive, and honestly, that much allocated bandwidth for a system of that power would be, I dunno..... overkill? Not pointless, since it'd eliminate certain bottlenecks, but the PS4 is right where it needs to be in terms of balance, not to mention cost.

Originally Posted by danhese007

from phosphor's other thread

Thanks for that.
i-Lo
Member
(04-04-2013, 12:52 AM)
i-Lo's Avatar

Originally Posted by Lagspike_exe

Human race would, but Sony probably wouldn't.

Slowclap.gif
Ryoku
Member
(04-04-2013, 12:54 AM)
Ryoku's Avatar

Originally Posted by Napalm_Frank

Does high bandwidth RAM surpass ''normal'' RAM in OS uses ect.? Still wondering if they bumped the amount reserved for OS from 0,5 gigs.

Eh. I'm not the most well-informed person here regarding tech, but general OS functions don't require high-bandwidth memory. However, the amount of memory an OS uses can be very little or a lot depending on the features available. And considering the features and background tasks the PS4 will have, I'm going to assume more than 0.5GB of that RAM will be allocated for OS functions.
wilflare
Member
(04-04-2013, 12:57 AM)
wilflare's Avatar
Sony really struck gold with Cerny...
I think we need more Cerny gifs now.. rather than Kaz...

and their approach is sensible. make the console as accessible/as easy to develop for = multiplatform titles potentially end up looking/performing better on the console.

but still reserving enough "sauce" for first party developers to come up with amazing stuff
AzaK
Member
(04-04-2013, 12:59 AM)
AzaK's Avatar

Originally Posted by JaggedSac

Interesting that he says that there is a big advance in performance with eDRAM, just that it takes more work to utilize. Cool stuff that they are adding dedicated hardware to solve tasks.

Originally Posted by phosphor112

It's odd how he said "terabytes per second."

Did he just say that the 720 and even possibly the Wii U would actually have great bandwidth, albeit harder to work with? :)
Ryoku
Member
(04-04-2013, 01:01 AM)
Ryoku's Avatar

Originally Posted by AzaK

Did he just say that the 720 and even possibly the Wii U would actually have great bandwidth, albeit harder to work with? :)

It's possible depending on the density used. Can't think of the right term at the moment. Bus width is what I was thinking of. Could be anywhere from 75GB/s to, as he said, 1TB/s or more. Surprised we still don't know Wii U's eDRAM badwidth... :(

EDIT: Take a look at this post

Originally Posted by wsippel

When someone decaps the GPU. But it's possible to make educated guesses:

Renesas UX8 eDRAM comes in three configurations: 1MB macro on a 128bit or 256bit bus, or 8MB on a 256bit bus. Wii U has 32MB eDRAM, so the bus should be either 1024, 4096 or 8192bit wide. The eDRAM should be clocked at either 486 or 729MHz, which leaves us with six possible configurations:

1024bit, 486MHz: 57.9GB/s
1024bit, 729MHz: 86.9GB/s
4096bit, 486MHz: 231.7GB/s
4096bit, 729MHz: 347.6GB/s
8192bit, 486MHz: 463.5GB/s
8192bit, 729MHz: 695.2GB/s

Last edited by Ryoku; 04-04-2013 at 01:08 AM.
USC-fan
aka Kbsmoker
(04-04-2013, 01:03 AM)

Originally Posted by AzaK

Did he just say that the 720 and even possibly the Wii U would actually have great bandwidth, albeit harder to work with? :)

No he is talking about TB/s. this is more like what you will see with nvidia Volta gpu. That will be launching in 2017-2019.
Napalm_Frank
Member
(04-04-2013, 01:04 AM)
Napalm_Frank's Avatar

Originally Posted by Ryoku

[slash]It's possible depending on the density used.[/slash] Can't think of the right term at the moment. Could be anywhere from 75GB/s to, as he said, 1TB/s or more. Surprised we still don't know Wii U's eDRAM badwidth... :(

I remember some people thinking Sony wouldn't give us any specs talk this time round, heh.
Zabka
Member
(04-04-2013, 01:04 AM)
Zabka's Avatar
Good interview but I'm still not convinced he isn't a Dana Carvey character.
ricen-beans
Member
(04-04-2013, 01:04 AM)
ricen-beans's Avatar

Originally Posted by Certinty

Every time I read what Cerny says I shake my head in disbelief.

Why? From amazement or disappointment?
Espada
Member
(04-04-2013, 01:10 AM)
Espada's Avatar

Originally Posted by BlazinAm

What Cerny is stating in the interview are the aspects of the PS4 that programmers at third party studios want, at least going by twitter responses.

Yeah, but the reality is that the ones who will make the most of these features are first party developers. There's really no incentive for multiplatform developers to do so.
Last edited by Espada; 04-04-2013 at 01:17 AM.
Cidd
Member
(04-04-2013, 01:12 AM)
Cidd's Avatar

Originally Posted by SolidSnakex

It seems like Quantic Dream is cooking up something nice as well

Woah, can't wait to see this, It should be amazing.
Espada
Member
(04-04-2013, 01:15 AM)
Espada's Avatar

Originally Posted by Ryoku

Eh. I'm not the most well-informed person here regarding tech, but general OS functions don't require high-bandwidth memory. However, the amount of memory an OS uses can be very little or a lot depending on the features available. And considering the features and background tasks the PS4 will have, I'm going to assume more than 0.5GB of that RAM will be allocated for OS functions.

From what we've heard about the PS4's OS (BSD) 512MB sounds plausible. I don't think it's safe to go by the Wii U's 1GB OS, when even Windows 7 uses ~300MB.

Thread Tools