• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New Cerny interview with Nikkei: Talks about eDRAM, "Supercharged" parts

Tripolygon

Banned
The folks at PC Watch Japan made some diagrams of the PS4 architecture. Its more like a simplified overview of the PS4 from what is known.

Overview

GIZpuAu.jpg


CPU Cores

aSjsYgR.jpg


Graphics

Xf4SJtp.jpg


Compute on GPU

UuT2b7D.jpg


Graphics API

fRFiTqn.jpg
 

USC-fan

Banned
hmmm 1.8 ghz +

That would be a nice surprise.

Love to hear more about PSSL. That is the "coding to the metal" ;)

They did a great job on that diagrams.
 
None.Zero.Zilch.

And dont forget that the PS3 had 66.7% !!!1 advantage over 360 yet the 360 had better games.

Except for the fact that the advantage the 360 enjoyed was the fact that it had unified ram which allowed assets to take great allotment of space than the PS3 could provide without a penalty.
 

RoboPlato

I'd be in the dick
The folks at PC Watch Japan made some diagrams of the PS4 architecture. Its more like a simplified overview of the PS4 from what is known.

Overview

GIZpuAu.jpg


CPU Cores

aSjsYgR.jpg


Graphics

Xf4SJtp.jpg


Compute on GPU

UuT2b7D.jpg


Graphics API

fRFiTqn.jpg

Thanks for finding this. Helps me figure out things a lot better. Interesting info about the CPU clock speed too. I'm guessing that's still in flux since we haven't heard anything on that front yet. I was originally thinking they were targeting matched clocks but it sounds like that's completely unnecessary now and it seems like the GPU is set in stone.

The featureset of the GPU sounds super impressive.
 

RoboPlato

I'd be in the dick
They should increase the cpu clock to 2 ghz and gpu core speed to 1 ghZ ^-^

In a perfect world...

If they were going to do matched clocks the highest I could see is 1.8/900 but they've been pretty open about 800MHz on the GPU since February so I doubt that'll happen. No confirmation on the CPU clock speed leads me to believe that it'll be more than 1.6 and they won't be going for matched clocks.

How much performance gain would this lead to? 20%?
25%

GPU would wind up being around 2.3Tflops
 
Said it before, will say it again: This guy sounds like the best thing to happen to Sony in a long time. I generally agree with his view, but maybe that little puzzle that developers might have to solve isn't as complex as he implies it to be. Coming from 360 development, with the 360's EDRAM, that puzzle is most likely already solved by devs. So it requires a tad bit more care, but I don't foresee this being incredible difficult for devs to do.

Also, Microsoft has historically been pretty good at giving developers the tools and resources they need to get the most of their consoles. They might not have to solve a puzzle if Microsoft already does it for them.
 

Kleegamefan

K. LEE GAIDEN
Yes, Microsoft consoles have always been easy to develop for and I am sure Durango will continue this legacy....

We haven't seen an easy to develop for Sony console since the PSOne, so it is big news for the PS4.

I don't know what it is and I don't know why, but I just get a vibe from the PS4 that it will be a return to the PS2 glory years......at least it will be for gamers....
 
Yes, Microsoft consoles have always been easy to develop for and I am sure Durango will continue this legacy....

We haven't seen an easy to develop for Sony console since the PSOne, so it is big news for the PS4.

I don't know what it is and I don't know why, but I just get a vibe from the PS4 that it will be a return to the PS2 glory years......at least it will be for gamers....

I get that same sense, too, but they aren't facing the same weakened Xbox, so it'll be interesting to see what consumers do. I technically don't count, since I'm getting all the new consoles.
 

Iacobellis

Junior Member
I get that same sense, too, but they aren't facing the same weakened Xbox, so it'll be interesting to see what consumers do. I technically don't count, since I'm getting all the new consoles.

Well, they don't have to worry about Nintendo this time like they did with the PS3. But, Microsoft is the juggernaut this time. When the Xbox came out, Microsoft didn't have a good handle on how the industry worked. The system was powerful, but they didn't secure the extensive partnerships that Sony did.
 
Well, they don't have to worry about Nintendo this time like they did with the PS3. But, Microsoft is the juggernaut this time. When the Xbox came out, Microsoft didn't have a good handle on how the industry worked. The system was powerful, but they didn't secure the extensive partnerships that Sony did.

I still have faith the Wii-U will produce some incredible games, but I do feel they are in a bad position.
 

i-Lo

Member
Thanks for posting the new set of screens, danhese007. I do have two queries pertaining to information on the "overview (estimate)" image:

  • Why does it say "32 or 16 ROPs"? I thought 32 was pretty much a given especially when cutting it to 16 effectively halves the performance from 25.6Gpixels to 12.8Gpixels/sec, the latter being a less than 3 times increase over RSX's 4.4Gpixels/sec.
  • What does 5.5GT/sec for the GDDR5 RAM mean?
 
Thanks for posting the new set of screens, danhese007. I do have two queries pertaining to information on the "overview (estimate)" image:

  • Why does it say "32 or 16 ROPs"? I thought 32 was pretty much a given especially when cutting it to 16 effectively halves the performance from 25.6Gpixels to 12.8Gpixels/sec, the latter being a less than 3 times increase over RSX's 4.4Gpixels/sec.
  • What does 5.5GT/sec for the GDDR5 RAM mean?

Yeah, the 32 rops is a given.

GT means gigatexelstransfers.

poodick for the save.
 

poodpick

Member
Thanks for posting the new set of screens, danhese007. I do have two queries pertaining to information on the "overview (estimate)" image:

  • Why does it say "32 or 16 ROPs"? I thought 32 was pretty much a given especially when cutting it to 16 effectively halves the performance from 25.6Gpixels to 12.8Gpixels/sec, the latter being a less than 3 times increase over RSX's 4.4Gpixels/sec.
  • What does 5.5GT/sec for the GDDR5 RAM mean?

gigatransfers
 

JJD

Member
I get that same sense, too, but they aren't facing the same weakened Xbox, so it'll be interesting to see what consumers do. I technically don't count, since I'm getting all the new consoles.

Well to be frank from the look of things MS is not going to face a hard to code, late and over expensive PS3 like console either.

I never saw the original Xbox as a weak console, quite the contrary actually, it was a great console IMO, but when it launched the PS2 already had the whole market on lockdown. That's why MS had no choice but to kill it early to get the 360 out of the gates first, and that worked wonders for then.

Seems like this time Sony will not only launch first or close to the next box, but they are making sure to avoid mistakes of the past.
 

onQ123

Member
hmmm 1.8 ghz +

That would be a nice surprise.

Love to hear more about PSSL. That is the "coding to the metal" ;)

They did a great job on that diagrams.


Believe in the mysterious MikeR



http://forum.beyond3d.com/showpost.php?p=1714745&postcount=815

*cough* *cough*
Jaguar Vanilla
* 1.8GHz LC Clocks (can be under-clocked for specific low-powered battery device needs - tablets, etc...).
* 2MB shared L2 cache per CUs
* 1-4 CUs can be outfitted per chip. (i.e. 4-16 logical cores)
* 5-25 watts depending on the device/product. (45 watts is achievable under proper conditions)

PS4 Jaguar with chocolate syrup.
* 2GHz is correct as of now.
* 4MB of total L2 cache (2MB L2 x 2 CUs)
* 2 CUs (8 Logical cores).
* idles around 7 watts during non-gaming operations and around 12 watts during Blu-ray movie operations. Gaming is a mixed bag...

Damn BY3D Mods banned him & scared him away when he was giving us good information.
 

Tripolygon

Banned
  • Why does it say "32 or 16 ROPs"? I thought 32 was pretty much a given especially when cutting it to 16 effectively halves the performance from 25.6Gpixels to 12.8Gpixels/sec, the latter being a less than 3 times increase over RSX's 4.4Gpixels/sec.

I think it was an error on their part
 

mrklaw

MrArseFace
In a perfect world...

If they were going to do matched clocks the highest I could see is 1.8/900 but they've been pretty open about 800MHz on the GPU since February so I doubt that'll happen. No confirmation on the CPU clock speed leads me to believe that it'll be more than 1.6 and they won't be going for matched clocks.


25%

GPU would wind up being around 2.3Tflops


AMD have quite a few overclocked options with their GPUs, including 'GHz editions'. So they seem fairly flexible on that front (probably binned so its a cost thing). Wonder what the difference would be heatwise?
 

benny_a

extra source of jiggaflops
Damn BY3D Mods banned him & scared him away when he was giving us good information.
Isn't it just [FORUM RUMORS] or did he leak anything that was confirmed by a third party like when VGLeaks leaked stuff and Kotaku had the same things, as did EDGE?
 
So the cpu can read the gpu cache and vice versa?

So?
4mb lvl2 cache for cpu
512kb lvl2 cache for gpu

Isn't it just [FORUM RUMORS] or did he leak anything that was confirmed by a third party like when VGLeaks leaked stuff and Kotaku had the same things, as did EDGE?

He said something about durango having a IBM cpu if im not mistaken.
 

ekim

Member
So the cpu can read the gpu cache and vice versa?

So?
4mb lvl2 cache for cpu
512kb lvl2 cache for gpu



He said something about durango having a IBM cpu if im not mistaken.

C-C-C-Crosspost:
http://www.neogaf.com/forum/showpost.php?p=52614993&postcount=630

I don't know if this is worth it or not, But Am gonna post it anyway.

Chen Guo
Staff/Advisory Engineer at IBM

Chen Guo's Experience

• Oban – migration of XBOX PowerPC chip to 32nm

LinkedIn

Maybe this is for 360 or Next Xbox..
 

Master_JO

Banned
So the cpu can read the gpu cache and vice versa?

So?
4mb lvl2 cache for cpu
512kb lvl2 cache for gpu



He said something about durango having a IBM cpu if im not mistaken.

Most likely this one:

Quote:
Originally Posted by Albofighter
IBM is doing some kind of stacking and charlie has hinted the relation to durango, take that as you will. I expect something new compared to vgleaks specs. Gotta wait til April at the latest.

MikeR
Most likely a MCM design similar to Wii U Espresso, maybe a smaller silicon package, but beefier innards.
 
One thing I still don't really understand is the change from 4 core CPU with higher clocks to 8 core with low clocks. Isn't this going against their new philosophy of making development as easy as they possibly can? Less cores is easier for devs AFAIK. The only reason for the change given their new goals I can think of is to make it as identical to the Xbox as possible just to make sure ports will not suffer? Or maybe they just got it at much better price for them?
 

gofreak

GAF's Bob Woodward
One thing I still don't really understand is the change from 4 core CPU with higher clocks to 8 core with low clocks. Isn't this going against their new philosophy of making development as easy as they possibly can? Less cores is easier for devs AFAIK. The only reason for the change given their new goals I can think of is to make it as identical to the Xbox as possible just to make sure ports will not suffer? Or maybe they just got it at much better price for them?

Back when rumours first emerged about the change, it was said that Steamroller (the fatter, higher power core that would have come in a group of 4) simply wouldn't be ready in time for launch, so they had to switch to Jaguar.
 

Tratorn

Member
One thing I still don't really understand is the change from 4 core CPU with higher clocks to 8 core with low clocks. Isn't this going against their new philosophy of making development as easy as they possibly can? Less cores is easier for devs AFAIK. The only reason for the change given their new goals I can think of is to make it as identical to the Xbox as possible just to make sure ports will not suffer? Or maybe they just got it at much better price for them?

Maybe one of the reasons was also that they need/want to reserve 1 core for the system?
If they would have reserved one of the 4 steamroller-cores, the loss of power for games would be higher than now.
 

Perkel

Banned
One thing I still don't really understand is the change from 4 core CPU with higher clocks to 8 core with low clocks. Isn't this going against their new philosophy of making development as easy as they possibly can? Less cores is easier for devs AFAIK. The only reason for the change given their new goals I can think of is to make it as identical to the Xbox as possible just to make sure ports will not suffer? Or maybe they just got it at much better price for them?

heat issues and yelds.


Going for higher clocks = lower yelds and more costly cooling = increased price
 
Top Bottom