Saberus
Member
Good interview but I'm still not convinced he isn't a Dana Carvey character.
The Church Lady
Good interview but I'm still not convinced he isn't a Dana Carvey character.
Confirm the memory is like we thought.
The folks at PC Watch Japan made some diagrams of the PS4 architecture. Its more like a simplified overview of the PS4 from what is known.
Overview
CPU Cores
Graphics
Compute on GPU
Graphics API
None.Zero.Zilch.
And dont forget that the PS3 had 66.7% !!!1 advantage over 360 yet the 360 had better games.
Except for the fact that the advantage the 360 enjoyed was the fact that it had unified ram which allowed assets to take great allotment of space than the PS3 could provide without a penalty.
The folks at PC Watch Japan made some diagrams of the PS4 architecture. Its more like a simplified overview of the PS4 from what is known.
Overview
CPU Cores
Graphics
Compute on GPU
Graphics API
They should increase the cpu clock to 2 ghz and gpu core speed to 1 ghZ ^-^
They should increase the cpu clock to 2 ghz and gpu core speed to 1 ghZ ^-^
25%How much performance gain would this lead to? 20%?
Sounds worthwhile25%
GPU would wind up being around 2.3Tflops
In one of those it says GCN2.Does that mean anything good?
It would also add a ton of heat and a lot more power consumption.Sounds worthwhile
Likely means GCN+In one of those it says GCN2.Does that mean anything good?
Sony should make a bigger box, add another fan...It would also add a ton of heat and a lot more power consumption.
16 ram chips?
Hmm, PS4 has the potential go be larger than the PS3 was at launch...
Yes, Microsoft consoles have always been easy to develop for and I am sure Durango will continue this legacy....
We haven't seen an easy to develop for Sony console since the PSOne, so it is big news for the PS4.
I don't know what it is and I don't know why, but I just get a vibe from the PS4 that it will be a return to the PS2 glory years......at least it will be for gamers....
I get that same sense, too, but they aren't facing the same weakened Xbox, so it'll be interesting to see what consumers do. I technically don't count, since I'm getting all the new consoles.
Well, they don't have to worry about Nintendo this time like they did with the PS3. But, Microsoft is the juggernaut this time. When the Xbox came out, Microsoft didn't have a good handle on how the industry worked. The system was powerful, but they didn't secure the extensive partnerships that Sony did.
In one of the sliders, it says "32 or 16 ROPS". I remember seeing here at GAF that it is indeed 32 ROPS.
What does 5.5GT/sec for the GDDR5 RAM mean?
Thanks for posting the new set of screens, danhese007. I do have two queries pertaining to information on the "overview (estimate)" image:
- Why does it say "32 or 16 ROPs"? I thought 32 was pretty much a given especially when cutting it to 16 effectively halves the performance from 25.6Gpixels to 12.8Gpixels/sec, the latter being a less than 3 times increase over RSX's 4.4Gpixels/sec.
- What does 5.5GT/sec for the GDDR5 RAM mean?
Memory transfer rate. Here a slide from the same site for example:
http://pc.watch.impress.co.jp/video/pcw/docs/414/696/07.pdf
Thanks for posting the new set of screens, danhese007. I do have two queries pertaining to information on the "overview (estimate)" image:
- Why does it say "32 or 16 ROPs"? I thought 32 was pretty much a given especially when cutting it to 16 effectively halves the performance from 25.6Gpixels to 12.8Gpixels/sec, the latter being a less than 3 times increase over RSX's 4.4Gpixels/sec.
- What does 5.5GT/sec for the GDDR5 RAM mean?
Yeah, the 32 rops is a given.
GT means gigatexelstransfers.
poodick for the save.
gigatransfers
I get that same sense, too, but they aren't facing the same weakened Xbox, so it'll be interesting to see what consumers do. I technically don't count, since I'm getting all the new consoles.
hmmm 1.8 ghz +
That would be a nice surprise.
Love to hear more about PSSL. That is the "coding to the metal"
They did a great job on that diagrams.
*cough* *cough*
Jaguar Vanilla
* 1.8GHz LC Clocks (can be under-clocked for specific low-powered battery device needs - tablets, etc...).
* 2MB shared L2 cache per CUs
* 1-4 CUs can be outfitted per chip. (i.e. 4-16 logical cores)
* 5-25 watts depending on the device/product. (45 watts is achievable under proper conditions)
PS4 Jaguar with chocolate syrup.
* 2GHz is correct as of now.
* 4MB of total L2 cache (2MB L2 x 2 CUs)
* 2 CUs (8 Logical cores).
* idles around 7 watts during non-gaming operations and around 12 watts during Blu-ray movie operations. Gaming is a mixed bag...
- Why does it say "32 or 16 ROPs"? I thought 32 was pretty much a given especially when cutting it to 16 effectively halves the performance from 25.6Gpixels to 12.8Gpixels/sec, the latter being a less than 3 times increase over RSX's 4.4Gpixels/sec.
Believe in the mysterious MikeR
http://forum.beyond3d.com/showpost.php?p=1714745&postcount=815
Damn BY3D Mods banned him & scared him away when he was giving us good information.
In a perfect world...
If they were going to do matched clocks the highest I could see is 1.8/900 but they've been pretty open about 800MHz on the GPU since February so I doubt that'll happen. No confirmation on the CPU clock speed leads me to believe that it'll be more than 1.6 and they won't be going for matched clocks.
25%
GPU would wind up being around 2.3Tflops
Isn't it just [FORUM RUMORS] or did he leak anything that was confirmed by a third party like when VGLeaks leaked stuff and Kotaku had the same things, as did EDGE?Damn BY3D Mods banned him & scared him away when he was giving us good information.
Isn't it just [FORUM RUMORS] or did he leak anything that was confirmed by a third party like when VGLeaks leaked stuff and Kotaku had the same things, as did EDGE?
You didn't see the chart above?Isn't it just [FORUM RUMORS] or did he leak anything that was confirmed by a third party like when VGLeaks leaked stuff and Kotaku had the same things, as did EDGE?
So the cpu can read the gpu cache and vice versa?
So?
4mb lvl2 cache for cpu
512kb lvl2 cache for gpu
He said something about durango having a IBM cpu if im not mistaken.
I don't know if this is worth it or not, But Am gonna post it anyway.
Chen Guo
Staff/Advisory Engineer at IBM
Chen Guo's Experience
• Oban – migration of XBOX PowerPC chip to 32nm
Maybe this is for 360 or Next Xbox..
So the cpu can read the gpu cache and vice versa?
So?
4mb lvl2 cache for cpu
512kb lvl2 cache for gpu
He said something about durango having a IBM cpu if im not mistaken.
Quote:
Originally Posted by Albofighter
IBM is doing some kind of stacking and charlie has hinted the relation to durango, take that as you will. I expect something new compared to vgleaks specs. Gotta wait til April at the latest.
MikeR
Most likely a MCM design similar to Wii U Espresso, maybe a smaller silicon package, but beefier innards.
One thing I still don't really understand is the change from 4 core CPU with higher clocks to 8 core with low clocks. Isn't this going against their new philosophy of making development as easy as they possibly can? Less cores is easier for devs AFAIK. The only reason for the change given their new goals I can think of is to make it as identical to the Xbox as possible just to make sure ports will not suffer? Or maybe they just got it at much better price for them?
One thing I still don't really understand is the change from 4 core CPU with higher clocks to 8 core with low clocks. Isn't this going against their new philosophy of making development as easy as they possibly can? Less cores is easier for devs AFAIK. The only reason for the change given their new goals I can think of is to make it as identical to the Xbox as possible just to make sure ports will not suffer? Or maybe they just got it at much better price for them?
One thing I still don't really understand is the change from 4 core CPU with higher clocks to 8 core with low clocks. Isn't this going against their new philosophy of making development as easy as they possibly can? Less cores is easier for devs AFAIK. The only reason for the change given their new goals I can think of is to make it as identical to the Xbox as possible just to make sure ports will not suffer? Or maybe they just got it at much better price for them?
Which chart? The one pertaining to GHz just says "1.8GHz?"You didn't see the chart above?
I should've said diagram, but I'm pretty sure it says >1.8 GHzWhich chart? The one pertaining to GHz just says "1.8GHz?"