• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

dr_rus

Member
I'm looking at the long game. Expensive for the first year and then cheaper for the next 9.

Look at pricing of DDR2/DDR3 now and the same will likely happen with DDR3/DDR4.

GDDR5 is good but it uses lots of power and isn't it overdue to be replaced?

We've just gotten to the point where GPU memory controllers are able to drive GDDR5 to the maximum of its frequency capacity and you want to simply scrap all of this and go with a new unproven non-graphical DDR4 standard? It does make sense with regards to further cost cutting down the line but it's precisely the type of risk that SCE should avoid after all the Cell/XDR/Blu-ray PS3 fiasco.

If cost cutting is the main target it would make more sense to go with two RAM pools - one smaller GDDR5 for GPU frame buffer operations and one larger DDR3 for OS / game CPU code / cache storage - instead of going with DDR4.
 

DieH@rd

Banned
I expect that sony will most likely make a brute force attempt in first mothearboard version with 8-16 DDR4 chips of 512 MB capacity. Later, they will move them to stacks on the mobo, and eventually move those stacks 2.5D interposer together beside APU. This will drastically simplify PS4 motherboard.

Hynix is preparing 4Gbit [512 MB] GDDR5 chips for manufacture in early 2013, but those will not be cheap. Micron/Nyanyha/Samsung/ Hynix have produced test samples of 4Gbit DDR4 and are expecting "quick implementation in 2013" now that JEDEC has addopted official DDR4 spec. Micron is planing to really focus on 4Gbit chips in the beginning by making 4/8/16GB dimm sticks for PC use. :D

Analist report from June 2012:
[...first they talk about current state of DDR4, and note that if Intel does not addopt DDR4 in Haswell servers, all timetables will become fucked]

There's no rush, really, as Garber notes that the price for DDR4 is (*currently*) outrageous. A 1Gbit DDR3 chip is $1.40, while a 2Gbit chip is $1.70, and a 4Gbit DDR4 chip is $30. By December, when volume production kicks in, that will drop to $5 and by this time next year, it will be $2.50, she said.

Garber predicts DDR4 will replace DDR3 as fast as DDR3 replaced DDR2, for one reason: we have no choice. Memory makers will move to DDR4 and motherboard makers won't have a choice but to support it.

"We're down to essentially four vendors in the marketplace. When they turn their production on, they stay state of the art. When they move to DDR4, they will force that price down and force their volume customers into converting as quickly as possible. And they will all convert to DDR4 and try to get as much of the high ASPs as possible from customers before the price comes down," she said.

If this timetable goes as planned, when the mass production of PS4 starts in AUG-OCT Sony will pay between 20 and 40 bucks for 4-8GB DDR4 memory [lets add another $5-10 just to be safe], which is not terribly expensive. Sure mobo will be a monster, but they will eat that price just to get to the market before MS.
 

yurinka

Member
Can we expect DDR4 for the Orbis ?
If Sony as rumored is going to release the console late 2013 / early 2014, no.
Same goes with all the crazyness of this thread regarding tech scheduled for 2014.
Guys, in 2014 the console needs to be made and in the stores, and the devs of the launch games need its tech in their devkits several months before its release.
And no, it won't have the tech of a $3000 PC because it's too expensive.
 

DieH@rd

Banned
If Sony wants devkits working in 2013 and to sell the console at a reasonable price, no.
Same goes with all the crazyness of this thread regarding tech scheduled for 2014.
Guys, the console should be released late 2013 or early 2014. And before that, the developers will need this tech running in their devkit several months before.

Original rumor in the #1 post of this thread mentions that "A final version [*of the devkit*] will be delivered to developers “next summer”". By next summer, DDR4 will be in large manufacturing stage, and will not be expensive. Also Sony can order 50 million of ddr4 chips for the period of the next X years and get a killer discount.
 

yurinka

Member
Original rumor in the #1 post of this thread mentions that "A final version [*of the devkit*] will be delivered to developers “next summer”". By next summer, DDR4 will be in large manufacturing stage, and will not be expensive. Also Sony can order 50 million of ddr4 chips for the period of the next X years and get a killer discount.
It makes sense to have the final kit in summer if the release in fall / winter. But AAA launch games need way more than 6 months of development. During the 6 last months, with the game basically already developed they will be basically bugfixing, trying to fit whatever they developed for other specs there, tweaking this or that, trying to make it work with the latest FW version of the console and latest SDK version, etc. Which use to be painful.

The more time they have, the better.
 
If Sony as rumored is going to release the console late 2013 / early 2014, no.
Same goes with all the crazyness of this thread regarding tech scheduled for 2014.
Guys, in 2014 the console needs to be made and in the stores, and the devs of the launch games need its tech in their devkits several months before its release.
And no, it won't have the tech of a $3000 PC because it's too expensive.

you act like tech corps aren't in talks with other tech corps in regards to technology that hasn't seen the light of day in the public.
 

StevieP

Banned
you act like tech corps aren't in talks with other tech corps in regards to technology that hasn't seen the light of day in the public.

His thinking is solid, though. Most of the time you need to launch a mass market product with parts that are available to mass manufacture for the mass market.

As an example, current fabs aren't even doing 28nm at a high volume percentage as more mature nodes. As of next year, they might be... but 22nm will probably represent less than 5% of their production, and you can't do consoles like that.
 

Elios83

Member
you act like tech corps aren't in talks with other tech corps in regards to technology that hasn't seen the light of day in the public.

He's right.
Consoles are never manufactured with state of the art technologies, they don't use the latest manufacturing process and so on.
These things are designed to be manufactured at low cost in millions of units, the yields must be well proven.
Sony tried to break the rule with Blu Ray, which at that time was state of art and they ended up with delays, manufacturing costs out of control, high retail price and huge losses. They did it with Blu Ray because they had a lot of investements involved with the technology, but no one is going to do that for the sake of good graphics like a few delusional people are thinking :D
Also the design process started in 2010 with a target release at the end of 2013. At that time they could only decide to use the technologies they knew that would be a standard for the intended release period.
So well, just to keep it short, these consoles will be made with the well proven 28nm manufacturing process for the main chips and the RAM will be either DDR3 or GDDR5 or a combination of the two if there is a discrete GPU and they decide go with two different memory pools as in the PS3.
 

THE:MILKMAN

Member
We've just gotten to the point where GPU memory controllers are able to drive GDDR5 to the maximum of its frequency capacity and you want to simply scrap all of this and go with a new unproven non-graphical DDR4 standard? It does make sense with regards to further cost cutting down the line but it's precisely the type of risk that SCE should avoid after all the Cell/XDR/Blu-ray PS3 fiasco.

If cost cutting is the main target it would make more sense to go with two RAM pools - one smaller GDDR5 for GPU frame buffer operations and one larger DDR3 for OS / game CPU code / cache storage - instead of going with DDR4.

I just feel that they all have pro's and con's and IMO, DDR4 wins by a nose.

According to VG247, a third near final Orbis devkit will be delivered in January so hopefully we find out once and for all what memory will be in it.
 
I just feel that they all have pro's and con's and IMO, DDR4 wins by a nose.

According to VG247, a third near final Orbis devkit will be delivered in January so hopefully we find out once and for all what memory will be in it.

DDR4 success is dependent on AMD and Intel adapting the technology. I rememeber how long it took Intel to put native USB 3.0 support on their chipsets and I fear AMD needs the success more than Intel. Just because DDR4 prices *will* drop at some point in the future I don't see the big relevance for Sony.

From XDR/GDDR3 (that split pool was a mistake) too a unified pool of GDDR5 is much more logical. It is established, available and predictable technology which just got the next step to 2GB modules - and I am certain there will be a lot of consumer GPUs in the next years with GDDR5 RAM. Whereas DDR4 was just finalized and right now neither proven nor cheaper when compared to GDDR5. Would the consoles launch in 2015-16 the risk with DDR4 would be a lot smaller but for a 2013/14 launch I just don't see it.
 

i-Lo

Member
There are 2 schools of thought on this one. If you want fast memory (such as GDDR5) you will have less of it. The current "cap" of common sense for that would be 2GB, with any more going to cost prohibitively and complicate the design, which complicates future cost reduction.

FYI, Microsoft currently has 12GB of DDR3 in their dev kits (of which the final console will have 8GB - some will be reserved). Feasibly they could replace this with DDR4 in the final console, since DDR4 is basically similar to DDR3 but running a bit faster and a bit cooler, but DDR4 is not out yet.

Sony's original "target spec" sheet indicated a 4-core steamroller CPU and 2GB of GDDR5. There have been many rumours that say the Steamrollers have changed to Jaguars, and I would bet that Sony's taken a look at Microsoft's larger amount of slower memory and said "well that could fuck us for ports... how bout we do the same thing? DDR3 for now, maybe DDR4 later"

The recent rumour about two dev kits being sent out with 8 and 16GB respectively would suggest that original 2GB is now out of the scenario, primarily because of, as you aptly put, ports. This leaves us with three possibilities, 4, 6 and 8GB. Once again, as aforementioned if the memory is split and GDDR5 is used then 2GB is the logical amount. Otherwise it could be DDR3 or 4. The pertinent question is whether the memory bus would the same constricted 128 bit or larger or whether the implementation of SoC design have an impact on it?

I just feel that they all have pro's and con's and IMO, DDR4 wins by a nose.

According to VG247, a third near final Orbis devkit will be delivered in January so hopefully we find out once and for all what memory will be in it.

IIRC, the final dev kits will be out Summer 2013. That means even the one in the January will be a close approximation. If Uncharted 1 vs 3 or Halo 3 vs 4 is anything to go by, the next projects that have already gone into development are in a prejudiced position because of the lack of final hardware specs. Of course, on the business side of things, it's a decision that'll make the early birds money.

At least, if rumours are to be believed, the good news is that there is a power increase with each new iteration of the dev kit. But considering the rumour that the earliest PS4 spec had it pegged with a 7790 (never released but specs matched those of rumoured PS4's GPU) or an equivalent (could have been even the 7770 or 7750 as StevieP's conservative estimates were) for a GPU and 2GB of RAM, it doesn't tell us much.
 

RaijinFY

Member
DDR4 success is dependent on AMD and Intel adapting the technology. I rememeber how long it took Intel to put native USB 3.0 support on their chipsets and I fear AMD needs the success more than Intel. Just because DDR4 prices *will* drop at some point in the future I don't see the big relevance for Sony.

From XDR/GDDR3 (that split pool was a mistake) too a unified pool of GDDR5 is much more logical. It is established, available and predictable technology which just got the next step to 2GB modules - and I am certain there will be a lot of consumer GPUs in the next years with GDDR5 RAM. Whereas DDR4 was just finalized and right now neither proven nor cheaper when compared to GDDR5. Would the consoles launch in 2015-16 the risk with DDR4 would be a lot smaller but for a 2013/14 launch I just don't see it.

That would actually be the main point!!! Going to a product that lower its price over time will allow a lower bom and so enabling price cuts. DDR4 is going to be adopted as an industry standard over the next few years unlike XDR which is a very boutique component (hence expensive). When you build a console you have to put emphasis on how the cost will go down overtime not just what will the cost at launch. That requires planification... basically everything hasnt been done with PS3 for weird reasons.

Also the xdr/gddr3 split wasnt a deliberate choice but more a forced comprise. At first they wanted a Toshiba made GPU but they switched very late to a nVidia solution. I suspect the toshiba made GPU would have allowed XDR but nVidia (nor anyone else) dont like rambus, so Sony had to stick with GDDR3.
 

Elios83

Member
That would actually be the main point!!! Going to a product that lower its price over time will allow a lower bom and so enabling price cuts. DDR4 is going to be adopted as an industry standard over the next few years unlike XDR which is a very boutique component (hence expensive). When you build a console you have to put emphasis on how the cost will go down overtime not just what will the cost at launch. That requires planification... basically everything hasnt been done with PS3 for weird reasons.

Also the xdr/gddr3 split wasnt a deliberate choice but more a forced comprise. At first they wanted a Toshiba made GPU but they switched very late to a nVidia solution. I suspect the toshiba made GPU would have allowed XDR but nVidia (nor anyone else) dont like rambus, so Sony had to stick with GDDR3.

Both PS2 and PS3 were designed to use niche Rambus memories just to get a slightly higher bandwidth compared to the standard PC memories of the time which could have costed much less.
The issue of long term memory price is not something which traditionally has been given much importance to. That is for two reasons, number one these companies are not the PC ethusiast who's looking for an obsolete memory module for an old PC on the market and the prices are higher than those of modern modules, they make mass production orders, more than 10 millions a year, with such huge quantities the costs discrepancy ends up being irrelevant.
Second reason is that these consoles ares designed for a long life cycle, 10 years on the market before being discontinued. In such circumstances it's impossible to select components that will not become obsolete anyway, they'll always end up manufacturing things just for that particular product.
With DDR4 there are key problems, initially they will be more expensive than DDR3, they're still not ready, unproven manufacturing can lead to components shortages which can affect the release schedule. Risks are too high so the chances of them being used in a console launching in 2013 are really slim imo.
 

RaijinFY

Member
Both PS2 and PS3 were designed to use niche Rambus memories just to get a slightly higher bandwidth compared to the standard PC memories of the time which could have costed much less.
The issue of long term memory price is not something which traditionally has been given much importance to. That is for two reasons, number one these companies are not the PC ethusiast who's lookinf for an obsolete memory module for an old PC on the market and the prices end up bening higher than those of modern modules, they make mass production orders, more than 10 millions a year, with such huge quantities the costs discrepancy ends up being irrelevant.
Second reason is that these consoles ares designed for a long life cycle, 10 years on the market before being discontinued. In such circumstances it's impossible to select components that will not become obsolete anyway, they'll always end up with manufacturing things just for that particular product.
With DDR4 there are key problems, initially they will be more expensive than DDR3, they're still not ready, unproven manufacturing can lead to components shortages which can affect the release schedule. Risks are too high so the chances of them being used in a console launching in 2013 are really slim imo.

It's going to be more expensive than DDR3 but have you seen the price of DDR3? It is so cheap nowdays, so even if DDR4 is more expensive, it's still cheap.
 

RaijinFY

Member
The recent rumour about two dev kits being sent out with 8 and 16GB respectively would suggest that original 2GB is now out of the scenario, primarily because of, as you aptly put, ports. This leaves us with three possibilities, 4, 6 and 8GB. Once again, as aforementioned if the memory is split and GDDR5 is used then 2GB is the logical amount. Otherwise it could be DDR3 or 4. The pertinent question is whether the memory bus would the same constricted 128 bit or larger or whether the implementation of SoC design have an impact on it?



IIRC, the final dev kits will be out Summer 2013. That means even the one in the January will be a close approximation. If Uncharted 1 vs 3 or Halo 3 vs 4 is anything to go by, the next projects that have already gone into development are in a prejudiced position because of the lack of final hardware specs. Of course, on the business side of things, it's a decision that'll make the early birds money.

At least, if rumours are to be believed, the good news is that there is a power increase with each new iteration of the dev kit. But considering the rumour that the earliest PS4 spec had it pegged with a 7790 (never released but specs matched those of rumoured PS4's GPU) or an equivalent (could have been even the 7770 or 7750 as StevieP's conservative estimates were) for a GPU and 2GB of RAM, it doesn't tell us much.

That doesnt exist.
 
I'm a bit conflicted. Extending this gen means no games like BF4, Watch Dogs, Crysis 3 and what not on ultra setting graphics for the consoles. But I currently have a lot of fun with the Ps3 and I will surely pick up Rising, DmC, Ni No Kuni, Tomb Raider and thats just for Q1.

When Ps4 is reality you can be very certain that Ps3 support will die quickly. Ps2 is still around, but it actually didn't get many relevant games once 2006 was done for. FFXII and GoW2 were its swan songs I think.
Persona 4
 
I don't think memory chip production is as complex as graphics chip production, it seems like they only had to wait for the JEDEC spec adoption to start mass production. I really think there is a good chance we will see DDR4 in the PS4.
 
If not DDR4, then what? I thought production for GDDR5 is getting significantly cut because Samsung is stopping their factories for those? Do you think it'll be DDR3?
 

jaosobno

Member
If not DDR4, then what? I thought production for GDDR5 is getting significantly cut because Samsung is stopping their factories for those? Do you think it'll be DDR3?

DDR4 or GDDR5 unified pool.

DDR3+GDDR5 split pool. I dread this combination because I believe that it will once again lead to "RAM shortage" late in next gen, aka Skyrim situation on PS3. I hope Sony learned their lesson regarding split pool from PS3.
 
Reflowing it (baking it lol) is a temporary fix, so be warned. The problem is Sony and MS use really cheap lead-free solder on the BGAs. After continuous heating and cooling, the solder can crack and separate from the motherboard when the board warps under heat. Reflowing actually weakens the solder further. The only way to fix it is to have the system re-balled. That is, replace the BGA solder balls with higher quality leaded solder.

Did you break the clamp off of the ribbon cable? that can be fixed.
see PM
 
Anyway, a bit on topic, if AMD is going to design something that requires new manufacturing technology, I'd think they'd test it on high-end video cards first. They're high-margin, low-volume products that are a lot more forgiving of delays and shortages than a console launch would be. We don't have long before AMD would have to launch something like that in order to learn lessons in time for them to be applied to the console manufacturing.

AMD_Interposer_SemiAccurate.jpg


17a.jpg


Other than testing, in the above pictures, with sample quantities of DDRX wide IO stacked DRAM there is nothing they can do because Memory is not available in quantity yet. The current designs use GDDR5 which AMD helped develop because there was no memory being produced for the PC market that was fast enough for GPUs.

A 2014 APU or GPU design could use wide IO stacked DRAM but a 2013 design can't as it's not available yet. Game console volumes (BOTH PS4 and Xbox3) are driving the industry to create custom DDR4 stacked wide IO memory in advance of it being ready for PCs in 2015-2016.

Memory of 2.5D and wide connections described are used in semiconductor research firm GPU [rumor], next generation of PS3 (PS4). This Sony lecture used as the basis of the ultra wide memory speculation.

http://www.i-micronews.com/upload/Rapports/3D_Silicon_&Glass_Interposers_sample_2012.pdf

95dd2b6d.jpg


Micron Stockholder meeting August 2011 discussing custom memory for game consoles.:

Graphics and consumer. Fair to say, a little bit of a slowdown here, specifically in the DTV segment. I'll speak more about what's happening in game consoles as well. A pretty good push for more memory coming up in the Game Console segment as a level of redesigns. We'll start to hit it over the next couple of years.

And talking about consumer again here. I thought it'd be beneficial to show you across a couple of key applications how this looks in terms of megabyte per system. On the left, what we have are game consoles. This is a space that's been pretty flat for a number of years in terms of the average shipped density per system. That's going to be changing here pretty quickly. I think everyone realizes that these systems are somewhat clumpy in their development. The next generation of system is under development now and that because of 3D and some of the bandwidth requirements, drives the megabyte per console up fairly quickly. So we're anticipating some good growth here.

We've worked with a number of these vendors specifically on both custom and semi-custom solutions in that space.
I can't stress this enough, the PS3 and Xbox3 will be using 2014 memory and designs to use that memory. 2014 is the AMD target date for their 3D stacking, third party SoCs (game console SoC), 20nm and Full HSA designs. 2014 is when Micron will have stacked wide IO DDR4 custom memory ready for game consoles. A 2014 design tapes out in 2013!

I feel like slapping you guys on the head NCIS style, everyone in the industry but outside the internal AMD-Sony Microsoft development, is assuming 3D stacked DDR4 ultra wide IO memory. DDR4 was designed to be stacked! TSVs enable wide IO memory and Transposers allow 2.5D connecting logic to memory with wide IO paths. All of this is being used and if one part is missing then DDR4 is not viable as DDR3 is cheaper and faster; I.E. Samsung, Micron and others would not be making DDR4.

3D wafer stacked memory will be ready for Game Consoles 2013-2014 Provides even more efficiencies when inside the SOC.

Edit: the arguments in posts above mine are valid for 28nm being used not 20nm and this is why I and others are only giving a slightly better than even chance of 20nm being used for a game console SoC.
 
Samsung never said they will exit the GDDR5 market, this was like so often a SA rumor based on the "DDR4 memory cube future". Hynix like someone already posted has 4Gbit modules ready to offer a easier way for more RAM on a GPU without having to much connections/bus. So far we can't predict the DDR4 market - maybe AMD delays their plans for some time, consumers don't want to change, Intel is not ready, Nvidia betting on GDDR5 ...

Maybe Sony and Hynix, Micron or even Samsung have already a deal going on for GDDR5 + future "designs" we don't know that either. So either choice if it is DDR4 or GDDR5 is a valid guess for now.
 
Samsung never said they will exit the GDDR5 market, this was like so often a SA rumor based on the "DDR4 memory cube future". Hynix like someone already posted has 4Gbit modules ready to offer a easier way for more RAM on a GPU without having to much connections/bus. So far we can't predict the DDR4 market - maybe AMD delays their plans for some time, consumers don't want to change, Intel is not ready, Nvidia betting on GDDR5 ...

Maybe Sony and Hynix, Micron or even Samsung have already a deal going on for GDDR5 + future "designs" we don't know that either. So either choice if it is DDR4 or GDDR5 is a valid guess for now.
Yes, Samsung never said they will exit the GDDR5 market, what was said is that Samsung is not putting any new R&D money into GDDR5. Reading between the lines this means that they don't think it has a future even with new improved versions.

Yes, stacked wide IO DDR4 or GDDR5 are valid choices for ONLY memory bandwidth. There are other considerations; long term price (at least till the next refresh), power consumption, drive power and motherboard complexity are other factors. DDR4 was designed to solve ALL the design constraints and will be used with the Intel Haswell EX next year.

Power consumption for instance, the PS4 and Xbox3 will always be on with parts of the SoC sleeping but parts need to always have power like memory. Two different memory pools or lower memory clock are possibles for this. GDDR5 even at lower clock wastes more power than DDR4. Standby power consumption will fall into mandated and optional power standards during the early years of the PS4 and we STRONGLY suspect the Sony and Microsoft game consoles will be tasked as media and cloud game servers for other CE platforms in and out of the home.

http://forum.beyond3d.com/showpost.php?p=1682016&postcount=15680

http://forum.beyond3d.com/showpost.php?p=1682044&postcount=15686

http://forum.beyond3d.com/showpost.php?p=1682189&postcount=15696

http://forum.beyond3d.com/showpost.php?p=1682208&postcount=15700

Edit: I'm hosting a Nelson Rating rack at my business. The Rack of 9 PCs, VPN tunnel, Cable modem and switch cost me about $110.00 a month in electricity. This can be replaced by netbooks and Home run tuners now with a power savings of $80.00/month or in 2015 with PCs made by AMD. 3 year payoff at today's electric costs and now that Obama won, when Carbon taxes ILLEGALLY go into executive order by the EPA, 2 years.

Replacing PS3s that burn 61 watts average at a navigation screen with a PS4 that burns less than 35 watts at the navigation screen (optional now but mandated in California within 1+ years) will result in multi-megawatt savings per year for the US.
 
Yes, Samsung never said they will exit the GDDR5 market, what was said is that Samsung is not putting any new R&D money into GDDR5. Reading between the lines this means that they don't think it has a future even with new improved versions.

Yes, stacked wide IO DDR4 or GDDR5 are valid choices for ONLY memory bandwidth. There are other considerations; long term price (at least till the next refresh), power consumption, drive power and motherboard complexity are other factors. DDR4 was designed to solve ALL the design constraints and will be used with the Intel Haswell EX next year.

Power consumption for instance, the PS4 and Xbox3 will always be on with parts of the SoC sleeping but parts need to always have power like memory. Two different memory pools or lower memory clock are possibles for this. GDDR5 even at lower clock wastes more power than DDR4. Standby power consumption will fall into mandated and optional power standards during the early years of the PS4 and we STRONGLY suspect the Sony and Microsoft game consoles will be tasked as media and cloud game servers for other CE platforms in and out of the home.

http://forum.beyond3d.com/showpost.php?p=1682016&postcount=15680

http://forum.beyond3d.com/showpost.php?p=1682044&postcount=15686

http://forum.beyond3d.com/showpost.php?p=1682189&postcount=15696

http://forum.beyond3d.com/showpost.php?p=1682208&postcount=15700

Edit: I'm hosting a Nelson Rating rack at my business. The Rack of 9 PCs, VPN tunnel, Cable modem and switch cost me about $110.00 a month in electricity. This can be replaced by netbooks and Home run tuners now with a power savings of $80.00/month or in 2015 with PCs made by AMD. 3 year payoff at today's electric costs and now that Obama won, when Carbon taxes ILLEGALLY go into executive order by the EPA, 2 years.

Replacing PS3s that burn 61 watts average at a navigation screen with a PS4 that burns less than 35 watts at the navigation screen (optional now but mandated in California within 1+ years) will result in multi-megawatt savings per year for the US.


I doubt that complexity will go down with DDR4 at least not right from the start. Furthermore to achieve comparable speeds you need a bigger BUS which adds complexity aswell. Concerning the power issues I don't know enough except that yes DDR4 uses less voltage than GDDR5 but how this stacks up in real life I can't tell and I don't know how much power it would need. Not sure but can't you just reserve eg. 256MB GDDR5 during standby and downclock it? That might not be as good as DDR4 but satisfy current laws. DDR4 will only be cheaper in the long run if the adaption rate is high enough - I am sure Sony will calculate if a eventual DDR4 refresh down the road will be worth sacrificing bandwith and trusted GDDR5 technology. After all who knows if in 2016 not GDDR7 will be the major GPU standard ;-)
 
I doubt that complexity will go down with DDR4 at least not right from the start. Furthermore to achieve comparable speeds you need a bigger BUS which adds complexity aswell. Not sure but can't you just reserve eg. 256MB GDDR5 during standby and downclock it? That might not be as good as DDR4 but satisfy current laws. DDR4 will only be cheaper in the long run if the adaption rate is high enough - I am sure Sony will calculate if a eventual DDR4 refresh down the road will be worth sacrificing bandwith and trusted GDDR5 technology. After all who knows if in 2016 not GDDR7 will be the major GPU standard ;-)
Yes, DDR4 ultrawide IO would require a transposer but I expect it would be cheaper to use a transposer than the motherboard complexity and heat sink that GDDR5 would require. In any case if there is a second GPU it's likely that it will be on the same transposer the APU and memory will be on so a transposer is likely needed in any case.

Concerning the power issues I don't know enough except that yes DDR4 uses less voltage than GDDR5 but how this stacks up in real life I can't tell and I don't know how much power it would need.
GDDR5 would need a heat sink and can't be stacked more than 2 high while DDR4 can be stacked 8 high and with a wider interface. DDR4 ultra wide IO uses MUCH MUCH less power because it's designed to use less drive voltage and it can be clocked slower (in ultra wide IO configuration) yet still achieve the same bandwidth.

Power use is going to be a big issue in the near future, look at all the power saving features in AMD chipsets. Handhelds are leading the way with techniques to be more efficient and save battery life. These wide IO standards give DDRX memory 2Tb/sec bandwidth at maximum clock/power but at lower clock speeds, power efficiency at 200 mb/sec which is where we need to be to support the Sony CTO statements of 10X GPU and 300FPS video streams.

All the pictures and charts in my post above dealing with stacked memory except one are AMD generated yet we are still discussing GDDR5 which can't be stacked. You and others are assuming OLD designs....you could be correct but I hope not. There is no slide or drawing speculating GDDR5 for next generation game consoles but several for ultrawide IO DRAM which is assumed to be DDR4 but could be DDR3.

1) PS4 to have 2.5D stacked ultra wide IO memory
2) Game consoles to have stacked ultra IO memory

Both of the above are informed speculation by the INDUSTRY, try to find anything for GDDR5 from the industry, plenty on NeoGAF and BY3D from supposed Leaks but nothing from the industry. Developer platforms are being used to project a PS4 design without thinking about the endgame requirements like mandated power rules for game consoles and always on server features. We have a fair idea of what the PS4 and Xbox 3 are going to support as well as target minimum specs from Sony Technology officers.

The following applies to the PS3 and Xbox 360 but next generation is assumed to be more powerful and consume more power so it needs even more power management to meet future mandated specs.

http://www.neogaf.com/forum/showpost.php?p=41486908&postcount=6 said:
Tier 3 standards Game Console Energy Star Requirements
Auto power off
Standby power .5W
Active navigation menu 35W
Active streaming Media 45W

The PS3 3000 chassis consumes
Standby .5W
Active Navigation Menu 61 watts
Playing game 72-79 watts

Xbox 360S Valhalla
Standby
67 watts at the dashboard
80 watts while gaming.


The last 4 PS3 refreshes had the following power savings: 20-10-5-4 The small numbers for the last few are because it's only been either RSX or Cell that had power savings not both at the same time. But do you see a trend, power savings is getting smaller with die size reductions it's not half or some ratio that's fixed. My impression is that just reducing from 45nm to 32nm is not going to reduce; Xbox for example, from 67 to 35 watts. The figures I used above are for the XMB menu where most of the power savings is in idle current. For full on game use the savings in die size reductions are (supposed to be) less (20-11-5-1). The last was PS3 3000 chassis where only the manufacturing technique for the Cell was changed.
Gong to 28nm is just about the same as 32nm so a move from 40nm to 28nm is not going to save 30 watts at the navigation menu (XMB) to comply with the current optional specs which I think will be mandated in little over a year. Certainly GDDR5 is going to exacerbate this while ultra wide IO DDR4 will use much less power than the PS3's memory.

Any refresh of Xbox 360 or PS4 from this point on and the PS4 and Xbox 720 will have to be designed to meet mandatory power requirements for game consoles. 20nm for PS3 by it'self, from the power savings trend above, is not going to meet the mandated standard either. It's going to take smart management of individual components (modern hardware) and ultrawide IO DDR4. So either the PS3 is dead next year or it's getting a total redesign to use modern hardware. The recent news of Sony selling the PS3 to China seems to indicate a longer life which implies a total redesign @ 20nm.

I've said this before and speculated that modern hardware would be used with the PS3 4K chassis and I was wrong, it didn't happen and power savings for the 4K chassis is only about 4 watts on my meter which is in the margin of error. The PS3 4K chassis and Xbox 360/S chassis can not be sold in California in a little over a year, what does this mean? They are to be discontinued soon after next generation is released or they are getting massive redesigns using modern hardware. If BC is provided in PS4 and Xbox 720 then a cut down version of newer console SoCs could work. It's possible that binned as defective PS4 SoCs with BC might work as PS3 SoCs.
 

THE:MILKMAN

Member
Jeff_Rigby said:
The PS3 4K chassis and Xbox 360/S chassis can not be sold in California in a little over a year, what does this mean? They are to be discontinued soon after next generation is released

Surely that isn't the case? Wouldn't the new rules be for newly launched hardware after the date they come into "force"?

Silly if not.
 
Surely that isn't the case? Wouldn't the new rules be for newly launched hardware after the date they come into "force"?

Silly if not.
No grandfathering.

SteveP said:
I don't think they're going to be shrinking the Cell more, Jeff.
Then how is the PS3 to be sold in the US past the California Law and how can the price and power requirements drop so it can be sold in India, China and South America?

Either it's dead or it's going through a major design change refresh. No other choices right? A possible is BC in the PS4 and it's cheap enough to fill all price points.
 

i-Lo

Member
Pertaining to power consumption, one noticeable thing (thankfully) is that there are no restrictions being placed on how much power can the system draw when actually playing games.

Was this an impetus for proceeding with an APU?

One would assume that the discrete GPU will be inactive until a game is played.
 
Same Seronx who speculated previously on PS4 and Xbox 720. He is not an insider and these are not leaks just speculation by someone who follows rumors and is very informed about AMD products.

http://forums.anandtech.com/showpost.php?p=34282049&postcount=191 said:
Playstation 4 is using Thebe-J which hasn't finished yet nor is it related to the Jaguar or Trinity or Kaveri architectures. The only one that is showing any signs of finalization is Xbox's Kryptos which is a >450 mm² chip. To get back on Thebe-J it was delayed from Trinity side-by-side development to Kaveri side-by-side development.

I assume if they are going to use Jaguar it is going to be in a big.LITTLE formation. Which will have them in a configuration where the Jaguar portion will control all of the system, os, etc stuff that generally isn't compute intensive. While the Thebe portion will control all of the gaming, hpc, etc. stuff that is generally compute intensive. Since, each year the performance part of the Playstation Orbis was upgraded it is safe to assume that they are going for an APU with the specs.


First:
A8-3850 + HD 7670
400 VLIW5 + 480 VLIW5 => 880 VLIW5 -> VLIW4 => 704

Second:
A10-5700 + HD-7670
384 VLIW4 + 480 VLIW5 => 864 VLIW4/5 -> VLIW4 => 768

I have heard that the third generation of the test Orbis uses an APU with GCN2.
Unknown APU + HD8770
384 GCN2 + 768 GCN2 -> 1152 GCN2

It is assumed that the APU only has four cores because AMD doesn't plan to increase the core count other than the GPU cores from now on.
If I'm reading this correctly then 1 Jaguar CPU package (4 X86 jaguar CPUs) and 3 higher performance CPUs which the "side by side development with Kaveri" might mean the same CPUs as in Kaveri. Jaguar (lower power more efficient) CPUs control the OS (Little) and Kaveri type CPUs (big) would be used where needed. 2014 AMD designs choose the best CPU for a task; the example given by AMD is choosing to use the GPU or CPU for compute tasks based on which would do this better. This can be extended to use more power efficient CPUs for OS tasks (little) and Desktop CPUs for games (big). Power usage is going to be a big issue due to mandated game console power usage laws.

Side by side development: I assume test chips on a few wafers at a time have been made every 2-3 months or so for more than a year. These wafers probably contain multiple designs at the same time as long as they are compatible with the same process. For example Thebe-J chips would be included with Kaveri and other AMD projects like discrete GPUs that all use the same process.

I don't know how accurate Seronx might be but his speculation does allow for a wider view on next generation designs.

The above is speculation and I can see where he is getting this: Thebe-Jaguar = big.little

I don't think he has any other information than speculation based on the hyphen between Thebe and Jaguar. It disagrees with Sweetvar26 in that he said 2 jaguar packages with cache which leaves room for 2 more CPUs which I earlier speculated might be two 1PPU3SPU CPU packages which if more modern versions might be better at game code than Kavari CPUs. Kavari CPUs & Jaguar CPUs is the other possiblity, a mix of low power and high power X86 CPUs for different jobs.

Edit: Are these "X86" CPUs to use ONLY AMD's -64 extension to X86. Is there any need to support older Intel copyrighted 16 bit X86?

HSA IL would require X86 CPUs for AMD libraries for the virtual engine but game code is more to the metal. Die size of all these CPUs & CPU packages are similar. I.E. 4 jaguar CPUs in a CPU package = 1 Kavini X86 CPU = 1PPU3SPU MSA CPU package

Rumors of Sony not having the ARM A5 for trustzone would support Sony still using SPEs with a encryption key buried in hardware similar to the PS3. Remember developers commenting on the PS4 being harder to develop on than the Xbox 720.
 
Same Seronx who speculated previously on PS4 and Xbox 720. He is not an insider and these are not leaks just speculation by someone who follows rumors and is very informed about AMD products.

If I'm reading this correctly then 1 Jaguar CPU package (4 X86 jaguar CPUs) and 3 higher performance CPUs which the "side by side development with Kaveri" might mean the same CPUs as in Kaveri. Jaguar (lower power more efficient) CPUs control the OS (Little) and Kaveri type CPUs (big) would be used where needed. 2014 AMD designs choose the best CPU for a task; the example given by AMD is choosing to use the GPU or CPU for compute tasks based on which would do this better. This can be extended to use more power efficient CPUs for OS tasks (little) and Desktop CPUs for games (big).

Side by side development: I assume test chips on a few wafers at a time have been made every 2-3 months or so for more than a year. These wafers probably contain multiple designs at the same time as long as they are compatible with the same process. For example Thebe-J chips would be included with Kaveri and other AMD projects like discrete GPUs that all use the same process.

I don't know how accurate Seronx might be but his speculation does allow for a wider view on next generation designs.

That sounds good but I somehow doubt that we will see 4 Jaguar cores just for the OS, x high-performance cores and a dedicated GPU (Sea Island according to the posting) for games. I could imagine a ARM chip for OS, standby reasons but that constellation die is going to be huge if true. After all you are a "fan" of stacking, so we have 4 Jaguar cores, rumoured 3 high-end cores, a Sea Island GPU, DDR4 and probably eDRAM to stack on a die :)
 
That sounds good but I somehow doubt that we will see 4 Jaguar cores just for the OS, x high-performance cores and a dedicated GPU (Sea Island according to the posting) for games. I could imagine a ARM chip for OS, standby reasons but that constellation die is going to be huge if true. After all you are a "fan" of stacking, so we have 4 Jaguar cores, rumoured 3 high-end cores, a Sea Island GPU, DDR4 and probably eDRAM to stack on a die :)
Confusion: 1 Jaguar CPU package which has 4 Jaguar CPUs and three higher performance X86 CPUs probably similar to Kavari CPUs. The total is still 4 which fits on a typical Xbar switch which has 4 slots to plug in CPUs. Maybe no eDRAM if the stacked DDR4 2.5D attached is wide enough and fast enough.
 
Confusion: 1 Jaguar CPU package which has 4 Jaguar CPUs and three higher performance X86 CPUs probably similar to Kavari CPUs. The total is still 4 which fits on a typical Xbar switch which has 4 slots to plug in CPUs. Maybe no eDRAM if the stacked DDR4 2.5D attached is wide enough and fast enough.

The above is speculation and I can see where he is getting this: Thebe-Jaguar = big.little

I don't think he has any other information than speculation based on the hyphen between Thebe and Jaguar. It disagrees with Sweetvar26 in that he said 2 jaguar packages with cache which leaves room for 2 more CPUs which I earlier speculated might be 1PPU3SPU which if more modern versions might be better at game code than Kavari CPUs. Kavari CPUs & Jaguar CPUs is the other possiblity, a mix of low power and high power X86 CPUs for different jobs.

Yes I am confused :)

So it will be a mixed Kabini/Kaveri APU with 1 Jaguar core (OS, etc.) and 3 "Steamroller" (or more/less powerfull) cores (games, video rendering, etc.) and a SeaIsland GPU according to that rumour?
 

Vol5

Member
No grandfathering.

Either it's dead or it's going through a major design change refresh. No other choices right? A possible is BC in the PS4 and it's cheap enough to fill all price points.

With the amount of effort and finance placed into Cell? It has to be part of the PS4 if only for BC alone.

PS2 had PSone MIPS core
PS3 (originally) had GS+EE in a single chip
PS4 will have cell or derivative.

I think they will go BC & non-BC SKU's
 
I also think he doesn't understand that if they put a 8770 with the APU, it will give better performance than a 8850.

No way. The main GPU will be used for pixel/vertex processing, while the ALUs inside the APU will make Cell´s labours ( phisics, and post processing in deferred engines ). It is not the same having a polygon and pixel crusher like a 7870 or 7850 than a 7750 or 7770 equivalent. To be clear, the APU is an attempt to make something similar to Cell but programming and port friendly...

Another thing is that like an ancient Geforce 7800 programmed to the metal can give you games like Uncharted 3 a Hd 8770 programmed to the metal can give you nowadays more performance than a Geforce 680gtx. But this being said, this console must last many years and you are already gimping it putting inside a laptop gpu. Both, MS and Sony will fight mainly for the bad so-called "hardcore" market. If they fail in specs many people will go definetively to the PC route.
 

Jinko

Member
With the amount of effort and finance placed into Cell? It has to be part of the PS4 if only for BC alone.

I don't see why, they removed the PS2 components from the PS3, why would they initially put expensive components in to only make the PS4 more expensive.

Backward compatibility is over rated, this discussion comes up at the start of every new generation.

Add to the fact that Cell is only part the issue, switching from Nvidia to AMD will create a similar situation to what Xbox > 360 had.

Both, MS and Sony will fight mainly for the bad so-called "hardcore" market.

I wouldn't like to bet on that.
 

i-Lo

Member
I also think he doesn't understand that if they put a 8770 with the APU, it will give better performance than a 8850.

Outright performance or performance per watt?

If the latter then that makes sense otherwise no. They could go for an underclocked 8850 unless the performance of a 8770 is targeted to be very similar to 7850.

In the end, it's all rumour and speculation atm.
 
Outright performance or performance per watt?

If the latter then that makes sense otherwise no. They could go for an underclocked 8850 unless the performance of a 8770 is targeted to be very similar to 7850.

In the end, it's all rumour and speculation atm.

I was talking about outright performance. For performance per watt you can´t match a cell plus a GPU.
 

i-Lo

Member
I was talking about outright performance. For performance per watt you can´t match a cell plus a GPU.

I was asking thuway about his claim. I agree with you that an 8850 (underclocked) makes more sense in the long run especially give how they are touting their aim for 1080p60fps. Of course if the current dev kits have say a 7850 and 8770 sea island have very similar performance with much better power efficiency then it makes more sense.
 

Grim1ock

Banned
Alot of assumptions and alot of guessing. It will be quite amusing if sony decides to go with the same design specs of the ps3.

A more powerful faster version of the cell.

A more powerful faster version of the RSX

and a beefed up XDR
 

Jinko

Member
Alot of assumptions and alot of guessing. It will be quite amusing if sony decides to go with the same design specs of the ps3.

A more powerful faster version of the cell.

A more powerful faster version of the RSX

and a beefed up XDR

Amusing for who ? :S

Can't see anyone with a smile on their face if that happens.
 
Top Bottom