• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN rumour: PS4 to have '2 GPUs' - one APU based + one discrete

androvsky

Member
it is if it doesn't have the same PPU & GPU
Sigh. Yes, those other things that I wasn't taking about would still have to be emulated. SPU code isn't mixed with anything else, so it should be pretty easy to run native SPU code on the real thing separate from everything else.
 

onQ123

Member
Sigh. Yes, those other things that I wasn't taking about would still have to be emulated. SPU code isn't mixed with anything else, so it should be pretty easy to run native SPU code on the real thing separate from everything else.

yes because this whole talk was about what you was talking about & had nothing to do with me saying


"I have faith that the PS4 will be able to Emulate The Cell with the GPU in the APU with a few added parts from SONY to play PS3 games."
 
Consoles have changes so much this gen I wouldn't mind still having it around in my entertainment center. I mean, now consoles do so much (Netflix, ESPN, ETC). Why get rid of it? I could understand wanting BC previous generations because if you weren't using it, it pretty much was wasted space.

And honestly with so much downloaded content, I really don't want to have to go through all that downloading on my new console anyway. If Sony can offer more for less with leaving out BC next gen, I'm all for it.
 

onQ123

Member
Consoles have changes so much this gen I wouldn't mind still having it around in my entertainment center. I mean, now consoles do so much (Netflix, ESPN, ETC). Why get rid of it? I could understand wanting BC previous generations because if you weren't using it, it pretty much was wasted space.

And honestly with so much downloaded content, I really don't want to have to go through all that downloading on my new console anyway. If Sony can offer more for less with leaving out BC next gen, I'm all for it.

to the people who don't have PS3's or their PS3 is broke BC is more for less because it gives them the PS1 , PS2 , PS3 & PS4 in one package.
 

Truespeed

Member
I don't think there's going to be any emulation for the PS4. That's the problem when you continually change architectures - you keep backing yourself into a corner when it comes to backwards compatibility. It just becomes too expensive to retrofit compatibility hardware into new consoles or too time consuming to develop reliable and accurate emulation software. The news that the next PS4 might be x86 based gives me hope that the PS5 may not need to worry about backwards compatibility whatsoever.
 
to the people who don't have PS3's or their PS3 is broke BC is more for less because it gives them the PS1 , PS2 , PS3 & PS4 in one package.

Not if it makes the console 600$ again. And you only get PS2 if you have a 60GB console any way. Personally its a non issue for me. I don't play any PS2 / PS1 games on my 60GB now, and I can count on one hand the ones that I have played. Sony made plenty of mistakes this Gen and it may not be until the generation after Next Gen until they can emulate PS3 without any extra cost to consumers. And I am perfectly fine with that.

My prediction is that PS4 will play PS1, PS2, some downloadable PS3 games, no PS3 Blu-ray games.

Or they can offer a very limited PS4 that is BC with everything but I see that option leading to consumer confusion and causing more negative fervor from the gaming community than would of been just by not releasing such a version.
 
You say that yet the until July 2007 the PSU of 360 was rated at 203W and the power it drew on average while playing games was around 185W. Now that works out to be around 90% of the limit.

I think Sony can design a next console with a max power draw of somewhere between 225 to 250W.

I am just wondering how close to THIS can we get with the next gen hardware (minus the IQ i.e. anti-aliasing and etc)?

As my video shows, ~220W was not uncommon for the launch PS3 models.

Youtube

*shrug* Maybe they aren't as concerned with efficiency. I'm just talking about PC PSU, you want one that's double the total amount your system will use. Maybe console power supplies are different, maybe they're labeled differently. *shrug* Some one with more knowledge will have to answer that.
 
I don't think there's going to be any emulation for the PS4. That's the problem when you continually change architectures - you keep backing yourself into a corner when it comes to backwards compatibility. It just becomes too expensive to retrofit compatibility hardware into new consoles or too time consuming to develop reliable and accurate emulation software. The news that the next PS4 might be x86 based gives me hope that the PS5 may not need to worry about backwards compatibility whatsoever.

IF we go with OpenCL, which apparently is highly compatible high level coding, then it shouldn't matter much. One minute we can use the Cell, the next x86, the next something different, as long as they work with OpenCL. The problem with next gen in particular is emulating the high speed SPU's on hardware that is not quite as fast. From there on, as long as OpenCL is a viable solution for coding, we SHOULD be good. Of course, it's ideal not to change shit over and over though...
 

Fafalada

Fafracer forever
Durante said:
There is no way to meaningfully parallelize the realtime emulation of a single core.
True for GP processing, but not necessarily so for the type of workloads SPUs typically perform. Streams can be parallelized on input, and task distribution will scale(to a point) with amount of units/cores available.

It obviously wouldn't satisfy general emulation requirements - but it only needs to work for existing real-world software. The question is whether it's enough to balance out edge cases, and whether there are special-cases in real-world PS3 software where this would just fail alltogether (eg. application that is bound by non-streamed single-SPU performance).

Of course this is all assuming we have no other resources that could reasonably assist emulation - any guesses on speeds for the actual CPU cores in PS4?
 
True for GP processing, but not necessarily so for the type of workloads SPUs typically perform. Streams can be parallelized on input, and task distribution will scale(to a point) with amount of units/cores available.

It obviously wouldn't satisfy general emulation requirements - but it only needs to work for existing real-world software. The question is whether it's enough to balance out edge cases, and whether there are special-cases in real-world PS3 software where this would just fail alltogether (eg. application that is bound by non-streamed single-SPU performance).

Of course this is all assuming we have no other resources that could reasonably assist emulation - any guesses on speeds for the actual CPU cores in PS4?
My guess;

1) The SOC will have a x86CPU, CL able GPU and a FPGA
2) Each have features that define their best use case That's the reason for heterogeneous computing.
3) OpenCL allows for an easy to use language for all
4) The SOC will have temp sensors to drop clock speed on overheat because it's expected that this might happen so running X86 at 100% duty cycles is probably not possible (no heavy SPU emulation)
5) Best use case for x86cpu is for easy to setup few clock tick operations and scripts that need branch prediction, some would benefit from higher clock speeds. Probably higher than the PS3 clock speed.

Sony could add 4-6 SPUs to the SOC; their use would fit between CPU and GPU as easier/faster to setup like a CPU but able to run faster without generating as much heat as a X86 CPU. X86 with branch prediction will run hotter than a SPU. Using only 2 X86 CPUs and adding 4 SPUs would make the SOC more efficient but increase complexity in coding which OpenCL takes care of. The idea in heterogeneous computing is that different design CPUs can be combined to be more efficient. We already know (from Sony CTO interview) about CPU, GPU, DSP and FPGA all easily useable with OpenCL code, there is no reason not to include SPUs also if they also have a use that justifies their inclusion.

Even with SPUs in the PS4 it's going to be very difficult to emulate a PS3 as unpredictable use of SPUs might still cause some games to break simulation/emulation....is it worth it to provide PS3 emulation? Xbox Durango might have Backwards compatibility and if it does Sony will feel pressure to provide the same or they must have developers recompile a large number of PS3 games for PS4 or have a large number of very attractive PS4 games available at launch. PS1, PS2, PSP and PS Suite games using emulators/engines already developed can have those emulators/engines ported to PS4 and then run on PS4.

As Durante alluded, Both AMD with their HSA and the PS4 SOC is going to be cutting edge Heterogeneous computing with new techniques needed and OS tuning required and all that and more would be needed to emulate a PS3..... My only hope is the rumor of Barcelona Super Computer Center working on PS4 Cell 2 in October 2011 which I think is after the rumor Sony moved from 24 SPUs to AMD (which I didn't believe because a X86 is not as good a choice for a game machine all things being equal and not knowing about OpenCL and GPUs). From Google searches, BSC is contracted by just about everyone (Intel, IBM, Nvidia and more) to do research on next generation computing. BSC developed the Cell simulator (runs on X86) for IBM. In 2005 the Cell simulator ran at 1/27th of Cells true speed on a X86 processor(s) without modern 400 core+ GPGPUs. I believe early PS3 developers got a PC with PS3 simulation that ran much slower than a PS3.

Durante is 100% accurate but it's like a trick puzzle. 100% real time Cell emulation is not possible using a slower GPU parallel processor with many cores but that slower clocked GPU processor can perform some of the same functions for which a SPU is being used more than 10 times faster. He needed to explain in more detail (which he and others have done) to the less informed among us (includes me) why (too much low level SPU code). He has outlined the issues and stated it might be possible with a combination of emulation and simulation but would require what amounts to BSC to research new coding techniques to make it possible. So we are left with it's not happening unless Sony feels it's worth the investment and that would require someone like BSC to determine if it's even possible. So even if the rumor is true that Sony contracted BSC, they might have returned not possible or not possible unless 2-4 SPUs were included and again, is it worth it, could SPUs add to the performance as well as help with emulation?

Back to my guess on two models for game developers:

Ureal Engine 3 1) Traditional extension to last generation with OpenGL and a more GPU bound model
Ureal Engine 4 2) Limited Ray tracing (CPU bound) and more CPU use, multiple "tons of CPUs".

Case 1 can develop games or engine now without knowing about SPUs in the PS4, Case 2 would have to know. Most games engines early on will be case 1 and the games developed for multiple platforms with resources scaled to GPUs. Case 2 can't be scaled to platforms with less CPU power or without excess unused GPU OpenCL ability, would be limited to higher end PC, PS4 and possible Xbox Durango.

So it's possible that SPUs are to be included and Backwards compatibility planned for PS4 and most PS4 developers wouldn't know about it except for someone Like Epic developing the Unreal Engine 4.

Very fast common memory pool and controller is a MUST as is the use of OpenCL. They make possible attaching multiple different CPUs to the memory buss. MMU code is going to be fun...Developers might notice alot of reserved addresses that arn't explained.
 
Job listing for voice recognition from Sony.

http://www.thesixthaxis.com/2012/04/16/rumour-ps4-getting-speech-recognition/

They were supposed to do that for the PS Eye I believe.

EDIT: DAMNIT lost a lot of my post... =/

Ok... let me type this again.

What if the current AMD is a temporary set up for a proposed 24 SPU cell? This isn't an outlandish thing we are talking about. With OpenCL we can get efficient forward compatibility with the Cell on other types of hardware while keeping full BC. As Jeff pointed out, in 2005, BSC was working on a Cell simulator before they went with the Cell, why can't they be doing the same before they do full production of the new Cell?

Sure some people will say "why not just bypass a powerful CPU that supplements the GPU for just a powerful GPU?" Well, what if they can get more for less with the Cell? GPU's these days consume a lot of power, while the Cell can supplement have GPU tasks at a much lower power consumption and even get better results. Of course, there IS the memory issue that the Cell poses... the requirement of fast memory access. Well, while 3D stacked CPU / GPU tech is a little farther than expected, isn't 3D stacked memory just a stones throw away?... early 2013 at the absolute latest? That would remove that issue completely. Also, as mentioned above, an extremely fast memory bus & controller is required for OpenCL. Besides, we've already established that the problem with scaling down the Cell wasn't with the Cell design itself, but with scaling down the memory controller to something manageable.

Overall, going with the set-up above, they can go with lower power consumption, meaning lower PSU requirements, and much less cooling going for overall cheaper and more powerful product that has no compatibility issues.

Without being a an absolute dick, I would like people to poke holes in my argument as to why that won't happen.
 

Durante

Member
True for GP processing, but not necessarily so for the type of workloads SPUs typically perform. Streams can be parallelized on input, and task distribution will scale(to a point) with amount of units/cores available.
But that's the same thing I already covered. To make use of that property, the emulator would have to have a high-level semantic understanding of the code that's being run, and translate that to a different architecture in realtime. I don't see that happening. I'm not familiar with how PS3 games use Cell, but remembering some slides from e.g. the Killzone or Uncharted devs it didn't look to me like it was just a few pre-canned patterns that you could conceivably provide high-level emulation for. And when you start special casing each high-end game it's no longer really emulation in my opinion.

IF we go with OpenCL, which apparently is highly compatible high level coding, then it shouldn't matter much. One minute we can use the Cell, the next x86, the next something different, as long as they work with OpenCL.
OpenCL is (somewhat) portable in terms of semantics. Optimized OpenCL code is not at all portable in terms of performance.

The problem we are having here is not one of just getting code to run on a different architecture. It's about getting time-critical code to run on a different architecture.
 
OpenCL is (somewhat) portable in terms of semantics. Optimized OpenCL code is not at all portable in terms of performance.

The problem we are having here is not one of just getting code to run on a different architecture. It's about getting time-critical code to run on a different architecture.

Though, it still is more portable than any other option out there... correct? As it stands right now, everything has to be done damn near from scratch as with OpenCL, it's at least more manageable. You can plan ahead for something like that, can't you?

EDIT: By the way durante. I like your posts. You know very much yet you don't be a dick about it. ;]
 

StevieP

Banned
What if the current AMD is a temporary set up for a proposed 24 SPU cell?

Without being a an absolute dick, I would like people to poke holes in my argument as to why that won't happen.

Without sounding "dickish" as you say, it's pretty much a lock that the final PS4 will be entirely AMD-based. I'm not sure how Sony will approach the conundrum of BC (old patents seem to have an external PCIe dongle of some kind like the one in their Vaios) but brain_stew seems to think they'll just cut and move on.
 
Without sounding "dickish" as you say, it's pretty much a lock that the final PS4 will be entirely AMD-based. I'm not sure how Sony will approach the conundrum of BC (old patents seem to have an external PCIe dongle of some kind like the one in their Vaios) but brain_stew seems to think they'll just cut and move on.

I know some of you guys have insider info, and I know you guys can't say "well they ARE doing this..."...but are they moving on? lol.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
But that's the same thing I already covered. To make use of that property, the emulator would have to have a high-level semantic understanding of the code that's being run, and translate that to a different architecture in realtime. I don't see that happening. I'm not familiar with how PS3 games use Cell, but remembering some slides from e.g. the Killzone or Uncharted devs it didn't look to me like it was just a few pre-canned patterns that you could conceivably provide high-level emulation for. And when you start special casing each high-end game it's no longer really emulation in my opinion.

But it is. you can have a fully generic emulation or a game specific emulation or a mix of them. The Xbox emulation on the 360 was like this. They had to come up with game specific profiles to make the emulation work. This meant games trickling out one by one and many never working. So it may be that Sony gets many SPU-lite games working with little effort (think PSN games and MP games) and then do hand tuning for big 1st party games.
 
Without sounding "dickish" as you say, it's pretty much a lock that the final PS4 will be entirely AMD-based. I'm not sure how Sony will approach the conundrum of BC (old patents seem to have an external PCIe dongle of some kind like the one in their Vaios) but brain_stew seems to think they'll just cut and move on.
That's a possible if they can get a large portfolio of games ready for launch. My favorite game Burnout could be ported from the PC for PS4 and if sold at under $30 I'd buy it. But I would be seriously miffed if Burnout and other titles will not be available for the PS4 because the developer went out of business.
 

THE:MILKMAN

Member

Shake my head at that......

There is also the possibility that the rumoured specs are based on the builds for the PS4 development kits which might have gone out. The final PS4 itself might contain more up to date iterations of that hardware further down the line.It should be possible to code with that current spec knowing that things are set to be x times faster by the time the final machines are released.


Why does OPM even entertain the idea that devkit chips=retail PS4? Don't they remember the PS3/360 devkits?
 

thuway

Member
With OPM even calling the PS4 underpowerred - I have to admit that it sounds awfully like the specs leaked are nothing but bull.
 
I am just wondering how close to THIS can we get with the next gen hardware (minus the IQ i.e. anti-aliasing and etc)?

Thanks for that link! Now that is what I hope next gen looks like! I know that's a lot to ask, but maybe once devs get more and more familiar with the hardware, something close to this will be possible later in the consoles life cycle.
 

pestul

Member
Considering what has been achieved in the The Witcher 2 version on 360, I really don't think people should worry about a PS4 with these specs.
 
SteveP said:
Without sounding "dickish" as you say, it's pretty much a lock that the final PS4 will be entirely AMD-based. I'm not sure how Sony will approach the conundrum of BC (old patents seem to have an external PCIe dongle of some kind like the one in their Vaios) but brain_stew seems to think they'll just cut and move on.
Everyone knows about this as it's been posted on NeoGAF already on Feb 2012 and I dismissed it when I first read it as the patent date is 2010 so it couldn't apply to the PS4 as I think the decision to go with AMD didn't occur till Oct 2011 but on second look I noticed the publish date and re-read it.

http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.html&r=1&f=G&l=50&s1=%2220120040762%22.PGNR.&OS=DN/20120040762&RS=DN/20120040762 said:
A compatibility adapter configured for connection to an external connection terminal of a new-generation game device is provided. An old-generation processor unit is a processor unit compatible with the processing function of an old-generation game device. When it is determined that the type of a recording medium storing application software is for the old-generation game device, a hub receives data, which has been input to the new-generation game device, from the new-generation game device via the external connection interface. Data extracted from a received packet is supplied to the old-generation processor unit and processed accordingly. The hub supplies the processed data to the new-generation game device via the external connection interface.

Inventors: Shinjo; Sadaaki; (Kanagawa, JP) ; Sugawara; Akihiko; (Kanagawa, JP) ; Hakamatani; Tadayasu; (Tokyo, JP)
Assignee: Sony Computer Entertainment INC.
TOKYO
JP

Serial No.: 201563
Series Code: 13
Filed: February 12, 2010
PCT Filed: February 12, 2010
PCT NO: PCT/JP2010/000850
371 Date: October 10, 2011
1) There is no port on the PS3 fast enough to support an external PS2 game device unless it was a complete PS2 with only video and control being sent through the external adaptor. Too expensive.

2) The published date fits with the decision to go with AMD rather than the rumored 24 SPU processor - Oct 2011. That's too late to be economically practical for a PS2 emulator.

Should we take the publishing of the patent as a clue to how Sony might provide BC for the PS4? I think the published date cinches it for me. With the correct adaptor only a portion of a PS3 might need to be supplied and we might have inexpensive BC. Edit: even with a partial PS3 chipset how will it be cooled? 28nm and maybe it won't need a fan? Placed next to an air duct in the PS4 that was designed to force air across anything plugged into the adaptor?

At the same time the above confirms BC is important to Sony and the AMD rumors are accurate.
 
More information on TSVs and Global Foundries:

GlobalFoundries is installing equipment to make through-silicon vias in its Fab 8 in New York. If all goes well, the company hopes to take production orders in the second half of 2013 for 3-D chip stacks using 20 and 28 nm process technology.

The systems should be in place and qualified by the end of July, with about half of them installed today, McCann said. The company aims to run its first 20 nm test wafers with TSVs in October and have data on packaged chips from its partners by the end of the year.

GlobalFoundries’ schedule calls for having reliability data in hand early next year. The data will be used to update the company’s process design kits so its customers can start their qualification tests in the first half of the year.

If all goes well, first commercial product runs of 20 and 28 nm wafers with TSVs can start in the second half of 2013 and ramp into full production in 2014, McCann said.

So far, three types of chip designs want to use TSVs. High-end mobile application processors will use TSVs to link to memories, high-end graphics and CPUs will use it to link to DRAMs and memory stacks that may or may not include any logic also will use TSVs, McCann said. All three classes could be in production in 2014, he said.

"high-end graphics and CPUs will use it to link to DRAMs and memory stacks" "20 and 28 nm wafers with TSVs can start in the second half of 2013 and ramp into full production in 2014". Description and dates fit with what we expected.

The Common Platform group that includes GlobalFoundries, IBM and Samsung has so far not collaborated on defining a 3-D chip stack process. The group’s focus to date has been on process technologies, not packaging issues.

GlobalFoundries recently announced it has shipped a quarter million wafers using high-K metal gate technology used for its 32 and 28 nm processes. Its 20 nm process also will be based on HKMG technology. GlobalFoundries said it will not use 3-D transistors, also known as FinFETs, until its 14nm generation.
 

KageMaru

Member
More information on TSVs and Global Foundries:



"high-end graphics and CPUs will use it to link to DRAMs and memory stacks" "20 and 28 nm wafers with TSVs can start in the second half of 2013 and ramp into full production in 2014". Description and dates fit with what we expected.

It's been mentioned before, but do you have any idea how expensive this would be in late 2013? I can't imagine yields turning out very good, which would eliminate any cost saving benefit to begin with. Not to mention with the comment of full production in 2014, it's also a question whether or not they could produce enough hardware for a console launch in 2013 to begin with.

Not meaning to sound "dickish", but I'm pretty sure all this has been explained or mentioned before to you.
 
It's been mentioned before, but do you have any idea how expensive this would be in late 2013? I can't imagine yields turning out very good, which would eliminate any cost saving benefit to begin with. Not to mention with the comment of full production in 2014, it's also a question whether or not they could produce enough hardware for a console launch in 2013 to begin with.

Not meaning to sound "dickish", but I'm pretty sure all this has been explained or mentioned before to you.
Yes it has.....unless someone has insider information to rule this out, your arguments are addressed in the cite.

1) The equipment is in place now

2) "The systems should be in place and qualified by the end of July, with about half of them installed today, McCann said. The company aims to run its first 20 nm test wafers with TSVs in October and have data on packaged chips from its partners by the end of the year 2012.

GlobalFoundries’ schedule calls for having reliability data in hand early next year 2013. The data will be used to update the company’s process design kits so its customers can start their qualification tests in the first half of the year.

If all goes well, first commercial product runs of 20 and 28 nm wafers with TSVs can start in the second half of 2013 and ramp into full production in 2014, McCann said."

3) 3D stacking will be LESS expensive not more expensive (we are in phase three of 3D wafer stacking with the major benefit cost). Even if first runs are more expensive over the next year yields will go up and costs down. Why do you think 3D stacking was invented......it's cheaper because the individual wafers can be quality checked before assembly.

Total the costs for two tapes/designs instead of ONE which would be necessary with your view with a larger power supply, CPU and GPU clocked lower, larger cooling system etc.

The Dates for the TSV process fall into line with the next generation game console rumors. Doesn't that seem convenient. Sony and Microsoft are NOT announcing a new console this year at E3.... why? They will wait till they get data from Global foundries on the reliability/cost etc. Earliest they could have an idea of a launch date would be Oct of this year. It could be 2014 not late 2013 if all does not go well.

Oh as to demand exceeding capacity:
If there is enough demand for TSVs, GlobalFoundries also will bring up the technology in its Fab 1 in Dresden. A fab in Singapore will be used for additional capacity for 2.5-D chips using silicon interposers if demand for the process exceeds what the New York fab can handle. GlobalFoundries is also exploring use of TSVs for MEMS and other products.

HSA efficiencies go through the roof if you include stacked memory in the same chip with CPU and GPU as you can eliminate L3 cache and possibly L2 cache reducing data movement in memory.
 
I was under the assumption that 3D stacking was cheaper after all that R&D and initial set up is said and done... As Jeff already mentioned, each wafer can be individually tested.
 

Proelite

Member
3D stacking is great for mobile devices, I am not sure if it'll have any use for consoles with lateral space isn't at a premium and cooling can be problematic. The silicon-interposer and TSV tech for next gen 3D stacking would be great to have for ultra-fast bandwidth, and you can potentially get around the 2GB GDDR5 ceiling.

Tri-gate tech, if they can make into next generation consoles will have a bigger impact, because they can lower power consumption by 50%.

The ideal next generation console would be a CPU + GPU with stacked memory on SI with tri-gate tech.
 
3D stacking is great for mobile devices, I am not sure if it'll have any use for consoles with lateral space isn't at a premium and cooling can be problematic. The silicon-interposer and TSV tech for next gen 3D stacking would be great to have for ultra-fast bandwidth, and you can potentially get around the 2GB GDDR5 ceiling.

Tri-gate tech, if they can make into next generation consoles will have a bigger impact, because they can lower power consumption by 50%.

The ideal next generation console would be a CPU + GPU with stacked memory on SI with tri-gate tech.
Silicon Interposer TSV is the backup as you get the benefits of 3D stacking but not some of the benefits in cost reductions that 3D wafer stacking can provide.

This has been a 10 year process in three phases and we are now in the third phase, according to the PDF in my previous post, with cost savings seen in manufacturing using pre-tested wafer stacking. AMD can and already has made APUs with CPU and GPU on the same silicon and they could take the same design and stack, using an interposer, a 3D stacked memory on top of it which would be what you are saying. But, for increased yield efficiency and reduced price they might make the CPU and GPU on different wafers, pre test them and then 3D wafer stack. This allows them a higher yield and lower cost because they aren't throwing away chips that might have good CPUs but bad GPU or the reverse.

We don't know which process they will use but it appears that 3D wafer stacking will be ready before the rumored launch date. In order to speculate on this we need to understand the process and what can be done and why IBM, Global Foundries and Samsung have joined together to support 3D Wafer stacking with interconnect standards.

A separate AMD CPU wafer and GPU wafer could be used by Sony to build a custom PS4 stack or by AMD to build custom stacks for their different GPU and APU lines. Instead of needing 10 different designs they have one that can stack as many GPU wafer elements as need. Flexibility and economy of scale at the same time! Because there are interconnect standards, memory wafers from a third company can be included in the stack.

If AMD with Global Foundries planned for a transition to 3D stacking, which will better support their HSA efficiencies and seems likely, then Sony is only giving AMD a specification which Global Foundries and AMD are filling in the most economical manner. It seems likely that Microsoft will do the same.

AMD with Global Foundries and IBM using AMD GPUs are the only companies that have CPUs & GPUs that can be combined for HSA efficiencies that can support "Next Generation" game console performance with a power envelope that works for a game console and 3D wafer stacking that will, with game console volumes, allow them to make the chipsets at a price needed by Game consoles. It's also interesting that AMD, Global Foundries and IBM have been cooperating since 2008. Even without leaks this is obvious and probably resulted in the speculation we heard earlier (this is all hindsight for me). Given this, possibles were exactly what we heard; PS4 1) Cell & AMD GPU or 2) AMD CPU-GPU and for Durango 1) IBM PPC & AMD GPU or 2) AMD CPU-GPU. All would have a AMD GPU which is what we first heard.
 
jeff_rigby said:
We don't know which process they will use but it appears that 3D wafer stacking will be ready before the rumored launch date. In order to speculate on this we need to understand the process and what can be done and why IBM, Global Foundries and Samsung have joined together to support 3D Wafer stacking with interconnect standards.

A separate AMD CPU wafer and GPU wafer could be used by Sony to build a custom PS4 stack or by AMD to build custom stacks for their different GPU and APU lines. Instead of needing 10 different designs they have one that can stack as many GPU wafer elements as need. Flexibility and economy of scale at the same time! Because there are interconnect standards, memory wafers from a third company can be included in the stack.

If AMD with Global Foundries planned for a transition to 3D stacking, which will better support their HSA efficiencies and seems likely, then Sony is only giving AMD a specification which Global Foundries and AMD are filling in the most economical manner. It seems likely that Microsoft will do the same.

AMD with Global Foundries and IBM using AMD GPUs are the only companies that have CPUs & GPUs that can be combined for HSA efficiencies that can support "Next Generation" game console performance with a power envelope that works for a game console and 3D wafer stacking that will, with game console volumes, allow them to make the chipsets at a price needed by Game consoles. It's also interesting that AMD, Global Foundries and IBM have been cooperating since 2008. Even without leaks this is obvious and probably resulted in the speculation we heard earlier (this is all hindsight for me). Given this, possibles were exactly what we heard; PS4 1) Cell & AMD GPU or 2) AMD CPU-GPU and for Durango 1) IBM PPC & AMD GPU or 2) AMD CPU-GPU. All would have a AMD GPU which is what we first heard.

Stacking? I don't really understand the tech or concept, but it sounds crazy so.. Shitjustgotreal.gif
 

TONX

Distinguished Air Superiority
There's no way the PS4 will be near $599. If only for Kaz Hirai. He was humiliated when he revealed the PS3 7 years ago at that price, and now that he is CEO, that's doubly not going to happen. People dreaming of BC with these specs might have to temper their expectations.
 

Melchiah

Member
There's no way the PS4 will be near $599. If only for Kaz Hirai. He was humiliated when he revealed the PS3 7 years ago at that price, and now that he is CEO, that's doubly not going to happen. People dreaming of BC with these specs might have to temper their expectations.

Unless there's a premium model, that's backwards compatible, or an add-on with some PS3 parts. One can dream.
 
Unless there's a premium model, that's backwards compatible, or an add-on with some PS3 parts. One can dream.
The patent I posted several messages ago seems to indicate a plug-in adaptor. For it to work, it needs less heat generation than the current Cell. If an adaptor is coming for BC then it would probably use a Slimmer slim refreshed Cell/RSX and most likely at least 256 meg of memory as fast as XDR. A new refreshed PS3 Slim is probably coming soon and the chip, similar to the Xbox360S will have both Cell and RSX in the same package.

The new PS3 Slim could reduce costs for all but the drives allowing for a $150 PS3 and a $99 PS4 BC adaptor (maybe less). If it's going to use any of the memory in the PS4 it would need shorter buss lines which would require an expansion compartment inside the case. There is also the FCC to consider and RF radiation shielding needed so doubly so on an internal expansion compartment.

Sony could also allow a PS3 on the same network as a PS4 to serve a PS4 as the BC game machine....The network can handle a video feed, controller and blu-ray drive access.
 

KageMaru

Member
I was under the assumption that 3D stacking was cheaper after all that R&D and initial set up is said and done... As Jeff already mentioned, each wafer can be individually tested.

Doesn't matter how much a process theoretically saves on cost if the yields are poor.
 

3rdman

Member
Job listing for voice recognition from Sony.

http://www.thesixthaxis.com/2012/04/16/rumour-ps4-getting-speech-recognition/

They were supposed to do that for the PS Eye I believe.

EDIT: DAMNIT lost a lot of my post... =/

Ok... let me type this again.

What if the current AMD is a temporary set up for a proposed 24 SPU cell? This isn't an outlandish thing we are talking about. With OpenCL we can get efficient forward compatibility with the Cell on other types of hardware while keeping full BC. As Jeff pointed out, in 2005, BSC was working on a Cell simulator before they went with the Cell, why can't they be doing the same before they do full production of the new Cell?

Dude...let go of the dream of another Cell-powered device. It was a complete and total failure...why in the hell would they keep dumping money into R&D on a CPU that nobody wants to use, is more expensive, and doesn't have a noticeable bump in graphics compared to their competitors?

As far as BC is concerned, I think your best hope is that they go through the trouble of porting the PStore titles to avoid problems but disc-based games will likely not be supported.

All IMO, of course...
 
Doesn't matter how much a process theoretically saves on cost if the yields are poor.
KageMaru is correct, if yields are poor we will probably be bumped to late 2014. This is the reason for the early start and yield tests for final product designs using 3D wafer stacking this year.

There is speculation that the TESTING done on wafers before assembly could damage wafers and this is one of the areas of concern. 3D stacking has been done for more than 10 years, 3D wafer stacking is new and has been R&D tested for only about (correct me if wrong) 5 years with new interconnect standards and pre-testing for 3D wafer stacking for 2 years. The interconnect standards and pre-assembly testing is new.
 
Top Bottom