• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IBM: Cell continues as integrated part of Power roadmap; working on next consoles

gofreak

GAF's Bob Woodward
So I guess this is something of a clarification of earlier statement about the status of Cell. He also talks a little about future work.

http://news.yahoo.com/s/pcworld/20101008/tc_pcworld/cellprocessordevelopmenthasntstalledibmctosays

Development around the original Cell processor hasn't stalled and IBM will continue to develop chips and supply hardware for future gaming consoles, a company executive said.

IBM is working with gaming machine vendors including Nintendo and Sony, said Jai Menon, CTO of IBM's Systems and Technology Group, during an interview Thursday. "We want to stay in the business, we intend to stay in the business," he said.

Sounds like they may still be in a bidding stage rather than having secured contracts.

"I think you'll see [Cell] integrated into our future Power road map. That's the way to think about it as opposed to a separate line -- it'll just get integrated into the next line of things that we do," Menon said. "But certainly, we're working with all of the game folks to provide our capabilities into those next-generation machines."
 

DonMigs85

Member
If they care at all about BC they'll have to continue using IBM parts. However the Wii and 360's respective CPUs shouldn't be that tough to emulate on a decent Intel/AMD chip but the Cell will be.
 

Nirolak

Mrgrgr
°°ToMmY°° said:
strange he doesn't mention microsoft. going with an amd GPU and CPU?
That's what I'm thinking.

They can probably get a nice bargain doing that, and AMD has been working really hard to meet every DirectX specification Microsoft puts out as soon as it happens.
 

gofreak

GAF's Bob Woodward
°°ToMmY°° said:
strange he doesn't mention microsoft. going with an amd GPU and CPU?

The article's author mentions specific companies, so maybe they just omitted MS for brevity. His later quote says they're working with 'all the game folks', but they're probably still at a pre-contract stage with most or all of them.
 

DonMigs85

Member
Even if Microsoft's next machine has a mere Radeon 5770-level GPU and a triple core Phenom it'll still be far more powerful than Xenos/Xenon.
 

IrishNinja

Member
DonMigs85 said:
If they care at all about BC they'll have to continue using IBM parts. However the Wii and 360's respective CPUs shouldn't be that tough to emulate on a decent Intel/AMD chip but the Cell will be.

was thinking about this the other day; 360 shouldn't be too hard to emulate down the lines, right? would PS3 be the next Saturn then, as far as difficulty to nail down for years?
 

DonMigs85

Member
IrishNinja said:
was thinking about this the other day; 360 shouldn't be too hard to emulate down the lines, right? would PS3 be the next Saturn then, as far as difficulty to nail down for years?
The PS3 RSX GPU is fairly simple but the SPEs in Cell will be the major hurdle.
 

Man

Member
Sony patented that external BC unit (so they can sidestep from using Cell in the future?), with Sony teaming up with Intel for Google TV's (and obviously Vaio's) that could be a future console alliance.
 

Lazy8s

The ghost of Dreamcast past
Microsoft could be designing their own CPU.

A smart company would've picked up a license for Meta and Series 6.
 

gofreak

GAF's Bob Woodward
makingmusic476 said:
Yeah, Cell is definitely gonna be around for awhile.

Aren't Toshiba using some form of Cell in their upcoming 3DTVs?

Yes, but I think they aren't going to be using it as broadly as they once planned. Less costly alternatives have come along since they can handle many of the things they want to do even if they don't offer as much power.


Recalling Hiroshige Goto's report last December, I wonder if Sony will go with several 'large' next-gen, Cell-influenced Power cores that have compatibility with the SPU instruction set. That could get them the greater ease of development they seem to be looking for while allowing for BC. I wonder how much more power it would offer, though - but perhaps they will lean on a relatively larger GPGPU to handle 'extra' heavier floating point work alongside rendering.
 

mrklaw

MrArseFace
A lot depends on bang for buck though. In theory a 'super CELL' makes sense as they've covered dev costs, provide stability for toolsets etc. Key would be whether they can get enough of a boost in performance from that architecture for the price they are willing to pay.

With the use of the Cell to augment the GPU in PS3, i'm also curious what balance they'll go for in PS4. relatively weak but therefore cheap GPU and spend more on a bigger CEll, or beef up the GPU to aid multi platform ports etc?
 

gofreak

GAF's Bob Woodward
Superficially at least I think a large-GPU makes sense. For a lot of developers I think it is relatively a lot harder to idle a GPU than a CPU. If you have a GPGPU, if you're not using it a lot for stuff like physics or whatever, you can relatively easily put the balance of the power to work on better graphics. But if devs - for example - aren't using SPUs, they just go idle. At least until more recently perhaps when Sony's made available libs for doing visual enhancement work in your spare SPU time.

Sony will look carefully at how the SPUs have and have not been used when determining how big the CPU and GPU in the next system should be relative to one another. How much more CPU they need. But I wouldn't be surprised at a 'small CPU, large GPU' approach. There's the question of how much CPU you'll really need in a world of very general GPUs. What would be nice would be a framework for writing jobs that could run on either chip, so that devs to a certain degree could treat both chips as one blob of processing power.

I'm thinking you could probably do a 8-12 'fat' Power core chip at a similar die size to Cell's by 2013 or so. The current Power7 8-core is around 500mm^2 at 45nm which is significantly larger than the 200-odd mm^2 of the first Cell die, but the cores in Sony's next wouldn't need to necessarily be AS fat as that, and there should be smaller processes available in a few years. On paper that wouldn't be a huge bump in raw processing power, but it would be much easier to leverage. And like I say, with a large and very general GPU - and with the right tools living on top of that - I'm not sure much more raw CPU power would be particularly necessary.
 
Next generation consoles won't come out until 2012 earliest, by then we should be at 22nm. That allows theoretically 16 times more transistors that the 90nm process we were at when ps3 launched. SPUs take only about 15 million trannies, so at 22nm they take only about 1 square millimeter of chip. Basically at that point SPUs cost almost nothing and still offer a lot of power. 32 SPU + 4 PPU Cell would take about 1 billion trannies, at 22nm the chip should be less than a third of what Cell was at PS3 launch.

There were rumors a while ago about IBM peddling Power7 for PS4. I tought it's an intersting idea, power7 should be compatible with PPU cores with little effort and offers a ten shitloads of conventional CPU power, lack of which has caused many complaints during this generation. 8 core Power 7 takes about 2 billion transistors, so replacing PPU cores with Power 7 cores should produce 4 Power7 + 32 SPU processor at 1.5 to 2 billion trannies. At 22nm that should be about half the size cell was at launch and also half the cost. Great choise for cost efficient easy to program high performance console with full backwards compatibility.
 
DonMigs85 said:
If they care at all about BC they'll have to continue using IBM parts. However the Wii and 360's respective CPUs shouldn't be that tough to emulate on a decent Intel/AMD chip but the Cell will be.

Isn't it always tough to emulate PPC on x86?
 

szaromir

Banned
When do you think the next batch of consoles will come out? My bet is 2013. At that point consoles the jump between current and next gen could be huge even at reasonable prices.
 

GCX

Member
szaromir said:
When do you think the next batch of consoles will come out? My bet is 2013. At that point consoles the jump between current and next gen could be huge even at reasonable prices.
I say Nintendo will be the first in 2012 followed by Sony and MS in 2013.
 

Man

Member
GCX said:
I say Nintendo will be the first in 2012 followed by Sony and MS in 2013.

I think everyone will end up in 2012 as they are trying to sneak in.
Nintendo and Sony are getting their handhelds 'out of the way' in 2011.

I speculated that MS might go for as early as 2011 (sneak launch with Gears 3 on super settings) but the Kinect defeats that thought. It's a prototype of the Kinect 2 which will be there from launch and 2011 is too early.
 

Tiduz

Eurogaime
prob a good idea for sony since all the devs had to learn cell anyways.

(shame not all of them could pull it off)
 
I can't really see this being all that valuable to anybody else but SONY.

Wasn't most (or at least a large share) of the CELL's horsepower dedicated to media and less so gaming?

This is what I've heard somebody please correct me if this is untrue.

fortified_concept said:
Good. Cell is the gift that keeps on giving as far as graphics are concerned and one of the few smart decisions Sony made for PS3.

CPU's hardly have anything to do with the graphics.

H_Prestige said:
It was by far the dumbest decision they made.

No, that was them using a storage medium that at launch cost them as much if not more than the console should have itself that ended up giving practically no advantages to the games. I know people would argue that the Playstation 3 helped Blu-Ray topple the ye olde HD-DVD, but those billions Sony lost could have easily gone to give the format a bigger marketing push.
 
fortified_concept said:
Good. Cell is the gift that keeps on giving as far as graphics are concerned and one of the few smart decisions Sony made for PS3.

It was by far the dumbest decision they made. How many billions did they waste on that chip? Every game the ps3 runs the 360 runs just as well if not better, with much cheaper hardware.

I think ps4 will be x86 based and ps3 BC will either be hardware based built in, or via the external add on that was patented.

No, that was them using a storage medium that at launch cost them as much if not more than the console should have itself that ended up giving practically no advantages to the games. I know people would argue that the Playstation 3 helped Blu-Ray topple the ye olde HD-DVD, but those billions Sony lost could have easily gone to give the format a bigger marketing push.

No, blu ray is one of those features that hurts at first but helps later on as it gets cheaper. The same cannot be said for Cell. It was a handicap right from the start and it still is today.
 

Man

Member
Flying_Phoenix said:
CPU's hardly have anything to do with the graphics.
SCE studios and a few 3rd parties are offloading a lot of graphics related work to the Cell.
The few 3rd parties part is the problem though.

Their 1st party offering is the most impressive outings on consoles today but 3rd party games often come out slightly better on competitive hardware (there are exceptions though like Elder Scrolls, Castlevania, FF13, Burnout, Fallout, Dragon Age etc).
 

gofreak

GAF's Bob Woodward
Flying_Phoenix said:
CPU's hardly have anything to do with the graphics.

Not so in PS3. At least in the games using Cell for render work. Among some of the better looking PS3 games you'll see a surprising amount of graphics work being done on SPUs.

A more general next-gen GPU might leave less potential application for a processor like Cell with render work though. Cell is nice in the context of PS3 because it gives a lot of power in a more general way than its GPU, letting devs explore certain algorithms or at a certain level of quality that wouldn't be possible on the system's GPU. But in a context with a really general GPU in the future, that kind of algorithm exploration could be done there.

H_Prestige said:
No, blu ray is one of those features that hurts at first but helps later on as it gets cheaper. The same cannot be said for Cell. It was a handicap right from the start and it still is today.

I'm not sure I understand this. Processors 'get cheaper' too. And in terms of impact on games' technical quality, the processors in PS3 have probably had more influence than the disc drive. In terms of getting more and more out of the system, and going well beyond what a RSX should rightfully be able to handle, I would certainly credit Cell above anything else in the system.
 

DonMigs85

Member
Blu-Ray was a good decision indeed, and Cell was also kinda needed for its video-decoding capabilities at least initially but Sony really shouldn't have cheapened out on the GPU.
And don't forget that the 360's Xenon CPU also came about as a by-product of Cell R&D.
 

spwolf

Member
fortified_concept said:
Good. Cell is the gift that keeps on giving as far as graphics are concerned and one of the few smart decisions Sony made for PS3.

cell, standard HDD and bd both good decisions now, terrible back then due to high costs :).
 

gofreak

GAF's Bob Woodward
DonMigs85 said:
Blu-Ray was a good decision indeed, and Cell was also kinda needed for its video-decoding capabilities at least initially

Cell was complete overkill for 1080p blu-ray decoding.

Contrary to the belief of some, the extent of Cell's capability was targeted at games - specifically at heavy simulation and physics components of game processing, and also as a potential helper for RSX. The proposition was that next-gen games would require heavy numerical computation as a much larger proportion of their overalll computation than before, and so that's what they optimised around. If they just wanted a good video/media decoder they could have popped one or two SPUs on a chip and called it a day. Or not gone with this design at all - a couple of PPUs would have sufficed too.
 

spwolf

Member
gofreak said:
Cell was complete overkill for 1080p blu-ray decoding.

Contrary to the belief of some, the extent of Cell's capability was targeted at games - specifically at heavy simulation and physics components of game processing, and also as a potential helper for RSX. The proposition was that next-gen games would require heavy numerical computation as a much larger proportion of their overalll computation than before, and so that's what they optimised around. If they just wanted a good video/media decoder they could have popped one or two SPUs on a chip and called it a day. Or not gone with this design at all - a couple of PPUs would have sufficed too.

main problem was that they didnt develop SDKs until much later on that show devs what to do with cell... no wonder crazy Ken got the boot.
 
Specifically mentioning Nintendo and Sony but not mentioning Microsoft is interesting.


So Beefed up Cell + mystery GPU (probably from Nvidia to ease BC) from Sony

Multicore PowerPC (probably Xenon alike but I'd much rather a Cell based design now that great tools are out there and developers are doing fantastic things with it) + AMD GPU for Nintendo. Possibly all integrated on a single chip like Valhalla/Llano. I kinda like the idea of Nintendo sneaking in at the last minute and reaping the rewards of all the millions of dollars and man hours put into Cell development without spending a single penny! :lol Since both Toshiba and IBM are free to license the design there's seemingly little that Sony can do about it either, the more I think about it the more sense a move like that actually makes. CELL has actually proven itself a pretty damn decent console CPU in the end and with a few careful tweaks (starting by beefing up the PPE) it could be a fantastic chip.

A customised future fusion design from AMD for Microsoft?

It probably means nothing of course but its nice to explore the consequences of that statement if you take them at face value.
 
°°ToMmY°° said:
strange he doesn't mention microsoft. going with an amd GPU and CPU?

If they're going with AMD its because they want a single chip design. Although IBM have delivered this for them on the 360 eventually, they may not be happy with the results and may want to have better guarantees that integration can happen at launch. From 2011 onwards, the majority of chips sold by AMD will be integrated CPU/GPU designs, so if an integrated CPU/GPU is what you want, then AMD are your best bet.

Microsoft may not be confident that IBM can deliver a multi billion transistor CPU/GPU chip (Valhalla was a much more modest project) without their own GPU tech. which is why they may be going with AMD for theirs whereas Nintendo may still pick one up from a joint IBM/AMD partnership. This could be for all manner of reasons, one simply being that AMD don't want IBM to get their hands on (and become deeply personal) with their latest and greatest GPU architecture. Since Nintendo will probably be happy to go with an older GPU design from AMD (something similar to what's being shipped in current generation Fusion products for instance), AMD may be more willing to pass it over to IBM.

I'm not expecting discrete CPU and GPU chips next generation, it'll only make some of the potential workloads infinitely more difficult and drive up costs. If we do see two chips then they'll probably be two separate CPU/GPU hybrids, not two specific parts each with a tightly defined function.


DonMigs85 said:
If they care at all about BC they'll have to continue using IBM parts. However the Wii and 360's respective CPUs shouldn't be that tough to emulate on a decent Intel/AMD chip but the Cell will be.

Sony will, yes but Microsoft and Nintendo? Nah, so long as they go with a graphics solution from AMD (and in Nintendo's case even that isn't a prerequisite) then it shouldn't be too difficult gaining compatibility with those low end PowerPC cores.
 
gofreak said:
Yes, but I think they aren't going to be using it as broadly as they once planned. Less costly alternatives have come along since they can handle many of the things they want to do even if they don't offer as much power.


Recalling Hiroshige Goto's report last December, I wonder if Sony will go with several 'large' next-gen, Cell-influenced Power cores that have compatibility with the SPU instruction set. That could get them the greater ease of development they seem to be looking for while allowing for BC. I wonder how much more power it would offer, though - but perhaps they will lean on a relatively larger GPGPU to handle 'extra' heavier floating point work alongside rendering.

That sounds quite similar to Power7 in principle, I'm sure "tweaking" those cores so that they ditch Altivec support for SPU support is within IBM's capabilities. That eDRAM as L3 cache tech. that's included in Power7 also seems like a nice fit for a console to me, especially if you can get some sort of GPU integrated on the same die. It doesn't have to be a shader monster either, since the beefed up SPEs can handle a lot of that work (as they do in the PS3, just less efficently since they're off die) just make sure it has decent texturing hardware etc.
 

Mr_Brit

Banned
So do you guys think MS and Sony will go for eDRAM in their next consoles as the 360 got a huge boost from that meager 10MB, just imagine what a ~48MB buffer could do, it would mean most games would run at least at 1080p with 2xMSAA which is the just below the point of diminishing returns where any more res or AA would be hard to notice.
 
gofreak said:
I'm thinking you could probably do a 8-12 'fat' Power core chip at a similar die size to Cell's by 2013 or so. The current Power7 8-core is around 500mm^2 at 45nm which is significantly larger than the 200-odd mm^2 of the first Cell die, but the cores in Sony's next wouldn't need to necessarily be AS fat as that, .

Its worth noting that the majority of the die space in the 8 core (32 thread) Power7 is actually taken up by the huge 32MB of L3 cache (eDRAM). Each individual core is actually a relatively small/simple IO design compared to the modern monstrosities from Intel/AMD.
 
I don't think I can survive more of those Xbox/PS threads and endless articles of devs complaining about the different architectures, and that showing in their games.
 

DonMigs85

Member
Mr_Brit said:
So do you guys think MS and Sony will go for eDRAM in their next consoles as the 360 got a huge boost from that meager 10MB, just imagine what a ~48MB buffer could do, it would mean most games would run at least at 1080p with 2xMSAA which is the just below the point of diminishing returns where any more res or AA would be hard to notice.
How much eDRAM would you need just to contain a 1920x1080 framebuffer with at least 2x MSAA?
 
Mr_Brit said:
So do you guys think MS and Sony will go for eDRAM in their next consoles as the 360 got a huge boost from that meager 10MB, just imagine what a ~48MB buffer could do, it would mean most games would run at least at 1080p with 2xMSAA which is the just below the point of diminishing returns where any more res or AA would be hard to notice.

As I understand it, the near wholescale move towards multiple render targets makes the inclusion of a huge ass chunk of eDRAM like that unlikely, at least that was the consensus I picked up at B3D from some of the more knowledgeable members. No matter how big it is, you're never going to be able to fit all your render targets in there, so its pointless going with a monstrous amount like that.

If any eDRAM tech is used, I believe it'll take the form of the eDRAM as L3 cache tech. in IBM's Power7 chips. Being able to quickly and easily share workloads between the CPU and GPU (without huge amounts of latency) could be crucial to performance if you've got something akin to both SPUs and a traditional GPU on that one die. Developers would still have the option to simply use it as somewhere to store their framebuffer in a traditional forward renderer but the clever ones will be able to put it to much better use as well. It'd simply be much more flexible and useful for incredibly useful for many different rendering techniques, not just the ones we had going into this generation. 15-20MB ought to cover most usage cases.
 

DonMigs85

Member
brain_stew said:
As I understand it, the near wholescale move towards multiple render targets makes the inclusion of a huge ass chunk of eDRAM like that unlikely, at least that was the consensus I picked up at B3D from some of the more knowledgeable members. No matter how big it is, you're never going to be able to fit all your render targets in there, so its pointless going with a monstrous amount like that.

If any eDRAM tech is used, I believe it'll take the form of the eDRAM as L3 cache tech. in IBM's Power7 chips. Being able to quickly and easily share workloads between the CPU and GPU (without huge amounts of latency) could be crucial to performance if you've got something akin to both SPUs and a traditional GPU on that one die. Developers would still have the option to simply use it as somewhere to store their framebuffer in a traditional forward renderer but the clever ones will be able to put it to much better use as well. It'd simply be much more flexible and useful for incredibly useful for many different rendering techniques, not just the ones we had going into this generation. 15-20MB ought to cover most usage cases.
This is true, especially with the push for true stereoscopic 3D nowadays.
 
H_Prestige said:
Isn't it always tough to emulate PPC on x86?

Apple managed just fine and they had to emulate significantly more complex than the stripped down (Pentium 1 era) cores you find in the Xbox 360. FP performance is one area where x86 has always been lagging far behind PowerPC but with the introduction of AVX, that's going to change.
 

thcsquad

Member
H_Prestige said:
It was by far the dumbest decision they made. How many billions did they waste on that chip? Every game the ps3 runs the 360 runs just as well if not better, with much cheaper hardware.

I think ps4 will be x86 based and ps3 BC will either be hardware based built in, or via the external add on that was patented.

The GPU in the PS3 is the bottle neck. Among other things, none of which are the Cell.
 
thcsquad said:
The GPU in the PS3 is the bottle neck. Among other things, none of which are the Cell.

Not strictly true. The anaemic general purpose single threaded performance is also a bottleneck I'd like to see rectified in a new design. The current PPE is just too slow. We shouldn't have something akin to a highly clocked Atom controlling all those super fast SPEs but that's the current reality. If you're going to add even more SPEs, then the situation is going to become even more critical.

There's more than one way to solve that problem of course, gofreak's suggestion of beefing up the SPE's themselves is one, but simply replacing the PPE itself with something that isn't straight out of the stone ages and tied to a puny 512KB of L2 cache, would be another.

Edit: The suggestion of replacing the PPE with a couple Power7 cores is a good one imo. Sure, those Power7 cores are still an IO design but at least they aren't completely useless like the current PPE is. 2 Power7s + ~20 SPEs + ~15MB of L3 cache (via IBM's eDRAM tech.) is starting to look tasty imo, especially if you can fit a traditional GPU on that die as well (you can save some transistors by nixing some of its shader units, since the SPEs can do a lot of that work, you only really want it for its fixed function hardware). Something like that may be doable with ~2 billion transistors.
 

TheSeks

Blinded by the luminous glory that is David Bowie's physical manifestation.
Man said:
Sony patented that external BC unit.

Wait, what? Please tell me that's not a $150 option just to play PSX/2/3 games via USB or whatever to the PS4. :|
 

McHuj

Member
Lazy8s said:
Microsoft could be designing their own CPU.

A smart company would've picked up a license for Meta and Series 6.

I know they acquired some people from IBM who worked on the xbox CPU and moved them to Washington. How many and for what exact purpose I don't know, but that was back in 2006-2007 time frame.
 

Man

Member
TheSeks said:
Wait, what? Please tell me that's not a $150 option just to play PSX/2/3 games via USB or whatever to the PS4. :|
It's the chipset of old hardware in an external box basically.
So yes (that's the potential).

If a Power7 eight core can be backwards compatible with the PS3 Cell then that is probably the most viable choice to Sony today as it should be easy to develop for and provide some good horsepower.
 

Averon

Member
Not surprised by this news. Sony using a Cell 2.0 would be far cheaper than contracting IBM/Intel/AMD to develop a brand new chip. In addition to that, it would make the transition from PS3 to PS4 much easier development-wise. Stringer and Hirai will make sure the PS4 won't be a repeat of the PS3.
 
Lazy8s said:
Microsoft could be designing their own CPU..

Well they do have an ARM architectural license now. They obviously picked that up for something and while its probably for server chips its not impossible that they're also planning to use it to design their own console CPU. Their console division is pretty much the only part of the company with any CPU design experience after working closely with IBM on the 360's design, die shrinks and chip integration.
 

gofreak

GAF's Bob Woodward
brain_stew said:
Not strictly true. The anaemic general purpose single threaded performance is also a bottleneck I'd like to see rectified in a new design. The current PPE is just too slow. We shouldn't have something akin to a highly clocked Atom controlling all those super fast SPEs but that's the current reality. If you're going to add even more SPEs, then the situation is going to become even more critical.

I may be wrong, but I wouldn't characterise the PPE as hold the SPEs back. Depending on the kind of task set-up used at least.

Not that a beefier main core, or cores wouldn't be welcome. And I don't foresee them coupling a tonne of SPEs to one Power core, they would probably have multiple ones coupled to a larger number of SPEs if that was the route they were taking.

In terms of downsides that have manifest in a very material way on end results, I think Cell's are probably less prominent than others. I'd hazard a guess that most developers wouldn't tell you, today at least, that Cell is where their performance bottleneck was in PS3 games... far from it, probably.
 

DonMigs85

Member
Is RSX's 128-bit memory bus a direct consequence of only giving it 8 ROPs compared to 16 on a 7800 GTX, or would it have been possible to retain a 256-bit bus with only 8 ROPs?
 
Flying_Phoenix said:
CPU's hardly have anything to do with the graphics.
.

Your average x86 chip might not, but CELL sure as hell does. The PS3 wouldn't be anywhere close to competing with the 360 graphically at this point if not for CELL, nevermind comprehensively surpassing it in titles that really understand how to utilise its capabilities. CELL is doing all kinds of traditional graphics work in even your average run of the mill multiplatform title, nevermind your premier first party exclusives.
 
Top Bottom