• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IBM kills Cell chip line - Ninthing kills Sony then wakes up sticky, confused

gcubed

Member
knitoe said:
I see you taking my highend estimate. It's plausible in 3 years. Look at 360 with 3 cores and dx10 compare to what was on market at time.

of course i did, it helped my point :D
 

Stink

Member
cRIPticon said:
Of course! And that's why tools, middle ware, libraries and developers all need to continue building towards greater and greater number of cores. An example of a practical use of this in games are virtual processors (or actual ones with more cores), that fire up to handle complex interactions within a scene (i.e. lighting, reflections, physics, pathfinding AI, etc.) for a limited time then spin down or are reassigned. Resulting in smoother framerates while handling more interactions withing the scene.

While a lot of games are still developed as mono-threaded apps, this will be less and less the case as time goes by.

Most of this already takes place on the GPU though, which has been multicore for a long, long time. The CPU isn't relevant here.
 

cRIPticon

Member
Stink said:
Most of this already takes place on the GPU though, which has been multicore for a long, long time. The CPU isn't relevant here.

Sorry, not true. More and more of this is happening in multi-core CPUs as well.
 
Stink said:
only if you include the Cell making up for the weak RSX.


RSX is weak today. For when PS3 was due to launch, it was an adequate/practical solution. Everyone has their Cell+8800 wetdreams, but that was not possible back in 2006 unless you're ok with a $799 launch price.
 

Rubezh

Member
Falafelkid said:
IBM is discontinuing the Cell chip line of products.
xkqff954n8tz.jpg
 

cRIPticon

Member
Stink said:
only if you include the Cell making up for the weak RSX.

Read Tim Sweeney's presentation "The End of the GPU Roadmap": http://graphics.cs.williams.edu/archive/SweeneyHPG2009/TimHPG2009.pdf

While we have seen this cycle of "specialized hardware/generalized hardware/repeat" happen for several turns in the general computer field, it is fun to see Tim advocate it here. More cores on CPU will matter; along with more threads per core, better OS. better tools, better understanding of how to develop for these beasts.

One thing is for certain: mult-core systems are the way forward and there is no going back when competing on the edge of computing.
 

Gorgon

Member
cRIPticon said:
You are completely wrong. What Tom's Hardware is testing is OS and current game applications running on multicore systems. This is why I stated that it is not just a hardware issue, it is an OS as well as a compiler issue. There are plenty of large scale systems that produce almost linear scaling across hundreds of cores (i.e Solaris vs. Windows or Linux)

Those large systems don't work within a gaming environment. There's a difference between crunching molecular/astrophysical/whatever data and processing game code. Plus that kind of extremely scalable OS systems that would make such a difference in the entertainment environment that you envision for future consoles ain't coming anytime soon.


cRIPticon said:
To be sure, it is not a trivial task in the slightest, but clock speeds are not increasing at the pace they once were, so multi-core becomes more important. Watch MS and Sony invest in compiler and OS development to take advantage of this moving forward. Developers are going to have to learn multi-threaded app development, or the tools will have to provide this with a limited learning curve. Point is, multi-core is the future from here on out and the industry needs to get better, and they are, in fact, at developing for these architectures.

There's a difference between it being the future and having 20 cores on a next console that has to sell at around 299 USD and actually expect those 20 cores to pay off in investment terms. Unless there's some kind of unexpected revolution in coding you won't be using anywhere near 20 cores in 3 years. It simply doesn't pay of.
 

Struct09

Member
How many ever cores they choose for their next CPU, they should name it the HARDCORE. You can have that one for free, Sony.
 

cRIPticon

Member
Gorgon said:
Those large systems don't work within a gaming environment. There's a difference between crunching molecular/astrophysical/whatever data and processing game code. Plus that kind of extremely scalable OS systems that would make such a difference in the entertainment environment that you envision for future consoles ain't coming anytime soon.




There's a difference between it being the future and having 20 cores on a next console that has to sell at around 299 USD and actually expect those 20 cores to pay off in investment terms. Unless there's some kind of unexpected revolution in coding you won't be using anywhere near 20 cores in 3 years. It simply doesn't pay of.

A 12 SPU CELL processor and quad core GPU already puts you at 16 cores. Not doable in 3 years? Again, please read the Tim Sweeney presentation I linked to in an above post.

Struct09 said:
How many ever cores they choose for their next CPU, they should name it the HARDCORE. You can have that one for free, Sony.

THAT made me laugh! Brilliant!
 
cRIPticon said:
Wrong and wrong. Revolutionaries at Sony. Go and find it. Read it.

It was documented in EGM2 back in late '93-early '94 or so, Sony at one point had bailed out on the project in its second iteration, leaving Nintendo to work on it themselves.

Caio said:
I get your point, but don't think to be smarter than Sony, 'cause you're not, no offence.
If Sony conceived the Cell + RSX there is a reason, and the result is very evident when you see games like KZ2 and Uncharted2. And you still saw nothing, the best has yet to come, in 4th and 5th gen of games. The CPU is very important in tasks such as heavy physics, animations, AI, system collision, etc and the only hypothetical PS3 stronger than the actual PS3 would be the same Cell + a G80 derivate for its GPU, but it would be too expensive, and Sony take care about these things more than us, believe me.
Let Sony do their job, they don't need our suggestions.

:lol :lol :lol :lol :lol

For someone purporting to know about CPUs and all that, you sure don't seem to display a lick of insight. And it takes a just a cursory bit of knowledge to know what Sony did wrong with the PS3, namely make an expensive, overly complicated piece of hardware that is unfriendly to many developers.

Wow, yes, wow : Uncharted, R&C, MGS4, GTA4, KZ2, LBP, Uncharted2, R&C : ACIT, COD MW2, AC2, GT5P, RE5, Mirror's Edge, Infamous, Resistance, R2, Wipeout HD, Burnout Paradise, etc etc etc etc = best console ever owned, waiting for God of War3, GT5, FFXIII, Versus, Star Ocean4, Heavy rain, MAG, The last Guardian, etc, etc, anything else ?

Including multiplatform games, and what is essentially a demo (GT5: Prologue)? Cheater.
 

cRIPticon

Member
cartman414 said:
It was documented in EGM2 back in late '93-early '94 or so, Sony at one point had bailed out on the project in its second iteration, leaving Nintendo to work on it themselves.

And EGM2 got it wrong. It was the Sony/Philips group that bailed on the second go around, not Ken's organization (Playstation). It was the same internal group at Sony that sabotaged Ken's organization (which, at the time, was under Sony Music I believe) and got Nintendo to drop the original PlayStation for SNES.

So, TECHNICALLY, you are correct as it was a Sony partnership that bailed out, but Ken's group only made one run at it. It was Nintendo's disrespect that Ken used to get the Sony management fired up and angry.
 
cRIPticon said:
And EGM2 got it wrong. It was the Sony/Philips group that bailed on the second go around, not Ken's organization (Playstation). It was the same internal group at Sony that sabotaged Ken's organization (which, at the time, was under Sony Music I believe) and got Nintendo to drop the original PlayStation for SNES.

So, TECHNICALLY, you are correct as it was a Sony partnership that bailed out, but Ken's group only made one run at it. It was Nintendo's disrespect that Ken used to get the Sony management fired up and angry.

Actually that's what I meant, more or less, Sony + Phillips.

The last sentence though would make Ken seem like even more of a manipulative bastard than initially thought, since it had to do with the original SNES sound chip contract that forced all CD-Rom royalties into Sony's hands. (Not to say that Nintendo was innocent of all wrongdoing there.)
 

Gorgon

Member
cRIPticon said:
A 12 SPU CELL processor and quad core GPU already puts you at 16 cores. Not doable in 3 years? Again, please read the Tim Sweeney presentation I linked to in an above post.

You know as well as I do that we are talking about general purpose cores, not PPEs + SPEs. Next thing you're going to tell me that since a modern GPUs already has hundreds of processors we can expect the same for CPUs next gen...
 

ZeoVGM

Banned
This Moron's Blog[/quote said:
If Warner would side with HD-DVD, then the format is gathering strength at a remarkable pace. The studio withdrawing Blu-ray support would be another huge blow, and most likely the fatal one, to Sony and the PlayStation3.

With Warner selling the most high-definition discs, retailers are sure to side with the studio and follow their decision. Then it would only be a matter of time before Fox and Disney jump ship and Blu-ray, as a home video format, can be pronounced dead.

:lol :lol :lol :lol :lol
 

Polari

Member
cRIPticon said:
Hmmm....as the PS3 already has 8 cores (not including the PPC and the RSX), they would have to do a lot better than that. :lol

Seriously, next gen platforms are going to have to have an order of magnitude greater amount of cores + RAM + multi-core GPU. While developing this hardware is not easy, it is more difficult to get developers who are used to programming a specific way to understand multi-threaded/multi-core code. Not many dev shops fully understand this and even fewer do it well. That's not specific to the games industry, BTW.

The quality of the development tools provided are going to be even more important next generation than they have been in this one. I'm curious, could a technology like Apple's Grand Central Dispatch be in anyway useful for game programming? Would something like that highly optimised for specific use cases within a game engine potentially provide an advantage for developers working on consoles with lots of cores? It seems to simplify the problem of multithreaded programming, but would it perhaps not work at a low enough level to be useful in gaming?
 

jax (old)

Banned
omg rite said:
If Warner would side with HD-DVD, then the format is gathering strength at a remarkable pace. The studio withdrawing Blu-ray support would be another huge blow, and most likely the fatal one, to Sony and the PlayStation3.

With Warner selling the most high-definition discs, retailers are sure to side with the studio and follow their decision. Then it would only be a matter of time before Fox and Disney jump ship and Blu-ray, as a home video format, can be pronounced dead.

:lol :lol :lol :lol :lol[/QUOTE]


:lol
 

Boss Man

Member
Oh God, I wish he wouldn't have been banned. This guy's post history is golden.

Falafelkid said:
So it isn't just me who prefers shooters with the Wiimote...

Falafelkid said:
Same here. I recently played Gears of War 2 and while it is a great game, I can never go back to the old regular controller setup or even the keyboard and mouse. Especially FPSs have to be played with the Wiimote Nunchuk combo.

Falafelkid said:
Hang on, stop and think. Would you play an FPS... with your hand? I wouldn't. Both systems have their advantages for different types of gameplay.

Sony is completely fucked now, though.
 

Durante

Member
Polari said:
The quality of the development tools provided are going to be even more important next generation than they have been in this one. I'm curious, could a technology like Apple's Grand Central Dispatch be in anyway useful for game programming? Would something like that highly optimised for specific use cases within a game engine potentially provide an advantage for developers working on consoles with lots of cores? It seems to simplify the problem of multithreaded programming, but would it perhaps not work at a low enough level to be useful in gaming?
"Grand Central Dispatch" is just another case of Apple taking well established technology, slapping a fancy name on it and using it in marketing. It's task based parallelism based on a threadpool. It's what MIT did in 1994 with Cilk.
That said, of course approaches like that are useful for game development as well. For example I know some PS3 developers at least use SPUlets, which are small tasks runnable on the SPU that get scheduled automatically.
 

Polari

Member
Durante said:
"Grand Central Dispatch" is just another case of Apple taking well established technology, slapping a fancy name on it and using it in marketing. It's task based parallelism based on a threadpool. It's what MIT did in 1994 with Cilk.
That said, of course approaches like that are useful for game development as well. For example I know some PS3 developers at least use SPUlets, which are small tasks runnable on the SPU that get scheduled automatically.

Oh yeah, I realise that there's other implementations of the concept, I just wondered whether it was used in game development or not.
 

gblues

Banned
Puck said:

I'll see you and raise you!

nqttoh.png


I actually found a vector version of the ESRB logo, matched the font/rotation, kerned "FAIL" consistent with the other logos, and everything.

Yes, I have too much time on my hands..
 
gofreak said:
I'm sure Sony seriously considers all their options, but I doubt they'd make a decision for a 2013 machine based on the pecking order of these different companies today. A lot can change so their decision would have to consider many more variables than that. That said though, I agree there's nothing to prevent AMD getting into PS4.

Obviously, and if anything that's why RSX wasn't quite upto snuff. It was best in class with the software of 2004, but the architecture hasn't faired quite as well with the software of today, look at how ATI's corresponding part of the same time period now outperforms G70 based GPUs, 2:1 in modern games despite then being neck and neck back in the day. Its not easy but you've got to try and choose a design that works best with the software of 2015, and as history shows performance in the here and now isn't necessarily the best indicator of that.

Where I do believe AMD meshes well is with their recent focus on smarter but smaller chips, they're really quite far ahead of Nvidia in terms of performance per mm2 and that's through a concsious choice to move away from the monolothic dies which Nvidia still rely upon. Its a philosiphy that meshes well with a console imo, and something like Fermi is a long way off from being viable as a console GPU for example but AMD already have high performance DX11 GPUs that are prime candidates for the job.

Having said all that, something like Larabee really does seem to mesh quite well with the vision laid out by CELL, and it is something that should perform well with engines that don't rely on straight rasterisation, and with the likes of Epic and id talking about going down such roads it starts to make a lot more sense.


Stink said:
tasks that lend themselves to multiple threads, sure. That's not videogames though, surely you should realise this.

We already have developers that have managed to spread a game's workload over 8 cores with great results, a jump to 16 cores in 3 years really shouldn't be that much of a stretch, given that. Its not easy, but its absolutely possible, and all those developers that have cut their teeth on Cell should fair very well even if Cell itself is dropped, a lot of that knowlede is going to remain relevant. The jump from 8 to 16 threads is not nearly as large as the one from 1 to 8 was.
 
Kobun Heat said:
Of course, that's not what the original article in question was saying -- it was saying that IBM is not developing a next-gen Cell, not that they were shutting down all production of the current chip.

Didn't the article linked in the OP say something that this news would be a catalyst for Sony to leave the console business?
 

Geek

Ninny Prancer
Kobun Heat said:
Of course, that's not what the original article in question was saying -- it was saying that IBM is not developing a next-gen Cell, not that they were shutting down all production of the current chip.

Which is why it's probably important to either read the full post (LOL) or pull these portions as well.

As for the fate of the Cell processor technology? Well that will live on as well says Turek, as "the core technology of the Cell processor will continue to proliferate throughout the IBM product line."

Turek wouldn't comment on upcoming product announcements regarding the future of the Cell.
 

Zen

Banned
So that means Sony will be going with another CPU (whilst maintaining Cell compatibility I'd suppose) or might they attempt a 'next gen' Cell on their own. Sort of a shame and probably throws a wrench in Sony's plans, but all companies involved had to know this was happening well in advance of any public announcement. If the Cell Road map wasn't up to snuff then so be it I guess.
I'd love to hear the reasoning's behind the decision not to move forward with the roadmap.
 

AndyD

aka andydumi
Zen said:
So that means Sony will be going with another CPU (whilst maintaining Cell compatibility I'd suppose) or might they attempt a 'next gen' Cell on their own. Sort of a shame and probably throws a wrench in Sony's plans, but all companies involved had to know this was happening well in advance of any public announcement. If the Cell Road map wasn't up to snuff then so be it I guess.
I'd love to hear the reasoning's behind the decision not to move forward with the roadmap.

Read a few posts above yours. One development line was ended while other Cell lines are just fine and will continue as scheduled intheir products lines. From IBM themselves.
 
Top Bottom