• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IBM: Cell continues as integrated part of Power roadmap; working on next consoles

DonMigs85

Member
Raistlin said:
Unless the new offering is significantly better than Cell at similar costs and power budgets, it would be stupid to not stay with the current family of processors.

There needs to be a very good reason to throw away BC, libraries, engine designs, and dev experience. A small jump in power is certainly not a good reason.
I have a feeling its general-purpose performance could be considerably better than Cell as we know it, unless Sony/IBM make significant advancements or additions.
Plus Sony also developed tons of libraries for the PS2 hardware, and those were all chucked out once the PS3 took over.
 

1-D_FTW

Member
Yes. And if you're going to believe this, then you have to think Tom's Hardware mockup is pretty close too:

new-xbox-720-console-concept_t.jpg
 

mrklaw

MrArseFace
DonMigs85 said:
I have a feeling its general-purpose performance could be considerably better than Cell as we know it, unless Sony/IBM make significant advancements or additions.
Plus Sony also developed tons of libraries for the PS2 hardware, and those were all chucked out once the PS3 took over.

who cares about general purpose performance? You want game performance. CELL has shown its good for both graphics related things and physics, with enough general purpose for AI etc.

Unless they have such a fantastic GPU that they don't think it'll need any support from the CPU. (I hope they learned their lesson from underspeccing the GPU in PS3 and over compensate with a monster GPU for PS4)
 

Raistlin

Post Count: 9999
DonMigs85 said:
I have a feeling its general-purpose performance could be considerably better than Cell as we know it, unless Sony/IBM make significant advancements or additions.
Plus Sony also developed tons of libraries for the PS2 hardware, and those were all chucked out once the PS3 took over.

Is general purpose computing power that much of a concern on a game console? I know they've been becoming more and more of a convergence device, but that hasn't seemed to be an issue even with the current Cell.

As to the PS2 library reference, the point is Sony's attempting to learn from their mistakes. Plus, the magnitude of dev costs/investment between PS2 and PS3 generations is quite substantial.


The reality is Cell is actually the safer move - as surprising as that may seem to some. The only way it wouldn't be is if both Nintendo and MS use what basically amounts to the same CPU. Then maybe it would make sense to drop Cell.
 

camineet

Banned
mrklaw said:
who cares about general purpose performance? You want game performance. CELL has shown its good for both graphics related things and physics, with enough general purpose for AI etc.

Unless they have such a fantastic GPU that they don't think it'll need any support from the CPU. (I hope they learned their lesson from underspeccing the GPU in PS3 and over compensate with a monster GPU for PS4)


I hope so too.
 

big_z

Member
i would love to see nintendo use the same gamecube chip yet again but solely as a dedicated physics chip. the thing has got to be cheap enough to make by now. it would take motion controls to another level since it would allow for a new level of environmental interaction.

of course im expecting nintendo to gut a speak and spell and call it a day.
 
If Sony doesn't design a system that is 99% compatible with ps3 software, then they are utterly retarded. You can bet both MS and Nintendo will have full BC.
 
Absolute Bastard said:
Next generation consoles won't come out until 2012 earliest....


IMO even 2012 is too early, hardware launches are annouced at E3 of the prior year, can you picture a hardware annoucement at the next E3? I can see Sony and MS showing off a ton of motion controlled games, it'll be the first E3 with those peripherals on store shelves. Nintendo has always been last to enter a new generation and I cannot picture them launching a new generation ahead of Sony and MS, Nintendo's focus will likely be the 3DS.

2012 will be the year when new generation plans are revealed, console spec's will run wild and new gen projects unveiled for the new systems launching in 2013. *Rubs crystal ball*


Hint Hint:

"The company has a history of releasing chips every three years, which points to a Power8 chip release for 2013."
 

RoboPlato

I'd be in the dick
mrklaw said:
Unless they have such a fantastic GPU that they don't think it'll need any support from the CPU. (I hope they learned their lesson from underspeccing the GPU in PS3 and over compensate with a monster GPU for PS4)
I doubt it'll be a monster compared to at least Microsoft's next console but I bet that it'll at least better with alpha effects and some other things that they cut the corners with. I bet it will be more in line with whatever the next Xbox has.
 

camineet

Banned
big_z said:
i would love to see nintendo use the same gamecube chip yet again but solely as a dedicated physics chip. the thing has got to be cheap enough to make by now. it would take motion controls to another level since it would allow for a new level of environmental interaction.

Flipper / Hollywood would not make a good physics chip.
 

camineet

Banned
LittleJohnny said:
IMO even 2012 is too early, hardware launches are annouced at E3 of the prior year, can you picture a hardware annoucement at the next E3? I can see Sony and MS showing off a ton of motion controlled games, it'll be the first E3 with those peripherals on store shelves. Nintendo has always been last to enter a new generation and I cannot picture them launching a new generation ahead of Sony and MS, Nintendo's focus will likely be the 3DS.

2012 will be the year when new generation plans are revealed, console spec's will run wild and new gen projects unveiled for the new systems launching in 2013. *Rubs crystal ball*


Hint Hint:

"The company has a history of releasing chips every three years, which points to a Power8 chip release for 2013."


Indeed, 2013 could be the year for the new generation of Xbox and PlayStation.

I do think Nintendo will strike first this time. Everything about past generations where Nintendo did not launch first is out the Window. We're in a new never-seen-before paradigm with Nintendo. They did launch the Famicom first in 1983, before the SMS or Atari 7800. So nobody can say they won't launch first next time.

I don't think a 2013 release of IBM Power8 will effect next-gen consoles. I think whatever IBM already has out now will be used as the basis for next-gen console CPUs. I just think IBM will go more parallal with its CPUs, i.e. a 16-20 core Xenon and a quad-core for Nintendo.
 

DonMigs85

Member
camineet said:
Indeed, 2013 could be the year for the new generation of Xbox and PlayStation.

I do think Nintendo will strike first this time. Everything about past generations where Nintendo did not launch first is out the Window. We're in a new never-seen-before paradigm with Nintendo. They did launch the Famicom first in 1983, before the SMS or Atari 7800. So nobody can say they won't launch first next time.
I agree, chances are Wii software releases will have slowed down quite a bit by 2012 so they'll need a refresh.
I do wonder though if they'll retain pretty much the same control setup, except maybe replacing the sensor bar with a camera similar to the PS Eye or something.
 
DonMigs85 said:
I agree, chances are Wii software releases will have slowed down quite a bit by 2012 so they'll need a refresh.
I do wonder though if they'll retain pretty much the same control setup, except maybe replacing the sensor bar with a camera similar to the PS Eye or something.

Well they better make all their peripherals wireless at the very least. And get rid of GC controller ports and memory slots.
 
Who's been editing the Wikipedia article on Fusion?

A heterogeneous multicore microprocessor architecture, combining a general purpose processing core(s) and basic graphics core(s) into one processor package, with different clocks for the graphics core and the central processing core.

Is that you Chittagong?
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.

HyperionX

Member
godhandiscen said:
Ehh... I dunno. How powerful is Fusion II expected to be?

Fusion II sounds totally unimpressive. According to Wikipedia (yes I know...), that's just a few Bobcat cores + lowend GPU on one chip. Not much of a leap over current console.

Compare this to a POWER7:

power7_ars.jpg


http://arstechnica.com/business/news/2010/02/two-billion-transistor-beasts-power7-and-niagara-3.ars

That's 8 OoO cores with 32MB of eDRAM L3 cache. The L3 cache could practically double as a frame buffer for the GPU!

Get rid of most of the server stuff and shrink it to 28nm (from 45nm now). Put a dedicated GPU on the same module (see MCM), or wait till 22nm and put it all on one die. Now that will be a console worth buying.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
HyperionX said:
That's 8 OoO cores with 32MB of eDRAM L3 cache. The L3 cache could practically double as a frame buffer for the GPU!
That would render that eDRAM useless as L3 cache for the CPU cores, and ultimately pose a negative impact on the CPU performance.
 

HyperionX

Member
blu said:
That would render that eDRAM useless as L3 cache for the CPU cores, and ultimately pose a negative impact on the CPU performance.

We're not exactly talking servers here. They can afford a few shortcomings in CPU performance.

For gaming though, having 32MB of general purpose eDRAM would be a huge benefit over current consoles.
 

szaromir

Banned
Wouldn't embedded dynamic RAM be enough for GPU? It's much cheaper and I don't think response time is as crucial as it is for CPU? I understand that TMUs, shader units etc. each already have a small (static?) memory pool to work with, but I mean something like embedded RAM in Wii or 360.
 

camineet

Banned
Rumour - Microsoft Have Chosen AMD Fusion II for Next Xbox Hardware

By Matt Williams - Fri Dec 3, 2010 3:11pm -
Xbox 720?

Tech-site KitGuru has dissected the history of the Xbox CPU and made a pretty big prediction. Analysing the history of Microsoft's consoles and business relationships around them, the website is claiming that the next Xbox will feature a AMD Fusion II CPU.

"Based on comments by people like Chekib Akrout, of all the likely designs to be targeted at the XBox 720 product, we think there’s a good chance that it will be the AMD Krishna product. This will be produced on Global Foundries’ 28nm ‘high-k gate first’ process."

Should the rumour be true it would potentially serve to quell two of the longest ongoing complaints targeted at the Xbox 360: Heat and Noise.

"The only real technical challenge for the first XBox 360 consoles was heat/noise, for which the AMD Krishna product could be the answer."

Be aware that all of this is only rumoured, but for all we know the next Xbox could be just around the corner. Microsoft remain tight-lipped.

http://games.on.net/article/10946/R...e_Chosen_AMD_Fusion_II_for_Next_Xbox_Hardware
 

camineet

Banned
HyperionX said:
Fusion II sounds totally unimpressive. According to Wikipedia (yes I know...), that's just a few Bobcat cores + lowend GPU on one chip. Not much of a leap over current console.

Compare this to a POWER7:

power7_ars.jpg


http://arstechnica.com/business/news/2010/02/two-billion-transistor-beasts-power7-and-niagara-3.ars

That's 8 OoO cores with 32MB of eDRAM L3 cache. The L3 cache could practically double as a frame buffer for the GPU!

Get rid of most of the server stuff and shrink it to 28nm (from 45nm now). Put a dedicated GPU on the same module (see MCM), or wait till 22nm and put it all on one die. Now that will be a console worth buying.


That would be amazing. I love Power7 :D
 

navanman

Crown Prince of Custom Firmware
Do the experts here believe the next generation of consoles will drop legacy AV ports. Will they be HDMI output only, dropping component and s-video for good.
I presume we can all but guarantee the next MS & Sony consoles will focus even more on the 3DTV gaming and TV experience and HMDI 1.4a ports are required for this.

The computer industry announced today announced that they are dropping VGA connections and LVDS for LCD monitors in favour of HDMI and DisplayPort for the future.
 

Durante

Member
HyperionX said:
Fusion II sounds totally unimpressive. According to Wikipedia (yes I know...), that's just a few Bobcat cores + lowend GPU on one chip. Not much of a leap over current console.

Compare this to a POWER7:
Have you looked at the prices of POWER7? We have one here and it's an amazing chip, but let's stay realistic here.
 
navanman said:
Do the experts here believe the next generation of consoles will drop legacy AV ports. Will they be HDMI output only, dropping component and s-video for good.
I presume we can all but guarantee the next MS & Sony consoles will focus even more on the 3DTV gaming and TV experience and HMDI 1.4a ports are required for this.

The computer industry announced today announced that they are dropping VGA connections and LVDS for LCD monitors in favour of HDMI and DisplayPort for the future.

No way. Tons of people using PS3s and 360s still have SD TVs. Sure, the number will be lower come next generation, but I don't expect any console manufacturer to willingly shrink their potential audience. I don't think the losses would offset the ones caused by having to include an A/V port. I expect HDMI cables to be bundled with all consoles, though.
 

okenny

Banned
H_Prestige said:
8gb of RAM? What on earth would a console need that much for?

Honestly, I'm not sure it would even need more than 2gb.

Speaking of RAM and ps3 BC, what are the chances that Sony uses XDR RAM again? Would ps3 bc require it?
:lol
 

McHuj

Member
Trunchisholm said:
No way. Tons of people using PS3s and 360s still have SD TVs. Sure, the number will be lower come next generation, but I don't expect any console manufacturer to willingly shrink their potential audience. I don't think the losses would offset the ones caused by having to include an A/V port. I expect HDMI cables to be bundled with all consoles, though.

They can just sell an adapter at a very high margin. You can pretty much adapt HDMI to anything.

I don't know how much adding the hardware for other displays costs for sure, but even if it's as little as $0.25 that a good chunk of change for such a high volume product and the possibility of reducing the cost should be explored.
 
navanman said:
Do the experts here believe the next generation of consoles will drop legacy AV ports. Will they be HDMI output only, dropping component and s-video for good.

Definitely not next generation. Probably the generation after that. As a point of comparison, this is the first generation where you can't buy official RF Adapters for the current consoles. :lol

McHuj said:
I don't know how much adding the hardware for other displays costs for sure, but even if it's as little as $0.25 that a good chunk of change for such a high volume product and the possibility of reducing the cost should be explored.

Here, I'm about to explore it.

Will everyone who wants to buy our systems have a TV that accepts HDMI in 2013? Oh wait no they won't.

Alright. Option explored!
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
charlequin said:
Definitely not next generation. Probably the generation after that. As a point of comparison, this is the first generation where you can't buy official RF Adapters for the current consoles. :lol



Here, I'm about to explore it.

Will everyone who wants to buy our systems have a TV that accepts HDMI in 2013? Oh wait no they won't.

Alright. Option explored!
Particularly when component does the job more than adequately when it comes to plain picture output.
 

McHuj

Member
charlequin said:
Definitely not next generation. Probably the generation after that. As a point of comparison, this is the first generation where you can't buy official RF Adapters for the current consoles. :lol



Here, I'm about to explore it.

Will everyone who wants to buy our systems have a TV that accepts HDMI in 2013? Oh wait no they won't.

Alright. Option explored!

That's not the option I was talking about exploring. The cost analysis of reducing manufacturing costs by only including one display connection.

And if that's the case that not everyone has a HDMI? So what? Put a sticker on the retail box that says if your TV doesn't have an HDMI connector, buy this MS/Sony branded adapter for $19.99 (or whatever ridiculous markup it is)
 

Oni Jazar

Member
McHuj said:
They can just sell an adapter at a very high margin. You can pretty much adapt HDMI to anything.

Actually you can't. HDMI has a handshaking protocal that can't be cheaply adapted to other signals.
 
McHuj said:
That's not the option I was talking about exploring. The cost analysis of reducing manufacturing costs by only including one display connection.

I am pointing out to you that dropping pre-HDMI connections is such a foolish idea that it will not take more than five seconds to realize that the lost sales will outstrip the 10 cents or whatever you'll save per-unit.
 

McHuj

Member
Oni Jazar said:
Actually you can't. HDMI has a handshaking protocal that can't be cheaply adapted to other signals.

My only argument is one economics for a company like Sony and MS. If you can lower the price of your base console and shift that cost to "optional" accessories, it may be economically beneficial to you.

If I can lower my BOM by even 10-20 cents, and I end up selling 50 million units over its life time, that can save me 5-10 million dollars over the life cycle.

You can get HDMI-to-component converters for $50-100, no that's not cheap, but that's also a low volume product. MS/Sony could easily drive down those costs and make a profit selling them.

MS has proven that you can make a lot of money off accessories like wireless adapters and hard drives (what many consider essential).

A lot of decisions aren't made with regards to what a better technical solution, it's what solution can make the most money for my business.


charlequin said:
I am pointing out to you that dropping pre-HDMI connections is such a foolish idea that it will not take more than five seconds to realize that the lost sales will outstrip the 10 cents or whatever you'll save per-unit.

And I don't believe there would be lost sales. In 2013/2014 when the next gen comes. Especially when given a solution to the problem (although not included in the box).
 

Ashes

Banned
When hdmi crosses even 75% of the market sure then we can even begin to talk about stuff like this as an outside shot. Console makers do look forward, so it stands to reason that at some point in the future hdmi will be 90%+.
 
What exactly is the benefit of dropping the old AV connector? You don't have to use it, the system and games are already optimized for HD output, and the five cents Sony saves isn't going to be passed on to you anyway. Seems like a dumb thing to put on a ps4 wish list.
 
McHuj said:
And I don't believe there would be lost sales.

Let's say the PS4 is sold at a $50 loss at launch and the average purchaser buys thirteen games/accessories over the system's life, at $10 net revenue to Sony for each.

That means that each lost customer would be worth $80. That means that if you save $5,000,000 in total by removing the part, you only have to lose 62,500 out of your 50 million total customers (one-eighth of one percent) to wipe out that savings.

HDTV penetration is high and will continue to get higher, but it's not over 99.875% yet. :lol More importantly, you can't look at whether people own one HDTV to determine when this is viable; you have to look at whether all the TVs people will want to hook consoles up to (that is, dorm room TVs, secondary TVs in kids' rooms or bedrooms, the older LCD that got moved down to the den, etc.) are HDTVs with HDMI. That number's going to trend quite a bit behind the overall HDTV penetration figure until there have been a good solid four or five years of bargain-priced small HDMI TVs being sold at BJs, Wal-Mart, Target etc. -- again, should be fine for PS5 but not a console launching in 2012 or even 2013.
 
H_Prestige said:
What exactly is the benefit of dropping the old AV connector? You don't have to use it, the system and games are already optimized for HD output, and the five cents Sony saves isn't going to be passed on to you anyway. Seems like a dumb thing to put on a ps4 wish list.

Keep a connector, lose the cable.
If they threw in a cheap HDMI cable as standard they could make a packet by selling overpriced proprietry old AV cables to people who still need one.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
charlequin said:
I am pointing out to you that dropping pre-HDMI connections is such a foolish idea that it will not take more than five seconds to realize that the lost sales will outstrip the 10 cents or whatever you'll save per-unit.
I wonder sometimes what would be the chances that manufacturers re-invented a cube-style digital-AV port solution - a simplistic data bus outputting on the console port, while the actual video signal transmitter sits in the cable. That would allow for an 'anti-nintendo circa '04' scenario where the console can substitute its multiple av ports for one proprietary port.
 

StevieP

Banned
blu said:
I wonder sometimes what would be the chances that manufacturers re-invented a cube-style digital-AV port solution - a simplistic data bus outputting on the console port, while the actual video signal transmitter sits in the cable. That would allow for an 'anti-nintendo circa '04' scenario where the console can substitute its multiple av ports for one proprietary port.

HDMI spec would probably never allow something like this. DisplayPort is an idea, though.

You know what would be better on future consoles? CPUs that weren't so damn crippled compared to even the low-end of the Intel offerings or the "real" PowerPC lineage. You know, out-of-order-execution. Memory controllers. The whole deal.
 
Graphics Horse said:
Keep a connector, lose the cable.
If they threw in a cheap HDMI cable as standard they could make a packet by selling overpriced proprietry old AV cables to people who still need one.

I like how this is just the flipside of the current situation where they pack in composite cables so the stores can sell people overpriced HDMI cables. :lol
 
charlequin said:
I like how this is just the flipside of the current situation where they pack in composite cables so the stores can sell people overpriced HDMI cables. :lol

The difference is they'd have a monopoly on the cable sales thanks to a proprietry connector, it's genius :D
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
StevieP said:
HDMI spec would probably never allow something like this. DisplayPort is an idea, though.
Well, one could argue that with the majority of current embedded designs where the HDMI transmitter is rarely on-chip, and you always have some generic 'display interface' bus feeding the transmitter, things are not so much more different DRM-wise. Of course, such a bus does not conveniently exit to the back of the PCB, ala the digital-AV port, so you could argue it'd be a bit more difficult to tap that bus for digital content theft purposes, but still..

You know what would be better on future consoles? CPUs that weren't so damn crippled compared to even the low-end of the Intel offerings or the "real" PowerPC lineage. You know, out-of-order-execution. Memory controllers. The whole deal.
I think that with the advent of CPUs like the A9 embedded devices will start to see more and more the use of CPU designs with a full package of 'convenience' features. That said, consoles often tackle a given computational/bw problem smartly, versus trying to brute-force it ala desktop/server chips. You know, we're still far from the moment when large quantities of transistors will be free ; )
 
Top Bottom