• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 Rumors , APU code named 'Liverpool' Radeon HD 7970 GPU Steamroller CPU 16GB Flash

Status
Not open for further replies.

THE:MILKMAN

Member
All am suggesting is they can use a modified versions of the current existing technology, for example last year the power consumption of a 6970 was around 200W, but the 7870 is about 115W. By end of this year or early next year, the possibility of a 8850/8870 with power consumption in the low 100's is a possibility.

I don't believe either MS or Sony will/can go with a GPU sucking up low 100's in power.

I believe the launch PS3's RSX was around 70-80w. That will be the upper limit for PS4's IMO.
 
A Cell 2.0 won't exist because of IBM. Though it is possible these designs can be used elsewhere.

according to Jeff_rigby it sounds like a 1PPU4SPU design might be seen alongside the AMD x86 fusion design in the PS4.

I wonder what it would be used for? Just BC? Maybe audio and video decoding as well? Is the PPU and SPU's in it improved over what is in the cell? I would hope they would have a larger cache/memory.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
according to Jeff_rigby it sounds like a 1PPU4SPU design might be seen alongside the AMD x86 fusion design in the PS4.

I wonder what it would be used for? Just BC? Maybe audio and video decoding as well? Is the PPU and SPU's in it improved over what is in the cell? I would hope they would have a larger cache/memory.

Uhh... according to my magic 8-ball it will have a hamster and a wheel.
 
He isn't wrong. Cell, namely the SPE's are still beasts when it comes to 3d maths and probably still one of the best when it comes to decoding video. As you said in regards to FP cell smokes any consumer or even enthusiast level CPU. Using a simple Linpack puts the PS3's Cell at about 73GFLOP compared to the like 40GLOPS for an I7.

So yeah like he said some conventional CPUs can't keep pace with Cell doing certain things.

That it's just not true.

My early 2007 C2Q Yorkfield @ 3'6Ghz does 47Gflops.
An I5 2500K @ 5Ghz does 70,13Gflops.
An I7 980 Sixcore @ 5Ghz does 106,65Gflops.

Real world performance, not peak theoretical numbers as given to CELL.

That is 8 SPU in Cell vs 4 FPU in C2Q/i5 or 6 in a Sixcore I7.

CELL it's an outdated CPU not by today standards, but because of Unified Shaders release from ATI and NVIDIA. There is no need to strong FPU's at CPU anymore.

You can argue what CELL can do to help that weak RSX, but that doesn't make CELL any better. Not performance wise, not efficient wise not dollar/watt/performance either. It's not about what CELL can do, but what programmers HAVE to do due to system flaws at PS3.

CELL was a great idea at 2001, under Playstation 2 paradigma, but it research pipeline was too long and was already outdated as a concept at release time.

CELL fanbase have to move on. Any modern CPU-GPU combo will be better at any given DIE budget than CELL, as any CPU+GPU was better at 2006. That's why a cheaper Xbox360 outperform PS3 with lower transistor count, lower TDP and cheaper: wrong architecture.

Said that, any steamroller like CPU for a console would be another fail. Overkill number of unneeded threads with low IPC, high transistor count and high TDP.

It's difficult to take any advantage of too many threads in gaming, as ports from PS360 to PC proves, maxing at 2 cores usage.

Any low core count, IPC monster like dual cores I3 and a stronger GPU would be better. But I think Intel will not license their CPU's to any other thirdparty. Or not cheap, at least.

In consoles it's all about DIE budget. The more efficient system wins.

You miss the point talking about a certain high end PC model GPU. You will not have a 7970 inside your machine. You COULD have a Southern Islands derived GPU because of better efficiency at lower bandwith and cluster count compared to Nvidia. That's why it's meaningless to talk about cores. RSX can have as many Vertex/pixel shaders as a GTX 7800, but it have HALF the bandwith. So we are not talking about GTX7800 anymore, but a customized dumbed down G70 architecture.

Stop the hype about 375W high ends GPUS. Stop that stale CELL hype. CELL can't deliver, end of history. Uncharted 2 looks good because of 25GB of streamable art assets and scripts. Not because of CELL, but despite CELL.

CELL, KEEP OUT from PS4. Sony, have a look at PSOne. That's the way to go.

TL;DR:

CELL: BAD
STEAMROLLER: BAD
IPC: GOOD
DIE Size/Performance: GOOD
Sonic fanart: BAD
My english: WORSE
 

RoboPlato

I'd be in the dick
Quick question, it's pretty likely that the system will have an updated shader model to be similar to the DX11.1 spec that's staring this year, right?
 

StevieP

Banned
Quick question, it's pretty likely that the system will have an updated shader model to be similar to the DX11.1 spec that's staring this year, right?

Yes. GCN supports 11.1 so the PS4 will support that same shader model. Obviously they will not be using DX but that's a given.
 

RoboPlato

I'd be in the dick
Yes. GCN supports 11.1 so the PS4 will support that same shader model. Obviously they will not be using DX but that's a given.

Cool, that's what I figured. I knew they wouldn't be using DX but I wasn't sure how to word the question and get my point across.
 

Lord Error

Insane For Sony
That it's just not true.

My early 2007 C2Q Yorkfield @ 3'6Ghz does 47Gflops.
An I5 2500K @ 5Ghz does 70,13Gflops.
An I7 980 Sixcore @ 5Ghz does 106,65Gflops.

Real world performance, not peak theoretical numbers as given to CELL.
He was talking Linpack, where fastest i7 scores 109GF @ 5Ghz.

If Cell on Linpack really puts out 73GF @ 3GHz, that's not theoretical performance vs. real life performance. That's Linpack vs Linpack, and would actually be a win for Cell per clock. That aside, Cell was well suited for PS3 due to an effortless video decoding necessary for BR playback.
*edit * Also, your argument why games look good is so ridiculous that I can't even bring myself to address it - and makes me feel a bit bad that I tried to reply with anything serious in the first place.
 

Rolf NB

Member
He was talking Linpack, where fastest i7 scores 109GF @ 5Ghz.

If Cell on Linpack really puts out 73GF @ 3GHz, that's not theoretical performance vs. real life performance. That's Linpack vs Linpack, and would actually be a win for Cell per clock. That aside, Cell was well suited for PS3 due to an effortless video decoding necessary for BR playback.
*edit * Also, your argument why games look good is so ridiculous that I can't even bring myself to address it.
Also Cell did that in 2006 on 90nm. It took Intel 4 whole process nodes to pull even.
 
That it's just not true.

My early 2007 C2Q Yorkfield @ 3'6Ghz does 47Gflops.
An I5 2500K @ 5Ghz does 70,13Gflops.
An I7 980 Sixcore @ 5Ghz does 106,65Gflops.

Real world performance, not peak theoretical numbers as given to CELL.

That is 8 SPU in Cell vs 4 FPU in C2Q/i5 or 6 in a Sixcore I7.

CELL it's an outdated CPU not by today standards, but because of Unified Shaders release from ATI and NVIDIA. There is no need to strong FPU's at CPU anymore.

You can argue what CELL can do to help that weak RSX, but that doesn't make CELL any better. Not performance wise, not efficient wise not dollar/watt/performance either. It's not about what CELL can do, but what programmers HAVE to do due to system flaws at PS3.

CELL was a great idea at 2001, under Playstation 2 paradigma, but it research pipeline was too long and was already outdated as a concept at release time.

CELL fanbase have to move on. Any modern CPU-GPU combo will be better at any given DIE budget than CELL, as any CPU+GPU was better at 2006. That's why a cheaper Xbox360 outperform PS3 with lower transistor count, lower TDP and cheaper: wrong architecture.

Said that, any steamroller like CPU for a console would be another fail. Overkill number of unneeded threads with low IPC, high transistor count and high TDP.

It's difficult to take any advantage of too many threads in gaming, as ports from PS360 to PC proves, maxing at 2 cores usage.

Any low core count, IPC monster like dual cores I3 and a stronger GPU would be better. But I think Intel will not license their CPU's to any other thirdparty. Or not cheap, at least.

In consoles it's all about DIE budget. The more efficient system wins.

You miss the point talking about a certain high end PC model GPU. You will not have a 7970 inside your machine. You COULD have a Southern Islands derived GPU because of better efficiency at lower bandwith and cluster count compared to Nvidia. That's why it's meaningless to talk about cores. RSX can have as many Vertex/pixel shaders as a GTX 7800, but it have HALF the bandwith. So we are not talking about GTX7800 anymore, but a customized dumbed down G70 architecture.

Stop the hype about 375W high ends GPUS. Stop that stale CELL hype. CELL can't deliver, end of history. Uncharted 2 looks good because of 25GB of streamable art assets and scripts. Not because of CELL, but despite CELL.

CELL, KEEP OUT from PS4. Sony, have a look at PSOne. That's the way to go.

TL;DR:

CELL: BAD
STEAMROLLER: BAD
IPC: GOOD
DIE Size/Performance: GOOD
Sonic fanart: BAD
My english: WORSE

So what the hell should Sony use in their PS4s then?
 

hodgy100

Member
CELL fanbase have to move on. Any modern CPU-GPU combo will be better at any given DIE budget than CELL, as any CPU+GPU was better at 2006. That's why a cheaper Xbox360 outperform PS3 with lower transistor count, lower TDP and cheaper: wrong architecture.

wow. The cell is not powerful compared to current cpu's but it still packs punch there are a lot of clever graphical tricks you can pull off on the cell. which is why Sony originally considered no GPU but 2 cells for the ps3. this gives the ps3 effectively a sub gpu which when used correctly can absolutely smoke the 360 performance wise.

Said that, any steamroller like CPU for a console would be another fail. Overkill number of unneeded threads with low IPC, high transistor count and high TDP.

It's difficult to take any advantage of too many threads in gaming, as ports from PS360 to PC proves, maxing at 2 cores usage.

core usage in pc's do not apply to consoles. until recently multi-threading on pc's was done exactly that way, in threads. on the ps3 they would have been using the "job" method of using multiple cores. which is much more efficient.

Stop the hype about 375W high ends GPUS. Stop that stale CELL hype. CELL can't deliver, end of history. Uncharted 2 looks good because of 25GB of streamable art assets and scripts. Not because of CELL, but despite CELL.

Uncharted2/3 is a technical marvel your opinion is invalid.
 

AB12

Member
I don't believe either MS or Sony will/can go with a GPU sucking up low 100's in power.

I believe the launch PS3's RSX was around 70-80w. That will be the upper limit for PS4's IMO.

7870 current peak power consumption is 115W, by next year it could lowered with a console design.
 

ekim

Member
German gaming sites are fucking stupid. One guy just discovered the xbox720 PowerPoint from 2010 and wrote that this presentation took place one week ago. Now they simply copy the text. And yes : every notable site is involved.
 

THE:MILKMAN

Member
7870 current peak power consumption is 115W, by next year it could lowered with a console design.

Sure could be. My point is that the GPU can be made from magic fairy dust and contain 12 virgins, but it won't exceed 80W TDP on it's own.

Unless/until Sony or a legit Gaf insider confirms otherwise, this is what I believe.
 
Which he said was up to art design, asset variety, and the overall quality of Naughty Dog's work.

Your post affirms his stance, does nothing to harm it.

Yeah art has something to do with it but so does specs.
Saying that UC2 look good because of 25GB of streamable art assets and scripts without talking about hardware is stupid .
If anything the PS3 only get games that good because of Cell since it had to help out rsx a whole bunch since it suck .

PS4 is look to be the opposite of ps3 hardware wise from the leak stuff we know.
 
He was talking Linpack, where fastest i7 scores 109GF @ 5Ghz.

If Cell on Linpack really puts out 73GF @ 3GHz, that's not theoretical performance vs. real life performance. That's Linpack vs Linpack, and would actually be a win for Cell per clock. That aside, Cell was well suited for PS3 due to an effortless video decoding necessary for BR playback.

Take a look at dies:

Cell:

cell_processor_die.jpg


Nehalem:

New_Nehalem_Die_Callout.jpg


SPE's fills most of the total die. Linpack is a measure of raw float point horsepower, where Cell excels. And, even then, it just fails to deliver in actual software scenarios:

,N-H-111005-15.jpg


Cell @ 3'2Ghz, Q6600 @ 2'4Ghz.

That's due to Cell poor performance at double-precision floating-point. FPU it's just a tiny part of a x86 processor, and even then it do better than Cell.

Bluray argument is also pretty nonsense. One Cell for BR decoding it's just overkill. It could handle a lot of streams simultaneously. You can't feed that to a retail PS3 with a low profile HD and a single BR unit under a poor performance southbridge.


*edit * Also, your argument why games look good is so ridiculous that I can't even bring myself to address it - and makes me feel a bit bad that I tried to reply with anything serious in the first place.

Games looks good because of insane budgets. Thats why PS3 is unprofitable. Bad architecture design. High costs at hardware level. High costs at software level.

Not only that, there is no modern high budget engines for Xbox 360. Sony invested a ton in research custom engines for PS3. Most high budget games for 360 use multiplatform engines. And most of better looking games in PS3 are scripted as hell.

Uncharted 2 looking marvelous proves Naughty Dog is a top studio. Nothing more.

Also Cell did that in 2006 on 90nm. It took Intel 4 whole process nodes to pull even.

Q6600 it's from that dates too. To say Cell can outperform a C2Q due to his high single precision float poit raw numbers it's like say Playstation 2 have a better GPU than PS3 because of Graphic Syntetizer bandwith.

It was so silly to compare Cell with C2Q back then, it's just insane to do right now with Sandy Bridges out there.

I don't know if Sony launched PS3 with Cell due to too strong investment on it or just japanese pride, but it was a very wrong move. Everyone can see that now. It's way beyond my understanding why there is people that defend Cell nowadays.
 

Just Lazy

Banned
If these rumoured specs are right. Theoretically, could the ps4 run a game like Planetside 2 easily? Or would that be towards the upper level of what's achievable?
 

SiteSeer

Member
So what the hell should Sony use in their PS4s then?

use off the shelf components, ramp up production of scale to reduce costs, transfer r & d hardware budget to software development and especially psn/sen support. beat ms live. i could live with that (even with no backwards compatibility).
 
Take a look at dies:

Games looks good because of insane budgets. Thats why PS3 is unprofitable. Bad architecture design. High costs at hardware level. High costs at software level.

Not only that, there is no modern high budget engines for Xbox 360. Sony invested a ton in research custom engines for PS3. Most high budget games for 360 use multiplatform engines. And most of better looking games in PS3 are scripted as hell.

Uncharted 2 looking marvelous proves Naughty Dog is a top studio. Nothing more.

You really have no idea what you are talking about do you ?
Also saying that there are no high budget engines for Xbox 360 is suck a joke you think just because Unreal Engine 3 is multiplatform it is not modern or high end .
The amount of work that get put in Unreal Engine 3 is crazy even compare to Sony custom engines .

use off the shelf components, ramp up production of scale to reduce costs, transfer r & d hardware budget to software development and especially psn/sen support. beat ms live. i could live with that (even with no backwards compatibility).

They going to use off the shelf components but they still going to have to customize the parts .
 

Lord Error

Insane For Sony
Bluray argument is also pretty nonsense. One Cell for BR decoding it's just overkill. It could handle a lot of streams simultaneously.
This is false, and for all that it needs to decode on a high bitrate BR with a 3D movie encode, PS3 as is, is hardly enough. I know that with high bitrate 3D movie encodes some corners needed to be cut with audio decoding on PS3.

Q6600 it's from that dates too. To say Cell can outperform a C2Q due to his high single precision float poit raw numbers it's like say Playstation 2 have a better GPU than PS3 because of Graphic Syntetizer bandwith
Not really true, as single precision FP is hugely useful for practically anything outside of scientific purposes (such as F@H). The fact that C2D is better at double precision FP is what's mostly meaningless in a game development or video decoding scenario, not the other way around.

Again, not touching your game arguments because they're too ridiculous to bother responding.
 

patsu

Member

You're comparing apples and oranges.

e.g., In the Folding@Home example, the GeForce 280GTX GPU alone required a 550W power supply *at least*. At launch, the entire PS3 (including RSX, HDD, Blu-ray drive and fan) used 205W max.

Those performance numbers will need to be normalized. Cell is power efficient. If you throw an equivalent amount of power at Cell, you should be able to scale the performance of a Cell optimized application up almost linearly. This may not be true for other architectures.

For a traditional GPU, the scaling is also usually linear only because the problems are embarrassingly parallel. But you need to add the CPU overhead.



The DMA memory model helps Cell to hide memory latency. If the app is optimized for Cell, it is like running the entire data and program in L1 cache. The GPU also uses similar techniques hiding a far longer pipeline (Hence higher FLOP count). However it depends on how the CPU is integrated with the GPU, and how memory is arranged. Cell is just one compact CPU doing both CPU and GPU-like jobs. For a "traditional GPU", you will always need a CPU, hence upping the power consumption even higher.



At the end of the day, it depends on your apps. Given the same power consumption, there are apps where Cell can beat a GPU easily and vice versa.

In Folding@Home, the Intel CPU is the most general. It can be used to solve all the Folding@Home jobs albeit slower. The GPU is the most specialized. It is only used to solve the highly parallelized problems even though its performance is the highest. The Cell is somewhere in-between. It can accelerate jobs that a traditional GPU or CPU can't handle for different reasons.


High performance computing uses GPGPU these days because (1) It costs a lot of money designing supercomputer CPU. Most organizations don't want to spend that kind of money anymore. So most adopt the GPGPU model since there is already a market to sustain high performance graphics processors. (2) In addition, these applications run in a cold room drawing huge (but efficient) power. At this high power range, the GPGPU architecture is a natural fit, and can outperform Cell because of its long pipeline and many cores.

In a home console, we may not have such luxury, especially not in 2005.

For consoles, it is not impossible to see a vector engine attached to the CPU, to accompany the GPU (e.g., See Vita, Xbox 360 and PS3). The important thing is how they work together.

If someone were to design a modern Cell today, ease of programming will probably be top on the list. So it may end up looking more like other architectures. Similarly, if someone were to design a modern CPU today, they may also steal ideas from Cell (as IBM has said). I remember when Cell was first introduced Carmack and others were bitching about hetereogenous CPU cores. Looking at today's world, hetereogenous cores may be here to stay especially if they want to use the CPU and GPU together with less overhead, and cheaply.

EDIT:
The seggregated memory (LocalStore) in Cell is also handy in running a completely separate security kernel. It helps to keep PS3 locked down. A regular CPU and GPU architecture can't do so (Their memory is often shared and globally addressable).
 

coldfoot

Banned
Q6600 it's from that dates too. To say Cell can outperform a C2Q due to his high single precision float poit raw numbers it's like say Playstation 2 have a better GPU than PS3 because of Graphic Syntetizer bandwith.

It was so silly to compare Cell with C2Q back then, it's just insane to do right now with Sandy Bridges out there.

I don't know if Sony launched PS3 with Cell due to too strong investment on it or just japanese pride, but it was a very wrong move. Everyone can see that now. It's way beyond my understanding why there is people that defend Cell nowadays.

You realize you're comparing a tiny chip with ones that are 2x-10x larger and more power hungry, and expensive, right?

Cell is only good at certain tasks, but its advantage at those tasks is incredible over any x86 chip with similar area and power consumption. Since this high performance is due to SPE's, it might be meaningful to save them for the PS4 and pair them with a general purpose CPU.
 
wow. The cell is not powerful compared to current cpu's but it still packs punch there are a lot of clever graphical tricks you can pull off on the cell. which is why Sony originally considered no GPU but 2 cells for the ps3. this gives the ps3 effectively a sub gpu which when used correctly can absolutely smoke the 360 performance wise.

To say Sony planned to use two cells instead of a GPU it's like say Saddam thought to use PS2 to lauch ballistic missiles, a non sense. Just a viral or myth.

You CAN'T use a second Cell as a GPU without indeep logic changes. How will it draw fast enough without ROPs? How will it handle textures without TMU's? It have no hardware offloads to graphic capabilities. How could it be able to act as a GPU then?

PS3 will never able to 'smoke 360 performance wise' even in your wildest dream. Xenos is way ahead of RSX tech wise. Xenon is stronger as CPU than Cell, like 3 to 4 times.

For every 'trick' Cell can do to help RSX, a programer just have to ask Xenos to do, and it will do better and more easily. There is no advantages in Cell, just not a single one. You have to understand this.

And once you have done that, you have to think what can you do with current hardware. That's what Naughty Dog or Dice Dice did. Figure out how to take some profit from a messy hardware.

It's not about what Cell can do, but what have we to do with Cell because of all of this. And once PS3 reach it's end of life, move on, because Cell it's a dead end.



core usage in pc's do not apply to consoles. until recently multi-threading on pc's was done exactly that way, in threads. on the ps3 they would have been using the "job" method of using multiple cores. which is much more efficient.

SMT IS the way to go in multithreading. A shared cache pool, not isolated processors with private caches doing their jobs and releasing it afterwards. Not DMA freezing the bus. Not high latencies and the need to over profile your code before feed to the CPU. At this point, I have my doubts about your working knowledge.

Everyone at the industry is moving to one direction, not only Intel or AMD, but Nvidia with it's GPGPU. Just have a look at Fermi or Kepler architecture, or even Southern Island. Look how them handle multi thread.

And finally look at IBM Power 7 symmetric multiprocessor architecture.

Cell was a great idea at 2001, but someone did better and everyone followed.

Uncharted2/3 is a technical marvel your opinion is invalid.

It is indeed, but I'm not talking about opinions. This is a fact.

So what the hell should Sony use in their PS4s then?

Take a modern GPU, get rid off any unnecesary GPGPU capabilities to reduce transistor count, as Nvidia did with GTX680, but even further. Focus on dedicated hardware offloads for gaming scenario. Magically you have a highly performance but cheap chip to power a gaming device.

That's what Sony and MS will do. My bet at least.

Also saying that there are no high budget engines for Xbox 360 is suck a joke you think just because Unreal Engine 3 is multiplatform it is not modern or high end .
The amount of work that get put in Unreal Engine 3 is crazy even compare to Sony custom engines .

Unreal engine is made to work on a wide array of machines. It's not a metal level engine designed to push every single peculiarity from Xbox 360 hardware. Naughty Dog code directly to metal, the guy at Epic just ask to a layer to draw a marine. That is a HUGE difference performance wise. There is not a single game not using DirectX in the entire 360 catalogue.

This is false, and for all that it needs to decode on a high bitrate BR with a 3D movie encode, PS3 as is, is hardly enough. I know that with high bitrate 3D movie encodes some corners needed to be cut with audio decoding on PS3.

http://www.youtube.com/watch?v=TlkEU_l02qg

One thing is to stream 8 HD signals, and another one it's to decode 8 .mkv files using x264 codec. There is no such offload engine in Cell to do that.

Not really true, as single precision FP is hugely useful for practically anything outside of scientific purposes (such as F@H). The fact that C2D is better at double precision FP is what's mostly meaningless in a game development or video decoding scenario, not the other way around..

CPU FP is what's mostly meaningless having a GPU at the end of the bus.
 

patsu

Member
SMT IS the way to go in multithreading. A shared cache pool, not isolated processors with private caches doing their jobs and releasing it afterwards. Not DMA freezing the bus. Not high latencies and the need to over profile your code before feed to the CPU. At this point, I have my doubts about your working knowledge.

Everyone at the industry is moving to one direction, not only Intel or AMD, but Nvidia with it's GPGPU. Just have a look at Fermi or Kepler architecture, or even Southern Island. Look how them handle multi thread.

And finally look at IBM Power 7 symmetric multiprocessor architecture.

Cell was a great idea at 2001, but someone did better and everyone followed.

You're confusing SPU with Cell. Cell can also support SMT. e.g., IBM has a dual Cell workstation in SMT arrangement. Even on PS3, PS3 Linux supports pthread. You can always throw more PPU into a Cell setup.

But it also has the SPUs acting as standalone/autonomous vector engines where needed.


EDIT: And nope, Xenon is definitely not a stronger CPU than Cell if you program to their strength.
 
You're comparing apples and oranges.

e.g., In the Folding@Home example, the GeForce 280GTX GPU alone required a 550W power supply at least. At launch, the entire PS3 (including RSX, HDD, Blu-ray drive and fan) used 205W max.

Lol. You can power up a overclocked GTX280, and heavily overclocked six core CPU with a 550W power supply. You can even power 2 GTX280 in SLI with a good 550W power supply.

Come on!

TDP_PCGH.jpg


Those performance numbers will need to be normalized. Cell is power efficient. If you throw an equivalent amount of power at Cell, you should be able to scale the performance of a Cell optimized application up almost linearly. This may not be true for other architectures.

Power efficient compared with what?

Same game will run better on Xbox with less power consumption.
Data analysis will finish earlier in a PC, so it will drawn less power over time.

Current x86 is way more efficient than Power architecture used at Cell and Xenon, as Core2 or K10 were.

For a traditional GPU, the scaling is also usually linear only because the problems are embarrassingly parallel. But you need to add the CPU overhead.

Wut? Can you develop more on this? This sound too vague.



The DMA memory model helps Cell to hide memory latency. If the app is optimized for Cell, it is like running the entire data and program in L1 cache. The GPU also uses similar techniques hiding a far longer pipeline (Hence higher FLOP count). However it depends on how the CPU is integrated with the GPU, and how memory is arranged. Cell is just one compact CPU doing both CPU and GPU-like jobs. For a "traditional GPU", you will always need a CPU, hence upping the power consumption even higher.

The DMA model is an outdated memory transfer model used in SEGA Genesis or Amiga 500. Modern CPUs uses point to point technologies.



At the end of the day, it depends on your apps. Given the same power consumption, there are apps where Cell can beat a GPU easily and vice versa.

At the end of the day you need to compile each byte from your code by hand to have some acceptable performance in Cell. You can throw your code as it to a modern CPU, its dispatcher will find by itself how to optimally execute it.

Not only that, a modern CPU will be able to run several programs at once, each one coded by a different programmer using different languages, just like your average desktop use. Cell would die trying to do that.

In Folding@Home, the Intel CPU is the most general. It can be used to solve all the Folding@Home jobs albeit slower. The GPU is the most specialized. It is only used to solve the highly parallelized problems even though its performance is the highest. The Cell is somewhere in-between. It can accelerate jobs that a traditional GPU or CPU can't handle for different reasons.

True, SPU are just a bit more flexible than first generation of Unified Shaders architecture to run some code.. But it can't excel where CPU or GPU works. PPE is awful slow. SPUs can't compete with unified shaders. There is no place to such a piece of hardware like Cell with current shader model.


High performance computing uses GPGPU these days because (1) It costs a lot of money designing supercomputer CPU. Most organizations don't want to spend that kind of money anymore. So most adopt the GPGPU model since there is already a market to sustain high performance graphics processors. (2) In addition, these applications run in a cold room drawing huge (but efficient) power. At this high power range, the GPGPU architecture is a natural fit, and can outperform Cell because of its long pipeline and many cores.

And everything at that level works in racks. Easily upgradable or fixable. You can also use racks of cells under a cold room. They use modern GPGPU system because they are better.

In a home console, we may not have such luxury, especially not in 2005.

For consoles, it is not impossible to see a vector engine attached to the CPU, to accompany the GPU (e.g., See Vita, Xbox 360 and PS3). The important thing is how they work together.

That it's not the problem. Problem was how game changer Xenos was with its Unified Shaders architecture.

If someone were to design a modern Cell today, ease of programming will probably be top on the list. So it may end up looking more like other architectures. Similarly, if someone were to design a modern CPU today, they may also steal ideas from Cell (as IBM has said). I remember when Cell was first introduced Carmack and others were bitching about hetereogenous CPU cores. Looking at today's world, hetereogenous cores may be here to stay especially if they want to use the CPU and GPU together with less overhead, and cheaply.

http://en.wikipedia.org/wiki/POWER7

EDIT:
The seggregated memory (LocalStore) in Cell is also handy in running a completely separate security kernel. It helps to keep PS3 locked down. A regular CPU and GPU architecture can't do so (Their memory is often shared and globally addressable).

The shared cache is handy in... Everything else.

Again, there is no point in explain what Cell can do. You have to look at the bigger picture. Any AMD low cost fusion proccesor will do better as a console than a bunch of Cells. A PS3 powered by a Core2Duo and a 8800GT would have sweep the floor with xbox360, and would be cheaper too to manufacture and program to.

Xbox360 makes PS3 a bad system. Sony first parties make PS3 a good console. Easy at this.
 

coldfoot

Banned
A PS3 powered by a Core2Duo and a 8800GT would have sweep the floor with xbox360, and would be cheaper too to manufacture and program to.
ROFL, you should get a tag for the bolded part. Just ask MS about how using an Intel CPU worked out for them for the first Xbox, especially regarding cost reductions.
 
You're confusing SPU with Cell. Cell can also support SMT. e.g., IBM has a dual Cell workstation in SMT arrangement. Even on PS3, PS3 Linux supports pthread. You can always throw more PPU into a Cell setup.

But it also has the SPUs acting as standalone/autonomous vector engines where needed.

You can also beef up FPU in a Sandy Bridge CPU. It will be called Haswell. How will you call a multicore Cell? Because a multicore Cell or a dual socket Cell is not what current Cell is.


EDIT: And nope, Xenon is definitely not a stronger CPU than Cell if you program to their strength.

3 symetric multipurpose cores with 1MB L2 cache vs a single one with 512Kb L2 it's what I call a stronger central processing unit, given that both have the very same core architecture. Integer wise, Xenon is MUCH stronger than Cell. As much as Cell is floating point wise.

Everyone forgets about that.
 
LOL It's so simple, they should have used a C2D and an 8800GT in PS3 and it would have easier to develop for and cheaper to manufacture !!!

ROFL, you should get a tag for the bolded part. Just ask MS about how using an Intel CPU worked out for them for the first Xbox, especially regarding cost reductions.

Call Sony and ask for their margin profits.

Now take a look at how profitable were G92 and shrinks to Nvidia. Or how profitable were that tiny C2D to Intel. Intel may be pretty draconian licensing his tech, but a stone is more profitable than this:

ps3_32.jpg
 

patsu

Member
Lol. You can power up a overclocked GTX280, and heavily overclocked six core CPU with a 550W power supply. You can even power 2 GTX280 in SLI with a good 550W power supply.

...

Power efficient compared with what?

Google the web for GTX280 power consumption. ^_^

Same game will run better on Xbox with less power consumption.

At launch ? Early Xbox 360 overheated. MS had to redesign the entire thing and shrink it aggressively. It's on a smaller process now, hence taking up less power.

Data analysis will finish earlier in a PC, so it will drawn less power over time.

We'll have to look at the actual numbers. In some supercomputing applications, a Cell system actually outran a large scale PC distributed system. Cell's power efficiency is published in papers. You can look them up if you're interested.

Current x86 is way more efficient than Power architecture used at Cell and Xenon, as Core2 or K10 were.

Cell was designed in early 2000s. If we still can't compete with Cell today, then something is very wrong with the tech world. That doesn't mean Cell is not power efficient though. ^_^

Wut? Can you develop more on this? This sound too vague.

What vague ? The traditional GPU architecture has always been designed for embarrassingly parallel applications. Modern GPGPU architectures generalize it, but it can't run away from its roots.

The DMA model is an outdated memory transfer model used in SEGA Genesis or Amiga 500. Modern CPUs uses point to point technologies.

... and GPU's pipeline model dates back to the 80s. All modern technologies are based on old and sound principles. The trick is to use them correctly.

At the end of the day you need to compile each byte from your code by hand to have some acceptable performance in Cell. You can throw your code as it to a modern CPU, its dispatcher will find by itself how to optimally execute it.

Not only that, a modern CPU will be able to run several programs at once, each one coded by a different programmer using different languages, just like your average desktop use. Cell would die trying to do that.

Nonsense. I don't think Uncharted is written by just one guy.

True, SPU are just a bit more flexible than first generation of Unified Shaders architecture to run some code.. But it can't excel where CPU or GPU works. PPE is awful slow. SPUs can't compete with unified shaders. There is no place to such a piece of hardware like Cell with current shader model.

And everything at that level works in racks. Easily upgradable or fixable. You can also use racks of cells under a cold room. They use modern GPGPU system because they are better.

That it's not the problem. Problem was how game changer Xenos was with its Unified Shaders architecture.

Cell is used to do vertex and lighting work together with RSX. It also decodes Blu-ray streams and run Java applets. It performs network tasks like RemotePlay and DLNA too. Underneath all these tasks, it also handles the hypervisor. Computing model-wise, it is far far more flexible than Unified Shaders.

As I mentioned, they use GPGPU first and foremost because of economics. And yes, GPGPU is suitable for supercomputing but the world fastest supercomputer (updated last month) is not GPGPU-based. It is BlueGene/Q: http://www.top500.org/lists/2012/06/press-release

It's PowerPC based. Its predecessor, BlueGene/L, is Cell + PowerPC-based.

Again, there is no point in explain what Cell can do. You have to look at the bigger picture. Any AMD low cost fusion proccesor will do better as a console than a bunch of Cells. A PS3 powered by a Core2Duo and a 8800GT would have sweep the floor with xbox360, and would be cheaper too to manufacture and program to.

In late 2005, a "Core2Duo and a 8800GT" packed in a small box will probably burn down your house. You can keep that picture in mind.

Xbox360 makes PS3 a bad system. Sony first parties make PS3 a good console. Easy at this.

Yes, bad or under-staffed developers may write bad code on 360 too. Applications that are optimized for one architecture may be hard to port elsewhere too.

3 symetric multipurpose cores with 1MB L2 cache vs a single one with 512Kb L2 it's what I call a stronger central processing unit, given that both have the very same core architecture. Integer wise, Xenon is MUCH stronger than Cell. As much as Cell is floating point wise.

Everyone forgets about that.

It doesn't matter what you call it. It is not more powerful. Period.
 
To say Sony planned to use two cells instead of a GPU it's like say Saddam thought to use PS2 to lauch ballistic missiles, a non sense. Just a viral or myth.

You CAN'T use a second Cell as a GPU without indeep logic changes. How will it draw fast enough without ROPs? How will it handle textures without TMU's? It have no hardware offloads to graphic capabilities. How could it be able to act as a GPU then?

PS3 will never able to 'smoke 360 performance wise' even in your wildest dream. Xenos is way ahead of RSX tech wise. Xenon is stronger as CPU than Cell, like 3 to 4 times.

For every 'trick' Cell can do to help RSX, a programer just have to ask Xenos to do, and it will do better and more easily. There is no advantages in Cell, just not a single one. You have to understand this.

To the bolded thats simply not true. Especially not 3 to 4 times. Its a pretty much known fact that Cell> Xenon. Why are PS3 exclusive games like Uncharted 2/3, Killzone 2, GoW3 still the best looking console games? Clearly PS3 has some advantages over 360 with Cell. The post mortems on these games show how they were developed around Cell and were able to achieve what they did because of it.

Also BF3 runs better on PS3 cause its extensively built around the SPUs and Cell. Theres a pdf on the internet from GDC that goes into depth how they took advantage of the cell with there frostbite engine for BF3. It also clearly states in bullet points that Cell is very well suited for tasks normally done by the GPU and you can greatly increase performance by using Cell in this way. You can find this slideshow pdf on the internet pretty easily through a google search. Why would DICE mislead other devs about developing on Cell and its strengths?

In order to stay on topic. There does seem like there will be certain tasks that PS4 with a steamroller based CPU might not be as efficient at compared to Cell, and will have to just do things in a different way to take advantage of its strengths. I still think its a shame that the rumored CPU in the PS4 might not be an "upgrade" on every level compared to the CPU in PS3.
 

OrangeOak

Member
Why are PS3 exclusive games like Uncharted 2/3, Killzone 2, GoW3 still the best looking console games?

Are these games technically more advanced than some multiplatform games and x360 exclusives or are you talking about art in which case it's a matter of opinion and doesn't prove anything.
I would really love to know where this whole knowledge about what all these games are doing technically comes from because in most cases it's not that well documented.
In most cases arguments end with a "because it looks better" or "it has better lighting" and all of this is really subjective unless you support it with concrete technical details.
There is so many sacrifices in all these current gen games at this point that these discussions about "best looking" game are pointless anyway.
Sorry for offtopic but it just baffles me that so many people are stating their subjective observations as a fact.
 
Are these games technically more advanced than some multiplatform games and x360 exclusives or are you talking about art in which case it's a matter of opinion and doesn't prove anything.
I would really love to know where this whole knowledge about what all these games are doing technically comes from because in most cases it's not that well documented.
In most cases arguments end with a "because it looks better" or "it has better lighting" and all of this is really subjective unless you support it with concrete technical details.
There is so many sacrifices in all these current gen games at this point that these discussions about "best looking" game are pointless anyway.
Sorry for offtopic but it just baffles me that so many people are stating their subjective observations as a fact.

The technology on these games have been well documented in post mortems that the devs showed at GDC. I already stated that in my previous post. The art is only 50% of it. The tech is just as important as the art. With out the tech, the artists wouldn't be able to fully realize there vision. In the same vein without the great artists, you wouldn't be able to utilize the tech to its full potential. So yes these game are more technically impressive than most multiplatform and 360 games, the art is only half of it. I think its its completely naive to say "blank game" only looks so good because of the art or tech. There equally important and are equally the reason why the game looks so good.

edit: Also digitalfoundry has posted some interesting tech articles on UC3 and Killzone 3. They also did tech interview with one of the programmers on the K3 team that went into detail on how they further optimized the K3 engine vs KZ2 by further taking advantage of the Cell.

Ok, Cell it's not more powerful than Xenon either. Period.

umm yes it is.
 

coldfoot

Banned
Now take a look at how profitable were G92 and shrinks to Nvidia. Or how profitable were that tiny C2D to Intel. Intel may be pretty draconian licensing his tech, but a stone is more profitable than this:
So your idea of cost reducing that motherboard is to put an even more expensive CPU and GPU on there? With Intel, Sony would still be paying 90nm prices for 45nm CPU's and a 8800 is much bigger than the RSX not to mention with a wider and more expensive memory interface.

Nothing you say makes sense or should be respected at this point, so I'll do exactly just that.
 
Google the web for GTX280 power consumption. ^_^

Been there, done that:

[image]deeplinking[image]

DERP!

http://www.pcgameshardware.com/aid,...on-of-graphics-cards-compared/Reviews/?page=2

Peak power consumption from a full system (GTX280, AMD X2, 1GB RAM, etc...): 274W.

Peak.

At launch ? Early Xbox 360 overheated. MS had to redesign the entire thing and shrink it aggressively. It's on a smaller process now, hence taking up less power.

Xbox 360 never overheated. RROD was due to engineer a motherboard with pre 2006 standards, and manufacture it Lead Free under 2006 RoHS. And, even then, it have lower power consumption than PS3. Google it.

No interest at Xbox vs PS3, tbh. This is about some guy arguing PS4 should not have Cell by any means.



We'll have to look at the actual numbers. In some supercomputing applications, a Cell system actually outran a large scale PC distributed system. Cell's power efficiency is published in papers. You can look them up if you're interested.

Why do you think I didn't read a LOT about Cell?

Cell was designed in early 2000s. If we still can't compete with Cell today, then something is very wrong with the tech world. That doesn't mean Cell is not power efficient though. ^_^

It was released in almost 2007 and it's manufactured under current litography.
It wasn't designed in early 2000. Project STARTED at early 2000. Come on, guy. Give me a breath.



What vague ? The traditional GPU architecture has always been designed for embarrassingly parallel applications. Modern GPGPU architectures generalize it, but it can't run away from its roots.

Still makes no sense. Lets talk about black boxes. Inside one, just an Cell. In the other one, A CPU+GPU+Motherboard+RAM+HD+Power supply. First one will not work at all.

... and GPU's pipeline model dates back to the 80s. All modern technologies are based on old and sound principles. The trick is to use them correctly.

DMA it's outdated as hell. Think about it, a full cache-coherent Cell with a shared pool for all SPE. That sound powerful, DMA and local store sound... Outdated.

Nonsense. I don't think Uncharted is written by just one guy.

Uncharted it's a single program compiled in a single executable with exclusive access to all PS3 resources. A single instance aware of just XMB.

Cell is used to do vertex and lighting work together with RSX. It also decodes Blu-ray streams and run Java applets. Computing model-wise, it is more flexible than Unified Shaders.

My old 8800GT can run Left4Dead, Physx, encode a video, simulate fluids...

http://www.nvidia.com/content/cuda/cuda-toolkit.html
http://www.nvidia.com/object/cuda-apps-flash-new.html#

You don't like to play this.

As I mentioned, they use GPGPU first and foremost because of economics. And yes, GPGPU is suitable for supercomputing but the world fastest supercomputer (updated last month) is not GPGPU-based. It is based on BlueGene/Q: http://www.top500.org/lists/2012/06/press-release

It's PowerPC based. Its predecessor, BlueGene/L, is Cell + PowerPC-based.

How many Cell does it have?


In late 2005, a "Core2Duo and a 8800GT" packed in a small box will probably burn down your house.

A fully clocked 8800Gt have the very same power consumption than a GTX7900. Did your PS3 burned your house?

Yes, bad or under-staffed developers may write bad code on 360 too.

360 will run bad code better than PS3 too.


It doesn't matter what you call it. It is not more powerful. Period.

Ok, Cell it's not more powerful than Xenon either. Period.
 

Lord Error

Insane For Sony
http://www.youtube.com/watch?v=TlkEU_l02qg

One thing is to stream 8 HD signals, and another one it's to decode 8 .mkv files using x264 codec. There is no such offload engine in Cell to do that.
Whatever Toshiba has in that TV or whatever they're doing there (probably decoding each stream in 1/16 of original resolution which is far less intensive task), it's an invalid comparison with PS3. What I told you about 3D BRs is true with PS3 - corners had to be cut with audio to decode everything fast enough. It's also very difficult to encode 1080p 60FPS H264 video to play smoothly on PS3. The CPU very good for the task, probably nothing reasonably priced in 2005 or 6 would come close, but to say that it was an overkill for this task is blatantly false.


No interest at Xbox vs PS3, tbh.
That's too bad, you were doing just great with that statement about Xenon being 3-4x more powerful than Cell.
 
Its a pretty much known fact that Cell> Xenon.

Tell my why and how.


I read all of that episode. How much investment did Dice on PS3 version and how little did on 360. As much others, I was specting a much better version on PS3.

Guess what?

http://www.eurogamer.net/articles/digitalfoundry-face-off-battlefield-3

After MUCH more research and invest, PS3 version don't performs better than 360 one. More transistors, more TDP, more power consumption, more expensive console, more software cost time and money wise, NOT better performance. How can that be possible if 'Cell>Xenon'?

How can that be possible on an engine wich was built around Cell+RSX and ignored Xenon+Xenos strongholds? Wich can't make use of several Xenos hardware offloads due to renderer?

I will tell you. Because all that tricks and marvelous things Cell do, are necessary to be on par with a non 360 focused software running on a 360. Sad.

Your 'arguments' are like say VHS it's better than Betamax because there are more movies on the former.

In order to stay on topic. There does seem like there will be certain tasks that PS4 with a steamroller based CPU might not be as efficient at compared to Cell, and will have to just do things in a different way to take advantage of its strengths. I still think its a shame that the rumored CPU in the PS4 might not be an "upgrade" on every level compared to the CPU in PS3.

And here is were it all comes. Some guy in internet believing Cell it's better than a future PS4 CPU. Hilarious.
 
Yeah art has something to do with it but so does specs.
Saying that UC2 look good because of 25GB of streamable art assets and scripts without talking about hardware is stupid .
If anything the PS3 only get games that good because of Cell since it had to help out rsx a whole bunch since it suck .

PS4 is look to be the opposite of ps3 hardware wise from the leak stuff we know.
RSX doesn't suck. It's just nowhere near as versatile as Xenos

Some of his other arguments have been insane at best, but he wasn't wrong to say that Uncharted 1, 2, and 3 look as good as they do because of the artistic ability of Naughty Dog. It helps that they had 3 times the storage space, a great team of programmers etc.

But it isn't because of the PS3 that the game looks as good as it does. They could have done something of similar fidelity on the 360, and something infinitely prettier on the PC.
 
Whatever Toshiba has in that TV or whatever they're doing there (probably decoding each stream in 1/16 of original resolution which is far less intensive task), it's an invalid comparison with PS3. What I told you about 3D BRs is true with PS3 - corners had to be cut with audio to decode everything fast enough. It's also very difficult to encode 1080p 60FPS H264 video to play smoothly on PS3. The CPU very good for the task, probably nothing reasonably priced in 2005 or 6 would come close, but to say that it was an overkill for this task is blatantly false.

Again, Cell can't offload all codec's. AFAIK, it can decode 48 MPEG2 streams.

http://techon.nikkeibp.co.jp/article/HONSHI/20061122/124279/

That crippled Cell at that Toshiba TV is streaming 8 HD signals. That mean Cell it's decoding it by hardware. If its hardware can't offload a certain codec (accelerate it), it will need to decode by software. Did I said already Cell is a very weak CPU?



That's too bad, you were doing just great with that statement about Xenon being 3-4x more powerful than Cell.

I started talking about CPU+GPU paradigma vs Cell for PS4. As soon as someone felt Cell was under attack he entered at damage control, RROD, and so on. Not interested at that.
 

Lord Error

Insane For Sony
Again, Cell can't offload all codec's. AFAIK, it can decode 48 MPEG2 streams.

http://techon.nikkeibp.co.jp/article/HONSHI/20061122/124279/

That crippled Cell at that Toshiba TV is streaming 8 HD signals. That mean Cell it's decoding it by hardware. If its hardware can't offload a certain codec (accelerate it), it will need to decode by software. Did I said already Cell is a very weak CPU?
It's decoded in software either way. MPEG2 is far easier to decode than H264. There's still fact that PS3 can play correctly encoded 60FPS 1080p H264 (single, of course) video feed. I really doubt something affordable made in 2005 or 6 would be able to that. However, that's about where the limit of its video decoding is, and it ended up being a good fit for BR movies, and far from overkill like you suggested. More like the only reasonable option they had.

I read all of that episode. How much investment did Dice on PS3 version and how little did on 360. As much others, I was specting a much better version on PS3.

Guess what?

http://www.eurogamer.net/articles/digitalfoundry-face-off-battlefield-3

After MUCH more research and invest, PS3 version don't performs better than 360 one. More transistors, more TDP, more power consumption, more expensive console, more software cost time and money wise, NOT better performance. How can that be possible if 'Cell>Xenon'? How can that be possible on an engine wich was built around Cell+RSX and ignored Xenon+Xenos strongholds? Wich can't make use of several Xenos hardware offloads due to renderer?
This is in all honesty slimy, and disrespectful of the tech work and ingenuity that went into X360 version of that game. A game that used the engine that was designed solely around PC/DX11 architecture and tile-based deferred shading. This was something that neither console was ever meant to do btw, and the fact that emulating this DX11 shading was a slightly more natural fit for a PS3 architecture, doesn't do anything to help your argument of how Cell was always good for nothing, nor does it do justice to their effort that was needed to make it work on either console.
 

onQ123

Member
Wait Wait Wait!

let me get this right.


Cell is a weak CPU

Blu-Ray wasn't needed

RSX is weaker than the Xbox 360 GPU

split ram was a big mistake & they should have used unified ram like the Xbox 360.


all this yet PS3 has the best looking games on a console this Generation how is that even possible?


I guess Sony is feeding their devs magic beans or something because the PS3 was just all fucked up according to the internet know it all's.
 
Status
Not open for further replies.
Top Bottom