Ah, minimum 550w is the power supply rating. Since your article is for 2008. you should use PS3 Slim for reference, which uses about 97W max.
You STATED that GTX280 needs a 550W Ps alone as a bare minimum:
the GeForce 280GTX GPU alone required a 550W power supply *at least*.
Thats *NOT* true. I showed you some reviews that demonstrate a PEAK total PC usage of 275Watts. And PS3 Slim released at september 2009, so FAT applies here.
Let's do some maths about efficiency:
GTX280 at folding: 6530/275W= 23,75 (not even counting CPU scores, just GPU).
PS3 at folding: 900/135W from the not retro fat model=6,67. (RSX can't run fonding).
A GTX280 based PC is 3'5 times more efficient at his job than a Cell PS3.
Enough of this Cell godlike shit.
Xbox 360 did overheat. As for blaming on RoHS, here's what can be said:
http://en.wikipedia.org/wiki/Xbox_360_technical_problems
Xbox 360 DO NOT overheat. It doesn't have to do with that all all. RROD was due to poor quality solder. Quality lead free solder is more expensive. MS was greedy enough to use low quality solder, just as they used low cost capacitors. Fat 360 is way better engineered than Fat PS3. Fat PS3 is way better manufactured than Fat 360.
Regarding power consumption, PS3 Slim now uses 72W (120Gb) to 83W (320Gb) playing FFXIII according to
wiki. 360 (Valhalla) uses about 80W playing games according to
here
However this is system wide usage (Blu-ray vs DVD, HDD size difference, etc.). It will vary depending on the application used.
This is the most hilarious thing I have read for weeks. You are just unable to read a mere table. That 11W difference depending on just hard drive size was just too funny to be beliable, since we are not talking about fat models and differents chips usage. A 2,5" hardrive, such as PS3 ones, peaks at 2,5W. But you actually believe there is a 11watts difference between hard drive discs based in their capacity. Can't stop laughting.
I will just not talk about using different sources data to power consumptions.
When talking about CPU power dissipation, they typically normalize the benchmark (e.g., frequency/FLOP per watt).
I noticed while ago you have no idea at all. I will teach you something. TDP is not power consumption, and have nothing to do with benchmarks. TDP is the amount of dissipated heat a cooling system have to dealt with in a worse case scenario. More on this later.
Yes, yes, my poor Cell evangelist.
It was a 5 year project. Technical investigation and initial design concepts were explored, investigated and revised continuously over those years.
So you agree Cell it's not a 2000 designed chip. Hope you never say that again then.
Yes, your example makes no sense. Why would you want to compare these 2 things ?
Because there are systems. PC with a GPU+CPU is a system. PS3 with Cell+RSX is a system. Cell alone do nothing, as any given CPU.
The GPU architecture is optimized for embarrassingly parallel problems. It doesn't matter what other components are used. A GPU is used mainly for running graphics and physics jobs, but not for running Java or a web browser.
You know what? A GPU can render a browser. You know what? Cell can't feed an image to your TV, a Cell doesn't have a DAC to feed audio to your speakers, Cell can't hold data because it have no room for it. Not only that, my cat can't bark, just speaking of more nonsenses about witch device do wich duty in a system.
Pipelining and DMA are both ancient concepts and remain in active use today. DMA is critical in I/O. What you probably meant is cache management (cache coherency). Yes, Cell lacks cache management. A modern implementation will have to revisit this topic to enable more flexible LocalStore usage.
You use DMA for external devices like hard drives. Not for an High performance proccessor. End of history. Any modern multicore CPU needs shared pool of cache. The more, the better.
Doesn't mean anything. You can say the same for other programs, "Halo is a single program compiled in a single executable with exclusive access to all 360 resources. A single instance aware of just Xbox environment."
My point is cell, as a monocore CPU, will lose tons of performance running more than a program. Any multicore proccessor will do better.
You were wrong claiming that you can't have multiple people working on the same Cell project.
FWIW, a Cell program consists of multiple programs because the SPU and PPU load different binaries. It's just packed into one bundle for easy management. There is also the underlying hypervisor like 360's kernel.
All of them PROFILED to work together and with a working knowledge of available resorces. It's not like you open Youtube into chrome and play some mp3's at same time.
Can it run Java and JavaScript ? How about DLNA network protocol ? Can it read from a disk directly ?
My 10 mhz phone can run javascript. My 300 mhz mips router cpu can run a dlna media server. There is absolutely no point at this. Can you hook a speaker to Cell so it can play some mp3's? But better than this. Can your almighty Cell read from a disk directly without a southbridge? Genius.
Their marketing says based on BlueGene architecture. So perhaps they retained some Cell design philosophy inside ? It's PowerPC. Definitely not GPGPU though. The GPGPU architecture is still too specialized compared to a CPU, *if* they can get the CPU performance high enough. It would be interesting to see how vendors combine both together.
Cell is based on PowerPC, not the other way around. Future is SOC. CPU's tends to be GPU's. GPU's tends to be CPU's. Cell it's just a weak CPU with a massive die budget dedicated to FP.
Nope, why would it burn my house when the 8800GT is in your house ? Not unless you can time travel back to 2005 and put your fully clocked 8800GT inside a PS3. ^_^
In *2005*, before Cell launched, 8800GT was not ready for small boxes yet. It would be too hot. Now 7-8 years later, of course you can put a shrunken version in small boxes today.
1. PS3 have a dumbed down GTX7900.
2. 8800GT and GTX7900 have the same TDP and similar power consumption.
3. ???
4. Profit!
G80 was released at 2006. PS3 was released at late 2006. Sony decided to invest more on CPU than on GPU, that's why they released PS3 with a G72 instead of a G80. Budget. Just as Microsoft throwing in 256MB of RAM in exchange of an HD.
This may be true. Then again, the reverse is also true, if the code is optimized for Cell, it will fly like no other.
Yup, Xenos will run Xenos optimized code like no other too. Another nonsense.
Show me a Xenon that can decode 3D Blu-ray and run Java at the same time.
Show me a BD drive hookable to a Xbox360. In the meantime, you can play with th HD DVD unit. Any software level codec will run at PPE. Xenos have one PPE just as Cell one. And 2 more friends with double cache. Guess wich one will run better PPC code?
I'm done with this childish talk. I'm done with guys posting a screenshot to 'prove' a CPU is better than another. I'm done with people calling other fanboys without having any working knowledge about microarchitectures. You want an indeep Xbox360 vs PS3 topic? Open a new one. You want a new Cell religion? Open a church.
I only wanted to say why I don't want a Cell or Steamroller or a shitty 8 cores arm cpu into next gen consoles. I wanted to say a Southern Island GPU is not an HD7970 GPU like those PCIE cards, just a derived chip. I said it already, and have nothing more to say.
Fafalada said:
You do realize that such hardware never existed in a console right?
Xbox360 at release date was.
systemfehler said:
I would take XDR2 aswell but I don't want to end up with a console without RAM but instead a voucher for it in 2013/14...
Please NOT! Don't touch that shitty patent troll rambus made from air memory.
About the GDDR5 vs DDR3 debate, you are forgeting to talk about the bus width. GDDR5 with a narrow bus will be cheaper, but would perform as DDR3 with a wider bus. If it's about speed vs capacity, DDR3 with 256bit bus is more than enough for GPU functions at 1080p, so 4GB DDR3 over 2GB GDDR5 with no doubt.