• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 Rumors , APU code named 'Liverpool' Radeon HD 7970 GPU Steamroller CPU 16GB Flash

Status
Not open for further replies.
Retro-question:
Did PS3 make a profit off of the h/w? How about with h/w and s/w together?

Anyone know the cost/profit margines of PS3? I feel like Sony is still losing money on PS3 but I'm not sure. I think that plays a factor in not only PS4's launch time, but also every facet of the system.

They've been making a profit on PS3 since the launch of the slim in 2009 AFAIK.

edit: or was that when they started breaking even, and then in 2010 they start making a profit? I forget exactly.
 

RiverBed

Banned
So they actually made up for the PS3's expenses? Wow. Good for them. I thought they would never make it since they were billions(?) in the hole and they are always in the black whenever I checked in the past.
 
There are no 7970s that I am aware of that have 18 CUs and put out 1.8tf.

The console will be based on GCN architecture, but you're not getting the equivalent of a Tahiti model.

Well the CU and Flop numbers from the rumour might be wrong - the 7970m has 20 CUs - so a version with 18 doesn't seem so far fetched. However I seriously doubt that a 7970(m) with a 100W TDP is suitable for a console with a standard cooling solution
 
So they actually made up for the PS3's expenses? Wow. Good for them. I thought they would never make it since they were billions(?) in the hole and they are always in the black whenever I checked in the past.

No, the PS3 had some profitable quarters, but it didn't make up for all the losses, it's not even close.
 

StevieP

Banned
Well the CU and Flop numbers from the rumour might be wrong - the 7970m has 20 CUs - so a version with 18 doesn't seem so far fetched. However I seriously doubt that a 7970(m) with a 100W TDP is suitable for a console with a standard cooling solution

You also generally don't use mobile variants for consoles, as they're usually quite a bit more expensive. The CU/Flop numbers are also their target specs. The 7970m is a rough equivalent to a desktop Pitcairn.

In late 2013, why not?

Because that's not what they're shooting for?

No, the PS3 had some profitable quarters, but it didn't make up for all the losses, it's not even close.

Right. They'll never make up for the losses they took on the PS3.
 

ekim

Member
Can't we just consolidate all the next gen and new PS3 Model Threads due to digression in every single one of them?
 
You also generally don't use mobile variants for consoles, as they're usually quite a bit more expensive. The CU/Flop numbers are also their target specs. The 7970m is a rough equivalent to a desktop Pitcairn.

Well I agree but heat and power increased quite a bit since 2006 and for a small case I only see a mobile GPU or a seriously downgraded "full" GPU feasible. I think it is worth to look at the mobile GPUs aswell if Sony wants to stay within a certain thermal boundry although I admit I don't have any price figures to make anything better than a wild guess.

I am sure AMD has already a mobile version of Tahiti speced out - depending on the PS4 launch this might be an good alternative instead of a dumbed down retail 7970.
 

StevieP

Banned
"7970m" is not a "Tahiti" it's fairly equivalent to a desktop Pitcairn (it's around 2tf).

Why not just take a similar approach and customize and downclock etc etc a rough equivalent?

(which, by every indication in rumours, is what's happening)
 
It doesn't have to be a 7970m - but more likely the next (first) mobile iteration of Tahiti. With the really consumer unfriendly naming scheme established by Nvidia and AMD it could be a 8xxx(m) version. With an APU + GPU setup power gating is really needed (especially if you want your console as a media box aswell) and so far I have read mixed informations if the mobile versions of Pitcarin (7970m) can do that.

I don't know how much custom work Sony wants to put into their next GPU but they have a lot of options and I am just pointing out a few.
 

Nachtmaer

Member
Isn't the 7970M just a higher (or differently?) binned Pitcairn that runs on a lower voltage and clock speed? To me using the mobile version actually makes more sense, if they're going to use an off the shelf GPU (so to speak) because of power and heat constrictions. Then again, I'm sure they're going for a more custom designed chip with its own performance and TDP targets and features. I can see this 18 CU, GCN-based GPU happening. Correct me if I'm wrong though.
 

RoboPlato

I'd be in the dick
It doesn't have to be a 7970m - but more likely the next (first) mobile iteration of Tahiti. With the really consumer unfriendly naming scheme established by Nvidia and AMD it could be a 8xxx(m) version. With an APU + GPU setup power gating is really needed (especially if you want your console as a media box aswell) and so far I have read mixed informations if the mobile versions of Pitcarin (7970m) can do that.

I don't know how much custom work Sony wants to put into their next GPU but they have a lot of options and I am just pointing out a few.

This is what I'm expecting. Some version of a 2013, high end mobile GPU with changes to clock speed and features in order to meet cost and feature set goals dictated by Sony and their partners.
 
Not only do we talk other consoles in the WiiU spec thread I was just posting about a shirt I had that ended up inspiring the ThunderMonkey character through chance.

What up?!

As long as there are secondary or tertiary connections to the thread at hand our conversations are generally allowed to be free-flowing. Besides we've already got a better idea of what is in both Sony and MS consoles than we do WiiU. We've at least got their general floppage range.

All we know about WiiU is generalities. 2-3x the power derp.

We know through dev commentary that the CPU is not a modern analog of the PS3/360 CPU's. Seems to forego brute force for ease.

GPU we know even less. I mean the range by those in the know is everything from barely more powerful than current gen to potentially a lot more. We've got a pretty good idea of the RAM total though.

Yeah I still remember defending Sony's financial situation in one of the old Wii U threads. XD

And it is something that we are learning more about the other two consoles than Wii U.
 

AB12

Member
There are no 7970s that I am aware of that have 18 CUs and put out 1.8tf.

The console will be based on GCN architecture, but you're not getting the equivalent of a Tahiti model.
LOL I remember when you used to say we'll lucky to get 6850, and get into arguments with people who even suggest the idea of 6970, for a console that is coming out in 2013/2014 Tahiti or some equivalent would have good chances of being used.
 

BanGy.nz

Banned
32 pages is a bit much to wade through and I know this is all just speculation based on educated (?) guesses, but what are the chances of this thing being backwards compatible?
 

StevieP

Banned
LOL I remember when you used to say we'll lucky to get 6850, and get into arguments with people who even suggest the idea of 6970, for a console that is coming out in 2013/2014 Tahiti or some equivalent would have good chances of being used.

A 6850 draws more power and creates more heat than an equivalent Pitcairn. A 6970 was and still is a ludicrous proposition in a console, and a full blown Tahiti still draws too much for a typical-sized console.

This console is most likely coming q4 2013. The 8xxx series is a gcn refresh, and the (legitimate) target specs indicate a 1.8tf gcn-based gpu with 18CU.
 

i-Lo

Member
A 6850 draws more power and creates more heat than an equivalent Pitcairn. A 6970 was and still is a ludicrous proposition in a console, and a full blown Tahiti still draws too much for a typical-sized console.

This console is most likely coming q4 2013. The 8xxx series is a gcn refresh, and the (legitimate) target specs indicate a 1.8tf gcn-based gpu with 18CU.

Why are we so hung up about the 1.8tf and 18CU as being the final product?
 
Target specs are generally actually more indicative of a final product than dev kits and usually end up being fairly accurate.

So you say the leak mentioning ~1.8TFlops/18CUs is true whereas the notion of Tahiti in that leak is false? I am sorry has there really been established what is true, false, believeable, wishfull thinking - I would like to read that because if those leaked specifications are true it means Streamroller and not Bulldozer because of [...]Target specs are generally actually more indicative of a final product than dev kits and usually end up being fairly accurate.[...]
 
There are no 7970s that I am aware of that have 18 CUs and put out 1.8tf.

The console will be based on GCN architecture, but you're not getting the equivalent of a Tahiti model.

Isn't the RSX in the PS3 a modified version of the high end 7800gtx back in the day? If so, a 79xx or 88xx variant wouldn't surprise me.
 

Triple U

Banned
Target specs are generally actually more indicative of a final product than dev kits and usually end up being fairly accurate.


Im pretty sure Dev-kits use the actual hardware, how are target specs more indicative than that? Unless you meant like the alpha kits or something.
 

Ryoku

Member
Im pretty sure Dev-kits use the actual hardware, how are target specs more indicative than that? Unless you meant like the alpha kits or something.

Because target specs are just that. Target specs. Early dev-kits contain hardware that allows the developers to get accustomed to the general performance level until the final hardware based on the target specs (or slightly modified) are finished.
 

Triple U

Banned
Because target specs are just that. Target specs. Early dev-kits contain hardware that allows the developers to get accustomed to the general performance level until the final hardware based on the target specs (or slightly modified) are finished.

Well he didn't really say early kits. The way he worded it seems like he means dev-kits in general. Thats why I questioned if he meant something like alpha kits or not.
 
Target specs are generally actually more indicative of a final product than dev kits and usually end up being fairly accurate.

I thought the rumored specs were based on what was in the dev kits? Where did we get legitimate info on what Sony's final target specs are? Why would the target specs exactly match the 2nd iteration of a dev kit that came out in Feb. 2012 which is almost 2 years from the rumored target release date?
 

TheD

The Detective
That's strictly not true. The cell can still do things conventional CPUs cannot achieve.

HAHAHAHA........... No.

The only thing Cell has going for it is high floating point output, but not much code is even close to floating point bound.
Everything else in Cell on the other hand is very old and would be crushed by a modern processor.


Anyone saying that GPGPU can make up for a weak CPU in the WiiU does not understand much about GPGPUs or CPUs.
GPUs are only good at highly parallel tasks that are insensitive to latency, it can not help the CPU in any other tasks.
 
HAHAHAHA........... No.

The only thing Cell has going for it is high floating point output, but not much code is even close to floating point bound.
Everything else in Cell on the other hand is very old and would be crushed by a modern processor.


Anyone saying that GPGPU can make up for a weak CPU in the WiiU does not understand much about GPGPUs or CPUs.
GPUs are only good at highly parallel tasks that are insensitive to latency, it can not help the CPU in any other tasks.

Wasnt Cell good at off loading certain tasks usually done by the GPU, to free up precious GPU resources for other things? I thought this was pretty established fact and was the main strength of the Cell and its SPU's , and the main reason PS3 exclusives like GoW3, UC3, and Killzone 2/3 are still the best looking console game out there.

Are the modern x86 CPU's(Steamrollar one rumored in this this topic) going to be better than Cell at doing these types of tasks to help the GPU?
 

missile

Member
I could follow most of this but man do you go low. My inferior high-level brain can't take much more ...
Haven't started. I just thought of bringing up an example of how such a PDE
for fully destructible environment are discretized resulting in a sparse
linear system of thousands of equations to be solved by a sparse linear
solver showing the issue of non-contiguous memory updates stemming from the
non-linear data structure inherent to such solvers, which do depending on
efficient DMA'ing to gain any high GFLOP rate. But I saved that one knowing
it would go off-topic way too much.

... Im still in the student stage(almost done though!!) but overall there is more of a focus on making managed languages like C# the standard where alot of work is done for you instead of teaching all of the solid fundamentals we should learn. The walls we are running into now is very much like you say, less and less concern is being placed on data in favor of a more uniform code based approach. I'm probably in no place to criticize but from the outside looking in I don't think developers had a good strong grasp of their code and where it could be broken down into a model that isn't the Wintel thread-centric norm. I think this is what stalled alot of advancement and choked performance.
Indeed.

In these times 9 out of 10 masters of computer science can't write a 'hello
world' in assembly. Aren't they suppose to know how the stuff works? It's
quite interesting how some of those people discuss the latest buzzes not
having a glue how it works underneath but claiming to know which system is
better. Go to a computer graphics course in most universities and you
will recognize that 9 out of 10 graduates can't raster a line on the screen.
But they are eager to build a new 3d engine and the latest of whatever
planet earth hasn't surfaced, yet. Let's finish this paragraph with a saying
of Mrs. Verda Spell of Beaumont from the '80s; "He who cannot in 64k program
cannot in 512k.".


... I have a question of my own, why are Sony dumping the Cell after spending so much time and money on it ?, wouldn't boosting the clocks of the CPU and adding in the 2 - 4GB's of Ram along with the 1.8 tFLOP GPU to it not make it just as powerful rather than starting again with an AMD CPU ? ...
In 2007 the Cell chip already run at 6GHz on 65nm CMOS SOI technology. If
you count that down to 32nm or 22nm would give a pretty efficient low heat/
wattage processor. So that's not the issue. If you trace the history in
creating the Cell processor then you may come to realize that there were many
issues at IBM. A new revision of the Cell processor seems to be impossible
since many of IBMs lead members of the STI Cell team have left the company
after dumping the chip into silicon. The issue was that the IBMer of the STI
team where forced under an additional pressure in creating the XBox360
processor as well. This has lead to many conflicts. The executive manager of
the STI Team, Dr. Akrout, a leading circuit designer working for IBM since
1982, who has lead the PowerPC team for the Apple Macintosh, Nintendo's
GameCube, etc. was a star at IBM and a visionary. However, he had enough of
the harsh rule of the IBMs directors and all the shifting politics of the
company leadership during that time. So he left. After Cell was shipped to
Sony, the PowerPC core lead, the SPE lead, and many others of the STI team
had left the company.

Btw; Did you know that Dr. Akrout went to AMD?

So the possible enhancements the Cell processors was designed for will
properly never going to be realized. And Sony is not being able to do it on
their own. It would be a complete restart to gather a team again in cranking
up the Cell processor. However, what could be done is stacking the Cell
processor, just like with the Xbox360 PowerPC processor. Four Cells could do
the job. However.

IBM might be not a reliable partner for Sony for the time being. Perhaps
Sony found a good partner with AMD for the next generation offering them
quite a lot of things. So Sony might have screwed the Cell chip altogether
in favor of the new offerings. I don't know, am just speculating. Personally,
I would like to see an enhanced Cell, well, four of them, within the PS4
combined with AMD for graphics.
 

Triple U

Banned
HAHAHAHA........... No.

The only thing Cell has going for it is high floating point output, but not much code is even close to floating point bound.
Everything else in Cell on the other hand is very old and would be crushed by a modern processor.


Anyone saying that GPGPU can make up for a weak CPU in the WiiU does not understand much about GPGPUs or CPUs.
GPUs are only good at highly parallel tasks that are insensitive to latency, it can not help the CPU in any other tasks.

He isn't wrong. Cell, namely the SPE's are still beasts when it comes to 3d maths and probably still one of the best when it comes to decoding video. As you said in regards to FP cell smokes any consumer or even enthusiast level CPU. Using a simple Linpack puts the PS3's Cell at about 73GFLOP compared to the like 40GLOPS for an I7.

So yeah like he said some conventional CPUs can't keep pace with Cell doing certain things.
 

missile

Member
He isn't wrong. Cell, namely the SPE's are still beasts when it comes to 3d maths and probably still one of the best when it comes to decoding video. As you said in regards to FP cell smokes any consumer or even enthusiast level CPU. Using a simple Linpack puts the PS3's Cell at about 73GFLOP compared to the like 40GLOPS for an I7.

So yeah like he said some conventional CPUs can't keep pace with Cell doing certain things.
And btw; has any other processor beaten Cell's 10 FO4 design, yet?
 

McHuj

Member
Are the modern x86 CPU's(Steamrollar one rumored in this this topic) going to be better than Cell at doing these types of tasks to help the GPU?

To me that's a backwards approach. Just put in a powerful enough GPU that you don't need to offload stuff to the CPU and let the CPU handle all the other non-GPU related processing.
 

coldfoot

Banned
To me that's a backwards approach. Just put in a powerful enough GPU that you don't need to offload stuff to the CPU and let the CPU handle all the other non-GPU related processing.
What about physics processing which is suited to Cell and GPUs and not x86 CPU's?
 

McHuj

Member
What about physics processing which is suited to Cell and GPUs and not x86 CPU's?

Run them on the GPU. Provide a sufficiently powerful enough GPU to handle the work. There's only so much silicon budget available on a console and I think you're better suited to spend it on the GPU.
 

coldfoot

Banned
Run them on the GPU. Provide a sufficiently powerful enough GPU to handle the work. There's only so much silicon budget available on a console and I think you're better suited to spend it on the GPU.
Not only the GPU isn't a perfect fit for those tasks, you'd inevitably be degrading graphics as running them on the GPU would take an inordinately high number of transistors compared to running on the CPU as GPU's are all interconnected and you wouldn't be able to use some GPU units which aren't involved in physics calculations anyway.

The SPE's are small enough to be included as Floating point processors in a next-gen console. Certainly better use of silicon area than x86 decoding hardware on a console.
 
And btw; has any other processor beaten Cell's 10 FO4 design, yet?

In the (9/2010) Sony 1PPU4SPU patent (Sony filed a patent for a Method and apparatus for achieving multiple processing configurations using a Multi-processor System Architecture. ) is this: "the PPU is a new ground up implementation of core with extended pipelines to achieve a low FO4 to match the SPUs." IBM and Sony must have been working on improving the PPU to work with SPUs in 2010.

Awful lot of work to just drop it and if they were dropping it then why publish the patent Dec 2010? We know 4000 chassis (my guess) and next generation work started 9/2010. I'm not disagreeing with the PS4 as primarily a X86-Fusion but the PS3 4000 chassis and the PS4 may contain 1PPU4SPU modules instead of Cell. Lots of uses for SPUs.
 

Triple U

Banned
I do remember there being talk of a Cell 2 with a way better PPU. I also know Toshiba did work on reengineering the SPE.

I think in the end, most parties involved just decided to go their own directions with their work instead of simply continuing the Cell line.

Edit: Interesting enough GH just released a full software suite for Cell a few months ago.

http://ghs.com/news/20120207_IBM_cell.html
 
I do remember there being talk of a Cell 2 with a way better PPU. I also know Toshiba did work on reengineering the SPE.

I think in the end, most parties involved just decided to go their own directions with their work instead of simply continuing the Cell line.

Edit: Interesting enough GH just released a full software suite for Cell a few months ago.

http://ghs.com/news/20120207_IBM_cell.html

So, is there any chance we'll be seeing a Cell 2 with more PPU and SPEs?
 

Nightbringer

Don´t hit me for my bad english plase
Is it possible to stack cpu, gpu and cbea in the same space? I am asking this because I believe that Durango and Orbis have got the same configuration with the exception of an extra layer for BC.
 

Triple U

Banned
So, is there any chance we'll be seeing a Cell 2 with more PPU and SPEs?

I doubt it honestly. I don't even think STI is together anymore. I believe Cell and the respective companies' contributions to the project will simply be integrated and will manifest into their future projects. You see Toshiba with SPURS, IBM said they were gonna integrate some things with POWER etc.
 
Anyone saying that GPGPU can make up for a weak CPU in the WiiU does not understand much about GPGPUs or CPUs.
GPUs are only good at highly parallel tasks that are insensitive to latency, it can not help the CPU in any other tasks.

I'm trying to not continually divert the discussion in this thread, but this comment is too general to be properly applicable IMO.

So, is there any chance we'll be seeing a Cell 2 with more PPU and SPEs?

No.
 
Btw; Did you know that Dr. Akrout went to AMD?

So the possible enhancements the Cell processors was designed for will
properly never going to be realized. And Sony is not being able to do it on
their own. It would be a complete restart to gather a team again in cranking
up the Cell processor. However, what could be done is stacking the Cell
processor, just like with the Xbox360 PowerPC processor. Four Cells could do
the job. However.

Is it possible that a "cell" spiritual successor was brought over with Dr. Akrout? I mean... as someone stated, there were many problems with the cell, but performance wasn't one of them. That, along with power consumption and HSA, AMD should be a fantastic company to undertake this.

Run them on the GPU. Provide a sufficiently powerful enough GPU to handle the work. There's only so much silicon budget available on a console and I think you're better suited to spend it on the GPU.

GPU's run a lot hotter than a Cell ever will. Not only that but they are extremely power hungry in comparison and will never run at the same efficiency as the Cell would.
 

deadlast

Member
Is it possible that a "cell" spiritual successor was brought over with Dr. Akrout?
I think he was out of IBM before there were any plans for a spiritual successor to the CELL. However, this is really interesting in his bio from the AMD site:


He later became vice president of Entertainment and Embedded Processor Development and was responsible for the Cell Project developed through a Sony/Toshiba/IBM partnership as well as Xbox 360 processor development for Microsoft and embedded PowerPC processors cores for SoC.

I have a feeling the next box and ps4 are going to be very similar.
 
I think he was out of IBM before there were any plans for a spiritual successor to the CELL. However, this is really interesting in his bio from the AMD site:




I have a feeling the next box and ps4 are going to be very similar.

The processor in the Xbox 360 was actually similar in many ways to the CELL design. This has been known for years now. Read the Xbox 360 wiki.
 

AB12

Member
A 6850 draws more power and creates more heat than an equivalent Pitcairn. A 6970 was and still is a ludicrous proposition in a console, and a full blown Tahiti still draws too much for a typical-sized console.

This console is most likely coming q4 2013. The 8xxx series is a gcn refresh, and the (legitimate) target specs indicate a 1.8tf gcn-based gpu with 18CU.

All am suggesting is they can use a modified versions of the current existing technology, for example last year the power consumption of a 6970 was around 200W, but the 7870 is about 115W. By end of this year or early next year, the possibility of a 8850/8870 with power consumption in the low 100's is a possibility.
 

SonnyBoy

Member
Wow, I came in here to ask if there would be a CELL successor in the next PS. So basically, Sony wasted their money on CELL? Yikes!
 
Status
Not open for further replies.
Top Bottom