• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

iceatcs

Junior Member
Well I won't follow them the leakers too much. Remember there is no such insider for free, you can simple type anything you want to say. Just follow the doc, PR and patent from MS, Sony and AMD is best opinion to get the idea what it will have.
 
I should say intelligent "assumptions" based on the VG Leaks document and a few rumblings here and there.

I hope that if Sony has learned something from MS this gen is not to launch with a graphics architecture 2 years old from the launch time when are enhancements provided in the same year of launch ( RSX= Nvidia NV40 when G80 was in the street in PS3 launch time, doubling the performance due to unified shaders- ).
 
It's worth noting it seems to me, PS4 is shaping up as the more expensive product.

*If* PS4=4 steamroller cores vs 8 Jag cores Durango, the former is quite a lot more die area (too lazy to look up right this second)

If PS4 is a pitcairn and Durango is cape verde, the former is 230mm and the latter 130. Former almost twice as big.

Durango may feature SRAM (but 32MB/s should be pretty small) and supposed "special sauce".

GDDR5 in PS4 should be handily more expensive. As 256 bit bus GPU if thats the case.

IF MS can get on par performance as PS4 in a product they can sell for $50 or $100 street price less, that's a win too. But that's all just basic conjecture at this point.\

Its also possible PS4 could be more expensive, and just better.

We don't know what's getting packed in with these things. Any cost advantage they might have on silicon could evaporate if they're packing in Kinect 2.0 or a touch screen controller.

Until we know final specifications and measurements it would be difficult to gauge relative cost anyway. For example, Sony could end up saving lots of money in the long run if they can get a high bandwidth stacked DDR4 on interposer memory design going. More expensive at the start, but potentially much cheaper over time than DDR3 on a 256 bit bus on the motherboard and the die budget for eDRAM. DDR3 is nearing end of life, DDR4 prices will plummet as production ramps up.
 

gofreak

GAF's Bob Woodward
Hmmm, so there's no performance increase? Well then its more sensible for Sony to have gone with the GCN as opposed to the GCN2. It's likely had for cheaper, while being in greater quantity and being a tried and tested product. The only bragging rights to be had by having a GCN2 in your console is to say that you have a newer iteration of what is essentially the same product.

Going by the floating point numbers presented there's no major rejigging on the ALU side. But refreshes like this can see changes in cache, maybe changes in ROPs (although in the latter case if MS is using a daughter-die eDRAM config they may well be using different ROPs than in Sea Islands anyway). So you might get a certain degree of pound-for-pound performance improvement, or improvement in certain types of task or operation. Or some new things that make certain things easier. Enough, on the performance side, to offset a 50% CU difference in the general case? I would guess not.

But there's a big factor in these kinds of revisions that you haven't touched on - power consumption. Heat. Performance per watt. GCN2 might be more about improving power efficiency so that AMD can pack more onto the same power draw in their PC GPUs.

Now by the sounds of it, MS isn't exploiting this to pack more CUs onto their chip. But a benefit for a console would potentially be a less power-hungry and cooler (cheaper) chip than the same type of chip under a GCN architecture. It may not be as glamorous a goal as performance but for a console designer it is one that weighs on them.
 

RaijinFY

Member
It's worth noting it seems to me, PS4 is shaping up as the more expensive product.

*If* PS4=4 steamroller cores vs 8 Jag cores Durango, the former is quite a lot more die area (too lazy to look up right this second)

If PS4 is a pitcairn and Durango is cape verde, the former is 230mm and the latter 130. Former almost twice as big.

Durango may feature SRAM (but 32MB/s should be pretty small) and supposed "special sauce".

GDDR5 in PS4 should be handily more expensive. As 256 bit bus GPU if thats the case.

IF MS can get on par performance as PS4 in a product they can sell for $50 or $100 street price less, that's a win too. But that's all just basic conjecture at this point.\

Its also possible PS4 could be more expensive, and just better.

It's 212 for Pitcairn.
 

PaulLFC

Member
On B3D he said that Orbis is using the 8830/8850 with 50% more CUs - those are Hainan cards GCN2 and not some rebranded OEM cards. Furthermore he added that Durango will use 8750/8770 cards which are Bonaire GCN2 models.

But I guess nobody cares since Proelite is just posting various things so nobody gets onto his leaking and insider information...
I remember reading thosemodel numbers in a post too, so it seems like that was definitely a rumour at one point (still is?)

Can't really keep up with everything there's so many differing rumours at the moment, part of waiting for next gen consoles though I guess!
 

Ashes

Banned
I remember reading thosemodel numbers in a post too, so it seems like that was definitely a rumour at one point (still is?)

Can't really keep up with everything there's so many differing rumours at the moment, part of waiting for next gen consoles though I guess!

Yeah, I'm probably going to un-subscribe from the Chinese rumour thread. No reason for me to bump it when I don't really buy it...
 
Sweetvar26 info was also old in fact i think it was older than the vgleaks article from what i remember .
We also know that Sony has beef up the GPU and added more ram since then .
HWinfo.com Oct 8 2012 update listing Thebe and Kryptos is pretty much locking the hardware design. Both Jaguar, both @ 1.6 Ghz with a turbo mode of 2.4 Ghz.

Also for those who think we might see memory clocks above 2 Ghz, LPM silicon gets hot with clock speeds over 2 Ghz. Efficiency is best for everything below 2 Ghz; Memory controller, CPUs, GNB (southbridge/Northbridge, integrated conroller), unless it's using a different process or node size. Pennar was ported to GF from TSMC and it's GNB was 20nm and likely had 2 128 bit wide memory channels.

Sweetvar26 said they moved from Steamroller to Jaguar considering the 10 year life. 3rd party AMD roadmaps seem to indicate Jaguar and Jaguar's replacement will be used for performance and mobile designs in the near future (once everyone stops writing single threaded code).

Next year 128 bit wide memory will be an industry standard then memory will move to stacked ultra wide IO in Mobile and PCs will follow shortly after. Current AMD designs reflect the memory that is available in 2012 but Game Consoles have the volume to get custom designs.
 

Ashes

Banned
HWinfo.com Oct 8 2012 update listing Thebe and Kryptos is pretty much locking the hardware design. Both Jaguar, both @ 1.6 Ghz with a turbo mode of 2.4 Ghz.

Also for those who think we might see memory clocks above 2 Ghz, LPM silicon gets hot with clock speeds over 2 Ghz. Efficiency is best for everything below 2 Ghz; Memory controller, CPUs, GNB (southbridge/Northbridge, integrated conroller), unless it's using a different process or node size. Pennar was ported to GF from TSMC and it's GNB was 20nm and likely had 2 128 bit wide memory channels.

Sweetvar26 said they moved from Steamroller to Jaguar considering the 10 year life. 3rd party AMD roadmaps seem to indicate Jaguar and Jaguar's replacement will be used for performance and mobile designs in the near future (once everyone stops writing single threaded code).

Next year 128 bit wide memory will be an industry standard then memory will move to stacked ultra wide IO in Mobile and PCs will follow shortly after. Current AMD designs reflect the memory that is available in 2012 but Game Consoles have the volume to get custom designs.

Can't find which thread I posted in now, but I think Microsoft had to go with the 8 core design [I reckon something with two jaguar quadcore or similar].

As such I don't think Sony will go for a quad-core Jaguar. The performance isn't there. Not to mention that the power levels, like you say are designed for 25w and lower.

If Sony are buying something close to off-the-shelf parts it won't be an over-clocked Jaguar, it'd just be cheaper to go Steamroller.

edit: not that it holds any clout, but here's my reasoning. Tear it up if you will.
 
There was also that Japanese report that suggested Sony was carrying forward 2 different APU designs, one with Jaguar cores, and another with Steamroller cores. The choice between them would supposedly come down to whether they could get on the 28nm fabrication schedule at Global Foundries in time for a launch in 2013, in which case they'd use the Steamroller design, otherwise they'd have TSMC fabricate the Jaguar version for a launch in late 2013. That's an interesting possibility that would explain some of the back and forth rumors about the Orbis CPU.

I do think it would be 8 Jaguar cores, just like Durango if they go that direction, or 4 Steamroller cores if not.
 

Proelite

Member
There was also that Japanese report that suggested Sony was carrying forward 2 different APU designs, one with Jaguar cores, and another with Steamroller cores. The choice between them would supposedly come down to whether they could get on the 28nm fabrication schedule at Global Foundries in time for a launch in 2013, in which case they'd use the Steamroller design, otherwise they'd have TSMC fabricate the Jaguar version for a launch in late 2013. That's an interesting possibility that would explain some of the back and forth rumors about the Orbis CPU.

I do think it would be 8 Jaguar cores, just like Durango if they go that direction, or 4 Steamroller cores if not.

The first time that I heard of this report, and it makes a lot of sense given what we know so far.

I don't think steamrollers will be ready yet by the time PS4 launches.
 
My understanding is the APU + GPU combo is only in dev kits. The final will supposedly be one System on Chip with the combined power of both.
Creating the confusion are two issues;

1) To support low power modes; CPU packages can be turned off and a smaller GPU plus larger GPU design with the larger able to be turned off to conserve power. This is part of AMD designs in Mobile.

2) Graphics pre-emption; switching from Compute to Graphics mode and back again takes time in flushing and saving registers in the GPU. A GPU takes time to do this much more so than a CPU. Graphics pre-emption is both a software and hardware feature that speeds up this process and is needed if there is only one GPU. Graphics pre-emption is a 2014 HSA feature and will be part of VI 9000 series GPUs. The PS4 and Xbox 3 will have 8000 series GPUs.

The most cost effective design will be a 1 SoC on interposer with a common UMA Stacked memory. But AMD recommends APU + GPU until 2014 designs that contain graphics pre-emption and Context switching.

Best guess is that CPU and GPU designs @ 28nm are being used as that is a known and stable process but 2014 designs allowing full HSA are being ported from 20nm to 28nm. So 8000 series GPU will have features found in a 9000 series GPU. Cost trumps.
 

jaosobno

Member
Could someone refresh my memory regarding Steamroller-->Jaguar switch?

If they went Jaguar, is it still 4 cores? And if it's 4 cores, is it 8 threads?

Or is it 8 cores, 16 threads like Durango?
 

leadbelly

Banned
Microsoft has done a lot to alienate me from the Xbox brand this gen. It's funny, I was going to say I am far more interested in the PS4 this time around, but then looking back on it, I was far more interested in the PS3 as well. The reason I bought a 360 was simply to scratch my next-gen itch. It was the first console out of course.
 

gofreak

GAF's Bob Woodward
Could someone refresh my memory regarding Steamroller-->Jaguar switch?

If they went Jaguar, is it still 4 cores? And if it's 4 cores, is it 8 threads?

Or is it 8 cores, 16 threads like Durango?

Both Jaguar and Steamroller are single-threaded cores, AFAIK.

I don't think anyone has reported/leaked how many cores Orbis is using since the reports about a switch from Steamroller->Jaguar emerged.
 

jaosobno

Member
Both Jaguar and Steamroller are single-threaded cores, AFAIK.

I don't think anyone has reported/leaked how many cores Orbis is using since the reports about a switch from Steamroller->Jaguar emerged.

Thx, but wasn't it mentioned that Durango had 16 threads and is using Jaguar cores? On the other hand, IIRC, it was mentioned in certain circles that Jaguar's pipeline was too short to go SMT. So that would mean that Orbis and Durango have 4 or 8 cores/threads (unless there was some unknown modification done to pipelines that would enable SMT for Jaguar).
 

Nachtmaer

Member
Could someone refresh my memory regarding Steamroller-->Jaguar switch?

If they went Jaguar, is it still 4 cores? And if it's 4 cores, is it 8 threads?

Or is it 8 cores, 16 threads like Durango?

Steamroller uses what AMD calls modules. Each module contains two cores who share certain parts (like the FPU). So they aren't really full-fledged cores like let's say Phenom but it's nothing like SMT (hyperthreading) either. According to AMD one module reaches about 70-80% performance compared to a real dual core, Intel's hyperthreading goes up to about 40%.
AMD_Steamroller_CPU_Design-640x354.jpg


Jaguar just uses regular cores and goes up to four in one Compute Unit.
356227-amd-jaguar-compute-unit.jpg


So if either MS and/or Sony are going for 8 Jaguar cores, that means they're using two CUs. If Sony plans on going with Steamroller that supposedly uses four cores, it'd be using two modules.

I probably made it more confusing now.
 
Could someone refresh my memory regarding Steamroller-->Jaguar switch?

If they went Jaguar, is it still 4 cores? And if it's 4 cores, is it 8 threads?

Or is it 8 cores, 16 threads like Durango?
AMD uses a Xbar switch for the CPU packages. There are slots for 4 CPU packages. A Jaguar CPU package contains 4 Jaguar CPUs and 2 meg of shared L2 Cache. A Steamroller CPU package contains 2 Steamroller CPUs. All AMD CPUs are single threaded.

Rumors and Sweetvar26 says 2 Jaguar CPU packages (8 Jaguar CPUs) which leaves 2 slots on the Xbar switch free......Jaguar is small and cheap so why arn't they using 4 CPU packages or 16 Jaguar CPUs:

1) Don't need more than 2 Jaguar packages
2) redundancy, there may be 3 or 4 CPU packages but defective CPU packages will be turned off.
3) Some other CPU packages will be included for BC
4) Some other CPU IP will be included
 

DBT85

Member
Microsoft has done a lot to alienate me from the Xbox brand this gen. It's funny, I was going to say I am far more interested in the PS4 this time around, but then looking back on it, I was far more interested in the PS3 as well. The reason I bought a 360 was simply to scratch my next-gen itch. It was the first console out of course.

This is one of the many things that will be interesting to see when the PS4/XB3 come out.

Are people that initially bought a 360 first going to buy the XB3 first? Are people that waited for the PS3 going to buy a PS4 first? Are people on both sides simply going to buy whatever console comes out first? How important are existing online accounts going to be to the next generation?

There is lots of speculation about what exactly caused Sony to lose so much ground to Microsoft in the US and to a degree in the UK. The price, the release date, the launch games, the online, multiplat performance all were in Microsofts favour last time out, I just wonder if Sony would be stupid enough to let them have so many bonus points for the next generation.

To see what happens in the next round is going to be really interesting. Will Microsoft continue and improve their domination of the US and UK? Will they drop the ball somehow and give Sony a chance? Will Sony match them on price and hardware and regain parity or even gain the advantage? Will Sony cock something up again and lose even more ground to Microsoft? Is the USA going to become for Sony what Japan is for Microsoft?
 

jaosobno

Member
Great, thx for the clarification. I am familiar with AMD's module architecture, however, I've studied Jaguar's architecture only a little so I'm not overly familiar with it.
 

PaulLFC

Member
This took far too long to format properly, in retrospect I should have just took a screenshot of Wikipedia
3AQmK.gif


Currently known specs of the four rumoured cards for next gen - either the 8750/8770 is rumoured for Durango, and either the 8830/8850 for Orbis.

Code:
Model	        Launch	Codename	Fab (nm)   Transistors (M)    Die Size (mm2)   Bus interface	Memory	Clock rate (Core/Mem)        Config core     Fillrate (Px/Tx)	    Bandwidth   Bus Type   Bus Width	  GFLOPS (SP)	TDP3 (W) (Idle/Max)	GFLOPS/W     GFLOPS (DP)     Release Price (USD)
Radeon HD 8750	2013	Bonaire Pro	28	   1700               135	       PCIe 3.0 ×16	1024	Unknown       Unknown	     640:40:16	     Unknown	Unknown	    Unknown	GDDR5	   192	          Unknown	Unknown	Unknown	        Unknown	     Yes	     Unknown
Radeon HD 8770	2013	Bonaire XT	28	   2000               160	       PCIe 3.0 ×16	2048?	Unknown       Unknown	     768:48:16	     Unknown	Unknown	    Unknown	GDDR5	   192	          Unknown	Unknown	Unknown	        Unknown	     Yes	     Unknown
Radeon HD 8830	2013	Hainan LE	28	   3200               230	       PCIe 3.0 ×16	1024?	Unknown       Unknown	     1024:64:32	     Unknown	Unknown	    Unknown	GDDR5	   256	          Unknown	Unknown	Unknown	        Unknown	     Yes	     Unknown
Radeon HD 8850	2013	Hainan Pro	28	   3400               270	       PCIe 3.0 ×16	2048	925-975       Unknown	     1536:96:32	     31.2	93.6	    192	        GDDR5	   256	          2990	        10	130	        23	     187.2	     $199

- 87xx-89xx are based on GCN2 (Graphics Core Next 2) architecture.
- HD 85xx-89xx models include DirectX 11.1, OpenGL 4.2 and OpenCL 1.2
 

PaulLFC

Member
How can the 8830 be 230mm² and the 8850 270mm² when they both use the same chip?
Don't blame me, blame Wikipedia
3AQmK.gif


It was just a copy/paste job, I don't have that level of knowledge about GPUs. If anyone knows any more reliable figures for any of the stats feel free to correct them and I'll update the table.
 

Ashes

Banned
Don't blame me, blame Wikipedia
3AQmK.gif


It was just a copy/paste job, I don't have that level of knowledge about GPUs. If anyone knows any more reliable figures for any of the stats feel free to correct them and I'll update the table.

That's from the rumoured table that got shot down as fake.

Fyi, this is probably the reason why wiki for sourcing is not always a good idea.
 

gofreak

GAF's Bob Woodward
This took far too long to format properly, in retrospect I should have just took a screenshot of Wikipedia
3AQmK.gif


Currently known specs of the four rumoured cards for next gen - either the 8750/8770 is rumoured for Durango, and either the 8830/8850 for Orbis.

I think this is just based on Proelite's comment.

Firstly he didn't mean these cards would go into next-gen consoles...custom chips based on the same or preceding lines will.

Second, Proelite has simultaneously said that Orbis is based on GCN, the preceding line, so he's sort of contradicting himself.

However I guess he more means to give a rough idea of what the chips would be like cast in 8xxx terms. 12 CUs would be like the 8770 in terms of configuration. 18 CUs would be like something north of the 8830 in terms of core config.

edit - and if those configs aren't right per the above comments, and Proelite was using them to make a comparison, you should probably throw them out anyway :)
 
How can the 8830 be 230mm² and the 8850 270mm² when they both use the same chip?

That whole chart looks bogus. The Pro and XT Bonaire chips should be the same size as well. We know AMD takes the same chip and salvages parts to create different models so the transistor and size counts would not be arbitrarily different like is listed there.
 

Nachtmaer

Member
That's from the rumoured table that got shot down as fake.

Fyi, this is probably the reason why wiki for sourcing is not always a good idea.

Exactly. I still don't get why some people keep referencing that one table. A chip that is 20% bigger which is clocked higher, uses less power and costs less than the current 7800s? What a joke.

The only reasonable thing about all this is that I can see the 8800's die be about 260-270mm², or perhaps even less.
 

Ashes

Banned
AMD uses a Xbar switch for the CPU packages. There are slots for 4 CPU packages. A Jaguar CPU package contains 4 Jaguar CPUs and 2 meg of shared L2 Cache. A Steamroller CPU package contains 2 Steamroller CPUs. All AMD CPUs are single threaded.

Rumors and Sweetvar26 says 2 Jaguar CPU packages (8 Jaguar CPUs) which leaves 2 slots on the Xbar switch free......Jaguar is small and cheap so why arn't they using 4 CPU packages or 16 Jaguar CPUs:

1) Don't need more than 2 Jaguar packages
2) redundancy, there may be 3 or 4 CPU packages but defective CPU packages will be turned off.
3) Some other CPU packages will be included for BC

For the context yes. But I think the FX processors (even with piledriver architecture) are multithreaded, such as fx4300
 
They are only multithreaded if you want to call each Bulldozer/Piledriver/Steamroller module a single core. AMD's designs are not comparable to the SMT IBM uses or Intel's hyperthreading. The way AMD counts cores, each one only supports a single thread.
 

Ashes

Banned
They are only multithreaded if you want to call each Bulldozer/Piledriver/Steamroller module a single core. AMD's designs are not comparable to the SMT IBM uses or Intel's hyperthreading. The way AMD counts cores, each one only supports a single thread.

I guess that makes sense. Yes. Funnily enough I've been looking at transistor counts today. What the hell happened with bulldozer? It just ballooned into the stratosphere. Brazos on the other hand is so tiny.
 
Bulldozer on PC has tons of L2 AND L3 cache (16MB SRAM total) that makes up a ton of the transistor count. The OoOE scheduling hardware in really complicated as well, compared to Jaguar.
 

mrklaw

MrArseFace
Bulldozer on PC has tons of L2 AND L3 cache (16MB SRAM total) that makes up a ton of the transistor count. The OoOE scheduling hardware in really complicated as well, compared to Jaguar.

is there a practical reason you'd go for 4 bulldozers instead of 8 jaguars? from a gaming context.
 
Bulldozer could be faster in situations where you are limited by a single thread's performance. Some kinds of code could just flat out be faster clock for clock on Bulldozer.
 

Ashes

Banned
is there a practical reason you'd go for 4 bulldozers instead of 8 jaguars? from a gaming context.

Do you mean steamroller? And do you mean clock for clock?

For the sake of simplicity, a quad-core steamroller vs an eight core jaguar would be an interesting fight where they are matched clockwise.
 

Nachtmaer

Member
is there a practical reason you'd go for 4 bulldozers instead of 8 jaguars? from a gaming context.

I think the main reason would be that even though games are becoming more multithreaded, not everything can be split up. This is why Sony might want to opt for a CPU that has better singlethreaded performance. Having 4 Jaguar cores might not be powerful enough and 8 of them could be overkill for games. Also it causes less headaches for developers who like to utilize everything.

MS probably likes having more (smaller) cores since the Nextbox wants to support more services and a beefier OS.

I could be totally off but this is just my $0.02.
 
You really shouldn't make posts like these. There are too many unknown variables. In current configurations, ProElite mentioned that either system could be up to 50% more powerful. I am not a programmer, so I have no idea how these things work. However, you need to take some facts in:

1. Orbis is using GCN, Durango is using GCN2
2. Durango was designed to work around bandwith issues and Orbis brute forces those issues.
3. The effect of 4 GB versus 8 GB of RAM. Fast versus slow RAM. What type of engine, etc.

Both should be a massive jump :).

Why not? It's a topic about next gen speculation, so everybody is assuming things. We don't know much.

Anyways, after the last batch of Proelite posts, i'm more confident in my assumptions.

Tell us more, you know more stuff. We wont tell Sony or MS, we promise!
 

tipoo

Banned
I swear I have read this exact post somewhere before. The article you link sounds like the dev talking about the PS3 at the beginning and yes it was said to be terrible.

A much more logical way of trying to predict whether they have improved in this area would be to look at their latest device. That is the Vita and I read that it is a big improvement.


I recycled my own post because someone wanted the same thing explained.

I have not heard of Vita having a GPU debugger or anything specific like that, the fact that it is easier to program for could just be because it uses a fairly straightforward architecture, right? Or maybe they did get around to making better dev tools. I hope that is the case.
 

antic604

Banned
I think the main reason would be that even though games are becoming more multithreaded, not everything can be split up.

Sure they're working on that already: LINK

Also, in one of the next-gen threads it was suggested that 1.6Ghz Jaguar cores have 2.4Ghz Turbo mode, so 8 of them in turbo mode using such compiler could easily outperform the Orbis. Obviously we don't know whether all 8 will be available for games, as e.g. 2 might be locked out for background system stuff, etc.
 

Nachtmaer

Member
Sure they're working on that already: LINK

Also, in one of the next-gen threads it was suggested that 1.6Ghz Jaguar cores have 2.4Ghz Turbo mode, so 8 of them in turbo mode using such compiler could easily outperform the Orbis. Obviously we don't know whether all 8 will be available for games, as e.g. 2 might be locked out for background system stuff, etc.

Well I sure hope we can see some progress when it comes to multithreading.

That turbo mode doesn't necessarily mean anything though. MS/AMD will probably tweak the architecture's power profile to their own needs. At least the 2.4GHz indicates what frequency Jaguar is capable of running at, considering its short pipeline (compared to Bulldozer and the likes).
 

i-Lo

Member
So Proelite says PS4 will once be receiving gimped ports. I have had it with that shit. It takes away a major incentive for me when it comes to buying PS4 first. More than anything, after all the struggles, all the mistakes Sony made with PS3, I had hoped that even when neck and neck, I'd never heard about getting the short end of the stick when it comes to multiplat titles, which form the bulk of games released in any given year.

People, including myself, have hugged on to thuway and Proelite for a few legitimate (or we'd hope so) leaks amongst obfuscations. So I am now pretty worried and wonder whether this is another reason why Aegeis alluded to third parties not having much confidence in Sony?
 
Top Bottom