• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

Razgreez

Member
Agreed. But What hardware does Microsoft make? And secondly, if they already have a solution for their bandwidth woes, why are they in that consortium?

To be honest, I don't know what they're doing there. Just rolling the dice.

Being part of a consortium does not mean you need to actually make the hardware. It's more so that you have a say in its design. They do after all produce the xbox and have fingers in windows phones, TVs, tablets etc

edit - if we're arguing about whether Sony will ever use stacking for PS4, I think for sure, they will...I'm talking about just whether the first version will or not.

Realistically, wouldn't the design differences and therefore, the performance differences be too large to have that sort of phased production. Unless you're saying developers will just be restricted to making sure their software runs on the GDDR5 hardware even if the stacked DDR4 APU provides better performance - which seems like a waste but is not impossible since you still gain the benefit of better efficiency
 

Ashes

Banned
You seem woefully optimistic.

How so? Woefully optimistic implies that I think they will make it by q4 this year. I don't.
It's possible but not plausible.

I'd be more than happy for you to show me where we're going wrong - completely or otherwise.

Being part of a consortium does not mean you need to actually make the hardware. It's more so that you have a say in its design. They do after all produce the xbox and have fingers in windows phones, TVs, tablets etc

Agreed. Well close enough. I think you become part of a consortium if you have a vested interest. Now this isn't necessarily Xbox. It could be pc. Or tablets. They may need that performance to scale up so people upgrade their pcs or maybe buy performance tablets. So as to leverage Direct x maybe [Are most pc games developers still Direct X?]. But yeah, all of those companies have vested interest directly or indirectly. I'm just thinking it's the most obvious one. :p


Honestly, speaking, I only posted that PR release because we were talking about bandwidth, again, and I wanted to show that Microsoft is like every other company that deals with computing, gpus, cpus, and think along the same lines. And in between talking about it in this thread, and thinking about the other thread, I posted it in both lol.

edit: Oh and I encourage cynicism. I try to be fair... but with a healthy dose of scepticism. Even if I don't often say as much.
 
How so? Woefully optimistic implies that I think they will make it by q4 this year. I don't.
Its possible but not plausible.

I'd be more than happy for you to show me where we're going wrong - completely or otherwise.

eh.

Why wouldn't it be plausible for them to launch in 2013?
 

antic604

Banned
I'm trying to follow all the rumors, incl. GDDR5 vs. DDR3/4 stacked, etc. but I miss one thing - it was said numerous times in this thread that DDR3/4 will become even cheaper in the future, while this won't be the case for GDDR5.

Why is that? Is GDDR5 made from unobtanium, or something? ;) Or is it because stacked DDR3/4 will provide bandwidth thus far reserved for GDDR5, hence the latter becomes obsolete and will only be produced in small quantities, hence no effect-of-scale benefits?
 

Ashes

Banned
eh.

Why wouldn't it be plausible for them to launch in 2013?

Do you mean with GDDR5? then yeah. Heck, no huge difficulty in getting to market with a barebones pc with a customised apu chipset. And AMD already had a kabini/tamash chips at CES. Everything is set. Minus the controller that is.

The context I was replying in was with the 2.5d tsv. Sure, as above there's testing, improving yields, balancing the hardware, testing again, and more testing, then sending off final silicon hardware to devs... All of that I guess could be done, if they started last summer. Ramp up production till launch in the fall. etc. Or so I understand from what I can gleam from reading up on it. But with 2.5d tsv, being a never before done process, you'd almost double the time it takes to bring to market. If ever. Even if it did look like they had come to resolving whatever issues they had. Going off my gut, I'd say Q1 2014, with all this stacking. But it's an amateur guess.

I know Intel is going stack crazy this year [or so they say], but Intel is so far ahead of the game, it's not even funny.
 
I'm trying to follow all the rumors, incl. GDDR5 vs. DDR3/4 stacked, etc. but I miss one thing - it was said numerous times in this thread that DDR3/4 will become even cheaper in the future, while this won't be the case for GDDR5.

Why is that? Is GDDR5 made from unobtanium, or something? ;) Or is it because stacked DDR3/4 will provide bandwidth thus far reserved for GDDR5, hence the latter becomes obsolete and will only be produced in small quantities, hence no effect-of-scale benefits?

Companies like Samsung already announced that they will cut their GDDR5 production - other companies might follow depending on the acceptance rate and success of DDR4 (stacked, speed, price, ...). So today every medium to high performance GPU uses GDDR5 but in 3-4 years that could easily change and GDDR5 might become a niche product. However this is as always not set in stone it is just likely.
 

Ashes

Banned
I wonder if we can calculate a very rough performance ballpark for the jaguar cores cpu...

1. AMD’s Jaguar-Based APU Runs at 2 GHz and Beats Brazos by 260%
2. Fastest brazos: AMD E2-1800 APU (Dualcore) 1.7..
3. cpu score: 849
4. quadcore jaguar: 2207.4
5. 8 core: ~4414.8
6. Very very rough benchmark: A10 5800k = 4,777
A10 5700k= 4,496

will edit out lol if the maths is wrong. But I think the A10 in the op is aiming for a quad-core steamroller, because the article presumes a 2ghz jaguar quad-core... and with efficiencies lost etc...
 

antic604

Banned
Companies like Samsung already announced that they will cut their GDDR5 production - other companies might follow depending on the acceptance rate and success of DDR4 (stacked, speed, price, ...). So today every medium to high performance GPU uses GDDR5 but in 3-4 years that could easily change and GDDR5 might become a niche product. However this is as always not set in stone it is just likely.

Ok, thanks - that explains it.

Another question, though - is it technically possible, that Sony would release Orbis/PS4 this year with GDDR5 just to not give MS to big of a head start, but would switch to stacked DDR3/4 once the tech becomes available in 2014? Would it be possible to make the change transparent for already existing software, provided that the bandwidth of 192GB/s remains intact (and make 2014 games working on 2013 consoles, as well)? If yes, then that would make a lot of sense considering strong PS3 lineup for 2013 (my current backlog is already huge and will get much bigger this year still) so first batch of PS4s wouldn't really have to be big - just to satisfy the 'hardcore'-s - with much more sales expected in 2014 and beyond, when BoM would be cheaper.
 

Sid

Member
I think after what Sony did with the architecture of their last/current home console, seeing is believing. Like Batman once said to Superman about Joker, "Expect the unexpected".
But if we expect the unexpected then the unexpected becomes the expected and then we have nothing to expect except expecting the expected and unexpected.

I'm looking forward to the official announcement.
 
I'm not disputing that it gets hot, memory is generally one of the hottest parts on a card. Im disputing the "very" part. Im willing to bet its pretty much in line with the XDR + GDDR3(Which was alot worse) setup. Would cost about the same too.

XDR has a very low power consumption, comparatively speaking. On the PS3, neither the GDDR3 or XDR needed contact cooling solutions.
 

i-Lo

Member
But if we expect the unexpected then the unexpected becomes the expected and then we have nothing to expect except expecting the expected and unexpected.

I'm looking forward to the official announcement.

Accepting the unexpected with the expected excepts the prime condition to only be able to accept the expectation of the unexpected.

GDC should provide for a more official outlet for further relevant leaks. However, given Kaz Hirai's statement, it is prudent to expect better information for XB3 rather than PS4.
 
Ok, thanks - that explains it.

Another question, though - is it technically possible, that Sony would release Orbis/PS4 this year with GDDR5 just to not give MS to big of a head start, but would switch to stacked DDR3/4 once the tech becomes available in 2014? Would it be possible to make the change transparent for alreaedy existing software, provided that the bandwidth of 192GB/s remains intact (and make 2014 games working on 2013 consoles, as well)? If yes, then that would make a lot of sense considering strong PS3 lineup for 2013 (my current backlog is already huge and will get much bigger this year still) so first batch of PS4s wouldn't really have to be big - just to satisfy the 'hardcore'-s - with much more sales expected in 2014 and beyond, when BoM would be cheaper.

Possible? Yes it is ...

... feasible? Hell no :)

You not only need the bandwidth at the same speed but also latency, throughput, timings and what is more important the bus system would need to be heavily modified to go from 256Bit and GDDR5 to wide-IO, HMC, stacked version. Furthermore you need to change the cooling solution, the board layout, etc.

I personally doubt that this will happen - the better way would be to wait ~6months depending on the schedule and launch with what you want. So far GDDR5 has been in the dev-kits and we will have to wait until summer for the final iteration which might show us if GDDR5 is still the solution or if Sony managed to get DDR3/4 stacking working (basicly a supply problem).
 

Mario007

Member
EDIT: Do you mean "will NOT"?

Yeah, we know they want a high bandwidth solution. They complained about the low BW compared to PS2, and in the Stereo3D presentation they said very high BW was needed.
Wasn't high bandwidth the core of their design philosophy for the PS2 as well and that allowed it to compete with consoles that came after it and were more powerful?
 

i-Lo

Member
Yeah, we know they want a high bandwidth solution. They complained about the low BW compared to PS2, and in the Stereo3D presentation they said very high BW was needed.

Can sufficiently high bandwidth remedy graphical ailments like LoD and object pop ins, Z fighting, aliasing etc?
 
Wasn't high bandwidth the core of their design philosophy for the PS2 as well and that allowed it to compete with consoles that came after it and were more powerful?

Can sufficiently high bandwidth remedy graphical ailments like LoD and object pop ins, Z fighting, aliasing etc?

Yes. For both.

EDIT: I'll go into more detail. Higher BW can allow faster data streaming to prevent less pop-in. More Ram can have more data loaded into the system, but that's useless if the BW can't accommodate the fillrate and actual rendering of the scene. Aliasing is also a bandwidth issue. It can be mitigated by higher internal renderer resolution increases or anti-aliasing increase. Both of which can be achieved by increasing BW.
 
Ok, thanks - that explains it.

Another question, though - is it technically possible, that Sony would release Orbis/PS4 this year with GDDR5 just to not give MS to big of a head start, but would switch to stacked DDR3/4 once the tech becomes available in 2014? Would it be possible to make the change transparent for already existing software, provided that the bandwidth of 192GB/s remains intact (and make 2014 games working on 2013 consoles, as well)? If yes, then that would make a lot of sense considering strong PS3 lineup for 2013 (my current backlog is already huge and will get much bigger this year still) so first batch of PS4s wouldn't really have to be big - just to satisfy the 'hardcore'-s - with much more sales expected in 2014 and beyond, when BoM would be cheaper.

They (Sony and Microsoft) can't use GDDR5 or DDR3 at memory speeds high enough to properly support a game console because they are using LPM silicon. They both need Wide IO DRAM. A special sauce larger cache is not going to support even a 6X Xbox 360 GPU, you need a minimum memory bandwidth that a slightly less than 2 Ghz Memory clock can't support (assumption is Kabini with 2 64 bit DDR3 controllers). GDDR5 with a clock speed of 2 Ghz comes close as there are two memory operations for every one DDR3.

systemfehler said:
You not only need the bandwidth at the same speed but also latency, throughput, timings and what is more important the bus system would need to be heavily modified to go from 256Bit and GDDR5 to wide-IO, HMC HBM (High Bandwidth Memory not Hybrid Memory Cube which is HBM with a bottom logic layer that converts to multiple serial lines), stacked version. Furthermore you need to change the cooling solution, the board layout, etc.
Although take the Micron stacked HMC memory (2 and 4 Gig versions now supported) and don't put the bottom logic layer on it and it could be used in a Game Console. 2 & 4 Gig HMC has no application as HMC is designed for servers and 4 gig is too small for that application.

The strong PS3 lineup vs Microsoft's I think is because Sony feels the PS3 is going to sell big this year due to ATSC 2.0 and XTV/RVU as according to Microsoft the Xbox 360 can not support 1080P or 1080P S3D while the PS3 can. Some of the following in the bottom right box can be supported by the PS3 while all in that box are what Microsoft wanted supported by the Xbox 361 and Sony if it refreshes rather than shrinks the PS3 should do also.

Slide4.jpg
 

Ashes

Banned
If I'm roughly in the ball park, then we have A10 5700k apu's cpu.
edit:
I wonder if we can calculate a very rough performance ballpark for the jaguar cores cpu...

1. AMD’s Jaguar-Based APU Runs at 2 GHz and Beats Brazos by 260%
2. Fastest brazos: AMD E2-1800 APU (Dualcore) 1.7..
3. cpu score: 849
4. quadcore jaguar: 2207.4
5. 8 core: ~4414.8
6. Very very rough benchmark: A10 5800k = 4,777
A10 5700k= 4,496

will edit out lol if the maths is wrong. But I think the A10 in the op is aiming for a quad-core steamroller, because the article presumes a 2ghz jaguar quad-core... and with efficiencies lost etc...

Well at least it confirms that A10 is close to final specs...

edit: If this is close enough, then the xbox had Xeon E5540 @ 2.53GHz

guess upon guess make it even more prone to error but, that xeon is quadcore with 8 logical threads rather than an 8 core with sixteen threads. Cpu performance wise= a wash.
 
The strong PS3 lineup vs Microsoft's I think is because Sony feels the PS3 is going to sell big this year due to ATSC 2.0 and XTV/RVU as according to Microsoft the Xbox 360 can not support 1080P or 1080P S3D while the PS3 can. Some of the following in the bottom right box can be supported by the PS3 while all in that box are what Microsoft wanted supported by the Xbox 361 and Sony if it refreshes rather than shrinks the PS3 should do also.

Are you saying Sony will update the PS3 later this year for it to support ATSC 2.0 and XTV/RVU ? You've been saying this for a while. Wouldn't it have happened already if Sony ever intended to do so ? I mean, last fall was the perfect time, considering the super-slim launched.
 
XDR has a very low power consumption, comparatively speaking. On the PS3, neither the GDDR3 or XDR needed contact cooling solutions.

Thats kind of my point.

The Rambus used in PS3 is clocked at 1.8V+/- 0.09V
The GDDR3 at 2.0V+/- .1V


The GDDR5 on a 7970 for example, comes at 1.5V stock. And theres three power modes available.

As nibs said, you can even use a passive aluminum sink directly on the chip to cool it. Its not the hardest thing in the world, like you are trying to make it.
 

Ashes

Banned
Want to know something funny?

7970m is a decent graphic card [above middle], right? and if you looked at value, say when you look at Direct compute [price/performance], it shoots very close to the top:

http://www.videocardbenchmark.net/directCompute.html

It's funny because it says n/a in price column.

edit: My mistake. It's not sorting by value just raw performance. groan there's a gtx 690 there. Must be driver issue with GTx690. Or it really is as I initially thought, and it's about value? :p

edit: Explanation found. It's modded [it says so]. Beats me what a modded 7970m is doing in those charts but there you go. Google suggests a driver mod. ?
 
Are you saying Sony will update the PS3 later this year for it to support ATSC 2.0 and XTV/RVU ? You've been saying this for a while. Wouldn't it have happened already if Sony ever intended to do so ? I mean, last fall was the perfect time, considering the super-slim launched.
ATSC 2.0 started being implemented last fall with Non-Realtime-Transmission and Mobile TV but the features like XTV and 1080P/1080P S3D will reach candidate for use status this March. RVU was according to news articles from last years CES and Direct TV, Verizon, Comcast, was supposed to be implemented in March and I can only assume NOW that they meant March 2013 not 2012 like I first assumed. I'm hearing from multiple sources that Verizon and Comcast are training to implement it soon.

Xbox 361 was supposed to launch for 2013 and I speculated incorrectly that the PS3 4K chassis would contain the same HDMI pass-through. PS3 and Xbox 360 can use RVU but it makes the DVR redundant and adds 61 watts of always on PS3 or 67 watts of always on Xbox 360 when the TV is on. PS4 should be less than 15 watts.
 
ATSC 2.0 started being implemented last fall with Non-Realtime-Transmission and Mobile TV but the features like XTV and 1080P/1080P S3D will reach candidate for use status this March. RVU was according to news articles from last years CES and Direct TV, Verizon, Comcast, was supposed to be implemented in March and I can only assume NOW that they meant March 2013 not 2012 like I first assumed. I'm hearing from multiple sources that Verizon and Comcast are training to implement it soon.

Xbox 361 was supposed to launch for 2013 and I speculated incorrectly that the PS3 4K chassis would contain the same HDMI pass-through. PS3 and Xbox 360 can use RVU but it makes the DVR redundant and adds 61 watts of always on PS3 or 67 watts of always on Xbox 360 when the TV is on. PS4 should be less than 15 watts.

So, are we to suppose that yet another PS3 model will be announced in the future ? This seems crazy.
 

Razgreez

Member
Want to know something funny?

7970m is a decent graphic card [above middle], right? and if you looked at value, say when you look at Direct compute [price/performance], it shoots very close to the top:

http://www.videocardbenchmark.net/directCompute.html

It's funny because it says n/a in price column.

edit: My mistake. It's not sorting by value just raw performance. groan there's a gtx 690 there. Must be driver issue with GTx690. Or it really is as I initially thought, and it's about value? :p

edit: Explanation found. It's modded [it says so]. Beats me what a modded 7970m is doing in those charts but there you go. Google suggests a driver mod. ?

A 7970m is a downclocked 7870. "Mod" likely means they've increased the cooling and over-clocked it. It's just above the 7870 which is exactly what it is
 

Ashes

Banned
A 7970m is a downclocked 7870. "Mod" likely means they've increased the cooling and over-clocked it. It's just above the 7870 which is exactly what it is

Makes sense. *nods in agreement*
Still a little funny that it's the only modded one, and the only one with one sample. And I mean you couldn't build a better conspiracy. :p

Except, Sony wouldn't overclock it; this being a power limited console etc.. Shame.. Powerful little bugger.


Edit:

Don't know what the hell we're talking about, when we refer to 2.5d, 3d, SoC, stacking, etc? Fear not, here's one of the best 101's on the subject:

http://www.eetimes.com/design/programmable-logic/4370596/2D-vs--2-5D-vs--3D-ICs-101
 

Razgreez

Member
Makes sense. *nods in agreement*
Still a little funny that it's the only modded one, and the only one with one sample. And I mean you couldn't build a better conspiracy. :p

You could actually. We already have a Tahiti 7870. What if this 7970m is a mobile version of that chip. In essence a GCN 7970m...
 

Auto_aim1

MeisaMcCaffrey
so now the kotaku rumor is back on a bulldozer cpu again?
and 8gb of ram PLUS 2.2 of vram lol
They are reporting the dev kit specs which is pretty pointless. The controller thing was already done by CVG so the only thing that's new is the fancy user accounts thing.
 
They specifically say in one of the first paragraphs, "And before we go any further, know that these are current specs for a PS4 development kit, not the final retail console itself. "
 

Arkham

The Amiga Brotherhood
They specifically say in one of the first paragraphs, "And before we go any further, know that these are current specs for a PS4 development kit, not the final retail console itself. "

Good point. They also specifically say in the page header "Kotaku", so that makes sense too.
 

ghst

thanks for the laugh
kotaku apparently drink at the same "oh boy you should see the devkit on this baby" bar as aegies.
 
Top Bottom