• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

AkIRA_22

Member
Sony just spent hundreds of millions of dollars releasing a handheld in a post iphone reality. They released move. They continue to put out first party titles that don't sell particularly well. They hitched their wagon to blu-ray and it cost them dearly.

Do I think Sony is incapable of making good decisions? No. Do I think there is a frequent crisis of vision there? Yes.

With regards to Microsoft and Durango, I've been saying all year that I've worried that internal politics at Microsoft will make Durango a muddled mess. I also called bullshit on the 350 watt monster that nowgamer insisted was the next Xbox. I've also said here, I believe, that I was most concerned about memory bandwidth on the system, because I was worried they were going with something slow.

I'm no longer concerned about that. And the console they wanted to release before the system was delayed would have been. It would have been the underwhelming piece of hardware that leaked last year. It's only been in the last 5 months that I've been slowly convinced that Microsoft isn't going to sabotage itself, and honestly? I'm still a little concerned. There's going to be a big play to tie durango into windows 8 and windows phone 8 ecosystems, and that could pay off for them, or it could blow up in their face.

I'm not concerned about either system. I think they'll be about even for at least the first few years, from a performance perspective. If you want to ignore my input and opinions, more power to you. But when I see people operating from a premise that they want to believe, i.e., that Sony will be the king of hardware, and that they're going to blow away the competition, I have to call foul. I don't think that's the way it's going to shake out. And if you're walking into the next generation thinking that, you're going to be disappointed. Just like people spend the first two years of the PS3 being disappointed.



Fair enough! I don't think Microsoft or Sony would ever consider their system costing anywhere near that. But fair enough.

Peeps, Arthur is on the ball. though I got to say, Memory will be less of an issue since they will be pushing 2D (3D is dead as proven by CES) games at up to 1080p. So massive amounts of memory aren't as much of a concern as it is ability to do things quickly. Saying that, it's not going to be a gap of particularly significant proportions, not like the differences in the PS3 and 360 ecosystem. Developers will use the dev kits to fine tune their games to the benefits and limitations of each system. To us, the user, it will probably end up being imperceptible. 4 GB or 8 that's a lot of memory for 1080p in the console environment, which has the benefit of fine tuning and squeezing every last drop out of the hardware, it just takes time.

I think initially it will benefit the 360, devs will want sheer brute force, then the nuances of the PS4 start being worked out. Again these aren't as drastic as it was with this gen, where you are talking an entirely different approach of doing everything that touches the CPU. If Sony are smart they will instruct devs in how to optimize for faster RAM.
 

mrklaw

MrArseFace
if you want to do 1080p, bandwidth helps.

if bandwidth is an issue on durango, it has the potential to drop to 720p before PS4 does.
 
Well, I probably shouldn't stir up this hornets nest here, but there's a lot of evidence they are, or at least can be, fairly incompetent.

A lot of the decisions in PS3 just were not good ones imo.

There's no way you should give your opponent a one year head start, and your console doesnt handily trash there's, imo. And PS3 did not handily trash 360.

If Sony designed Orbis 8 years ago maybe that would matter. Crazy Ken is gone, there is no new format to mortgage the business to push and MS will have a 1 year head start only in their dreams this time.

Vita is a great design, but it shows Sony doesn't understand the handheld market. People searching for good graphics dont seek out handheld in the first place.

Not only that, but Vita games look substantially worse than 360 ones. Such is the problem with building a handheld over graphics. The very best you can do is still unimpressive.

Sony built a state of the art handheld, the problem is they dont understand nobody wants that.

Too be fair, I was driving the "PSP is gonna crush DS!!!" bandwagon back in the day. I was dead wrong. That's when I figured it out.

PSP continued to sell strongly in Japan up until the Vita's introduction. It's not that crazy to think a follow up to a 70+ million seller is worth a shot. In any case, we're arguing about hardware design decisions in relation to developer needs. In those matters Vita is a pretty shining example of how to do that right, delivering a great, dev friendly machine at a good price. Seems like we should have a lot of confidence in the Orbis design based on recent history. Unless you're arguing that the home console market is dead or dying, something that will equally impact Durango, these comparisons to the Vita business model don't tell us anything.
 

jaosobno

Member
The problem is we don't know if ND for example had the choice between small/fast RAM or big/slow RAM at all or if they just have been able to push from 2 to 4GB and have been limited by Sony and their money/power constraints.

In my opinion, ICE Team (damn that sounds cool - no pun intended), was heavily involved in PS4 development.

It really makes no sense for Sony not to involve their elite developers when creating a gaming console and considering that Naughty Dog houses ICE, it's logical to presume that they had the choice and their their input mattered a lot.

Money played a big part in shaping PS4, but I do believe that it's not in Sony's best interest to cut PS4's power (in contrast to X720) so they could have smaller expenses.

This kind of approach (money first, everything else later) combined with popular trend of hating Sony would immediately backfire on them.

Gaming press (specially the likes of IGN, Kotaku, and obviously NowGamer) would start smear campaign day 1 with phrases like "PS4 inferior to X720" and "PS4 has games but no power" becoming new catchphrases to fuel next gen console wars.
 

Reiko

Banned
if you want to do 1080p, bandwidth helps.

if bandwidth is an issue on durango, it has the potential to drop to 720p before PS4 does.

Here's another thing I wonder.

Is the GDDR5 in PS4 a reaction to the eSRAM in 720, or is it the other way around?

I mean is the eSRAM really 32MB?
 

Tripolygon

Banned
Here's another thing I wonder.

Is the GDDR5 in PS4 a reaction to the eSRAM in 720, or is it the other way around?

I mean is the eSRAM really 32MB?

Its neither. In PS2 Sony went for high bandwidth hence they used eDRAM in PS2 while in PS3 they weren't concerned much about bandwidth and now they are reverting back to high bandwidth for PS4 because rendering performance from high-end GPUs relies on it.

Reading this would give you some insight to why they went for GDDR5
 

gofreak

GAF's Bob Woodward
Here's another thing I wonder.

Is the GDDR5 in PS4 a reaction to the eSRAM in 720, or is it the other way around?

I mean is the eSRAM really 32MB?

On bandwidth I think both were looking at what their systems needed to keep humming along rather than simply trying to match each others numbers.

Orbis needs more bandwidth into the top of the pipeline if its GPU is more powerful. An eDRAM based system that balances bandwidth toward the bottom of the pipeline would be risky in terms of creating a bottleneck.

Durango doesn't need as much frontend bandwidth to keep its GPU humming along if it's not as powerful. But needs more bandwidth for the bottom-end of the pipeline than DDR3 would provide, hence the eDRAM. Microsoft could have chosen higher bandwidth main memory instead, but that would require significantly more cost if they also wanted to have 8GB of RAM. So they would have probably had to choose between constraining their OS plans, going with an eDRAM+DDR3 setup and keeping them, or bumping the cost-per-unit quite significantly. The middle option looks like the best balance for them.
 
I do like Microsoft and especially their development tools but on the other side they are a big corporation more focused on total living room domination not gaming alone. The last E3 showed us a glimpse what they are up to and if you want to push the struggling business of your own tables, phones and Windows 8 into the living room you might have to sacrifice parts in the gaming division.

Hate to break it to you, but Sony and Microsoft are exactly alike in that regard. Domination in the broader entertainment space has always been Sony's goal. Even if we ignore "it only does everything" for a moment, remember how their 2005 E3 conference started? Some choice quotes:

Sony said:
Ken Kutaragi and his team at Sony set out to invent a different future. They imagined a rich 3D experience fusing gaming, music, movies, communication.

Sony said:
PlayStation 2 redefined computer entertainment with unparalleled power, speed and graphics, converging movies, games and music.

Sony said:
PlayStation Portable - the next revolution. Suddenly handhelds aren't just for kids... Or just for games. From a minor market into a major cultural force.

I think the last line exemplifies that perfectly: gaming is just a means to an end. Sony is not any less likely to focus on broader features than Microsoft is. If they've been cunningly supporting the notion that gaming is their primary concern in recent times, that's only because they realized that they could benefit from that angle. Gamers are easily manipulated, it seems.
 

Reiko

Banned
Its neither. In PS2 Sony went for high bandwidth hence they used eSRAM in PS2 while in PS3 they weren't concerned much about bandwidth and now they are reverting back to high bandwidth for PS4.





On bandwidth I think both were looking at what their systems needed to keep humming along rather than simply trying to match each others numbers.

Orbis needs more bandwidth into the top of the pipeline if its GPU is more powerful. An eDRAM based system that balances bandwidth toward the bottom of the pipeline would be risky in terms of creating a bottleneck.

Durango doesn't need as much frontend bandwidth to keep its GPU humming along if it's not as powerful. But needs more bandwidth for the bottom-end of the pipeline than DDR3 would provide, hence the eDRAM. Microsoft could have chosen higher bandwidth main memory instead, but that would require significantly more cost if they also wanted to have 8GB of RAM. So they would have probably had to choose between constraining their OS plans, going with an eDRAM+DDR3 setup and keeping them, or bumping the cost-per-unit quite significantly. The middle option looks like the best balance for them.

Ah. I see.
 
Here's another thing I wonder.

Is the GDDR5 in PS4 a reaction to the eSRAM in 720, or is it the other way around?

I mean is the eSRAM really 32MB?

No I think both went with their seperate approach. I do believe that Sony decided to go with 4GB (if that is true) because of the 720s bigger memory setup. Everything else was planned from the start.

32MB are a decent amount for gaming without too much sacrifice (money, space).

In my opinion, ICE Team (damn that sounds cool - no pun intended), was heavily involved in PS4 development.

It really makes no sense for Sony not to involve their elite developers when creating a gaming console and considering that Naughty Dog houses ICE, it's logical to presume that they had the choice and their their input mattered a lot.

Money played a big part in shaping PS4, but I do believe that it's not in Sony's best interest to cut PS4's power (in contrast to X720) so they could have smaller expenses.

This kind of approach (money first, everything else later) combined with popular trend of hating Sony would immediately backfire on them.

Gaming press (specially the likes of IGN, Kotaku, and obviously NowGamer) would start smear campaign day 1 with phrases like "PS4 inferior to X720" and "PS4 has games but no power" becoming new catchphrases to fuel next gen console wars.


First yes it is a cool name and second we should not forget that a japanese company might work different from a US one. Kaz Hirai needs to turn around a sinking ship and I am sure Naughty Dog was heavily involved in the PS4 development but in the end the final decisions come from Japan and are bound by money for a great deal. Of course going the cheapest route will be the same mistake like going all in and taking a great loss. So taking a huge risk is out of the question if you ask me. Sony won't side with the Wii-U this generation but they also won't kill the 720 when it comes to power.

Hate to break it to you, but Sony and Microsoft are exactly alike in that regard. Domination in the broader entertainment space has always been Sony's goal. Even if we ignore "it only does everything" for a moment, remember how their 2005 E3 conference started? Some choice quotes:







I think the last line exemplifies that perfectly: gaming is just a means to an end. Sony is not any less likely to focus on broader features than Microsoft is. If they've been cunningly supporting the notion that gaming is their primary concern in recent times, that's only because they realized that they could benefit from that angle. Gamers are easily manipulated, it seems.

I agree with you but unlike Sony Microsoft has actually the resources and will to make that happen. Sony should be happy if the Playstation4 makes money and helps to push some more TVs. They can't spend the rest of the gold they have in a "battle for everything". Until Sony drops the dead weight and has the remaining division on track not much will happen.
 

Tripolygon

Banned
Gemüsepizza;46586571 said:
I'm curious how the PS4 would perform if they added eDRAM to the GPU and would use 4 GB GDDR5. How viable would this be?

Waste of money. They might as well use that money to add extra RAM for OS as GDDR5 is already fast.

Edit: gofreak explains it better.
 

gofreak

GAF's Bob Woodward
Gemüsepizza;46586571 said:
I'm curious how the PS4 would perform if they added eDRAM to the GPU and would use 4 GB GDDR5. How viable would this be?

It'd be ideal in terms of doing as much as possible to avoid any bandwidth related bottlenecks, but probably overkill for the GPU that's there. To make the most of that kind of bandwidth setup you'd probably want a more powerful GPU = more cost again on top of your added eDRAM cost.

If the sky is the limit, overkill is fine. But the sky isn't the limit. You're going to want to make the best tradeoff against cost to have just enough to balance your components.
 

mrklaw

MrArseFace
Its neither. In PS2 Sony went for high bandwidth hence they used eDRAM in PS2 while in PS3 they weren't concerned much about bandwidth and now they are reverting back to high bandwidth for PS4 because rendering performance from high-end GPUs relies on it.

Reading this would give you some insight to why they went for GDDR5

good point. Both machines seem logical extensions of the architectures in their previous machines.
 

GopherD

Member
As I've pointed out in some other posts, it would seem to me the cost for Sony's approach looks higher. Steamroller vs Jaguar (if it's so), Pitcairn vs Cape Verde, and GDDR5 vs DDR3 all present significant cost discrepancies in favor of the latter (while a paltry 32MB ESRAM should be cheap). It's way too early to say much, so that's just a birds eye early look at it.

That's why I keep making the point I think MS could be on to a cheaper system that performs on par with a more expensive one, which is good engineering. And I've always admired MS engineers the most (360 competing with PS3 despite a full year headstart for example).

But as I also say, it could just be that PS4 is more expensive and also better. Just to be even handed about it.

For MS RAM decisions, my view has always been it probably had to do more with everything else they want to do and less with gaming. They wanted "gobs" of RAM for Kinect, and the OS, and all the other crazy stuff they want, and this is how they got there, imo. DDR3 was the only way to do it cheap enough. But having said that, it could still work out well for the gaming side.

MS has a lot of priorities to put into a similar silicon budget that Sony is concentrating gaming in. You cannot have a solution that is strong in games, strong with Kinect, strong with TV functions, strong with Surface streams that is more highly customised, that costs the same or less and is the same in power than one more dedicated to purely gaming functions. It is a fallacy.
 

mrklaw

MrArseFace
MS has a lot of priorities to put into a similar silicon budget that Sony is concentrating gaming in. You cannot have a solution that is strong in games, strong with Kinect, strong with TV functions, strong with Surface streams that is more highly customised, that costs the same or less and is the same in power than one more dedicated to purely gaming functions. It is a fallacy.

I'm curious why you need to focus resources other than a bit of RAM on those other things though.

If you have a gaming focused machine, it will be great for video playback and all that stuff. PS3 already has DVRs that work in the background while playing, in both Japan and Europe

And I hope PS4 has hardware to support streaming to vita without impacting native games, so that remote play is available for all games. Plus possibly to android phones/tablets too
 

jaosobno

Member
First yes it is a cool name and second we should not forget that a japanese company might work different from a US one. Kaz Hirai needs to turn around a sinking ship and I am sure Naughty Dog was heavily involved in the PS4 development but in the end the final decisions come from Japan and are bound by money for a great deal. Of course going the cheapest route will be the same mistake like going all in and taking a great loss. So taking a huge risk is out of the question if you ask me. Sony won't side with the Wii-U this generation but they also won't kill the 720 when it comes to power.

I agree with everything you said there. I strongly believe that both next gen consoles will be equally matched and that both will have specific strengths and weaknesses (weaknesses this time will not include "Skyrim incident" and inferior ports on PS3).

I also have strong faith in Hirai. The man obviously knows what he's doing since Sony's losses have been cut 50% already (somebody correct me if I'm wrong, I believe that number was mentioned in the reports) and Sony actually feels like much more dynamic company rather than a sleeping giant that they were these past few years (of course there is still a way to go until I can consider them fully fired up).

Hirai is the best of both worlds combining japanese and american business styles into a single persona.

It just occured to me that we should probably start referring to him as "The Messiah".
 

thuway

Member
MS has a lot of priorities to put into a similar silicon budget that Sony is concentrating gaming in. You cannot have a solution that is strong in games, strong with Kinect, strong with TV functions, strong with Surface streams that is more highly customised, that costs the same or less and is the same in power than one more dedicated to purely gaming functions. It is a fallacy.

Agreed. I think people are discounting how Microsoft is posturing the next Xbox. This is Microsoft's take on gaming in a connected sense.
 
Agreed. I think people are discounting how Microsoft is posturing the next Xbox. This is Microsoft's take on gaming in a connected sense.

I remember a quote from the times of the first Xbox saying that it was MS´s troyan horse to gain the living room in the long term. They are starting to make this clearer: kinect, streaming to tablets, buying of smart houses tech company, putting W8 in the console, etc. And from a gamer point of view if they start focusing less and less in games i hope they lose next gen by a mile.
 
Santa Monica and ND are hubs for so much more than their respective franchises. I'd argue they have more say than Polyphony. I'd put third-party aggregate recommendations over any individual first-party studio.

There were rumors like last year, about third parties telling Sony that the RAM is not enough that they upped it. Sony does talk to third parties. There s no way that Sony only talked to first party devs.
 

Ashes

Banned
Good thing is. We'll have better idea which set up is the way to go.

Of course this is just walking in the dark, but who cares really? Let's speculate!

1. 8 smaller cores or 4 beefier cores.
2. Lots of ram or faster ram.
3. Simpler architecture or hidden complexity.
4. Closer to off-the-shelf or lots of customisation.

...

My thoughts:

1. 8>4; but we're gonna be stuck with devs who are just now getting workloads on four logical cores [Tom's hardware just now changed their editorial guidance from dual core to quadcore], and if this settles in, then maybe Orbis has the design win. [draw]
2. Faster ram. I reckon Orbis is functionally a graphic card with a cpu attached, which is the way modern games work. Extra ram is more Microsoft division than xbox division. But Devs will make that 8 gig count and we may be in for some ride, especially if we're aiming for 720p as base line [Orbis on paper, but I'm not at all confident, so draw]
3. If Durango is 1.3 tf, then it cannot achieve anything higher then this. So Orbis wins. Except, something about this isn't right. I'd personally go for simpler architectures that are just, simply put, better [Like intel core rather then amd modules or something else or whatever]. [ I expect this to be closer than the supposedly lower Durango specs imply, because Devs haven't complained.. This is highly suspect. Something funny is going on.] [Orbis on paper. And basically, I personally think devs [non first party] can get a lot more out of the simpler platform than the more complicated one]
4. Draw. Cause, honestly, AMD isn't doing so well [Intel reins supreme, lots of engineers have lost their jobs, and bulldozer didn't er match expectations [bobcat and jaguar seem to be different though]], and they as company seem to be shifting towards the lower power performance segment, and so either the customisations or the off the shelf parts will be good value performance wise rather than set the world on fire. Also Microsoft and Sony [the non consoles divisions] too have been kinda in rut the last couple of years, and just about getting out of it.
 

GopherD

Member
I'm curious why you need to focus resources other than a bit of RAM on those other things though.

If you have a gaming focused machine, it will be great for video playback and all that stuff. PS3 already has DVRs that work in the background while playing, in both Japan and Europe

And I hope PS4 has hardware to support streaming to vita without impacting native games, so that remote play is available for all games. Plus possibly to android phones/tablets too

The storage of data and its ability to quickly change state in read/write is only part of the computational silicon needed in all of these functions.

If you build a system with more silicon dedicated to peripheral services like XTV, internal DVR, Secure Surface streams etc, you reduce it's capacity for the graphical pipeline or you raise it's cost. MS will not raise its cost, so it reduces its graphics system silicon.

MS is creating a "jack of all trades" system focused internally on many different areas connecting media, TV, PC and Surface. Sony is creating an internal gaming ecosystem based around graphical capability, cloud and remote streams, with TV and media functions as subsystems. To Sony it's gaming first, others second. To MS, it's all things first.

As far as third parties go, everyone is happy, the games will look identical because third parties have demanded it be so.

Do I need to end with an IMO?
 
1. 8>4; but we're gonna be stuck with devs who are just now getting workloads on four logical cores [Tom's hardware just now changed their editorial guidance from dual core to quadcore], and if this settles in, then maybe Orbis has the design win. [draw]

I also think that having talented multi threaded CPU programmers like Sony first parties ( above all after having make a post-grade with Cell ) it has more sense to go with 8 less powerful cores than with 4 and is one of the clearer answers they got if ND, Santa Monica, Guerrilla, etc have been asked about.

Besides trying to mimic the fantastic engines these studios have from PS3 would be a little easier with more cores ( well, in fact with more AVX units or whatever the vector units these cores have attached to get the more similar configuration to the PS3 6 spus ).
 
I guess 1-2 cores of the Xbox CPU will also be used for Kinect / OS / apps etc., imo I would rate the PS4 CPU as potentially more powerful, but this is of course only based on rumors.
 
I also think that having talented multi threaded CPU programmers like Sony first parties ( above all after having make a post-grade with Cell ) it has more sense to go with 8 less powerful cores than with 4 and is one of the clearer answers they got if ND, Santa Monica, Guerrilla, etc have been asked about.

Besides trying to mimic the fantastic engines these studios have from PS3 would be a little easier with more cores...

Well with Cell and SPUs they had to work heavily multi-threaded and of course gained a lot of experience but if they really prefer 8 over 4 cores I somehow doubt that. There is just so much you can parallelize and it adds complexity and often needs workarounds or special algorithms to really gain performance.
 

Reiko

Banned
I also think that having talented multi threaded CPU programmers like Sony first parties ( above all after having make a post-grade with Cell ) it has more sense to go with 8 less powerful cores than with 4 and is one of the clearer answers they got if ND, Santa Monica, Guerrilla, etc have been asked about.

But then there's the unseen variable of 343i which has members behind the creation of DirectX11. Who knows what kind of things they will be experimenting with 8 core design.

We really can't tell until many years forward of next gen. Overall... It should be fun on the software end.
 
But then there's the unseen variable of 343i which has members behind the creation of DirectX11. Who knows what kind of things they will be experimenting with 8 core design.

We really can't tell until many years forward of next gen. Overall... It should be fun on the software end.

Yes, 343i is like the MS´s side graphics dream team ( it was time they founded something like that ). They promise a lot seeing what they made with Halo 4.
 
They also used a lot of those SPUs to make up for the weak GPU - they won't have to do this anymore, because a modern GPU can handle these tasks much more efficiently than those CPU cores. To me it seems like Sony did chose a strong CPU with 4 cores with gaming as primary focus and MS did chose a weaker CPU with 8 cores for multitasking with their rumored multimedia features in mind.
 
Gemüsepizza;46587237 said:
I guess 1-2 cores of the Xbox CPU will also be used for Kinect / OS / apps etc., imo I would rate the PS4 CPU as potentially more powerful, but this is of course only based on rumors.

In the beginning it was rumored 2 cores and 3 GBs for OS. The last chirp is 2GBs and 1-2 cores.
 

davious88

Banned
The only problems with vita are the over priced memory and low support from themselves.

lloyd-o.gif
 
Good thing is. We'll have better idea which set up is the way to go.

Of course this is just walking in the dark, but who cares really? Let's speculate!

1. 8 smaller cores or 4 beefier cores.
2. Lots of ram or faster ram.
3. Simpler architecture or hidden complexity.
4. Closer to off-the-shelf or lots of customisation.

...

My thoughts:

1. 8>4; but we're gonna be stuck with devs who are just now getting workloads on four logical cores [Tom's hardware just now changed their editorial guidance from dual core to quadcore], and if this settles in, then maybe Orbis has the design win. [draw]
2. Faster ram. I reckon Orbis is functionally a graphic card with a cpu attached, which is the way modern games work. Extra ram is more Microsoft division than xbox division. But Devs will make that 8 gig count and we may be in for some ride, especially if we're aiming for 720p as base line [Orbis on paper, but I'm not at all confident, so draw]
3. If Durango is 1.3 tf, then it cannot achieve anything higher then this. So Orbis wins. Except, something about this isn't right. I'd personally go for simpler architectures that are just, simply put, better [Like intel core rather then amd modules or something else or whatever]. [ I expect this to be closer than the supposedly lower Durango specs imply, because Devs haven't complained.. This is highly suspect. Something funny is going on.] [Orbis on paper. And basically, I personally think devs [non first party] can get a lot more out of the simpler platform than the more complicated one]
4. Draw. Cause, honestly, AMD isn't doing so well [Intel reins supreme, lots of engineers have lost their jobs, and bulldozer didn't er match expectations [bobcat and jaguar seem to be different though]], and they as company seem to be shifting towards the lower power performance segment, and so either the customisations or the off the shelf parts will be good value performance wise rather than set the world on fire. Also Microsoft and Sony [the non consoles divisions] too have been kinda in rut the last couple of years, and just about getting out of it.

The number of cores really doesn't mean a whole lot without much context. They're not the same architecture at all so just looking at it from a numerical standpoint is just pointless. Developers won't be stuck with anything because Steamroller should be able to do much more instructions per clock than jaguar.
 
Please don't take this as a personal attack but the more I read your posts the more I question your objectivity.

On one side you claim that developers are much more confident in Microsoft, that 8 GB DDR3 with abysmal bandwidth is better than 4 GB GDDR5 and that certain developers question Sony's approach to next gen hardware, and yet judging from your posts you know quite a lot about Durango and next to nothing about Obris.

Considering that consoles are heavy data streamers I do believe that bandwidth > size. If Sony consulted the likes of Epic and EA (they did), they would have been the first to say "give us speed" (with of course reasonable amount RAM).

For example, Unreal 3 Engine is a heavy streamer. In a low bandwidth system you will (again) have ugly texture pop-ups in UE3 (and probably UE4).

Of course Sony couldn't just have 2 GB of GDDR5 where 50% would have been used by OS, but 4 GB of GDDR5 where at least 3 GB will be usable initially is more than enough for next gen. In the end, once the OS is optimized and they manage to shrink it, devs will probably have 3.5 GB or more at their disposal (they reduced PS3's OS from 120 to 50 MB which goes to show that you can reduce OS footprint considerably if you know how).

Of course when you have devs like Bethesda that like to load up every single possible asset into RAM and keep it there for unnecessary amount of time before they reset it (who gives a s*it on what XY coordinate I dropped some random book in some random dungeon from some random shelf), you can never have enough RAM.
I think it's the other way around. Texture popin exists because the streaming speed is no where close to the memory speed. If you have more ram you can buffer far more in advance so you won't texture pop ins, or at least less than if you had less ram and had to stream just the more immediate stuff.

Having more memory bandwidth would allow them to use a larger pool of memory at once, which means they can push bigger assets on the screen... The difference in bandwidth (assuming both machines use the same compressed data formats and whatnot) would mean that even at 60 fps orbis would be able to access a larger pool memory at once than durango at 30 fps. At 30 fps orbis is able to acess more memory than it's 4GB so it wouldn't made a difference... This could mean that we could see more 60 fps titles in orbis, or at least that 60 fps titles will look better on orbis and show no drop in asset fidelity compared to 30fps titles. Or that even at 30fps, some titles would have higher asset quality on orbis.

if you want to do 1080p, bandwidth helps.

if bandwidth is an issue on durango, it has the potential to drop to 720p before PS4 does.

There's no way they would put the framebuffer on a paltry 60GB/s DDR3, so i guess is safe to assume that they are going with a similar design again, so all framebuffer operations would occur on the esram.

Even if the same tiling issues persist, and since most sub HD titles were that way to avoid tiling costs, the increase in memory is bigger than the increase in resolution, aside from deferred rendering, 32mb is more than enough to hold a 1080p buffer specially if they are going with post process aa, so it'll probably do better comparatively to 360 in this regard.

In the very least i think we will see less first party titles underperforming resolution wide.
 
The storage of data and its ability to quickly change state in read/write is only part of the computational silicon needed in all of these functions.

If you build a system with more silicon dedicated to peripheral services like XTV, internal DVR, Secure Surface streams etc, you reduce it's capacity for the graphical pipeline or you raise it's cost. MS will not raise its cost, so it reduces its graphics system silicon.

MS is creating a "jack of all trades" system focused internally on many different areas connecting media, TV, PC and Surface. Sony is creating an internal gaming ecosystem based around graphical capability, cloud and remote streams, with TV and media functions as subsystems. To Sony it's gaming first, others second. To MS, it's all things first.


As far as third parties go, everyone is happy, the games will look identical because third parties have demanded it be so.

Do I need to end with an IMO?
No, you are correct as far as my opinion is concerned. PS3 was "it only does everything" and this time it will be the Xbox3. Sony this time will use Nanse and blu-ray players or rely on the home DVR allowing the PS4 to be less expensive while Microsoft will go for an all in one that is more expensive.

I'll copy from the other thread:

Rumors Xbox3 won't support Blu-Ray.

The source of the rumour:
http://www.thenextxbox.com/2013/01/...gainst-blu-ray-movie-playback-with-xbox-next/

It's not really a rumour though, just speculation.

Jesus Christ.
This just follows Microsoft OS choices also. No codecs for DVD or blu-ray in MS Operating Systems as they cost money for each. There also was a judgement that they can't include Java with the OS (you have to install it yourself). Java and h.264 are necessary for a blu-ray player.

In any case Mpeg2, h.264 and soon h.265 will be needed for IPTV streaming so if you support the former you can support DVD, Blu-Ray and 4K Blu-Ray. Hardware support for Mpeg2 and h.264 is built into AMD APUs and non-mobile GPUs.

Both Microsoft and Sony want their next game console to be an indispensable part of the living room. So DVD, Blu-ray, 4K blu-ray, IPTV streaming, Skype, Voice and Gesture recognition with a common standard, common set of apps that also interface with Skype (Contact list, Calendar, Journal), apps, XTV standard common to the industry which means RVU support at a minimum and likely both will support HDMI pass-through. Both will comply with international standards for power modes for all they support.

This means a large chunk of the PS4 and Xbox 3 will be identical hardware with some identical software, WebGL and OpenGL support not DirectX or Direct Compute being the biggest obvious change. XTV apps were XHTML = Java + HTML 4.01 with OpenVG supporting XML. This will change to HTML5 (WebGL & WebCL), no Java required.

XTV will also be supported by ATSC 2.0 which uses the blu-ray codec. After March 2013 there will be digital channels that only the latest smart TVs can access or a PS3, PS4 or Xbox3. These digital channels will support 1080P and 3-D as well as extended features via a web browser. So in the short term you will need a Google TV box, a smart TV or a game console to access the new content. This along with the must have new features in game consoles will make having one a must and I expect a price point that allows impulse buying.

A large chunk of the PS4 and Xbox 3 will be identical hardware = 1 Jaguar CPU package turned on and the 8280 2 CU GPU on but the second GPU turned off. This makes XTV always on and IPTV streaming a 10W or less feature. This part of the PS4 may be called Thebe and hidden in Kryptos. Thebe should also contain the WideIO memory and Transposer common to both game consoles. This is why Thebe needed to be completed first and Oban is named for a large blank Japanese gold coin with bumps and needed to be made and tested before Thebe. You can not support a low power mode with GDDR5 memory that needs 40 watts running a game and 10+ watts even at a slower clock speed in addition to the less than 10 watts (5 watts are burned by the second GPU even when off) the rest of Thebe needs. Stacked DDR4 memory in low power mode will burn less than a watt. 256 bit wide memory would need to be run twice as fast as wide IO memory in low power mode and would require more power, instead of 1 watt or less 2 watts.

There is speculation in the above but it's informed speculation taking into account the big picture which is sadly lacking in professional articles.

Oban is not a Greek name and is not a AMD-Microsoft project name or Sony-AMD project name (Kryptos and Thebe). AMD is using Greek names and figures from Greek mythology and while Kryptos is Greek for hidden, Oban is not Greek for anything. Oban is likely named after a large blank Japanees gold coin with bumps which is nearly descriptive of a Transposer, it costs like gold and has bumps to connect (MCM) chips to each other. So hidden in Kryptos is Thebe and Kryptos is also the name for the CIA statue outside CIA Langley which unencrypted is the hidden John Carter account of the King Tut tomb discovery on the river Thebe in Thebes Egypt named after the Thebe River, name taken from a figure in Greek mythology.
 

Ashes

Banned
The number of cores really doesn't mean a whole lot without much context. They're not the same architecture at all so just looking at it from a numerical standpoint is just pointless. Developers won't be stuck with anything because Steamroller should be able to do much more instructions per clock than jaguar.

That's the interesting part. Clock for clock they are getting closer. Brazos is surprisingly efficient, even though clock of clock, Bulldozer was better. Jaguar is a much more efficient processor than the current Apus and with double the cores [not exactly I know], I retain that it will be interesting.

On that note, last I checked, I think the A10 5800k could on paper do about 700 Sp Gflops [what with the gpu on board], theoretical peaks of course, but in practise, people have managed only about a third of that. That should be improved with GCN though, A10 Apus are currently running with older VLIW4 or was it 5? I can't remember.
 
That's the interesting part. Clock for clock they are getting closer. Brazos is surprisingly efficient, even though clock of clock, Bulldozer was better. Jaguar is a much more efficient processor than the current Apus and with double the cores [not exactly I know], I retain that it will be interesting.

On that note, last I checked, I think the A10 5800k could on paper do about 700 Sp Gflops [what with the gpu on board], theoratical peaks of course, but in practise, people have managed only about a third of that. That should be improved with with GCN cores though.

The changes they're bringing to SR should see a pretty sizable increase in performance (early estimates have it pegged as a 20-25% increase over PD which was a 10-15% increase over BD). I haven't seen a single benchmark that focuses on CPU usage that shows bobcat based parts anywhere near Bulldozer or Piledriver CPUs. Heck I haven't seen one that puts it past Phenom Is. But then again I'm not expecting to, Bobcat was made to beat out Intel's Atom line.

Flops don't really mean anything though, APUs won't really take off until we get more OpenCL/GPGPU written programs out and about and HSA is put in place. Right now APUs are very much still seen as separate CPU/GPU entity and thus are treated as such.
 
Here's another thing I wonder.

Is the GDDR5 in PS4 a reaction to the eSRAM in 720, or is it the other way around?

I mean is the eSRAM really 32MB?

Neither is likely a "reaction" to either. Lead times on console and silicon designs are in the years.

AFAIK, neither Sony or MS likely cared anything about the others plans. At least that seems to be what I get from Microsoft sources.

I dont know if the ESRAM is really 32MB, that seems to be the accepted number now so I think so. It's a perfect number imo. Not so much that it is overly expensive, but enough to be more roomy per pixel at 1080P than the 360's setup by a good deal.

If anything I think EDRAM makes a good deal more sense this gen. It's likely to be something like half as expensive. I'm not sure if they have 28nm ES/DRAM, but if they did compared to 360 at launch it's like for a given budget:

10MB@90nm=20MB@65nm=40MB@45nm=80MB@28nm

So you can do 80MB now for the same cost as 10MB then. Except you're only using 32MB now. The die area/slash cost should be less than half while still giving you more EDRAM per pixel.

It's possible these systems come out pretty close despite being designed in vacuum's, for perfectly good reasons. I remember Matrick I think in an interview years ago about 360 vs ps3 talking about "we've got the same sand they do, it doesn't matter, I traveled the world looking at sand and they dont have different sand than us" or something like that. Sand meaning silicon.
 

Ashes

Banned
The changes they're bringing to SR should see a pretty sizable increase in performance (early estimates have it pegged as a 20-25% increase over PD which was a 10-15% increase over BD). I haven't seen a single benchmark that focuses on CPU usage that shows bobcat based parts anywhere near Bulldozer or Piledriver CPUs. Heck I haven't seen one that puts it past Phenom Is. But then again I'm not expecting to, Bobcat was made to beat out Intel's Atom line.

Flops don't really mean anything though, APUs won't really take off until we get more OpenCL/GPGPU written programs out and about and HSA is put in place. Right now APUs are very much still seen as separate CPU/GPU entity and thus are treated as such.

*nods in agreement*
 
That's the interesting part. Clock for clock they are getting closer. Brazos is surprisingly efficient, even though clock of clock, Bulldozer was better. Jaguar is a much more efficient processor than the current Apus and with double the cores [not exactly I know], I retain that it will be interesting.

Yeah, it's a interesting comparison. Twice as many cores versus twice the clock speed. The advantage will be more about the nature of the workload than anything else.
 

sTeLioSco

Banned
We heard developers asking for 8GB at GDC. Clearly developers don't just want super fast RAM, as we're not going to see video cards with anywhere near 4GB of video memory any time soon. I think there's an obvious compromise in the decisions that both platform holders made.

Also, maybe I know things you don't. Or think I know, anyway.

what that has to do with videocards??
the fast ram in ps4 is not only video memory..... its unified.

if it had 6gb of fast ram you would say: "we're not going to see video cards with anywhere near 6GB of video memory any time soon" ???

yes because you are a fanboi.
 

Reiko

Banned
Neither is likely a "reaction" to either. Lead times on console and silicon designs are in the years.

AFAIK, neither Sony or MS likely cared anything about the others plans. At least that seems to be what I get from Microsoft sources.

I dont know if the ESRAM is really 32MB, that seems to be the accepted number now so I think so. It's a perfect number imo. Not so much that it is overly expensive, but enough to be more roomy per pixel at 1080P than the 360's setup by a good deal.

It's possible these systems come out pretty close despite being designed in vacuum's, for perfectly good reasons. I remember Matrick I think in an interview years ago about 360 vs ps3 talking about "we've got the same sand they do, it doesn't matter, I traveled the world looking at sand and they dont have different sand than us" or something like that. Sand meaning silicon.

Another good point. Should be interesting come release for both of them.
 
I wish someone would compile a list of all the possible specs Jeff has mentioned in the last months...
They are here: http://www.neogaf.com/forum/showthread.php?t=473780 Speculation is circling back to where most of us started early 2012. It's been disrupted by some who legitimately did not think TSVs and stacking would be used this cycle because they felt it was not ready despite the Sony CTO article. Every professional article has been predicting 3D stacked memory on Interposer for Game consoles and I've cites for all of them.

Speculation has also been thrown because of the SemiAccurate Oban article which bkilian says is bunk. If bkilian is a former Microsoft Employee in DSP Audio then he likely would know something of the coming Xbox3.

I have mentioned no specs except stacked memory on transposer, 2 Jaguar CPU packages and echoed the Sony senior VP Technology platform 5X memory bandwidth and 10X GPU = 100Gb/sec and 1.8 Tflops. Yoles interpretation of the Sony presentation was 512 bit wide stacked memory on Interposer. I found the leaked Xbox 720 presentation and stated as shown in that presentation that both Microsoft and Sony have the same plans and would need the same hardware to support accessories.

I went off on a tangent because 2014 HSA features were not available until 20nm and found a port of Pennar/Samara @ 20nm from TSMC to Global Foundries and Samara being released in 2013. Samara as a jaguar mobile APU @ 20nm must have 2014 HSA features needed by the PS4. I tried to find if 20nm could be supported in 2013 and it seems that LPM silicon can have yields high enough to support a release of Samara in 2013. I came to the conclusion it's more likely that critical parts of the Samara design were ported to 28nm and used in the PS4.

I've also said that 8000 series GPUs would be used in the PS4 and a 8000M series GPU would be the second GPU (APU + GPU needed because of low power modes and AMD recommends APU + GPU until 2014 graphics pre-emption is supported). Laptops and PCs with embedded GPUs can share the same memory pool as this reduces costs and I expect the PS4 design has the second GPU sharing the same wide IO memory on the transposer.
 

Nachtmaer

Member
I think they're just having trouble putting everything together in a reliable fabrication. It happens a lot.

From what I've read, I think the main reason for Steamroller's delay was that GlobalFoundries' 28nm process wasn't ready yet. Jaguar/Kabini will be fabbed on TSMC's 28nm process, Steamroller/Kaveri on GloFo's 28nm process.
 

wizzbang

Banned
Durango is supposed to have some blocks of more or less fixed function hardware that would do specific things really well, apparently. It's a bit of a strange approach in my opinion, but it's an interesting way to differentiate.

I distinctly recall being 6 or 7 years younger, fascinated with technology and more up to date with it. When I learnt what the 10mb of EDRAM (!!!!) can do on the 360 to 'permenantly elminate jaggies forever with free AA per clock cycle!!' I was intrigued.

Look how that specific piece of cache worked out on the die? Do we see jaggies and non AA'd screens? Yerp.
 

wizzbang

Banned
Will PS4 feature a discrete GPU in addition to the APU? I'm really lost when it comes to the technicalities of these things. I thought there was talk of PS4 featuring both something akin to the Trinity APU and a 8xxx series GPU? Or was that all just crap. I want the machine to be beastly while maintaining affordability.


I'm rusty on tech compared to how I used to be but I don't think I'd be too out of line in describing the A10 GPU compared to say a 7970 (or even significantly less) as a complete and utter piece of shit.
The A10 rumour I've heard multiple times now but I find it hard to believe to be honest - even tweaked it's just pitiful compared to a 'proper' GPU.

This is giving me PS3 deja-vu. Sony were convinced the Cell could do EVERYTHING and were very reluctant to put the nvidia chip on the board which they eventually had to do quite close to the end, because the CPU just couldn't do what they expected of it.

If the PS4 ships with an A10, I'd be very surprised if it isn't monstorously tweaked in some way, so much so that even calling it an A10 wouldn't really be a fair description.
 

Ashes

Banned
I'm rusty on tech compared to how I used to be but I don't think I'd be too out of line in describing the A10 GPU compared to say a 7970 (or even significantly less) as a complete and utter piece of shit.
The A10 rumour I've heard multiple times now but I find it hard to believe to be honest - even tweaked it's just pitiful compared to a 'proper' GPU.

This is giving me PS3 deja-vu. Sony were convinced the Cell could do EVERYTHING and were very reluctant to put the nvidia chip on the board which they eventually had to do quite close to the end, because the CPU just couldn't do what they expected of it.

If the PS4 ships with an A10, I'd be very surprised if it isn't monstorously tweaked in some way, so much so that even calling it an A10 wouldn't really be a fair description.

Oh it won't be A10 for sure, what with the rumoured 1.8 tf. A10 6800k is rumoured to be coming pretty soon, but I don't think that is anywhere near that.

Last I heard, Kaveri [Steamroller + GCN] was topping out at 1 teraflop, which is a generation ahead of Richland and Piledriver mentioned above.
 
Top Bottom