• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 Rumors , APU code named 'Liverpool' Radeon HD 7970 GPU Steamroller CPU 16GB Flash

Status
Not open for further replies.

Karak

Member
But weren't those architectural complexities plus the usage of XDR would made CELL better suited for graphics work than a conventional CPU? Even if development is more straightforward, it still takes time and money to do a ton of coding specific to one SKU. And when you couple the rumors of 720 using a lot of slow main RAM, I'm not sure a conventional CPU, even a really high-powered one, is a great place for a ton of graphics work.

I am just talking about the comparisons and that the Cell is just pretty damn complex compared to pretty much anything MS is going to put into their systems and the rumors so far. As for everything else, no one knows. Its really out there and again why so many people question it. Other have pointed to vg247 and their history with breaking a couple big stories and their specs and leaks indicated a very powerful system all around and they pointed to it being a more firm source than even their 2 earlier leaks that proved true. They also pointed out that their leak confirmed duel smaller GPU's running in crossfire/sli. Which, when accounting for the jump to a new NN process a 6670 does make more sense if its a duel card system or duel chip system.

One thing is devs will always figure out a way. A slower CPU they will work around, a slower GPU they will work around. We will just all live with the sad consequences. Or possible benefits.
 

Raistlin

Post Count: 9999
Is it TRULY next generation if they can't at the very least get everything running at 1080p 60FPS? Not in my opinion.
By that measure, next generation doesn't even exist on PC for the most part with top level games.

Buying a 1080p TV for GAMING only purposes this generation was all of about worthless.
lol and who would have done that? The smallest amount of due diligence would tell someone not to use gaming as a motivation for 1080p.

That said unless someone bought a piece of crap, a quality TV should be around for the next gen and not be feature-crippled.






I am just talking about the comparisons and that the Cell is just pretty damn complex compared to pretty much anything MS is going to put into their systems and the rumors so far.
I just think blaming complexity on why CELL wasn't utilized to its full extent in multiplatform titles is a gross oversimplification of the situation.

As for everything else, no one knows. Its really out there and again why so many people question it. Other have pointed to vg247 and their history with breaking a couple big stories and their specs and leaks indicated a very powerful system all around and they pointed to it being a more firm source than even their 2 earlier leaks that proved true.
Oh I'm not stating it definitely won't happen, just trying to figure out the rationale behind it.

More to the point, I was specifically questioning the merits of a dramatically more powerful CPU as it pertains to games in particular. I don't see an obvious benefit there. Certainly there could be plenty of other uses for it, but gaming implications has been brought up a few times and I question how useful it will really be there.

One thing is devs will always figure out a way. A slower CPU they will work around, a slower GPU they will work around. We will just all live with the sad consequences. Or possible benefits.
Certainly. More power is never bad, all things being equal.
 
By that measure, next generation doesn't even exist on PC for the most part with top level games.


lol and who would have done that? The smallest amount of due diligence would have someone not to use gaming as a motivation for 1080p.

That said unless someone bought a piece of crap, a quality TV should be around for the next gen and not be feature-crippled.

What do you mean? They all run at NATIVE 1080p. Performance will vary on how much money one is willing to spend on their hardware.
 

jmdajr

Member
Strictly gaming, how will Ps4 compare to a 500 dollar PC in 2014. I think it will be bettet no doubt. 1000 dollar pc? Tougher.
 

Raistlin

Post Count: 9999
yes it does
Note I said 'for the most part'.



You can quite easily drop below 60fps on an overclocked 680 with current high-end games depending on the resolution and effects. The salient point here being for most devs, 60fps is never going to be highly weighted target.

How can that even be argued? We've seen generation after generation proving that point out.
 

Karak

Member
I just think blaming complexity on why CELL wasn't utilized to its full extent in multiplatform titles is a gross oversimplification of the situation.

I wasn't and never would blame it all on that so I really can't speak to why it wasn't used to its fullest. But from devs who worked on it and a couple that spoke about it last year it was the major, or the largest factor when it came to difficulty. As for the superfast CPU thing. I have no clue. Things change, some stay the same. Just no telling.

I don't like a huge disparity either. Small sure but 1 with massive ram and a massive CPU and the other with a great Video card. At one time or another that is going to cause an issue that will be pretty noticable. Thats why I hope that the VGA724 leak is closer to the truth and they sure do have a better record than IGN on this ha. Then it will be more equal.
 
Note I said 'for the most part'.



You can quite easily drop below 60fps on an overclocked 680 with current high-end games depending on the resolution and effects. The salient point here being for most devs, 60fps is never going to be highly weighted target.

How can that even be argued? We've seen generation after generation proving that point out.

It can't be, and you're right about that. I was just simply saying that 60FPS 1080p or higher is very common on PC these days.
 

gatti-man

Member
Note I said 'for the most part'.



You can quite easily drop below 60fps on an overclocked 680 with current high-end games depending on the resolution and effects. The salient point here being for most devs, 60fps is never going to be highly weighted target.

How can that even be argued? We've seen generation after generation proving that point out.
you said at 1080p. pc next gen is sli/crossfire anyways. at the moment 60fps is easily attainable on any game with the right tech.
 

Raistlin

Post Count: 9999
I wasn't and never would blame it all on that so I really can't speak to why it wasn't used to its fullest. But from devs who worked on it and a couple that spoke about it last year it was the major, or the largest factor when it came to difficulty.
I suspect it is a combination of factors. But overall, I question whether 720 will really be in notably better position given the current HW rumors, constraints with multiplatform development, and MS's trend away from exclusive content.

I don't like a huge disparity either. Small sure but 1 with massive ram and a massive CPU and the other with a great Video card. At one time or another that is going to cause an issue that will be pretty noticable.
Yep.

The good news is PC's have demonstrated that graphics are pretty scalable as long as there's a reasonable amount of graphics RAM and the GPU's are running shader models that are at least close. With the upcoming gen, it looks like the big boys will have enough gRAM and will be running cards based on modern shader models. With that in mind I think we'll at least see better overall performance (framerates) in multiplatform titles. And if the architectures are relatively easy to develop for, we'll see some graphics optimizations where applicable instead of 'lazy ports'.


What will be most interesting in my mind is the 'other' things these consoles offer. An argument that has only increased in discussion is whether non-gaming features matter. The old, should a console be for games or is that only part of the puzzle? Historically power hasn't been the be-all, end-all differentiator in sales, so if these consoles show dramatic differentiation in terms of non-gaming features ... it's going to be quite interesting seeing the results in adoption. This could be the first time we really get to see that discussion tested in the real world.
 

Raistlin

Post Count: 9999
you said at 1080p. pc next gen is sli/crossfire anyways. at the moment 60fps is easily attainable on any game with the right tech.
Sorry if I wasn't clear, but I meant 60fps. PC resolution is relative to monitor tech of the time. It's like arguing 360 has no problems running games at 480p60.

SLI/crossfire has been around for quite a while. I would argue it is not 'PC next gen' ... it's PC bleeding edge. PC next gen is whatever is the best available single GPU card.

Certainly yes, in general, you can keep throwing HW on a PC to up performance. But devs keep upping the requirements of their games due to HW availability. My buddy has a 690. At 1080p he can run pretty much any game at 60fps (maybe the occasional dip) at top settings. There are some exceptions, and if he ups the res it no longer holds true in many more cases. Do you think upcoming games will not be more demanding?

The point here is that even with PC gaming, at almost any time you can find a situation where 60fps isn't achievable. It's because that is not that goal in many games. The devs are purposely trying to eek out the best graphics with framerate not being their highest priority.
 

Karak

Member
What will be most interesting in my mind is the 'other' things these consoles offer. An argument that has only increased in discussion is whether non-gaming features matter. The old, should a console be for games or is that only part of the puzzle? Historically power hasn't been the be-all, end-all differentiator in sales, so if these consoles show dramatic differentiation in terms of non-gaming features ... it's going to be quite interesting seeing the results in adoption. This could be the first time we really get to see that discussion tested in the real world.

The other stuff I don't have much of an opinion on but this last bit I really do.

I love that I can use my consoles as almost everything. Movies, music, some apps, all that stuff. For example the UFC stuff on the 360 was simply ace. Some of the coolest stuff ever, in addition to them making a couple UFC's free. Thats just a bit of what I liked. So many amazing things on consoles.

If either don't offer at the bare minimum, the exact same things as the offerings now. I won't get them right away. They are mainly gaming but speaking for me alone, the 360 has become a huge draw in my family. I have parents and nieces and nephews who come over and us the kinect, watch movies and so forth, I play my games have my party chats, and my multiplayer stuff and the wife watching netflix. Its a total hub for me and it crept up on me without notice until I found that I was getting rid of device after device.

PS3 to a somewhat lesser extent is the same where I work(we have one in our employee gameroom, its my old PS3 I gave them).

So if either don't offer the same services on next gen. Man that will rankle me.
 

coldfoot

Banned
Given the lack of BC in the PS4 and a very PC-like architecture with x86 and a powerful GPU, and that most games are multiplatform including PC, I predict 1080p/60fps "full HD" remakes of games like Just Cause 2, BF3, Call of Duty, etc would sell very well for about $25-30 apiece and they would be very easy to port from the PC version and be very profitable.
 

mrklaw

MrArseFace
Stop talking about mobile parts and start talking about what they really meant: underclocked Pitcairn :p

Regardless, why would any console manufacturer want to go the dual-GPU route?

Discrete GPU for graphics, integrated GPU of relatively low power in APU for offloading GPGPU suitable work from the CPU
 

mrklaw

MrArseFace
I said in the other 1TF UE4 thread, that bar they announced must be a threshold for one of the consoles. Likely Durango, if the 1-1.5 TF leak is correct. With Wii-U falling between 400-800 GFLOPS depending on your source.

8GB with 1TF for Durango seems absurd. So I think it will be that with 4-6GB or its GPU will be 1.5TF with 6-8.

If Durango is only 2-2.5x wiiU, with all the criticism wiiU is getting regarding tech, that would be disappointing. It would also help nintendo as they would be more likely to be able to get ports

Ram doesn't have any particular connection with GPU power. If MS want windows 8 sitting in the background always on, perhaps recording your tv shows and doing other things while you play, that could eat a lot of ram. So 8GB doesn't mean all of it for games. Same for CPU. Perhaps they are going with a stronger CPU because they'll be locking more cores out for other use?
 
"So 8GB doesn't mean all of it for games. "

No, but PS4 could get some messed up ports if it only has 2GB and Durango has like 6GB, even if a couple are dedicated to O/S and other non-gaming duties. The developers will get used to working with that much and will have to cram things to fit into 2GB. 2GB isn't much, conservatively 100MB for the O/S, and they could adjust the memory allocated from1-1.5GB for video, and the rest to run the game. It would always be a struggle. It could be a weakness that would be a thorn in its side for years even if it's got a better GPU. They better figure out how to include 4GB.
 
If Durango is only 2-2.5x wiiU, with all the criticism wiiU is getting regarding tech, that would be disappointing. It would also help nintendo as they would be more likely to be able to get ports

Ram doesn't have any particular connection with GPU power. If MS want windows 8 sitting in the background always on, perhaps recording your tv shows and doing other things while you play, that could eat a lot of ram. So 8GB doesn't mean all of it for games. Same for CPU. Perhaps they are going with a stronger CPU because they'll be locking more cores out for other use?

Yes it does, dont be ridiculous. The RAM is going to help games look better than Wii U/Orbis, sorry! It isn't going to be for "recording TV with Windows 8" lol. WTF?

Also people acting like Wii U even belongs in this discussion is silly, until they show something better than E3 which basically looked worse than PS360 (Killer U). I cannot believe in 400-800gflops or 1.5-2GB RAM until they show games that dont look worse than 360, period.

It's looking more likely PS4 is going to be 2GB by the discussion on B3D (there are pretty serious technical roadblocks to anything above 2GB of GDDR5 in a console) and in that case if it's 8GB RAM+1.5TF GPU Durango vs 2GB RAM 1.8TF GPU on Orbis, I'll take that all day. Even if it's 8GB+1 TF vs 2GB+1.8TF, that's probably a tie at worst. But we know the GPU on Durango isn't finished, so I'm expecting it will end up pretty nice and at least close to whats in Orbis, MS isn't dumb.
 

z0m3le

Banned
"So 8GB doesn't mean all of it for games. "

No, but PS4 could get some messed up ports if it only has 2GB and Durango has like 6GB, even if a couple are dedicated to O/S and other non-gaming duties. The developers will get used to working with that much and will have to cram things to fit into 2GB. 2GB isn't much, conservatively 100MB for the O/S, and they could adjust the memory allocated from1-1.5GB for video, and the rest to run the game. It would always be a struggle. It could be a weakness that would be a thorn in its side for years even if it's got a better GPU. They better figure out how to include 4GB.

Is the whole 8GB thing clearly not the memory total in the Dev kits? because Wii U's Devkits had about 4GB memory, since debugging takes up about equal memory space with what you are doing...

Wouldn't surprise me if someone was told that a XB3 dev kit had 8GBs and they ran the story that XB3 has 8GBs, which that doesn't necessarily point to.

Yes it does, dont be ridiculous. The RAM is going to help games look better than Wii U/Orbis, sorry! It isn't going to be for "recording TV with Windows 8" lol. WTF?

Also people acting like Wii U even belongs in this discussion is silly, until they show something better than E3 which basically looked worse than PS360 (Killer U). I cannot believe in 400-800gflops or 1.5-2GB RAM until they show games that dont look worse than 360, period.

It's looking more likely PS4 is going to be 2GB by the discussion on B3D (there are pretty serious technical roadblocks to anything above 2GB of GDDR5 in a console) and in that case if it's 8GB RAM+1.5TF GPU Durango vs 2GB RAM 1.8TF GPU on Orbis, I'll take that all day. Even if it's 8GB+1 TF vs 2GB+1.8TF, that's probably a tie at worst. But we know the GPU on Durango isn't finished, so I'm expecting it will end up pretty nice and at least close to whats in Orbis, MS isn't dumb.

Some of those things you aren't believing were basically confirmed by the leaked Wii U specs thread last week, it being the weakest, by a multiple of 2 or 3 is clear based on these specs in this thread, but it's fairly plain that Wii U is stronger than a current gen console. You might be right to not call it next gen though, a mid way point between last gen and next could in fact describe it better in terms of power.

But if the intelligent posters at beyond3D are to be believed, the Wii U's CPU is likely based on 470s cores, with 6 threads; it having 2GBs of ram is what insiders have said and a graphics card that produces more than 650GFLOPs. Antonz has confirmed the GPU's feature set is up to date (2012). pretty plain to see all this if we think about it...

Current gen consoles are 2005 consoles with a 2006 design, Wii U is a 2009 (gpu: R700,cpu: 470s)console with 2012 features, and PS4/XB3 are 2013 consoles.

Again I think XB3 having 4GBs is more likely, especially if you take the "slow" ram into account, as you can get to 4GBs on GDDR3 with only 8 memory chips. (GDDR5 just matched this density) I think it's safe to assume that if Sony goes with these specs, XB3 could build either box, but I am figuring that they might go with a more middle of the road approach and look to a spec between 1 and 1.5TFLOPs to keep the price down, all I do know is Devs believe it is the simpliest to design for, and I assume that is because of PS4 only having 4 available threads and the OS taking up one of them.
 
Is the whole 8GB thing clearly not the memory total in the Dev kits? because Wii U's Devkits had about 4GB memory, since debugging takes up about equal memory space with what you are doing...

Wouldn't surprise me if someone was told that a XB3 dev kit had 8GBs and they ran the story that XB3 has 8GBs, which that doesn't necessarily point to.

No the dev kits actually have 12GB. Sick I know.

And there's 1-2 GB of VRAM, I guess on top of all this!
 
Again I think XB3 having 4GBs is more likely, especially if you take the "slow" ram into account, as you can get to 4GBs on GDDR3 with only 8 memory chips. (GDDR5 just matched this density) I think it's safe to assume that if Sony goes with these specs, XB3 could build either box, but I am figuring that they might go with a more middle of the road approach and look to a spec between 1 and 1.5TFLOPs to keep the price down, all I do know is Devs believe it is the simpliest to design for, and I assume that is because of PS4 only having 4 available threads and the OS taking up one of them.

There is also the Edram to be taken into account. Will the Xbox720 Gpu have Edram ? Most likely it will, and this is gonna be an additional cost. Will the xbox720 have blue ray ? I dunno, but if it will, that's gonna be an additional cost.
 
MS must have gone crazy at a Newegg sale. The only explanation.

lol ddr3 is actually dirt cheap. DDR3 is actually what people are talking about when they look up newegg prices and everybody says "thats DDR3 not GDDR5!" the problem is it's not fast.

I dunno what they did to get around this. EDRAM? System RAM/VRAM setup like PC?

I also have no idea how effective this RAM will actually be in games. With a 256 bus it can get around 97 GB/s of bandwidth max I believe. Vs like 200 GB/s for the leaked Orbis specs.

But I dont know what effect this is. It was commented with that little bandwidth you can forget 4XAA for example. Well, personally I'm fine with that, use FXAA or something, or skimp a little on AA, it seems worth it. Also if we go to 1080P next gen aliasing will be less of a problem by nature since the higher resolution the less aliasing is an issue.

Maybe they're just using it almost like a big cache. Instead of an SSD you use cheap DDR3 which is also much faster than SSD.

regardless I like it.
 

z0m3le

Banned
No the dev kits actually have 12GB. Sick I know.

And there's 1-2 GB of VRAM, I guess on top of all this!

Hmm interesting, but doesn't early dev kits usually have separate VRAM thanks to using actual desktop dev kits? not like you can rip out the Vram, makes since if they are using 6GB DDR3 to have 1-2GB GDDR5 vram, but how that works out in a future dev kit would change things.


There is also the Edram to be taken into account. Will the Xbox720 Gpu have Edram ? Most likely it will, and this is gonna be an additional cost. Will the xbox720 have blue ray ? I dunno, but if it will, that's gonna be an additional cost.

For backwards compatibility, I think it's safe to assume Edram (32-64MB would be enough) as for blu-ray, I think they will support the discs themselves, but not blu-ray playback, at least not out of the box, you might be able to download it though... Microsoft doesn't support it in Windows 7 for instance, and earlier windows didn't support DVD playback, even Bill Gates pointed it out.

Both are also features on the Wii U, so I can't imagine them being that expensive.

lol ddr3 is actually dirt cheap. DDR3 is actually what people are talking about when they look up newegg prices and everybody says "thats DDR3 not GDDR5!" the problem is it's not fast.

I dunno what they did to get around this. EDRAM? System RAM/VRAM setup like PC?

I also have no idea how effective this RAM will actually be in games. With a 256 bus it can get around 97 GB/s of bandwidth max I believe. Vs like 200 GB/s for the leaked Orbis specs.

But I dont know what effect this is. It was commented with that little bandwidth you can forget 4XAA for example. Well, personally I'm fine with that, use FXAA or something, or skimp a little on AA, it seems worth it. Also if we go to 1080P next gen aliasing will be less of a problem by nature since the higher resolution the less aliasing is an issue.

Maybe they're just using it almost like a big cache. Instead of an SSD you use cheap DDR3 which is also much faster than SSD.

regardless I like it.

honestly going with 4GBs of GDDR3 or even 6GBs of it (though this becomes a complex board past 4) would be more important than going with DDR3, basically Writing and Reading to GDDR3 ram can be done at the same time from my understanding, and normal DDR3 memory has to do one or the other each cycle. having 1 large cache of Unified memory is also better than split memory as 360 has already shown, and having GDDR3 would allow 360 games to run flawlessly if Microsoft went with a IBM processor again, would also give them 8 threads on just 4 cores, which is something I'm worried about in terms of PS4's AMD CPU.

Not knowing the specs of XB3 at all, makes it pretty interesting to talk about, but if they come out with a 4 core IBM design with 6 threads, 4GB GDDR3 and a 1.5TF GPU, it would be super easy to develop for, and be cheaper to build than PS4's leaked set up we are discussing here, it's also fairly evident that PS4's gpu will have to do things that a CPU like I'm talking about could handle with relative ease.
 
Hmm interesting, but doesn't early dev kits usually have separate VRAM thanks to using actual desktop dev kits? not like you can rip out the Vram, makes since if they are using 6GB DDR3 to have 1-2GB GDDR5 vram, but how that works out in a future dev kit would change things.




For backwards compatibility, I think it's safe to assume Edram (32-64MB would be enough) as for blu-ray, I think they will support the discs themselves, but not blu-ray playback, at least not out of the box, you might be able to download it though... Microsoft doesn't support it in Windows 7 for instance, and earlier windows didn't support DVD playback, even Bill Gates pointed it out.

Both are also features on the Wii U, so I can't imagine them being that expensive.



honestly going with 4GBs of GDDR3 or even 6GBs of it (though this becomes a complex board past 4) would be more important than going with DDR3, basically Writing and Reading to GDDR3 ram can be done at the same time from my understanding, and normal DDR3 memory has to do one or the other each cycle. having 1 large cache of Unified memory is also better than split memory as 360 has already shown, and having GDDR3 would allow 360 games to run flawlessly if Microsoft went with a IBM processor again, would also give them 8 threads on just 4 cores, which is something I'm worried about in terms of PS4's AMD CPU.

Not knowing the specs of XB3 at all, makes it pretty interesting to talk about, but if they come out with a 4 core IBM design with 6 threads, 4GB GDDR3 and a 1.5TF GPU, it would be super easy to develop for, and be cheaper to build than PS4's leaked set up we are discussing here, it's also fairly evident that PS4's gpu will have to do things that a CPU like I'm talking about could handle with relative ease.

Many seem to forget that Edram is the key; having 64mb of Edram would mean 1080P with free AA without tiling. However, I dunno if MS can affoard to put such a big amount of Edram in the next Xbox. If the wii U does really have 32mb of Edram, then it means 64mb may be doable in the next xbox.
 

z0m3le

Banned
Many seem to forget that Edram is the key; having 64mb of Edram would mean 1080P with free AA without tiling. However, I dunno if MS can affoard to put such a big amount of Edram in the next Xbox. If the wii U does really have 32mb of Edram, then it means 64mb may be doable in the next xbox.

yep, and even 32MB would give you 720p with 4X AA or 1 1080p pass, it's a reasonable amount of ram, 360 used 10MBs with xenos in 2005 at a much larger processing node, so I think 32MBs is a minimum for XB3.
 

CLEEK

Member
2GB isn't much, conservatively 100MB for the O/S, and they could adjust the memory allocated from1-1.5GB for video, and the rest to run the game. It would always be a struggle. It could be a weakness that would be a thorn in its side for years even if it's got a better GPU. They better figure out how to include 4GB.

100MB? Bare in mind that the Vita has 512MB RAM in total, with 256MB of that allocated to the OS. I have no idea of the memory footprint that the 360 or PS3 OS use, but it wouldn't surprise me if the next gen OS need significantly more. At least as much as the Vita, more likely more.
 
Many seem to forget that Edram is the key; having 64mb of Edram would mean 1080P with free AA without tiling. However, I dunno if MS can affoard to put such a big amount of Edram in the next Xbox. If the wii U does really have 32mb of Edram, then it means 64mb may be doable in the next xbox.

it's theoretically affordable, just go like this (~50% shrink per node)

10MB EDRAM 360@90nm

=20mb @65nm

=40mb @45nm

=80 mb @32nm

If EDRAM is even possible on 32nm yet, I wouldn't know that. Just that it often lags nodes.

Personally I hate EDRAM and hope they dont do it. It's not so simple as "X EDRAM for this". Nowdays rendering has many advanced techniques that EDRAM isn't good for. Such as deferred rendering. It actually robs performance.

Not even mentioning the real problem, it's taking budget from the rest of your silicon. If you have X silicon that can be budgeted to your GPU, it's now X-(EDRAM silicon) for the actual GPU. Say you have 200mm for gpu, edram is 60mm, now you only have 140 left.

Everything else equal you definitely do not want EDRAM.
 
Buying a 1080p TV for GAMING only purposes this generation was all of about worthless.
My 720p (and 1080p 24fps for movies) projector is still very up to date after many years. And i think it will even be fine next gen.
If not, i'll eventually buy a 1080p, but this was money well spent imo :p.

Ofcourse i DID buy a 1080p plasma...
Oh well...
 
100MB? Bare in mind that the Vita has 512MB RAM in total, with 256MB of that allocated to the OS. I have no idea of the memory footprint that the 360 or PS3 OS use, but it wouldn't surprise me if the next gen OS need significantly more. At least as much as the Vita, more likely more.
I recall reading the XBOX 360 OS only uses 32 MB of system memory. The PS3 apparently uses twice that; originally thrice that.
 
It's looking more likely PS4 is going to be 2GB by the discussion on B3D (there are pretty serious technical roadblocks to anything above 2GB of GDDR5 in a console) and in that case if it's 8GB RAM+1.5TF GPU Durango vs 2GB RAM 1.8TF GPU on Orbis, I'll take that all day. Even if it's 8GB+1 TF vs 2GB+1.8TF, that's probably a tie at worst. But we know the GPU on Durango isn't finished, so I'm expecting it will end up pretty nice and at least close to whats in Orbis, MS isn't dumb.
I respect the professional knowledge shown on BY3D but 2013 is a dividing line where TSVs become economically practical in Memory and SOCs because of TSVs become economically practical for consumer platforms besides Handhelds. This changes the game and you can no longer make the same assumptions. How about posting this on BY3D so we can intelligently discuss the memory that's coming:

http://seekingalpha.com/article/291012-micron-technology-inc-shareholder-analyst-call said:
A pretty good push for more memory coming up in the Game Console segment as a level of redesigns. We'll start to hit it over the next couple of years.

game consoles. This is a space that's been pretty flat for a number of years in terms of the average shipped density per system. That's going to be changing here pretty quickly. The next generation of system is under development now and that because of 3D and some of the bandwidth requirements, drives the megabyte per console up fairly quickly. So we're anticipating some good growth here.

We've worked with a number of these vendors specifically on both custom and semi-custom solutions in that space.

http://eda360insider.wordpress.com/2011/08/22/want-to-know-more-about-the-micron-hybrid-memory-cube-hmc-how-about-its-terabitsec-data-rate/ said:
The Micron HMC project illustrates why memory is a killer 3D app. With the bandwidths made possible by employing the large number of I/Os made possible through TSV interconnect, much more of the potential bandwidth available from all of those DRAM banks that have been on memory die for decades can finally be brought out and used to achieve more system performance. This is especially important in a world that increasingly makes use of multicore processor designs. Multiple core processor chips have insatiable appetites for memory bandwidth and, as the Micron HMC demonstrates, 3D assembly is one way to achieve the required memory bandwidth.

Micron is making the HMC and providing custom memory for game consoles and we are still discussing an old LIMITED GDDR5 memory interface that as Coldfoot pointed out creates motherboard complexity and cost.

You can do away with the interface and traces entirely by putting the ultrawide I/O memory inside a SOC or find some other way to connect Ultra wide I/O memory to the SOC. Custom memory would seem to me to indicate one or the other....who knows.
 

z0m3le

Banned
I respect the professional knowledge shown on BY3D but 2013 is a dividing line where TSVs become economically practical in Memory and SOCs because of TSVs become economically practical for consumer platforms besides Handhelds. This changes the game and you can no longer make the same assumptions. How about posting this on BY3D so we can intelligently discuss the memory that's coming:





Micron is making the HMC and providing custom memory for game consoles and we are still discussing an old LIMITED GDDR5 memory interface that as Coldfoot pointed out creates motherboard complexity and cost.

You can do away with the interface and traces entirely by putting the ultrawide I/O memory inside a SOC or find some other way to connect Ultra wide I/O memory to the SOC. Custom memory would seem to me to indicate one or the other....who knows.

Well, I think we'd see it in GPUs first, it will cost much more even in 2013 because you have to pay for the process, it's not something that can be done anywhere, but where you really have to come across is why your speculation is any more important than the actual target specs we have here? 3D transistors also didn't help Ivy bridge in the heat department, even though the die shrunk to 22nm.

This new technology has some common new technology drawbacks, but it also has some unexpected ones as well.
 

i-Lo

Member
Yes it does, dont be ridiculous. The RAM is going to help games look better than Wii U/Orbis, sorry! It isn't going to be for "recording TV with Windows 8" lol. WTF?

Also people acting like Wii U even belongs in this discussion is silly, until they show something better than E3 which basically looked worse than PS360 (Killer U). I cannot believe in 400-800gflops or 1.5-2GB RAM until they show games that dont look worse than 360, period.

It's looking more likely PS4 is going to be 2GB by the discussion on B3D (there are pretty serious technical roadblocks to anything above 2GB of GDDR5 in a console) and in that case if it's 8GB RAM+1.5TF GPU Durango vs 2GB RAM 1.8TF GPU on Orbis, I'll take that all day. Even if it's 8GB+1 TF vs 2GB+1.8TF, that's probably a tie at worst. But we know the GPU on Durango isn't finished, so I'm expecting it will end up pretty nice and at least close to whats in Orbis, MS isn't dumb.

If these RAM figures end up being correct then PS4 will not be a viable option for third party titles, which as it so happens, will take precedence over first party titles for many. It'll be the same story next gen where porting becomes a nightmare. If Sony can sell PS4 based on the strength of its first parties' products alone then great for them but such a difference may have far more pronounced effect on the attach rate of software down the line.

By the sound of it, if WiiU is Sony's intended competitor then they are already missed the release window and if it is indeed MS then they have lost out on vying for third party titles.

Looks like I'll be getting one after several price drops (which will be inevitable).
 
Strictly gaming, how will Ps4 compare to a 500 dollar PC in 2014. I think it will be bettet no doubt. 1000 dollar pc? Tougher.

This is a silly comparison. You'll save much more than 500$ in the long run just from the prices of games and accessories by buying the 100$ PC.
 

KageMaru

Member
"So 8GB doesn't mean all of it for games. "

No, but PS4 could get some messed up ports if it only has 2GB and Durango has like 6GB, even if a couple are dedicated to O/S and other non-gaming duties. The developers will get used to working with that much and will have to cram things to fit into 2GB. 2GB isn't much, conservatively 100MB for the O/S, and they could adjust the memory allocated from1-1.5GB for video, and the rest to run the game. It would always be a struggle. It could be a weakness that would be a thorn in its side for years even if it's got a better GPU. They better figure out how to include 4GB.

I have a hard time believing the RAM difference will be 2GB vs 8GB. However IF it was, there's nothing stopping you from getting an Xbox too if it concerns you that much.

This is why company loyalty is silly IMO, you restrict your options.

No the dev kits actually have 12GB. Sick I know.

And there's 1-2 GB of VRAM, I guess on top of all this!

There is no way any system will have an extra 2GB of memory on top of an existing 8GB.

I'm starting to wonder what half of you guys are smoking.

http://www.rambus.com/us/technology/solutions/xdr2/index.html

It's all a lie? A fabrication? (excuse the pun or don't)

Show me one article of it ever being in mass production or in use with a consumer product.

Existing on paper is not the same as the component actually existing.
 

z0m3le

Banned
If these RAM figures end up being correct then PS4 will not be a viable option for third party titles, which as it so happens, will take precedence over first party titles for many. It'll be the same story next gen where porting becomes a nightmare. If Sony can sell PS4 based on the strength of its first parties' products alone then great for them but such a difference may have far more pronounced effect on the attach rate of software down the line.

By the sound of it, if WiiU is Sony's intended competitor then they are already missed the release window and if it is indeed MS then they have lost out on vying for third party titles.

Looks like I'll be getting one after several price drops (which will be inevitable).

I don't know, sony would definitely be portable either way, in fact, just looking at pcs and UE4's scaling ability, points to Wii U being inside the realm of ports... Right now even PC to console porting works, the days of different architectures won't apply to next gen.

Everything is using unified shaders, heck even compute units are in all 3 next gen consoles, so I really don't see a barrier existing across any of these next gen consoles from getting ports. It's really just about showing off whose hardware is better at this point.
 
Also people acting like Wii U even belongs in this discussion is silly, until they show something better than E3 which basically looked worse than PS360 (Killer U). I cannot believe in 400-800gflops or 1.5-2GB RAM until they show games that dont look worse than 360, period.

I was trying to avoid Wii U discussion in this thread, but if that rumor of Wii U having a GPGPU is true then it does belong as that makes Wii U a PS4/Xbox 3-lite instead of a souped up Xbox 360.
 

z0m3le

Banned
I was trying to avoid Wii U discussion in this thread, but if that rumor of Wii U having a GPGPU is true then it does belong as that makes Wii U a PS4/Xbox 3-lite instead of a souped up Xbox 360.

Wasn't that "rumor" basically confirmed with the dev kit pictures of the actual Wii U just a couple days after the post? obviously they are getting information from people who actually have the dev kits, and they listed compute shaders in the spec sheet.

Even the R700 has compute shaders, so them not having them, would of been going out of their way to remove em, it was unlikely they would not have em from the start.
 
Wasn't that "rumor" basically confirmed with the dev kit pictures of the actual Wii U just a couple days after the post? obviously they are getting information from people who actually have the dev kits, and they listed compute shaders in the spec sheet.

Even the R700 has compute shaders, so them not having them, would of been going out of their way to remove em, it was unlikely they would not have em from the start.

No I'm talking about the post just a couple days ago on B3D. At the same time there's listing compute shaders and then there's designing your GPU to have GPGPU capabilities to the point of actually trying to have a lesser CPU.
 
Well, I think we'd see it in GPUs first, it will cost much more even in 2013 because you have to pay for the process, it's not something that can be done anywhere, but where you really have to come across is why your speculation is any more important than the actual target specs we have here? 3D transistors also didn't help Ivy bridge in the heat department, even though the die shrunk to 22nm.

This new technology has some common new technology drawbacks, but it also has some unexpected ones as well.
How would you describe target specs given no-one has seen the new memory, use the fastest available memory at the time? Custom or semi-custom memory with a GDDR5 interface does not make sense!

As to drawbacks, it's faster, requires less power and produces less heat. Manufacturing memory with TSVs and stacking like on like has been done for 4 years so it's not new.

As to the SOC and MCM that both games consoles will be using, that's not a debatable point but is also being overlooked in BY3D for the most part. It's process optimized building blocks with differing die size and processes that are TSV connected to a SOC substrate with overflow being assembled by TSMC using TSVs and 2.5D transposer (MCM).

According to the data gleaned from presentations by Samsung, Toshiba, AMD, and others, 3D IC assembly gives you the equivalent performance boost of 2 IC generations (assuming Dennard scaling wasn’t dead). Garrou then quoted AMD’s CTO Byran Black, who spoke at the Global Interposer Technology 2011 Workshop last month. AMD has been working on 3D IC assembly for more than five years but has intentionally not been talking about it. AMD’s 22nm Southbridge chips will probably be the last ones to be “impacted by scaling” said Black. AMD’s future belongs to partitioning of functions among chips that are process-optimized for the function (CPU, Cache, DRAM, GPU, analog, SSD) and then assembled as 3D or 2.5D stacks.
IBM and AMD have been working together on this since 2008. Micron has been working on the memory solutions to feed HSA HPC SOCs.

2013 is the target date a number of industries have been working toward and Game console volumes are kick starting the technology and making it economical.

I'm getting tired of repeating myself and I'm sure some of you are getting tired of reading my posts. I've provided the cites and created a thread which has a summary of what makes a next generation game console possible, I suggest you read through the cites on the first page.

Your comment about 3D transistors for instance, which won't happen till 14nm, is an example of your misunderstanding of the technology. I've made similar gaffs on Neogaf <grin>.

Edit: For example, I insisted Gstreamer was in the PS3 since Firmware 3.0 incorrectly and Massa (below) corrected me (LGPL) and forced me to rethink my position. Turns out My first guess on BY3D which was Adobe open source AVM+ was correct which I dismissed later as SNAP and GTKwebkit indicated Gstreamer was the future which may still be true but I couldn't believe Sony would invest in AVM+ and then rewrite/replace most of the XMB routines later this year. Should have known as Netflix had to port a complete QTWebkit plus gstreamer to the PS3. I also insisted the XMB would be rewritten to support CairoSVG when it's already OpenVG SVG as I didn't know about EGL which provides support for OpenVG + OpenGLES2 and Cairo can be edited to use OpenVG for the 2D parts of Cairo and use OpenGLES2 for the 3D parts of CairoGL= Cairo-eGL

I'm really surprised that given AMD has admitted that their process optimized building blocks include "(CPU, Cache, DRAM, GPU, analog, SSD)" and the popularity of SSD for very fast loading and solid state hard disks as well as DRAM inside a SOC that no-one is speculating on this. The rumors of 16 gig Flash = SSD sorta supports this.
 

Massa

Member
I hope they use at least 512MB for the OS next gen. Having web browser and Netflix without having to close the game would be nice.

Also really hoping the consoles offer suspend functionality, so I can turn it on and immediately load up a game as I do on the Vita. No waiting for warnings and loading.
 

KageMaru

Member
I hope they use at least 512MB for the OS next gen. Having web browser and Netflix without having to close the game would be nice.

Also really hoping the consoles offer suspend functionality, so I can turn it on and immediately load up a game as I do on the Vita. No waiting for warnings and loading.

No offense but I really hope you don't get what you want. I can't think of anything that would be more of a waste of resources than to have Netflix running in the background while I'm playing my game. I much rather the developers have every bit of memory they can have to benefit the games I'm playing, than for me to have quick access to something that loads up quickly anyways.

Edit:

Yup, all your reading and looking for Sony patents is giving you a heads up over many here on this board <grin>. You are one of about 10 or so who I have noticed as having contributed allot to NeoGAF. Thumbs up.

Am I one of those special 10 as well? I mean I don't look at patents and jump to conclusions, but I have a pretty good grasp on some things.
52apeh.png


You know, I'm surprised you don't post on Beyond 3D. You would be surrounded by well informed people such as yourself.

Not serious, your post just made me laugh is all
 

joshwaan

Member
No the dev kits actually have 12GB. Sick I know.

And there's 1-2 GB of VRAM, I guess on top of all this!

Wow so one would expect about 6GB in the console, is it always the case Dev's get double the ram and retail units ship with half or would I be off with that analogy?

SP, do you think Sony could go with 3or 3.5GB of memory to keep cost down or would 4GB be the better option price wise?

I'm personally looking forward to the PS4 more then the Xbox 3, don't get me wrong I love my Xbox 360 and games on it are fantastic. I just get the feeling Microsoft is going to slowly move away from the Core gamer console. I bought a lot of games on the Xbox cause games run better and looked a lot nicer and Xbox live is quite good. But PSN Deals are fantastic Sony got this right, Microsoft GOD prices ROFL don't get me started :p

Anyways this is a great discussion keep it going guys it's very interesting reading a lot of your post's :)
 
Am I one of those special 10 as well? I mean I don't look at patents and jump to conclusions, but I have a pretty good grasp on some things. >=p

You know, I'm surprised you don't post on Beyond 3D. You would be surrounded by well informed people such as yourself.

I used to read stuff on B3D allot and sometimes post but just like every where else they got there crazy people .
Also half of the time they are wrong about stuff just like every where else.

If MS does go with 8GB of ram and most of it DDR3 and Sony goes with 4GB of ram and it DDR5 it going to make thing very interesting.
I just hope Sony taking with there first party devs about the spec since they have some of the best in the world.
 

onQ123

Member
Yup, all your reading and looking for Sony patents is giving you a heads up over many here on this board <grin>. You are one of about 10 or so who I have noticed as having contributed allot to NeoGAF. Thumbs up.


if you read the thread someone actually brought up the thought of using a APU in the new consoles.
 
Status
Not open for further replies.
Top Bottom