• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 Rumors , APU code named 'Liverpool' Radeon HD 7970 GPU Steamroller CPU 16GB Flash

Status
Not open for further replies.

RoboPlato

I'd be in the dick
Now that I've seen the clarifications that the 7970 is an M this rumor sounds like it could be pretty accurate. It seems like a refresher of the previous rumored specs, which were probably early dev kits. If they wind up bumping the RAM to 4Gb, which is probably likely given how many devs want more RAM and the likelihood of the competition going for more, this system will be quite solid. This set up also seems like it'll be far easier to devop for and port to than the PS3, which is a godsend.

I just hope the OS will be a good deal faster than the XMB is on the PS3 (faster trophy notifications, background downloads in all games and while watching blu-rays, etc) and isn't too much of a resource hog.
 

szaromir

Banned
This post does not really make any sense.
Crytek is not saying they need 8GB to run current Cryengine, they are saying to make a generational leap from Crysis 3, they will need that. Sounds about right to me.
crysis-3-teaserhdkd9.jpg

Do you think a generational leap from Crysis 3 would be easy on 2GB?
I think Crysis1/2/3 on consoles are better points of comparison for current-gen/next-gen expectations than those games on any arbitrary PC.

The day we can buy 8GB single core GPUs, I'll believe it, visually I've seen nothing from Crytek that would require that sort of ram, even cryengine 3.4 doesn't carry that sort of requirement, and I think we all know that if Microsoft and Sony were planning on 8GBs, Epic and Crytek would be asking for 16. however, if they are talking about 1080p, you could do 720p for roughly half that. Which is likely why 4GBs makes perfect since, and 2GBs with optimization and compression would work fine.
Obviously after Crysis 1 Crytek realized making games for high-end/future hardware only is a financial suicide and their ambition was restrained with Crysis 2 (for the benefit of majority of gamers, eg. I can run C2 on my laptop in DX9 mode with 30fps and it looks gorgeous). If console manufacturers put very modest hardware in the upcoming consoles Crytek will scale their ambitions back (so will Epic), but so far Crytek only commented what they would like to see in future hardware to make a real tangible difference in user experience.
 

thuway

Member
This post does not really make any sense.


Crytek is not saying they need 8GB to run current Cryengine, they are saying to make a generational leap from Crysis 3, they will need that. Sounds about right to me.




Do you think a generational leap from Crysis 3 would be easy on 2GB?

Totally agree. It's a pathetic reality where GDDR5 RAM densities are caught in purgatory.

Even if densities improved, 3D stacking was used, and a mother board could be made in a logistical way - the cost, the power, and the enormous complexity would be a nightmare to deal with. I want 8 GB just as bad as the next technofile - but unless something miraculous happens, or strong shortcuts are taken, or pricing is set above $399 it will be highly unlikely.

Which is sad since RAM is usually the bottleneck in these situations.
 

Avtomat

Member
So when you guys say 7970M you mean down clocked 7870, rumour makes far more sense now, I thought it would be pretty stupid to base it on the desktop 7970 architecture. Wonder if they will wait for 22nm?
 

RoboPlato

I'd be in the dick
Don't count on 22nm happening any time soon. If either Sony or MS were to wait for 22nm though, you'd see a doubling of performance from the box. We would be talking about 4-6 teraflops instead of 2-3.
Holy shit. Yeah, I don't see them waiting for that despite how awesome it would be. I just thought he was referring to a die shrink for energy/heat reasons. I wasn't sure if that was on the roadmap for the next year or not.
 
It's an under clocked 7970m running at 800mhz with 1.8 teraflops, a discrete 6850 with 1.5 teraflops, and 2gb of GDDDR5. I really hope they up those specs to hit 3-4 teraflops. This could be the last tradition console cycle as we know it.

I would be surprised if they are targeting a discrete gpu. gofreak's theory made alot of sense, I think the apu + discrete gpu was for the dev kits, and the target hardware is just the apu alone. It wouldn't make any sense for sony to go for over 3TFLOP if the 720 is going to end up with only half of that.
 

z0m3le

Banned
This post does not really make any sense.


Crytek is not saying they need 8GB to run current Cryengine, they are saying to make a generational leap from Crysis 3, they will need that. Sounds about right to me.


crysis-3-teaserhdkd9.jpg



Do you think a generational leap from Crysis 3 would be easy on 2GB?

If it's a leap from what the game ends up looking like on the PS3/360... YES.
 

RoboPlato

I'd be in the dick
I would be surprised if they are targeting a discrete gpu. gofreak's theory made alot of sense, I think the apu + discrete gpu was for the dev kits, and the target hardware is just the apu alone. It wouldn't make any sense for sony to go for over 3TFLOP if the 720 is going to end up with only half of that.
Why would a dev kit have that much more power unless it was to emulate a part that isn't in production yet? Usually RAM is the only thing dev kits have over the real specs and I don't think there's going to be an APU with enough processing power that a 7970M and a 6850 would be simulating.
 

Yoshiya

Member
Don't count on 22nm happening any time soon. If either Sony or MS were to wait for 22nm though, you'd see a doubling of performance from the box. We would be talking about 4-6 teraflops instead of 2-3.

If that were the case one or both of the console manufactures would be offering themselves to Intel right now, whatever its exorbitant price.
 

thuway

Member
would this be able to run agnis philosopy?

I read Agnis Philosophy was running at 720p on high end PC hardware. (Multiple GTX 680s theoretically) Which is two to three times as powerful as the machines we are talking about.

It'll stay a tech demo :(, but I truly wonder what Naughty Dog, Guerilla, Santa Monica, 343, and Polyphony can achieve with such hardware.
 

gofreak

GAF's Bob Woodward
Why would a dev kit have that much more power unless it was to emulate a part that isn't in production yet?

The dev kit IGN rumoured was substantially less powerful than the rumoured target in the OP.

If the final target is an APU with 18 compute units - there is no such APU today that powerful.

They would need to couple a APU to a discrete GPU in the dev kits in order to make it appear to the OS that the APU was more powerful. That is why they had that configuration in early kits.

Final kit will either have one discrete CPU + one discrete GPU or one APU with no additional discrete components, IMO.
 
Why would a dev kit have that much more power unless it was to emulate a part that isn't in production yet? Usually RAM is the only thing dev kits have over the real specs and I don't think there's going to be an APU with enough processing power that a 7970M and a 6850 would be simulating.

Because the apu alone wouldn't be enough for development. So they tossed in another gpu to bring up the performance.

But I'm just going off the IGN rumor since it's the only talked about the apu + discrete gpu. The rumored apu A8-3850 is about 480gflop, so it make sense to put another gpu to help out development.
 

thuway

Member
The dev kit IGN rumoured was substantially less powerful than the rumoured target in the OP.

If the final target is an APU with 18 compute units - there is no such APU today that powerful.

They would need to couple a APU to a discrete GPU in the dev kits in order to make it appear to the OS that the APU was that powerful. That is why they had that configuration in early kits.

Final kit will either have one discrete CPU + one discrete GPU or one APU with no additional discrete components, IMO.

Is it really unfeasible to have a APU with a 7970m and a discrete GPU?
 

KageMaru

Member
I read Agnis Philosophy was running at 720p on high end PC hardware. (Multiple GTX 680s theoretically) Which is two to three times as powerful as the machines we are talking about.

It'll stay a tech demo :(, but I truly wonder what Naughty Dog, Guerilla, Santa Monica, 343, and Polyphony can achieve with such hardware.

It was a single high end GPU, probably a 680 since they use Nvidia tech for the hair in the demo.
 

StevieP

Banned
Is it really unfeasible to have a APU with a 7970m and a discrete GPU?

Stop talking about mobile parts and start talking about what they really meant: underclocked Pitcairn :p

Regardless, why would any console manufacturer want to go the dual-GPU route?
 

Mashing

Member
A little OT but, I'm going to laugh my ass off if/when console become too powerful for air cooling and will instead require water cooling (and other advanced cooling techniques). Or at the very least huge consoles with big air cooled heatsinks/fans installed in them. Noise could be an issue too, I hate loud consoles/PCs.
 

gofreak

GAF's Bob Woodward
Is it really unfeasible to have a APU with a 7970m and a discrete GPU?

I don't know if it wouldn't be feasible, but it would be expensive. And besides, the rumour in the OP and VG Leaks rumour don't talk about an additional discrete GPU.

The only report to talk about a APU + GPU was IGN's, and given the kit they were talking about, I think it makes perfect sense as an early dev kit emulating a more powerful single APU such as that in the supposed target specs.
 

Avtomat

Member
Don't count on 22nm happening any time soon. If either Sony or MS were to wait for 22nm though, you'd see a doubling of performance from the box. We would be talking about 4-6 teraflops instead of 2-3.

Both the 7870 and the proposed CPU/APU combo weigh in at over 200 mm^2 after all the bitching about consoles being too expensive (to produce, too hot, multiple failures etc) this time around I am sure no one wants to go for 2 chips of larger than 200 mm^2 so in fact I was wondering if Sony would be waiting for 22 nm in order to shrink both not for performance reasons but cost and heat reasons.

If they waited for 22 nm then a late 2014 launch really.
 

Avtomat

Member
I don't know if it wouldn't be feasible, but it would be expensive. And besides, the rumour in the OP and VG Leaks rumour don't talk about an additional discrete GPU.

The only report to talk about a APU + GPU was IGN's, and given the kit they were talking about, I think it makes perfect sense as an early dev kit emulating a more powerful single APU such as that in the supposed target specs.

Actually gofreak been meaning to ask someone who is a bit more technically biased would it not make sense that the APU would be based on Tahiti (Obviously suitably cut down) and the GPU based on Pitcairn if the proposed APU / GPU combo comes to fruition with Tahiti being more biased to GPU compute?
 

gofreak

GAF's Bob Woodward
Actually gofreak been meaning to ask someone who is a bit more technically biased would it not make sense that the APU would be based on Tahiti (Obviously suitably cut down) and the GPU based on Pitcairn if the proposed APU / GPU combo comes to fruition with Tahiti being more biased to GPU compute?

I thought Tahiti and Pitcairn were the same architecture?

Anyway, the reason this will probably be reasonable at 28nm is because there won't be two chips :) One APU at 28nm. It might be a large-ish chip but there'll only be the one.
 

Elios83

Member
So everything works like a sli ? Or something similar ?

It would be asymmetric cross fire.
But as gofreak suggested that is probably just the configuration of the devkit, they have a current low spec APU coupled with a discrete GPU.
Final hardware will have a APU with 4 Jaguar cores and a GPU with 1.8-2Teraflops all integrated in the same SOC.
The only problem is how they can upgrade the memory to 4GB.
They would need 16 GDDR5 chips, two per memory controller with a shared channel, or they have to wait for expensive 4Gbit chips.
There are no alternatives.

PS: Ma che ci fai qua? Sei risorto dalle ceneri in tempo di next gen? AHAHA :D
 

luca_29_bg

Member
It would be asymmetric cross fire.
But as gofreak suggested that is probably just the configuration of the devkit, they have a current low spec APU coupled with a discrete GPU.
Final hardware will have a APU with 4 Jaguar cores and a GPU with 1.8-2Teraflops all integrated in the same SOC.
The only problem is how they can upgrade the memory to 4GB.
They would need 16 GDDR5 chips, two per memory controller with a shared channel, or they have to wait for expensive 4Gbit chips.
There are no alternatives.

PS: Ma che ci fai qua? Sei risorto dalle ceneri in tempo di next gen? AHAHA :D

Grazie per la spiegazione, si sono risorto perchè mi interessa capire se ps4 sarà una cagata o meno. Da qui il mio interesse visto che non mi era chiaro il discorso gpu. L'apu finale avrà una gpu integrata se ho ben capito ?
 
I
The only problem is how they can upgrade the memory to 4GB.
They would need 16 GDDR5 chips, two per memory controller with a shared channel, or they have to wait for expensive 4Gbit chips.
There are no alternatives.

PS: Ma che ci fai qua? Sei risorto dalle ceneri in tempo di next gen? AHAHA :D
Micron is producing custom and semi-custom memory for next generation game consoles We don't know the form this will take, it may not even be GDDR5 but a new standard designed for memory hungry multi-cpu fusion designs. HMC with 6-8 serial links for 360gb/sec....who knows. Micron makes the Hybrid Memory Cube. Depends on availability and cost with IBM now making the logic layer ramping up to full production in 2013.

The initial offering for the HMC is for 2 and 4 Gig single package designs and rumors have the PS4 and Durango with either 2 or 4 gigs. 4 gigs is too small for most other applications that it's supposed to be targeted for.
 

luca_29_bg

Member
Micron is producing custom and semi-custom memory for next generation game consoles We don't know the form this will take, it may not even be GDDR5 but a new standard designed for memory hungry multi-cpu fusion designs. HMC with 6-8 serial links for 360gb/sec....who knows. Micron makes the Hybrid Memory Cube. Depends on availability and cost with IBM now making the logic layer ramping up to full production in 2013.

The initial offering for the HMC is for 2 and 4 Gig single package designs and rumors have the PS4 and Durango with either 2 or 4 gigs. 4 gigs is too small for most other applications that it's supposed to be targeted for.

I really hope that this will bring 4 gb for ps4 because 2 gb is really too...low..
 

thuway

Member
Micron is producing custom and semi-custom memory for next generation game consoles We don't know the form this will take, it may not even be GDDR5 but a new standard designed for memory hungry multi-cpu fusion designs. HMC with 6-8 serial links for 360gb/sec....who knows. Micron makes the Hybrid Memory Cube. Depends on availability and cost with IBM now making the logic layer ramping up to full production in 2013.

The initial offering for the HMC is for 2 and 4 Gig single package designs and rumors have the PS4 and Durango with either 2 or 4 gigs. 4 gigs is too small for most other applications that it's supposed to be targeted for.

Is it possible to put two of those 4 gig packages together into an 8 gig amalgam?
 
I don't know if it wouldn't be feasible, but it would be expensive. And besides, the rumour in the OP and VG Leaks rumour don't talk about an additional discrete GPU.

The only report to talk about a APU + GPU was IGN's, and given the kit they were talking about, I think it makes perfect sense as an early dev kit emulating a more powerful single APU such as that in the supposed target specs.

If it works out that way with only a APU we are looking at 1.5 Tflops the most .
Which is a really weak system but then again i really don't have my hopes up for much .
 

TronLight

Everybody is Mikkelsexual
Grazie per la spiegazione, si sono risorto perchè mi interessa capire se ps4 sarà una cagata o meno. Da qui il mio interesse visto che non mi era chiaro il discorso gpu. L'apu finale avrà una gpu integrata se ho ben capito ?

Piacerebbe anche a me capirlo. lol
 

Mindlog

Member
A little OT but, I'm going to laugh my ass off if/when console become too powerful for air cooling and will instead require water cooling (and other advanced cooling techniques). Or at the very least huge consoles with big air cooled heatsinks/fans installed in them. Noise could be an issue too, I hate loud consoles/PCs.
We have external power supplies already. Next step: external radiators.
 
I read Agnis Philosophy was running at 720p on high end PC hardware. (Multiple GTX 680s theoretically) Which is two to three times as powerful as the machines we are talking about.

It'll stay a tech demo :(, but I truly wonder what Naughty Dog, Guerilla, Santa Monica, 343, and Polyphony can achieve with such hardware.

SE are up with the others you listed when it comes to visuals...I think we will have to see Versus in its current form to trully see what they are capable of this gen...its probably a couple of years before the PS4 is released. I honestly think we will get a final fantasy with visuals like that or very close. SE has a hell of a long time to optimize the engine
 
Is it possible to put two of those 4 gig packages together into an 8 gig amalgam?
If it's based on HMC or actually HMC then the choice in 2013 will be either 2 or 4 gigs. Plans are for 16 gig and larger down the road with optical connections @ 1Tb/sec.

The point is this 2013 run is only for 2 or 4 gig Hybrid memory cubes, HPC platforms and servers usually use more ram and would use a faster interface like optical. It seems to me that the target with 2 and 4 gig could be game consoles. Again, cost and availability will determine it's use.

You see, there’s been a DRAM-bandwidth bottleneck building for a long, long time. DRAMs have gotten far bigger with huge parallel arrays of DRAM cells on chip, but DRAMs are essentially still limited to the bandwidth supported by their package. That’s why we’ve gone from asynchronous DRAMs with RAS and CAS signals to more structured DRAMs with synchronous interfaces that support ever-increasing memory bandwidths.

With synchronous DRAM interfaces, we’ve gone from DDR to DDR2, DDR3, and now DDR4. Early SDRAM modules were limited to about 1 Gbyte/sec of peak bandwidth. DDR2-667 modules jumped to nearly more than 5 Gbytes/sec; DDR3-1333 modules bumped the bandwidth bar to 10.66 Gbytes/sec; and DDR4-2667 modules—at some time in the future—will achieve 21.34 Gbytes/sec.

Micron expects its HMC module to achieve and exceed 128 Gbytes/sec. That’s at least 6x what’s expected of DDR4. The only way to do that is through parallelism. The first step in exposing DRAM parallelism through the HMC is to configure each DRAM die as a 16-slice device with sixteen independent I/O ports on each die.

According to Micron, an HMC with four links can achieve a peak bandwidth of 160 Gbytes/sec and one with eight links can achieve a peak bandwidth of 320 Gbytes/sec. Keep in mind that this is one HMC module, with the footprint that’s essentially no bigger than the logic die at the bottom of the HMC 3D stack.

The Micron HMC project illustrates why memory is a killer 3D app. With the bandwidths made possible by employing the large number of I/Os made possible through TSV interconnect, much more of the potential bandwidth available from all of those DRAM banks that have been on memory die for decades can finally be brought out and used to achieve more system performance. This is especially important in a world that increasingly makes use of multicore processor designs. Multiple core processor chips have insatiable appetites for memory bandwidth and, as the Micron HMC demonstrates, 3D assembly is one way to achieve the required memory bandwidth.
StevieP said:
Do you really see something that new and exotic going into a console?
It might be cheaper than multiple chips and a 256 or 512 bit buss on the motherboard. Do you really see something as new and exotic as a AMD fusion SOC going into a game console, it's going to be state of the art when it's released and very exotic requiring a new method to access memory.
 

Elios83

Member
Grazie per la spiegazione, si sono risorto perchè mi interessa capire se ps4 sarà una cagata o meno. Da qui il mio interesse visto che non mi era chiaro il discorso gpu. L'apu finale avrà una gpu integrata se ho ben capito ?

LOL it will probably be a system with a 2 Teraflops cumulative power.
It will be really powerful compared to PS3 (which is what matters) although far from high-end PC hardware at the end of 2013. But this is a sytem which won't be priced higher than 399$ and will also include extra tech for the control system (a touch pad, a camera, whatever, but definetly they won't use the standard Dual Shock again).
The only problem with the suggested architecture is how to achieve 4GB of memory cumulative memory.
2GB is not enough, maybe it would be enough for first party developers, but third party would struggle with it.

And yes, basically it will be a quad core CPU and a GPU integrated in a single 28nm chip, connected externally through a 256bit bus to a unified pool of memory. It's very efficient and cost effective, also easy to develop for.

jeff_rigby said:
Micron is producing custom and semi-custom memory for next generation game consoles We don't know the form this will take, it may not even be GDDR5 but a new standard designed for memory hungry multi-cpu fusion designs. HMC with 6-8 serial links for 360gb/sec....who knows. Micron makes the Hybrid Memory Cube. Depends on availability and cost with IBM now making the logic layer ramping up to full production in 2013.

The initial offering for the HMC is for 2 and 4 Gig single package designs and rumors have the PS4 and Durango with either 2 or 4 gigs. 4 gigs is too small for most other applications that it's supposed to be targeted for.

Exotic things with unproven manufacturing technology won't be used.
Imo the easiest way to do this is to accept the extra cost on the PCB and put 16 chips of memory, two per memory controller channel. It will still be better than using 4Gbits expensive modules whose production could lead to shortages and they won't need to significantly edesign the APU.
 
Exotic things with unproven manufacturing technology won't be used.
Imo the easiest way to do this is to accept the extra cost on the PCB and put 16 chips of memory, two per memory controller channel. It will still be better than using 4Gbits expensive modules whose production could lead to shortages and they won't need to significantly edesign the APU.
Gesh, the 2014 AMD fusion SOC is going to be New exotic and unproven and Micron said that custom and simi custom chips will be used by game consoles and you still insist on using OLD standard memory configurations. Micron SAID Custom chips will be used in game consoles!!!!!!!!!!!!!!!! Think about this before responding!!!!!!!! We don't know the form this will take but it will NOT be an OLDER STANDARD!

Is no one reading the cites? SteveP may be correct and it may not be a HMC, there are a large number of ways to custom configure memory. In some way a wider buss providing a higher bandwidth is going to have to be available for use by the AMD SOC. The HMC does this internally with TSVs and the logic layer transforms a 512 bit buss into 4 or 8 serial busses.

Remember this is a game console and it does not have plug in memory that has to follow a standard; as such it could be the first consumer device to use a HMC.

SteveP do you know of any exotic hardware that would use a 2 gig HMC? Any other game consoles besides PS4, Xbox 8 and WiiU (which does not push the envelope and may not need exotic memory) that Micron could be talking about?
 

z0m3le

Banned
If it's based on HMC or actually HMC then the choice in 2013 will be either 2 or 4 gigs. Plans are for 16 gig and larger down the road with optical connections @ 1Tb/sec.

The point is this 2013 run is only for 2 or 4 gig Hybrid memory cubes, HPC platforms and servers usually use more ram and would use a faster interface like optical. It seems to me that the target with 2 and 4 gig could be game consoles. Again, cost and availability will determine it's use.

It might be cheaper than multiple chips and a 256 bit buss on the motherboard. Do you really see something as new and exotic as a AMD fusion SOC going into a game console, it's going to be state of the art when it's released and very exotic requiring a new method to access memory.

Umm, APUs aren't exotic, even 360s has a basic Frankenstein APU.

Also for everyone caring about the memory only being 2GBs... That is 4X the memory of the current HD systems, just look back to the Xbox:
Xbox: 64MBs to display 345600 pixels... (480p majority of games) 5400 pixels per MB
360: 512MBs to display 921600 pixels... (720p majority of games) 1800 pixels per MB
PS4: 2048MBs to display 921600 pixels... (720p majority of games, expected) 450 pixels per MB


Xbox to 360 was 3 times more efficient for ram. 360 to PS4 will see a 4times efficiency increase... it's quite a big jump no matter how you look at it, 2GB will be a bigger improvement to ram over the last generation jump thanks to the resolutions staying the same as they are now.
There is NO reason 2GBs wouldn't be enough for a next gen leap, crytek just wants to expect the world.
 

luca_29_bg

Member
LOL it will probably be a system with a 2 Teraflops cumulative power.
It will be really powerful compared to PS3 (which is what matters) although far from high-end PC hardware at the end of 2013. But this is a sytem which won't be priced higher than 399$ and will also include extra tech for the control system (a touch pad, a camera, whatever, but definetly they won't use the standard Dual Shock again).
The only problem with the suggested architecture is how to achieve 4GB of memory cumulative memory.
2GB is not enough, maybe it would be enough for first party developers, but third party would struggle with it.

And yes, basically it will be a quad core CPU and a GPU integrated in a single 28nm chip, connected externally through a 256bit bus to a unified pool of memory. It's very efficient and cost effective, also easy to develop for.



.

So a cpu and a gpu integrated in a single chip right ? The cpu don't have another 6850 gpu right ? And a cpu from amd? O_________________________________O

If only sony would use a ipotetic cell 2....why sony why....:/
 
Status
Not open for further replies.
Top Bottom