• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Possible hint at AMD's next-gen APU (codename: Gonzalo) - 8 cores, 3.2GHz clock, Navi 10-based GPU

Shin

Banned
I thought it was Ariel, now it's Gonzalo?
Someone was running with Ariel from the string, could be the GPU codename with Gonzalo being the project/whole APU itself, IDK.
Ariel+Gonzalo sounds more MS than Sony either way, don't you think?
Tweet from the article got deleted or at least one of them, so doesn't help much either
Rich (BB code):
https://twitter.com/KOMACHI_ENSAKA/status/1086194765278789634
 

Fake

Member
Zen+ maybe? AMD maybe just use base hardware, but they're custom. I prefer to believe its a custom cpu base on Zen 2.
 

Shin

Banned
Zen+ maybe? AMD maybe just use base hardware, but they're custom. I prefer to believe its a custom cpu base on Zen 2.
It could be Zen 2 as Mark Papermaster (or whatever his name is) or Lisa Su said a while ago Zen 1-3 are being developed simultaneously.
If semi-custom became more of a bread and butter to them then Zen 2 APU wouldn't have to wait for the desktop and mobiel variants to officially debut.
Plus by the time these consoles are out Zen2 will be available for the desktop with the mobile version the year after, so Zen 2 in the APU is possible.
Until we get confirmation from AMD after the system is announced it's anyone's guess, I'm just going by the roadmaps, conference calls and personal research.
 

Shin

Banned
3.2 is high for console period. Especially when you have 8 cores.
Old Vega APU leak 4/8 running at 3GHz - 3.3GHz, older node as 7nm wasn't a possibility at the time.
I wouldn't say it's impossible, bla bla bla node shrink less heath, bla bla bla CPU architecture gains, bla bla something.

big_amd-raven-ridge-sisoftware-sandra.jpg.ashx
 

Fake

Member
It could be Zen 2 as Mark Papermaster (or whatever his name is) or Lisa Su said a while ago Zen 1-3 are being developed simultaneously.
If semi-custom became more of a bread and butter to them then Zen 2 APU wouldn't have to wait for the desktop and mobiel variants to officially debut.
Plus by the time these consoles are out Zen2 will be available for the desktop with the mobile version the year after, so Zen 2 in the APU is possible.
Until we get confirmation from AMD after the system is announced it's anyone's guess, I'm just going by the roadmaps, conference calls and personal research.
Probably its a mobile custom because ps4 already have one and the TDP need to be low for energy efficient. I think Sony is like the rumor say 'waiting' for manufacture cost to reduce over time. But I really really hoping for a Zen 2 too.
The funny part is getting close to release many AAA gonna receive more dev hardwares.
 

onQ123

Member
O onQ123 my gosh if ps5 were to have 880gb/s that would either leave hbm2 or gddr6 on a 512 bit bus. Both of which I suspect are too expensive unless Sony wants to charge a premium. I dunno.

Perhaps navi will have a yet to be seen memory type.

But my conservative estimation of ps5's bandwidth would be 500-600gb/s. Also dude ignore that fuckwit.

18 1GB GDDR5X chips could reach 880GB/s on a 576-bit bus
 

McHuj

Member
18 1GB GDDR5X chips could reach 880GB/s on a 576-bit bus

No one will put that many GDDR5x chips in a console not a bus that big.

5x will not have the economies of scale. It’s a dead technolog. The power efficiency is worse than GDDR6 at least for the necessary bandwidths.

A bus that big is uneconomical, you would probably red a chip in the ballpark of 500mm to even think about that. It ain’t happing on 7nm.

I’m not even convinced a 384-bit will be possible in the near term on 7nm.
 

Redneckerz

Those long posts don't cover that red neck boy
Goodness it is starting to sound like a bunch of petty children in here, Let It Go.
Pie out of the sky speculation that has little to do with this rumor will do that for you.

Is there reason to believe this is a console SoC or are we assuming it is one because its an unknown piece of sillicon?

No I wasn't
Compelling argument.

Also I was completely right about PS4 Pro being 8.4TF fp16
Nobody cares if you were completly right because you tend to always say this using your own posts as some kind of source. Almost as if you want to self-confirm how right you are all those times yet you only cite your own posts which only have vague speculations as some kind of base for your ground truth.

And yeah, on that offchance you will get something right eventually. The idea here is that you act like you are right all the time.

O onQ123 my gosh if ps5 were to have 880gb/s that would either leave hbm2 or gddr6 on a 512 bit bus. Both of which I suspect are too expensive unless Sony wants to charge a premium. I dunno.

Perhaps navi will have a yet to be seen memory type.

But my conservative estimation of ps5's bandwidth would be 500-600gb/s. Also dude ignore that fuckwit.
Who are you calling a fuckwit? I know its not me but this reads like it was written in an intentionally vague way.

Is GDDR5X and a 576-bit bus realistic though? I don't believe so.
It would be a very unusual configuration to say the least. Also 18 chips. You may aswell go with 36 512 MB GDDR5x chips.
 

Dontero

Banned
It would be a very unusual configuration to say the least. Also 18 chips. You may aswell go with 36 512 MB GDDR5x chips.

18 memory dies is literally impossible when you take into account PCB wiring.
Unless you are willing to make high end PCB with multiple layers which would be very costly.

Nvidia didn't want to even that for their 2080Ti. I mean look at this junk of a wiring, and that is just for 8 chips on 352-bit bus not 512:

geforce-rtx-2080-technical-photography-pcb-front-001-Custom-1.png
 

StreetsofBeige

Gold Member
If better cpus are coming out, do the consoles really need 24+ gig ram like some people want? PCs don't even have that much ram.

Maybe the sweet spot is a good console version of this cpu + 16 gb ram?
 
Last edited:
If better cpus are coming out, do the consoles really need 24+ gig ram like some people want? PCs don't even have that much ram.

Maybe the sweet spot is a good console version of this cpu + 16 gb ram?
More ram is always better.


Xbox one x already has 12gb. And some of it is reserved for the system, so it's even less.

Pcs are in another ballpark altogether. My system is getting old and I have 32gb of system ram and 12gb on the gpu alone (at much higher clocks and bandwidth).
 
Last edited:

Evilms

Banned
Personally I can see the PS5 either with:

24 GB GDDR6 @ 12 Gbps interfaced in 384-bit with a bandwidth of 576 GB/s (worst case scenario)
or
16 GB of GDDR6 @ 14 Gbps interfaced in 512-bit with a bandwidth of 896 GB/s (best case scenario)
 

onQ123

Member
Is GDDR5X and a 576-bit bus realistic though? I don't believe so.


Custom hardware so it's hard to say what's realistic but I'm sure if they can find a way to get 880GB/s out of cheaper memory they will do it. Logic says they would just use HBM or GDDR6 but I have a feeling that we are dealing with mad men .


Maybe the specs osirisblack seen was for a devkit but 18GB would be using either 144-bit , 288-bit , 576-bit , 1152 -bit or 2304-bit but 144 & 288 would be too slow.
 

Shin

Banned
I believe 18GB means 16GB GDDR6 main pool + 2GB DDR3 for others CPU stuffs.
I think this is the memory module given the potential jump, bandwidth, product status, pricing, setup and layout on the PCB.
https://www.samsung.com/semiconductor/dram/gddr6/K4ZAF325BM-HC14/

And 4-8Gb of https://www.samsung.com/semiconductor/dram/lpddr4x/ for UI/OS/streaming, etc etc.
This is the module from the original PS4 (in clamshell mode): https://www.samsung.com/semiconductor/dram/gddr5/K4G80325FB-HC28/
 

LordOfChaos

Member
Remind me what makes us sure it's not for another bizarro box like this again? Surprisingly AMD did make a custom chip for a seemingly niche configuration, though I don't know how big it is in China.

 
Last edited:

THE:MILKMAN

Member
Custom hardware so it's hard to say what's realistic but I'm sure if they can find a way to get 880GB/s out of cheaper memory they will do it. Logic says they would just use HBM or GDDR6 but I have a feeling that we are dealing with mad men .


Maybe the specs osirisblack seen was for a devkit but 18GB would be using either 144-bit , 288-bit , 576-bit , 1152 -bit or 2304-bit but 144 & 288 would be too slow.

It is always easy to over complicate things in spec threads. Keeping things simple usually results in being closer to reality. Matt's hints from way back are slowing becoming reality as we get closer to reveals.

As for OsirisBlack.......He suddenly stopped posting for unknown reasons after saying he was going to drop more info. I'm not sure it helps to keep referring back to him until/unless he does?
 
No one will put that many GDDR5x chips in a console not a bus that big.

5x will not have the economies of scale. It’s a dead technolog. The power efficiency is worse than GDDR6 at least for the necessary bandwidths.

A bus that big is uneconomical, you would probably red a chip in the ballpark of 500mm to even think about that. It ain’t happing on 7nm.

I’m not even convinced a 384-bit will be possible in the near term on 7nm.


Gddr6 on 384 bit is a likely possibility actually. for one the x1x already is 384 bit and 7th gen was 128 bit, ps4 is 256... 384 would follow that progression. And when you say impossible - gddr6 on 384 is what current nvidia chips have.
 
Last edited:

Fake

Member
Remind me what makes us sure it's not for another bizarro box like this again? Surprisingly AMD did make a custom chip for a seemingly niche configuration, though I don't know how big it is in China.


Because those consoles never get much sucess in China. I beleive even Digital Foundry have one. Richard made a video about. This really give us an idea of what could be next gen, at least in CPU configuration.



edit: See the video in 1:12
 
Last edited:
well, i'll eat my hat when it's not 7nm. and if it's 7nm it's pretty much a given that it will be zen 2 or a derivate thereof. that's just common sense.
So first you said there's no way it's zen 2 now it's a given? What.


Oh perhaps by zen 2 you meant zen+ the current 12nm desktop chips? Zen 2 is not out yet.
 
Last edited:

THE:MILKMAN

Member
Gddr6 on 384 bit is a likely possibility actually. for one the x1x already is 384 bit and 7th gen was 128 bit, ps4 is 256... 384 would follow that progression. And when you say impossible - gddr6 on 384 is what current nvidia chips have.

2080Ti uses a 352-bit bus but the other cards below that use a 256-bit bus or 192-bit bus.

Edited to fix quote.
 
Last edited:
I think you misread, what he means is that clock speed is not too high on Zen2.
Now you're talking about something completely different. He said 30watts is too much for a console which that would be on the current zen 2. By zen 2 he means zen+.

Damn forums lol Anyways yes they'll either shrink zen+ to 7nm or just use the full fat 7nm zen 2 cores. That's if zen 2 isn't just a clockspeed boost and has ipc boosts over zen+. Either way we're getting 7nm yes.
 
Last edited:

DeepEnigma

Gold Member
Now you're talking about something completely different. He said 30watts is too much for a console which that would be on the current zen 2. By zen 2 he means zen+.

Damn forums lol Anyways yes they'll either shrink zen+ to 7nm or just use the full fat 7nm zen 2 cores. That's if zen 2 isn't just a clockspeed boost and has ipc boosts over zen+. Either way we're getting 7nm yes.

Read what he quoted.

The guy said that 3.2Ghz was too much, he said not for Zen2 and it should be about 30 Watts at full load which really isn't that bad.

But he can come in here and clear it up because that's how I understood it at least.
 

THE:MILKMAN

Member
Only because they chose to be cheap and use 11gb instead of 12. THE:MILKMAN THE:MILKMAN

Their quadro cards are 384 bit but the point is 384 gddr6 is current tech, right now.

Go cheap!? Have you seen their prices?

I wouldn't label the bus as 'current spec tech'. It is an option and I guess if Microsoft really want the specs crown (higher BW for example) then they may well go for that again. Given my guess for PS5 specs I think 256-bit bus would suffice.
 
Last edited:
Go cheap!? Have you seen their prices?

I wouldn't label the bus as 'current spec'. It is an option and I guess if Microsoft really want the specs crown (higher BW for example) then they may well go for that again. Given my guess for PS5 specs I think 256-bit bus would suffice.

Their prices are because they're greedy. They're cheap greedy fucks lol. i'll beT you anytime that the rtx titan will be 384 gddr6. Nvidia has been holding back their best tech ever since the gtx 680.

256 bit is not good enough. It puts it right where ps4 is in its power to bandwidth ratio. One of the most obvious consequences for ps4 is a lack of AF on many titles. Actually 256 bit gddr6 may be worse than what ps4 got really. Thatd be... 5x+ the gpu power over ps4 and only 3x or less of a bandwidth jump.
 
Last edited:

THE:MILKMAN

Member
Their prices are because they're greedy. They're cheap greedy fucks lol. i'll beT you anytime that the rtx titan will be 384 gddr6. Nvidia has been holding back their best tech ever since the gtx 680.

256 bit is not good enough. It puts it right where ps4 is in its power to bandwidth ratio. One of the most obvious consequences for ps4 is a lack of AF on many titles. Actually 256 bit gddr6 may be worse than what ps4 got really. Thatd be... 5x+ the gpu power over ps4 and only 3x or less of a bandwidth jump.

No need to bet about Titan RTX as we already know it sports a 384-bit bus. We're talking about the consoles here, though. I think Microsoft might go with a 384-bit bus but it wouldn't surprise me at all if Sony do a 256-bit bus.

I'm really not sure that the bus size and bandwidth was the cause of PS4's often poor AF? Xbox One had a 256-bit bus too and less power and often had better AF. Maybe some the tech guru's here can explain what that issue really was about?

It is a shame there are no insiders here anymore to give hints on the general make-up/specs of next-gen.
 

DeepEnigma

Gold Member
No need to bet about Titan RTX as we already know it sports a 384-bit bus. We're talking about the consoles here, though. I think Microsoft might go with a 384-bit bus but it wouldn't surprise me at all if Sony do a 256-bit bus.

I'm really not sure that the bus size and bandwidth was the cause of PS4's often poor AF? Xbox One had a 256-bit bus too and less power and often had better AF. Maybe some the tech guru's here can explain what that issue really was about?

It is a shame there are no insiders here anymore to give hints on the general make-up/specs of next-gen.

AF definitely wasn't a bandwidth issue on the PS4. I think it probably just wasn't on by default in the SDK, and some developers didn't bother messing with it. There are plenty of demanding games that use it no problem when they do enable it.
 

THE:MILKMAN

Member
AF definitely wasn't a bandwidth issue on the PS4. I think it probably just wasn't on by default in the SDK, and some developers didn't bother messing with it. There are plenty of demanding games that use it no problem when they do enable it.

Cheers. I remember now there was some "argument" as to whether it was devs fault for not turning it on or Sony's for making it difficult to enable it/use it in the SDK?
 

DeepEnigma

Gold Member
Cheers. I remember now there was some "argument" as to whether it was devs fault for not turning it on or Sony's for making it difficult to enable it/use it in the SDK?

It is all speculation, just as I am as well.

It hasn't been an issue in quite some time that I am aware of. Also, I don't know why it's so hard to get that answer out of any developer, when they seem to share anything and everything about tech a lot of the times with these systems.

Maybe Hellpoint Dev and GamingBolt can shed some light on this. ;)
 
Last edited:
AF definitely wasn't a bandwidth issue on the PS4. I think it probably just wasn't on by default in the SDK, and some developers didn't bother messing with it. There are plenty of demanding games that use it no problem when they do enable it.
AF needs bandwidth. The ps4 had as much bandwidth as the 7850 but the latter doesn't have to share that with the cpu. And graphics cards rarely have an abundance of bandwidth. Even the gtx 1080 is bandwidth limited.

I'm not talking about when some indie or remaster lacks AF but even sony's AAA games have 4x AF at best. I'm 100% sure ps4 needed more bandwidth. iT's not as big a bottleneck as the original xbox or n64's memory for example but still. i mIss sony's insane ps2 bandwidth solution.
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
Why did it have to be APU? Old consoles had discrete GPU and they were fine.
For integration purposes and its more efficient. Having everything integrated reduces latency instead of seperating things.

AMD has shown it can make APU's with relatively powerful graphics hardware, so its not like they are compromising that much by going the APU route.
18 memory dies is literally impossible when you take into account PCB wiring.
Unless you are willing to make high end PCB with multiple layers which would be very costly.

Nvidia didn't want to even that for their 2080Ti. I mean look at this junk of a wiring, and that is just for 8 chips on 352-bit bus not 512:
Agreed. All this shows is that Q123 is just talking words.
Maybe the specs osirisblack seen was for a devkit but 18GB would be using either 144-bit , 288-bit , 576-bit , 1152 -bit or 2304-bit but 144 & 288 would be too slow.
1152 bit and 2304 bit memory, yes, sure... we are approaching HBM levels here.

Xbox One , Xbox One S & PS4 all used 16 RAM chips
Yeah, and at how many bits was that? Certainly not some golfball 576 bit. You left that out of your post so you could make the point that they used 16 RAM chips.

Yeah, they do. At 256 bit memory width, which is a common size.
 

DeepEnigma

Gold Member
AF needs bandwidth. The ps4 had as much bandwidth as the 7850 but the latter doesn't have to share that with the cpu. And graphics cards rarely have an abundance of bandwidth. Even the gtx 1080 is bandwidth limited.

I'm not talking about when some indie or remaster lacks AF but even sony's AAA games have 4x AF at best.

Yet the weaker Xbox One had no issue. AF doesn't cost that much bandwidth on these modern systems. Like I said, there are more graphically intensive games now compared to when this was a thing, that have it enabled no problem.
 
It is all speculation, just as I am as well.

It hasn't been an issue in quite some time that I am aware of. Also, I don't know why it's so hard to get that answer out of any developer, when they seem to share anything and everything about tech a lot of the times with these systems.

Maybe Hellpoint Dev and GamingBolt can shed some light on this. ;)
It is all speculation, just as I am as well.

It hasn't been an issue in quite some time that I am aware of. Also, I don't know why it's so hard to get that answer out of any developer, when they seem to share anything and everything about tech a lot of the times with these systems.

Maybe Hellpoint Dev and GamingBolt can shed some light on this. ;)
It is all speculation, just as I am as well.

It hasn't been an issue in quite some time that I am aware of. Also, I don't know why it's so hard to get that answer out of any developer, when they seem to share anything and everything about tech a lot of the times with these systems.

Maybe Hellpoint Dev and GamingBolt can shed some light on this. ;)[/QUOT
Yet the weaker Xbox One had no issue. AF doesn't cost that much bandwidth on these modern systems. Like I said, there are more graphically intensive games now compared to when this was a thing, that have it enabled no problem.

No xbox one rarely has more than 4x AF as well. again not talking indie games. Look at halo 5 it uses trilinear or bilinear filtering.
^ can't edit this mess on ps4 browser
 
Last edited:
Top Bottom