I'll give you something even better:
Wait the guy who wrote that tard was kidding yes?
I'll give you something even better:
I went ahead and made a "clearer" version of the diagram posted on the first page.
Hope it helps.
Really nice article, tons of details on PS4 arhitecture and SDK tools. Sony seems to be in very good position with PS4 deployment.
Article also confirmed two things - 2 CPU cores are dedicated to PS4 OS, and PSN download speed limit is [was] 12mbps.
I went ahead and made a "clearer" version of the diagram posted on the first page.
Hope it helps.
Bethesda will fu(k up PS4 game developement confirmed...
I disagree with Cerny. I would call the PS4 AMD Station A and the Xbox One AMD Station B with some custom-stuff.It's one of the major modifications by the hardware team:
[]
The three "major modifications" Sony did to the architecture to support this vision are as follows, in Cerny's words:
"First, we added another bus to the GPU that allows it to read directly from system memory or write directly to system memory, bypassing its own L1 and L2 caches. As a result, if the data that's being passed back and forth between CPU and GPU is small, you don't have issues with synchronization between them anymore. And by small, I just mean small in next-gen terms. We can pass almost 20 gigabytes a second down that bus. That's not very small in today’s terms -- it’s larger than the PCIe on most PCs!"
Actually AMD calls it now FLC (Fusion Compute Link) and RMB (Radeon Memory Bus).I can confidentially say that there are multiple buses named after vegetables. Whether this is good or bad depends on how much you like onions/garlic/chives.
Maybe there was some influence from Sony, but the ACEs are not different from the one AMD is using in their products nowadays.Asynchronous compute is one of the areas the PS4 has a very significant advantage over Xbone. PS4 has extra CU units (18 vs. 12 for Xbone) that can be applied to asynchronous compute. PS4 also has custom modified compute queues: 64 vs. the standard 2 on AMD GCN parts.
It's great that PS4 ports are already looking at taking advantage of asynchronous compute this early in the lifecycle.
Wait the guy who wrote that tard was kidding yes?
The thing that puzzles me is the developers' mention of the need to allocate data correctly. When I first read about AMD's HSA, I assumed that everything would be unified and could be customized to the needs of the developers as required. I thought there wouldn't be any need at all for data separation, as to me and my limited knowledge it seemed that CPU and GPU data would be fed thrugh a common bus and allocated dynamically as needed. Now I'm confused
Stop taking away our flops yo.
EDIT: Did you know that PS2 had about 6GFlops of computational power? Every Gflop counts, least of all dat xtra 23.
PS4 games will give you bad breath, and be delicious. Confirmed.
So if the unified memory doesn't negate the need to separate data in the way you described, what is its main benefit?
Not in the slightest... and sadly there are more posters like him - the only difference being they don't make their bs as obvious and sometimes hide it in walls of text...
So is this good or bad, I am untechnical as hell. How does this impair/improve performance?
Bethesda will fu(k up PS4 game developement confirmed...
Something told me we were not going to go into next gen without hearing about the unlocked power these machines have. So all we need now is a hard number of the percentage of the machines potential the first gen games are going to be using, 50,70,90 or 110%.
I had a feeling the PS4 would be stealth downgraded before launch
Go away with this adding of various bandwidths. There is no reason to get less precise.It's good. Total 196GB bw. Though that bw can't compete against x1's infinite power of the cloud.
I believe as someone that is untechnical all you need to know is that John Carmack thinks that Sony "made wise engineering choices."So is this good or bad, I am untechnical as hell. How does this impair/improve performance?
You're taking this way too seriously, benny.Go away with this adding of various bandwidths. There is no reason to get less precise.
It will just be used in fanboy wars about "See, the Sony people are advertising bullshit memory bandwidths too! LOL hypocritical!"
Ps4 secretsauce!garlic
I read a lot of these threads and I can just see the derails in the future provided by some well known posters that will allude to posts like the ones you make.You're taking this way too seriously, benny.
It's good. Total 196GB bw. Though that bw can't compete against x1's infinite power of the cloud.
Depends on which IP-Level MS was aiming.the real question is if MS special sauce can be as tasty
That's what he's saying. I doubt he's talking about the PC.
It's good. Total 196GB bw. Though that bw can't compete against x1's infinite power of the cloud.
Correct me if I'm wrong, but this doesn't mean that the total BW is 192GB/s.
The memory itself has a maximum bandwidth of 176GB/s.
The GPU has access to the full available bandwidth.
The CPU has access to the same pool of memory, but through a slower bus with a maximum of 20GB/s.
Right or wrong?
It is really just about whether an individual address in main memory should be mapped to CPU-L1/L2 cache (Onion) or not (Garlic). CPU-L1/L2 is (a) of limited size and (b) highly relevant to the CPU but at the same time irrelevant to the GPU. Hence, you issue access commands to CPU-relevant data through Onion, and access to CPU-irrelevant data through Garlic. As a result the GPU does not bully the CPU.
What? Why? It's about games tech, why shouldn't we be allowed to discuss it? It's a great opportunity for all of us to gain some knowledge on how the internals of a console work.
the real question is if MS special sauce can be as tasty
Boo I understand the reasoning, but still don't sit well when people don't go all out and take advantage of available resources when it comes to anything ><
Not sure if that infinite power of the cloud is sarcasm or not (I suspect it is) but I remember Cerny calling bullshit on cloud computing being able to boost graphical fidelity?
http://www.youtube.com/watch?feature=player_detailpage&v=rCVplpC1YXY#t=157s
Really nice article, tons of details on PS4 arhitecture and SDK tools. Sony seems to be in very good position with PS4 deployment.
Article also confirmed two things - 2 CPU cores are dedicated to PS4 OS, and PSN download speed limit is [was] 12mbps.
You are correct. It's also worth nothing that GCN GPUs are rated to be able to only handle 153GB/s so the division between the CPU and GPU is pretty much perfect for the bandwidth.
You are correct. It's also worth nothing that GCN GPUs are rated to be able to only handle 153GB/s so the division between the CPU and GPU is pretty much perfect for the bandwidth.
Nope it did not.
Their game logic is created for 2 cores where rest of the cores do some other stuff. There is no confirmation in text about what you say.
Owing to confidentiality agreements, Reflections couldn't go into too much depth on the relationship of the Onion and Garlic buses with the rest of the PS4's processor, but we suspect that ExtremeTech's block diagram is pretty close to the mark. Note that the PS4 has two Jaguar CPU clusters for eight cores in total, two of which are reserved by the operating system.
The cpu has to go through the gpu to acess ram. So its actually 176 minus 20 = 156 I belive
Aww..I miss specialguy.
It did. There is a diagram captioned
Maybe you missed it?
It did. There is a diagram captioned
Maybe you missed it?
Maybe you missed it?
OH GOD. LOOKS SO GOOD.
And I'm fasting. =[
Not sure if that infinite power of the cloud is sarcasm or not (I suspect it is) but I remember Cerny calling bullshit on cloud computing being able to boost graphical fidelity?
http://www.youtube.com/watch?feature=player_detailpage&v=rCVplpC1YXY#t=157s
That explains the lack of 1080p/60fps titles...
You are right. Cerny has called out the Cloud that it cannot increase graphical fidelity.
I can understand xbone needing two cores for their 2 and a half OS setup but I'm perplexed as to why PS4 would need two as well.
Why do they need to reserve 2 cpu cores for the damn OS?