iceatcs
Junior Member
Only posted 4 times since 2009...wtf
pretty obviously.
Only posted 4 times since 2009...wtf
14+4 CU thing is now assumed to be wrong as well, I think. This new GPU leak seems to supersede that old leak that outlined the 14+4 thing, as there's now no indication of it, and instead the new information is the 8 ACE setup.Well the RAM thing is the only thing that's been "wrong" so far and that was a last minute change. VGleaks obviously have a lot of detailed docs for PS4 (a 95 page PDF IIRC?) Why they choose to react to rumours I don't know.
I personally would just post everything......
pretty obviously.
Can you expand on it because it doesn't seem to be universally accepted as having any inherent or specific meaning?100% efficiency.
I'm not sure what kind of docs VGLeaks have but it seems like there is some sort of interpretation involved. It also seems like english is not their first language(?).14+4 CU thing is now assumed to be wrong as well, I think. This new GPU leak seems to supersede that old leak that outlined the 14+4 thing, as there's now no indication of it, and instead the new information is the 8 ACE setup.
100% efficiency.
Near 100% efficiency.
Can you expand on it because it doesn't seem to be universally accepted as having any inherent or specific meaning?
Because I have social life.
It is a good quick photomontage. The concern is that all you look more into the penis than the GDDR5.
Apologies if I offended anyone tweaking the work of Michelangelo.
Because I have social life.
It is a good quick photomontage. The concern is that all you look more into the penis than the GDDR5.
Apologies if I offended anyone tweaking the work of Michelangelo.
Did I miss something? why isn't any questioning the legitimacy of this?
Given I am so totally lost, I am going ask a few questions.
Firstly an open ended one:
What does it all mean for PS4 GPU when compared to traditional HD7850?
Closed question:
Does this mean that with little effort the developers can get to its theoretical 1.84TF?
Is the GPU customization better or worse than previously assumed?
Given I am so totally lost, I am going ask a few questions.
Firstly an open ended one:
What does it all mean for PS4 GPU when compared to traditional HD7850?
Closed question:
Does this mean that with little effort the developers can get to its theoretical 1.84TF?
Is the GPU customization better or worse than previously assumed?
This is a rib to folks who earlier have a PDF from the Durango Summit of 2012. In the PDF, there is a chart which says the original Xbox 360 GPU was 60% efficient and the Durango GPU is near 100% efficient.
For a long time, people were under the impressions, PS4 would have a vastly inefficient GPU, but this throws some of that caution into the wind. That is all.
This is a rib to folks who earlier have a PDF from the Durango Summit of 2012. In the PDF, there is a chart which says the original Xbox 360 GPU was 60% efficient and the Durango GPU is near 100% efficient.
For a long time, people were under the impressions, PS4 would have a vastly inefficient GPU, but this throws some of that caution into the wind. That is all.
Deanos [Deano Calves, Ninja Theory dev [ex?]] just commented on B3D "GCN 2.0" when he saw VGLeaks article.
http://forum.beyond3d.com/showpost.php?p=1713696&postcount=630
Deanos [Deano Calves, Ninja Theory dev [ex?]] just commented on B3D "GCN 2.0" when he saw VGLeaks article.
http://forum.beyond3d.com/showpost.php?p=1713696&postcount=630
I thought it was great
Deanos [Deano Calves, Ninja Theory dev [ex?]] just commented on B3D "GCN 2.0" when he saw VGLeaks article.
http://forum.beyond3d.com/showpost.php?p=1713696&postcount=630
Deanos [Deano Calves, Ninja Theory dev [ex?]] just commented on B3D "GCN 2.0" when he saw VGLeaks article.
http://forum.beyond3d.com/showpost.php?p=1713696&postcount=630
As it is, the Orbis GPU is already faster than the 7850. With this info, the gap increases when it comes to compute related tasks.What does it all mean for PS4 GPU when compared to traditional HD7850?
Most likely.Does this mean that with little effort the developers can get to its theoretical 1.84TF?
It is obviously a more customized design. IMO it is for the better.Is the GPU customization better or worse than previously assumed?
GCN is AMDs current GPU architecture, and 2.0 is obviously its successor. Since we have no idea what exactly this entails, throwing around the "GCN 2.0" moniker in console tech discussions is pretty meaningless IMHO.
Well we can agree that this change is rather big to overall performance and i don't think AMD will throw it out to trashbin.
BTW did you notice that in that graph CU are devided ? Those pipelines are essentialy devided with each part going to different pool of CU. Or i am reading something wrong ?
So GCN 2.0 and 8GB GDDR5? Sony you beauties! Lol, people on this forum said both of these two things were basically definitely not going to happen! In a short few months the PS4 went from being half decent hardware wise, to actually being quite potent. I know GCN 2.0 isn't the biggest leap forward, but I'll take anything on the GPU at the mo since AMD GPU tech right now isn't the most formidable, especially against some of Nvidia's top offerings.
Isn't this akin to what ATI did with the Xenos and unified shaders?
We do not know if it is GCN 2.0 it may be just custom GPU for PS4 (doubt it)
It's still labelled as "unified" array. AMD likes to divide their CUs into two blocks for their digrams;BTW did you notice that in that graph CU are devided ? Those pipelines are essentialy devided with each part going to different pool of CU. Or i am reading something wrong ?
GCN + custom improvements or GCN 2.0, either one will do nicely.
It's still labelled as "unified" array. AMD likes to divide their CUs into two blocks for their digrams;
GCN+/2.0 is not some magical lamp that will transform system. That diagram shows only that GPU will be just more efficient doing compute.
On side note if they gone this far i wonder what else did they changed
If you mean that they put custom design then yes (being probably their new standard in future)
This also means that we can finally rest that 14+4 split theory.
This isnt Durango, so no.
I get this feeling that Sony is very happy to have gone with AMD for the GPU design and will probably stick with them in the future (hopefully).Considering what they did for GameCube and Xbox 360, I think PS4 will have a very capable GPU. It's obvious Sony is putting a lot of care in this and aiming high. I expect good things.
Yeah;Yes I admit I was wrong. Now I would like vgleaks to explain themselves. It ironic that they release something that confirms they were wrong on something else.
I'm not sure what kind of docs VGLeaks have but it seems like there is some sort of interpretation involved. It also seems like english is not their first language(?).
All this means is, they are giving developer's more options on the GPU at one time. If they want to make an area that is heavy on CPU work, they can do that and run code parallel without the GPU and CPU running into each other. It's "smart" design essentially.
IMO It looks like it is more flexible for GPGPU work (non-rendering). More compute rings, so it can kick off processing in parallel and not risk leaving CUs idle.
As it is, the Orbis GPU is already faster than the 7850. With this info, the gap increases when it comes to compute related tasks.
Most likely.
It is obviously a more customized design. IMO it is for the better.
I would like to understand this in real world gaming terms.It doesn't matter if its called GCN2 or WTF4 - that architecture is damn impressive. I doubt any dev is able to use all the potential power in the PS4 within the next 1-2 years.
That is what I always felt as the right approach, had lengthy discussion with JohnnySasaki86 on this.Also, unlike last time, the task switch can happen on any one or more CUs and is dependent upon the work load. This would also mean that there is no hardware divide like the earlier rumour suggested at 14 (fixed function) + 4 (GPGPU as well GPU related tasks), rather, much like the UMA RAM pool, the ratio between CUs performing GPU vs GPGPU taks is flexible and dynamic(?). Am I close to being correct?
In a way.Also, is this what you call "context switching"?
I'm afraid I already did. SorryQuick someone start a GCN2.0 meme.
Quick someone start a GCN2.0 meme.
I would like to understand this in real world gaming terms.
Here is my understanding based on reading the OP of the rumor thread for this:
They said their 2 ACE setup can already saturate the x16 PCI-E bus with bandwidth.
Now because of the APU in PS4 instead of using traditional CPU and GPU separate as on a PC that means you'd need to increase the way this bandwidth is utilized to take advantage of the 176GB/s bandwidth on PS4 for more than just GPU tasks. Right?
It definitely seems like a capable and wisely designed system.Alright, good gpu check. Lots of ram with good bandwidth check. The only thing left to worry about is the CPU and gddr5 latency problem. If Sony solved these issues then they have quite the capable system. I see the era of ps2 coming back )
That is what I always felt as the right approach, had lengthy discussion with JohnnySasaki86 on this.
In a way.
I'm afraid I already did. Sorry
14+4 CU thing is now assumed to be wrong as well, I think. This new GPU leak seems to supersede that old leak that outlined the 14+4 thing, as there's now no indication of it, and instead the new information is the 8 ACE setup.
14+4 CU thing is now assumed to be wrong as well, I think. This new GPU leak seems to supersede that old leak that outlined the 14+4 thing, as there's now no indication of it, and instead the new information is the 8 ACE setup.