• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: PS4 GPU based on AMD's GCN 2.0 architecture?

Kleegamefan

K. LEE GAIDEN
100% efficiency is a goal that will NEVER be reached, no matter what piece of paper it is written on.

XB3 games won't have it and ps4 games won't have it.

Note, I am not doubting you read that on confidential materiel, thuway....... I am just saying t is an unrealistic goal.
 

McHuj

Member
It's not a rumor now. I am confirming it. You need a link? There is no link. I am the link :). The point is, MS is going for 100% GPU efficiency.

How? There's got to be some technical reason/justification for these claims that MS is making?
 
D

Deleted member 80556

Unconfirmed Member
If this is true, Sony made huge modifications last minute.

Define "last minute". Could it have been done 4 months ago? Can MS do these modifications too?

Or was this not known to devs and thus not leaked like the RAM?

Also of interest: PhysX by Nvidia is being licenced for the PS4. Yes, Physx can run on non-Nvidia GPUs, that was just a licencing thing.

http://physxinfo.com/news/10531/nvi...tner-for-physics-middleware-on-playstation-4/

Well, if they were restricting to their own GPU's, they would've missed a huge share of games, not to mention that since devs would be so acquainted to AMD's structure, most wouldn't use PhysX. Smart decision, they'll probably do the same with Durango.

It's not a rumor now. I am confirming it. You need a link? There is no link. I am the link :). The point is, MS is going for 100% GPU efficiency.
HE'S THE LINK.
Everything is connected. This is all Watch_Dogs viral campaign.

What you are aiming at is a standard feature of the GCN architecture. Happy to disclose this stuff via pm.

Did not know this :-o. Architecture sure has changed in the last few years :-O.

That's an interesting turn of events.
 

Reiko

Banned
I know what you have. Please don't leak it. Trying to act cool on the internet is not worth having someone lose their job. I can back you up though-

Specs for Durango haven't changed. The GPU is nearly 100% efficient though at 1.23 TF, so that's something most people didn't know.

Correct thuway:/

That was the part I was sensitive about. Did not know how GAF would react.

I'll PM you the info of the last page. To show you my info checks out.
 
I know what you have. Please don't leak it. Trying to act cool on the internet is not worth having someone lose their job. I can back you up though-

Specs for Durango haven't changed. The GPU is nearly 100% efficient though at 1.23 TF, so that's something most people didn't know.

Does it deny BG's claims about the processor? Also do you think PS4 GPU is also nearly 100% efficient as well, now that we have this new info released today from vgleaks?
 

CLEEK

Member
What is 8 ACEs?
.
ACE_Issue_8.jpg
 

KidBeta

Junior Member
I know what you have. Please don't leak it. Trying to act cool on the internet is not worth having someone lose their job. I can back you up though-

Specs for Durango haven't changed. The GPU is nearly 100% efficient though at 1.23 TF, so that's something most people didn't know.

Would the 100% efficiency your talking about per chance have to do with 5 Lane SIMD verse 1 Lane SIMD ? :D. If so both consoles should have it (and its good for both too, idle parts are bad).

Also, afaik the normal GCN and Durango only have 2 Queues.

Could these 8 Queues be what the original vgleaks, leak was talking about when they said improved GPGPU performance? maybe.
 
No such thing as 100% efficiency, it's like saying eSRAM will allow Durango to go past it's theoretical limit. <insert eyeroll here> People are going full blown delusional at this point.
The entire point of GCN is increased efficiency, but no matter what MS or any engineer does you will never have a 100% efficient machine. It's like saying Honda made a 100% efficient engine....wut?
 

Reiko

Banned
No such thing as 100% efficiency, it's like saying eSRAM will allow Durango to go past it's theoretical limit. <insert eyeroll here> People are going full blown delusional at this point.
The entire point of GCN is increased efficiency, but no matter what MS or any engineer does you will never have a 100% efficient machine. It's like saying Honda made a 100% efficient engine....wut?

We'll that's egg on the Durango engineers faces don't you think?

That's what they were beating their chest to on the slide.
 

KidBeta

Junior Member
I don't remember the Durango GPU being at 100% efficiency.

The only part I heard about efficiency on GAF were the Data Move Engines.

It is if you _think_ about architecture, compared to Xenos, maybe the 1 lane SIMD is more efficient then the 5 lane SIMD because it can fully utilised (all the time?, i dunno) much more of the time :).
 

Reiko

Banned
It is if you _think_ about architecture, compared to Xenos, maybe the 1 lane SIMD is more efficient then the 5 lane SIMD because it can fully utilised (all the time?, i dunno) much more of the time :).


I really hope so. Would be a waste of R&D if they didn't accomplish their goals.
 
You are missing out on the mega ton -_-. The GPU is near 100% efficient. The 360 GPU was 60% efficient. If Reiko wants to back me up, he should have a chart that compares them.

I remember some stats on this regard in some gamefest/gdc presentations from last year... 360 was really "inefficient", games at most utilized about 20ish gflops on cpu and 140ish on the gpu IIRC...

But even if you use the same 60% metric on current Amd gpus they would still be far ahead of durango's 1.3tflop, so doesn't sound too much like a megaton XD Unless something other than flops is also highly inefficient in current gpu designs and MS is aiming to improve it all.

Edit: Then again, 100% gpu efficiency sounds really improbable by just targeting the gpu itself. A good portion of the frame time the gpu sits idle waiting for the gpu, for example, no matter how efficient you are, if some one makes you wait you are wasting resources.

Also of interest: PhysX by Nvidia is being licenced for the PS4. Yes, Physx can run on non-Nvidia GPUs, that was just a licencing thing.

http://physxinfo.com/news/10531/nvi...tner-for-physics-middleware-on-playstation-4/

Physx is licensed even for Ps3/360/Wiiu. The part that makes physx interesting on nvidia cards is the fluids/cloth simulation that usually are locked to nvidia gpus.

That doesn't mean that they are not licensing it too for Ps4, but physx by itself is no different than havok.
 
We'll that's egg on the Durango engineers faces don't you think?

That's what they were beating their chest to on the slide.

Not egg on their face because they've never claimed such, it's a bunch of people on the internet who've made this hilarious statement. Increased efficiency != 100% efficiency. You can create a GPU who can sustain better performance over an average instead of a GPU that peaks, but also has extremely low efficiency at times due to <insert reason for GPU performance to plummet>. If you can raise how low performance drops under most situatinos, then you've created something worth talking about (again this is the point of GCN). I have no doubt Durango's GPU will be much more efficient than 360's, but people running around screaminng 100% efficiency are really no different than the people running around screaming "8 GEE BEES of DDR5!" (and yes I'm well aware it's GDDR5)

100% efficiency is a pipe dream, because in 15 years when new software and hardware engineering techniques come around, they will look back to Durango's GPU and be like "Oh, we could have done this to increase efficiency" or "Oh, we should have done this".

What people are hilariously suggesting is that in 15 years when engineers go back and look at this hardware they're going to essentially be like "whelp, there was nothing we could have improved with this design, it was perfect".

No...just no.
 

Reiko

Banned
Well considering they bought the entire GPU (practically) off AMD then its not really there money that would be wasted if GCN was crap :p.

Right:/


Not egg on their face because they've never claimed such, it's a bunch of people on the internet who've made this hilarious statement. Increased efficiency != 100% efficiency. You can create a GPU who can sustain better performance over an average instead of a GPU that peaks, but also has extremely low efficiency at times due to <insert reason for GPU performance to plummet>. If you can raise how low performance drops under most situatinos, then you've created something worth talking about (again this is the point of GCN).

100% efficiency is a pipe dream, because in 10 years when new engineering techniques come around, they will look back to Durango's GPU and be like "Oh, we could have done this to increase efficiency" or "Oh, we should have done this".

What people are hilariously suggesting is that in 10 years when engineers go back and look at this hardware they're going to essentially be like "whelp, there was nothing we could have improved with this design, it was perfect".

No...just no.

The bunch of people in this case is Microsoft. This info isn't coming from forum warriors.

I have my doubts, but this is what they're trying to sell to developers and investors.
 

Kaako

Felium Defensor
Not egg on their face because they've never claimed such, it's a bunch of people on the internet who've made this hilarious statement. Increased efficiency != 100% efficiency. You can create a GPU who can sustain better performance over an average instead of a GPU that peaks, but also has extremely low efficiency at times due to <insert reason for GPU performance to plummet>. If you can raise how low performance drops under most situatinos, then you've created something worth talking about (again this is the point of GCN). I have no doubt Durango's GPU will be much more efficient than 360's, but people running around screaminng 100% efficiency are really no different than the people running around screaming "8 GEE BEES of DDR5!" (and yes I'm well aware it's GDDR5)

100% efficiency is a pipe dream, because in 15 years when new software and hardware engineering techniques come around, they will look back to Durango's GPU and be like "Oh, we could have done this to increase efficiency" or "Oh, we should have done this".

What people are hilariously suggesting is that in 15 years when engineers go back and look at this hardware they're going to essentially be like "whelp, there was nothing we could have improved with this design, it was perfect".

No...just no.
Well said.
 
What exactly is the Durango GPU supposed to be "100% efficient" at doing exactly?

And why wouldn't AMD apply this magic they've used to perfect GPUs for all eternity into their other products?
 

KidBeta

Junior Member
What exactly is the Durango GPU supposed to be "100% efficient" at doing exactly?

And why wouldn't AMD apply this magic they've used to perfect GPUs for all eternity into their other products?

They have, its called GCN.

Durango GPU is 100% utilisation with its 1 Lane SIMD compared to the 5 Lane SIMD of Xenos which go much worse utilisation (near half that) depending on what you were doing.
 
What exactly is the Durango GPU supposed to be "100% efficient" at doing exactly?

And why wouldn't AMD apply this magic they've used to perfect GPUs for all eternity into their other products?

Apparently thuway saw some Microsoft slide that compared Xenos to the new GPU, saying that Xenos was 60% efficient and the new GPU would be near 100% efficient...

It's all hype, there's obviously been a significant amount of improvement between a 2005 GPU and a 2012 GPU in terms of advancement in efficiencies, so I think he's completely taking it out of context and has no basis for claiming it's much more efficient than PS4's GPU. Sure, it's a big improvement over a GPU from 2005...that tells us NOTHING about how it compares to Playstation 4's GPU. He was WAYYYYY off base with all of his statements about PS4 being a "brute force" approach and 720 introducing some revolutionary technological efficiency that is somehow lacking from the highest GPU cards on the market.

In fact, I doubt there's anything substantially different at all between the two basic architectures other than the quantity of CU's.

The ESRAM and other hardware functions are there to patch up a less than ideal DDR3 memory config.

Sorry, there is no magic jizzard sauce.
 

Reiko

Banned
Apparently thuway saw some Microsoft slide that compared Xenos to the new GPU, saying that Xenos was 60% efficient and the new GPU would be near 100% efficient...

It's all hype, there's obviously been a significant amount of improvement between a 2005 GPU and a 2012 GPU in terms of advancement in efficiencies, so I think he's completely taking it out of context and has no basis for claiming it's much more efficient than PS4's GPU.

In fact, I doubt there's anything substantially different at all between the two basic architectures other than the quantity of CU's.

The ESRAM and other hardware functions are there to patch up a less than ideal DDR3 memory config.

Sorry, there is no magic jizzard sauce.

I don't think MS's intention of better GPU utilization has anything to do with PS4. It's about making a better designed console than the Xbox 360.
 
I don't think MS's intention of better GPU utilization has anything to do with PS4. It's about making a better designed console than the Xbox 360.

Oh, I don't disagree.

All I'm saying is -- thuway took a comparison of Durango's GPU compared to Xenos and somehow tried to make a comparison of Durango vs. Orbis on the basis of efficiencies. Sorry, it doesn't work like that.
 
Apparently thuway saw some Microsoft slide that compared Xenos to the new GPU, saying that Xenos was 60% efficient and the new GPU would be near 100% efficient...

It's all hype, there's obviously been a significant amount of improvement between a 2005 GPU and a 2012 GPU in terms of advancement in efficiencies, so I think he's completely taking it out of context and has no basis for claiming it's much more efficient than PS4's GPU.

In fact, I doubt there's anything substantially different at all between the two basic architectures other than the quantity of CU's.

The ESRAM and other hardware functions are there to patch up a less than ideal DDR3 memory config.

Sorry, there is no magic jizzard sauce.

I doubt they'd spent all the money and half the gpu budget in a patch... The whole system was probably designed that way, to achieve a certain performance target within cost/power restrictions...
 
Right:/




The bunch of people in this case is Microsoft. This info isn't coming from forum warriors.

I have my doubts, but this is what they're trying to sell to developers and investors.

I haven't seen a single hardware engineer from MS actively involved and working on Durango's GPU suggest that it will be 100% efficient (and what exactly is it efficient in? color reproduction? how fast it can add 3+3?), and I doubt I will see an actual legit MS engineer state anything about the GPU because they'd be under lock and key. An AMD engineer won't say that, because AMD is already working on GCN 2.0 and beyond. If this design was 100% efficient (Again efficient in what?) Then AMD wouldn't be working on successors. This ranks up there with secret sauce as "bullshit that gets the average forum goer hyped." I mean no offense to Thuway, but he isn't someone I would go to for in depth technical discussion.

This is exactly the same as the whole unified shader business back on 360 "Omg 100% shader efficiencyz!"
 
I doubt they'd spent all the money and half the gpu budget in a patch... The whole system was probably designed that way, to achieve a certain performance target within cost/power restrictions...

Microsoft wanted 8 GB of RAM, they wanted a fairly unified setup, GDDR5 was looking unlikely to deliver 8 GB of RAM cheaply or in substantial quantities.

Their system is based upon these issues, analyzing what was known two years ago.

I think Sony just lucked out on the ability of 8GB of GDDR5 to be feasible in 2013 through various vendors, although it certainly comes at a higher cost I'm sure (though ESRAM doesn't come cheap either).
 

Reiko

Banned
I haven't seen a single hardware engineer from MS actively involved and working on Durango's GPU suggest that it will be 100% efficient (and what exactly is it efficient in? color reproduction? how fast it can add 3+3?), and I doubt I will see an actual legit MS engineer state anything about the GPU because they'd be under lock and key. An AMD engineer won't say that, because AMD is already working on GCN 2.0 and beyond. If this design was 100% efficient (Again efficient in what?) Then AMD wouldn't be working on successors. This ranks up there with secret sauce as "bullshit that gets the average forum goer hyped."

This is exactly the same as the whole unified shader business back on 360 "Omg 100% shader efficiencyz!"

I understand how you feel. But me, thuway, and KidBeta are really being serious with what we are telling you. That's what MS is saying, not us.

Could be hyperbole, but why lie to investors? It's counter-productive.
 

Pistolero

Member
I understand how you feel. But me, thuway, and KidBeta are really being serious with what we are telling you. That's what MS is saying, not us.

Could be hyperbole, but why lie to investors? It's counter-productive.

What were they comparing their GPU to?
 
I understand how you feel. But me, thuway, and KidBeta are really being serious with what we are telling you. That's what MS is saying, not us.

Could be hyperbole, but why lie to investors? It's counter-productive.

It's not what you're looking at that we doubt, it's the ill-informed manner in which you attempt to extrapolate those figures into a comparison with Orbis. Everyone is aiming for 100% efficiency. That means nothing. And it means less than nothing when you try to use it as ammunition against an Orbis design we have every reason to believe is in many ways MORE efficient than Durango.
 

mavs

Member
I understand how you feel. But me, thuway, and KidBeta are really being serious with what we are telling you. That's what MS is saying, not us.

Could be hyperbole, but why lie to investors? It's counter-productive.

This was directed at investors? Question, if investors were shown anything other than a PR slide would they know what they were looking at? It sounds like there is an actual explanation that got lost in translation. I could take a guess at what it was, but if I'm right it doesn't really matter anyway.
 

Reiko

Banned
It's not what you're looking at that we doubt, it's the ill-informed manner in which you attempt to extrapolate those figures into a comparison with Orbis.

I'm not comparing it to Orbis.

360 only. How that measures to PS4 specs is anybody's guess.

This was directed at investors? Question, if they gave investors anything other than a PR slide would they know what they were looking at? It sounds like there is an actual explanation that got lost in translation. I could take a guess at what it was, but if I'm right it doesn't really matter anyway.

Correct. There are bullet points in the slide targeting how it's better than the Xbox 360, I guess that would be where investors will gain interest. But a slide is only part of the presentation, and you're right that it could lead to nothing without context.
 
Xenos, for the 5th godamn time in this thread.

the 100% effiecny has to do with 1 Lane SIMD (GCN) verse 5 Lane SIMD (Xenos).

Which basically means that it's just comparing a normal modern day amd GPU with one from 2005.

There's nothing special Durango has for its gpu architecture over ps4

Microsoft slides taken completely out of context
 

Reiko

Banned
Which basically means that it's just comparing a normal modern day amd GPU with one from 2005.

There's nothing special Durango has for its gpu architecture over ps4

Microsoft slides taken completely out of context

Why would MS be making a slide comparing to PS4 when we know good and well that it's parts are not final?

That would have been completely stupid on MS's part. lol

No, you're just mounting a dogged defense of an out of context Durango figure in a thread about the PS4 GPU for no apparent reason.

My opinion just changed when seeing the truth for myself. I'm giving the edge to Sony.
 

Striek

Member
Well a few days ago I did get an email from a Gaffer who has posted a lot in these topics.

It's a PDF. I don't know how much of it is new or old. (It's a little bit different from the VGleaks document)
That is kind've wierd.

Allegedly an insider on gaf has MS document(s), and decided to share it with one of the most staunch Durango fans knowing he would evangelise further with said information? Theres nothing wrong per se I guess, but it feels like it straddles a line.
 
Xenos, for the 5th godamn time in this thread.

the 100% effiecny has to do with 1 Lane SIMD (GCN) verse 5 Lane SIMD (Xenos).

Which makes much more sense than

"herp derp, Durango GPU is 100% efficient!"

It's amazing how 100% increase in efficiency over <insert 2005 object> turns into "This GPU will be 100% efficient!"

This is no different than Nvidia claiming their new GPU architecture is a X% increase in efficiency over old architecture.
 
Top Bottom