• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: PS4 GPU based on AMD's GCN 2.0 architecture?

If this is true, Sony made huge modifications last minute.
I'd been led to believe that such a thing wasn't possible?
So there where some changes made, everyone keeps saying things are set in stone, only memory is easy to change (when it was changed), no way MS can change anything, things are finalized etc.

Because no previous rumour indicated GCN 2.0? Every rumour was about a 7xxx series derivative.
Pretty much.
 

artist

Banned
So there where some changes made, everyone keeps saying things are set in stone, only memory is easy to change (when it was changed), no way MS can change anything, things are finalized etc.
See;
Gemüsepizza;48255046 said:
What? Where do you get the impression that this was a last minute change? Do you know how long GCN2 is in development? Afaik it is even already finished. This "last-minute-change" seems like some conspiracy stuff, it is much more plausible that AMD shared their developments on GCN2 from the beginning with both of their very important partners, Sony and Microsoft, and that this technology influenced development of the PS4/Xbox3 from the beginning.
 
Gemüsepizza;48255046 said:
What? Where do you get the impression that this was a last minute change? Do you know how long GCN2 is in development? Afaik it is even already finished. This "last-minute-change" seems like some conspiracy stuff, it is much more plausible that AMD shared their developments on GCN2 from the beginning with both of their very important partners, Sony and Microsoft, and that this technology influenced development of the PS4/Xbox3 from the beginning.

Because no previous rumour indicated GCN 2.0? Every rumour was about a 7xxx series derivative.
 

aeolist

Banned
Their 8000 series are going to be GCN based and some of them are going to be rebadged 7000 chips

Besides which doubling the ACEs without increasing the core count doesn't make much sense considering that GCN was already a computer monster
 
Wouldnt any advantage the new xbox cpu may have be partially nullified by using some of the extra CU in the PS4 GPU? If a multiplatform game is more CPU intensive then surely the devs could use some of the CU in the GPU to offset the load on the CPU?
 

onQ123

Member
So 8 Gigs of GDDR5 + yet unreleased graphics processor = $???

Now I'm starting to wonder how this won't be $599 USD

Because 1.8TFLOP GPUs are not the top of the line model of AMD GPUs even if it's a GPU that's coming out in 2015 it's still going to be on the cheaper side & maybe even cheaper than the 2012 model of a 1.8TFLOP AMD GPU.
 
So GAF, and others have told me that the GPU in the PS4 is similar to a 7850.

Then I see:

I'm hearing that PS4 GCN has 8 ACE's, each capable of running 8 CL's each.

7970/7870 have two ACEs.

Again, the acronyms are quite foreign to me.

Is it correct to say that the performance of the PS4 GPU should be greater than that of the 7970? I can't put it all together, I don't have the knowledge.
 

Nachtmaer

Member
Because no previous rumour indicated GCN 2.0? Every rumour was about a 7xxx series derivative.

Who knows the rumours that indicated both GPUs would be based on the 7000 series came from the devkits who had these just simulate the final silicon's performance. I'm not denying that AMD could have started from these IPs and worked them into the PS470's SoCs but I also wouldn't be surprised if they did throw in some future improvements they were sitting on either.

Perhaps we will know more after GDC since that's when AMD will be talking about those chips supposedly.
 

gofreak

GAF's Bob Woodward
Their 8000 series are going to be GCN based and some of them are going to be rebadged 7000 chips

Besides which doubling the ACEs without increasing the core count doesn't make much sense considering that GCN was already a computer monster

It's more like 16x on the ACE side... :)

But it's not about an increase in core count, it's about that awfully over-used word 'efficiency' and 'utilisation' in mixed-task scenarios.
 
Maybe, but the number of posts flat out stating such a thing wasn't possible doesn't add up to the claim that they made huge last modifications.

Are we to assume it's possible for Sony to make last minute modifications but no-one else? Or are we to assume that this info simply wasn't divulged earlier?

Sony didn't make a last minute change, really. There was a story that they sent out dev kits with 16GB RAM in October/November, alluding to a bump to 8GB in the system proper. Then EDGE said Sony were aiming for 8GB (again) just last month, but GAF's experts decided not to believe it. Sony made the change late, but it wasn't 2 weeks ago, it was likely kept as a possibility months ago.

Now that's not to say we know everything about Durango, but one would assume any plans they had (including the maximum amount of power they were willing to have in their system) will have been put in place months ago.
 

artist

Banned
Their 8000 series are going to be GCN based and some of them are going to be rebadged 7000 chips
That has nothing to do with this.

Besides which doubling the ACEs without increasing the core count doesn't make much sense considering that GCN was already a computer monster
While this is true to a certain extent, the 7970 still falls behind the GTX580 in some cases;

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/25

While utilization may not the only reason for the above, it gives you some areas for improvement.
 

iceatcs

Junior Member
So 8 Gigs of GDDR5 + yet unreleased graphics processor = $???

Now I'm starting to wonder how this won't be $599 USD

I think PC middle/end high GPU card is quite/high expensive because it don't produce about >10 millions same card per years.
 

Globox_82

Banned
Sony clearly states under that pic image that "specs are subject to change". It's a game of bluffs. RAM thing proved it once again no matter how hard gaf member try to sell us nonsense "it's set in stone". I am not saying they completly change everything, but they always (9 months away from launch) can make changes. Sony showed it with RAM, not even first party devs seemed to have a clue about it.
I expect MS and Sony to still make some changes, but once E3 comes all cards will be on the table, THEN it is going to be late (if they want 2013 release).
 

artist

Banned
It does because it means they don't have a new architecture on deck
You dont know what you're talking about ..

As we discussed yesterday with AMD’s latest round of GPU rebadges, both AMD and NVIDIA are locked into playing the OEM rebadge game in order to fulfill their OEM partner’s calendar driven schedules. OEMs want to do yearly updates (regardless of where the technical product cycle really is), so when the calendar doesn’t line up with the technology this is achieved through rebadges of existing products. In turn these OEMs put pressure on component suppliers to rebadge too, so that when consumers compare the specs of this year’s “new” model to last year’s model the former look newer. The end result is that both AMD and NVIDIA need to play this game or find themselves locked out of the OEM market.
Anandtech said:
Because the rebadge cycle is OEM driven, rebadging is typically focused exclusively on OEM parts, and this year is no exception.
http://www.anandtech.com/show/6579/...730m-and-geforce-710m-partial-specs-published
http://www.anandtech.com/show/6570/amds-annual-gpu-rebadge-radeon-hd-8000-series-for-oems
 
Wouldnt any advantage the new xbox cpu may have be partially nullified by using some of the extra CU in the PS4 GPU? If a multiplatform game is more CPU intensive then surely the devs could use some of the CU in the GPU to offset the load on the CPU?

Again - you do realise this Durango CPU 'advantage' is based off of one guy's post on these forums right?

Its crazy how this one guy's post seems to have spread around the forum and now its seen as a credible rumour.

Nowhere does it say the CPU's are significantly different, I've searched around for some kind of confirmation but there usn't any and it all comes back to that poster on here.

Madness how easily people are influenced in times like these :)
 

Kleegamefan

K. LEE GAIDEN
So
Durango CPU > Orbis CPU
Orbis GPU >> Durango GPU

something like that?

Here is the comparison as I understand it:



XB3/Durango


Customized AMD 8 core Jaguar CPU
Around 200 gigaflops of computing power

Customized AMD Graphics Core Next 2.0 GPU
12 Compute Units(CUs)
16 Raster Operations Processors (ROPs)
48 Texture Mapping Units (TMUs)

Around 1200 gigaflops of computing power (1.2TF)

.2TF cpu power +1.2TF gpu power =1.4 teraflops performance (plus whatever custom chips and dsps will provide



PS4/Orbis


(slighly less) Customized AMD 8 core Jaguar CPU
Around 100 gigaflops of computing power


Customized AMD Graphics Core Next 2.0 GPU
18 Compute Units (CUs)
32 Raster Operations Processors (ROPS)
72 Texture Mapping Units (TMUs )

Around 1840 gigaflops of computing power (1.84TF)

.1TF cpu power +1.8TF gpu power =1.94 teraflops performance ( plus whatever custom chips and dsps will provide
 

Globox_82

Banned
Again - you do realise this Durango CPU 'advantage' is based off of one guy's post on these forums right?

Its crazy how this one guy's post seems to have spread around the forum and now its seen as a credible rumour.

Nowhere does it say the CPU's are significantly different, I've searched around for some kind of confirmation but there usn't any and it all comes back to that poster on here.

Madness how easily people are influenced in times like these :)

You are new to Gaf. Don't worry you will learn. Just let it go for now, it's good for health
 

NeOak

Member
You are new to Gaf. Don't worry you will learn. Just let it go for now, it's good for health

Is that how it works here?

You get posters appearing out of nowhere posting inside information and then threads spring up about this information?

Crazy.

(Im joking I've been a long time lurker here)
 

aeolist

Banned
http://www.anandtech.com/show/6520/amd-announces-their-first-8000m-gpus

Their mobile GPU update is partly a whole new set of chips with the same GCN architecture

I really doubt they're significantly changing things just for one customer unless Sony is paying out the nose
 

artist

Banned
Again - you do realise this Durango CPU 'advantage' is based off of one guy's post on these forums right?

Its crazy how this one guy's post seems to have spread around the forum and now its seen as a credible rumour.

Nowhere does it say the CPU's are significantly different, I've searched around for some kind of confirmation but there usn't any and it all comes back to that poster on here.

Madness how easily people are influenced in times like these :)
It's not just limited to that ..

 

artist

Banned
http://www.anandtech.com/show/6520/amd-announces-their-first-8000m-gpus

Their mobile GPU update is partly a whole new set of chips with the same GCN architecture

I really doubt they're significantly changing things just for one customer unless Sony is paying out the nose
You didnt even read it .. :/

The rebrand of 7xxx to 8xxx is to satisfy their OEMs. Nvidia has done the same thing.

Besides that, both AMD and Nvidia have already sunk quite a bit of their R&D budgets on future architectures. If either of them dont have a new architecture, then they're .... already dead.
 

aeolist

Banned
You didnt even read it .. :/

The rebrand of 7xxx to 8xxx is to satisfy their OEMs. Nvidia has done the same thing.

Besides that, both AMD and Nvidia have already sunk quite a bit of their R&D budgets on future architectures. If either of them dont have a new architecture, then they're .... already dead.
Well sure they both have new architectures

But they won't be ready until Q3 next year
 

televator

Member
Because 1.8TFLOP GPUs are not the top of the line model of AMD GPUs even if it's a GPU that's coming out in 2015 it's still going to be on the cheaper side & maybe even cheaper than the 2012 model of a 1.8TFLOP AMD GPU.

I guess I'm just seeing as how the PS3 released with what I recall being an already outdated GPU at the time... Then again, Sony hasn't invested in anything crazy like Cell + Blu-ray this time around.
 

Reiko

Banned
It's not just limited to that ..

That's all speculation on my part.

Nothing to do with what I read. And the reality is that it's probably not happening unless MS is doing something that devs with specs right now don't even know about.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
I kinda hope the Durango CPU rumor is true so Reiko can feed few people some crow :).

He has been repeating, cherry picking and wishing for so much he could get lucky and get a rumor correct. Even a broken clock is correct twice a day.
 

Globox_82

Banned
In the official PR released last week Sony says 1.84 TFlops

http://www.scei.co.jp/corporate/release/pdf/130221a_e.pdf

read what it says bellow

OTXi30gl.png

Since you won't let me help you:"SPECIFICATIONS ARE SUBJECT TO CHANGE WITHOUT NOTICE"

Official PR means nothing, there will be more changes. Nothing massive but there will be
 
Yep, there will be more changes. Even Yoshida said the PS4 isn't ready yet, that's why we haven't seen it, and it's still being finalized in terms of specs and functionality.
 
Yep, there will be more changes. Even Yoshida said the PS4 isn't ready yet, that's why we haven't seen it, and it's still being finalized in terms of specs and functionality.

Not in the next xbox though, those specifications are locked and won't see any major changes.

I find it staggering that people are happily talking about the PS4 possibly undergoing changes, but refuse to even consider the mere thought of the next xbox undergoing some changes as well because it's impossible, etc.
 
Not in the next xbox though, those specifications are locked and won't see any major changes.

I find it staggering that people are happily talking about the PS4 possibly undergoing changes, but refuse to even consider the mere thought of the next xbox undergoing some changes as well because it's impossible, etc.

I don't get it. You're being sarcastic?
 

joeblow

Member
One interesting fact about the official spec sheet is that it does not list the frequency of the individual cores. Tech-wise it is possible for them to bump up the rumored 1.6 GHz CPU core speed before production if they chose to without much hassle, right?

I not saying the should since they do have 8 cores, processing help from the GPU, and no doubt want to keep heat down and yields up, but isn't it possible at this point?
 
Top Bottom