• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks Durango specs: x64 8-core CPU @1.6GHz, 8GB DDR3 + 32MB ESRAM, 50GB 6x BD...

Durante

Member
Imagine if what the data movers did was take output from the GPU and feed it quickly into the CPU while performing some super quick operations that allowed the CPU to process that data very quickly, and then shoot it back to the GPU where another data mover performs another transformation so that the GPU can then do it's part very quickly.
That sounds rather likely. An evolution of MEMEXPORT.

The more we learn about Durango the more I like it. It seems to have a lot of really smart silicon. I think the writings on the wall it's going to be able to compete head up with Orbis, between the DMA engines, the ESRAM looking better and better, and the RAM quantity advantage.
To me, the ESRAM really looks worse now than it did before the leak. I expected more bandwidth.
 

gofreak

GAF's Bob Woodward
That's pretty much my line of thoughts as well.

I guess the issues are that GDDR5 is more expensive, and that MS really wanted 8 GB for their set-top box / OS level ambitions.

Well fack, they must REALLY want that capacity at that cost.

Because from a bandwidth-juggling point of view this seems really really messy, and won't even bring them up to where they could have got with an alternative setup.

I am actually pretty shocked. I figured they would optimise for the bottom of the pipe again so there would be at least some cases where it could excel. Bandwidth demand on the top end of the pipeline must be way higher than I thought in typical cases.
 

EvB

Member
I think Sony should not include stuff like HDMI in right now which will only increase costs. Keep the price low and see how the market reacts. If there is a demand for this then they can always add that functionality in the future units right?


More likely they will just announce that they have 2x HDMI in, so that all the media outlets publish it, your local game shops will have little photocopied specs comparisons taped to the front desk.

Then once they've baited all the people who care for these stats, they can reduce them down on the sly, just like PS3/Vita.
 

Karak

Member
Nah. It's way out of my comfort zone. I see some of this info and I'm all

mal-what.gif

Mine too. I am just totally confused by them. Obviously MS didn't pay for random shit to be added. But their continual pushing of how much it adds seems odd. Need to hear more devs talk about it to understand I guess.
 
That's pretty much my line of thoughts as well.

I guess the issues are that GDDR5 is more expensive, and that MS really wanted 8 GB for their set-top box / OS level ambitions.
Once you're stuck with cheaper memory to fulfill that size goal, your only options are embedded RAM or a second separate bus. Which is REALLY expensive and complicates development.

i agree with this.

i also think esram makes so much more sense financially this gen than last. last gen you had to dedicate a lot of silicon to it. this gen you dont. you can get 32mb for less than half the cost of ten mb last gen. then it lets you load up on cheap ddr to boot.

i have a table of gpu bom's, ddr3 is about 4 dollars a gb, gddr5 is about 15. so 32 dollars vs 60 dollars for the durango v orbis ram cost. ms got more ram and they got it for half the price. all they had to do was add a dollop of now cheap esram.

the more i learn about durango the more i like it. ms could have come up with a cheap powerhouse.
 
GoFreak:-
But surely it would just be way simpler, way more flexible, and faster too, to use a chunk of GDDR5?

I can only think that maybe they had another case of the 'EPIC we need 512mb RAM for Gears' and Crytek (or EPIC again) have informed the design to push quantity of RAM at expense of bandwidth.

There was no way we were going to get 8GB of GDDR5, to be honest I'm still quite surprised that we are even getting 4GB GDDR5 in PS4.

Of course if the OS really is reserving 3GB (I have my doubts) in Durango then that makes the argument that Epic or Crytek (or internal studios like 343 and Corrine Yu) drove the design hold less weight IMO, YMMV
 

StevieP

Banned
Finally, StevieP is being called out! He was obnoxious enough around the Wii U with his fake sources and promises.

09/21/2012:
http://www.neogaf.com/forum/showpost.php?p=42380331&postcount=158
StevieP said:
Lower-power/watt CPUs will be the norm next gen. There will just be a lot more of them on other consoles lol (of a more modern lineage of "netbook-oriented" cores rather than PPC)

07/29/2012:
http://www.neogaf.com/forum/showpost.php?p=40407685&postcount=591
StevieP said:
Laugh all you want. Let me ask you a question that you can google the answer to: How many Intel CPUs is Global Foundries producing?
For reference, btw, I have been saying "AMD all the way" for the PS4 since early 2011. For the Xbox I have been generally less specific until more recently - the reasoning above should spell out why I say recently.

The final retail box will have 8GB of DDRx. Dev kits have 12gb (4 for debug).

Same thread:
StevieP said:
ostinblue - the final console will have 8gb of ram, however it won't be of the faster/higher bandwidth variety most expect in consoles (i.e. the gddrs of the world) but a more consumer-PC type of memory.


Edit: in regards to your "Atom" speculation, while it's true it's not a very beefy core at all, there may be some point of reference there in regards to why that was speculated.

Steamroller was in the original target specs for the PS4. There have been some recent rumours of a "Jaguar" switch, and Jaguar is essentially AMD's answer to Atom (and Bobcat's successor) and very much meant to compete with the likes of Atom/ARM cores/IBM Bluegene type stuff, etc.

Same thread:

StevieP said:
The reason why "2GB of GDDR5 in a unified architecture" with the provision that "we may have 4GB if densities increase in time" was mentioned in target specs is because it's using a faster type of memory that only comes in lower densities, unlike Microsoft's console.

There have been rumours that Sony's internal development teams are pushing Sony to bite the bullet, density increase or not, but those are the rumours. The console is launching late 2013, so whatever is available to them reasonably will be where they go if I were to guess. lostinblue is correct in that adding a lot of complexity to your motherboards impacts future cost reduction greatly and after this generation Sony (and Microsoft, and Nintendo) are very sensitive to that. Die shrinks will be much less of a factor than it was in previous generations as well, so initial price matters.

07/23/2012:
http://www.neogaf.com/forum/showthread.php?p=40160674&highlight=#post40160674
StevieP said:
There are no 7970s that I am aware of that have 18 CUs and put out 1.8tf.

The console will be based on GCN architecture, but you're not getting the equivalent of a Tahiti model.

07/06/12:
http://www.neogaf.com/forum/showthread.php?p=39576346&highlight=#post39576346
StevieP said:
The architecture will be much closer to that of modern PCs than previous consoles were (hence my statement there, making for an easier analogous statement), but the improvements over this generation will be great. The increase in memory alone should produce better results in every respect (not just visually). Higher end PCs will also benefit from a greater baseline with respect to multiplatform titles. There are some folks expecting too much, that's all.

06/27/12
http://www.neogaf.com/forum/showpost.php?p=39304637&postcount=1670
StevieP said:
Either Steamroller or Jaguar, depending on which direction the system is going now.

06/18/12
http://www.neogaf.com/forum/showpost.php?p=39020036&postcount=2957
StevieP said:
Sony is using a high-bandwidth unified memory in their current spec sheet. MS is not.

06/15/12
http://www.neogaf.com/forum/showthread.php?p=38912937&highlight=#post38912937
StevieP said:
I was told some time last year that AMD would take over the console landscape (outside of the Wii U's CPU). I didn't believe this (usually reliable) person until more recently. For what it's worth.

That first pastebin post that we all got a glimpse of on GAF was... well way more accurate than it was given credit for, minus the memory on the MS console (and obviously the processor types were guessed on). I'd say it was based on (accurate) info that was in the documentation of that time.

06/11/2012
http://www.neogaf.com/forum/showpost.php?p=38781672&postcount=240
StevieP in an edited-back-and-forth said:
thuway said:
When a 7970m is 2 teraflops by itself, you honestly believe that in late 2013/ early 2014 these consoles won't cross that barrier? What are you talking about mang....

An x51 by Alienware can easily get 2.56 teraflops in 2012. As far as I am concerned these companies could easily hit 3 teraflops by next year. Early 2014 might net them 3.5.

StevieP said:
As far as teraflops go? No, if I were you I'd temper my expectations in regards to the next gen consoles reaching >2.5TF.

thuway said:
I think your under estimating next gen performance. However, like I said earlier: when a 7970m nets you 2+ teraflops performance and you combine that with an APU- you can only imagine the possibilities of a box in the early 2014.

StevieP said:
Or, mid/late 2013. Laptop components should also be discarded in the majority of cases due to their binned nature.

And lots of people are asking for that. You've basically begged and pleaded in the last 6 months on multiple occasions for Sony to delay their console to late 2014 so that there could be what constitutes a "full" generational leap on multipliers alone instead of a more modest 1.2-1.8tf 2013 machine (i.e. what the 2 console makers are looking at currently) that's less of a razor blade proposition and good for the company and the majority of the mass market.

You'd basically be dooming Sony to hand a giant chunk of userbase to the competition for your own personal gain. Why would anyone want their "console maker of choice" to make such a boneheaded move?

I wish post history went back to 2011! You're welcome, though, Horse Armour :p

In regards to dev kits often showing more powerful components than their retail consoles, have a look at this post:

http://www.neogaf.com/forum/showthread.php?p=39521583&highlight=#post39521583
 

Jadedx

Banned
So now we're once again heading to Xb3>Orbis.

This is like perennial oscillation. Some people here will outright sublimate during E3.

Which is what proelite, karak, and thuway have been saying for the last week. But all in all a wash.
 

gofreak

GAF's Bob Woodward
That sounds rather likely. An evolution of MEMEXPORT.

To me, the ESRAM really looks worse now than it did before the leak. I expected more bandwidth.

Agreed. Before, wrt bandwidth, I said 'sometimes Durango will be better, sometimes Orbis'.

Durango looks like it's compromised on what it would have excelled at in favour of something more balanced...with it's 'something more balanced' being difficult to juggle AND less performant.

Although I guess latency on reads/writes is better from eSRAM. But if we're introducing compression/decompression beyond the inbuilt kinds on the GPU that might be a bit of a tradeoff.

And yes, the gain is capacity at lower cost. But again - fack!
 
Gemüsepizza;46711044 said:
And CUs in cutting edge GPUs aren't "maximized for graphics"?

Your problem is in thinking there's no room for a new approach, and there's no way MS and AMD could come up with something that is based on existing tech but that is essentially new.

It's completely wrong to think that.
 

DieH@rd

Banned
I feel like a lot of people see that 1.2 teraflop figure and then immediately underestimate the Durango.

It'll take a 2.5 teraflop or better GPU on the PC to match this highly customized GPU, which has customizations that will probably never be ported to PCs for the reason that PCs don't need them.

With the arrival of Radeon 8000 series at Q2, that kind of PC GPU power will cost us less than 200€...
 
To me, the ESRAM really looks worse now than it did before the leak. I expected more bandwidth.

I'm going to look genuinely stupid now... but oh well.

Why is bandwidth so important given the relatively small size of the ESRAM? Surely you'd be limited by the size of the memory before you'd be limited by your ability to ship data to / from it.

I can appreciate that when filling gigabytes of RAM bandwidth is massively important, but I'm not seeing why it would be on 32mb (I appreciate that I'm coming at this from a standpoint of severe ignorance).

I don't go anywhere near hardware when I'm coding. Abstraction for the win!
 

nib95

Banned
So now we're once again heading to Xb3>Orbis.

This is like perennial oscillation. Some people here will outright sublimate during E3.

I don't think anyone has switched to Durango > Orbis as far as I can tell. A few people said Durango > Orbis early on, but this was with earlier details, and largely from Microsoft employees or those connected lol (sorry guys!). So I think we should just wait a bit before jumping the gun.

As of now, the only question mark on why it isn't Orbis > Durango is this Data Move Engine or other factors not related to face value specs.
 

Karak

Member
Which is what proelite, karak, and thuway have been saying for the last week. But all in all a wash.

Well I have been more confused since Friday, as you can see in the other thread:)
But I wanted to be right and didn't care about being late to the news. I just could not seem to understand the custom bits. I get the efficiencies but the move engine is an unknown to me. It could be a number of useful things.
 
Which is what proelite, karak, and thuway have been saying for the last week. But all in all a wash.

Ultimately it looks like it isn't going to come down to shades of grey about power, but instead whose console will it be easier to make money on. I guess it's always about that in the end, which would explain the embarrassing lack of Wii U support from third parties.
 
If those things exist, they are more like compensations for the compromises that have been made. (if)

That is consistent with their language in materials aimed at developers, but I really don't know. I do know that HSA is mentioned repeatedly.

My post was designed to determine how much I know or guessed vs. how much you know or guessed. :)

It seems a lot of people now have a certain level of information and many of the really juicy pieces are only being speculated on.
 

Eideka

Banned
It'll take a 2.5 teraflop or better GPU on the PC to match this highly customized GPU, which has customizations that will probably never be ported to PCs for the reason that PCs don't need them.

Good thing current PC hardware is already up for the task !
 

Thraktor

Member
Why would AMD put technology in Sony's stuff that MS paid to be developed? That would 100% have been in a contract.

Why would AMD enter into a contract where they develop a considerably more efficient GPU architecture, but aren't allowed to use it themselves? It's entirely possible that MS and AMD entered into a research agreement that involved joint development of components which are specific to what MS want to do with Durango, but as I say they're going to be bells and whistles rather than a drastic overhaul of the entire GPU architecture.
 
So now we're once again heading to Xb3>Orbis.

This is like perennial oscillation. Some people here will outright sublimate during E3.
No matter what the special hardware does, it wont make up the difference in bandwidth and gpu power. In fact, a lot of this special hardware is stuff to make sure computational resources arent taken up bt kinect.

Orbis is simply much better than Durango in specs.
 

Proelite

Member
Once again there was a big gulf in terms of alpha kits and beta kits for Durango on paper. The Alpha kits were needed to emulate the final performance of the combined silicon.

Alpha kits were

8 core 16 thread intel CPUs, probably running at 1.6 ghz
12GB of Ram
High end AMD 7000 series >2.5 teraflop GPU
 
I am not worried in the least about the supposed low ESRAM bw.

I'm 100% sure MS didnt do anything stupid. We're not dealing with Nintendo engineering here.

Pretty sure on B3D people think the 102 GB is external, internal can/should be much greater. Whatever the explanation I'm not worried. MS wont design something to be crippled (again, Nintendo+Wii U, cough)
 

Reiko

Banned
No matter what the special hardware does, it wont make up the difference in bandwidth and gpu power. In fact, a lot of this special hardware is stuff to make sure computational resources arent taken up bt kinect.

Orbis is simply much better than Durango in specs.

Nope. Not especially for Kinect. Dream a little bigger darling.
 

Jadedx

Banned
Anyone know anything about BC? Is it hardware or software? Either way will we get enhancements? Better AA? Better AF? Frame rate?

I am not worried in the least about the supposed low ESRAM bw.

I'm 100% sure MS didnt do anything stupid. We're not dealing with Nintendo engineering here.

Pretty sure on B3D people think the 102 GB is external, internal can/should be much greater. Whatever the explanation I'm not worried. MS wont design something to be crippled (again, Nintendo+Wii U, cough)

would internal matter in this case since they moved the rops?
 
Nope. Not especially for Kinect. Dream a little bigger darling.
One of them is specifically called Kinect Multi-Channel Echo Cancellation. The other is hardware codec support which is typical. Move engine wont make up a difference since the esram will be used as a framebuffer.
 

Elios83

Member
Read the thread, this is not correct, even though the ps4 will have an advantage.


The wii will be slower, than both hd systems. probably bigger difference than xbox vs ps2, probably not as bad as hd twins vs wii though.

One thing is to have access to a whole 4GB pool at 192GB/s, the other is to access to a 32MB 'buffer' at 100GB/s and separately to a slow main pool at 68GB/s. Not considering that the total still favours the GDDR5 approach, which is something kinda surprising in a negative way because it raises the question of why not using GDDR5 in first place. And the answer could just be to be as cheap as possible.
Also talking about efficiency and other optimizations, there is no magic and whatever solution is available to one company has been available and discussed as well with the other. The know-how is the same, choices have been slightly different depeding on specific goals.
Microsoft is just using relatively cheap base hardware integrated into a big system on chip, trying to work around the most obvious limits using ad hoc solutions.There won't be miracles and we have already seen what kind of graphics this hardware can make with games like Star Wars 1313 and WatchDogs.
These considerations apply as well to Sony, they're going to be similar, although it seems they're going to have more brute force which is gonna help a lot in certain games and especially in the hands of first party studios.
 

WinFonda

Member
So the people on GAF that are privy to next-gen specs and insider info, have you heard more about Durango or Orbis? Because it seems like the former, but I'm curious.
 
True to a vocal block at NeoGAF. In all honesty, I think Nintendo under spec'd the WiiU and should be reasonably criticized. Not that a console is all about polyflops and AA shadercores... I think the WiiU's gamepad is a brilliant idea.

Reasonable criticized, I fully agree. Unfortunately, in many cases it reads like "Oh oh oh my precious next gen console seems to lack of (insert tech talk here), I don't really know what it means but it must be vital, things are heating up, uhm... LOL, look at that crappy Wii U thing, at least we will not end up like this, hahahaha!

That is tiring, not funny and is not leading anywhere satisfying. Well, not true, at least some funny gifs are produced. Thanks to the people with a good sense of humor out there.
 

ghst

thanks for the laugh
I am not worried in the least about the supposed low ESRAM bw.

I'm 100% sure MS didnt do anything stupid. We're not dealing with Nintendo engineering here.

Pretty sure on B3D people think the 102 GB is external, internal can/should be much greater. Whatever the explanation I'm not worried. MS wont design something to be crippled (again, Nintendo+Wii U, cough)

i should hope it's external, but with Z it still speaks of very conservative aspirations for throughput. aegis seems to believe that there is the potential for this thing to shape up against a 3TFLOP monster, but how could that be possible with such limited bandwidth?

there are either big empty promises being made, or something doesn't add up.
 
Top Bottom