• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks Durango specs: x64 8-core CPU @1.6GHz, 8GB DDR3 + 32MB ESRAM, 50GB 6x BD...

Bsigg12

Member
Do we have any idea of how current these specs are? Is this a week, month, or year or longer old specifications for the proposed Durango chipset?

Also, this strongly supports the idea of the Kinect 2.0 shipping with the system. Microsoft is making it an integral part of the system that will have dedicated systems for it. It will continue being a camera you set up somewhere in your play space. Just because it says "Kinect In" doesn't mean it's a peripheral per say, it just shows that the console will not have the kinect hardware (i.e. cameras and audio equipment) baked in, it's just like the current connect is.
 

StevieP

Banned
So you're saying epic will throw away safe money and jump into rocky waters? F2P is very unstable. People only look at few success stories and ignore the ton of F2P games that fail. Epic would be so dumb to do this.

Well doesn't it suck then that Crytek, Epic, and many others are moving toward that model very heavily?

Horse Armor said:
Finally, StevieP is being called out! He was obnoxious enough around the Wii U with his fake sources and promises.

Fake sources and promises? Well.. you could look through my post history. I've always maintained that I am not a game developer and am just relaying things.

specialguy said:
I just think it was a truly pathetic effort on Nintendos part. Worthy of scorn really.

Even as an olive branch I try to say, "Nintendo fans deserved better". You guys defend that company and hope it gives you something good, and they gave you, that...

They gave me a way to play Nintendo games. The same way my Playstation hardware allows me to play Sony games. And my PC hardware allows me to play all multiplatform titles. There's nothing to defend. You play software, not hardware.
 

sangreal

Member
I think it actually means that it's not built in the system, and is separate. Probably bundled instead of being something that comes with every sku.

Which would eliminate some of the "cheap cuz of kinect built in" theories.

"High-fidelity Natural User Interface (NUI) sensor is always present" would indicate that it is bundled
 
No, but he's always hated it lol.


I just think it was a truly pathetic hardware effort on Nintendos part. Worthy of scorn really.

Even as an olive branch I try to say, "Nintendo fans deserved better". You guys defend that company and hope it gives you something good, and they gave you, that...
 

onQ123

Member
durango_arq1.jpg


doesnt "kinect in" indicate that it IS NOT included with the system and again would be sold separately?

No it indicate that it will have a port specially made for it.
 

op_ivy

Fallen Xbot (cannot continue gaining levels in this class)
Why would you think that? Even it kinect comes bundled, i'm sure it wouldn't be actually embedded on the conosle, nor it would be already attached... It makes more sense if they are just using a dedicated port designed to provide lower latency/ higher throughput than even usb 3.0 could provide.

mainly because of this, from the first page, which just hit me and i began to look critically at the info provied and felt i must have missed something. also, brain fart regarding how kinect would work if actually "built it".

So kinect in every box?

Edit: ^^

Which will be what the RAM being taken up is. Kinect will be constantly on and taking high quality 3D images and having to process them.

Gemüsepizza;46702267 said:
So this probably means Kinect will be included too.



For the data in the ESRAM, the Xbox seems capable of achieving 102GB/s and simultanously 68GB/s from DDR3. But those 102 GB/s are imo still only for the 32 MB, and that 8 GB RAM only has 68GB/s bandwidth. Imo this isn't nearly as good as the rumored PS4 solution.
 

charsace

Member
Let's not forget that efficiency, like every other performance metric save actual games running alongside other versions, is not in itself a predictor of real world performance. It's even more vague than frequency.

For a theoretical 1 TFLOPS chip to be faster than a 2 TFLOPS one the former must operate at 100% efficiency and the latter at below 50%. And as someone who knows a bit about how these things work, 100% efficiency can never be achieved. Ever. In the end everything is math. Whoever does the math quickest wins. Smarter math can help but if the raw performance differential is too big they may not be enough. This is super simplified of course. More of a theoretical exercise than a technical analysis.

What if the xbox gpu runs at around 80% efficiency? That would get the xbox 3 to around 1 TFLOPS in real world performance.
 

PaulLFC

Member
I think it actually means that it's not built in the system, and is separate. Probably bundled instead of being something that comes with every sku.

Which would eliminate some of the "cheap cuz of kinect built in" theories.
I don't see why having a port means it won't be included with every console? I presume the alternative to having a port for it is to build it into the system, but that to me is a poor solution as it limits where you can put the console. Having the camera as a separate, connectable component like the sensor bar was for Wii makes much more sense IMO, unless I'm missing a reason why having it built in is a lot better.

But if you can unplug it, it obviously wouldn't be a required accessory.
Why not? It's like saying the PS2/Xbox controller isn't a required accessory because you could unplug them from the console. Sure, there was the odd game that controlled via Eyetoy/Lightgun but still.
 

davious88

Banned
Royalty isn't as high as you seem it is. Everyone and their mother is using BD-ROM these days. Microsoft can afford it, especially if they want their console to be the entertainment center.

Its basically BD data vs BD data + video, hence Nintendo choosing the former.
 

SunSunich

Member
Ps4 == Nissan 350z
Xbox == Toyota Corolla
WiiU == Fiat 500
Assuming potential of their popularity.
I really hope that this spec is someones joke, and stock units will be more powerfull on gfx side.
 
I just think it was a truly pathetic hardware effort on Nintendos part. Worthy of scorn really.

Even as an olive branch I try to say, "Nintendo fans deserved better". You guys defend that company and hope it gives you something good, and they gave you, that...

You are way too emotionally invested in video games bro.
 

MCD

Junior Member
Let it go. Rare, as it was, is gone. I don't think anyone would want the present incarnation of Rare to touch any real games.

They made PDZ, Kameo, 2 Pinata games and Banjo. They can still make it but SUDDENLY

KINECT

Yeah yeah, Kinect Sports sells and all but I WAS WAITING FOR THIS GEN TO GET OVER JUST FOR RARE TO COME BACK

WILL NEVER LET GO
 

Thraktor

Member
The use of embedded SRAM specifically is probably the most interesting thing to me here. I'm guessing they're actually referring to some form of pseudo-SRAM, such as T-RAM (which AMD have been linked to), as they achieve a much higher density than true SRAM, and the main trade-off is an increase in latency, which is largely trivial for the framebuffer/render target use case of the 32MB of embedded memory in this case.

In any case, I think their approach to memory is a good one, as the bulk of bandwidth requirements are going to be the relatively small framebuffer and so forth, so their approach allows a large total memory pool without either being bandwidth starved or costing an absurd amount of money. In fact, my guess is that if Sony had originally intended for 4GB+ of RAM, they would have gone the eRAM + DDR3 route as well. The 2GB of RAM they initially intended was about the limit of what would make sense with a single GDDR5 pool, and my guess is that by the time they heard about MS's RAM intentions it was too late to make such a big architectural change, and they had no choice but to just hike up to 4GB of GDDR5, despite the fact that 32-64MB of eDRAM at ~500GB/s and 8GB of DDR3 at ~70GB/s would most likely have been both cheaper and offered better performance at the same time.
 

Into

Member


Good read:

The leak also offers confirmation of last week's story that the new PlayStation Orbis graphics core appears - at face value - to be significantly more powerful than the GPU in Durango. Our sources suggest that the new PlayStation offers up 18 Radeon GCN compute units at 800MHz. The leak matches older rumours suggesting that Durango features only 12, running at the same clock speed. Bearing in mind the stated peak performance metrics in the leak, there is a clear deficit between the 1.23 teraflops offered by Durango, and the 1.84TF found in Orbis.

There is still much we do not know and you should not take this as confirmation of anything really.

I do not mind if the 720 is slightly less powerful, it will still be a beastly system and a upgrade of what we got now. And its good that Sony first party developers, especialyl Polyphony Digital, Santa Monica and Naughty Gods get beefy hardware, you know they will produce top notch looking games.
 
The problem I have with this idea is that it seems so ... regressive in terms of GPU architecure.

The last decade or so of GPU development has been all about taking dedicated circuitry and replacing it with more flexible, programmable hardware. This seems to be the exact opposite approach.

That's true but that also comes with the cost and power usages that apparently consoles can no longer compete...

Having said that, maybe they will still be somewhat flexible, to at least support the different implementations we have today?

Let's not forget that efficiency, like every other performance metric save actual games running alongside other versions, is not in itself a predictor of real world performance. It's even more vague than frequency.

For a theoretical 1 TFLOPS chip to be faster than a 2 TFLOPS one the former must operate at 100% efficiency and the latter at below 50%. And as someone who knows a bit about how these things work, 100% efficiency can never be achieved. Ever. In the end everything is math. Whoever does the math quickest wins. Smarter math can help but if the raw performance differential is too big they may not be enough. This is super simplified of course. More of a theoretical exercise than a technical analysis.

That's assuming flops are the only metric controlling performance, which is not. If we assume both Ms and Sony had the same budgets for the gpu (both in size, costs etc) it means that each of them decided to prioritize different aspects of the pipeline that could lead to performance advantages in different scenarios, but may or may not matter much in the general scheme... If you extrapolate the same design philosophies to the entire systems, it's not a much different scenario from 360 and Ps3 were the peculiarities in architecture allowed each of them to excel the other in one area, but since the budget was the same the end results ended up being similar...
 
That's what I'm talking about.

All of these dudes are coming out and basically saying these companies don't know how to plan like its fact.

Forget StevieP for a second.
Do you not remember the ue3 tech demo from 2004? or the early gears of war demos?

Self shadowing never made it in Gears of War or other UE3 games until their ue '3.5' that also contained the baked GI update was released ,neither did the soft shadows shown in the 2004 demo. (Can't remember the displacement mapping shown being half as good in the final games either but not sure on that one. )

The 2004 tech demo was leagues above what we got in games (shitting a bunch of post process effects and bloom on top of the games to hide the flaws does not count).

Also I believe mass effect 1 released with self shadowing on pc (at some crummy resolution ) and they had to flat out remove it in a patch because it was broken and shit, while the ue3 tech demo example looked glorious.
Also the tech demo had great image quality, not a PEEP about the shitty image quality and lack of proper MSAA support back then.

So yes, these companies don't know how to plan and are prone to showing features that aren't actually ready or can't be done on the harware we end up with.
 

JaggedSac

Member

and while we haven't confirmed overall bandwidth, the leak's 170GB/s throughput certainly seems plausible

The proof presented by the source suggests that the data is at most nine months old: factoring in how long it takes to create a console, the chances are that there will not be many changes implemented since then.

Several interesting things.
 
If the PS4 has a Pitcairn class GPU close to 7850/70 and the Durango doesn't, the battle for power should be already over. Dat Kinect pack-in.

Anyway, would have been weird to have both next gen consoles sporting the same GPU, so I guess this makes some sense. I wonder how those negotiations at AMD went down...
 

op_ivy

Fallen Xbot (cannot continue gaining levels in this class)
Forget StevieP for a second.
Do you not remember the ue3 tech demo from 2004? or the early gears of war demos?

Self shadowing never made it in Gears of War or other UE3 games until their ue '3.5' that also contained the baked GI update was released ,neither did the soft shadows shown in the 2004 demo. (Can't remember the displacement mapping shown being half as good in the final games either but not sure on that one. )

The 2004 tech demo was leagues above what we got in games (shitting a bunch of post process effects and bloom on top of the games to hide the flaws does not count).

Also I believe mass effect 1 released with self shadowing on pc (at some crummy resolution ) and they had to flat out remove it in a patch because it was broken and shit, while the ue3 tech demo example looked glorious.
Also the tech demo had great image quality, not a PEEP about the shitty image quality and lack of proper MSAA support back then.

So yes, these companies don't know how to play and are prone to showing features that aren't actually ready or can't be done on the harware we end up with.

the gears of war downgrade got me my tag :)
 

Jadedx

Banned
This thread reminds me of GAF in early 2005 - 2006. Everyone was convinced that the ps3 was like 2-3x more powerful than the 360 because the ps3 had more raw power, and there was no way the 360 would be as efficient with their ram and their shaders as they said they would.
 

i-Lo

Member
You know I am surprised at how civil things have been despite perceived inclination that Orbis may have slightly better specs overall. In most other forums, the defence squad would mount up for all out warfare.

Personally, if there is more than an iota of extra power with PS4, I honestly don't see it being used by third parties. It's all about the first parties then. And when it comes to first parties, I expect 343 to amaze us when they go beyond what can be expected from XB3.
 
Top Bottom