• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks Durango specs: x64 8-core CPU @1.6GHz, 8GB DDR3 + 32MB ESRAM, 50GB 6x BD...

I'm gonna guess and say you're wrong. UE3 was designed with PC in mind. Epic's engine is the most popular one for big budget multiplatform games. Why would they design another engine with just PC in mind?

Why are you still talking about the gaming industry as if we're still in 2005? Many things have changed since then, Epic could easily come up with a f2p Unreal Tournament 2014 that blows everything else out of the water graphically, consoles are not the only way to make money anymore.
 

i-Lo

Member
Because we were all talking about faster memory (such as GDDR5).
Consoles usually had faster memory than what we were all using as general purpose memory in our PCs.

Up until a few months back, Sony still had 2GB GDDR5.

Yes yes, provisionally maintaining that if densities increase or whatever they'll do more.
There was a lot of internal outcry at that I'm sure, but if you look at the motherboard complexity that 4GB of GDDR5 would introduce, you'll see why that was the case.

That, and the bean counters don't like it much.

As far as MS goes, they're using 8GB of DDR3, which really isn't anything special in of itself. They have 32mb of ESRam on the GPU to mitigate the fact the memory and bandwidth of DDR3 is generally a lot slower than GDDR5.

Hope that helps explain why there used to be a "lol 8gb" train more than a year ago.

Except, and this is a conjecture based off of the report of 4 Gbit GDDR5 chips, the design of the motherboard for 2GB GDDR5 would be identical to 4GB GDDR5.

Reiko said Gianna was Orbis....
Now Gianna is Durango...


Which console will be the "gianna" of next gen?

Imagine if Gianna was a member of GAF....
 
Because we were all talking about faster memory (such as GDDR5).
Consoles usually had faster memory than what we were all using as general purpose memory in our PCs.

Up until not too long ago, Sony still had 2GB GDDR5 in their target spec sheets.

Yes yes, provisionally maintaining that if densities increase or whatever they'll do more.
There was a lot of internal outcry at that I'm sure, but if you look at the motherboard complexity that 4GB of GDDR5 would introduce, you'll see why that was the case.

That, and the bean counters don't like it much.

As far as MS goes, they're using 8GB of DDR3, which really isn't anything special in of itself. They have 32mb of ESRam on the GPU to mitigate the fact the memory and bandwidth of DDR3 is generally a lot slower than GDDR5.

Hope that helps explain why there used to be a "lol 8gb" train more than a year ago.

Crytek didn't say anything about GDDR type memory. Everything was laughing at the prospect of 8Gb DDR3 ram.

http://www.neogaf.com/forum/showthread.php?t=427598&highlight=
 

op_ivy

Fallen Xbot (cannot continue gaining levels in this class)
durango_arq1.jpg


doesnt "kinect in" indicate that it IS NOT included with the system and again would be sold separately?
 

Durante

Member
True. I'm just looking at the block diagram and how much silicon Microsoft is dedicating to that and the memory movers (and the audio DSP), all of which is custom to the system and expensive, and wondering why that, instead of more space dedicated to more GPU resources. They're telling developers it brings a number of advantages and frees them up to do "things," which I don't understand in an appreciable way. They spend a lot of time in their documentation talking about them.
Stupid question, but are you sure your block diagram is representative of the relative die size of the individual components?
 

McHuj

Member
True. I'm just looking at the block diagram and how much silicon Microsoft is dedicating to that and the memory movers (and the audio DSP), all of which is custom to the system and expensive, and wondering why that, instead of more space dedicated to more GPU resources. They're telling developers it brings a number of advantages and frees them up to do "things," which I don't understand in an appreciable way. They spend a lot of time in their documentation talking about them.

Wait you know the area numbers for the pieces in the block diagram?
 
Huh? RAM is far from the only important metric, GPU FLOPS and CPU power are both equally important and they're both fairly disappointing (I'm including PS4 here as well).


I wonder if Epic knew all along and despite their best efforts, couldn't get more muscle put into them. Don't give us new hw until it's a massive leap etc. They seemed to hint 2.5TF would be their ideal. And I remember about a year ago, them hinting UE4 engine gets interesting at 1TF GPU compute. I think they may have based that comment on their knowledge of new Xbox specs.
 

xtop

Member
http://www.vgleaks.com/wp-content/uploads/2013/01/durango_arq1.jpg[/ IMG]

doesnt "kinect in" indicate that it IS NOT included with the system and again would be sold separately?[/QUOTE]

doesn't it simply indicate it uses a special input and not usb. don't think it really suggests it being included or not included either way
 

charsace

Member
Why are you still talking about the gaming industry as if we're still in 2005? Many things have changed since then, Epic could easily come up with a f2p Unreal Tournament 2014 that blows everything else out of the water graphically, consoles are not the only way to make money anymore.

So you're saying epic will throw away safe money and jump into rocky waters? F2P is very unstable. People only look at few success stories and ignore the ton of F2P games that fail. Epic would be so dumb to do this.
 

Margalis

Banned
An audio DSP cannot be expensive and is really good at doing what an audio DSP needs to do.

A system needs to process audio and dedicated audio hardware gives you more price/performance than using a generic CPU. I'm sure the same is true of video codec processing and other heavily math-based stream transforms.

Using specialized hardware that is good at specific tasks is exactly the theory behind GPUs as well.
 
Yeah the one thing this threads says is the neogaf hates the wii u. lol

If the next Xbox or PS launch with hardware capabilities barely above 7 year old tech with a bunch of overpriced ports, they'll receive even more hate than the Wii U is getting. I don't think this is a case of people shitting on Nintendo simply because they're Nintendo.
 

op_ivy

Fallen Xbot (cannot continue gaining levels in this class)
Could just be the hardware is separate. You really think you'll mount your XBOX720 above your TV or on top of your media shelves?

heh, good point. i remember reading in the thread early on that people were saying that this story confirms kinect would ship with the system - and i thought i had missed something. nothing to say either way at this point, other then the system is designed for it better this time around.
 

Nirolak

Mrgrgr
Did you not see UE4? These systems will be able to handle GI. All of these engines aren't being made to do things that they won't be able to implement on the nextgen systems. Why waste all that money on something you won't be able to use? Whatever tech demos we have seen so far are an indicator as to what the next systems will do.
Well... UE4, unfortunately... How much do you like baked lighting? :p

You'll be held to this one to note.
 

Reiko

Banned
Except, and this is a conjecture based off of the report of 4 Gbit GDDR5 chips, the design of the motherboard for 2GB GDDR5 would be identical to 4GB GDDR5.



Imagine if Gianna was a member of GAF....

She wouldn't last long...:/
 
durango_arq1.jpg


doesnt "kinect in" indicate that it IS NOT included with the system and again would be sold separately?

Why would you think that? Even it kinect comes bundled, i'm sure it wouldn't be actually embedded on the conosle, nor it would be already attached... It makes more sense if they are just using a dedicated port designed to provide lower latency/ higher throughput than even usb 3.0 could provide.
 

StevieP

Banned
True. I'm just looking at the block diagram and how much silicon Microsoft is dedicating to that and the memory movers (and the audio DSP), all of which is custom to the system and expensive, and wondering why that, instead of more space dedicated to more GPU resources. They're telling developers it brings a number of advantages and frees them up to do "things," which I don't understand in an appreciable way. They spend a lot of time in their documentation talking about them.

All of the consoles will have some type of dedicated silicon designed to take workload off other things. Hell, the Wii U has this type of thing as well.

Except, and this is a conjecture based off of the report of 4 Gbit GDDR5 chips, the design of the motherboard for 2GB GDDR5 would be identical to 4GB GDDR5.

Not at the densities people were talking about in late 2011/early 2012 would certainly make the board more complex.

Crytek didn't say anything about GDDR type memory. Everything was laughing at the prospect of 8Gb DDR3 ram.

http://www.neogaf.com/forum/showthread.php?t=427598&highlight=

Your OP was wrong. MS uses GDDR3 in the Xbox 360.

Uh...not really? Did the Wii U kill your family?

No, but he's always hated it lol.
The Wii U has a CPU/GPU design that has a lot more in common with Durango than the 360, though. With the overall power that's only a bit above previous generation consoles, however, when you take everything into account. As I've always said, physics never lie.
 
heh, good point. i remember reading in the thread early on that people were saying that this story confirms kinect would ship with the system - and i thought i had missed something. nothing to say either way at this point, other then the system is designed for it better this time around.


I think they'll ship with Kinect 2.0. I suspect GAF way overestimates Kinect's production costs. Kinect is a central strategy for MS.
 
durango_arq1.jpg


doesnt "kinect in" indicate that it IS NOT included with the system and again would be sold separately?

I think it actually means that it's not built in the system, and is separate. Probably bundled instead of being something that comes with every sku.

Which would eliminate some of the "cheap cuz of kinect built in" theories.
 

pixlexic

Banned
If the next Xbox or PS launch with hardware capabilities barely above 7 year old tech with a bunch of overpriced ports, they'll receive even more hate than the Wii U is getting. I don't think this is a case of people shitting on Nintendo simply because they're Nintendo.

Nah the wii u was belittled because everyone looked at its hardware like it was the same as the 360s/ps3. Remember the "OMG 1.2 ghz cpu!"
 

xenist

Member
Let's not forget that efficiency, like every other performance metric save actual games running alongside other versions, is not in itself a predictor of real world performance. It's even more vague than frequency.

For a theoretical 1 TFLOPS chip to be faster than a 2 TFLOPS one the former must operate at 100% efficiency and the latter at below 50%. And as someone who knows a bit about how these things work, 100% efficiency can never be achieved. Ever. In the end everything is math. Whoever does the math quickest wins. Smarter math can help but if the raw performance differential is too big they may not be enough. This is super simplified of course. More of a theoretical exercise than a technical analysis.
 
Top Bottom