• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks Durango specs: x64 8-core CPU @1.6GHz, 8GB DDR3 + 32MB ESRAM, 50GB 6x BD...

Good point but I felt kind.

It should be:
PC: SSAA
(there is nothing in between equivalent to MSAA)
Orbis: SMAA
Durango : edge detect or no AA
Wii U : sweet FX FXAA.

Since edge detect or no AA is significantly much better than having FXAA laying a steaming blur turd on your entire screen. And (edit;,typo SweetFX) does the same (steaming turd) for your colors.
 
Does this supposed efficiency of the GPU come from MS having a better development environment than Sony or just that the Xbox has more efficient hardware as you would expect them hardware wise to be based off very similar architectures so I don't see where MS would get efficiency that Sony can't.

Until we know what these GPUs are, what architecture are they based on, how modified they are, we won't see an answer to that.

I'm sure both MS and Sony will shoot for efficiency too, but it's hard to rate each attempt right now.
 

Eideka

Banned
It should be:
PC: SSAA
(there is nothing in between equivalent to MSAA)
Orbis: SMAA
Durango : edge detect or no AA
Wii U : sweet FX FXAA.

Since edge detect or no AA is significantly much better than having FXAA laying a steaming blur turd on your entire screen. And FXAA does the same (steaming turd) for your colors.

Thanks, I didn't know this. :D
 
Lmao.

I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.

Dafuq people.
 

i-Lo

Member
Totally. But when you talk about GPU efficiency, you are talking about how easy/hard the hardware makes attaining efficiency in a variety of scenarios.

Also, almost nothing in the real world "hovers around" 100% efficiency. Not even close.

Thank you.

So, do you buy this whole thing about MS designing their GPU to gain parity with a pitcairn equivalent? Would it be idiotic of Sony not maximise efficiency for their GPU as well given it's a closed box environment that has to last quite a long while? I mean why invest more to get less?
 
Don't worry man, Durango will be a cable box/tv tuner through and through, you won't be disappointed.

I imagine that when MS unveil Durango they will unveil all the partnerships they have with cable providers and television stations.

Yes, keep shitting this in every single thread over and over. You'll last long.
 

MCD

Junior Member
Lmao.

I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.

Dafuq people.

Now they need 8 gigs GDDR5 or no sale.
 

Pagusas

Elden Member
All I want to really contribute to this thread is: I'm decently happy with the specs and VERY glad MS didn't go the wii/wii U route with being super cheap/xbox 1.5 on us. At the end of the day we can all wish for more, but we could have gotten a lot worse too. I think it looks pretty good compared to what we all feared would happen thanks to the Wii's success.

Thank you.

So, do you buy this whole thing about MS designing their GPU to gain parity with a pitcairn equivalent? Would it be idiotic of Sony not maximise efficiency for their GPU as well given it's a closed box environment that has to last quite a long while? I mean why invest more to get less?


Seriously FXAA has come a long way and its looking better and better. I wish we'd get Txaa though (not happening), AC3 looks sooooo good with it on.
 

aegies

Member
One more question for you Aegies, Does the secret sauce add anymore flops?

At least part of the custom hardware is designed to take specific GPU associated tasks off the shoulders of that element, freeing up more raw GPU resources for stuff it's better at. I believe, anyway. I could be wrong. Again, wish I was a software engineer.
 

Durante

Member
So, do you buy this whole thing about MS designing their GPU to gain parity with a pitcairn equivalent? Would it be idiotic of Sony not maximise efficiency for their GPU as well given it's a closed box environment that has to last quite a long while? I mean why invest more to get less?
I don't think it's likely that Durango's 2013 AMD GPU FLOPs are somehow 66% more efficient than Orbis' 2013 AMD GPU FLOPs, no.
 
Eh, this doesn't sound that impressive to me. I'll just get a PS4. Sony never lets me down as a gamer. I never really cared for Halo or Gears of War, so MS doesn't have anything to offer me, especially when they are trying to force Kinect on everyone. I can see where their interests will be next go around.
 

Mik_Pad

Banned
So basically,

Orbis : Uncharted 2
Durango : Uncharted 3
Wii U : Uncharted: Fortune for Fortune

Or

Orbis: God of War 3
Durango: God of War 2
Wii U: God of War Betrayal
 
Lmao.

I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.

Dafuq people.

The funny thing is everyone told those people it won't happen because console ram isn't the same as cheap PC ram. Except that turned out to not be the case....
 

Doffen

Member
Orbis: Ferrari 458
Durango: Porsche 911 Carrera 4S
WiiU: Nissan Versa

Ferrari_458_824379i.jpg
 
Lmao.

I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.

Dafuq people.

Actually this is what people were expecting.

Everyone poopooed the idea of 6-8GB of VRAM (GDDR) because of cost. Apparently, so did the console makers. 2-4GB of VRAM was always the expectation. The 720's hybrid shit is out of left field and we'll have to see what their DDR3 direction means for the games.
 

Durante

Member
At least part of the custom hardware is designed to take specific GPU associated tasks off the shoulders of that element, freeing up more raw GPU resources for stuff it's better at. I believe, anyway. I could be wrong. Again, wish I was a software engineer.
The problem I have with this idea is that it seems so ... regressive in terms of GPU architecure.

The last decade or so of GPU development has been all about taking dedicated circuitry and replacing it with more flexible, programmable hardware. This seems to be the exact opposite approach.
 

davious88

Banned
Microsoft is truly trying to create the all-in one media box of the future. It's a big gamble, but they'll make a compelling case IMO.

-Surface/Ipad/Iphone/etc. Smartglass Integration
-Blu-ray Movie playback
-DVR/Cable Integration (seamless) w/ Twitter/Facebook/Skype overlay
-IPTV capability (seamless)
-Video/Music Marketplace Integration
-Kinect integration (voice/gesture control everything + physical universal remote)
-Bing search built-in for internet and content/game searches and launching

tumblr_lwhtxb5d3F1r6aoq4o1_250.gif
 

Ashes

Banned
I don't think it's likely that Durango's 2013 AMD GPU FLOPs are somehow 66% more efficient than Orbis' 2013 AMD GPU FLOPs, no.

The only way they can do so is by having major architectural advances. So unless MS have some 2014 'secret sauce', say from the 9000 series, and Sony are on 7000 gpus branded as 8000 parts, they can't bridge the gap, can they?

edit: this thread is so hard to read now. lol.
 

StevieP

Banned
Lmao.

I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.

Dafuq people.

Because we were all talking about faster memory (such as GDDR5).
Consoles usually had faster memory than what we were all using as general purpose memory in our PCs.

Up until not too long ago, Sony still had 2GB GDDR5 in their target spec sheets.

Yes yes, provisionally maintaining that if densities increase or whatever they'll do more.
There was a lot of internal outcry at that I'm sure, but if you look at the motherboard complexity that 4GB of GDDR5 would introduce, you'll see why that was the case.

That, and the bean counters don't like it much.

As far as MS goes, they're using 8GB of DDR3, which really isn't anything special in of itself. They have 32mb of ESRam on the GPU to mitigate the fact the memory and bandwidth of DDR3 is generally a lot slower than GDDR5.

Hope that helps explain why there used to be a "lol 8gb" train more than a year ago.
 

pestul

Member
The Wii U will be underpowered compared to these, but nothing like the orders of magnitude between Wii and PS360. 3-4x difference maybe and generally similar usable featureset. The memory gap is going to be the most glaring likely.
 
Lmao.

I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.

Dafuq people.
Huh? RAM is far from the only important metric, GPU FLOPS and CPU power are both equally important and they're both fairly disappointing (I'm including PS4 here as well). Fair enough if we were just rating the systems on their RAM then they would both be impressive but taken on their whole, they fall far short of where they should be considering the tech PS3 and 360 had.
 

charsace

Member
At least part of the custom hardware is designed to take specific GPU associated tasks off the shoulders of that element, freeing up more raw GPU resources for stuff it's better at. I believe, anyway. I could be wrong. Again, wish I was a software engineer.

Sounds like the Amiga in design.
 

aegies

Member
The problem I have with this idea is that it seems so ... regressive in terms of GPU architecure.

The last decade or so of GPU development has been all about taking dedicated circuitry and replacing it with more flexible, programmable hardware. This seems to be the exact opposite approach.

True. I'm just looking at the block diagram and how much silicon Microsoft is dedicating to that and the memory movers (and the audio DSP), all of which is custom to the system and expensive, and wondering why that, instead of more space dedicated to more GPU resources. They're telling developers it brings a number of advantages and frees them up to do "things," which I don't understand in an appreciable way. They spend a lot of time in their documentation talking about them.
 
Top Bottom