metalslimer
Member
Wouldn't Orbis be Late Registration because at least that's somewhat close to College Dropout while Graduation begins the slide downward.
Good point but I felt kind.
Does this supposed efficiency of the GPU come from MS having a better development environment than Sony or just that the Xbox has more efficient hardware as you would expect them hardware wise to be based off very similar architectures so I don't see where MS would get efficiency that Sony can't.
It should be:
PC: SSAA
(there is nothing in between equivalent to MSAA)
Orbis: SMAA
Durango : edge detect or no AA
Wii U : sweet FX FXAA.
Since edge detect or no AA is significantly much better than having FXAA laying a steaming blur turd on your entire screen. And FXAA does the same (steaming turd) for your colors.
Totally. But when you talk about GPU efficiency, you are talking about how easy/hard the hardware makes attaining efficiency in a variety of scenarios.
Also, almost nothing in the real world "hovers around" 100% efficiency. Not even close.
FXAA laying a steaming blur turd on your entire screen.
Don't worry man, Durango will be a cable box/tv tuner through and through, you won't be disappointed.
I imagine that when MS unveil Durango they will unveil all the partnerships they have with cable providers and television stations.
Lmao.
I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.
Dafuq people.
Thank you.
So, do you buy this whole thing about MS designing their GPU to gain parity with a pitcairn equivalent? Would it be idiotic of Sony not maximise efficiency for their GPU as well given it's a closed box environment that has to last quite a long while? I mean why invest more to get less?
One more question for you Aegies, Does the secret sauce add anymore flops?
I don't think it's likely that Durango's 2013 AMD GPU FLOPs are somehow 66% more efficient than Orbis' 2013 AMD GPU FLOPs, no.So, do you buy this whole thing about MS designing their GPU to gain parity with a pitcairn equivalent? Would it be idiotic of Sony not maximise efficiency for their GPU as well given it's a closed box environment that has to last quite a long while? I mean why invest more to get less?
WiiU is also okay.
Lmao.
I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.
Dafuq people.
PC - Brain
Orbis - Vagina
Durango - Penis
Wii U - Anus
808s and heartbreak was okay guys
WiiU is also okay.
Wouldn't Orbis be Late Registration because at least that's somewhat close to College Dropout while Graduation begins the slide downward.
Orbis: Ferrari 458
Durango: Porsche 911 Carrera 4S
WiiU: Nissan Versa
So basically,
Orbis : Uncharted 2
Durango : Uncharted 3
Wii U : Uncharted: Fortune for Fortune
Or
Orbis: God of War 3
Durango: God of War 2
Wii U: God of War Betrayal
Lmao.
I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.
Dafuq people.
PC - Brain
Orbis - Vagina
Durango - Penis
Wii U - Anus
The problem I have with this idea is that it seems so ... regressive in terms of GPU architecure.At least part of the custom hardware is designed to take specific GPU associated tasks off the shoulders of that element, freeing up more raw GPU resources for stuff it's better at. I believe, anyway. I could be wrong. Again, wish I was a software engineer.
I want a comparison in terms of MMA fighters
Microsoft is truly trying to create the all-in one media box of the future. It's a big gamble, but they'll make a compelling case IMO.
-Surface/Ipad/Iphone/etc. Smartglass Integration
-Blu-ray Movie playback
-DVR/Cable Integration (seamless) w/ Twitter/Facebook/Skype overlay
-IPTV capability (seamless)
-Video/Music Marketplace Integration
-Kinect integration (voice/gesture control everything + physical universal remote)
-Bing search built-in for internet and content/game searches and launching
I don't think it's likely that Durango's 2013 AMD GPU FLOPs are somehow 66% more efficient than Orbis' 2013 AMD GPU FLOPs, no.
GDDR5 and a more powerful GPU produce more heat
Lmao.
I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.
Dafuq people.
Yes, keep shitting this in every single thread over and over. You'll last long.
Orbis: Asa Akira
Durango: Gianna
Wii U: Ron Jeremy
PC - Brain
Orbis - Vagina
Durango - Penis
Wii U - Anus
I hope we get TXAA 4x or equivalent as standard next gen
Orbis - Walter White
Durango - Jesse Pinkman
WiiU - Gale Bedeker
PC: Bugatti Veyron
Orbis: Ferrari 458
Durango: Porsche 911 Carrera 4S
WiiU: Nissan Versa
Lol gow: betrayal. I hope everyone knows that's a cell phone game.
Huh? RAM is far from the only important metric, GPU FLOPS and CPU power are both equally important and they're both fairly disappointing (I'm including PS4 here as well). Fair enough if we were just rating the systems on their RAM then they would both be impressive but taken on their whole, they fall far short of where they should be considering the tech PS3 and 360 had.Lmao.
I remember when everyone here was laughing at Crytek when they said they wanted 8gb ram, and saying we probably wouldn't get more than 2Gb ram for next gen consoles, and now that we're getting fucking 4-8GB, everyone is moaning.
Dafuq people.
I hope we get TXAA 4x or equivalent as standard next gen
It seems one thing Durango and Orbis fanboys agree on: Lets be mean to Wii U.
But lets be honest, Wii U deserves it.
At least part of the custom hardware is designed to take specific GPU associated tasks off the shoulders of that element, freeing up more raw GPU resources for stuff it's better at. I believe, anyway. I could be wrong. Again, wish I was a software engineer.
The problem I have with this idea is that it seems so ... regressive in terms of GPU architecure.
The last decade or so of GPU development has been all about taking dedicated circuitry and replacing it with more flexible, programmable hardware. This seems to be the exact opposite approach.