• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Orbis vs Durango Spec Analysis

Didn't see a thread after I searched so please lock if old.

Nothing really new in this one but kind of puts both in perspective.

Interesting bit about "secret sauce"

There's an argument that suggests that comparing Durango and Orbis on these terms is not realistic; that the platform holders have far more control over the design of the silicon than the raw specs suggest; that they can be adapted with manufacturer-specific 'secret sauce' customisations.

The raw teraflop measurements being mooted - 1.23TF for Durango and 1.84TF for Orbis - have been dismissed as meaningless, and to a certain extent that is true. However, check out AMD's specs page for all of its various GCN hardware and you'll find similar metrics based a very easy formula derived from clock speed and CU count. It's not the be-all-and-end-all of processing power of course, but these are accurate measurements used by AMD itself in giving a broad assessment of the raw computational power of the parts it creates. You'll find that the next-gen console parts slot in quite nicely with their PC equivalents - in short, the teraflop metrics aren't much use in isolation but they are effective for comparison purposes in terms of base hardware capabilities.


Edit sorry http://www.eurogamer.net/articles/df-hardware-spec-analysis-durango-vs-orbis
 
http://www.eurogamer.net/articles/df-hardware-spec-analysis-durango-vs-orbis

Secret Sauce
most of the custom hardware works to ease the CPU burden, not to improve GPU performance, so those hoping for 'secret sauce' to overcome Orbis's theoretical graphics advantages are probably going to be disappointed.
eSRAM and memory bandwidth
So what does this mean for game devs in real terms? Well, whichever way Microsoft tries to finesse it, the 32MB of ESRAM is a bit of a sticking plaster solution that is nowhere near as fast or efficient as the single unified pool of RAM available to Orbis. However, while the disadvantages are obvious, this is not to say that the situation is anything like a complete disaster for Durango development. Speaking to game makers, the impression we come away with is that not every feature in a game actually requires ultra-fast memory. Systems will be developed on the DDR3, and if memory throughput becomes an issue, those features will be ported over to the ESRAM where there's enough bandwidth to provide the raw performance if needed.

Also mitigating the difference to a certain extent is the fact that Durango operates under an enhanced version of DirectX 11 - dubbed internally DirectX 11.x. It's highly likely that crucial rendering functions will automatically be optimised by Microsoft for use with the ESRAM.

Based on the specs I think they should have stuck with 720p as that would allow them to provide a proper next gen leap with the added bonus of native 1080p on PC. I doubt the current specs will be enough to provide a proper leap at 1080p.
 

deanos

Banned
Again with the secret sauce -___- .
Evertime Orbis seems to be ahead of Durango, someone (Aegis, Proelite, etc) comes out with some unspecific blah blah.
Orbis has better GPU? No problem, Durango has GPU secret sauce.
Orbis has better RAM? Nor problem, Durango has RAM secret sauce.
Orbis has 4 CUs for computing? No problem, Durango has CPU secret sauce.

Secret-Sauce.jpg
 
Horrible. They barely even sell 720p displays anymore. 1080p or bust, IMO.
Why? People who care about IQ are already gaming on PCs and no matter how powerful PCs are in relation to consoles, they'll always be shackled by them. Aiming for 720p gives you that next gen leap and if you care about image quality then you'd move to PC and get those same graphics but at several times the resolution.
 

Valnen

Member
Why? People who care about IQ are already gaming on PCs and no matter how powerful PCs are in relation to consoles, they'll always be shackled by them. Aiming for 720p gives you that next gen leap and if you care about image quality then you'd move to PC and get those same graphics but at several times the resolution.

I can't play Sony exclusives on my PC, and I want my Sony exclusives to be at 1080p. Going 720p next gen would be a huge mistake. 720p would be flat out unacceptable for what is supposed to be a huge leap graphically.
 

Eideka

Banned
I can't play Sony exclusives on my PC, and I want my Sony exclusives to be at 1080p. Going 720p next gen would be a huge mistake. 720p would be flat out unacceptable for what is supposed to be a huge leap graphically.

With 5gb of GDDR5 that should not be a problem. I'm talking about real full HD, 1920x1080.
 

StuBurns

Banned
Saying the teraflops are moot, then saying the secret sauce won't combat the Orbis advantage seems to contradict to me.
 
I'm still not convinced that a pool of fast RAM is necessarily the right choice. What happens if a game is optimised to utilise 8GB?
 
Why? People who care about IQ are already gaming on PCs and no matter how powerful PCs are in relation to consoles, they'll always be shackled by them. Aiming for 720p gives you that next gen leap and if you care about image quality then you'd move to PC and get those same graphics but at several times the resolution.

Why don't console games start giving the player the choice of running the game at 720p or 1080p? Choosing between framerate vs IQ. The average Joe need not care about it; it's just there under "Advanced options" for the gamer who does have a preference.
 
This little cache of memory can run in parallel with the DDR3, and combined bandwidth then rises back up to around 170GB/s - a number close to the throughput of the GDDR5 in Orbis.

I don't get this.
How does adding speeds result in increased speeds?
Regardless of the speed of the ESRAM, the DDR3 memory still only runs at DDR3 speeds, right?
 

foogles

Member
Horrible. They barely even sell 720p displays anymore. 1080p or bust, IMO.
I agree, but it's important to point out that most gamers, at the distance they sit, wouldn't be able to see a difference between 720p and 1080p if everything else (frame rate, HUD real estate, etc) is held equal. Even if both MS and Sony do commit to making 1080p the standard, I can guarantee they at least have both considered allowing developers to render at 720p.

And I'm thinking that will be the case: that 720p will be allowed. Also, 30fps (as opposed to 60fps) will almost certainly be allowed for developers to use, too.
 
One thing people should note in regards to "compute functions" and the 4 CUs PS4 supposedly reserved for them. Compute is really an umbrella term that could mean anything. As the article mentions, JCs water physics are based on compute code a somewhat minor task. At the same time, BF3 uses a great amount of DX11 compute code in its deferred renderer, which touches pretty much the entire game.

I would think that we should see the same kind of variability and range in the consoles next gen.
 
I don't think Sony is using LibGCM this gen, but instead native OpenGL.

Also, he "added" the two memory BW's for Durango. -__-
 

gofreak

GAF's Bob Woodward
Hmm. Not a whole lot new but a good sum-up I guess.

The bit about APIs was interesting though, and kind of jives with what Edge reported (Sony giving very 1:1 low-level access through its libraries, MS 'locking' people to DirectX).
 
Saying the teraflops are moot, then saying the secret sauce won't combat the Orbis advantage seems to contradict to me.
He said that they are moot to a certain extent, in isolation. The problem here is that the PS4 GPU has the advantage all around seemingly.
 

Valnen

Member
I agree, but it's important to point out that most gamers, at the distance they sit, wouldn't be able to see a difference between 720p and 1080p if everything else (frame rate, HUD real estate, etc) is held equal.

As someone who games on a PC monitor exclusively that makes me sad =(.

I'm expecting most games to be 1080p. I can't imagine consumers being willing to pay $400 for more sub HD games.
 
DigitalFoundry said:
You only need to look at games like God of War and Uncharted to see what Sony's approach to exploiting its hardware can produce: these remain state-of-the-art video games to this day, despite utilising graphics hardware directly derived from now-obsolete vintage 2005 Nvidia graphics hardware. Of course, with PS3 in particular, the GPU is only one part of the overall hardware offering, but the fact remains that developers are extracting performance from RSX that could only have been dreamed of when the console was designed.

TBH I am fucking SALIVATING to see see what GG, Naughty Dog and Santa Monica can produce on next gen hardware.

God of war ascension looks completely frigging mind blowing in terms of IQ and scale as it is.
 

Eideka

Banned
Hmm. Not a whole lot new but a good sum-up I guess.

The bit about APIs was interesting though, and kind of jives with what Edge reported (Sony giving very 1:1 low-level access through its libraries, MS 'locking' people to DirectX).

That would be absolutely terrible, what the hell is MS doing ?
Regarding the amount of RAM dedicated to the OS, has the 3gb been confirmed or correlated somehow ? Again that's an insane amount.

I'm expecting most games to be 1080p. I can't imagine consumers being willing to pay $400 for more sub HD games.
Unfortunately those consoles are not only targetting core gamers, the mainstream consumer is the key target and they won't notice the difference between 720p and 1080p.
 
And if its not?

I'm sure for certain types of games such as large open world Skyrim & GTA type environments, the more memory that's available the better.

But only in the sense you could have more of a level held in memory at any one time, but I'd rather have a more powerful GPU, which the PS4 seems to have,delivering higher fidelity visuals and smoother frame rates, at the expense of some additional loading from the Hard Drive.
 

gofreak

GAF's Bob Woodward
That would be absolutely terrible, what the hell is MS doing ?

Well, it is a custom version of DX and will be optimised for Durango and will expose custom features as mentioned, but the suggestion later on in the article is that it may not be quite as 1:1 with the hardware as Sony's.

They may be thinking about future hardware that may not necessarily use the same components or may be thinking of other kinds of compatibility.
 

test_account

XP-39C²
I wonder if Sony will talk much about specs regarding PS4. They did this with the PS3 i remember. Although they hardly mentioned anything regarding Vita, so i guess they wont mention much hard specs for the PS4.
 
As someone who games on a PC monitor exclusively that makes me sad =(.

I'm expecting most games to be 1080p. I can't imagine consumers being willing to pay $400 for more sub HD games.
I don't think people really care. Nobody I know offline cares that the COD games are sub-HD, or complains about them looking low-res. It even came up in a conversations with a couple of core gamers I know (not GAF-tier core, but not casuals either) and they had no idea.
 

Eideka

Banned
Well, it is a custom version of DX and will be optimised for Durango and will expose custom features as mentioned, but the suggestion later on in the article is that it may not be quite as 1:1 with the hardware as Sony's.

The future compatibility argument makes sense for them but aren't developpers going to be seriously pissed about that ?
On paper the 720 is behind Sony's offering, if the API makes the gap even larger I don't know if MS will have the multiplats advantage like this gen.
 

gofreak

GAF's Bob Woodward
I wonder if Sony will talk much about specs regarding PS4. They did this with the PS3 i remember. Although they hardly mentioned anything regarding Vita, so i guess they wont mention much hard specs for the PS4.

Yeah, that's what I'm thinking. Everything changed after Kutaragi left. Kutaragi was talking about PS3 tech so long in advance of PS3 even being announced. And then they went into lots of detail with their spec sheet and presentations.

Vita had much more moderate level of spec exposure. They initially didn't even see fit to publish the memory amounts.

So that kind of sucks, if Sony and/or MS are clamming up about technical detail. The word that Sony will be focussing less on spec and more on play style suggests that PR-wise they may not go overboard with spec talk.

On the other hand, though, the leaks so far if accurate have given us a lot more lower level detail about these systems than we've actually had officially out of the platform holders in the past.

For Sony's part, if they feel there's a general perception out there that their system is more powerful and elegant and simple, they might well be happy to not talk too much about spec and leave things out there as they are. Talking about spec too much might goad messy PR arguing with MS.
 

Bojanglez

The Amiga Brotherhood
Well, it is a custom version of DX and will be optimised for Durango and will expose custom features as mentioned, but the suggestion later on in the article is that it may not be quite as 1:1 with the hardware as Sony's.

They may be thinking about future hardware that may not necessarily use the same components or may be thinking of other kinds of compatibility.

Annual Xbox releases with improved hardware?
 
Hmm. Not a whole lot new but a good sum-up I guess.

The bit about APIs was interesting though, and kind of jives with what Edge reported (Sony giving very 1:1 low-level access through its libraries, MS 'locking' people to DirectX).

From what i can gather at B3D coding to the metal means using libgm.
While DirectX on the 360 was like libgm or as close as libgm.

From what we've heard, it should be a seamless transition for developers (especially those who've worked with DX11 on prior PC projects) and while some appear to be worried that being locked to the Microsoft API will be an issue, the fact is that there are specific DX11 functions available to devs tied into the custom Durango hardware. There's also a level of flexibility in how DirectX is used that is equivalent to the almost legendary concept of "coding to the metal". For example, on Xbox 360, Microsoft allow developers to load shader constant data into the GPU in its native form. Devs point the hardware to the data and it loads it - the challenge is to ensure it's in the right place, in the right format before the GPU gets to it. Strictly speaking, it's still working within the DirectX API, but effectively, developers are writing to the hardware directly.

Almost the same thing only gaming press is still confused what that means.
On consoles you can cut some levels of abstraction you just can't on pc because the API needs to talk to many different hardware devices.

I could be wrong....
 
Top Bottom