• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGleaks: Orbis Unveiled! [Updated]

iceatcs

Junior Member
"About 14 + 4 balance:

- 4 additional CUs (410 Gflops) “extra” ALU as resource for compute

- Minor boost if used for rendering"

If it's minor it implies it is largely reserved/optimized for compute behavior, since I wouldn't consider a 28% increase minor.

so all 8 cores CPU are free? Hope MS doing the same. 2 cores reserved are way too much. I don't need it encoding while gaming.
 

i-Lo

Member
Honest question: Given most PC graphics card use between 1 to 2GB at max for graphical information at 1080p and less for current games, why would the PS4 be all of a sudden starved with 3.5GB even with general ram requirements?

Last time I checked, the Radeon 7770HD had 1GB and the GTX670 has 2GB and they run most games of today very well (especially GTX 670) with resolutions of 1200p on high. Do people really believe that all of a sudden the 5-7 GB will be used purely for graphical data? Also have thee forgotten about streaming?

And has anyone ever done a pure "system" RAM utilization while playing games at resolutions between 800P to 1200p (16x10 screen ratio)?
 
It's not 1.4TF though, because the additional 4 CU's can be used in just the same way to get to 1.8TF.
A 28% increase in GPU resources only providing a minor boost in rendering power suggests that those 4 extra CUs cannot be used in the same way as the 14 normal CUs when it comes to graphics rendering.
 
Honest question: Given most PC graphics card use between 1 to 2GB at max for graphical information at 1080p and less for current games, why would the PS4 be all of a sudden starved with 3.5GB even with general ram requirements?

Last time I checked, the Radeon 7770HD had 1GB and the GTX670 has 2GB and they run most games of today very well (especially GTX 670) with resolutions of 1200p on high. Do people really believe that all of a sudden the 5-7 GB will be used purely for graphical data? Also have people forgotten about streaming?

And has anyone ever done a pure "system" RAM utilization while playing games at resolutions between 800P to 1200p (16x10 screen ratio)?

Because 8xMSAA and 32xAF.

You know. The standard stuff.
 

gofreak

GAF's Bob Woodward
"About 14 + 4 balance:

- 4 additional CUs (410 Gflops) “extra” ALU as resource for compute

- Minor boost if used for rendering"

If it's minor it implies it is largely reserved/optimized for compute behavior, since I wouldn't consider a 28% increase minor.

It wouldn't be a simple 'boost' like that. The 4 CUs would have to be working on distinct stuff from the 14.

But compared to the boost other 'co-processors' have offered in the past in some graphics tasks (e.g. SPUs) it might be characterised as minor. Particularly if you're trying to encourage developers to use GPGPU for other things, as I think they clearly are. They wouldn't slice 4 CUs out of the hardware scheduler's hands if they didn't expect reasonably substantial GPGPU usage, and they'll want to see that decision justified, so I'm sure they're really trying to get devs to think along those lines and to not turn around and just use those CUs for graphics.

If devs did as a matter of course turn around and use those CUs more or less for graphics work it would make the decision to split them out a bad one...
 

Nirolak

Mrgrgr
A 28% increase in GPU resources only providing a minor boost in rendering power suggests that those 4 extra CUs cannot be used in the same way as the 14 normal CUs when it comes to graphics rendering.

Right, presumably they're hooked out of the main rendering pipeline and somehow setup to better focus on other compute tasks like animation, physics, and other such floating point tasks.
 

DopeyFish

Not bitter, just unsweetened
Honest question: Given most PC graphics card use between 1 to 2GB at max for graphical information at 1080p and less for current games, why would the PS4 be all of a sudden starved with 3.5GB even with general ram requirements?

Last time I checked, the Radeon 7770HD had 1GB and the GTX670 has 2GB and they run most games of today very well (especially GTX 670) with resolutions of 1200p on high. Do people really believe that all of a sudden the 5-7 GB will be used purely for graphical data? Also have people forgotten about streaming?

And has anyone ever done a pure "system" RAM utilization while playing games at resolutions between 800P to 1200p (16x10 screen ratio)?

it won't be starved

the only negative is between orbis<->durango is pool size. That makes things potentially problematic for porting down from durango. I stress POTENTIALLY.

3.5 GB GDDR5 on it's lonesome is great.

Because 8xMSAA and 32xAF.

You know. The standard stuff.

lol that would be a little overkill
 
They're not being reserved.

They just have to be used more explicitly in your code. If I'm writing my rendering code, I make library calls, and the library and hardware take care of the rest. The GPU splits the work up per vertex or per group of fragments, or whatever, and schedules the threads and executes them.

These 4 CUs won't operate under that regime. They won't mix in 'automagically' to the work you do through the graphics libraries. You'll have to set up the work explicitly, separately, for those 4 CUs.

The work could be whatever you want that a CU can do - graphics or GPGPU. So they're not being 'reserved' per se, they're not being prescribed for any one particular use. However the motivation for splitting the CUs out of a hardware managed regime is so that if you want to do GPGPU, you don't have to leave the scheduling and context switching up to a hardware balancer that might not do things optimally when mixing graphics and compute tasks.

"I think"

Basically how I figure it as well. It's a smart move really
 
AMD bets the company on their new philosophy and products and neither Durango nor the PS4 end up with anything "new" like unified shaders have been.

HSA? GPGPU? HSA? Kaveri? Steamroller?

If reserving 4 CUs + downclocked GPU + Jaguar is the best AMD can do I really wonder if firing their designers have been such a good idea. They are far not ready for anything it seems.
 
So before either console has shown any service, game or peripheral - we already have clowns cross posting brags about which system is going to be the most powerful and waiting for "bitter tears"?

Going to be a crazy year. Enjoying the actual tech discussion... well, what I can actually understand of it. : P
 

Valnen

Member
it won't be starved

the only negative is between orbis<->durango is pool size. That makes things potentially problematic for porting down from durango. I stress POTENTIALLY.

3.5 GB GDDR5 on it's lonesome is great.
"Porting down"

Ugh. Implying it's inferior...
 

Reiko

Banned
Last I heard this is a kinect game for the Durango, do you have more recent informations ?

It was supposed to be a Kinect hybrid game.

But then it was dropped to next gen. Could have turned into a full on core game.

For all intents and purposes, Ryse should be a showcase title. I mean it's a new IP from Crytek.
 

artist

Banned
So before either console has shown any service, game or peripheral - we already have clowns cross posting brags about which system is going to be the most powerful and waiting for "bitter tears"?
Yup, look at these premature victory laps;
Orbis sounds worse and worse. GPU is not 1.8tflops, but 1.4 vs 1.2 of Durango. Like I predicted MS will come on top with more efficent machine and more ram
Yup. It looks like MS engineers will have done it again.
 
Honest question: Given most PC graphics card use between 1 to 2GB at max for graphical information at 1080p and less for current games, why would the PS4 be all of a sudden starved with 3.5GB even with general ram requirements?

Last time I checked, the Radeon 7770HD had 1GB and the GTX670 has 2GB and they run most games of today very well (especially GTX 670) with resolutions of 1200p on high. Do people really believe that all of a sudden the 5-7 GB will be used purely for graphical data? Also have thee forgotten about streaming?

And has anyone ever done a pure "system" RAM utilization while playing games at resolutions between 800P to 1200p (16x10 screen ratio)?

PC also has a large amount of system ram as cache.
You don't need that 4GB for graphics rendering but you will need to also use a decent amount of it for caching, else you just end up with painfully long loading times and a bunch of nasty texture pop in (again).
If you are limited by how much you can stream in from the disc at any one given time that means you are also more limited for the size of the world and things like teleporting/fast travel might not be possible (since it requires all new data, that you could normally have at the ready in normal ram , but that you can't possible fetch off a slow HDD/disc fast enough).

Access times for HDD vs ram = orderS of magnitude (thousands of times slower) and bandwidth is also 50-100x worse.

Sony could have gone with 2GB but I guess that then games would be fucked for everything listed above (AGAIN), MS seems to have gone for the other extreme and could just be bottlenecking their gpu (or more likely decided not to go with a gpu fast enough to be bottlenecked by slow ram since there is no point then) with a huge amount of too slow ram.

Sony went for the more expensive middleground for being dead set on using unified ram. (I wonder why you can't have unified ram aka accessible by both cpu and gpu AND also connect it to a cache of cheap DDR through a bus, that literally only acts as ramdisk, get both your bandwidth and your ability to cache all the data you want without the cost of 4GB gddr5)

edit: to explain, just look what happens in skyrim or gta4 .

Gta4 : you can freeroam around the islands at will, but when you want to fast travel there is a loading screen because the game can't stream everything into the ps3/xbox ram fast enough.
"reality" around the player is just a tiny little bubble of traffic , pedestrians, dynamic stuff, everything further is LOD'ed away (fly up in a heli a bit and cars become sprites, light posts become sprite lights, pedestrians dissapear etc, fly back down and the old cars are gone and replaced by new ones). Turn around and the cars behind you will dissapear etc etc.
Why? Because both consoles have no cache to keep all of this data in when the gpu needs to flush it's memory to make room for the new data it's streaming in.

Skyrim is an even better example: You can freely roam around but any fast travel is met by a loading screen, every house , every dungeon and almost every set piece is instanced because the game has to slowly load all this stuff in from the disc/hdd every time.
The entire game is designed around it (and seriously gimped because of it).

Gran turismo: leave a race and go to the main menu and start the same exact race again? FUCK you , loading screen time.


Consoles needed shitloads more ram so games no longer need to be mechanically/design wise crippled and limited like this.

Again, I think durango took it too far by just throwing bandwidth under the bus, but we'll see how that turns out.

If nothing else durango has the potential for a bit less loading times, larger levels, more 'persistence', less 'tiny bubble of reality surrounded by carboard smoke and mirrors' and more interesting design and gameplay.
That is assuming they tone the graphics down enough and don't just go for really pretty corridors again of course...

And still, the 4GB gddr5 is a relief for the sony console (as dissapointingly low end as the gpu seems to have turned out) since that is at least enough to make for bigger and more persistent maps and less constant memory flushing and long loading times. 2-3GB would have just resulted in another ps3/xbox360/ps2/psx situation loading times/world size wise.
 

Lord Error

Insane For Sony
Based on what? Double of the ram is not that crazy thing if the bandwith is more restricted... & durango has 32 MB eSRAM, remember. Tiling is not a problem on Orbis where on Durango will be. Why people continues to think more RAM is important? It's absolutely false.
Extra ram could be interesting if it can be used to precompute tons of stuff on runtime. Now what kind of stuff, I have no idea. For normal game stuff though, I think that with todays engines that stream everything, ram amount is not really an issue anymore.
 

Eideka

Banned
It was supposed to be a Kinect hybrid game.

But then it was dropped to next gen. Could have turned into a full on core game.

This is speculation on your part then, I don't see how moving it to next-gen implies that Kinect will be brushed aside. That's weird, you seemed so adamant.

The settings is very interesting though, I'm keeping an eye on this.

The "14 + 4 balance" is a game changer and closes the gpu gap significantly. That's a disappointment.
Both machines are modest, and far from a "beast" or "monster".
I'm disappointed myself but let's not get carried away : the visual jump will be spectacular nonetheless.

Those machines seem tweaked for maximum efficiency and that will show in games.
 

Razgreez

Member
AMD bets the company on their new philosophy and products and neither Durango nor the PS4 end up with anything "new" like unified shaders have been.

HSA? GPGPU? HSA? Kaveri? Steamroller?

If reserving 4 CUs + downclocked GPU + Jaguar is the best AMD can do I really wonder if firing their designers have been such a good idea. They are far not ready for anything it seems.

It is disappointing yet i'm pretty sure both these companies are focussing on much shorter console life-cycles. The wii u, psnext and nextbox are all far more focussed on efficiency than ever before - the global economy dictates as much
 

acksman

Member
I don't know why you think this. Sony already said they'd let Microsoft make the first move. I don't think Sony wants to be in the position of having subpar 3rd party games again, and with the similar architectures this time around I don't see them having that issue. You also need to consider the possibility of the next Xbox having Kinect in every box. If that happens, what makes you think they can sell a more powerful system at a reasonable price without taking a big loss?

Think subscription based pricing. That is how Microsoft will keep it affordable and probably beat Sony's pricing this round. MS already tested it this past holiday season with the 360 and a two year sub to live. Expect something similar. Welcome to cell phone pricing.
 
Last I heard this is a kinect game for the Durango, do you have more recent informations ?

Crytek was asked about it and they changed their tune saying "Kinect will be a part of the game", leading many to believe it's most likely one of those "better with Kinect" games.
 

thuway

Member
Right, presumably they're hooked out of the main rendering pipeline and somehow setup to better focus on other compute tasks like animation, physics, and other such floating point tasks.

Correct. Its a balancing act. Instead of wasting GPU resources trying to make the machine juggle expensive compute tasks, a portion of the GPU is shelved off to make these tasks relatively cheap. It is a better solution. Hence the 1.4 TF left over will be very efficient at what it does.
 

guek

Banned
its funny that what annoys me most about the wiiu = krillin/yamcha/farmer comments is how it displays such a shallow understanding of dbz. c'mon!! you can draw up better metaphors than that without using hyperbole.
for example:

360/ps3 = androids 18 & 17 respectively. cuz you know...they're HD twins?

WiiU = android 16. A better model but doesn't show off at all.

Orbis = Ultra Trunks
Durango = Super Vegeta

or conversely, WiiU = imperfect cell. Thinks it's a big boy but is really not in the same league as the others.

ps2/gc/xbox were frieza/ssj1trunks/ssj1goku

wii was ssj1 vegeta (got demolished by android 18)

it's not that hard, people! put some effort into your dbz metaphors!

/nerdrant
 

spwolf

Member
bunch of nasty texture pop in (again).
If you are limited by how much you can stream in from the disc at any one given time that means you are also more limited for the size of the world and things like teleporting/fast travel might not be possible (since it requires all new data, that you could normally have at the ready in normal ram , but that you can't possible fetch off a slow HDD/disc fast enough).


lol.
 

nib95

Banned
Being slow at graphics rendering won't affect those 4 CUs' general FLOPS count.

That makes sense. I wonder why Sony went in this direction. Sounds unusual.

Anyone else want to provide a follow up on GoFreaks post?

GoFreak said:
They're not being reserved.They just have to be used more explicitly in your code. If I'm writing my rendering code, I make library calls, and the library and hardware take care of the rest. The GPU splits the work up per vertex or per group of fragments, or whatever, and schedules the threads and executes them.These 4 CUs won't operate under that regime. They won't mix in 'automagically' to the work you do through the graphics libraries. You'll have to set up the work explicitly, separately, for those 4 CUs.The work could be whatever you want that a CU can do - graphics or GPGPU. So they're not being 'reserved' per se, they're not being prescribed for any one particular use. However the motivation for splitting the CUs out of a hardware managed regime is so that if you want to do GPGPU, you don't have to leave the scheduling and context switching up to a hardware balancer that might not do things optimally when mixing graphics and compute tasks."I think"
 

Reiko

Banned
"Porting down"

Ugh. Implying it's inferior...

Depending on RAM targets for multiplats is what it might come to. Everything else is hot air.

Some will target PS4 ram amounts, while others will target 720 Durango ram amounts.

Whichever RAM amounts and type suits their engine, one of these consoles will be ported down.
 
people please don't reveal your true colors just yet! if you wan't to come out the closet do it properly, with a meltdown at E3!!!

some of these post celebrating and heralding winners, my god.
 
A 28% increase in GPU resources only providing a minor boost in rendering power suggests that those 4 extra CUs cannot be used in the same way as the 14 normal CUs when it comes to graphics rendering.

Or, like people have been saying all along: The real performance difference from float point resources is a lot smaller than it would seem :p
 
People seem to be freaking out about the 14+4 compute units. Unless they gimped the segregated 4, or artificially prevent them from rendering, those extra 4 should be equally adept at rendering graphics as the other 14. They're just separated in order to make it easier for a programmer to load balance GPGPU work and general graphics work. They might also have their own local cache in order to improve GPGPU performance. Basically gofreak summed it up:

They're not being reserved.

They just have to be used more explicitly in your code. If I'm writing my rendering code, I make library calls, and the library and hardware take care of the rest. The GPU splits the work up per vertex or per group of fragments, or whatever, and schedules the threads and executes them.

These 4 CUs won't operate under that regime. They won't mix in 'automagically' to the work you do through the graphics libraries. You'll have to set up the work explicitly, separately, for those 4 CUs.

The work could be whatever you want that a CU can do - graphics or GPGPU. So they're not being 'reserved' per se, they're not being prescribed for any one particular use. However the motivation for splitting the CUs out of a hardware managed regime is so that if you want to do GPGPU, you don't have to leave the scheduling and context switching up to a hardware balancer that might not do things optimally when mixing graphics and compute tasks.

"I think"

It's a long long time since we judged general 'pure graphics' ability on triangle setup rates... :)

Exactly. If a game was absolutely optimized to perfection (not even remotely possible), you would only need to render 1 polygon per pixel - a little over two million polygons per frame at 1920x1080. Or 125 million polygons per second (at 60fps). Theoretically achievable on current HD consoles. Both upcoming consoles likely won't suffer from a lack of polygon pushing ability. Especially since smart and efficient use of tessellation for LOD can greatly aid in putting more advanced scenery on screen.
 

deanos

Banned
This 14 + 4 CU thing is bogus and this is why:
- 4 additional CUs (410 Gflops) “extra” ALU as resource for compute

since when do you need to 'reserve' CUs in order to do that? GPGPU says hi.
btw you can bookmark this.
 
it won't be starved

the only negative is between orbis<->durango is pool size. That makes things potentially problematic for porting down from durango. I stress POTENTIALLY.

3.5 GB GDDR5 on it's lonesome is great.



lol that would be a little overkill

Nope. They will use the 3.5GB pool from Orbis and the lesser bandwidth from Durango. Both will have 4xMSAA, Orbis using the extra RAM bandwidth and Durango using the ESRAM.

Seriously, third parties are going lowest common denominator. Make no mistake. It will be first parties that make full use of the systems. I can't wait to see what Naught Gods and 343i do on next gen hardware.
 

artist

Banned
The 14+4 split is what I havent heard before, it makes as much sense as this;

VGLeaks on Durango said:
- 8 gigabyte (GB) of RAM DDR3 (68 GB/s)

- 32 MB of fast embedded SRAM (ESRAM) (102 GB/s)

- from the GPU’s perspective the bandwidths of system memory and ESRAM are parallel providing combined peak bandwidth of 170 GB/sec.
 

Eideka

Banned
Crytek was asked about it and they changed their tune saying "Kinect will be a part of the game", leading many to believe it's most likely one of those "better with Kinect" games.

This pleases me, I have no interest whatsoever in motion controllers. Yes, I'm part of "those" people that favor traditional gaming over what I see as gimmicks.
 

Reiko

Banned
people please don't reveal your true colors just yet! if you wan't to come out the closet do it properly, with a meltdown at E3!!!

some of these post celebrating and heralding winners, my god.

Didn't you make that Don Mattrick post prior? lol

This pleases me, I have no interest whatsoever in motion controllers. Yes, I'm part of "those" people that favor traditional gaming over what I see as gimmicks.

I agree. Just seeing the CryEngine 3 being used in a new setting that's not "Fun in the jungle redux" interests me greatly.
 

Lord Error

Insane For Sony
"About 14 + 4 balance:

- 4 additional CUs (410 Gflops) &#8220;extra&#8221; ALU as resource for compute

- Minor boost if used for rendering"

If it's minor it implies it is largely reserved/optimized for compute behavior, since I wouldn't consider a 28% increase minor.
Perhaps minor compared to the overall boost provided when it's used to offload CPU of compute tasks.

Good news there is confrmation that those 4CUs can in fact be used for rendering if dev wants to.
 
Top Bottom