• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGleaks: Orbis Unveiled! [Updated]

Gorillaz

Member
So here is the ultimate question:






















Which console will be the Gianna Michaels of next gen?
Reiko give me some feed back or something
 

Orayn

Member
That doesn't mean devs will aim for good tech, in general. We will still see games run in the TEENS in FPS (Thanks, DICE for BF3 on consoles) just to put more dust/blood/sun flares/rubber trees on the screen.

We will see parity, yes, but that doesn't mean devs will change their shitty ways.

Personally, I'd rather have a hit to visuals for 1080p native at 60 but we won't be seeing that this gen as a norm.

I can still see there being a 1080p60 honeymoon for the first year or so when devs are using most of the same engines with a lot more power than they've been used to.
 
Too much RAM??

You can never have too much RAM. If anything the RAM on orbis is frankly a complete let down.

Within 2 years the ps4 will be bandwidth starved and devs would be back to making compromises again.

Anything less than 8GB is frankly a dissapointment
 

Verendus

Banned
Folks need to stop posting that pussy Gohan.

What I'm really curious about regarding these specs is how far destructibility and physics can go. I really hope these kind of things are on display in the first batch of games they choose to display. With the specs, they should be able to do some really nifty stuff. I genuinely can't wait to see GT6 and ND/SSMs next projects. They're going to produce some eye-melting stuff.

Thank you. All I care is that this will be a big jump.
It will be a significant enough jump. I don't think you'll be disappointed.
 

mrklaw

MrArseFace
But if that is the case (and that's the only reason, with no further hardware customization), then why would you fix the spit to 14:4? I can't imagine that it wouldn't be equally possible to simply allow any desired split from the software side.

compromization
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
I have a pretty good feeling were looking at $400 - $450. I don't think were going to see $500

To me, that's way too high in a shrinking industry. We'll see what happens, though.
 
I'm well aware of that.

Just saying that in terms of graphics rendering performance the difference between Durango and Orbis is going to be a lot less than the 50% that was talked about before.

But is Durango then going to do all the non-graphics rendering stuff for free somehow?
 

Jack_AG

Banned
I'm well aware of that.

Just saying that in terms of graphics rendering performance the difference between Durango and Orbis is going to be a lot less than the 50% that was talked about before.

Maybe not. According to what we know - physics, AI, etc will have to be gouged from the 1.2TF performance of Durango's GPU, limiting what can be done with graffox - whereas on the Orbis - it has it's own dedicated unit so the entire 1.4TF can be dedicated to graphocks.

There will be a difference. Of course, what can change in the future is the MS "special sauce" which, I contend, is nothing more than a continuation of the "jump in" PR campaign.

I'll say it again: PR > Specs
 

McHuj

Member
But if that is the case (and that's the only reason, with no further hardware customization), then why would you fix the spit to 14:4? I can't imagine that it wouldn't be equally possible to simply allow any desired split from the software side.

I've wondered if perhaps maybe you can give those guys fast path/interface to the CPU's L2 cache so that the communication between CPU and CU could be super low latency and very high bandwidth.

Or with regards to the thread scheduler, the CPU access always takes priority over GPU threads. That way the CPU gets a guaranteed access to 4 CU's no matter what likewise the GPU will never see less then 14 CU's ever. I'm not saying that that necessary, but forcing those rules may easy programming and predictable behavior.
 

Spongebob

Banned
But is Durango then going to do all the non-graphics rendering stuff for free somehow?

Neither of them will be doing anything for free.

Maybe not. According to what we know - physics, AI, etc will have to be gouged from the 1.2TF performance of Durango's GPU, limiting what can be done with graffox - whereas on the Orbis - it has it's own dedicated unit so the entire 1.4TF can be dedicated to graphocks.

There will be a difference. Of course, what can change in the future is the MS "special sauce" which, I contend, is nothing more than a continuation of the "jump in" PR campaign.

I'll say it again: PR > Specs

I'm well aware of this, I'm just comparing graphics rendering performance.
 
I'm well aware of that.

Just saying that in terms of graphics rendering performance the difference between Durango and Orbis is going to be a lot less than the 50% that was talked about before.
I really don't get where you are trying to draw this conclusion. Do you even know what a ALU's role in rendering is?
 

kitch9

Banned
Too much RAM??

You can never have too much RAM. If anything the RAM on orbis is frankly a complete let down.

Within 2 years the ps4 will be bandwidth starved and devs would be back to making compromises again.

Anything less than 8GB is frankly a dissapointment

You need to learn computers, and stop looking at big numbers.
 
I have a feeling that for many people in today's economy where food prices are skyrocketing, $499 for a video game console isn't going to be on their priority list. If they don't launch this at $399 we're going to be seeing yet another PS3 situation early on in its life.

I'd easily buy this at $399, though. Looks amazing.

499 for a dedicated console is going to crash out of the gate and then possibly slowly recover as more games come out and price cuts occur. A 499 dollar box that is seen as much more than just a gaming platform will most likely sell better. I'm interested to see how Sony intends to market it.
 
Regarding the 14 + 4 CUs

(not 100% sure)

VLIW is rather inefficient when it comes to anything else than pure GPU tasks. So a GPU with a VLIW architecture wouldn't perform that good on tasks like physics, etc. So what if the 14 CUs are indeed based on VLIW architecture (Southern Island is not - so maybe the Liverpool GPU is older and does not feature GCN).

To make that up somehow AMD designed or put together 4 extra CUs for those tasks in a SIMD architecture. I just wonder why not just take a pure Southern Island GPU which is all SIMD and leave it at that.

So basicly I guess the GPU could be "old" the extra CUs are a newer architecture to have better GPGPU possibilities.
 
But if that is the case (and that's the only reason, with no further hardware customization), then why would you fix the spit to 14:4? I can't imagine that it wouldn't be equally possible to simply allow any desired split from the software side.

The hardware balanced to 14 CU imply they balanced memory amount, bandwidth, ROPs, Texture units, etc, as if they had a 14 CU gpu for graphics...

Probably to really benefit on the extra 4CUs they would have to scale everything up... It should be easy to see if that's the case if a game if simpler graphics/heavy simulations can use even more resources to computations.
 

artist

Banned
Uhh, people taking pre-emptive victory laps and now insisting that the BOM is well below $400.

Do you even know the die area of the SoC?
 
I really don't see any way that the BOM of this system could reach anything even close to $500. How are you arriving at that number?

Im expecting move to be included.

about $250 for cpu gpu ram and all the extra little things in the SoC, $150 for motherboard and io ect( +HDD + BR) $50 for controller with move integration and $50 for move.
 

Lord Error

Insane For Sony
i think the prospect of donating part of what's already a less than stellar GPU to prop up the console's weak CPU isn't exactly setting people's hair on fire.
I don't think this stuff is gong to be used to prop up the CPU. GPU compute is meant to be used for tasks that are practically impossible to do in realtime on any today's CPU.
 

Frodo

Member
Since I gave up understating what most of this things mean, I'll wait until they show us some games. Really interested to see what Santa Monica and Naughty Dog will make on the new hardware...
 

Krabardaf

Member
Hold on a minute there console warrior. If you want things like destructible environments, heavy physics, better animation blending, less clipping, and better set pieces, than you better be good at Compute.

This was a conscious decision, and it is GOOD news. If Durango doesn't have this, you can damn well bet that a significant portion of the GPU will be crippled. GPGPU is the wave of the future, don't be daft.

I don't understand the need for reserved CUs. GPGPU is perfectly doable on a classic GPU, without the need to have dedicated units for that.
In the end even the orbis GPU might be used for computational stuff, and so is the Durango one.

I think what will really matters is the number of CUs. Having a part of them arbitrary assigned to a task may come off as a constraint more than anything else.
 

Jack_AG

Banned
I really don't get where you are trying to draw this conclusion. Do you even know what a ALU's role in rendering is?

It's been explained a dozen infinity times already. I'm just tossing him on /ignore. There's a difference between someone who doesn't understand and someone who is ignorant.
 
Could the 4 cus be needed in some way for backwards compatibility?

From the other Orbis thread:

That's what I'm trying to figure. I don't know if they are going to do BC through that (and use it to aid PS4 compute processes) or if it's even possible to match the SPE's speed using those CU's. Maybe they're just using those 4 CU's until they get those PE's manufactured? I have no clue.

I just think it's odd that they have 4 cores x 2 modules, when they have another 2 spots open for modules...
 
i think the prospect of donating part of what's already a less than stellar GPU to prop up the console's weak CPU isn't exactly setting people's hair on fire.

That is exactly how I feel.

Combined with the fact that it took 8 years and that a large jump is required just to display current gen games at 1080p and 30 fps locked (instead of barely hitting 30) and to have the LOD scaling and distance be at ... well... something that is less of a caricature than what we get right now this is really dissapointing to me.

If it's dissapointing straight out of the gate then I'm afraid we'll be right back at 720p/25fps/silly LOD before the console cycle even gets halfway.

I don't see how I have some sort of obligation to get excited about anything and everything regardless the context or perceived value.
 

lantus

Member
I don't know, I think we could end up being surprised at the final price once it's announced. A lot of us here, including myself were somewhat taken by surprise when they announced the Vita's price point.
 

iamvin22

Industry Verified
PC really has nothing like this. Don't count out both of these new consoles just because they are placed in 1.3-2Tf mark "on paper". Their hardware will be able to do crazy things in closed environment.

THIS is the problem. people are trying to compare these specs to pc's. these consoles will do amazing things because they are dedicated and dont have to share resources to support, windows, word, internet and so on.
 

Razgreez

Member
A console is not a computer. Computers can be updated within months.

consoles have to last an entire generation.

You can't take a console gpu out and stick a new one in. Totally different territories.

To quote my old computer science lecturer: "in the end they're all toasters, they just sit there and get hot"
 
I don't know, I think we could end up being surprised at the final price once it's announced. A lot of us here, including myself were somewhat taken by surprise when they announced the Vita's price point.

We were pleasantly surprised, and then they said you couldn't even play the majority of games unless you had an expensive proprietary mem stick inside the device. Then we weren't so pleasantly surprised.
 

Elios83

Member
I'm well aware of that.

Just saying that in terms of graphics rendering performance the difference between Durango and Orbis is going to be a lot less than the 50% that was talked about before.

It just means that instead of having a clear edge in pure rendering performance, Orbis will have a significant edge for 'little things' like physics, animations, destructibility, fluid simulations.
The disadvantage has not disappeared, has just shifted elsewhere, now Durango has just the Jaguar CPU (probably with more cores reserved than Orbis) for non rendering tasks and that's it, while Orbis will have more sauce (LOL) for these tasks.
Btw just keep in mind that these system will be equivalent and treated as equivalent in most cases, first party developers dedicated to make exclusives will exploit the strengthnesses in both cases and make the difference.
 
I don't really understand why people are dissing the GPU here if it is a downclocked 7870... Sure there are much more powerful gpus on the market, but they draw ridiculous power...it terms of the performance/efficiency balance they are not feasible at all...a 680 was never even an option. To beat a dead horse, considering how good games are still looking on consoles, next gen is definitely going to offer games much better looking than stuff like Battlefield 3, which was still probably held back by console focusing.
 

Spongebob

Banned
Am I missing something here.

VGleaks mentions that only a minor gain in rendering performance will be gained despite a 29% increase in the # of CUs being dedicated to rendering. From that it's reasonable to assume that these other 4CU aren't as capable in their ability to render, right?
 

Jack_AG

Banned
Neither of them will be doing anything for free.



I'm well aware of this, I'm just comparing graphics rendering performance.

Wow... people are explaining it and you're not getting it. If you want to PLAY games - you need more than just "graphics". 1.2TF for Durango is used for graphics and "else". With Orbis - 1.4TF can be used for JUST graphics (if chosen) and the CUs to be used for "else".

A portion of that 1.2TF will have to be used for "else". SO let's say that 400GF are used for "else" in a single video game. That means that Durango will use 800GF for "graphics" and Orbis will use 1.4TF for "graphics". What, about this simple math, do you not get?

Is any of this sinking in? I hope because you are now being ignored, junior.
 
Regarding the 14 + 4 CUs

(not 100% sure)

VLIW is rather inefficient when it comes to anything else than pure GPU tasks. So a GPU with a VLIW architecture wouldn't perform that good on tasks like physics, etc. So what if the 14 CUs are indeed based on VLIW architecture (Southern Island is not - so maybe the Liverpool GPU is older and does not feature GCN).

To make that up somehow AMD designed or put together 4 extra CUs for those tasks in a SIMD architecture. I just wonder why not just take a pure Southern Island GPU which is all SIMD and leave it at that.

So basicly I guess the GPU could be "old" the extra CUs are a newer architecture to have better GPGPU possibilities.
It wouldn't be DX 11.1 class if it was VLIW
 

Derrick01

Banned
Thank you. All I care is that this will be a big jump.

They both will be. Devs are targeting games like Watch Dogs and Star Wars 1313 to be launch games so just wait until people have a year or two of experience with them under their belts.

I don't think people understand just how big of a deal it is going from well under 1GB RAM to 3.5 of really fast RAM or 5+ of normal RAM let alone all of the other upgraded parts. It's a fucking huge deal.
 

Alex

Member
Im expecting move to be included.

about $250 for cpu gpu ram and all the extra little things in the SoC, $100 for motherboard and io ect, $50 for BR and $50 for controller with move integration and $50 for move.

Those would be consumer level prices, for this stuff, Sony doesn't go down to Fry's when it's time to build a new box.
 
Can anyone tell if Orbis or Durango will be inherently easier to develop for?

From a hw perspective, I think Orbis' memory is more straightforward with less potential bottlenecks.

From a tools perspective, I think MS will probably be the leader.

At any rate, it's much different than last generation, that had one easy to develop for system and one challenging to develop for system. Neither should be "difficult" this time around.
 
Top Bottom