• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGleaks: Orbis Unveiled! [Updated]

What is important is that the latest news brings an end to the PS2-Xbox type power gap hyperbole.

Well, we still don't have all the details on the durango GPU, OS etc, so it could still be early days.

It isn't surprising that seem to be pretty close though...I mean, they will likely to targeting the same retail price
 

thuway

Member
Dude.

It says that when the 4CUs are used for rendering there is only a MINOR boost. This seems to suggest that these 4CUs aren't as capable as the other 14CUs in graphics rendering.

I'm not talking about the other benefits of the 4CUs, just rendering performance.

dolan-duck-smile.jpg


-_- Spongepls. The 4 CU's are optimized for compute. If you make things like destructibility, complex calculations, animations, lighting- etc. RELATIVELY CHEAP, than the remainder of your GPU can do the heavy lifting. This is GOOD news.

EDIT: you are right, nothing is free.
 

Jack_AG

Banned
The opposite. These things are so damn similar in terms of architecture that botched multiplats should be nothing but an unpleasant last-gen memory. I hope.

That doesn't mean devs will aim for good tech, in general. We will still see games run in the TEENS in FPS (Thanks, DICE for BF3 on consoles) just to put more dust/blood/sun flares/rubber trees on the screen.

We will see parity, yes, but that doesn't mean devs will change their shitty ways.

Personally, I'd rather have a hit to visuals for 1080p native at 60 but we won't be seeing that this gen as a norm.
 

Durante

Member
There was a suggestion earlier in the thread that context and thread switching between compute and graphics tasks on GCN isn't as optimal as it could be, or eventually will be in GPUs down the line.

So having a slice of CUs work outside of hardware thread control might make sense if that's true.

Jeffrigby mentioned that AMD currently recommends for some systems, an APU + discrete GPU, so the APU can be used for compute, and the GPU for graphics...that it's best to have the two work on distinct schedulers. Apparently post-2013/2014 GPUs will have better context switching.

They may well tweak them further for compute vs graphics also, but the above, if true, may be motivation enough for the split.
But if that is the case (and that's the only reason, with no further hardware customization), then why would you fix the spit to 14:4? I can't imagine that it wouldn't be equally possible to simply allow any desired split from the software side.
 
Could anyone kindly compile a list of posters who actually know what they are talking about? My bullshit-o-meter doesn't work in this area, so I'm having a hard time believing, and I want to believe!
 

iamvin22

Industry Verified
We've said this over and over and over again. Orbis and Durango will be good at different thing. Don't be foolish to think one is better than the other in all frames of work. However, this news about CU is something that makes Orbis EVEN better. People are morons. You lot are trying to spin something positive into a negative.

This is akin to Cell processor, that actually is worth hyping. Fuck is wrong with you GAf.

the problem with GAF is they are comparing piece for piece to computer parts. they dont understand closed console development.

you are 100% right on the positive.
 

Gorillaz

Member
The tablet isn't even all that great, no multi-touch, built-in battery, etc. Considering also that there are rumors of touch capability in the PS4 controller, and having "dual cameras" as well, I'm sure that the Nintendo Gamepad is a non-factor. I see Orbis and Durango releasing at $449 - $599.

Thats suicidal in this economy.
 
It will be $450 at launch. All three console manufacturers have positioned themselves to be relatively successful. Sony and MS will take a couple of years, but will have long legs. Nintendo was smart to release a less powerful rig last year and will have a substantial catalog and profit (on first party games) by Christmas.
 

Lord Error

Insane For Sony
Terrible comparsion. 1 AMD GCN FLOP is worth around 0.6 nvidia Fermi FLOPs. Even then it's still a bad comparison. You should only compare these machines to the 7000 and the eventual 8000 AMD series. The closest comparison to both GPUs is the HD 7770.
Isn't this more like 7850? With much more VRAM at that. This is of course assuming either that:
- You would want to use compute in your games at all (which I think every next gen game will, and heavily)
- Or the 4CUs can still be used for rendering, through some less automated/manual scheduling, which seems to be possible.
 
We've said this over and over and over again. Orbis and Durango will be good at different thing. Don't be foolish to think one is better than the other in all frames of work. However, this news about CU is something that makes Orbis EVEN better. People are morons. You lot are trying to spin something positive into a negative.

This is akin to Cell processor, that actually is worth hyping. Fuck is wrong with you GAf.


GAF only cares about pretty pixels and vertices.

Seriously though.. it's tough not to be dissapointed... I wanted to see what those extra 6 CU's over Durango could have done graphically wise.
 

GodofWine

Member
Question from a non-techie (there should be a thread for stuff like the following):

1. the occassional next gen detractor's comments regarding "weak gpus etc" makes me want to know how much more powerful is orbis's gpu than PS3...im sure its by alot, but whats alot?

2 . RAM is way higher and way faster...hypothetically what would KZ2 or 3 look like if the PS3 had only the ramof orbis...basically how does RAM affect stuff

3. Same question..but swap GPU for RAM...

4. What does KZ4 / BF4 / GT6 look like compared to KZ3 etc
 
Dude.

It says that when the 4CUs are used for rendering there is only a MINOR boost. This seems to suggest that these 4CUs aren't as capable as the other 14CUs in graphics rendering.

I'm not talking about the other benefits of the 4CUs, just rendering performance.
You're reading into there words too much. They said themselves that's its 400 GFLOPS for the array.
 
How are people coming in at $500-600. Unless Sony is looking to make a substantial profit from day 1 unlike any of their previous hardware, then that's never going to happen.
 

Elios83

Member
Dude.

It says that when the 4CUs are used for rendering there is only a MINOR boost. This seems to suggest that these 4CUs aren't as capable as the other 14CUs in graphics rendering.

I'm not talking about the other benefits of the 4CUs, just rendering performance.

They don't have to be used for rendering, it's a GPGPU module for advanced simulations and compute tasks. Pretty graphics are just a part of the picture. Things like physics, collisions, animations have to significantly improve as well and having 400 Gigaflops reserved for those when your CPU is 100Gigaflops is a really significant thing.
Using those CUs for rendering even if possible goes against the reason they didn't leave them in the GPU for rendering tasks in first place.
Of course it would have been better if the GPGPU module was something extra and not carved out of the GPU.
 
You can't compare these machines on paper. Durango better have something to assist with compute, or else it just got much weaker in comparison.

With all that talk about efficiency i think durango is not going to have any extra units to assist with compute tasks, but rather have architectural changes which allows the same units to switch more efficiently between those.

Somewhat akin to Dedicated Pixel + Vertex shaders vs unified shaders.

Dude.

It says that when the 4CUs are used for rendering there is only a MINOR boost. This seems to suggest that these 4CUs aren't as capable as the other 14CUs in graphics rendering.

I'm not talking about the other benefits of the 4CUs, just rendering performance.

For me that suggests that everything being equal 18 CUs provide a minor improvement in performance compared to 14.
 

Globox_82

Banned
Hold on a minute there console warrior. If you want things like destructible environments, heavy physics, better animation blending, less clipping, and better set pieces, than you better be good at Compute.

This was a conscious decision, and it is GOOD news. If Durango doesn't have this, you can damn well bet that a significant portion of the GPU will be crippled. GPGPU is the wave of the future, don't be daft.

tumblr_lagnxv7cLz1qa15tx1.gif
.
 

QaaQer

Member
Correct. Its a balancing act. Instead of wasting GPU resources trying to make the machine juggle expensive compute tasks, a portion of the GPU is shelved off to make these tasks relatively cheap. It is a better solution. Hence the 1.4 TF left over will be very efficient at what it does.

Could the 4 cus be needed in some way for backwards compatibility?

From the other Orbis thread:

phosphor112 said:
In the DF article it mentions the following:

"Paired up with the eight AMD cores, we find a bespoke GPU-like "Compute" module, designed to ease the burden on certain operations - physics calculations are a good example of traditional CPU work that are often hived off to GPU cores."

If it really is "paired up" to the CPU, then it confirms a theory that Jeff_Rigby has had for several months.

Reading a 2010 patent by Sony (http://www.google.com/patents/US20100312969) shows a chip that resembles Toshiba's "SpursEngine." In this patent, they detail a "Processing Element" (PE) that contains 1 PPU and 4 SPE's. In short, "half of a Cell."

The interesting thing about this patent, is that this PE can be hooked up to as many as one pleases. Hooking two together will create a "Cell equivalent." Now, one might ask, how much would that cost?" If integrated into the Jaguar APU, very little.

How? AMD Crossbar Switch.

The AMD solution currently rumored in the Orbis would have 4 "slots" on the crossbar to integrate their chips. As of current, Orbis has 8 Jaguar cores. There are 4 cores per Jaguar module. That means 2 slots take up the Crossbar, leaving 2 more opened. Just enough for two PE's. 8 Jaguar cores total with 2 PPU's and 8 SPU's.

Another possible configuration that Jeff mentions is 4 Steamroller cores (as 2 comes in each module), with the additional 2 PE's attached, but I digress.

In the patent there is this quote.

"The local PE bus can have, e.g., a conventional architecture or can be implemented as a packet-switched network."

This fits in-line perfectly with AMD's solution.

To further solidify this theory, we take a look at this quote:

"The PE is closely associated with a shared (main) memory through a high bandwidth memory connection. Although the memory preferably is a dynamic random access memory (DRAM), the memory could be implemented using other means, e.g. as a static random access memory (SRAM), a magnetic random access memory (MRAM), an optical memory, or a holographic memory, etc."

This allows any implementation of memory as the engineers see fit. Sony is no longer tied down to the use of XDR ram for these PE's. The GDDR5 bandwidth would satisfy the needs of the SPE's to make sure they aren't data starved.

What does this all mean?

This means several things:

Backwards compatibility is within reach, adding the two PE's will create an environment where they can emulate the Cell. The RSX can be emulated by the GPU, and the GDDR5 bandwidth is sufficient.

PS4 functions: As they mentioned in the article, it will take "GPU-like" functions. Why use the SPE's over conventional GPU cores? SPE's are much faster. They tackle GPU tasks in a CPU manner. Low core count, high speed. These functions include DSP, a feature that Sony has yet to address in the Orbis, physics tasks, and video processing (encoding/decoding). They can add all these features to the Orbis without having the GPU take a hit and sacrificing GPU tasks.
 

Snakeyes

Member
its funny that what annoys me most about the wiiu = krillin/yamcha/farmer comments is how it displays such a shallow understanding of dbz. c'mon!! you can draw up better metaphors than that without using hyperbole.
for example:

360/ps3 = androids 18 & 17 respectively. cuz you know...they're HD twins?

WiiU = android 16. A better model but doesn't show off at all.

Orbis = Ultra Trunks
Durango = Super Vegeta

or conversely, WiiU = imperfect cell. Thinks it's a big boy but is really not in the same league as the others.

ps2/gc/xbox were frieza/ssj1trunks/ssj1goku

wii was ssj1 vegeta (got demolished by android 18)

it's not that hard, people! put some effort into your dbz metaphors!

/nerdrant
Post of the thread.

Cell: iOS/Android gaming - Feeds on average people, offers cheap derivatives of existing game series, goes through frequent incremental upgrades.

And here's the Wii U successor after Nintendo realizes the fickleness of the casual audience:

tumblr_m4sn5lsn7P1r72ht7o11_r1_500.gif
 

gofreak

GAF's Bob Woodward
But if that is the case (and that's the only reason, with no further hardware customization), then why would you fix the spit to 14:4? I can't imagine that it wouldn't be equally possible to simply allow any desired split from the software side.

Well, I'm not sure about that. There is a point where things get handed over to the GPU and it balances and schedules the rest in its own hardware, no? I think. You lose sight after a certain point about what's going to run where. Even in a closed box it may not be possible to make that programmable.

They probably are taking the opportunity to tweak these CUs though, yeah.
 

Perkel

Banned
I don't understand this "14+4" idea yet. Unless there are some hardware customizations for those 4 CUs, I don't see the point in fixing the distribution to some specific scheme, since it would be more efficient to let each game make its own decisions.

So I'd assume that there are in fact some hardware customizations for those 4 CUs. Then the question centers on whether they are "upgrades", "downgrades" or simply "sidegrades" compared to normal, graphics-focused CUs.


I think it is pretty clear what they want.

"Here is 14CU for bells and whistles, resolution etc. and here is 4 CU for physic, AI, other non GPU stuff. and also here is audio hardware and decompression hardware to help you"


And still 1,4 is above 1,2 rumored for durango.
Game should look better and will have better AI and physic at the same time.
It is still 1,8 Tf just 400Gflop of that is reserved to help CPU with hard stuff.

Imo this is better for gamers. I always choose things like cloth physic over other things.


Still with this spec i wonder about resolution and IQ for new games. I hope personally that APU which Sony/MS will provide will get most of this hardware.

Exciting stuff in my opinion.
 

DopeyFish

Not bitter, just unsweetened
This is precisely why I think more RAM is needed for the PS4.

Well it's not entirely needed, they went up 8x like expected...meshes, textures, music, sounds.... They don't need to inflate by 8x...

But the more ram you have, the less loading you have, the quicker you can transition from scene to scene, you can create more game with worlds like elder scrolls... You move something and go across the world and come back, its still there. Stuff like that.

That's why it can be a toss up
 

Chronos24

Member
Judging by the read it looks like quite the jump from PS3. Looking forward to seeing this machine in action! Anything on backwards comparability? Sorry if the question has been answered I just don't feel like sifting through all the pages...
 

jsnepo

Member
I know FLOP is some technical jargon that I have no interest in knowing but I can't be the only one that finds it funny especially reading it in this thread.
 
Could anyone kindly compile a list of posters who actually know what they are talking about? My bullshit-o-meter doesn't work in this area, so I'm having a hard time believing, and I want to believe!

Look to Thuways coming at first light on the second day and only Thuway.
Kagari is a good one too
Proelite for anything Durango.
 

LastNac

Member
No, it isn't. I'd argue with the shifts in the market, it's harder to increase the base price now than it was then. Meaning $500+ starting point would shrink the market further.

No way in hell it will surpass $500, but I largely think that is the target mark.
 

Reiko

Banned
Post of the thread.

Cell: iOS/Android gaming - Feeds on average people, offers cheap derivatives of existing gaming series, goes through frequent incremental upgrades.

And here's the Wii U successor after Nintendo realizes the fickleness of the casual audience:

tumblr_m4sn5lsn7P1r72ht7o11_r1_500.gif

And then there's Dreamcast 2.

boo_and_gohana1ukh.jpg
 

Razgreez

Member
Question: if this level of programming versatility for the CUs is a good thing when compared to a 18+0 arrangement, what would have been the negative of going with 0+18 instead of the rumored 14+4?

Normal AMD CUs have ALU scalar units. These stop "inefficient" code from reaching the SIMD and execute them separately. I assume the scalar ALUs on these CUs are disabled meaning programmers can run whatever code they want on the CUs in whichever order they wish. A potentially powerful scenario in the right hands.

Once again i'm just speculating
 

Spongebob

Banned
They don't have to be used for rendering, it's a GPGPU module for advanced simulations and compute tasks. Pretty graphics are just a part of the picture. Things like physics, collisions, animations have to significantly improve as well and having 400 Gigaflops reserved for those when your CPU is 100Gigaflops is a really significant thing.
Using those CUs for rendering even if possible goes against the reason they didn't leave them in the GPU for rendering tasks in first place.
Of course it would have been better if the GPGPU module was something extra and not carved out of the GPU.

I'm well aware of that.

Just saying that in terms of graphics rendering performance the difference between Durango and Orbis is going to be a lot less than the 50% that was talked about before.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Also, $499 is not expensive at all. This is a system that will last you 6 years minimum, most likely more than that. There aren't many tech products out there that can guarantee you that. The iPhone is 6 years old now, but no one uses the 1st gen model because it's too slow.

It's why I'll be buying a next-gen system regardless of price. I'd rather buy it now than wait 2 years and save what, $100? That's nothing.

I have a feeling that for many people in today's economy where food prices are skyrocketing, $499 for a video game console isn't going to be on their priority list. If they don't launch this at $399 we're going to be seeing yet another PS3 situation early on in its life.

I'd easily buy this at $399, though. Looks amazing.
 

ghst

thanks for the laugh
-_- Spongepls. The 4 CU's are optimized for compute. If you make things like destructibility, complex calculations, animations, lighting- etc. RELATIVELY CHEAP, than the remainder of your GPU can do the heavy lifting. This is GOOD news.

EDIT: you are right, nothing is free.

i think the prospect of donating part of what's already a less than stellar GPU to prop up the console's weak CPU isn't exactly setting people's hair on fire.
 
I have a feeling that for many people in today's economy where food prices are skyrocketing, $499 for a video game console isn't going to be on their priority list. If they don't launch this at $399 we're going to be seeing yet another PS3 situation early on in its life.

and then it become's like the vita shortly after D:
 

Durante

Member
Question from a non-techie (there should be a thread for stuff like the following):

1. the occassional next gen detractor's comments regarding "weak gpus etc" makes me want to know how much more powerful is orbis's gpu than PS3...im sure its by alot, but whats alot?
PS3 is harder to compare, so let's use 360. In that case, on paper, it's about 7 times faster, but GPUs have become more efficient since then, so I'd say a nice round ~10 is fair. That's very similar to a traditional generational jump. The reason people might call it weak is that this jump took far longer to achieve this time around than in previous generations.

Your other questions are almost impossible to answer in a general way.
 
I have a feeling that for many people in today's economy where food prices are skyrocketing, $499 for a video game console isn't going to be on their priority list. If they don't launch this at $399 we're going to be seeing yet another PS3 situation early on in its life.

I have a pretty good feeling were looking at $400 - $450. I don't think were going to see $500
 

mrklaw

MrArseFace
don't fixate on the minor, might be a simple adjective put in there because 4<14. I bet some of the more experienced PS3 developers will really make those CUs sing, as they will be used to pulling interesting tasks away from the CPU/GPU and running them on the SPEs.
 
These specs just make it all the stranger that Sony (I'm assuming MS too) have supposedly settled for 1080p as the standard resolution for games rather than 720p. 720p would allow them to get much closer to that mythical next gen jump whilst still retaining the weak hardware that they're so desperate to keep whilst providing PC gamers with the same games at 1080p. It would be a win win for everyone, people who don't care about FPS or IQ can get their next gen jump from consoles, PC gamers get the same jump at 1080p/60FPS and Sony and MS get to keep their cheap console costs.
 
Top Bottom