• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Inside the Scorpio Engine: the processor architecture deep dive

timlot

Banned
Why are we milking the FP16 thing so hard? Is it a game changer?

Doesn't seem like it. Its something that Scorpio doesn't have. At least in double rate form. Therefore Sony must have gotten one over on Microsoft and their first year computer science major rejects they have over their in Redmond running their silicon dept. Turn off the grunge music and put down pot Microsoft.
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
The quote just shows except the improvements from GCN 1.0 to GCN 4.0 everything is exactly like XB1 and XB1S.

And the promise about 900p to 4k is just that... a promise... let's wait and see any real example.

I will be the first to give the proper congrats if they showed a Halo 5 or Gears 4 native 4k.


Not in gaming space to be fair.

FP16 is a deal breaker on computacional tasks... that is why Tesla P100 do it 2x faster years ago.

Dude FP16 has been around for a while. I used it in college back in 2007. In fact, I think Microsoft and Nvidia came up with it. It isn't a breakthrough of this generation. It is an option. The reason why it got skipped previously is because there were very few purposes for it. 16 point floating precision is hard to deal with and no current engine uses it AFAIK. Like I mentioned, you have to design your shader to take advantage of it.

It is a forward thinking feature, and if it becomes common, perhaps some engines will start using it, but I doubt this gen anybody but Sony will make an effort.
 

KageMaru

Member
Here:


The 900p games scaled to 4K is a promise seen only on their profiler.

Scorpio doesn't have the TF to brute force 900p games running at 4k but they have achieved it thanks to the additional customizations they added to the hardware as a whole.

I don't think this is anything unique that MS did. They are probably referring to the improvements made to Polaris in general and stating how it benefits Scorpio.
 

tuxfool

Banned
No, no, we're blitting our shader microcode to DCC encoded RTs so that its losslessly encoded and we get major bandwidth savings on the fetches, reducing I$ stalls, which is how we get to 12 TFLOPS.

smirk

Exactly... FP16 uses less memory and bandwidth than FP32.

If you can do that twice faster (what GCN 5.0, P100 and mobile GPU does) then it is is the best actual hardware scenario.

...I'm facepalming.
You realise you're agreeing with nonsense designed to catch fools
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
I don't think this is anything unique that MS did. They are probably referring to the improvements made to Polaris in general and stating how it benefits Scorpio.
He is talking specifically about their custom hardware programmability. Otherwise they would have mentioned the benefits of jumping to a new GPU architecture, which they did not.
 

wachie

Member
Another big exclusive ultra-mega "hardcore" technical article from DF. Bravo! So much technical jargon OMG!

Digital Foundry said:
It's at this point in the presentation where it's clear that the hopes of the most hardcore users weren't going to be realised - there is no Ryzen technology in Project Scorpio.
Still believe a R7 1800X is hiding in the second layer. You betcha!

Digital Foundry said:
However, looking at the layout of the Scorpio Engine, the proportion of the space occupied by the GPU dwarfs the CPU area. The ratio is much, much larger than both Xbox One and PlayStation 4.
Ofcourse. It's a much bigger GPU than the PS4, PS4 Pro with the same weak Jaguar cores, albeit running at a higher frequency. The imbalance is even more obvious.

Digital Foundry said:
]The end result is that not only does the CPU run faster, it also runs more efficiently meaning more power for you at the end."
Expected. It was widely expected that Jag running at higher speeds would consume exponentially more power, it seems we can expect the CPU power usage to be similar to that of the PS4 Pro.

Digital Foundry said:
According to Goossen, some performance optimisations from the upcoming AMD Vega architecture factor into the Scorpio Engine's design, but other features that made it into PS4 Pro - for example, double-rate FP16 processing - do not.
Such as ..?
Digital Foundry said:
However, customisation was extensive elsewhere.
Microsoft:Trollface:

Digital Foundry said:
We were 853MHz in Xbox One, we dialed it up to 1.172 GHz (1172MHz). That's a 37 per cent increase in clock, more than our CPU clock relatively. The next big one: we have 40 CUs. When you take 1172 multiplied by 40 multiplied by 64 for ops multiplied by 2 FLOPS per op, you get exactly 6.0TF.
Digital Foundry said:
On Scorpio we are using a 384-bit GDDR5 interface - that is 12 channels. Each channel is 32 bits, and then 6.8GHz on the signalling so you multiply those up and you get the 326GB/s."
Holy shit. This is technical overload. Those disclaimers at the beginning of the article were not enough!

All in all, nothing new technical learned from this DF article aside from the usual buzzwords "custom" "profiling" "extensive" "to the METAL" "engines" ". I would term it as a "behind the scenes" interview and nothing more.
 
I didn't understand a word of that but I'm pretty sure it was tongue in cheek because of that 12TF figure..

Yes, we are satirizing some of the insanity in this thread, although all of those terms are part of the GCN architecture and make some relative level of sense. (Semi-related-fun-abuse-the rasterization-hardware-for-other-means: the RSX used to do fragment program patching using draws.)

So what are your thoughts on the FP16 debate?

I discussed some of this earlier in the Switch threads.
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
Another big exclusive ultra-mega "hardcore" technical article from DF. Bravo! So much technical jargon OMG!


Still believe a R7 1800X is hiding in the second layer. You betcha!


Ofcourse. It's a much bigger GPU than the PS4, PS4 Pro with the same weak Jaguar cores, albeit running at a higher frequency. The imbalance is even more obvious.


Expected. It was widely expected that Jag running at higher speeds would consume exponentially more power, it seems we can expect the CPU power usage to be similar to that of the PS4 Pro.


Such as ..?

Microsoft:Trollface:



Holy shit. This is technical overload. Those disclaimers at the beginning of the article were not enough!

All in all, nothing new technical learned from this DF article aside from the usual buzzwords "custom" "profiling" "extensive" "to the METAL" "engines" ". I would term it as a "behind the scenes" interview and nothing more.

How things change. They were "experts" while Sony had the technological advantage.
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
Please point out when I claimed DF were experts. Otherwise ..
Their understanding has always been shallow just more in depth than IGN and the like. The main difference is the tables turned against the biggest fan base and people who already made a purchase, therefore, their technical credibility is coming into question.
 

wachie

Member
While none of this may go over your head, I know plenty of mainstream gamers who would be lost on some of this.
I'm pretty sure the DF part of Eurogamer is mainly for the non-casuals. For people who tend to venture beyond "which is more powerful". My point was that this was a big load of buzzword-palooza and they didn't actually give out anything new. Infact, they claimed something (features from Vega being used in Scorpio) but then purposely deflected by not saying what they were.

Their understanding has always been shallow just more in depth than IGN and the like. The main difference is the tables turned against the biggest fan base and people who already made a purchase, therefore, their technical credibility is coming into question.
You're welcome to call them out and not quote me.
 

KageMaru

Member
I'm pretty sure the DF part of Eurogamer is mainly for the non-casuals. For people who tend to venture beyond "which is more powerful". My point was that this was a big load of buzzword-palooza and they didn't actually give out anything new. Infact, they claimed something (features from Vega being used in Scorpio) but then purposely deflected by not saying what it was.

While you're right that DF is usually for the non-casual, you don't have to be casual to have this stuff go over your head. I have friends who are no less gamers than anyone on GAF read these articles and ask me to translate some of it. Seeing how their earlier reveal was trending, I imagine these early reveals are bringing more attention to their site than usual. Just the math on how they reached 6TF can be a bit much for some people.

As for referencing things but not going into detail, I'm disappointed too. Unless this is the end of their coverage/reveals, I bet they are just drip feeding us information and hopefully we'll eventually learn about these 60 customizations and Vega features in the GPU.
 

ethomaz

Banned
Dude FP16 has been around for a while. I used it in college back in 2007. In fact, I think Microsoft and Nvidia came up with it. It isn't a breakthrough of this generation. It is an option. The reason why it got skipped previously is because there were very few purposes for it. 16 point floating precision is hard to deal with and no current engine uses it AFAIK. Like I mentioned, you have to design your shader to take advantage of it.

It is a forward thinking feature, and if it becomes common, perhaps some engines will start using it, but I doubt this gen anybody but Sony will make an effort.
FP16 exists since computes was created (edit- to not be called non-accurate since 1982).

GPUs in the past worked only with FP16... Radeon 9000 come with FP24 and futher nVidia and ATI moved to FP32 that become the standard.

That is nothing new.

FP16 running twice faster than FP32 didn't happen since GeForce FX5200 when they where crushed due low FP32 performance.

That is the new feature found in Vega and Pro... they can run 2x FP16 instructions instead 1x FP32 delivering twice flops... you are making confusion of GPUs supporting FP16 (all of them supports) vs GPUs that runs FP16 faster than FP32 (only future Vega, Pro, mobile GPUs and Tesla).

That didn't mean FP16 is better for games than FP32... it is twice faster with half memory/bandwidth use but at the end few cases can take advantage of FP16 without loss of quality.

BTW it was never skipped... games in the past was coded in FP16 only... tech envolves and it was replaced by a better quality option (FP32) and it will be replaced in the long future by FP64.
 

jaypah

Member
Their understanding has always been shallow just more in depth than IGN and the like. The main difference is the tables turned against the biggest fan base and people who already made a purchase, therefore, their technical credibility is coming into question.

The people calling out DF now could have very well been busy when DF were discussing the Pro. You never know, life comes at you fast.
 

Space_nut

Member
While you're right that DF is usually for the non-casual, you don't have to be casual to have this stuff go over your head. I have friends who are no less gamers than anyone on GAF read these articles and ask me to translate some of it. Seeing how their earlier reveal was trending, I imagine these early reveals are bringing more attention to their site than usual. Just the math on how they reached 6TF can be a bit much for some people.

As for referencing things but not going into detail, I'm disappointed too. Unless this is the end of their coverage/reveals, I bet they are just drip feeding us information and hopefully we'll eventually learn about these 60 customizations and Vega features in the GPU.

I'm sure Build 2017 and GDC 2018 will help expand more too
 
Yes, we are satirizing some of the insanity in this thread, although all of those terms are part of the GCN architecture and make some relative level of sense. (Semi-related-fun-abuse-the rasterization-hardware-for-other-means: the RSX used to do fragment program patching using draws.)



I discussed some of this earlier in the Switch threads.

What are your thoughts on the ID buffer?
 

onQ123

Member
FP16 exists since computes was created.

GPUs in the past worked only with FP16... Radeon 9000 come with FP24 and futher nVidia and ATI moved to FP32 that become the standard.

That is nothing new.

FP16 running twice faster than FP32 didn't happen since GeForce FX5200 when they where crushed due low FP32 performance.

That is the new feature found in Vega and Pro... they can run 2x FP16 instructions instead 1x FP32 delivering twice flops.

That didn't mean FP16 is better for games than FP32... it is twice faster with half memory/bandwidth use but at the end few cases can take advantage of FP16 without loss of quality.

BTW it was never skipped... games in the past was coded in FP16 only... tech envolves and it was replaced by a better quality option (FP32) and it will be replaced in the long future by FP64.

No it didn't
 

Colbert

Banned
So here is what we know so far.

Scorpio has a 6 inch penis with decent girth.

PS4 Pro has a 4 inch penis with not as much girth.

However during certain scenarios PS4 Pro can double it's length to 8 inches.

ice cold

and the p shrinks down to its normal size
 

ethomaz

Banned
Are you sure that you're talking about fp16 & not some other 16-bit format?
FP16 is a binary floating-point number that occupies 16 bits... F is float, P is point and 16 is 16bits.

You can call it binary16, float16, half-floats, half-precision, FP16, etc.
 

longdi

Banned
While you're right that DF is usually for the non-casual, you don't have to be casual to have this stuff go over your head. I have friends who are no less gamers than anyone on GAF read these articles and ask me to translate some of it. Seeing how their earlier reveal was trending, I imagine these early reveals are bringing more attention to their site than usual. Just the math on how they reached 6TF can be a bit much for some people.

As for referencing things but not going into detail, I'm disappointed too. Unless this is the end of their coverage/reveals, I bet they are just drip feeding us information and hopefully we'll eventually learn about these 60 customizations and Vega features in the GPU.

I would blame MS and DF imo.

Just like their earlier Scorpio articles, all the techie parts, read like a PR doc from MS. I hope people sees the marketing effort for what it is. There have been no 'deep dive' from all these reveal articles.

At least when Cerny spoke about PS4 Pro, (which was still PR), he gave insights on their choice of hardware selection.
 

quest

Not Banned from OT
I would blame MS and DF imo.

Just like their earlier Scorpio articles, all the techie parts, read like a PR doc from MS. I hope people sees the marketing effort for what it is. There have been no 'deep dive' from all these reveal articles.

At least when Cerny spoke about PS4 Pro, (which was still PR), he gave insights on their choice of hardware selection.

I agree both are to blame. Df should of pushed on specific Vega features and 60 custom feature PR. MS is way to defensive at this point. Just be strait forward even if it is not awesome. Stand up answer the question and explain the design choices. No shame the GPU is a standard Polaris with bigger memory bus. This is the reason ms has such a bad reputation.
 

wachie

Member
I'm glad I was not the only person calling out this shit, from the first page.
Not much of a deep dive when they don't even talk about what customization they did to the CPU or GPU.
Yet they don't seem to actually articulate what all these customizations are beyond the superficial.
This is literally the same article as before. There is nothing in this article that is new.
This was a pretty long article, that read super choppy with all the quotes, in which we actually learned very little we didn't already know.
 

onQ123

Member
FP16 is a binary floating-point number that occupies 16 bits... F is float, P is point and 16 is 16bits.

You can call it binary16, float16, half-floats, half-precision, FP16, etc.

I know but you said it's been around since computers existed
 

ethomaz

Banned
I know but you said it's been around since computers existed
Yeap it was innacurate but I meant more like since 198x processors started to have some kind of FP16 support that lies with when consumer computers started to happen.

In any case it was a reply about FP16 being nothing new (and it is not and even older GPUs max supported FP16) and what is changing now is that gaming GPUs are receiving FP16 units twice faster that makes the use an option over FP32 instead of actual scenario where AMDs runs at the same speed (Polaris) and nVidia at lower speed (Pascal GeForce).
 

Colbert

Banned
FP16 is a binary floating-point number that occupies 16 bits... F is float, P is point and 16 is 16bits.

You can call it binary16, float16, half-floats, half-precision, FP16, etc.

I know but you said it's been around since computers existed

Single and double precision floating point numbers were introduced by IEEE 754 in 1985 as a industry standard and were superseded in 2008 with the addition of a Half-Precision floating point format (FP16).

FP16 (half-precision) did not exist as an industry standard before 2008.
 

Locuza

Member
I'm glad I was not the only person calling out this shit, from the first page.
We learned that there is no double-rate FP16 on board, which is quite a heavy information.
And since I and others are "enjoying" AMDs marketing strategy, I'm quite used to one piece after piece after piece after piece, every god damn month or two.
 

Colbert

Banned
Yeap it was innacurate but I meant more like since 198x processors started to have some kind of FP16 support that lies with when consumer computers started to happen.

In any case it was a reply about FP16 being nothing new (and it is not and even older GPUs max supported FP16) and what is changing now is that gaming GPUs are receiving FP16 units twice faster that makes the use an option over FP32 instead of actual scenario where AMDs runs at the same speed (Polaris) and nVidia at lower speed (Pascal GeForce).

Single and double precision floating point numbers were introduced by IEEE 754 in 1985 as a industry standard and were superseded in 2008 with the addition of a Half-Precision floating point format (FP16).

FP16 (half-precision) did not exist as an industry standard before 2008.
 

wachie

Member
We learned that there is no double-rate FP16 on board, which is quite a heavy information.
And since I and others are "enjoying" AMDs marketing strategy, I'm quite used to one piece after piece after piece after piece, every god damn month or two.
We also learned that it has no Ryzen, which we also knew. Again they keep hanging this carrot that there are "Vega customizations" but fail to say what they are. It's just a regurgitation of what we all knew (& suspected) plus confirmation of that Jag + Polaris were the base architectures.

Tbh, the only surprising bit of reveal so far was the FreeSync support.
 

ethomaz

Banned
Single and double precision floating point numbers were introduced by IEEE 754 in 1985 as a industry standard and were superseded in 2008 with the addition of a Half-Precision floating point format (FP16).

FP16 (half-precision) did not exist as an industry standard before 2008.
GPUs before that worked with FP16.

3DFX Voodoo
nVidia GeForce
ATI Radeon

Games were coded in FP16 way before the IEEE standard that means nothing at all... computers in 198x used FP16.

Being more specific FP32 was included in GeForce FX 5000 to replace FP16 but it was so slow that games run better in FP16 mode... Radeon 9000 introduced FP24 to replace FP16 and they had success... that happened in 2001-2002.

Half-life 2 can run in both FP32 or FP16 mode (the only way to get god performace with GeForce FX was using FP16 mode)... that is a 2004 game.

Pretty sure FP16 existed and it was used way before 2008 lol
 

Locuza

Member
[...]
Tbh, the only surprising bit of reveal so far was the FreeSync support.
I think different, the fact that rapid packed math wouldn't be included is quite a surprise to me.
I didn't expect that after the PS4 Pro got it and it's coming with Vega.
 

Dynomutt

Member
So Sony cheaped out by using tech from the 80's? Wow...Probably the same guys who didn't put it in the 4K Blu-ray drive.
 
I think different, the fact that rapid packed math wouldn't be included is quite a surprise to me.
I didn't expect that after the PS4 Pro got it and it's coming with Vega.

Explain to me why you think this is a big deal. We've just had several pages of back and forth from what I'm hoping are intelligent people where the conclusion is that it isn't. From what I can tell anyway.
 

quest

Not Banned from OT
I think different, the fact that rapid packed math wouldn't be included is quite a surprise to me.
I didn't expect that after the PS4 Pro got it and it's coming with Vega.

Why it has such limited use not worth the die space for them. It would be the least desirable Vega feature imo. What shocked me there is 0 major Vega features in Scorpio. Looking like the extra year was spent working on making the development environment Great. It could of easily launched against the pro hardware wise.
 

Colbert

Banned
GPUs before that worked with FP16.

3DFX Voodoo
nVidia GeForce
ATI Radeon

Games were coded in FP16 way before the IEEE standard that means nothing at all... computers in 198x used FP16.

Being more specific FP32 was included in GeForce FX 5000 to replace FP16 but it was so slow that games run better in FP16 mode... Radeon 9000 introduced FP24 to replace FP16 and they had success... that happened in 2001-2002.

Half-life 2 can run in both FP32 or FP16 mode (the only way to get god performace with GeForce FX was using FP16 mode)... that is a 2004 game.

Pretty sure FP16 existed and it was used way before 2008 lol

Voodoo was introduced in the late 90's. Sure you are not confusing the 80's with the 90's? If you are as old as I am you may remember in the late 80's we were talking about VGA/EGA/Hercules cards (8-bit for the most of them). 16-bit processors came a little bit later and then without any floating point. You needed a co-processor. You remember that?

Floating Point needed to be implemented completely by software which required to have an industry standard if products of different manufacturers should be able to talk and to understand each other the correct way!

If FP16 was used before 2008 it was completely proprietary to a manufacturers product line.

Imagine a voice in the off like in Babylon 5: "I was there when it all happened"
 
Yeah MS should apologize they have higher power console this time because it's making some people really unhappy about it and need to typing bunch of words just for denial the truth.
 

ethomaz

Banned
Voodoo was introduced in the late 90's. Sure you are not confusing the 80's with the 90's? If you are as old as I am you may remember in the late 80's we were talking about VGA/EGA/Hercules cards (8-bit for the most of them). 16-bit processors came a little bit later and then without any floating point. You needed a co-processor. You remember that?
Yes... in GPU space but there are processors (and compilers for it) with FP16 in 80's.

The point is FP16 is not new... it running native twice faster than FP32 in gaming GPUs is the difference here.

That of course didn't change Scorpio has more raw power than Pro.
 
Who says they'll still go with AMD?

Because AMD is still the only player in town that can offer an x86/powerful GPU SOC?

Unless magically Nvidia (which also means MS working with Nvidia again and that means not having nearly as much control of GPU fabrication as they do now) gets the x86 license (they won't) and can make a CPU as powerful as Zen/Intel that uses little power (they can't). And Intel's GPU solutions are lol
 

ethomaz

Banned
Because AMD is still the only player in town that can offer an x86/powerful GPU SOC?

Unless magically Nvidia (which also means MS working with Nvidia again and that means not having nearly as much control of GPU fabrication as they do now) gets the x86 license (they won't) and can make a CPU as powerful as Zen/Intel that uses little power (they can't). And Intel's GPU solutions are lol
That is a good point in question for new generation because there are a lot of talk about moving to ARM.

ARM + strong GPU can become a thing for next gen over x86 + strong GPU... it was an option before this gen started too.
 

Colbert

Banned
Yes... in GPU space but there are processors (and compilers for it) with FP16 in 80's.

The point is FP16 is not new... it running native twice faster than FP32 in gaming GPUs is the difference here.

That of course didn't change Scorpio has more raw power than Pro.

I edited my post, you may want look into it again. Besides that of course the Scorpio is more powerful than the PS4 Pro and nothing will close this gap as nothing was close the gap the PS4 had over the Xbox One. It is very interesting to see the type of discussion have turned 180 degrees with kind the same arguments about "secret sauce".

Happy Easter!
 

wachie

Member
I edited my post, you may want look into it again. Besides that of course the Scorpio is more powerful than the PS4 Pro and nothing will close this gap as nothing was close the gap the PS4 had over the Xbox One. It is very interesting to see the type of discussion have turned 180 degrees with kind the same arguments about "secret sauce".

Happy Easter!
Don't forget that these "customizations" we don't know about etc also fall in the same sauce category.
 
Top Bottom