• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox Series X’s Advantage Could Lie in Its Machine Learning-Powered Shader Cores, Says Quantic Dream

rnlval

Member
Where would you run more wavefront offchip? 😂 again double counting things... you have the same L1 cache feeding 7 DCU’s on one side and 5 DCU’s on the other... period.

You want more TFLOPS (more wavefronts/threads/operations) without additional Shader Engines (which has a bigger fixed cost HW wise... sure, but you have a tradeoff that the L1 cache is shared with more DCU’s.
Compared to RX 5700 XT, XSX GPU has the following
30% increased in DCU
25% increased in memory bandwidth i.e. 448 GB/s to 560 GB/s
25% increased in L2 cache i.e. 4 MB to 5MB


index.php


For a quick Gears 5 port, XSX rivals RTX 2080 which is 20% superior over RX 5700 XT's results. Memory bandwidth bound is real.
 

Panajev2001a

GAF's Pleasant Genius
Compared to RX 5700 XT, XSX GPU has the following
30% increased in DCU
25% increased in memory bandwidth i.e. 448 GB/s to 560 GB/s
25% increased in L2 cache i.e. 4 MB to 5MB

index.php


For a quick Gears 5 port, XSX rivals RTX 2080 which is 20% superior over RX 5700 XT's results. Memory bandwidth bound is real.

Keep double counting things (higher bandwidth —> more MC’s —> more L2), adding charts, etc... not sure beyond agreeing that it has more memory bandwidth and also a higher TFLOPS rating to feed we can exchange platitudes and talk over each other all day. You know what is also a thing? Memory contention... one system is feeding more units off of the same L1 cache than the other.

You can of course fallback on the bigger and shared L2 and if you can have enough running threads on the chip you can hide the higher latency than the increased L1 cache misses will cause... but you are still giving a latency advantage to the GPU feeding a smaller amount of CU’s from the same L1 pool.
 
Last edited:

Caio

Member
I also heard about Machine unlearning, the technique 343i used for Halo Infinite showed on Jul 23rd. Ok, kidding aside, why we didn't see any impressive XSX showcase yet ? The hardware is there, easy to develop for, all is ready into the Hardware. Why MS didn't think about showing something really impressive on the XSX ? This is a legitimate question to say the least.
 
You need dedicate hardware to support such enhancement. You take a bunch of tech stuff heard here and there, and you come with overoptimistic conclusion mixing all in the bag without rationality. It doesn't work like that.
Wrong and extremely wrong, Sir it states in 7 book's across 13 books in totality in Computer Science Curriculum that AI utilization will take this generation of hardware and increase it's performance 135%.

As a computer scientist/someone that has finished all the curriculum (across 13 book's of coursework) I find it pertinent to inform everyone here that Software Improvements bolstered by AI alone (Machine Learning)are due to deliver upwards of 305% performance gains - with hardware this generation if utilized properly - with or without dedicated ML hardware. (soley utilizing ML optimized Software)

And anyone that has read and understands the curriculum can easily vouch to that here.

We are essentially in for a "Machine Learning" Revolution... neigh.. a quadrillion fold quantum leap in compute... and multiple new paradigm shifts for computing bolstered solely by machine learning infused software.

As anyone who has read the curriculum/passed the coursework will attest. It is blatantly plastered across 5 of 7 book's. That paradigm begins with these consoles/GPU's/CPU's being released.

Particularly beginning in 2021.

Being that Microsoft has clearly stated it can dedicate a portion of it's hardware to DirectML, a solution it created and hardcoded into DX12Ultimate - I expect no less than an 86% performance increase when this feature is utilized properly.

With that said, I am also an avid Trans-human/Singularitist - also something Microsoft has nothing to do with.
 
Last edited:

assurdum

Banned
I also heard about Machine unlearning, the technique 343i used for Halo Infinite showed on Jul 23rd. Ok, kidding aside, why we didn't see any impressive XSX showcase yet ? The hardware is there, easy to develop for, all is ready into the Hardware. Why MS didn't think about showing something really impressive on the XSX ? This is a legitimate question to say the least.
They will. They were far behind in the SDK schedule. Honestly I'm expect something in line with what we have seen or we will see on ps5, maybe some refinement in some tech stuff.
 

assurdum

Banned
Wrong and extremely wrong, Sir it states in 7 book's across 13 books in totality in Computer Science Curriculum that AI utilization will take this generation of hardware and increase it's performance 135%.

As a computer scientist/someone that has finished all the curriculum (across 13 book's of coursework) I find it pertinent to inform everyone here that Software Improvements bolstered by AI alone (Machine Learning)are due to deliver upwards of 305% performance gains - with hardware this generation if utilized properly - with or without dedicated ML hardware. (soley utilizing ML optimized Software)

And anyone that has read and understands the curriculum can easily vouch to that here.

We are essentially in for a "Machine Learning" Revolution... neigh.. a quadrillion fold quantum leap in compute... and multiple new paradigm shifts for computing bolstered solely by machine learning infused software.

As anyone who has read the curriculum/passed the coursework will attest. It is blatantly plastered across 5 of 7 book's. That paradigm begins with these consoles/GPU's/CPU's being released.

Particularly beginning in 2021.

Being that Microsoft has clearly stated it can dedicate a portion of it's hardware to DirectML, a solution it created and hardcoded into DX12Ultimate - I expect no less than an 86% performance increase when this feature is utilized properly.

With that said, I am also an avid Trans-human/Singularitist - also something Microsoft has nothing to do with.
Can you stop to say absurdity please? You take what MS said as the holy bible, you mix a bunch of notion with a lot of confusion if I can say. That's embarrassing. We are very luck to have a boost of 20/30% in the hardware performance with software optimization, but now an increase beyond that just...how? It has it's physical limits.
 
Last edited:

Neo_game

Member
Each DCU has a Local Data Share which scales with DCU count. XSX's 26 DCU LDS / PS5's 18 DCU LDS = ~44% advantage for XSX. LOL

03737c08-7540-4a78-940e-a660ca7fdebf.PNG


More DCU has the following
1. more wave32 processing on the chip instead of outside the chip.
2. more Texture Filter Units
3. more texture load/store units
4. more L1 cache
5. more branch & message unit
6. more scalar units
7. more RT function blocks (RDNA 2)

Trips to the external memory bus have a higher cost.

44% advantage yes but since PS5 is going to run 22% faster than SX it comes back to 18% overall 🤷‍♂️
 
Can you stop to say absurdity please? You take what MS has the holy bible. That's embarrassing.
Again, nothing I've stated in my previous post has anything to do with what 'Microsoft' taught it's fan's, this is Computer Science Curriculum - sir.

These software Optimizations will provide these performance benefit's - eventually for both consoles and as I stated previously, anyone that has read the coursework will attest to an on average performance increase of 105% due to MACHINE LEARNING. This is not science fiction and with all thing's computer science - will be proved science fact by 2021 and beyond.
 
Last edited:

assurdum

Banned
Again, nothing I've stated in my previous post has anything to do with what 'Microsoft' taught it's fan's, this is Computer Science Curriculum - sir.

These software Optimizations will provide these performance benefit's - eventually for both consoles and as I stated previously, anyone that has read the coursework will attest to an on average performance increase of 105% due to MACHINE LEARNING. This is not science fiction and with all thing's computer science - will be proved science fact by 2021 and beyond.
Based on what exactly? You read a bunch of book so you think every hardware with a finger of machine learning can be pushed from 80% to beyond just because it uses machine learning? That's crazy stuff.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Yes, only on NeoGaf - would Computer Science, and it's in fact 13 books of coursework if you also elect to take the philosophical courses - be scoffed at.

You can read 14 books, but nobody is seeing the results of those books in the posts you have made. You are not even quoting things without context, you are just hinting at things you could quote out of context.
 
Based in what exactly?

AGAIN -

Computer Science.


But you know what, I'm sure other's here will eventually chime in and be more than happy to politely attest to the fact that it states Machine Learning will clearly bolster hardware performance over 105%, across various Computer Science Curriculums.

And why wouldn't A.I. be able to perform this achievement is the real question? Why is this concept of A.I. bolstering hardware performance through software optimization so hard to grasp?

Because you haven't heard of it?

Because SCI-FI does not already illustrate massive advancement due to AI - without hammering it in that all of that has been borrowed verbatim from Computer Science Coursework?


That, to me - is almost as crazy as insinuating (factually stating in this case) Machine Learning will bolster future hardware over 300% through software optimization. Which it will.
 

Lysandros

Member
Keep double counting things (higher bandwidth —> more MC’s —> more L2), adding charts, etc... not sure beyond agreeing that it has more memory bandwidth and also a higher TFLOPS rating to feed we can exchange platitudes and talk over each other all day. You know what is also a thing? Memory contention... one system is feeding more units off of the same L1 cache than the other.

You can of course fallback on the bigger and shared L2 and if you can have enough running threads on the chip you can hide the higher latency than the increased L1 cache misses will cause... but you are still giving a latency advantage to the GPU feeding a smaller amount of CU’s from the same L1 pool.
Additionally there is the small matter of XSX's RAM pool not being 560 GB/s entirely. There is the 336 GB/s part sharing the same adress space to consider. Not every game can stay within the 10 GB (CPU usage included).
 

Thirty7ven

Banned
Computer scientists who spend their time misrepresenting data and theorizing about insane results out of small improvements.

What will your degree say when multi plats are neck and neck? Wait for 2021? 2022? 2023? 2024?

Of course, somehow on one side of the argument the future is always tomorrow.
 

assurdum

Banned
AGAIN -

Computer Science.


But you know what, I'm sure other's here will eventually chime in and be more than happy to politely attest to the fact that it states Machine Learning will clearly bolster hardware performance over 105%, across various Computer Science Curriculums.

And why wouldn't A.I. be able to perform this achievement is the real question? Why is this concept of A.I. bolstering hardware performance through software optimization so hard to grasp?

Because you haven't heard of it?

Because SCI-FI does not already illustrate massive advancement due to AI - without hammering it in that all of that has been borrowed verbatim from Computer Science Coursework?


That, to me - is almost as crazy as insinuating (factually stating in this case) Machine Learning will bolster future hardware over 300% through software optimization. Which it will.
It's not computer science here. You mix a lot of notions without contest, machine learning on series X(as on ps5) is very limited, the hardware part dedicate to it need to cowork for other stuff, they aren't completely dedicate to that scope. That's the double edge sword. When console will have a "true" machine learning dedicate in the hardware, we need to see how will end.
 
You can read 14 books, but nobody is seeing the results of those books in the posts you have made. You are not even quoting things without context, you are just hinting at things you could quote out of context.
The result's of "these books" are evident in every single piece of hardware you have ever owned since pre-90's.

Look, I've told you all verbatim what this coursework teaches, offhand as I am in fact a computer scientist - I don't need to reference literature I've slaved over already to consummate correct responses here but I've told you it is blatantly plastered across 5 of 7 books on the subject, a subject extending 13 books in totality. If you're that interested, look into computer science curriculum and get set to be amazed at the future we are approaching. With that said, I promise other's here will probably vouche that this data is correct - eventually, and I'd rather look to that response from other users to resolve your misgivings than needlessly try to prove I'm not lying.

And all of this performance will be gained by Machine Learning specifically optimizing software so that Dedicated ML Hardware is not specifically needed.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
The result's of "these books" are evident in every single piece of hardware you have ever owned since pre-90's.

Look, I've told you all verbatim what this coursework teaches, offhand as I am in fact a computer scientist - I don't need to reference literature I've slaved over already to consummate correct responses here but I've told you it is blatantly plastered across 5 of 7 books on the subject, a subject extending 13 books in totality. If you're that interested, look into computer science curriculum and get set to be amazed at the future we are approaching. With that said, I promise other's here will probably vouche that this data is correct - eventually, and I'd rather look to that response from other users to resolve your misgivings than needlessly try to prove I'm not lying.

5 out of 7? Darn my coursework that let me graduate with only 2... now I know why we disagree, it was the latter two.

Seriously though, I have not accused you of lying, just taking things in a way that looks all out of context and mixing it all in a blender in a way that is not conveying much, not as much as you think.
 

geordiemp

Member
Again, nothing I've stated in my previous post has anything to do with what 'Microsoft' taught it's fan's, this is Computer Science Curriculum - sir.

These software Optimizations will provide these performance benefit's - eventually for both consoles and as I stated previously, anyone that has read the coursework will attest to an on average performance increase of 105% due to MACHINE LEARNING. This is not science fiction and with all thing's computer science - will be proved science fact by 2021 and beyond.

So which of these functions will be increased by Machine learning. You can reference your books if you wish.


eSsCJ69.png


No these are not facial recognituon :messenger_beaming:
 
Last edited:
p-3dconv. particularly. but that isn't the only instance. but that one's important, 135% performance uplift due to machine learning.

edit: should i have waited a bit here, so you all think i hit google for that? under 1 minute I solved that, which i hope clears up any misconceptions.
 
Last edited:

Dolomite

Member
Overall, I think that the pure analysis of the hardware shows an advantage for Microsoft, but experience tells us that hardware is only part of the equation: Sony showed in the past that their consoles could deliver the best-looking games because their architecture and software were usually very consistent and efficient.

And this will repeat in the new gen.
Or what?😂😂
 

geordiemp

Member
p-3dconv. particularly. but that isn't the only instance. but that one's important, 135% performance uplift due to machine learning.

edit: should i have waited a bit here, so you all think i hit google for that? under 1 minute I solved that, which i hope clears up any misconceptions.

Yeah and I added my facial recognition humour just after..
 

Old Empire.

Member
I also heard about Machine unlearning, the technique 343i used for Halo Infinite showed on Jul 23rd. Ok, kidding aside, why we didn't see any impressive XSX showcase yet ? The hardware is there, easy to develop for, all is ready into the Hardware. Why MS didn't think about showing something really impressive on the XSX ? This is a legitimate question to say the least.

Start of the next gen. There's a pandemic occurring if you have not noticed. Many devs are doing work at home in 2020. I expect MS first party to shine in 2022. You swear PS5 has multiple exclusives at launch that show of their hardware?

This is the third party year. You're buying new consoles to play those games with better graphics.

Either way PS5 first party devs don't worry about the series x power difference. It only matters to third party devs.
 

Dnice123

Member
I also heard about Machine unlearning, the technique 343i used for Halo Infinite showed on Jul 23rd. Ok, kidding aside, why we didn't see any impressive XSX showcase yet ? The hardware is there, easy to develop for, all is ready into the Hardware. Why MS didn't think about showing something really impressive on the XSX ? This is a legitimate question to say the least.
Well MS did say they waited for all the RDNA 2 features to be completed by AMD and their GDK dev kit wasn't complete until June.
333-1024x412.jpg
 
Last edited:

DeeDogg_

Banned
David Cage, CEO and founder of Quantic Dream, highlighted the Xbox Series X's shader cores as more suitable for machine learning tasks, which could allow the console to perform a DLSS-like performance-enhancing image reconstruction technique.









Bbbut they’re one the M$ Payroll!!
 

quest

Not Banned from OT
You need dedicate hardware to support such enhancement. You take a bunch of tech stuff heard here and there, and you take enthusiast conclusion mixed all the bag without rationality. It doesn't work like that.
Considering they will have ML for launch your being disingenuous. It probably won't ever be upscaling like nvidia but there will be other uses. If anyone can find a way to make ml upscaling more efficient it will be Microsoft. I see it as a cool new feature that will get more use as the generation goes on. It is sad it is being down played because Sony is not talking it up like SSD SSD tempest tempest controller controller.
 

assurdum

Banned
Start of the next gen. There's a pandemic occurring if you have not noticed. Many devs are doing work at home in 2020. I expect MS first party to shine in 2022. You swear PS5 has multiple exclusives at launch that show of their hardware?

This is the third party year. You're buying new consoles to play those games with better graphics.

Either way PS5 first party devs don't worry about the series x power difference. It only matters to third party devs.
I never heard third parties are worried about the power difference between the two console.
 
Last edited:

alstrike

Member
I know right? If I were a dev who's last 3 projects were Sony exclusives, I'd lie for MS too. It's not like I know more about system architecture than a Gaf user🙃

I'll take I don't give a flying fuck what David Cage says or does for 200$ Alex
 

assurdum

Banned
Considering they will have ML for launch your being disingenuous. It probably won't ever be upscaling like nvidia but there will be other uses. If anyone can find a way to make ml upscaling more efficient it will be Microsoft. I see it as a cool new feature that will get more use as the generation goes on. It is sad it is being down played because Sony is not talking it up like SSD SSD tempest tempest controller controller.
Machine learning is on ps5 too. Cerny mentioned it on the road of ps5. But doubt you ever care to know when you heard just SSD, tempest or controller about ps5...neither machine will offer something significant in that sense. There aren't exclusive hardware part dedicate as Nvidia.
 
Last edited:

alstrike

Member
I mean you are here.....in a David Cage thread, reacting to His quotes....giving away your fucks like thier going out of style.
Truth is truth, fucks be damned

shawshank-2.gif


I'll take someone reputable said something I don't like therefore I act like it is not reliably info for 1000 alex!

What's there to like or not to like?

Edit: Oh snap, looking at your post history you should be in my Hall of Fame. Congratulations, you just made it! Better late than never...
 
Last edited:

Dolomite

Member
shawshank-2.gif




What's there to like or not to like?

Edit: Oh snap, looking at your post history you should be in my Hall of Fame. Congratulations, you just made it! Better late than never...
Lmao now who's being obtuse 🙃
First you say "BS for $100", admitting your doubt driven denial. Then you go for the lazy ad hom :" who gives AF what David Cage has to say" which really means " he's right but I'll dismiss his truth because I don't like him" pick a struggle man
 

quest

Not Banned from OT
Machine learning is on ps5 too. Cerny mentioned it on the road of ps5. But doubt you ever care to know when you heard just SSD, tempest or controller about ps5...neither machine will offer something significant in that sense. There aren't exclusive hardware part dedicate as Nvidia.
Exactly it is a feature that will be used because both sides have it. That is why there is should be no down playing of it. Sony should be front and center with it to push it.
 

assurdum

Banned
Exactly it is a feature that will be used because both sides have it. That is why there is should be no down playing of it. Sony should be front and center with it to push it.
In what way is downplayed? We are talking about it. But now let's not pretend to talk about something of miraculous just because MS sell it in that way. It's MS if I'm not wrong it's started this meme of incredible and unprecedented achievement with ML with the new direct X but it's not even totally hardware, there are a lot of if around his benefit.
 
Last edited:

quest

Not Banned from OT
In what way is downplayed? We are talking about it. But now let's not pretend to talk about something of miraculous just because MS sell it in that way
Miracle lol they are selling it as a cool feature see auto hdr. You get to easily triggered by a company that will have 25-30% market share. Sony will still win with ease no reason to be triggered.
 

assurdum

Banned
Miracle lol they are selling it as a cool feature see auto hdr. You get to easily triggered by a company that will have 25-30% market share. Sony will still win with ease no reason to be triggered.
Sure, convince the people they will get a boost of the 80% in performance it's the right way to keep cool and genuine the userbase attitude, I seen the result . They are not new to such practices as to sell TF as the only factor to value the console specs.
 
Last edited:

01011001

Banned
shawshank-2.gif




What's there to like or not to like?

Edit: Oh snap, looking at your post history you should be in my Hall of Fame. Congratulations, you just made it! Better late than never...

my post history of countering bullshit? yeah I know I should
 
Yeah and I added my facial recognition humour just after..
Yes P3MM is also scheduled for massive gains solely due to machine learning, but everyone expect's caching will see large performance boost's, not everyone believes ML software optimization's will yield large performance gains.

I'll leave the other 2 I spotted here a mystery in case other user's want a go
 
Last edited:
Top Bottom