• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The PlayStation 5 GPU Will Be Supported By Better Hardware Solutions, In Depth Analysis Suggests

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Against "the Xbox SX either has only 7.5 GB of interleaved memory operating at 560 GB/s for game utilisation", argument.

From https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs




nrQAT14.png


When the game is at full screen, OS front-end render will be suspended.

James Prendergast failed to understand CPU is not pretending to be a half-ass'ed GPU like in PS3's CELL SPUs i.e. CPU and GPU memory bandwidth intensity are different e.g. computation intensity difference for XSX's CPU : GPU ratio is 1 to 13, hence Prendergast's averaging argument is flawed.

Prendergast's argument is flawed when the CPU is the bottleneck i.e. a similar argument to why XBO GPU's 1.32 TFLOPS with +200 GB/s memory bandwidth was nearly useless against PS4 GPU with 1.84 TFLOPS and 176 GB/s memory bandwidth.


Targeting 60 Hz game loop, game programmer slices machine's potential in a 60 Hz game loop

PS5:

448 GB/s turns into 7.467 GB per 16ms frame budget potential



XSX:

336 GB/s turns into 5.60 GB per 16ms frame budget potential. 3.5 GB memory storage

560 GB/s turns into 9.33 GB per 16ms frame budget potential. 10 GB memory storage

2.86: 1 memory storage ratio between 10 GB vs 3.5 GB

As long XSX doesn't give equal time between 5.6 GB and 9.33 GB memory pools, XSX has the advantage.



Scenario 1

15% of 16 ms for 5.6GB memory pool= 0.84 GB

85% of 16 ms for 9.33 GB memory pool = 7.93 GB

Frame total bandwidth budget: 8.77 GB

XSX has 17.5% per frame BW advantage over PS5





Scenario 2

10% of 16 ms for 5.6GB memory pool= 0.56 GB

90% of 16 ms for 9.33 GB memory pool = 8.397‬ GB

Frame total bandwidth budget: 8.957‬ GB

XSX has 20% per frame BW advantage over PS5



Scenario 3

5% of 16 ms for 5.6 GB memory pool= 0.28‬ GB

95% of 16 ms for 9.33 GB memory pool = 8.8635 GB

Frame total bandwidth budget: 9.1435‬ GB

XSX has 22.5% per frame advantage over PS5



Scenario 4

2% of 16 ms for 5.6 GB memory pool= 0.112‬ GB

98% of 16 ms for 9.33 GB memory pool = 9.1434‬ GB

Frame total bandwidth budget: 9.2554‬ GB

XSX has 24% per frame advantage over PS5





Assigning CPU and GPU memory bandwidth for frame

Scenario A


Let CPU consumes 0.85 GB for 16 ms frame similar to 16 ms frame slice from 51 GB/s 128 DDR4-3200 PC config

PS5: 6.617‬ GB available to GPU per frame

XSX: 7.92‬ GB‬ available to GPU per frame with 8.77 GB (from Scenario 1)

XSX GPU has 19.7% memory bandwidth advantage per frame over PS5 GPU



Scenario B

Let CPU consumes 0.85 GB for 16 ms frame similar to 16 ms frame slice from 51 GB/s 128 DDR4-3200 PC config

PS5: 6.617 GB available to GPU per frame

XSX: 8.107 GB‬ available to GPU per frame with 8.957 GB (from Scenario 2)

XSX GPU has 22.5% memory bandwidth advantage per frame over PS5 GPU



Scenario C

Let CPU consumes 0.85 GB for 16 ms frame similar to 16 ms frame slice from 51 GB/s 128 DDR4-3200 PC config

PS5: 6.617 GB available to GPU per frame

XSX: 8.296 GB‬ available to GPU per frame with 9.1435 GB (from Scenario 3)

XSX GPU has 25.3% memory bandwidth advantage per frame over PS5 GPU



Tile compute methods on both CPU's and GPU's multi-MB caches with TMU/ROPS can conserve external memory IO access.





--------------------
The reason for 970's slow 0.5GB

3648456-9497634327-gtxcr.jpg


0.5GB DRAM bottlenecked without its own L2 cache and dedicated I/O link into the crossbar.

So the memory bandwidth difference is about the same as the TF difference? Makes sense to me. We all said it's a 20 or something percent difference for the most part.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
That "18%" (minimum btw) translates to a PlayStation 4 and a half of GPU compute on top of the PlayStation 5's GPU. That's something just throwing out percentages doesn't really convey very well. Another thing being grossly overlooked is the uptick in RT hardware, there's 44% more of it on the Series X die and it also has a considerably wider bus. It will undoubtedly have higher pixel and texel fill rates, more TMU's, and a higher ROP count.

It's not just teraflops, Microsoft's GPU goes places which the PlayStation 5's cannot follow.

You're being daft. Percentages matter more than any nominal number if you are discussing TF differences. Using your logic wouldn't the PS4's TF difference translate to the PS4 having 3 PS3s worth of extra power in it over the Xbox One? Now think about all 3rd party games and ask yourself if that makes any sense.
 
Well, it was a performance test by the company responsible for creating the chips and everything in the leak was spot-on except for not mentioning the boost frequency. Now what do you want to insinuate? That the Github leak was just a lucky guess guessing everything correct except for the frequency boost on PS5? That it was a nasty attack by AMD against Sony, deliberately misrepresenting the performance of the hardware? That Sony kept it a secret from AMD that they had devised a secret plan on how they want to handle frequencies?

Isn't it pretty obvious that the only reasonable explanation is that Sony learned, in the meantime, that XSX was stronger than they anticipated and reacted by using the wiggle room the architecture gives to achieve a performance closer to their direct competitor?
Spot-on?

Didn't it say there was no dedicated HW raytracing and was using RDNA1 architecture?
 

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
The GPU is not even in the same class as XSX, I say this do to its weak Internal system BUS of 256bits. CPU draw calls too the GPU will be Bottlenecked and even if The ssd used it's even slower. That just one thing, Collision detection, Particle generation, and many other tasks will be Affected by this bus.

This isn't true.
 

LordKasual

Banned
Well, it was a performance test by the company responsible for creating the chips and everything in the leak was spot-on except for not mentioning the boost frequency. Now what do you want to insinuate? That the Github leak was just a lucky guess guessing everything correct except for the frequency boost on PS5? That it was a nasty attack by AMD against Sony, deliberately misrepresenting the performance of the hardware? That Sony kept it a secret from AMD that they had devised a secret plan on how they want to handle frequencies?

Isn't it pretty obvious that the only reasonable explanation is that Sony learned, in the meantime, that XSX was stronger than they anticipated and reacted by using the wiggle room the architecture gives to achieve a performance closer to their direct competitor?

Except that isn't a "reasonable" explanation at all, it's beyond extreme for the extent they've gone to actually go the route they did. Everyone has known the chips these consoles would be using for a very long time, it's not difficult to guess because there are only so many possibilities if you take into consideration that NEITHER company is going to go beyond a price point threshold. So the projected Tflops can only be so low (based on the known chipset) and can only be so high (based on pricepoint of the console.)

Besides, from your own quoted article:

My understanding is that this data was first stored on GitHub around six to seven months ago - and looking back over noted leakers' timelines on Twitter, the source seems to have been picked up on as early as August. While this may suggest that the testing data doesn't reflect current next-gen console specs, it's important to remember that developing a microprocessor of the complexity we're talking about here tends to be a multi-year effort. Testing and validating a chip to ensure that it meets performance targets and that it passes debugging is in itself a lengthy process - and making changes to the architecture of the chip at this point is unlikely....

most importantly:
Tweaks to clock speeds or accompanying memory are a possibility but the timeline we have suggests that Sony already took the decision to push GPU clock speeds higher by the time the leaked testing took place.

This kind of thing is not a "panic change", they didn't just throw in extra power to overclock the chips, they completely changed the whole system.

It's such a silly narrative.

So the memory bandwidth difference is about the same as the TF difference? Makes sense to me. We all said it's a 20 or something percent difference for the most part.

If this is accurate, then that makes sense i guess. The XSX has alot more CUs anyway.
 
Last edited:

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt
This kind of thing is not a "panic change", they didn't just throw in extra power to overclock the chips, they completely changed the whole system.
Good thing I specifically said it is not panic then.

Except that isn't a "reasonable" explanation at all, it's beyond extreme for the extent they've gone to actually go the route they did. Everyone has known the chips these consoles would be using for a very long time, it's not difficult to guess because there are only so many possibilities if you take into consideration that NEITHER company is going to go beyond a price point threshold. So the projected Tflops can only be so low (based on the known chipset) and can only be so high (based on pricepoint of the console.)
Yeah, it was very easy to guess correctly the CUs, the amount of and speed of RAM. Which is why everyone guessed the same thing.
 

slade

Member
So I refer you back to the post of mine you originally quoted. Try reading it again and try to comprehend it this time.

No one is disputing that the XSX GPU is 18% stronger than the PS5 GPU (and stop with the downclock stuff, you're embarrassing yourself), but I clearly don't care and have explained why. And on the ROPs, just no. 80 ROPs would mean 5 raster engines which doesn't match with 56 shader cluster or any shader cluster count. 64 ROPs, on both consoles, is consistent with AMD architecture.

But please stop engaging me on this. I JUST DON'T CARE.

Only thing I want to see now is GAMES
You're being daft. Percentages matter more than any nominal number if you are discussing TF differences. Using your logic wouldn't the PS4's TF difference translate to the PS4 having 3 PS3s worth of extra power in it over the Xbox One? Now think about all 3rd party games and ask yourself if that makes any sense.

Guys, leave him alone. He built his own computer once in a cave with a box of scraps. Cerny's got nothing on our very own Tony Stank here.
 

Yoshi

Headmaster of Console Warrior Jugendstrafanstalt
Because you can’t and won’t admit you are wrong.
You could create the very first wiki article since none currently exists.

ciNWfyD.jpg



2YiqEDp.jpg
You should have clicked the link:
In software engineering, a bottleneck occurs when the capacity of an application or a computer system is limited by a single component, like the neck of a bottle slowing down the overall water flow. The bottleneck has lowest throughput of all parts of the transaction path.

As such, system designers will try to avoid bottlenecks and direct effort towards locating and tuning existing bottlenecks. Some examples of possible engineering bottlenecks are: a processor, a communication link, disk IO, etc. Any system or application will hit a bottleneck if the work arrives at a sufficiently fast pace.[1]
Of course a bottleneck is always relative to the software using the hardware, but when you consider a a system with very uniform software (e.g. a console, where all software are games), then a common bottleneck among the software will be regarded as a hardware issue. The dependence of the software in determining a bottleneck is precisely why Cerny or whoever made the hardware decisions at Microsoft can only guess and hope they created a hardware where no egregious bottlenecks will pop up. They cannot know, we cannot know. In all liklihood there will be a clear bottleneck on either hardware though and it is even likely that the bottleneck will be the same thing between both consoles.
 

bitbydeath

Gold Member
You should have clicked the link:

Of course a bottleneck is always relative to the software using the hardware, but when you consider a a system with very uniform software (e.g. a console, where all software are games), then a common bottleneck among the software will be regarded as a hardware issue. The dependence of the software in determining a bottleneck is precisely why Cerny or whoever made the hardware decisions at Microsoft can only guess and hope they created a hardware where no egregious bottlenecks will pop up. They cannot know, we cannot know. In all liklihood there will be a clear bottleneck on either hardware though and it is even likely that the bottleneck will be the same thing between both consoles.

Yes, but this whole thing is about overcoming bottlenecks that exist in hardware.
Not by buying new hardware, that is what Mark Cerny has done with his custom IO setup.
 

Ascend

Member
Remember how the Xbox One improved clocks just before the console was even out? So why couldn't the PS5 clocks be increased before even anything of the console has been truly shown?
 
Last edited:

S0ULZB0URNE

Member
Remember how the Xbox One improved clocks after the console was even out? So why couldn't the PS5 clocks be increased before even anything of the console has been truly shown?
They aren't going faster than 2.3 ghz..
If covid-19 causes a delay maybe they could increase other things...
 

Sosokrates

Report me if I continue to console war
Only 10(-2.5 for the OS) of faster than PS5's 16(minus whatever the OS uses)

When people keep posting these types of things it shows that they haven't taken all sources/breakdowns into account.

Did you know the PS5s CPU bandwidth to the ram wont be 448gb/s because the cpu memory access is determined by there caches.
I think we will just have to agree to disagree, you clearly believe the ps5 has the better ram setup.
 

Leyasu

Banned
lol, fuck me this thread is amazing!

What a difference a new console can make. In 2013, the sony horde on here were triumphantly swaggering around, with full mod approval shitting up every thread, beating their chests, talking about having man crushes on Cerny and saying that it would be difficult seeing people in MP games at less than 1080p. They let rip on microsoft when they tried damge control on eurogamer talking about a balanced system. Chased Albert Penello off of the forum, and collectively ejaculated over their screens with every new ps4 article.

Fast forward to now, microsoft pulled out the big guns specs wise, now the sony horde are running around in full damage control, complete with the bullshit that the ps5 is better balanced and better engineered to outperform microsofts leviathan.

This is straight up MrXmedia territory.
 

Leyasu

Banned
Only 10(-2.5 for the OS) of faster than PS5's 16(minus whatever the OS uses)

When people keep posting these types of things it shows that they haven't taken all sources/breakdowns into account.
A quick question, if it is only 10gig of ram with 2.5 for the OS, what is the other 6gig being used for?
 

Ascend

Member
They aren't going faster than 2.3 ghz..
If covid-19 causes a delay maybe they could increase other things...
I was mainly referring to them (possibly) increasing the specs at the last minute before the spec reveal that just happened.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
lol, fuck me this thread is amazing!

What a difference a new console can make. In 2013, the sony horde on here were triumphantly swaggering around, with full mod approval shitting up every thread, beating their chests, talking about having man crushes on Cerny and saying that it would be difficult seeing people in MP games at less than 1080p. They let rip on microsoft when they tried damge control on eurogamer talking about a balanced system. Chased Albert Penello off of the forum, and collectively ejaculated over their screens with every new ps4 article.

Fast forward to now, microsoft pulled out the big guns specs wise, now the sony horde are running around in full damage control, complete with the bullshit that the ps5 is better balanced and better engineered to outperform microsofts leviathan.

This is straight up MrXmedia territory.
the difference between sony and ms next gen compared to last gen is that the xbox one was weaker in every way and the difference was at least 50% on the gpu, and actually more because 10% of the gpu was saved for the kinect up until launch. the ram situation was even worse at around 70%.

the ps5 is only trailing by 18% and has a massive leg up on SSD and i/o by over 120%. it might not outperform the xbox series x, but its definitely not the same situation as last gen.
 

Leyasu

Banned
the difference between sony and ms next gen compared to last gen is that the xbox one was weaker in every way and the difference was at least 50% on the gpu, and actually more because 10% of the gpu was saved for the kinect up until launch. the ram situation was even worse at around 70%.

the ps5 is only trailing by 18% and has a massive leg up on SSD and i/o by over 120%. it might not outperform the xbox series x, but its definitely not the same situation as last gen.
I am aware of the percentage difference, but the SSD and i/o are going to equate to faster loading and a few gimmicks in a few 1st party games. Everyone knows this deep down.

The Only reason why Cerny and sony have made such a fuss of it is because they have been trumped everywhere else. Again, everyone knows this
 

FranXico

Member
the difference between sony and ms next gen compared to last gen is that the xbox one was weaker in every way and the difference was at least 50% on the gpu, and actually more because 10% of the gpu was saved for the kinect up until launch. the ram situation was even worse at around 70%.

the ps5 is only trailing by 18% and has a massive leg up on SSD and i/o by over 120%. it might not outperform the xbox series x, but its definitely not the same situation as last gen.
Not to mention DRM and TVTVTV.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
lol, fuck me this thread is amazing!

What a difference a new console can make. In 2013, the sony horde on here were triumphantly swaggering around, with full mod approval shitting up every thread, beating their chests, talking about having man crushes on Cerny and saying that it would be difficult seeing people in MP games at less than 1080p. They let rip on microsoft when they tried damge control on eurogamer talking about a balanced system. Chased Albert Penello off of the forum, and collectively ejaculated over their screens with every new ps4 article.

Fast forward to now, microsoft pulled out the big guns specs wise, now the sony horde are running around in full damage control, complete with the bullshit that the ps5 is better balanced and better engineered to outperform microsofts leviathan.

This is straight up MrXmedia territory.

I wish you knew what you were talking about. It's amazing that some people want to create this new reality where the only thing gamers were excited about (hardware wise) were Teraflops for all these generations.

During the PS3 generation I was excited about....

- The Cell processor
- The Blu-ray drive
- FINALLY a built-in hard drive!
- Finally having wireless controllers by default
- 6 axis gyro in the controller
- And The Xbox 360's unified RAM solution which sounded AMAZING compared to the PS3's RAM solution.

Notice how NONE of that stuff has anything to do with teraflops. Back then I didn't know what a teraflop was. I think it was around the end of the PS3 generation that I found out what a gigaflop was and how it was calculated.


During the PS4 generation I was excited about........

- The 8 GB GDDR5 RAM the PS4 had
- A controller with a headphone jack, speaker, and share button
- FINALLY a console with an easy to develop CPU and GPU (an APU that I didn't know was possible to create) unlike the PS3.
- A little later PSVR, the 4K drive in the Xbox One, and mid-gen refreshes.

Again most of that stuff wasn't about Teraflops. Did I like the 1.8 TFs that could make most of my games 1080p? YES! But it wasn't the end-all, be-all. It was part of the bigger package that made me excited. The same exist this generation. So, why is it NOW (all of a sudden) we are "ONLY" supposed to be excited about the Teraflop number and nothing else?
 

Leyasu

Banned
A long time ago The Beatles famously sung that "all you need is love" , The reality is different. All you need infact is an SSD.

I'm calling it now, the PS6 will be a super-super-fast SSD, with a 3ds graphics card and processor soldered on... You won't need anything else, because Cerny will say so.
 
A long time ago The Beatles famously sung that "all you need is love" , The reality is different. All you need infact is an SSD.

I'm calling it now, the PS6 will be a super-super-fast SSD, with a 3ds graphics card and processor soldered on... You won't need anything else, because Cerny will say so.
So what has Microsoft done to address memory mapping, file I/O, and check-in?
 

Sosokrates

Report me if I continue to console war
Remember how the Xbox One improved clocks just before the console was even out? So why couldn't the PS5 clocks be increased before even anything of the console has been truly shown?

Because they are already pushing the envelope in clockspeed, so much so that they can not maintain locked clocks.
 

martino

Member
I wish you knew what you were talking about. It's amazing that some people want to create this new reality where the only thing gamers were excited about (hardware wise) were Teraflops for all these generations.

During the PS3 generation I was excited about....

- The Cell processor
- The Blu-ray drive
- FINALLY a built-in hard drive!
- Finally having wireless controllers by default
- 6 axis gyro in the controller
- And The Xbox 360's unified RAM solution which sounded AMAZING compared to the PS3's RAM solution.

Notice how NONE of that stuff has anything to do with teraflops. Back then I didn't know what a teraflop was. I think it was around the end of the PS3 generation that I found out what a gigaflop was and how it was calculated.


During the PS4 generation I was excited about........

- The 8 GB GDDR5 RAM the PS4 had
- A controller with a headphone jack, speaker, and share button
- FINALLY a console with an easy to develop CPU and GPU (an APU that I didn't know was possible to create) unlike the PS3.
- A little later PSVR, the 4K drive in the Xbox One, and mid-gen refreshes.

Again most of that stuff wasn't about Teraflops. Did I like the 1.8 TFs that could make most of my games 1080p? YES! But it wasn't the end-all, be-all. It was part of the bigger package that made me excited. The same exist this generation. So, why is it NOW (all of a sudden) we are "ONLY" supposed to be excited about the Teraflop number and nothing else?

The problem is more turning exitement on other things into "secret sauce" power
 
Last edited:
Why does it seem that the dominant narrative on this(and other sites) is PS5 having less power than XSX doesn't matter because HURR DURR GAMEZ DOOD?
 

Leyasu

Banned
I wish you knew what you were talking about. It's amazing that some people want to create this new reality where the only thing gamers were excited about (hardware wise) were Teraflops for all these generations.

During the PS3 generation I was excited about....

- The Cell processor
- The Blu-ray drive
- FINALLY a built-in hard drive!
- Finally having wireless controllers by default
- 6 axis gyro in the controller
- And The Xbox 360's unified RAM solution which sounded AMAZING compared to the PS3's RAM solution.

Notice how NONE of that stuff has anything to do with teraflops. Back then I didn't know what a teraflop was. I think it was around the end of the PS3 generation that I found out what a gigaflop was and how it was calculated.


During the PS4 generation I was excited about........

- The 8 GB GDDR5 RAM the PS4 had
- A controller with a headphone jack, speaker, and share button
- FINALLY a console with an easy to develop CPU and GPU (an APU that I didn't know was possible to create) unlike the PS3.
- A little later PSVR, the 4K drive in the Xbox One, and mid-gen refreshes.

Again most of that stuff wasn't about Teraflops. Did I like the 1.8 TFs that could make most of my games 1080p? YES! But it wasn't the end-all, be-all. It was part of the bigger package that made me excited. The same exist this generation. So, why is it NOW (all of a sudden) we are "ONLY" supposed to be excited about the Teraflop number and nothing else?

Reading doesn't seem to be one of your strong points does it. Could you please highlight where I worte the word teraflop in my post tat you quoted?

It is good that you are/were excited for all those things. I am not trying to take that away or say that you are wrong. I am merely poking fun at the sony fanboys who have suddenly lost their swagger, and are now bending themselves into human pretzels to follow any narrative that closes the gap.
 
Last edited:
I wish you knew what you were talking about. It's amazing that some people want to create this new reality where the only thing gamers were excited about (hardware wise) were Teraflops for all these generations.

During the PS3 generation I was excited about....

- The Cell processor
- The Blu-ray drive
- FINALLY a built-in hard drive!
- Finally having wireless controllers by default
- 6 axis gyro in the controller
- And The Xbox 360's unified RAM solution which sounded AMAZING compared to the PS3's RAM solution.

Notice how NONE of that stuff has anything to do with teraflops. Back then I didn't know what a teraflop was. I think it was around the end of the PS3 generation that I found out what a gigaflop was and how it was calculated.


During the PS4 generation I was excited about........

- The 8 GB GDDR5 RAM the PS4 had
- A controller with a headphone jack, speaker, and share button
- FINALLY a console with an easy to develop CPU and GPU (an APU that I didn't know was possible to create) unlike the PS3.
- A little later PSVR, the 4K drive in the Xbox One, and mid-gen refreshes.

Again most of that stuff wasn't about Teraflops. Did I like the 1.8 TFs that could make most of my games 1080p? YES! But it wasn't the end-all, be-all. It was part of the bigger package that made me excited. The same exist this generation. So, why is it NOW (all of a sudden) we are "ONLY" supposed to be excited about the Teraflop number and nothing else?
People were always extremely excited about compute, it's been a thing literally forever but it merely went by different names over the years. Don't remember 16, 32, 64 and 128 bits? Yeah, same shit.

What's taking place with people like you is trying to discourage others from leaning into what has always been vastly important, compute. Sony's compute got dwarfed for next-gen, there's no if's, and's or but's about it which brings out individuals like yourself.

You want to detract from its time tested and proven advantages, and excitement because it's no longer a relevant metric for you, because you got forced out of it.

Why would you be excited about compute? You got the inferior offering. Take the L, and take a walk.
 
Last edited:
He hasn't really watched the video.

Twice, actually.

Cerny is pretty awesome. Love to hear him speak.

Unfortunately, console warriors don't understand what he says and misinterpret it as universal partisan support for their retarded console war fuckwittery.

Case in point, bitbydeath bitbydeath who doesn't understand a fucking word in Cerny's pretty reasonable presentation.
 
Only 10(-2.5 for the OS) of faster than PS5's 16(minus whatever the OS uses)

When people keep posting these types of things it shows that they haven't taken all sources/breakdowns into account.

If you hadn't already realised, the fact that mckmas8808 mckmas8808 liked your post means that you're wrong.

The 2.5 GB for the OS doesn't come out of the XSX 10GB "GPU optimal" ram. That's literally stated by MS in their introduction. It is an explicit statement about location in physical memory.

You can't even get basic shit right.

Given that you're so wrong, and such a console warrior goober, who the hell are you to talk about taking "all sources/breakdowns into account"?

You're literally fucking up ultra basic, confirmed and explained stuff.
 

bitbydeath

Gold Member
Twice, actually.

Cerny is pretty awesome. Love to hear him speak.

Unfortunately, console warriors don't understand what he says and misinterpret it as universal partisan support for their retarded console war fuckwittery.

Case in point, bitbydeath bitbydeath who doesn't understand a fucking word in Cerny's pretty reasonable presentation.

No need to get upset.
I'd gladly admit if I was wrong but you have proven nothing.
 
So basically you don't know the answer to his question.
It's ironic how those who laugh at others for "SSD, SSD, SSD!" end up becoming the very people they're mocking, i.e. they keep talking about storage and decompression while there are 4 other steps to the storage-to-memory process: coherency, memory mapping, file I/O, and check-in/load management.
 
Top Bottom