• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

3liteDragon

Member
E3 2019

There are going to be areas where we should compete. And those areasā€¦ can lead to better outcomes when we competeā€¦ but it's still a world where in order for us to do better, you don't have to do worseā€¦ Gaming is a better space when there are brands that people trust, where brands that have been around for decades continue to be aroundā€¦ I want Xbox obviously to be one of thoseā€¦ We're not building Scarlett to not be the best console. I want to be number one. I'm as competitive as anybody. But that doesn't mean that I need somebody else toā€¦ suffer from a business standpointā€¦ I think raw power is very importantā€¦ I don't have a PS5 development kit, so I don't know what they're buildingā€¦ I will say that being a leader in consoles is what the team is committed to doingā€¦ We aren't building this program to try to aim for second place. We're building it aiming for first place and that's what I want to hit.


Xbox Series X is our most powerful console


656kEcW.png
He says he WANTS Xbox Series X to be the #1 console, not it is going be the most powerful. He literally says right after that their MC team donā€™t have PS5 devkits so they donā€™t know what Sonyā€™s building (Whether he has insider knowledge, idk you wouldnā€™t expect him to make that public and risk getting sued). And your last sentence already contradicts the picture you posted.
 

Gavin Stevens

Formerly 'o'dium'
Obviously, since the next-gen consoles will be held back by PCs with their ancient HDD technology ;)

Actually no, itā€™s dependant on the engine used and the asset quality. With high fidelity assets being common place next generation, letā€™s look at a 4096x4096 texture on a surface, would would require at the bare minimum a diffuse map, a normal map (with possible height) and a specular map, the specular map being not only colour but also containing PBR data in its alpha mask. Thatā€™s a basic surface, but more than likely other stages will be drawn too. If you think about the size a single targa image takes in 4096x4096 with no alpha, then at least triple it (at least, but likely more), thatā€™s a single surface of data.

Now for a targa based set (none compressed) you are looking at

Current gen average level area texture size 1024x1024:
Diffuse 3mb
Normal 3.0mb
Spec 4.0mb (with PBR data)
Total: 10mb per set

Next generation average level area texture size 4096x4096:
Diffuse 48mb
Normal 48mb
Spec 64mb
Total 160mb

Now this is an average level asset. Keep in mind that you will have HUNDREDS of texture ā€œsetsā€ per level, and that characters will likely be using multiple 4096x4096, if not higher.

So doing the math, letā€™s take it nice and easy here and say 50 texture sets in a level. So 50x160 = 8000mb. Thatā€™s just on a very basic, without any additional form of asset, level. This doesnā€™t include level geometry or character/weapon models, of which will also go up exponentially next gen, effects, music, sound and so on. Itā€™s quite possible then that a basic load could result in 12-16gb of data up front, if youā€™re not steaming.

But using this basic math on the assumption of no compression we can already prove next gen will not load ā€œinstantlyā€, due to the way things are called and loaded into memory.

(This example is uncompressed because Iā€™m going on the assumption thatā€™s what we are aiming for. Compressed assets are much lower, for example on a basic compression with no artefacting using dx1 on my own engine, I can get a 49,153mb asset down to 10,923mb.)
 

ZywyPL

Banned
Actually no, itā€™s dependant on the engine used and the asset quality. With high fidelity assets being common place next generation, letā€™s look at a 4096x4096 texture on a surface, would would require at the bare minimum a diffuse map, a normal map (with possible height) and a specular map, the specular map being not only colour but also containing PBR data in its alpha mask. Thatā€™s a basic surface, but more than likely other stages will be drawn too. If you think about the size a single targa image takes in 4096x4096 with no alpha, then at least triple it (at least, but likely more), thatā€™s a single surface of data.

Now for a targa based set (none compressed) you are looking at

Current gen average level area texture size 1024x1024:
Diffuse 3mb
Normal 3.0mb
Spec 4.0mb (with PBR data)
Total: 10mb per set

Next generation average level area texture size 4096x4096:
Diffuse 48mb
Normal 48mb
Spec 64mb
Total 160mb

Now this is an average level asset. Keep in mind that you will have HUNDREDS of texture ā€œsetsā€ per level, and that characters will likely be using multiple 4096x4096, if not higher.

So doing the math, letā€™s take it nice and easy here and say 50 texture sets in a level. So 50x160 = 8000mb. Thatā€™s just on a very basic, without any additional form of asset, level. This doesnā€™t include level geometry or character/weapon models, of which will also go up exponentially next gen, effects, music, sound and so on. Itā€™s quite possible then that a basic load could result in 12-16gb of data up front, if youā€™re not steaming.

But using this basic math on the assumption of no compression we can already prove next gen will not load ā€œinstantlyā€, due to the way things are called and loaded into memory.

(This example is uncompressed because Iā€™m going on the assumption thatā€™s what we are aiming for. Compressed assets are much lower, for example on a basic compression with no artefacting using dx1 on my own engine, I can get a 49,153mb asset down to 10,923mb.)

I know, I was just messing around a bit, you're taking it all way too serious, calm down ;) I genuinely expect loading screens to exist in every single game, otherwise there will be a blank screen before all the assets get loaded into the RAM and processed before giving ust the final image, so whether 10s or 2min. the loading screen simply has to be there no matter what. And let's assume the huge performance difference between all available platforms, like let's say PS5>XBX>PCIE4 NVMe>PCIE3 NVMe>SATA3>SSHD>HDD, then a single loading screen completely solves the compatibility issues, which is a no-brainer for any developer out there,

Unless the loadings will take less than 2-3s, then some sort of transition animation can be added between the main meny/OS and the actual gameplay, but if so I can't imagine it other than just a few 1st party titles that aren't open world games.
 

Neo Blaster

Member
Actually no, itā€™s dependant on the engine used and the asset quality. With high fidelity assets being common place next generation, letā€™s look at a 4096x4096 texture on a surface, would would require at the bare minimum a diffuse map, a normal map (with possible height) and a specular map, the specular map being not only colour but also containing PBR data in its alpha mask. Thatā€™s a basic surface, but more than likely other stages will be drawn too. If you think about the size a single targa image takes in 4096x4096 with no alpha, then at least triple it (at least, but likely more), thatā€™s a single surface of data.

Now for a targa based set (none compressed) you are looking at

Current gen average level area texture size 1024x1024:
Diffuse 3mb
Normal 3.0mb
Spec 4.0mb (with PBR data)
Total: 10mb per set

Next generation average level area texture size 4096x4096:
Diffuse 48mb
Normal 48mb
Spec 64mb
Total 160mb

Now this is an average level asset. Keep in mind that you will have HUNDREDS of texture ā€œsetsā€ per level, and that characters will likely be using multiple 4096x4096, if not higher.

So doing the math, letā€™s take it nice and easy here and say 50 texture sets in a level. So 50x160 = 8000mb. Thatā€™s just on a very basic, without any additional form of asset, level. This doesnā€™t include level geometry or character/weapon models, of which will also go up exponentially next gen, effects, music, sound and so on. Itā€™s quite possible then that a basic load could result in 12-16gb of data up front, if youā€™re not steaming.

But using this basic math on the assumption of no compression we can already prove next gen will not load ā€œinstantlyā€, due to the way things are called and loaded into memory.

(This example is uncompressed because Iā€™m going on the assumption thatā€™s what we are aiming for. Compressed assets are much lower, for example on a basic compression with no artefacting using dx1 on my own engine, I can get a 49,153mb asset down to 10,923mb.)
Yep, stuff will not load instantly on next gen consoles, even with SSDs, but won't things be much worse on PCs using mechanical drives? Unless you make SSDs mandatory, of course, and not just any kind of it.
 
Last edited:

FeiRR

Banned
Actually no, itā€™s dependant on the engine used and the asset quality.
(Snapped the rest)
This is very insightful! Quite a refresher after dozens of pages of TF babble.

We're going to get incredible IQ next gen. However, I see an issue with BD capacity. Is it possible that the readers will be triple layer (100 GB)? I heard of just one game on more than 1 BD (RDR2) but next gen it might be a problem. Not to mention people with data caps downloading games and patches.

Also SSDs aren't cheap. 1 TB will be how many, 3-4 games? I wonder how they're planning to solve that.
 

Disco_

Member
SSDs are an exciting thing to have for consoles, definitely. I know on the old PS4 there are some benefits even there. On the PS4 Pro, Final Fantasy XIV benefits greatly from it, imo.

Has anybody tried The Division 2 on PS4/Pro with an SSD? Any difference?
Quicker fast travel, just like division 1. The problem with both td1 and td2 is that the ps4 versions were somewhat neglected.
 

Gavin Stevens

Formerly 'o'dium'
Yes, both consoles will use the new higher capacity 100gb disks, thats common knowledge I believe.

The thing about game size is that itā€™s another tricky area to talk about. Games this generation use a lot of duplication of assets so that files are closer to each other, which results in reduced mechanical disk drive seeking. In other words, a shit load of stuff in your RDR2 install is actually duplicate data.

SSD doesnā€™t need this, at all. It can read anything it needs at any time, with no ā€œseek timeā€ as such.

However the flip side of the coin is that while you no longer need duplicate data, the data itself will be far higher in file size.

So itā€™s a tough one to say for sure haha.

Donā€™t get me wrong though, 2tb SSD is too small even for me, regardless.
 
Last edited:

Neo Blaster

Member
Yes, both consoles will use the new higher capacity 100gb disks, thats common knowledge I believe.

The thing about game size is that itā€™s another tricky area to talk about. Games this generation use a lot of duplication of assets so that files are closer to each other, which results in reduced mechanical disk drive seeking. In other words, a shit load of stuff in your RDR2 install is actually duplicate data.

SSD doesnā€™t need this, at all. It can read anything it needs at any time, with no ā€œseek timeā€ as such.

However the flip side of the coin is that while you no longer need duplicate data, the data itself will be far higher in file size.

So itā€™s a tough one to say for sure haha.

Donā€™t get me wrong though, 2tb SSD is too small even for me, regardless.
Less space taken, but bigger assets than last gen games. So, tit-for-tat?
 

Gavin Stevens

Formerly 'o'dium'
Less space taken, but bigger assets than last gen games. So, tit-for-tat?

Essentially yeah. Itā€™s a ball ache, you canā€™t really give a ballpark figure on this either as it depends entirely on the game itself. But yes most definitely, take with one hand and give with the other.
 

geordiemp

Member
Donā€™t get me wrong though, 2tb SSD is too small even for me, regardless.

It would be interesting if either Sony or MS have a dual technology approach, say a 1 TB SSD and you can plug in a normal 6 TB hard drive and you transfer the 3 or 4 games you ar playing now to the SSD.

It would be very expensive to have to add 4 TB SSD to consoles fo rmost people IMO.
 

DeepEnigma

Gold Member
No, itā€™s tricky, let me explain. 13gb isnā€™t enough, so disregard that. Itā€™s enough for lockhart, and I would assume that it could be using that, but not for a main console, not with just how large assets get and how much resolution has increased. 13gb isnā€™t enough.

Now Iā€™m of the personal opinion that 16gb isnā€™t even enough - itā€™s the bare MINIMUM, but itā€™s not what I would use. However memory costs money, which increase bom cost. The issue is that IF the Ps5 has slightly higher clocked memory, for the SX to have less would be a double punch, it wouldnā€™t work. To have less and slower? You would be swapping in more and drawing it slower, meaning even if your console was a few tf faster, it would be for nothing. So, while PS5 can have faster clocked memory, itā€™s essential for the SX to, at the very least, match it in size.

But as to what that size i? Last time memory was mentioned to me, multiple people were talking 16. However as I said, when I spoke about anything extra for the OS (for example, letā€™s say an additional separate DDR4 4gb chip, just for OS) I was told thatā€™s one way to do it, but there is something special going on. The way it was worded to me made it seem like the OS is running in a virtual memory type deal, because that SSDs are fast, and so you actually donā€™t need anything extra. But thatā€™s my interpretation, itā€™s not something I can prove.

As a baseline, I would go with 16gb for both systems, but I ask wouldnā€™t rule out either both systems or one (and imo, the SX would be the likely candidate here) having more memory. The issue is, itā€™s not really a case of pop one off and another off like on a pc. If you have more memory, you need to feed it, and that costs valuable cpu cycles when pushed. So, itā€™s a bit of a chicken and egg type deal... itā€™s all about balance.

edit: keep in mind the above is for memory. Not for SSD speed, which last I heard was about 1-1.5gb/sec faster in favour of Sony, but with that, the actual speed increase wonā€™t provide anything substantial for gameplay, it will will essentially be a wash with a few fringe case exceptions.

I always thought they should be at least 24GB for future proofing, but we also know RAM is back to being pricey again. It's the most expensive component in this boxes. My bet was always 16 minimum, but wanted 24.
 
Last edited:

BGs

Industry Professional
Ok thanks mate. Was just wondering if you could provide something, since the context of why he was PMing you was right there on his post. But anyway thanks for your reply.


Hey BGs, my friend šŸ˜…
You can PM me too, whenever you feel the need to share all your knowledgeable first person insights... i'm a tomb.
Spanish and Portuguese people ā¤ hermanos para siempre!

Everything in this life has a cost.

giphy.gif


Just kidding.

I have not sent any PS5 information to SlimySnake SlimySnake . Only Company information for clarification.
 

Neo Blaster

Member
It would be interesting if either Sony or MS have a dual technology approach, say a 1 TB SSD and you can plug in a normal 6 TB hard drive and you transfer the 3 or 4 games you ar playing now to the SSD.

It would be very expensive to have to add 4 TB SSD to consoles fo rmost people IMO.
I still think these consoles could be using SSD chips soldered to the main board as main storage and a regular replaceable HDD + external drives for cold storage. No way they can have only 1 TB total, nor will put more than that due to high costs. I just don't know how data management across all these would work. Copy games to SSD when installing? Or on demand?
 
Last edited:

LED Guy?

Banned
It's been a long weekend. What have I missed? We up to 14 tfs with RDNA 3 for PS5 yet?! All for 399!!
It was a good weekend, you know.... the one where all Xbox fanboys who were pushing GitHub leak data and then that data is falling flat on Its face after the confirmation of RDNA 2 for PS5.....and then Xbox fanboys were deleting their past tweets and videos on YouTube.....yeah that weekend.
 

DaGwaphics

Member
It would be interesting if either Sony or MS have a dual technology approach, say a 1 TB SSD and you can plug in a normal 6 TB hard drive and you transfer the 3 or 4 games you ar playing now to the SSD.

It would be very expensive to have to add 4 TB SSD to consoles fo rmost people IMO.

That seems reasonable. Especially with the back compat with PS4 & Xbone, you could probably continue playing those games straight from the HDD as you do now. If they created some kind of archive state for the next gen games, you could even keep them updated/patched while they are on the HDD. (Says cheap bastard looking to save coin :messenger_tears_of_joy:).
 

alex_m

Neo Member
After AMD conference i was sure that PS5 would be 12TF the same as Xbox. But now no so sure.. after reading everything out there on the internet plus gaf and era.. seeing every videos on youtube plus Twitter ect. Im back in to thinking itā€™s either 9.2 or 10TF, why?

75% of does github leak were right about xbox. But they were wrong about RDNA2. Everybody was thinking it was rdna1 cause everyone assumed RDNA2 was 7nm+ but after AMD press conference they made clear it was 7nm same as RDNA1 just more watt efficiency and plus we dont know literally shit about PS5 and come on. They read those tweets they read those post and i do have a feeeling that Sony wouldnā€™t let MS flexing like that if they were more powerful. I mean come on!! Remember the sony from 2013? They clap back at every chance they got.. and now your telling me they are silent cause theyā€™re not quite ready to revel info!! And how come MS are ready Before them when every thing we Heard at first was that MS was being schedule with next gen.. and now first to show the console first to show specs first to show confidence in their product... sssh something smell fishy.. and BTW i hope im wrong but i dout it

You forget that Microsoft owns Github and that Microsoft is known for it's Fear, Uncertainty and Doubt tactics.

That said, everybody with a working brain cell in his skull would know that Sony will not be using a GCN based GPU in the new PS5.

That said, did you know that Microsoft is desperate because Sony sold over 2.5 times more consoles for the current generation? It also seems that Microsoft shareholders are not happy about this. I could very well imagine once Sony releases more details about it's console (>13 TF most probably) XBox will be done.
 
That's not what I meant, BC with current-gen is a given no doubts, what I meant is that MS will have a tough time convincing all those 100MLN people to abandon their hundreds-thousands $$ worth libraries and switch the camp.

It wasn't much of an an issue back in the days because in order to play let's say PS2 games you had to keep the PS2 with you (excluding the early PS3 models), so nothing was stopping you from buying X360 instead of PS3. But now, I really don't see anyone wanting to repurchase Fortnite or Minecraft again just to continue playing it. Countless hours in GTA Online, gone. CoD, BF, Destiny etc. all gone. Assasin's Creed collections, worthless.

I could go on and on with the examples, but I think you get the idea, people invested not only huge amount of money but also huge amount of time into current-gen titles (that's exactly what GAAS are meant for), and now they will be given the opportunity to keep all of it intact if they simply choose PS5 as then next-gen system, while MS at least of of now has literally nothing to offer to counter that. Because why would anyone heavily attached to all the 3rd party titles wanted to get XBX and lose everything, and would need to either buy everything they want to keep/get back to again, or have to keep the PS4. Because people who really want to play Halo/Gears/Forza are already there on the Xbox, and they will most likely stay in that ecosystem for the exact same reason.

Actually as i think of it now, BC might single-handedly make the upcoming generation the least exciting, because the vast majority of people will just migrate on the newer version of the system they already own. It's like switching from PC to Mac - why would you want to lose all your library when you can just upgrade your current PC or get a new one, and keep everything you got?

It actually won't be nearly that hard. For starters, only a small fraction of all PS4 owners have actually invested multi-hundreds to thousands-worth into their software library. There's also the fact that a lot of titles in libraries these days come as perks with subscription services, meaning the user didn't actually "pay" for them the usual way, so those games weren't really a financial investment on their part.

The majority of PS4 owners are casuals who generally pick consoles up for a very small selection of mega-games, usually 3rd-party, and generally franchise games. So stuff like Madden, NBA 2K, FIFA, and these days games like Fortnite and GTA5, Destiny, Minecraft etc. are what the majority of PS4 owners (especially those who buy their systems around Black Friday and the Christmas holidays) buy the system for. They might also happen to buy one that's bundled with a 1st-party exclusive, or may happen to pick up a few other games down the line, but for those types of mass-mainstream purchasers those amounts are nothing significant.

I mean you can look at the typical sales range of more core-orientated games to tell what the market size on systems like PS4 are like for the hardcore/core gamer you're describing. I'd probably peg it around 15 -20 or so million, because a lot of those people buy a lot of the same games. The mega-hyped releases like FFVII Remake, RE2 Remake, GOW4, Tekken 7, SFV, Horizon etc. I'm not saying the hardcore/core are the only ones who buy those games: again there are the more casual and mainstream types who might happen to pick the games up, or get them conveniently in bundles or as freebies with their subscriptions. But the casuals and mainstream aren't buying PS4s specifically for those types of games! And the same thing applies with Xbox: nowhere near all those 40+ million owners purchased one for Halo, or TitanFall, or Sunset Overdrive etc.

Also, with a lot of the franchise games the majority of system owners purchase, they really don't tend to play the older installments when the newer ones come out. This is something they do with sports games especially! So to them, they wouldn't be losing much of anything since they have no attachment to the previous, older versions. For other games like Fortnite and Minecraft, IIRC progress, save data, DLCs etc. in those games are platform-agnostic. This means you can easily transfer data (and even in-game currency earned in-game like with Fortnite) from one device to another. Most of these games do this with universal game accounts and let the players merge accounts between two or more different devices, that way they don't have to redo or re-purchase everything when going from one account to the next, or one platform to the next, either.

GAAS does not inherently mean the software is tied to any given fixed console or platform. Quite the opposite, actually. It's meant to allow players to be freed from being locked down to a given platform, so that they can have a continuous experience regardless of what device they're accessing the game from. So it would be counter-intuitive if these big GAAS titles locked content earned and purchased on one device to simply that device (it would also hurt active engagement in that game's ecosystem by the end-user). I can see exclusive DLC made for a given platform staying tied to it in terms of access, but this never accounts for any significant portion of DLC content in a games, just a few skins or such, and that's about it.

So essentially, I think the only people who fall into the categorization of potential next-gen console purchasers that have a significant financial and social investment into an ecosystem (and would primarily use that as one of their primary factors in picking a console) are the hardcore early-adopters, the ones who tend to buy systems within the first year or two. And even among them, that factor isn't too high on their list, because some of them value other things well over it such as power, or software library. For everyone else, such a reasoning is basically assuming a sunk cost fallacy, but the actual people they assume it upon don't see it the same way. They buy PS and Xbox because of the major, newest AAA mainstream 3rd-party games like GTA6, or the franchise sports games like Madden/FIFA/NBA 2k, or the big GAAS titles like Fortnite, Minecraft etc. And they don't tend to spend much financially outside of those titles, those titles either being largely platform-agnostic in how data can be accessed/transferred, or annual games where the previous release is immediately rendered outdated once the updated version drops. So even if they DID drop a lot of cash into the previous version, it doesn't matter because they won't touch it again, and they can probably transfer save data and DLC from the older version to the new one anyway (I'm assuming).

I think the bigger issue with people switching platforms next gen has nothing to do with financial investments into game libraries, but social connections made on a given ecosystem. Unless/until cross-platform chat and messaging (and cross-platform sharing of games and game data/DLC with 3rd-party platform-agnostic games) becomes a standard, PS people are still mostly confined to their PS friends for best means of communication on that platform, and the same goes with Xbox people on theirs. THAT is what's going to likely matter more in terms of affecting how many people might consider jumping between platforms, but I can actually see this benefiting both systems rather than hurting one. I.e, if one platform (say PlayStation) is the one a more casual/mainstream player has a social network investment in, but the other platform (say, Xbox) has a 1st-party game or two they want to play that, for them, is easiest to do on the Xbox platform (because let's be honest, casuals and mainstream gamers aren't typically the ones with thousand-dollar + rigs and super high-end laptops capable of playing next-gen games at console-quality settings, let alone higher ones), then they might pick up a new Xbox as well, and probably play on that system, too, without abandoning their friends on PlayStation.

Most gamers, whether hardcore, core, or casual/mainstream...they honestly aren't AS tied to brand loyalty or a given platform ecosystem just for the sake of it. Hardcore will go where the best mix of power/games/services/price is at, even if it means a lot of them double-up on both system. The casuals and mass market will eventually go where the hardcore/core are at, and since they jump in a lot later, they get the systems at heavily reduced cost and can therefore have more leniency to double-up if they so choose. IMO this is what makes MS's approach pretty interesting, because they really do seem to understand that you get people in through the ecosystem, but that ecosystem doesn't need to be tethered down by a singular device. And that means a lot of the social communication aspects of gaming that we know to be pretty platform-dependent for now, they're going to have to make platform-agnostic out of necessity. And as I was just saying before, it's the social investment that's particularly valuable to the end user, just look at social network platforms like Twitch, Facebook, Twitter etc. for proof of this. If Microsoft can get gamers invested into their ecosystem socially ahead of competitors, it becomes much more difficult for that person to consider an alternative ecosystem.

Sony understands this, too, which is why I keep telling some of the diehards who keep insisting it being "dumb" what MS is doing, they might as well get ready to say that about Sony in a year from now, too. Do you honestly think Sony would put themselves at a disadvantage that could completely cripple their PlayStation brand in the years ahead, by letting competitors saturate their ecosystem across multiple devices while they sit back and do nothing on that front? Yes exclusives do matter (especially for Nintendo) and can have a strong effect on us if they're of quality, but here's the thing: we might put exclusives on that type of pedestal but the vast majority of gamers as a whole absolutely do not.

That isn't the say exclusives don't like exclusives or don't care about them, but exclusives do not shape the purchasing timing or habits of the mainstream/casual gamers that make up the majority of console purchasers. And keep in mind that some exclusives between the likes of Sony and Microsoft, actually reach whatever numbers they do via being bundled into discounted SKUs aimed at holiday shoppers, but those shoppers aren't necessarily buying the systems for the bundled games!

Lastly I will say this: I do think an underlying reason a lot of people vehemently want Sony to not focus on these emerging areas is because they fear Sony will leave behind their bread-and-butter in doing so. But in reality, it's not an either/or game. It IS possible to do both; MS seems very intent on doing so (they've been investing in a lot more 1st party content the past few years) and if they can do it, Sony can as well (partnering with MS to use Azure servers, for example). A lot of great things can come from this approach and it's a bit shortsighted and irrational to hope it fails for no good reason, though I can at least understand what might be causing people to think that way.

You forget that Microsoft owns Github and that Microsoft is known for it's Fear, Uncertainty and Doubt tactics.

That said, everybody with a working brain cell in his skull would know that Sony will not be using a GCN based GPU in the new PS5.

That said, did you know that Microsoft is desperate because Sony sold over 2.5 times more consoles for the current generation? It also seems that Microsoft shareholders are not happy about this. I could very well imagine once Sony releases more details about it's console (>13 TF most probably) XBox will be done.

I guess I should find out what media outlets Sony has investments in so I can irrationally fearmonger about them utilizing those for propaganda even if that's legally punishable by law :pie_thinking:
 
Last edited:

Shmunter

Member
Interesting discussions on ssd streaming next gen. Agreed loading wont be instant, filling up many gigs of ram will still only be as fast as the storage solution which will always be slower than ram.

But as far as streaming assets during traversal of the map/world; a super fast ssd solution should be leagues more capable than a mechanical hard drive.

As illustrated by this simple matrix gif example, only the last line gets pushed out of ram as a new line enters from storage. Picture the matrix Gif having the capability to move much faster with the ssd; mirroring the speed of new assets moving into ram.

Any game designed specifically around this speed capability will require compromise on slower systems -or- not exist. Again igniting potential for lower base systems holding back the faster one if target platform is Hdd oriented and advantage is not taken of the ssd.

matrix_animation_spangled.gif
 
Last edited:

FERN

Member
You forget that Microsoft owns Github and that Microsoft is known for it's Fear, Uncertainty and Doubt tactics.

That said, everybody with a working brain cell in his skull would know that Sony will not be using a GCN based GPU in the new PS5.

That said, did you know that Microsoft is desperate because Sony sold over 2.5 times more consoles for the current generation? It also seems that Microsoft shareholders are not happy about this. I could very well imagine once Sony releases more details about it's console (>13 TF most probably) XBox will be done.
Nice Avatar

Chances on a Quake 3 with RT re-release?? šŸ™ƒ
 

Fun Fanboy

Banned
It was a good weekend, you know.... the one where all Xbox fanboys who were pushing GitHub leak data and then that data is falling flat on Its face after the confirmation of RDNA 2 for PS5.....and then Xbox fanboys were deleting their past tweets and videos on YouTube.....yeah that weekend.
I love you playstation twitter warriors!!!

My weekend was great too. Lakers beat the bucks and clippers. I won a chicken dinner on PUBG. Still no official PS5 news. It was bliss!
 

alex_m

Neo Member
Nice Avatar

Chances on a Quake 3 with RT re-release?? šŸ™ƒ

Thanks. I think that ship has sailed. Now it's called Quake Championship now. Tried it, but I think I'm too old for it.

I played lot's of Q3 and Q3 Team Arena online. Made it to place 13 once on ranked servers once. Knew the levels by heart and was able to jump backwards towards our flag while carrying the oponents flag and shooting everyone who wanted to get it back ;) On so called LAN parties I won regularly alone against three opponents.
 

CyberPanda

Banned
Thanks. I think that ship has sailed. Now it's called Quake Championship now. Tried it, but I think I'm too old for it.

I played lot's of Q3 and Q3 Team Arena online. Made it to place 13 once on ranked servers once. Knew the levels by heart and was able to jump backwards towards our flag while carrying the oponents flag and shooting everyone who wanted to get it back ;) On so called LAN parties I won regularly alone against three opponents.
Quake 3! Brings back so many memories!
 
He didn't mention anything about RT. He just confirms that PS5 dev kits used to be RDNA 1 and that with the 3rd kit it uses RDNA 2. That is a major change in the hardware and that change was motivated with the emerging information of XSX using latest RDNA.

If that's the case and the latest devkit is the one insiders were talking about a month or so ago that got out in early January, then that devkit is probably using Oberon E0 which was dated December 2019, and further confirms Oberon is an RDNA2 chip and the Navi 10 listing in Rogame's tweets were for the Arieal iGPU profile testlist because Ariel was before Oberon and an RDNA1 chip. It also fits with a change that big being a revision of the chip, so it means there's a chance the first Oberon revision or two were RDNA1 chips, but a different revision was the RDNA2 chp.

However, two caveats here: I don't think Sony was motivated in reaction to go RDNA2 because of info about XSX using RDNA2. They'd of have planned for its use well ahead of time, they have access to AMD's roadmaps early just like MS does. Secondly, I think there's a sneaking assumption with some people (not indicating you in particular or anything) that RDNA2 is only for big chips. But I see no reason why AMD would hold back RDNA2 from their future mobile APUs which will have CU clusters much less than 40.

This applies to Oberon as well: it pretty much is definitely an RDNA2 chip, but that doesn't necessarily mean it's larger than 40CUs. I think we're all wishing it's at least a bit larger, but there's no guarantee that it is. While 40CUs on 7nm would be smaller in die size than 40CUs of a PS4 Pro, these systems also need dedicated silicon for their RT solutions, too, as well as other IP blocks for handling other system tasks, caches etc.

I always thought they should be at least 24GB for future proofing, but we also know RAM is back to being pricey again. It's the most expensive component in this boxes. My bet was always 16 minimum, but wanted 24.

If the memory buses we've speculated so far check out (256-bit PS5, 320-bit XSX), then at the very least we're looking at 16GB for PS5 (plus some DDR4 for background OS tasks, maybe 4GB worth) and 20GB for XSX.

However, they could both do clamshell mode and put anywhere from 28GB - 32GB (PS5) to 34GB - 40GB (XSX). But I don't think that's very likely. For one clamshell would mean the data bit rate is cut in half for the chips so each chip on its own brings half the bandwidth of usual. The upside to that is the active framebuffer size increases since you have double the amount of chips on bus (and you can potentially double the maximum amount of memory to the controller if the chip densities are all the same), it'd just mean bandwidth rate per chip is cut in half.

I guess you can picture it as its own way of going "narrow and fast" versus "wide and slow", even though those terms are usually used for discussing memory buses and chip speeds more generally. By and large though I don't expect either system to do clamshell mode this time around, it's already going to be a significant part of the BOM to get the 16GB/20GB capacities as-is.
 
Last edited:

01011001

Banned
If that's the case and the latest devkit is the one insiders were talking about a month or so ago that got out in early January, then that devkit is probably using Oberon E0 which was dated December 2019, and further confirms Oberon is an RDNA2 chip and the Navi 10 listing in Rogame's tweets were for the Arieal iGPU profile testlist because Ariel was before Oberon and an RDNA1 chip. It also fits with a change that big being a revision of the chip, so it means there's a chance the first Oberon revision or two were RDNA1 chips, but a different revision was the RDNA2 chp.

However, two caveats here: I don't think Sony was motivated in reaction to go RDNA2 because of info about XSX using RDNA2. They'd of have planned for its use well ahead of time, they have access to AMD's roadmaps early just like MS does. Secondly, I think there's a sneaking assumption with some people (not indicating you in particular or anything) that RDNA2 is only for big chips. But I see no reason why AMD would hold back RDNA2 from their future mobile APUs which will have CU clusters much less than 40.

This applies to Oberon as well: it pretty much is definitely an RDNA2 chip, but that doesn't necessarily mean it's larger than 40CUs. I think we're all wishing it's at least a bit larger, but there's no guarantee that it is. While 40CUs on 7nm would be smaller in die size than 40CUs of a PS4 Pro, these systems also need dedicated silicon for their RT solutions, too, as well as other IP blocks for handling other system tasks, caches etc.

I agree that there's no way Oberon switched to RDNA2 as a reaction to Microsoft, they simply weren't in a hurry to switch over with their dev kits because it wasn't really an urgent matter or anything.

the RDNA2 revision will only be actually important in the final retail version to reduced heat, dev kits a chunky bois and don't need to be super compact
 
Last edited:

webber

Member
I dreamed I was in Jack Tretton's house but it was actually houses from my childhood neighborhood and we were talking about gaming/PS4. When I left I saw none other than Mark Cerny leaving his just parked car.
All of the sudden the hype feeling of something going on came up and I started walking (with a crowd) through the main street towards the events building.
Neil Druckmann was there too.

#Soon
#Ineedhelp
 
Status
Not open for further replies.
Top Bottom