Music to my ears.Jason said:Yes, exactly. The number everyone's looking at is teraflops, which is essentially the maximum speed that a graphics card can run at. I believe it stands for floating point operations per second. So everybody's now seeing this spec sheet and they see PS5, 10.2 teraflops, and Xbox Series X, 12 teraflops. And it's like, oh my god, the Xbox is more powerful than the PlayStation. But meanwhile, the people I've been talking to over the past few months and the past couple years who are actually working on the PlayStation have pretty much unanimously all said: This thing is a beast. This thing is one of the coolest pieces of hardware that we've ever seen, we've ever used before. There are so many things here that are revolutionary, so many behind-the-scenes tools and features, APIs, and all sorts of other stuff that is way beyond my scope of comprehension. This is why I'm a reporter, and not an engineer.
I dont understand. What do you want Sony to say? WE HAVE SECRET SAUCE.I preffer to believe in the "developers received Series X tools later" narrative because its much more plausible
And also because Digital Foundry knows their stuff when its about tech things
And really, if there was a secret sauce, Sony would be shouting about it.
What's the benefit of keeping this secret sauce in secrecy? This is just dumb
This is coming from someone who will only buy a PS5, btw
No Vulkan derivatives. It's GNM derivative, Sony's own low-level API, that apparently gives devs even more control over the hardware than Vulkan.The one area where PS5 definitely beats the XSX is ROP/fillrate throughput thanks to the higher clockspeed (both consolves have 64 ROPs, though PS5 has fewer TMUs)
I'm assuming they also use some sort of Vulkan derivative which may be beating DX12
True. Its a bug with the XSX running at reduced scene complexity (foliage, geometry, textures, lighting....) as well as running at lesser FPS than the PS5. What will happen to the FPS with increased scene complexity after they fix the bug?
Source?This is a bug. And the dev confirmed it.
The one area where PS5 definitely beats the XSX is ROP/fillrate throughput thanks to the higher clockspeed (both consolves have 64 ROPs, though PS5 has fewer TMUs)
I'm assuming they also use some sort of Vulkan derivative which may be beating DX12
Nah he's just a man who is right sometimes and wrong other times. Let's not demonize people like we are on era.
Jason gets mocked and ridiculed for anything he says on this forum. I've been following gaming for many years now and even before the dawn of Restera and there is one thing I learned and that is Jason, Matt, Shinobi are as reliable as they come when it comes to insider news.Music to my ears.
He was right all along.
Didn't DF said to reach 120 fps the seres S rez went down at worse to 576p (or something like that)?Jason gets mocked and ridiculed for anything he says on this forum. I've been following gaming for many years now and even before the dawn of Restera and there is one thing I learned and that is Jason, Matt, Shinobi are as reliable as they come when it comes to insider news.
I've always waited for these guys to say something and as far as I know they are always right.
They have good sources in the industry which is why I never doubted Jason when he said PS5 has other advantages not communicated to the general public yet.
We have had developers that have stated the numbers don't tell the full story yet you had armchair devs on here shouting 12TF >19TF because overclocked GPU etc etc.
The chest pumping after we learned that PS5 is sitting at 10TF is rather amusing right now considering the '9TF' console is smashing it in multiplats right out of the gate.
Xbox already has a launch problem in that they don't have anything resembling an exclusive to show off the console capabilities. Now the Series X is being trumped and the fucking Series S is being brought down to 720p?! Armchair devs on here told me it's going to play the same games but in 1440p and 120FPS.
We are only into the first month of the machines on the market and already games are going as low as 720p.
FUCK ME.
Xbox have problems and the last thing you want is to stumble getting out of the gates against PlayStation because once PS gets that momentum it's farewell and goodbye.
I thought this was a serious post until I read this.We have only a handful of crossplat titles where comparisons can be made and differences are not that jarring. Sony has a lead as its SDK is reportedly more robust and more intuitive at the moment but Microsoft will eventually catch up - since Nadella became CEO all of their dev tools have been world class.
In 2022, after half a cycle, we can once again compare notes on how/whether the PS5 is "superior in ways that have not been communicated". This is the same garbage we hear every console cycle - "blast processing", "the reality synthesizer", "the emotion engine", "the Cell, a supercomputer on a die", etc.
We actually do know where the PS5 is superior: faster i/o speeds, haptic feedback (a game changer), VR backwards-compatibility. As for raw power, the XSX is the superior machine and in the coming years, as more tools come out, documentation is streamlined, etc., that will be ascertained. Same thing happened two generations ago: The Cell was a better processor than the Xenon, but that only really became apparent towards the end of that generation once Cell's complexity was fully understood by all major developers.
This is not rocket science, both consoles have RDNA2 Radeon GPUs and the SX GPU is more powerful. It is what it is, folks. Whether that makes the Series X "better" (a childish notion, but console warring seems to be reaching embarrassing levels) or "worse", well, that's for you to decide.
FIVE HUNDRED AND SEVENTY SIX PEE !!Didn't DF said to reach 120 fps the seres S rez went down at worse to 576p (or something like that)?
Remember what that guy from Ready at Dawn said:Music to my ears.
He was right all along.
FIVE HUNDRED AND SEVENTY SIX PEE !!
Yes you read it right. On a next gen console in 2020.
A very good post friend!Jason gets mocked and ridiculed for anything he says on this forum. I've been following gaming for many years now and even before the dawn of Restera and there is one thing I learned and that is Jason, Matt, Shinobi are as reliable as they come when it comes to insider news.
I've always waited for these guys to say something and as far as I know they are always right.
They have good sources in the industry which is why I never doubted Jason when he said PS5 has other advantages not communicated to the general public yet.
We have had developers that have stated the numbers don't tell the full story yet you had armchair devs on here shouting 12TF >19TF because overclocked GPU etc etc.
The chest pumping after we learned that PS5 is sitting at 10TF is rather amusing right now considering the '9TF' console is smashing it in multiplats right out of the gate.
Xbox already has a launch problem in that they don't have anything resembling an exclusive to show off the console capabilities. Now the Series X is being trumped and the fucking Series S is being brought down to 720p?! Armchair devs on here told me it's going to play the same games but in 1440p and 120FPS.
We are only into the first month of the machines on the market and already games are going as low as 720p.
FUCK ME.
Xbox have problems and the last thing you want is to stumble getting out of the gates against PlayStation because once PS gets that momentum it's farewell and goodbye.
A very good post friend!
By the time MS release their games Sony will be on their next wave when the PS5 will be starting to really show its capabilities. Sony worked with devs especially Epic games and their internal studios regarding the features of PS5
CoD for example outputs at 4k 120fps HDR 4:2:2 12bits.
Also Jason Schreier is just some guy with an opinion he should not be somone whos words you take with any authority, he was wrong about both consoles being above 10.7tflops and wrong about both consoles being like an rtx 2080 power wise.
True. Its a bug with the XSX running at reduced scene complexity (foliage, geometry, textures, lighting....) as well as running at lesser FPS than the PS5. What will happen to the FPS with increased scene complexity after they fix the bug?
Or you didn’t distribute workload correctly and use all the cus now .. so framerate stays the same .. who knows .Well logically, when scene complexity and geometry increases, framerate takes a hit. But I'm no expert or anything.
Sorry I don't get it, why do people think Microsoft put in 6GB of slower speed RAM?I mean... I called it as soon as Xbox said they where giving the series X 16GB of vram instead of 20GB because they would have to split the speeds for that to work (and they did), I'm going to call this now the series X will run 16GB at the lower clock speed, because its the only reasonable way for developers to use its special boy duel speeds in a multiplatform world without coding the game from the ground up to use it. Or Microsoft are going to have to allow for developers to switch the 6GB off and use just the 10 at the higher speed.
That decision has greatly hamstrung the console, its the cell processor issue all over again.
Games wise, I believe it's called under promising and over delivering, also letting the games do the talking not their marketing execs.Oooh, so mysterious. Our console is better in so many ways, you just can’t tell and we’re not revealing. Like our HDMI 2.1 connection which may or may not be at HDMI 2.1 performance levels, so mysterious, right??
FIVE HUNDRED AND SEVENTY SIX PEE !!
Yes you read it right. On a next gen console in 2020.
I know you're probably enjoying yourself too much right now about this, but in that specific case we're still talking about running a game at a frame rate that was a pipe dream on consoles until a week ago, and that no one in their right mind would even expect to be in the Series S version of the game, at all.FIVE HUNDRED AND SEVENTY SIX PEE !!
Yes you read it right. On a next gen console in 2020.
For once? LmfaoDam. Jason was right for once.
It's not secret sauce. Better development kit and more balanced system is nothing of secret sauce, it can beat a more powerful hardware less balanced. I'm not saying it's always the case, but first wave of multiplat showed how Sony wasn't wrong at all.I preffer to believe in the "developers received Series X tools later" narrative because its much more plausible
And also because Digital Foundry knows their stuff when its about tech things
And really, if there was a secret sauce, Sony would be shouting about it.
What's the benefit of keeping this secret sauce in secrecy? This is just dumb
This is coming from someone who will only buy a PS5, btw
Sorry I don't get it, why do people think Microsoft put in 6GB of slower speed RAM?
Yeah I knows it's a negative, I was wondering why they did it, thought it was a cost thing but wasn't sure.err because they have... 10GB of the Ram is at 560GB/s while 6GB is at 336GB/s as you might be able to tell 560GB/s and 336GB/s are diffrent speeds, hence why people "think" that there are diffrent speeds, although its less a "think" and more an actual honest to god fact.
Now if you want to know why this is bad, its fairly complicated, but the main reason is that very VERY few products would ever do this and the ONLY benefit is cost, all the negatives are how it complicates everything involving its use. It is not normal for hardware to have two diffrent speeds of ram developers and tools are not exsperienced around this constraint. There is also the issue of that you have 10GB of one and 6GB of the other, have you ever heard of people putting two diffrent sized RAM sticks in a system before? Yeah further complicates the issue.
While these are not physically diffrent sticks they are separate parts of the stick with diffrent speeds, it would make it alot easier to develop around if they where, sadly this is not an option with 16GB, as it would need to be 20GB to allow for this.
Outputs at 4k 120Hz HDR 4:2:2 12bits all the time in 120Hz TVs.Which COD? Pretty sure CW drops to 1080p for 120hz.
It is probably cost.Yeah I knows it's a negative, I was wondering why they did it, thought it was a cost thing but wasn't sure.
Maybe so, but they were eventually correct. I was a solid 360 boy until around 2011 when the quality of PS3 exclusives compelled me to switch. And I haven’t seen anything to make me switch back.
It is probably cost.
They wanted a 320bits bus for better bandwidth but 320bists bus needs multiple of 10 chips of RAM or the same size to works... that means 10GB or 20GB total memory.
10GB was too low.
20GB too expensive.
So they used different memory sizes to reach 16GB with multiple of 10 chips that ended creating the access issue.
I believe the idea is to put critical render data on the 10GB with faster speed and non-critical data (audio for example) in the 6GB with slower speed.
The key is I believe devs needs to choose what to allocate in the fast and slow part for the RAM... there is no magical tool that can decide what data can go on slow and what data can go to faster if devs not says where data will be allocated.
In theory games that uses less than 10GB of RAM can be fully allocated in faster part... the add works only exists in games that uses more than 10GB.
BTW we don’t have evidences that is causing the Series X low performance compared to PS5 because I believe these games are not using 10GB of RAM.
Bad ram setup is being discussed on Beyond3d. Speculation is 3rd party will suffer forever as no other platform has the limitations so ports will take a hit if not reworked.I don't think there are any evidences, apart from some complains on twitter, of the XBSX memory pool being an issue for games. It's probably more cumbersome to develop for but I don't think is nowhere near to the XBO or PS3 situation.
I do have interest in computer hardware from an enthusiast perspective and do some 3D rendering from time to time but my knowledge is limited on this matter so I may be wrong. I do think that the issue, if there is any, with the XSXS GPU maybe that if, as some speculate here, 14CU per Shader Array is not the optimal configuration I wouldn't be surprised if the command processor is having issues giving tasks efficiently to the CUs. If the front end of the CU is not able to efficiently distribute the work then MS has issues, but may be solvable.
Mesh Shaders as far as I'm aware are trying to simplify as much as possible the process between the input assembler and the rasterization process given that most of the hardware there is programmable, which is where the CUs come into play. So Mesh Shaders could bring additional efficiency to how that data is fetched into the CUs.
However there are still some hardware limitations that I don't know how they would impact performance. For example the L1 cache. There is one MB of L1 per SA so XBSX here has less cache bandwidth as a) is running 22% slower and b) it has to be shared among more CUs which means way more contingency reducing the effective bandwidth even more.
Then there are other units that are running faster on PS5 like the ROPs or the Rasterizers that are fixed per SA. So who knows, I think there is still a lot to learn about both consoles.
You realize you will see every game in 1080p as your output right?FIVE HUNDRED AND SEVENTY SIX PEE !!
Yes you read it right. On a next gen console in 2020.
Dude that is not correct. its one unified pool, just some runs slower designed for components that don't use a lot of bandwidth. it's not the same as sdram, or esram, or even PS3.MS had two gens of two memory pools, sdram & esram, then copied the ps4 with the xb1x. Then went back to effectively two pools. Strange.
One thing that hit me, the same feature that enables quick resume might be hurting performance. They are using virtual machines right? This abstraction has overhead.
MS had two gens of two memory pools, sdram & esram, then copied the ps4 with the xb1x. Then went back to effectively two pools. Strange.
One thing that hit me, the same feature that enables quick resume might be hurting performance. They are using virtual machines right? This abstraction has overhead.
The secret sauce might be infinity cache, full rdna2 with hints of rdna3.It's not secret sauce. Better development kit and more balanced system is nothing of secret sauce, it can beat a more powerful hardware less balanced. I'm not saying it's always the case, but first wave of multiplat showed how Sony wasn't wrong at all.
Infinity cache can't be on ps5 the SOC is too small. But I suspect infinity cache is inspired to the ps5 cache system, so it could work in a very similar way.The secret sauce might be infinity cache, full rdna2 with hints of rdna3.
Just out of curiosity, since the PS5 drops to 900p in that 120hz mode, what resolution would you expect from the Series S, which is not even half as powerful?
We havent seen die shots, ps5 soc is nearly as big as series x soc. It has very high bandwidth, it might have a smaller but relatively big cache capable of infinity like functions.Infinity cache can't be on ps5 the SOC is too small. But I suspect infinity cache is inspired to the ps5 cache system, so it could work in a very similar way.
PS5 is an xbox series x with similar but 20% weaker hardware. There is no secret sauce or secret features.A very good post friend!
By the time MS release their games Sony will be on their next wave when the PS5 will be starting to really show its capabilities. Sony worked with devs especially Epic games and their internal studios regarding the features of PS5