• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(*) Ali Salehi, a rendering engineer at Crytek contrasts the next Gen consoles in interview (Up: Tweets/Article removed)

FranXico

Member
When was the last time ms moneyhatted a game, rise of the tomb raider? I'm sorry but it's Sony that have been the "bribing" devs, and i don't expect them to stop next gen.
Still waiting for Cuphead for come out on PS4...

Look, they both do it, I never said otherwise. The only difference is MS held off for a while this gen for PR reasons. To play "good guys", and I see people keep falling for it.

Next gen, they are going in full force, spending as much money as possible to keep games away from the competition. Of course Sony will fight back and keep trying to do the same exact shit. But this time, MS has consumers short memory in their favor, and a whole lot more resources.
 
When was the last time ms moneyhatted a game, rise of the tomb raider? I'm sorry but it's Sony that have been the "bribing" devs, and i don't expect them to stop next gen.

PUBG and PSO2 come to mind. Thinking ms didnt pay for PSO2 on a console thats dead everywhere outside murica.
 
Last edited:

John254

Banned
Still waiting for Cuphead for come out on PS4...

Look, they both do it, I never said otherwise. The only difference is MS held off for a while this gen for PR reasons. To play "good guys", and I see people keep falling for it.

Next gen, they are going in full force, spending as much money as possible to keep games away from the competition. Of course Sony will fight back and keep trying to do the same exact shit. But this time, MS has consumers short memory in their favor, and a whole lot more resources.
:messenger_tears_of_joy: :messenger_tears_of_joy:

You are seriously talking about Cuphead? I think, that there is biiiig difference between funding a development of a game (Cuphead) and bribing Activision to postpone release of MW 2 remaster on other platforms.

Because by your logic, Sony bribed Insomniac to release Spider-Man only on PS4, right?
 

FranXico

Member
Because by your logic, Sony bribed Insomniac to release Spider-Man only on PS4, right?
Insomniac and Marvel, yeah. Especially Marvel.

I have no idea how they talked Price into selling out though. Guess he had a price LOL.

There definitely was a deal and the bottom line mattered. Bribes, as I call them, are part of business. Why are people acting as if this is controversial? And it sucks for consumers, every time, but it's part of business.
 
Last edited:
Still waiting for Cuphead for come out on PS4...

Look, they both do it, I never said otherwise. The only difference is MS held off for a while this gen for PR reasons. To play "good guys", and I see people keep falling for it.

Next gen, they are going in full force, spending as much money as possible to keep games away from the competition. Of course Sony will fight back and keep trying to do the same exact shit. But this time, MS has consumers short memory in their favor, and a whole lot more resources.

Microsoft helped funded cuphead, and turned it from a boss rush game to a platformer, and it did make it way to a competitor platform. I just don't see microsoft monyhatted big games anymore, 3 party dev just can't ignore ps install base.
 
I dont have 2080ti but I have seen many gamplays and in higher resolutions GPU usage was almost always at 99% (besides CPU limited scenarios), so what you wrote is simply a lie.



99% GPU usage almost entire time!


Also although techpower up suggest 2080ti is 20% faster from both 2070 and 1080ti in reality is very hard to find benchmarks that shows only 20% difference in 4K.

wolfenstein-2-3840-2160.png


Here's 2080ti has 39% better performance compared to 2070 super.

hellblade_3840-2160.png


Here's for example 2080ti is 51% faster from standard 2070 and 1080ti. Bigger and slower chip (2080ti) is still better in the end and it will be exactly the same on consoles, meaning 56CUs 12TF GPU will easily beat 36CUs 10TF.


The 2080Ti is 51% faster than a 2070 sure. But the 2080Ti has 68 SMs (4352 cuda cores) and the 2070 has 36 SMs (2304 cuda cores). So 80% more shaders for just 51% extra performance.
The regular 2080 has 48 SMs (3072 cuda cores), which is 33% more shaders, yet it only manages 15% more performance.

Hmmm...

Yeah. The Series X will be faster than the PS5; it has 44% more shaders. But that doesn't mean it'll be 44% faster. Not even close.
 

rnlval

Member
Basically what the crytek dev said. They will perform similarly. One has a lot of bottleneck that the other does not have. The other machine has an API that bogs down the performance while the other machine has a more performant software API.
PS5 has its bottlenecks such as 448GB/s memory bandwidth which is recycled from RX 5700/5700 XT has extra desktop-class CPU, RT cores, File I/O and DSP memory bandwidth consumers.


usmOBhI.jpg



From EA DICE's API comparison.
 

John254

Banned
Insomniac and Marvel, yeah. Especially Marvel.

I have no idea how they talked Price into selling out though. Guess he had a price LOL.

There definitely was a deal and the bottom line mattered. Bribes, as I call them, are part of business. Why are people acting as if this is controversial? And it sucks for consumers, every time, but it's part of business.
Yup. Money talks.

But you are comparing situations which aren't comparable. Without Sony's funding, we wouldn't have Marvel's Spiderman today, and without Microsoft's money, we probably wouldn't have Cuphead today. This is "bribing" where I'm okay with it, because it's called "funding development."

I'm not okay with legitimate bribery like Destiny, which always had 1 year exclusivity with maps/strikes/guns for Playstation and like MW2 Remastered with is clearly just bribing to postpone release elsewhere.
 

rnlval

Member
The 2080Ti is 51% faster than a 2070 sure. But the 2080Ti has 68 SMs (4352 cuda cores) and the 2070 has 36 SMs (2304 cuda cores). So 80% more shaders for just 51% extra performance.
The regular 2080 has 48 SMs (3072 cuda cores), which is 33% more shaders, yet it only manages 15% more performance.

Hmmm...

Yeah. The Series X will be faster than the PS5; it has 44% more shaders. But that doesn't mean it'll be 44% faster. Not even close.
FYI, regular RTX 2080 has 46 SMs (2944 cuda cores). RTX 2080 Super has 48 SMs.

RTX 2080 is gimped by 448 GB/s memory bandwidth which is the same as RTX 2070's 448 GB/s.

RTX 2080 Super has 496 GB/s memory bandwidth.
 
Last edited:
PS5 has its bottlenecks such as 448GB/s memory bandwidth which is recycled from RX 5700/5700 XT has extra desktop-class CPU, RT cores, File I/O and DSP memory bandwidth consumers.


usmOBhI.jpg



From EA DICE's API comparison.

Oh so PS4 has less complex API than XB1, is that what you're saying? I don't know why you're telling me that.

The Crytek dev is saying XSeX API is not up to par with the one in the PS5. That chart does not negate it.
 
Last edited:

rnlval

Member
Oh so PS4 has less complex than XB1, is that what you're saying? I don't know why you're telling me that.

The Crytek dev is saying XSeX API is not up to par with the one in the PS5. That chart does not negate it.
Against your "The other machine has an API that bogs down the performance" statement

Crytek dev's statement has been withdrawn.
 
Last edited:

rnlval

Member
You don't need credentials to call out that kind of BS, what a joke.

TC's Crytek dev's statement wouldn't matter much when Unreal Engine 4.x dominates the licensed 3D game engine market. What matters more is Unreal Engine 4's runtime performance. e.g. Gears 5.
 
Last edited:

yewles1

Member
Yeah, I expect they'll go all out with this. People were crying foul because modern warfare 2 remaster has come out on PS4 a month earlier, but I expect they'll forget that when it suits.

For MS it just makes business sense to try and fill the void until their internal studios get up to pace, we'll see how Phil can spin this, as he's been pretty cool since he started in his role, promoting things like cross platform play, but being perceived as taking games away from other platforms may alter the perception of him.

I think both platforms will do it to an extent, I'll buy both consoles at launch but will probably not buy any 3rd party exclusives at launch in order to discourage it.

... but never say never 😆
I'm suddenly reminded of Review Tech's stances on Rise of the Tomb Raider vs. Street Fighter V...
 

BluRayHiDef

Banned
PlayStation 5:
GPU: 144 TMUs (texture mapping units)
GPU frequency: 2,230 Mhz
Fill Rate: 144 TMUs x 2,230 x 1000 = 321,120,000 texels per second


Xbox Series X:
GPU: 208 TMUs (texture mapping units)
GPU frequency: 1,825 Mhz
Fill Rate: 208 TMUs x 1,825 x 1000 = 379,600,000 texels per second


-------------
Calculation of Percentage Difference: (321,120,000 texels per second) / (379,600,000 texels per second ) = 0.845943098 -> 100 x 0.845943098 = 84.5943098% =~ 84.6%


The PlayStation 5's fill rate is 84.6% of the Xbox Series X's fill rate, which is negligible since its fill rate is fast enough to render games in 4K at 60 frames per second.


Also, the PS5's higher frequency will enable it to perform rasterization and overall render output (ROPs) more quickly. So, the PS5's GPU is a bit weaker in one regard but more powerful in two other regards.
 

Dory16

Banned
PlayStation 5:
GPU: 144 TMUs (texture mapping units)
GPU frequency: 2,230 Mhz
Fill Rate: 144 TMUs x 2,230 x 1000 = 321,120,000 texels per second


Xbox Series X:
GPU: 208 TMUs (texture mapping units)
GPU frequency: 1,825 Mhz
Fill Rate: 208 TMUs x 1,825 x 1000 = 379,600,000 texels per second


-------------
Calculation of Percentage Difference: (321,120,000 texels per second) / (379,600,000 texels per second ) = 0.845943098 -> 100 x 0.845943098 = 84.5943098% =~ 84.6%


The PlayStation 5's fill rate is 84.6% of the Xbox Series X's fill rate, which is negligible since its fill rate is fast enough to render games in 4K at 60 frames per second.


Also, the PS5's higher frequency will enable it to perform rasterization and overall render output (ROPs) more quickly. So, the PS5's GPU is a bit weaker in one regard but more powerful in two other regards.
How's that armchair Mr Developer?
 

Gavin Stevens

Formerly 'o'dium'
PlayStation 5:
GPU: 144 TMUs (texture mapping units)
GPU frequency: 2,230 Mhz
Fill Rate: 144 TMUs x 2,230 x 1000 = 321,120,000 texels per second


Xbox Series X:
GPU: 208 TMUs (texture mapping units)
GPU frequency: 1,825 Mhz
Fill Rate: 208 TMUs x 1,825 x 1000 = 379,600,000 texels per second


-------------
Calculation of Percentage Difference: (321,120,000 texels per second) / (379,600,000 texels per second ) = 0.845943098 -> 100 x 0.845943098 = 84.5943098% =~ 84.6%


The PlayStation 5's fill rate is 84.6% of the Xbox Series X's fill rate, which is negligible since its fill rate is fast enough to render games in 4K at 60 frames per second.


Also, the PS5's higher frequency will enable it to perform rasterization and overall render output (ROPs) more quickly. So, the PS5's GPU is a bit weaker in one regard but more powerful in two other regards.

Fillrate isn't a static figure per resolution, its heavily dependant on what's being drawn in the given frame, so you can't just say "its fast enough" because this isn't a simple equation. Its fast enough, but is it fast enough to render particles at full res? Half res? Quarter res? What about amount?, what about complexity, such as lighting interaction, or shader quality? What about how many are rendered at what distance? All this is just particle, but what about foliage, alpha billboards, alpha blended and alpha masked effects...? You're essentially doing what everybody else is doing and saying this is number X, this is number Y, X is bigger, but let me show you why Y is better. But you're not actually putting it into real world situations.

I'm sure I don't need to tell you that the higher in resolution you go, the more of an exponential drain on overall performance fillrate bound things are, and that rendering at 4k that quickly eat through your memory consumption even in simple frames. That's why games like RE3 and the like perform so badly at full 4k, compared to just dropping it down slightly but being on 2tf less hardware. The higher you up your resolution, the more that fillrate will kick you in the arse. Its why there is so much work being put into *other* solutions to fake 4k. Its so damn expensive.

This is a single area, that you've got wrong, because you're not actually thinking of the bigger picture. Yet there are people here that are thinking of everything and cocking it up, majorly. You don't have to work for Crytek making Crysis 2: Electric boogaloo to know this shit.
 
Last edited:

thelastword

Banned
Wow, the article was removed. MS ninjas are not fooling around.
I heard Phil was on that Discord, you should be aware that this Crytek dev was reported ASAP...…"Talking positives about PS5, how do we quell it Sir Phil"
wait I thought the PS5 was efficient, not powerful? We shifting the narrative again? It’s both now? Hard to keep up
As it stands PS5 has little to no bottlenecks......You realize this is not a PRO vs XBONEX comparison relative to design.....It's different....XBOX has more raw TFLOPS and a higher CU count, that will translate to some benefits.....PS5 has a higher GPU clock, less stress on the CPU and GPU because of lots of custom silicon (like the sound unit, geometry engine and cache scrubbers), it has a faster SSD, that will translate to more benefits in real-time overall...

In reality, more horsepower does not automatically translate to better performance in realtime, it really depends on the subsystems. If I can go through a maze faster it's better than trying to knock down the mazewalls through pure bruteforce. All the time you spend trying to break down the walls in realtime is time where the speedster is going through that maze unhindered...So consistency in framerates, no large frame spikes is what you can expect.....A very well designed system......

There are a lot more tweets and retweets that only make reference to Playstation.

I mean, it's pretty obvious he has a bias to the brand.



Funny, they created the game for all platforms, yet he shows the PS4 version.
Maybe the PRO version had a better framerate.....
 

Panajev2001a

GAF's Pleasant Genius
That's not what I think, that's just the reality, games take way longer to develop then ever before despite that TTT being just a month or two just like on PSX. Now it will supposedly be even less than a month, but Cerny even said during his presentation that this doesn't mean the developers won't take as much time as then want/need to materialize their visions for their games.

We saw Death Stranding being pulled out within what, two years? But that's on a well established and polished engine, well known PS4 platform, if the current generation would last another 3-5 years I'm sure we would see such development time decrease across all the gamin studios out there, but it's a new generation, everything resets, engines and tools have to be adjusted and enhanced, devs will have to lear new technologies and techniques, get to know what the consoles are truly capable of, it will take years again until the devs get familiar with new systems, I recon 3-4 years again (2023-24) before we will start seeing the so called "true" next-gen games.

We are not disagreeing on big AAA(A) games taking longer as the ceiling grows higher, what the TtT KPI tries to optimise is the minimisation of the effort required to get the right level of sustained performance close to the peak levels.

This is super important for smaller budget games coming from big publishers and Indies alike, but it is also important for very big projects: if getting good performance out of PS5, when you transition your engine over, took almost a year like on PS3 you are either spending a lot of money and effort with your core engineering group (limited resources and a cost that small devs may not easily be able to afford) or you just target a much lower level of performance and aim a lot lower for the platform which may hold the game back ultimately.
 

martino

Member
Oh so PS4 has less complex API than XB1, is that what you're saying? I don't know why you're telling me that.

The Crytek dev is saying XSeX API is not up to par with the one in the PS5. That chart does not negate it.
that's right and what does the y axis tell you ?
 

Panajev2001a

GAF's Pleasant Genius
Oh don't worry, Microsoft has also been busy doing what they do best, bribing third-parties left and right to keep games away from PS5. Just wait ;). And don't expect the usual suspects to cry foul about exclusives being "anti-consumer" anymore... ;)

Of course and if we get games not crossplay enabled to protect LIVE customers there won’t be a beep about it either :LOL:.
 

BluRayHiDef

Banned
Fillrate isn't a static figure per resolution, its heavily dependant on what's being drawn in the given frame, so you can't just say "its fast enough" because this isn't a simple equation. Its fast enough, but is it fast enough to render particles at full res? Half res? Quarter res? What about amount?, what about complexity, such as lighting interaction, or shader quality? What about how many are rendered at what distance? All this is just particle, but what about foliage, alpha billboards, alpha blended and alpha masked effects...? You're essentially doing what everybody else is doing and saying this is number X, this is number Y, X is bigger, but let me show you why Y is better. But you're not actually putting it into real world situations.

I'm sure I don't need to tell you that the higher in resolution you go, the more of an exponential drain on overall performance fillrate bound things are, and that rendering at 4k that quickly eat through your memory consumption even in simple frames. That's why games like RE3 and the like perform so badly at full 4k, compared to just dropping it down slightly but being on 2tf less hardware. The higher you up your resolution, the more that fillrate will kick you in the arse. Its why there is so much work being put into *other* solutions to fake 4k. Its so damn expensive.

This is a single area, that you've got wrong, because you're not actually thinking of the bigger picture. Yet there are people here that are thinking of everything and cocking it up, majorly. You don't have to work for Crytek making Crysis 2: Electric boogaloo to know this shit.

Fill rate isn't a static figure per resolution, but I've defined it for the PS5 and the XSX in terms of the relation between the two systems, which is static. So, the comparison is valid.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
When this generation launched, Playstation fans were all over the place with how much better it looked on the PS4 with 30 percent more pixels.

When one X launched ps fans said that graphics doesn't matter.

Now we have daily new threads with ps fans meltdowns spinning slower hardware compared to the competition is faster,

Seriously i can no longer follow up on which statement is true anymore. All this meltdown seems more deadly than the corona virus at the moment.

I presume pussy cat meant this hype train:







Yep, PS5 Hypetrain. Not a single mention of flops.


Would also be weird when you can't use it in your fanboy agenda. That's like hitting yourself.
 
Last edited:

Handy Fake

Member
When this generation launched, Playstation fans were all over the place with how much better it looked on the PS4 with 30 percent more pixels.

When one X launched ps fans said that graphics doesn't matter.

Now we have daily new threads with ps fans meltdowns spinning slower hardware compared to the competition is faster,

Seriously i can no longer follow up on which statement is true anymore. All this meltdown seems more deadly than the corona virus at the moment.




Would also be weird when you can't use it in your fanboy agenda. That's like hitting yourself.
Seems to me the PS fans are perfectly happy with their console.
Not sure why others have to piss on it to make themselves feel better, but each to their own I suppose.
 

FranXico

Member
Seems to me the PS fans are perfectly happy with their console.
Not sure why others have to piss on it to make themselves feel better, but each to their own I suppose.
Just how much their pride was hurt this gen, to the point of desiring so deeply to "pay back" and contemplate "reversed roles" says a lot about them.
 

Gavin Stevens

Formerly 'o'dium'
Fill rate isn't a static figure per resolution, but I've defined it for the PS5 and the XSX in terms of the relation between the two systems, which is static. So, the comparison is valid.

Erm... no. That’s just daft. You’re not understanding how stuff works here at all and instead just trying to do basic math without it being put into real world situations.
 

SirTerry-T

Member
I feel sorry for the guy if he lost his job but NDA's are there for a purpose. If that's what his dismissal was all about.

I also hope the guy who instigated the interview, the sole purpose of which appeared to be for fanning the flames of petty console warring and getting his 15 minutes in the Twitter and forum spotlight has karma bite him on his arse at some point.

It's all pretty pathetic really...these are machines for playing games on.
 

Ashoca

Banned
First of all, the guy, who was interviewed, had his instagram and twitter feed full of Sony fanboy stuff. Nothing about Xbox.

And then lets have a look at his interview:

The PlayStation 3 had a hard time running multi-platform games compared to the Xbox 360. Red Dead Redemption and GTA IV, for example, ran at 720p on the Microsoft console, but the PlayStation 3 had a poorer output and eventually up scaled the resolution to 720p. But Sony's own studios have been able to offer more detailed games such as The Last of Us and Uncharted 2 and 3 due to their greater familiarity with the console and the development of special software accessibility.]

That is why it is not a good idea to base our opinions only on numbers. But if all the parts in the Xbox Series X can work optimally and the GPU works in its own peak, which is not possible in practice, we can achieve 12 TFlops. In addition to all this, we also have a software section. The example is the advent of of Vulkan and DirectX 12. The hardware did not change, but due to the change in the architecture of the software, the hardware could be better put in use.

Sorry, but NO, PS3? This was a completely different scenario. You should not take PS3 into this discussion.
PS3 was a complete unique scenario, because of the huge cluster fuck called CELL.
PS5 is NOT like the PS3, also XSX is NOT like the PS3. It's ridiculous to compare PS3 to anything of those.
This is just bullshit, come on! WHY compare PS3 to XSX? SERIOUSLY?!

And XSX is using DX12 ULTIMATE:
Microsoft’s DirectX 12 Ultimate unifies graphics tech for PC gaming and Xbox Series X
Bringing a suite of software advancements in gaming graphics onto a single platform



WHY would it be easier to develop a game for a single platform (PS5) vs PC+XSX?

Sony runs PlayStation 5 on its own operating system, but Microsoft has put a customized version of Windows on the Xbox Series X. The two are very different. Because Sony has developed exclusive software for the PlayStation 5, it will definitely give developers much more capabilities than Microsoft, which has almost the same directX PC and for its consoles.

What the fuck is he talking about? What is this fanboy crap? XSX is not running Win10 OS, what is he talking about? Whats next? Windows 10 malware will destroy your XSX? Lol

Developers say that the PlayStation 5 is the easiest console they’ve ever coded for. so they can reach the console's peak performance. In terms of software, coding on the PlayStation 5 is extremely simple and has many features which leave a lot of options for developers. All in all, the PlayStation 5 is a better console.

OK, this is absolutely bullshit. 1st party devs saying this marketing crap, so what? Extremely simple? Really? That's why PS5 is using variable frequencies and devs even have to rewrite their engines to optimise for the PS5, according to Cerny:

In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption.

Super easy! Just create a new engine just for the unique PS5 architecture. Simple, right?

Again, what has this to do with anything? And only because 1st party devs say that it’s „THE BETTER CONSOLE“ really? This is 100% fanboy crap! come on!

A good example of this is the Xbox Series X hardware. Microsoft two seprate pools of Ram. The same mistake that they made over Xbox one.

Again, bullshit. XSX is still using a unified memory system, it's NOT even remotely like Xbox One, XSX does not use „ESRAM“, this is just bullshit! It's the same as any PC

This is how it works:


Raising the clock speed on the PlayStation 5 seems to me to have a number of benefits, such as the memory management, rasterization, and other elements of the GPU whose performance is related to the frequency not CU count. So in some scenarios PlayStation 5's GPU works faster than the Series X. That's what makes the console GPU to work even more frequently on the announced peak 10.28 Teraflops.
But for the Series X, because the rest of the elements are slower, it will not probably reach its 12 Teraflops most of the time, and only reach 12 Teraflops in highly ideal conditions.

Again, bullshit. DF actually tested similar graphic cards on PC, where we had one overclocked GPU vs. a non-overclocked GPU, they showed that 10 TF from 36 compute units leads to less performance than 10 TF from 40 compute units. Xbox has 12 TF from 52 compute units.

And its actually the other way around. XSX has FIXED clocks, it can sustain 100% cpu/gpu at all the time, during every condition:

Once again, Microsoft stresses the point that frequencies are consistent on all machines, in all environments. There are no boost clocks with Xbox Series X.

Source: https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs

whereas PS5 can NOT:

Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core.

Source: https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

So again, what is this fanboy crap? Seriously! How is PS5s way of doing this a better way?

Doesn't this difference decline at the end of the generation, when developers become more familiar with the Series X hardware?

No, because the PlayStation API generally gives devs more freedom, and usually at the end of each generation, Sony consoles produce more detailed games. For example, in the early seventh generation, even multi-platform games for both consoles performed poorly on the PlayStation 3. But late in the generation Uncharted 3 and The Last of Us came out on the console. I think the next generation will be the same. But generally speaking XSX must have less trouble pushing more pixels.

What? Now xbox is more powerful? lol
Again, why bring PS3 in here? XSX is NOT PS3. XSX does not have CELL.

There is another important issue to consider, as Mark Cerny put it. CUs or even Traflaps are not necessarily the same between all architectures. That is, Teraflops cannot be compared between devices and decide which one is actually numerically superior. So you can't trust these numbers and call it a day.

What? Both consoles have a very similar architecture, both are based on RDNA 2.0, both CPU/GPU are from AMD. How can we NOT compare? We absolutely CAN compare!
It's not like Xbox 360 vs PS3, where we had two different architectures, but now we HAVE!

Sony has always had better software because Microsoft has to use Windows. So that's right.
What? Again XSX is not using Windows 10 OS? What is he talking about? And Sony has better Software? What? How so? What kind of software is he talking about? Operating System? Or what?


What Sony has done is much more logical because it decides whether the GPU frequency is higher or the CPU's frequency at certain times, depending on the processing load. For example, on a loading page, only the CPU is needed and the GPU is not used. Or in a close-up scene of the character's face, GPU gets involved and CPU plays a very small role. On the other hand, it's good that the Series X has good cooling and guarantees to keep the frequency constant and it doesn't have throttling, but the practical freedom that Sony has given is really a big deal.


This is some nextgen level spinning…. FREEDOM?!?!? practical FREEDOM?!?!

„Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core."

GREAT freedom! devs HAVE TO throttle back CPU to sustain GPU! Fantastic freedom!

On XBOX series x devs don't have to worry, get max for both, easy:

Once again, Microsoft stresses the point that frequencies are consistent on all machines, in all environments. There are no boost clocks with Xbox Series X.

but this is a BAD thing? what the actual fuck?

There is many more of this crap in this interview, but I think this is enough. No wonder they pulled it, this is embarrassing.
 

Shmunter

Member
Seems to me the PS fans are perfectly happy with their console.
Not sure why others have to piss on it to make themselves feel better, but each to their own I suppose.
If I were so upset, I’d just join the winning team and live in bliss. But being annoyed seems to be the default position for some reason. Chiilax, the gen hasn’t even begun, who knows when it will with the state of the world as it is.
 
Last edited:

BluRayHiDef

Banned
Erm... no. That’s just daft. You’re not understanding how stuff works here at all and instead just trying to do basic math without it being put into real world situations.

No, you're in denial. When both systems are given the same rendering targets, the difference in their performances remains constant even though the values of their performances change.
 

Journey

Banned
Nice
drSXFHY.png

Seems to reflect what other devs said: both consoles are very close, so much unnecessary bickering over a meager 17% difference in peak performance, literally the closest consoles have ever been


Xbox One also played Call of Duty: Ghosts, Assassin's Creed IV, etc., every bit as amazingly as the PS4 version, so what was all the hoopla about back then? 🤷‍♂️


Fake edit:

Oh yea, they were graphically superior on PS4 lol.
 

SonGoku

Member
Xbox One also played Call of Duty: Ghosts, Assassin's Creed IV, etc., every bit as amazingly as the PS4 version, so what was all the hoopla about back then? 🤷‍♂️


Fake edit:

Oh yea, they were graphically superior on PS4 lol.
To be fair that's a 40% gap on a 1080p display, also ghosts was particularly bad because it wasn't optimized for the esram setup halving resolution to 720p creating a artificial 2x gap
Won't see anywhere close to that on a 17-21% gap let alone at higher resolutions ~4k approximates
 
Top Bottom