• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

(*) Sony PS5 Vs. Xbox Series X Technical Analysis: Why The PS5’s 10.3 TFLOPs Figure Is Misleading

Status
Not open for further replies.

Romulus

Member
Well you have to remember Halo’s development began 5 yrs ago. Doubtful 343 would have known XSX’s final specs since XSX’s development began 4 yrs ago.

Launch games never max out a new console for this obvious reason so all this chatter seems like mountain out of a molehill stuff.

I’d wager PS5’s games would run just fine on PS4 but that Sony simply won’t permit it for marketing purposes and that even then there will be cross gen 1st and 3rd party PS5 games.

This is mostly a perception issue.

I think launch games get a lot further along now with the architecture than in the past. Super custom ps3 type systems took much longer, not now. So they might not max it out, but that was never an expectation on my end anyway. Even going back to something like Halo CE it looked amazing to see those massive worlds in that fidelity on a console. Some of that being that the Xbox was a beast and easier to develop for, and there were no constraints of other systems. One of the best looking launch games of all time in my opinion, for those reasons. So, I don't think its perception at all. A brand new CPU with 4-5x the power, 6x GPU, more RAM, SSD all customized is a league superior to what we have. And I don't believe its enough of a marketing point to matter, the vast majority of gamers aren't even thinking that way at all. I just think gamers will see PS5 exclusives and see a difference, not because its not on PS4. That won't even cross minds.
 
Last edited:

welsay01

Neo Member
Yep, I get the utilization but I didn't watch the whole Cerny presentation it was too boring for me. Anyways though someone said PS5 will mostly run at 10.28 TFlops that is where this confusion started. I already know that is its peak performance. My point is that if it is based on utilization as you say then you cannot say definitely that it will mostly run at 10.28. So that assumption that it will run at 10.28 TFlops most of the time is wrong.

Yes, the variable frequency is based on utilization. But I see now, your argument is semantics of the 10.28 most of the time, well of course that can't be proven since no one can predict the future, right?

I wasn't the one who said it, but the person who did was most likely citing Cerny:

Mark Cerny - Road to PS5 @ 37:40 said:
"36CUs at 2.23Ghz is 10.3 teraflops and we expect the GPU to spend most of its time at or close to that frequency and performance"

I can see where the confusion comes from. From that statement alone it's not clear if he's making a speculation that developers are going to be maxing out the GPU or if he's saying that it can't sustain that clock. I understood it as him saying he expects developers to optimize for full saturation most of the time, and the rest of the time are the times it doesn't need to be maxed. I was able to infer that because I did watch the full presentation and thus understood how the variable clocks were explained to work.

However people trying to argue that since he only said 'most' of the time that means it's less than 10.28 TFLOPS and that's incorrect.

So just to clarify, since you understand that TFLOPS is based on peak then you agree that PS5 is in fact a 10.28 console, yeah?
 

Sosokrates

Report me if I continue to console war
Great OP.

What I think what will happen a lot is that devs will just find clockspeeds which can run locked all the time, so they can get consistent performance, and they will just lower the graphics so it runs on these lower but sustained clockspeeds.
 
Yes, the variable frequency is based on utilization. But I see now, your argument is semantics of the 10.28 most of the time, well of course that can't be proven since no one can predict the future, right?

I wasn't the one who said it, but the person who did was most likely citing Cerny:



I can see where the confusion comes from. From that statement alone it's not clear if he's making a speculation that developers are going to be maxing out the GPU or if he's saying that it can't sustain that clock. I understood it as him saying he expects developers to optimize for full saturation most of the time, and the rest of the time are the times it doesn't need to be maxed. I was able to infer that because I did watch the full presentation and thus understood how the variable clocks were explained to work.

However people trying to argue that since he only said 'most' of the time that means it's less than 10.28 TFLOPS and that's incorrect.

So just to clarify, since you understand that TFLOPS is based on peak then you agree that PS5 is in fact a 10.28 console, yeah?

Yeah, that is what I was arguing. I agree that it is a 10.3 TFlop machine but I was contesting that is what it would run most of the time. Yep, I was thinking you cannot definitively say that unless I don't understand how the throttling works. So based on what you have told me I stand by my original comment that you cannot say it will run 10.3 most of the time.
 

welsay01

Neo Member
Yeah, that is what I was arguing. I agree that it is a 10.3 TFlop machine but I was contesting that is what it would run most of the time. Yep, I was thinking you cannot definitively say that unless I don't understand how the throttling works. So based on what you have told me I stand by my original comment that you cannot say it will run 10.3 most of the time.

Sure, just as much as you also can't say that it won't run 10.3 most of the time. Because no one can predict the future of how people are going to use it.
 

Elenchus

Banned
I think launch games get a lot further along now with the architecture than in the past. Super custom ps3 type systems took much longer, not now. So they might not max it out, but that was never an expectation on my end anyway. Even going back to something like Halo CE it looked amazing to see those massive worlds in that fidelity on a console. Some of that being that the Xbox was a beast and easier to develop for, and there were no constraints of other systems. One of the best looking launch games of all time in my opinion, for those reasons. So, I don't think its perception at all. A brand new CPU with 4-5x the power, 6x GPU, more RAM, SSD all customized is a league superior to what we have. And I don't believe its enough of a marketing point to matter, the vast majority of gamers aren't even thinking that way at all. I just think gamers will see PS5 exclusives and see a difference, not because its not on PS4. That won't even cross minds.

But you missed my central point. Halo Infinite development began at least a year before the development of XSX. 343 could not have set the scope of their game to the specs of XSX early in their development cycle because they had no way of knowing where those specs would land. That’s like tossing a dart at a dartboard blindfolded.
 

Romulus

Member
But you missed my central point. Halo Infinite development began at least a year before the development of XSX. 343 could not have set the scope of their game to the specs of XSX early in their development cycle because they had no way of knowing where those specs would land. That’s like tossing a dart at a dartboard blindfolded.

I think it being developed on Xbox one S is more of a anchor than not knowing the exact specs. It likely forced them to lean with a scaler that let them go with up with settings etc, built around the old console.
 
You guys are very desperate. PS5 is 3.5 and 10.3tf console. It's ok.

I can't believe there's a thread on this. The only thing misleading is this entire article.



MS puts a pretty name on things thats already available. You can't make up in software what raw performance can do in hardware. XB fans love telling that about PS5 vs SX, yet now bring up this velocity crap as if it can make up a 3x difference in SSD speed.
We can infer from the claims both companies make about the raw data speeds their compression algorithm will give each consoles.
The sx would allow up to 6GB/s (up from 2.5)
The PS5 would allow up to 9GB/s, but their chip could handle 21 for some reason, maybe it depends on the datatype (up from 5.5).. maybe I should re-watch the stream, but I'll wait for some written overview to

So it looks like MS has a better compression algorithm (they claim better compression, and for all I know, it seems true... but in the end that aspect of their console is still slower.

For the TF numbers, Cerny went out of his way to explain that a game like Horizon Zero Dawn would somehow stress the PS4 pro on the map screen, while it should not... so my assumption is that the low clock is there for that kind of scenarios - however the way he explained it was truly not that good, but from logic I assume that they did not create a "boost mode" to boost map screens performance, but have it kick in when there are explosions or some other compute heave stuff going on, which is unlikely to be most of the time, and they made is predictable so that developers could design around it.

Otherwise it just makes no sense - but it's possible to boost when you are under low load, but why?
 
Last edited:
Funny, I'm more a real playstation fan than some of you warriors here. I support all platforms, but keep saying whatever makes you feel better. I can tell some news recently has you guys extra triggered. I wonder what it could be. You guys need relax, take a chill pill and just decompress for a bit. It's okay.

You say one thing that doesn't fit with what these crybabies want to hear, and it's immediately like release the kraken! holy shit lol.

Nope, you aren't. Yeah, talking about warriors, looks who's talking.
 

Goliathy

Banned
Great OP.

What I think what will happen a lot is that devs will just find clockspeeds which can run locked all the time, so they can get consistent performance, and they will just lower the graphics so it runs on these lower but sustained clockspeeds.

bingo. Maybe 1st party devs will optimize for that. But 3rd party? No way.
that’s why also MS is going for a LOCKED way on everything. To make it easier for devs.
MS could easily use variable frequencies too but not many devs would use that anyway. And it makes the console very hot and loud af.
 

oldergamer

Member
bingo. Maybe 1st party devs will optimize for that. But 3rd party? No way.
that’s why also MS is going for a LOCKED way on everything. To make it easier for devs.
MS could easily use variable frequencies too but not many devs would use that anyway. And it makes the console very hot and loud af.
If boost mode were a good thing sony would have made the clocks variable on ps4 pro
 
Last edited:

Goliathy

Banned
Gaming Bolt is saying it does not expect Sony’s over clocking to scale linearly, that XSX’s 15% GPU advantage on paper may actually be 25-30% in real world performance, and that it expects PS5 to run in 1800p often.

It’s at 9:15-11:25 in the video below. Just one view but interesting analysis.

This is the problem with Sony’s approach of tell but don’t show. Not everyone is going to buy Cerny’s powerpoint slides.



This further proves the point that ps5 is a 9 TFLOPS machine and that they did this whole overclcocking thing to further close the gap ON PAPER.

And also why it took them so long to give details, to tell us the tflops.. and why we even have seen the console yet.
 
Last edited:
I've seen that before and highly doubt it, even then its on PC and will have a "minimum system specs" userbase that they can't leave out. Even making it relatively high like a 1060ti is nowhere near the XSX.
Do you even realize how scaling works? Seems like you have a chip on your shoulder in regards to the most powerful hardware, pc. It's so funny to see you try and reason, without having the slightest clue that games and engines scale, better than consoles. So someone with a high end gpu (even a few years old) will look better than any next gen console, while someone with a medium or low end gpu can play the game with decreased visuals and framerate,

It's called common sense, and when people like you, just don't know how things work, shouldn't speak out, without doing the slightest research and spewing FUD.
 

Tumle

Member
What I’m getting from these threads about unlocked and locked TF, is that I’m getting a PS5!
How bad are the engineers at Microsoft that they can’t manage to throttle down power consumption if I’m just idling on the home screen?
Not going to take out a small loan to have my console turned on..
 

Goliathy

Banned
What I’m getting from these threads about unlocked and locked TF, is that I’m getting a PS5!
How bad are the engineers at Microsoft that they can’t manage to throttle down power consumption if I’m just idling on the home screen?
Not going to take out a small loan to have my console turned on..

What? Microsoft could do this easily too, but they don’t. Because having a locked frequency is was better in many ways.
 

Tenka Musou

Banned
Didn't Cerny say if the GPU/CPU downclocks, it'd be only by 2% ? So the GPU would still be in the 10TF range.

Why are we still talking about 9.2TF?
 

UnNamed

Banned
I'm not a tech expert but I wonder if TFs on PS5 will be released only on games with low CPU load.
For example, any CPU bounded games won't probably benefit from GPU overclock since all the power would be direct to CPU tasks, and that means heavy load-> heavy watt consuption-> less watt for GPU -> "underclocked" GPU.
Imagine a next-gen Assassin's Creed Unity with all that game's logic, NPC, simulation and physics, draw calls, and many other things you can find in a open world game, I don't think it will benefit the overclock from GPU.
Other games such as Mortal Kombat could benefit from the boost since the game' logic is relatively poor.
 
I'm more worried about the variable performance(boost) of the GPU than the CPU.
Zen2 CPU has high enough IPC for 4k games, and with all the offloading their custom SSD controller and Audio chip should do for the CPU I don't see CPU underclocking a big problem.
Nobody runs idiotic 720p FPS tests on consoles.

I'll wait for the games to speak for them selves, won't stop me from getting PS5 day 1.

I don't believe in any secret sauce.
 

Akuji

Member
love those threads :D as someone who buys ps5 cause of the exclusives and playing the rest on pc its hillarious.
xbox fans are spiting everywhere, well the pre-build pc fanbase gonna be disappointed soon when the ps5 that actually is a console plays it strenghts.

but whatever xbox raw performance ftw right? :D

please fight those tech fights with that little knowledge of how things actually work for all eternity, lightens my mood.
 

FranXico

Member
1. Developers will not need to take GPU frequency drops into account when optimizing graphics for the PS5. Power is redirected based on workload. If they push more graphics, the GPU will keep running at 10TF and the CPU will be assigned less power if it was not using it. However, the system can still be stressed on both CPU and GPU simultaneously, in which case the cooling system will need to kick in when temperature goes up. How often will that happen with actual games remains to be seen.

Having said that, 3rd party games will noticeably run better on the XSX, there is no way around that.

2. If the XSX could safely be upclocked to reach 14TF, this would have already been the case. The extra power required and heat produced to run at a higher frequency does not scale linearly. Even more heat would be generated with more CUs. It is designed as a tower with a big fan for a reason.

3. The same website that regurgitated the 12TF + "13TF from RT" = 25TF misleading false advertising rubbish at face value now feels very comfortable calling out Cerny. Hardly unbiased, hardly reliable.

I see plenty of damage control over the PS5 inferior GPU specs to go around, but way more FUD out there. Predictably, plenty of hypocrites who attacked NXGamer NXGamer calling him a Sony fanboy are now willing to buy whatever bridge these actual Xbox fanboys are selling. The double standards on display here are on a "legaue" of their own.
 
Last edited:

longdi

Banned
Damn the other thread got locked. Anyway here are my expert views, i have 20 years of gaming experience.

IIRC the '2% loss with 10% power drop' figures are just guesses from our asses. Cerny did not commit any official loss numbers. It can be 2%, it can be 10%. Sneaky!

I still cant wrap my head around what Sony is doing differently better here?

It just sounds similar to what Series X does. Both have the same APU tech from AMD...
Boost clocking have come a long way since jaguar & gcn
Power shifting have come a long way too
Make sense for consoles to use deterministic values
Make sense load management between CPU-GPU in the APU improves. If not, FU
Both Sony and MS have to lock a vcore along the VF curve, as such
tiny 36CU RX6700 probably runs at higher frequency than big poweful sexy full value 52CU RX6800XT.
I expect AMD to market the first 2Ghz gpu next month!
Sony can apply a fixed 10% max boost on the RX6700 in PS5 with a fixed vcore
Loses the power efficiency of 'stock boost' but PS5 is not portable so no big deal.
PS5 will run same or above tdp than Series X because of this loss
My 1080ti run >1.9ghz on all games at 300w. The stock boost is 1.6ghz at 250w. This is without additional vcore but just with watercooling.
In worst case, PS5 drops back to 'stock', and losses the 10% overclock.
Cerny went a-roundabout. His main point really is: PS5 will use a good cooling solution. :messenger_spock:
And perhaps 10.3tf > 9.2tf on PR.
But purposedfully led fans lost in translation

Quote this wall of text. Im as confident as MS is with Series X. This is all there is to it.
 
Yes, the variable frequency is based on utilization. But I see now, your argument is semantics of the 10.28 most of the time, well of course that can't be proven since no one can predict the future, right?

I wasn't the one who said it, but the person who did was most likely citing Cerny:



I can see where the confusion comes from. From that statement alone it's not clear if he's making a speculation that developers are going to be maxing out the GPU or if he's saying that it can't sustain that clock. I understood it as him saying he expects developers to optimize for full saturation most of the time, and the rest of the time are the times it doesn't need to be maxed. I was able to infer that because I did watch the full presentation and thus understood how the variable clocks were explained to work.

However people trying to argue that since he only said 'most' of the time that means it's less than 10.28 TFLOPS and that's incorrect.

So just to clarify, since you understand that TFLOPS is based on peak then you agree that PS5 is in fact a 10.28 console, yeah?

I think "most" was actually the easiest to understand part. Most is 51%+ So when he says the CPU runs its max clock most of the time. I get that. I am uncertain of the GPU clocks because he says expects them to run close to 2.23ghz. I dont know what close means, and I am also uncertain why he expects them... I feel like if anyone should know what the system runs at it should be Mark Cerny.
 

Clear

CliffyB's Cock Holster
Damn the other thread got locked. Anyway here are my expert views, i have 20 years of gaming experience.

IIRC the '2% loss with 10% power drop' figures are just guesses from our asses. Cerny did not commit any official loss numbers. It can be 2%, it can be 10%. Sneaky!

I still cant wrap my head around what Sony is doing differently better here?

It just sounds similar to what Series X does. Both have the same APU tech from AMD...
Boost clocking have come a long way since jaguar & gcn
Power shifting have come a long way too
Make sense for consoles to use deterministic values
Make sense load management between CPU-GPU in the APU improves. If not, FU
Both Sony and MS have to lock a vcore along the VF curve, as such
tiny 36CU RX6700 probably runs at higher frequency than big poweful sexy full value 52CU RX6800XT.
I expect AMD to market the first 2Ghz gpu next month!
Sony can apply a fixed 10% max boost on the RX6700 in PS5 with a fixed vcore
Loses the power efficiency of 'stock boost' but PS5 is not portable so no big deal.
PS5 will run same or above tdp than Series X because of this loss
My 1080ti run >1.9ghz on all games at 300w. The stock boost is 1.6ghz at 250w. This is without additional vcore but just with watercooling.
In worst case, PS5 drops back to 'stock', and losses the 10% overclock.
Cerny went a-roundabout. His main point really is: PS5 will use a good cooling solution. :messenger_spock:
And perhaps 10.3tf > 9.2tf on PR.
But purposedfully led fans lost in translation

Quote this wall of text. Im as confident as MS is with Series X. This is all there is to it.

No offence, but having benchmarked a few PC cards gives you minimal credibility in my eyes. As I've said before its basically min-maxing stats in a RPG, and as such barely qualifies as any sort of technical insight.

It was a GDC talk aimed at industry people, many of whom are way, way, more knowlegable than you, and as such would see through any deception if not immediately, then as soon as they get their hands on a dev-kit and proprietary documentation.

It was absolutely not the same sort of marketing spiel that MS has been spamming in order to court the enthusiast or power-user.
 

welsay01

Neo Member
I think "most" was actually the easiest to understand part. Most is 51%+ So when he says the CPU runs its max clock most of the time. I get that. I am uncertain of the GPU clocks because he says expects them to run close to 2.23ghz. I dont know what close means, and I am also uncertain why he expects them... I feel like if anyone should know what the system runs at it should be Mark Cerny.

Getting into what 'most' or 'close' means is semantics.

He's just speculating how it will be used. He can't say it will spend 100% of the time there because there will be instances when it won't need to be at 100% since the clock variance is based on workload/utilization and not heat. Saying 'most' or 'close' is just optimism that developers will utilize the system to its potential.

The system is capable of running either sustained max GPU, sustained max CPU or both sustained with reduced clocks; all with no overheat issue because it was designed and specifically capped at those clocks to fit within the window of the cooling solution's capability.

Reading into words that are ultimately semantics is just looking for fire where there is no smoke.

Obligatory: XSX is more powerful than PS5. Locked clocks are better than variable clocks. Sony went with a smaller APU with variable clocks to allocate more of their budget to custom I/O, flash controller, audio and SSD.
 

quest

Not Banned from OT
No offence, but having benchmarked a few PC cards gives you minimal credibility in my eyes. As I've said before its basically min-maxing stats in a RPG, and as such barely qualifies as any sort of technical insight.

It was a GDC talk aimed at industry people, many of whom are way, way, more knowlegable than you, and as such would see through any deception if not immediately, then as soon as they get their hands on a dev-kit and proprietary documentation.

It was absolutely not the same sort of marketing spiel that MS has been spamming in order to court the enthusiast or power-user.
It was all PR why else was his guys on Twitter doing his lines and on forums astroturfing it was pure PR only SSD and clock speed matter. He did not commit to any hard numbers most of the time could be 51%. He said a couple not 2% there was no chart. If it was really only 2% they left it there locked jesus people think. You think they suits would spend 10s of millions to cool for 2% over 7 years. He did not want to show chart dropping below 10tf period.
 

scalman

Member
why you all misleading ? x1x now is most powerful console best resolutions best fps in games, better then PS4 Pro in all specs , so does everyone jumped to X1X because it has better specs ? numbers say NOT. so nothing will change in next gen if xbox wont give more amazing exclusive games... i mean more PC/xbox games that everyone could play on pc . umm.. so you just need pc anyway.
 

Romulus

Member
Do you even realize how scaling works? Seems like you have a chip on your shoulder in regards to the most powerful hardware, pc. It's so funny to see you try and reason, without having the slightest clue that games and engines scale, better than consoles. So someone with a high end gpu (even a few years old) will look better than any next gen console, while someone with a medium or low end gpu can play the game with decreased visuals and framerate,

It's called common sense, and when people like you, just don't know how things work, shouldn't speak out, without doing the slightest research and spewing FUD.

Huh? You get so upset the second someone brings up PC its hilarious and you jump to conclusions about posts.
Meanwhile I require a PC for gaming. It probably blows your mind why I can't be so biased like yourself. The funny part is, you don't even understand what you defend. Consoles have their advantages. System requirements are a thing on PC for a reason. Sure theres scalers, and you can run potato settings on many PCs. Thats not the point, at all, But once again, everything flies over your head because you got so angry at the mention of PC you scramble to post. You're blocked again.

I mean, look at your post history regarding PC vs console. It's just constant warring and triggered comments. Its endless anger and aggressiveness. lmao
 
Last edited:
I am 99% convinced that the PS5 will come in 2021, and not 2020. The question I am making: Will the PS5 in 2021 be able to play the AAA games at 4k 60fps? If it doesn't Sony did a bad job, and will lose a lot of market share to Microsoft.
 

StreetsofBeige

Gold Member
I am 99% convinced that the PS5 will come in 2021, and not 2020. The question I am making: Will the PS5 in 2021 be able to play the AAA games at 4k 60fps? If it doesn't Sony did a bad job, and will lose a lot of market share to Microsoft.
Considering how bad that show was and how weak PS5 is (aside from SSD), I wouldn't be surprised if they scrapped it entirely, rode out PS4/Pro for another two years and release PS5 in 2022. They surely have some other R&D designs they could green light that has more power. Every tech company has multiple blueprints floating around the product development teams.

At one electronics company I worked at, R&D already had a road map of products to come out every year for the next 6 years. But most people in the office were only told about products 1 year out (at most).

Coronavirus is a prefect excuse for any company to change products.
 
Last edited:

Windows-PC

Banned
I find it frustrating that Cerny wasted a portion of his 'GDC talk' to obfuscate PS5 boost, and to justify PS5 low specs tflops
Boost is boost is variable clocks. Tflops is tflops. APU smart shift is smart shift.
He may fooled a few and not others. Look at where we are going in circles trying to make sense...like wtf is 'reverse boost'.. :messenger_astonished:

For a developers conference, i rather he spent that time talking about any new additions he made on the CPU or GPU.
Like for PS4 and PS4p, he talked on such additions over AMD pc gpu equivalents.

Such behaviour just lead to suspicions that Mark Sony fucked things up. Panicked at the power of the X.

I also wanted to add that I lost alot respect for Mark Cernty because of how hard he tried to downplay the importance of TF's just because the competition clearly is in advantage.

The PS5 will be absolute fine with 10.3 TF's and will also have great games, he should have shown some games to show what you can archive with 10.3 TF's instead of downplaying the importance of TF's.

And if TF's are not important anymore to Mark Cerny, than I don't understand why he overclocked the GPU in a horrendous way to reach 10.3 TF's to match the competition instead of keeping 8-9.2 TF's!
 
Last edited:
Huh? You get so upset the second someone brings up PC its hilarious and you jump to conclusions about posts.
Meanwhile I require a PC for gaming. It probably blows your mind why I can't be so biased like yourself. The funny part is, you don't even understand what you defend. Consoles have their advantages. System requirements are a thing on PC for a reason. Sure theres scalers, and you can run potato settings on many PCs. Thats not the point, at all, But once again, everything flies over your head because you got so angry at the mention of PC you scramble to post. You're blocked again.
Who wants other people to NOT get a game? Imagine who could be that dense? Especially those who spend the money to get the best hardware? Of course I, and other's find that offensive as a gamer. It's childish to think that way, and you need to grow up from that mentality. It's toxic to the gaming community as a whole. If I was a ps5 owner, I could care less if a game goes to Nintendo, or Xbox, and vise versa.

In other words more games are coming to pc, no matter your childish opinion. And who blocks and unblocks people? Lol. What a funny person you are.
 
Last edited:

quest

Not Banned from OT
Are you telling me that having a system always running on max output is to be desired?
then I welcome back red ring part 2. ;)
You don't down clocking under load like every other home console released. Its really sucks at the end of the generation where developers won't be able to push the system to the max because it will just slow it down or give them unknown performance from these variable clocks. It was a bad idea to try to overclock to the moon to get over 10tf "most of the time" tm sony.
 
Last edited:

welsay01

Neo Member
You don't down clocking under load like every other home console released. Its really sucks at the end of the generation where developers won't be able to push the system to the max because it will just slow it down or give them unknown performance from these variable clocks. It was a bad idea to try to overclock to the moon to get over 10tf "most of the time" tm sony.

Why wouldn't it be able to be pushed to its max? Yes, its max is not as high as XSX, and yes it downclocks under the worst case scenario of max GPU and max CPU, but if that's its max why wouldn't they be able to push to it?

Obviously you'd want them to optimize so things aren't constantly on that worst case scenario, but it's not like the variable clocks need to be manually handled by the developer since the APU is the one deciding the frequency. There is no having to deal with 'unknown performance'. You throw code at it, it decides if it's CPU bound, GPU bound or both and clocks accordingly. So it's repeatable. It's not going to look at the same piece of code and decide it's worth 2.23GHz one day and only 2.0GHz some other day.


Are you telling me that having a system always running on max output is to be desired?

then I welcome back red ring part 2. ;)

Cheeky, but MS learned their lesson better than anyone from the RROD. All iterations of the Xbox One have had great build quality, I'd even say they have the best build out of any console ever and I have no doubt that the XSX will continue that.
 

Elenchus

Banned
1. Developers will not need to take GPU frequency drops into account when optimizing graphics for the PS5. Power is redirected based on workload. If they push more graphics, the GPU will keep running at 10TF and the CPU will be assigned less power if it was not using it. However, the system can still be stressed on both CPU and GPU simultaneously, in which case the cooling system will need to kick in when temperature goes up. How often will that happen with actual games remains to be seen.

Having said that, 3rd party games will noticeably run better on the XSX, there is no way around that.

2. If the XSX could safely be upclocked to reach 14TF, this would have already been the case. The extra power required and heat produced to run at a higher frequency does not scale linearly. Even more heat would be generated with more CUs. It is designed as a tower with a big fan for a reason.

3. The same website that regurgitated the 12TF + "13TF from RT" = 25TF misleading false advertising rubbish at face value now feels very comfortable calling out Cerny. Hardly unbiased, hardly reliable.

I see plenty of damage control over the PS5 inferior GPU specs to go around, but way more FUD out there. Predictably, plenty of hypocrites who attacked NXGamer NXGamer calling him a Sony fanboy are now willing to buy whatever bridge these actual Xbox fanboys are selling. The double standards on display here are on a "legaue" of their own.

DF disagrees.


vavudhb.png
 
But you missed my central point. Halo Infinite development began at least a year before the development of XSX. 343 could not have set the scope of their game to the specs of XSX early in their development cycle because they had no way of knowing where those specs would land. That’s like tossing a dart at a dartboard blindfolded.
I think it being developed on Xbox one S is more of a anchor than not knowing the exact specs. It likely forced them to lean with a scaler that let them go with up with settings etc, built around the old console.
They probably started development with Vega 56/64 when they dropped the first trailer

Can't wait for the ps5 pro with 20 tflops.
:messenger_neutral: I'm not sure about that.
 

SynTha1

Member
I just wanted to ask if anybody in here has any hands on knowledge of both Dev kits cause I keep hearing well this is how its always been so this is how I think its supposed to happen and I get that we are just speculating at this point but I see a lot of people spitting out there speculation as fact like they have both consoles already and they also developed a game on both its really getting tiring.
 

sendit

Member
Considering how bad that show was and how weak PS5 is (aside from SSD), I wouldn't be surprised if they scrapped it entirely, rode out PS4/Pro for another two years and release PS5 in 2022. They surely have some other R&D designs they could green light that has more power. Every tech company has multiple blueprints floating around the product development teams.

At one electronics company I worked at, R&D already had a road map of products to come out every year for the next 6 years. But most people in the office were only told about products 1 year out (at most).

Coronavirus is a prefect excuse for any company to change products.

Agreed. Microsoft should also delay. Considering they won't have a single next gen exclusive for launch.
 

SynTha1

Member
I just wanted to ask if anybody in here has any hands on knowledge of both Dev kits cause I keep hearing well this is how its always been so this is how I think its supposed to happen and I get that we are just speculating at this point but I see a lot of people spitting out there speculation as fact like they have both consoles already and they also developed a game on both its really getting tiring
 
Status
Not open for further replies.
Top Bottom