• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

kyliethicc

Member
So Moore's Law Is Dead was right hmmmm he said he heard Cory Barlog was just a producer in the new God of War game.
I think he will have a writing credit. He’s said they’ve written out basically a trilogy already. He laid the groundwork for new God of War, so the next one might not need him to direct it all. It will still have the no cut camera etc.

If I were Sony, and I wanted to green light a new AAA IP, I’d let the guy who just made GOTY do it, if he wanted to. And I think he wanted to. I think Cory’s new game is space related.

“Cory Barlog, the Creative Director of 2018’s God of War recently updated his Twitter profile, posting cryptic messages. Barlog changed his Twitter profile to that of the Voyager 2 space satellite, his cover photo to a picture of stars in space, and tweeted the cryptic message “—- <END TRANSMISSION> … .. .””

“British film Director Duncan Jones met with Barlog back in 2018, and tweeted “if you think that God of War is his magnum opus, just you wait!” Jones is well known for having directed many sci-fi and space themed films.”

 
4bx5h3.jpg


Need to sleep for now.:lollipop_thescream:

9wyvc.jpg
 
T

Three Jackdaws

Unconfirmed Member
I think he will have a writing credit. He’s said they’ve written out basically a trilogy already. He laid the groundwork for new God of War, so the next one might not need him to direct it all. It will still have the no cut camera etc.

If I were Sony, and I wanted to green light a new AAA IP, I’d let the guy who just made GOTY do it, if he wanted to. And I think he wanted to. I think Cory’s new game is space related.

“Cory Barlog, the Creative Director of 2018’s God of War recently updated his Twitter profile, posting cryptic messages. Barlog changed his Twitter profile to that of the Voyager 2 space satellite, his cover photo to a picture of stars in space, and tweeted the cryptic message “—- <END TRANSMISSION> … .. .””

“British film Director Duncan Jones met with Barlog back in 2018, and tweeted “if you think that God of War is his magnum opus, just you wait!” Jones is well known for having directed many sci-fi and space themed films.”

Wow. You know I was kind of scared reading this at first, God of War is one of my favourite games and Cory Barlog was at the heart of it all. I wouldn't trust the development of the sequels with anyone else but I guess if Cory trusts them, then so do I

That being said, the sci-fi game seems like it will be Cory's passion project and it will also release on the PS5. I'm really looking forward to it.

Also really curious to see the graphics of the God of War sequel, the first game looks ridiculously good, especially the details on the close up shots as the art direction rivals Horizon Zero Dawn at times. I know MLiD claimed he had heard from his source that the sequel will have "jaw-dropping graphics approaching photorealism", I really really hope this is true. FIngers crossed.
 
as a playstation proponent, I find it funny that you talk like that.
since we already know that CUs is the place were raytracing calculations will take place,
and ps5 has 50% less than xbox, here is a simple example for you to ponder:
lets say a game -a third party game- uses 28 CUs on ps5 for everything else, and keeps 8 for ray tracing effects.
xbox can easily use 36 Cus for everything else, over-compensating the max possible Hz difference, and still have more than double the available CUs for raytracing effects.

should I wish you good luck?
giphy.gif
 

roops67

Member
That's a lie


Not XSX but a revision of a XSX devkit most likely. This demo is from early days so I would guess the early devkit would have been more a PC approximation of an XSX (again guessing but don't think they would have had XSX APU's available to put into devkits yet)

This and the Minecraft ray-tracing demo have been stated by Microsoft to be running XSX hardware so early (no mention of devkit), anybody find that questionable or suspicious considering they're unable to show anything now?

It's more likely Microsoft's version of the truth that XSX is whatever hardware they had at time to approximate it (?)
 

roops67

Member
there was this blog post covering the event, a little extra info on top of slides plus some q&a at the end


one could say that their adoption of amd's variable clocks thingy was indeed a reaction to what ms did.
I mean, over the years we've seen how sony is able to have an answer in the most loud way and even in the next day, IF that is to their benefit. remember "how to share games" video?
One could say this was proven to be pure fud. Using this lie as a foundation cancels any arguments based on it

One example of Sony having a dig at Microsoft... just one fun poke which Microsoft left themselves wide open for (you can't say they didn't deserve it?), and now Sony are the loud mouths of the industry?! What the fuck the hypocrisy here is beyond words!!!

Edit: really wish the mods will add a facepalm 🤦‍♂️ trigger
 
Last edited:

Hustler

Member
Buy Best Buy gift cards online now for the dollar amount. Do it soon so they can be delivered soon. If you end up pre-ordering while waiting on the gift cards, choose in store pickup so you can pay with the gift cards at pickup since you can’t change payment type for online orders.

just so you know, that’s not an option. It states during checkout rewards cannot be used towards gift card purchases.
 

roops67

Member
AMD's variable clocks thingy?

My understanding from R2PS5 is the variable clocks are something Sony came up with that AMD helped them with and that Sony then added AMD's SmartShift "while we're at it" that only shifts electrical power between CPU/GPU. It has nothing to do with the variable clocks.

And even if the variable clocks are a AMD thing just take a look at this graph posted on Beyond3D that shows PC GPUs already have constantly variable clocks:

Iw8HhDy.jpg
RGT(?) and MooresLawIsDead have heard XSX have removed the variable frequency circuitry from their GPU to save space, wonder how they gonna get around all these variables like excessive loads etc... :pie_thinking: cos as shown above GPU workloads ain't all plain sailing!

Edit: LoL the graph is for THICC II Ultra ...
DYuNnXs.jpg

... now that's an excessive load!
 
Last edited:

SSfox

Member
My bank statement after I bought the RTX 3080 (or 3070 if NVIDIA became crazy with the price),
a PS5, a decent 4K TV and the XSX this year and still need to bough gifts for my family this christmas.
giphy.gif

I know people that play on PC and the amount of money they spent is insane, and they don't even play this much They're not rich or anything but just got baited in the mood of "I NEED TO HAVE THE BEST GRAPHIC CARD, I NEED TO RUN MY GAMES AT MAXIMUM" ect, that was scary to see lol
 

FeiRR

Banned
What’s Gamestop?

/s
The dirtiest and messiest game store I've been to. Probably there are people who think it's as global as McDonald's, lol.

Man this looks absolutely mind blowing.



Can't imagine what games that strive for realism will look like nextgen.

It does but not from close up, check the bad screenshot thread. So it's not going to help game devs a lot in designing environments. Even if it's a good start, hundreds of hours have to be put into making it look good. They didn't have to do it for this particular project because of its nature.

Not XSX but a revision of a XSX devkit most likely. This demo is from early days so I would guess the early devkit would have been more a PC approximation of an XSX (again guessing but don't think they would have had XSX APU's available to put into devkits yet)

This and the Minecraft ray-tracing demo have been stated by Microsoft to be running XSX hardware so early (no mention of devkit), anybody find that questionable or suspicious considering they're unable to show anything now?

It's more likely Microsoft's version of the truth that XSX is whatever hardware they had at time to approximate it (?)
I agree with your opinion but I'm known to be paranoid about MS and hardly anybody believes me ;).
 

vivftp

Member
Teraflops still remain a huge mystery for me. Companies have weaponized them for an easy to use stat for the general gamer populace to use when comparing products. It seems to be a decent enough value to try and gauge relative performance, at least in some benchmarks I've seen (I haven't done in-depth research). Then again I've seen many, many developers and industry folk chime in to say teraflops are not a useful metric to use (even Cerny!), they're just a peak figure that's just one part of a much larger system. Then we have the folks who say teraflops between different architectures is a pointless comparison, which I'd say makes sense, and they'd say teraflops between the same architecture is fair game, which I'm still curious about.

I'm mostly thinking about the PS5 and its use of variable clocks. Now you could compare it against what the XSX is doing, or just against the way other consoles in the past have worked since those are more traditional designs, I'm more wondering about the actual methodology and what sort of benefits each approach yields. For simplicity sake I suppose the PS5 and XSX are the best ways to compare.

From what I've read it's pretty rare that either machine (or any GPU for that matter) will ever actually be doing work at their absolute peak TF figure, they'll generally spend most of their time somewhere below that (where, I have no clue). Hitting their peak TF figures is more like a blip here, a blip there rather than having the game completely max out the peak TF figure for minutes or hours on end. I've seen it argued that with both the XSX and the PS5 they both have their stated peak TF figure, but the design of the variable clocks will let the PS5 punch closer to its peak on average more often than the design of the XSX will be able to punch towards its peak TF figure. There was a post by an Era user, I believe it was it was Liabe Brave that gets posted here every once in a while that seemed to describe things pretty well. I don't have it immediately handy, but it seems to indicate that pound for pound the PS5 should be able to punch above its weight class that's calculated on paper with just the TF figure.

Is this true? I have no frickin' idea. It's still something I wonder about when I see people discussing power. I'm by no means going to argue the PS5's GPU will outperform the XSX's GPU, I have no basis for such a claim and lack the ability to properly compare them anyways. What I am curious about is whether the TF figure really tells us definitively about the "power gap" between these two machines or if the PS5's ability to "punch above its weight class" narrows that gap at all.

I'm also curious about the points Cerny raised regarding higher clocks and fewer CU's and whether they will yield any tangible, measurable benefits. He mentioned with the higher clocks that rasterization and processing the command buffer are that much faster and the L2 and other caches have that much more bandwidth compared to going with lower clocks. He claimed the only downside was that system memory was further away in terms of cycles. What does this mean in terms of benefits to performance and what devs can do with this hardware? Again, no frickin' clue.

Dunno if that makes much sense but meh, it's been kicking around in my head and figured I'd post this stream of consciousness on the off chance someone smarter than me could tell me whether it's nonsense or not.

As always the best way to tell will be actual real world performance comparisons so I'm curious to see how things will play out once these machines are released and we have folks like DF or NX Gamer doing comparisons.
 

Nickolaidas

Banned
Teraflops still remain a huge mystery for me. Companies have weaponized them for an easy to use stat for the general gamer populace to use when comparing products. It seems to be a decent enough value to try and gauge relative performance, at least in some benchmarks I've seen (I haven't done in-depth research). Then again I've seen many, many developers and industry folk chime in to say teraflops are not a useful metric to use (even Cerny!), they're just a peak figure that's just one part of a much larger system.

Okay, let's be a little less naive here. Cerny saying terraflops don't really matter is like Spencer saying that exclusives are anti-consumer. Both are trying to downplay/handwave the areas their hardware is left wanting. There is some truth to what Cerny is saying (that talent, tech and other stuff can help mitigate the difference between PS5 and XSX specs), but saying that terraflops don't matter is clearly a PR stunt.

Basically: If the PS5 was 13 TFs and the XSX was 9 TFs, there's no way in hell that Cerny would've said, 'Terraflops don't matter'.
 

vivftp

Member
Okay, let's be a little less naive here. Cerny saying terraflops don't really matter is like Spencer saying that exclusives are anti-consumer. Both are trying to downplay/handwave the areas their hardware is left wanting. There is some truth to what Cerny is saying (that talent, tech and other stuff can help mitigate the difference between PS5 and XSX specs), but saying that terraflops don't matter is clearly a PR stunt.

Basically: If the PS5 was 13 TFs and the XSX was 9 TFs, there's no way in hell that Cerny would've said, 'Terraflops don't matter'.

That's the thing that I'm still puzzled about. Cerny's not the only developer or industry person who's dismissed teraflops as not being a relevant metric, not by a long shot. I'm am curious what metrics they would consider to be useful, but off the top of my head I don't recall anyone elaborating on that.

EDIT.
And just in case you want examples of other developers, well I don't keep a record of all the times I've seen it mentioned but I can at least cite one recent instance:

TF.png

 
Last edited:

Sinthor

Gold Member
4bx5h3.jpg


Need to sleep for now.:lollipop_thescream:

Sleep! Sleep is for the weak! The strong have been sapped by the whimpering and sleepiness of the weak. Men, strong men, have been denied their rest! You Bo, have been saved from that fate. Redemption and alertness, is within your grasp.

:messenger_winking_tongue:

Guess what movie I saw on TV this evening?

Then imitate the action of the tiger,
stiffen the sinews, conjure up the blood,
disguise fair nature with hard favored rage,
and lend thine eye a terrible aspect....

 

FeiRR

Banned
Okay, let's be a little less naive here. Cerny saying terraflops don't really matter is like Spencer saying that exclusives are anti-consumer. Both are trying to downplay/handwave the areas their hardware is left wanting. There is some truth to what Cerny is saying (that talent, tech and other stuff can help mitigate the difference between PS5 and XSX specs), but saying that terraflops don't matter is clearly a PR stunt.

Basically: If the PS5 was 13 TFs and the XSX was 9 TFs, there's no way in hell that Cerny would've said, 'Terraflops don't matter'.
I was curious so I looked into the Road to PS5 transcript:
This continuous improvement in AMD technology means it's dangerous to rely on teraflops as an absolute indicator of performance.
And CU count should be avoided as well.
In the case of CPUs we all understand this the PlayStation 4 and PlayStation 5 each have eight CPUs but we never think that meant the capabilities and performance are equal.
It's the same for see use for one thing they've been getting much larger over time adding new features means adding lots of transistors.
In fact the transistor count for a Playstation 5 CU is 62% larger than the transistor count for a playstation 4 CU.
It's far from saying they don't matter at all. It was also used in context to explain the difference between GCN and RDNA2 architectures. Console wars aren't good optics for measuring serious people like Mr Cerny doing their job.
 

Nickolaidas

Banned
I was curious so I looked into the Road to PS5 transcript:

It's far from saying they don't matter at all. It was also used in context to explain the difference between GCN and RDNA2 architectures. Console wars aren't good optics for measuring serious people like Mr Cerny doing their job.
"This continuous improvement in AMD technology means it's dangerous to rely on teraflops as an absolute indicator of performance.
And CU count should be avoided as well. "

I'm sure that both TF and CU count being handwaved and 'avoided as an absolute indicator of performance' has nothing to do with the fact that they're both at a higher number in the XSX.

PR stunt. Accept it for what it is.

It isn't any different than Spencer saying they're not interested in selling more XSXs than Sony will sell PS5s because he knows the PS5 will utterly destroy the XSX in sales and tries to downplay that fact.
 

FeiRR

Banned
That's the thing that I'm still puzzled about. Cerny's not the only developer or industry person who's dismissed teraflops as not being a relevant metric, not by a long shot. I'm am curious what metrics they would consider to be useful, but off the top of my head I don't recall anyone elaborating on that.
That's because TF is a single variable and no real science can be done based on that. Devs work with engine profiler tools which allow them to measure and improve performance of what is inserted into their engine. Here's an example of Guerilla profiler tool. You can notice how complex it is and how many variables it displays.
 

vivftp

Member
I was curious so I looked into the Road to PS5 transcript:

It's far from saying they don't matter at all. It was also used in context to explain the difference between GCN and RDNA2 architectures. Console wars aren't good optics for measuring serious people like Mr Cerny doing their job.

Yeah I may have oversimplified that point in my original post, I just tacked on mention of Cerny after the fact because I was referencing his bit about the benefits of faster clock. Here's what he said on that:

"If you just calculate teraflops you get the same number, but actually the performance is noticeably different because teraflops is defined as the computational capability of the vector ALU.

That's just one part of the GPU there are a lot of other units and those other units all run faster when the GPU frequency is higher at 33% higher frequency rasterization goes 33% faster processing the command buffer goes that much faster the 2 and other caches have that much higher bandwidth and so on.

About the only downside is that system memory is 33% further away in terms of cycles. But the large number of benefits more than counterbalanced that.

As a friend of mine says a rising tide lifts all boats.

Also it's easier to fully use 36CUs in parallel than it is to fully use 48CUs when triangles are small it's much harder to fill although CUs with useful work.

So there's a lot to be said for faster assuming you can handle the resulting power and heat issues which frankly we haven't always done the best.



I'm curious whether these points he raises will yield noticeable real world improvements, bringing us back to the whole "punching above its weight class" talking point in terms of overall real world performance. The TF gap is always brought up, but I'm curious to know if that tells us the whole story.
 

FeiRR

Banned
I'm curious whether these points he raises will yield noticeable real world improvements, bringing us back to the whole "punching above its weight class" talking point in terms of overall real world performance. The TF gap is always brought up, but I'm curious to know if that tells us the whole story.
This has to wait until some major third party publisher has a game for both platforms that can be analysed. And still, it won't be a 1:1 comparison because devs can optimize their engines for each platform to take advantage of particular architecture features.

I'd wait for a DICE game (probably Battlefield 6) or a Rockstar game (probably GTA6) because those devs are good at what they do. I'm sure a lot of early cross-gen games will be badly optimized, taking into consideration current pandemic situation. Don't count on AC: Valhalla or Cyberpunk shining on next gen at launch.
 

TJC

Member
Okay, let's be a little less naive here. Cerny saying terraflops don't really matter is like Spencer saying that exclusives are anti-consumer. Both are trying to downplay/handwave the areas their hardware is left wanting. There is some truth to what Cerny is saying (that talent, tech and other stuff can help mitigate the difference between PS5 and XSX specs), but saying that terraflops don't matter is clearly a PR stunt.

Basically: If the PS5 was 13 TFs and the XSX was 9 TFs, there's no way in hell that Cerny would've said, 'Terraflops don't matter'.
Except cerny is a engineer who is only there to breakdown the system. It's more than plausible that he is right for what he wants from the system. Phil however flip flops on everything, he's not a engineer he's a salesman. I will admit he does a good job at hyping the console, but when it all goes wrong he seems to look like fool also. A MS engineer has recently been quoted saying "Teraflops don't matter". It's so easy to Market "POWER", but when the games shown looking like turd it's a problem
 

FeiRR

Banned
Except cerny is a engineer who is only there to breakdown the system. It's more than plausible that he is right for what he wants from the system. Phil however flip flops on everything, he's not a engineer he's a salesman. I will admit he does a good job at hyping the console, but when it all goes wrong he seems to look like fool also. A MS engineer has recently been quoted saying "Teraflops don't matter". It's so easy to Market "POWER", but when the games shown looking like turd it's a problem
Another thing is, PS5 is 10 TF because Cerny chose so. He had a certain budget to spend (which includes money, termals and power) and he made some choices. Of course he's going to justify them because he thinks they were correct (I agree with him but that's a different thing). He made those choices after talking to devs, not marketers. Forgive my bias but I believe engineers and programmers know much more about making games than a salesman in a Craig tee.
 

vivftp

Member
That's because TF is a single variable and no real science can be done based on that. Devs work with engine profiler tools which allow them to measure and improve performance of what is inserted into their engine. Here's an example of Guerilla profiler tool. You can notice how complex it is and how many variables it displays.


Indeed, TF does seem to be an important metric from what I can gather. I think it's what else goes into giving us the game that we experience that isn't part of the TF calculation that I'm mostly curious and how big a factor that is in comparison.

Also still trying to get a decent grasp on the whole variable clock thing. There were a couple of times where I thought I might have decent understanding behind the philosophy based on the way others explained it, but I'm never quite sure about my own comprehension on those matters.

Ah well... best to wait for real world comparisons to sort all of this out anyways :)
 

husomc

Member
I think it’s safe to say Kaz knows what a race car should sound like. He’s been making GT for 25 years and drives real race cars personally. He’s even raced in the 24 hour races at the Nurburgring. I trust he can get it right.



0GFMUI1.jpg

I can't see any raytraced reflections in this video. This shit must be fake. Where are the cars reflecting on the bodies of other cars ? :messenger_grinning_squinting: :messenger_grinning_squinting: :messenger_tears_of_joy:
 

FeiRR

Banned
Indeed, TF does seem to be an important metric from what I can gather. I think it's what else goes into giving us the game that we experience that isn't part of the TF calculation that I'm mostly curious and how big a factor that is in comparison.

Also still trying to get a decent grasp on the whole variable clock thing. There were a couple of times where I thought I might have decent understanding behind the philosophy based on the way others explained it, but I'm never quite sure about my own comprehension on those matters.

Ah well... best to wait for real world comparisons to sort all of this out anyways :)
Not really. TF is a theoretical capacity of silicon and that's why it has no real application. If you look closely at the video I attached, you see metrics such as time to frame with average/median calculations. Those are used a lot by devs because they are very practical.

So let's say the same game engine running on both next gens, with exactly the same setup of assets, all effects on. If you look at TTF at that point, you'll have half the answer which console does it better. Why only half? Because in another setup/frame the variable might be in favour of the other console. That's why they measure averages.

And still, it's not all. You also have to take into consideration things like controller lag/response, OS overhead, TCP/IP stack, etc., which aren't much talked about. Much, much too many variables for simple console warriors ;)
 
Last edited:

vivftp

Member
Not really. TF is a theoretical capacity of silicon and that's why it has no real application. If you look closely at the video I attached, you see metrics such as time to frame with average/median calculations. Those are used a lot by devs because they are very practical.

So let's say the same game engine running on both next gens, with exactly the same setup of assets, all effects on. If you look at TTF at that point, you'll have half the answer which console does it better. Why only half? Because in another setup/frame the variable might be in favour of the other console. That's why they measure averages.

And still, it's not all. You also have to take into consideration things like controller lag, which isn't much talked about. Much, much too many variables for simple console warriors ;)

Nifty, thanks for that response :)
 

ToadMan

Member
Teraflops still remain a huge mystery for me. Companies have weaponized them for an easy to use stat for the general gamer populace to use when comparing products. It seems to be a decent enough value to try and gauge relative performance, at least in some benchmarks I've seen (I haven't done in-depth research). Then again I've seen many, many developers and industry folk chime in to say teraflops are not a useful metric to use (even Cerny!), they're just a peak figure that's just one part of a much larger system. Then we have the folks who say teraflops between different architectures is a pointless comparison, which I'd say makes sense, and they'd say teraflops between the same architecture is fair game, which I'm still curious about.

I'm mostly thinking about the PS5 and its use of variable clocks. Now you could compare it against what the XSX is doing, or just against the way other consoles in the past have worked since those are more traditional designs, I'm more wondering about the actual methodology and what sort of benefits each approach yields. For simplicity sake I suppose the PS5 and XSX are the best ways to compare.

From what I've read it's pretty rare that either machine (or any GPU for that matter) will ever actually be doing work at their absolute peak TF figure, they'll generally spend most of their time somewhere below that (where, I have no clue). Hitting their peak TF figures is more like a blip here, a blip there rather than having the game completely max out the peak TF figure for minutes or hours on end. I've seen it argued that with both the XSX and the PS5 they both have their stated peak TF figure, but the design of the variable clocks will let the PS5 punch closer to its peak on average more often than the design of the XSX will be able to punch towards its peak TF figure. There was a post by an Era user, I believe it was it was Liabe Brave that gets posted here every once in a while that seemed to describe things pretty well. I don't have it immediately handy, but it seems to indicate that pound for pound the PS5 should be able to punch above its weight class that's calculated on paper with just the TF figure.

Is this true? I have no frickin' idea. It's still something I wonder about when I see people discussing power. I'm by no means going to argue the PS5's GPU will outperform the XSX's GPU, I have no basis for such a claim and lack the ability to properly compare them anyways. What I am curious about is whether the TF figure really tells us definitively about the "power gap" between these two machines or if the PS5's ability to "punch above its weight class" narrows that gap at all.

I'm also curious about the points Cerny raised regarding higher clocks and fewer CU's and whether they will yield any tangible, measurable benefits. He mentioned with the higher clocks that rasterization and processing the command buffer are that much faster and the L2 and other caches have that much more bandwidth compared to going with lower clocks. He claimed the only downside was that system memory was further away in terms of cycles. What does this mean in terms of benefits to performance and what devs can do with this hardware? Again, no frickin' clue.

Dunno if that makes much sense but meh, it's been kicking around in my head and figured I'd post this stream of consciousness on the off chance someone smarter than me could tell me whether it's nonsense or not.

As always the best way to tell will be actual real world performance comparisons so I'm curious to see how things will play out once these machines are released and we have folks like DF or NX Gamer doing comparisons.

In the case of PS5 vs Xsex the difference in Tflops is so small it is basically irrelevant. The difference would be irrelevant even if we were comparing PC GPUs with these specs.

When it comes to consoles, the tflop delta could easily be swallowed up by other performance differences in other system components.

Ultimately there won’t be a difference in frames/res when it comes to multiplats on these systems so that tells us that Tflops are not as critical an indicator of system performance as some would try to claim.

Given tflop difference isn’t observable in practice, then the Tflops difference is irrelevant and that’s why they’re not considered a critical predictor of performance.
 
Last edited:

vivftp

Member
In the case of PS5 vs Xsex the difference in Tflops is so small it is basically irrelevant. The difference would be irrelevant even if we were comparing PC GPUs with these specs.

When it comes to consoles, the tflop delta could easily be swallowed up by other performance differences in other system components.

Ultimately there won’t be a difference in frames/res when it comes to multiplats on these systems so that tells us that Tflops are not as critical an indicator of system performance as some would try to claim.

Given tflop difference isn’t observable in practice, then the Tflops difference is irrelevant and that’s why they’re not considered a critical predictor of performance.

Agreed, from what I've gathered the ~18% TF difference is minor when you compare it against TF differences from generations past. Even if we do come across a situation where the XSX runs a game at a slightly higher resolution to the PS5 not many people are going to be able to tell the difference without some form of detailed side by side comparison.

Still, while we wait for more news and for the console launches I suppose it is something to chat about :)
 

bitbydeath

Member
Drunk post which may get deleted so putting in quotes for posterity. LOL.

Edit: And it’s gone!
Hi.

I'm an Artist who is working on the lastest Call of Duty.

Why am I leaking? Activison are assholes who leave friends without a job and homeless while the boss gets a couple extra million to his bank account. Its dehumanizing and im sick of it.

Sadly I can't go too into the game. The whole project has been quite tight lipped and very messy due to the shorter time the team have had on it. My group wasn't even aware of the full name until a week before the Doritio leak.

Sony has approched us to make a new Theme for the Ps5 as an order bouns for the game so I will talk about the UI and what to expect.

General UI:

  • It is very similar to the playstation 4 UI in a lot of ways
  • Biggest difference is the select a game/app panel. It is now on the bottom.
  • They "peek" out from the bottom, half cut off until you hover over it.
  • When hovered over, the central part of the screen turns into a full image (chosen by the devs) of the game. If clicked, you'll zoom into the image like your jumping into a portal to your game. It gives me PS2 start up screen vibes.
  • Hovering over a game will also reveal friends playing and any special events at the moment.
  • Game panels can now be animated and updated much more frequently. I have already seen panels made for 2xp weekends.
  • The News tab is now hidden from the home screen, you have to press down on a panel to see them.
  • Top row is almost the same as the Ps4
  • There is: PSplus, PSNow, "Alerts" (replaces notifcations I believe) Friends, Party, Forum, Profile, ???, ???, settings and Power Off
  • The question marks are things Sony told us to leave blank at the moment but we will need to make art for later
Theme:

  • Sony is offering "4D" Themes
  • These themes change the Music, UI art, Background image, we can play sound through your controller, make it rumble at times and change the color of the lights
  • Background art of your theme will be used when there isn't another game or app selected
  • They have mentioned our theme overlapping UI elements but our team decided not to do that
General:

  • We've been told to get some promotional material ready and finalised this week. If it's similar to other projects "get ready" to "release" time, it'll start being used around the 28th
  • For this Call of Duty, Activison has worked VERY closely with Playstation compared to other projects. Sony has had a bit more influence.
  • I've seen up to 5 designs of the UI before the UI I mentioned above was "finalized". More like Sony gave us multiple so we didn't know which was real.
I hope this helps to fuck over Activison in some way. We just wanna make a game that's fun and not filled with all the Black Ops branding. It is quite late and im just a drunk tired dude who is tired of coorparate bullshit. I'll try to reply to comments but if I don't its most likely because I forgot the password to the throwaway.

 
Last edited:

HAL-01

Member
That's the thing that I'm still puzzled about. Cerny's not the only developer or industry person who's dismissed teraflops as not being a relevant metric, not by a long shot. I'm am curious what metrics they would consider to be useful, but off the top of my head I don't recall anyone elaborating on that.
Teraflops are not a good indicator of performance because they're only a theoretical max. It is the kind of performance you would get with unlimited power availability, perfect cooling, and 100% utilization of every CU every frame. But none of these ideal parameters are achievable in the real world, and so no machine ever reaches this number during use. The "better" machine is one that's able to get closer to its theoretical peak over a sustained period without failing. Cooling and power efficiency play a great part in this, as well as how easy it is for a programmer to keep the compute units busy (hence cerny's 36 CUs vs 52 CUs comment).

Only way to know how the PS5 and XsX stack up to each other will be by comparing multiplatform performance.
 

vivftp

Member
Teraflops are not a good indicator of performance because they're only a theoretical max. It is the kind of performance you would get with unlimited power availability, perfect cooling, and 100% utilization of every CU every frame. But none of these ideal parameters are achievable in the real world, and so no machine ever reaches this number during use. The "better" machine is one that's able to get closer to its theoretical peak over a sustained period without failing. Cooling and power efficiency play a great part in this, as well as how easy it is for a programmer to keep the compute units busy (hence cerny's 36 CUs vs 52 CUs comment).

Only way to know how the PS5 and XsX stack up to each other will be by comparing multiplatform performance.

Yup, that goes back to my earlier question about whether the PS5 will be able to hit closer to its max TF figure on average more often than the XSX would be able to hit to its max TF figure due to the variable clocks. If that is the case then that brings these two machines even closer to parity than the already close TF count indicates.

Wait and see is all we can do :)
 

HAL-01

Member
Yup, that goes back to my earlier question about whether the PS5 will be able to hit closer to its max TF figure on average more often than the XSX would be able to hit to its max TF figure due to the variable clocks. If that is the case then that brings these two machines even closer to parity than the already close TF count indicates.

Wait and see is all we can do :)
The PS5's robust power management system, alongside with the liquid metal cooling patent, and the console's large volume (which was quoted to be explicitly for cooling purposes) all point to what you're suggesting, and is probably what devs mean about it "punching above its weight". The machines will be close, likely more so than the 20% figure going around.
 

Vae_Victis

Banned
Well, GB aren't free, though
The size difference is not huge (less than 20%), and Sony did a lot of customization on their SSD. Whatever could potentially get saved with having 150-ish GB less can easily be made up for (and more) with something else Sony put in there, like the fact they split the memory over more modules or some custom piece they had manufactured specifically to manage the abnormal speed.

The PS5's SSD is a huge question mark in the cost factor, could be relatively cheap or could be what ends up diving the price up by an additional $50 over a more standard solution.
 

FeiRR

Banned
The PS5's SSD is a huge question mark in the cost factor, could be relatively cheap or could be what ends up diving the price up by an additional $50 over a more standard solution.
They've been ordering similar chips for their prosumer camera market for some time. Shouldn't be a major problem. Doesn't mean a stock 1 TB SSD like the one in XSX is more expensive.
 
Yup, that goes back to my earlier question about whether the PS5 will be able to hit closer to its max TF figure on average more often than the XSX would be able to hit to its max TF figure due to the variable clocks. If that is the case then that brings these two machines even closer to parity than the already close TF count indicates.

Wait and see is all we can do :)

That's the whole point of SmartShift and running the clocks this high. You are assuring a more constant, high utilization of your resources over an extended period if time. Basically this machine should work very differently to how things were done in the past, and that should mostly show on exclusives and next gen only games.

The fact we're getting this close to launch and there's so much to discover is frankly quite exciting :D
 
Status
Not open for further replies.
Top Bottom