• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • The Politics forum has been nuked. Please do not bring political discussion to the rest of the site, or you will be removed. Thanks.

THE COALITION MOVING TO NEXT-GEN DEVELOPMENT, UNREAL ENGINE 5

DonJuanSchlong

Spice Spice Baby
Jul 15, 2020
3,415
8,655
745
Source?

Id Software engine programmers disagree and say otherwise.

Xbox has the weakest link going into this generation, yet again.
Source? History.


Development usually has PC as lead platform, but other times, they use the next strongest perform. 360 was generally the lead platform over ps3. Ps4 generally was the lead platform over the xb1. It just makes sense to go with the stronger platform, and downgrade settings till the next platform has performance parity. It wouldn't make sense to develop on ps5, only to bump up the resolution or fidelity for XSX.
 
Last edited:
  • LOL
Reactions: Darius87

MonarchJT

Member
Sep 25, 2020
2,258
3,299
375
Source?

Id Software engine programmers disagree and say otherwise.

Xbox has the weakest link going into this generation for multiplatform game development, yet again with the weakest GPU, tiny RAM, and everything.
It will not stop the xsx to perform better for long time
 
  • Like
Reactions: VFXVeteran

Shmunter

Member
Aug 25, 2018
9,878
22,727
775
Source? History.


Development usually has PC as lead platform, but other times, they use the next strongest perform. 360 was generally the lead platform over ps3. Ps4 generally was the lead platform over the xb1. It just makes sense to go with the stronger platform, and downgrade settings till the next platform has performance parity. It wouldn't make sense to develop on ps5, only to bump up the resolution or fidelity for XSX.
Actually consoles define the tech baseline, it’s why we saw a whole gen of practically no physics simulation this whole gen thanks to shit console CPU’s.

Second, the biggest install base within the class defines the target platform as that version becomes the priority product.

Granted, this gen with the ps5 memory subsystem so far ahead, the situation becomes more nuanced as replicating designs around that level of streaming is not easily solved elsewhere.
 
Last edited:

Mister Wolf

Member
Sep 21, 2014
7,257
9,629
810
Actually consoles define the tech baseline, it’s why we saw a whole gen of practically no physics simulation this whole gen thanks to shit console CPU’s.

Second, the biggest install base within the class defines the target platform as that version becomes the priority product.

Granted, this gen with the ps5 memory subsystem so far ahead, the situation becomes more nuanced as replicating designs around that level of streaming is not easily solved elsewhere.

It just wont be used by anyone except Sony.
 

Md Ray

Member
Nov 12, 2016
3,134
10,206
735
India
Development usually has PC as lead platform, but other times, they use the next strongest perform. 360 was generally the lead platform over ps3. Ps4 generally was the lead platform over the xb1.
These are good.

But I asked you to provide source specifically for this statement 👇
XSX will always be the lead platform for development over the ps5.
If you can't, then this is just your assumption based on history, and there's no truth to it.

But in reality, according to GDC 2021 survey, it's the PS5 that's most preferred console among game devs to develop for, next to PC. And we've seen many times that PS5 can outperform or be on par XSX because games aren't always compute-intensive. They will flip-flop depending on the game engines/scenes. PS5 GPU is more powerful than XSX GPU in some aspects just like XSX is in computational power.
 
Last edited:
  • LOL
Reactions: Riky

Mister Wolf

Member
Sep 21, 2014
7,257
9,629
810
I fear you may be right. Would be poor PR to roll out games with a sizeable chasm, not to mention more work

Pretty much. At the end of the day this is a business so if Sony aren't willing to pay for exclusive rights they wont truly optimize for it. Loading the game a couple seconds faster is nothing to write home about. PC gaming is more popular now than ever. Alot of money to be made there. Even Sega realize this now. At most we will see multiplatform games designed around SATA SSD speeds to accommodate lower tier PC gamers, which is still a huge leap over HDD.
 
Last edited:
  • Like
Reactions: Shmunter

DonJuanSchlong

Spice Spice Baby
Jul 15, 2020
3,415
8,655
745
These are good.

But I asked you to provide source specifically for this statement 👇

If you can't, then this is just your assumption based on history, and there's no truth to it.

But in reality, according to GDC 2021 survey, it's the PS5 that's most preferred console among game devs to develop for, next to PC. And we've seen many times that PS5 can outperform or be on par XSX because games aren't always compute-intensive. They will flip-flop. PS5 GPU is more powerful than XSX GPU in some aspects just like XSX is in computational power.
Think about it like this. Would you try and develop for the platform with the faster SSD, or the better GPU? If you rely on the fastest SSD, how can you easily port the game to a device that can't take advantage of a certain speed? Or would you just develop a game on the stronger GPU, and then move sliders down for graphics quality and resolution? It's a no brainer. I don't have any proof that this is definite, but logically it makes sense.


If you have any proof of ps5 being lead platform, I'd gladly take a look into it though!
 
Last edited:
  • LOL
Reactions: Darius87

MonarchJT

Member
Sep 25, 2020
2,258
3,299
375
the thing is and some ppl have to open their eyes about it.. Xbox releasing day one on PC have already won the userbase war. and until Sony decides to do the same there is very little they can do about it. No matter how good the PS5 is selling will never surpass Xbox+PC+xcloud
 
Last edited:

Md Ray

Member
Nov 12, 2016
3,134
10,206
735
India
Think about it like this. Would you try and develop for the platform with the faster SSD, or the better GPU? If you rely on the fastest SSD, how can you easily port the game to a device that can't take advantage of a certain speed? Or would you just develop a game on the stronger GPU, and then move sliders down for graphics quality and resolution? It's a no brainer. I don't have any proof that this is definite, but logically it makes sense.


If you have any proof of ps5 being lead platform, I'd gladly take a look into it though!
As I said, PS5 GPU can be better too depending on the game engine/scenes. Its GPU is powerful in some ways & XSX isn't faster always. So I look at them as mostly equal all things considered. But I hear PS5 is the easiest console to develop for so I'd go for that. Based on GDC 2021 it looks like PS5 is probably the lead dev platform, and it also obviously has the most install base compared to XSX.
 
  • Like
Reactions: Mr. PlayStation

MonarchJT

Member
Sep 25, 2020
2,258
3,299
375
As I said, PS5 GPU can be better too depending on the game engine/scenes. Its GPU is powerful in some ways & XSX isn't faster always. So I look at them as mostly equal all things considered. But I hear PS5 is the easiest console to develop for so I'd go for that. Based on GDC 2021 it looks like PS5 is probably the lead dev platform, and it also obviously has the most install base compared to XSX.
when they will move using mesh shader and all the other things i very doubt that the PS5 will perform better in anygame...GDC have seen always the PlayStation as a console of choice even when the One X was beating hard the ps4pro in practically every comparison
 
Last edited:
  • Thoughtful
Reactions: DonJuanSchlong

Md Ray

Member
Nov 12, 2016
3,134
10,206
735
India
when they will move using mesh shader and all the other things i very doubt that the PS5 will perform better in anygame
Do we have any definitive proof from an actual dev that this will be the case? Or is this another case of "lol it’s math bro, minimum of 50fps on GPU Alone" which fell flat as soon as these comparisons came out?
GDC have seen always the PlayStation as a console of choice
Because PlayStation was still the lead development platform even if One X had better specs.
 
Last edited:

MonarchJT

Member
Sep 25, 2020
2,258
3,299
375
Do we have any definitive proof from an actual dev that this will be the case? Or is this another case of "lol it’s math, minimum of 40to50 frames per second on GPU Alone" which fell flat as soon as these comparisons came out?
We starting to see a pattern where most multiplats games perform little bit better on sx and still most of its hw peculiarities are not even used. When the consoles will be exploited properly reaching (or at least getting closer to) their maximum performance ceiling, as happens every generation, it is reasonable to think that the gap will widen . no one has the crystal ball to predict the future ... as I said it is reasonable to think that it will happen given the specs


Because PlayStation was still the lead development platform even if One X had better specs.
Exactly, it could simply be that the PS5 devkits tools are simpler or more complete and devs prefer that. It means little ..especially so a few months after launch .... during the covid
 
Last edited:

Md Ray

Member
Nov 12, 2016
3,134
10,206
735
India
When the consoles will be exploited properly reaching (or at least getting closer to) their maximum performance ceiling, as happens every generation, it is reasonable to think that the gap will widen
Even Sony devs are hit with COVID and are facing the same challenges so it is reasonable to think that PS5 development environment too will get better and mature with time, it won't remain stagnant while only Xbox's will be exploited. As Insomniac said, they're only scratching the surface.
 

MonarchJT

Member
Sep 25, 2020
2,258
3,299
375
Even Sony devs are hit with COVID and are facing the same challenges so it is reasonable to think that PS5 development environment too will get better and mature with time, it won't remain stagnant while only Xbox's will be exploited. As Insomniac said, they're only scratching the surface.
yes it will...but both console have a ceiling on their performance...one is higher than the other. it would seem that ps with its devkits and tools (probably more mature) was easier for devs to use initially..
 
Last edited:
  • Thoughtful
Reactions: DonJuanSchlong

Md Ray

Member
Nov 12, 2016
3,134
10,206
735
India
One is hughert han another.
Higher? Both console's GPUs have their strengths and weaknesses.

XSX GPU isn't definitively the stronger one. Looking at TFLOPs alone is misleading. Anyway, I'm done with this convo.
 
Last edited:
  • LOL
Reactions: Riky

Tschumi

Gold Member
Jul 4, 2020
3,990
4,547
715
>company who makes xbox games says they're using unreal engine 5<
nekminit
"the best performing UE5 game confirmed xbox exclusive"

simmer down homies
 

DonJuanSchlong

Spice Spice Baby
Jul 15, 2020
3,415
8,655
745
Higher? Both console's GPUs have their strengths and weaknesses.

XSX GPU isn't definitively the stronger one. Looking at TFLOPs alone is misleading. Anyway, I'm done with this convo.
Clockspeed only goes so far. Rdna2 benchmarks clearly show this. Hitting a remarkable 3ghz doesn't linearly give better performance. Your just creating more heat at that point, and things will begin to throttle. Having more CU's is the way to go.

Gpu manufacturers aren't making 5 sku's with the same amount of CU's, just operating at different clockspeeds. They have more CU's the higher up in tier of the GPU. This has always been the case for AMD, Nvidia, Snapdragon, Apple chips, etc. You can't expect higher clockspeeds on less CU's to perform the same as a marginally less clockspeed with several more CU's. Not sure if you'll reply to this, but look it up if you don't believe me. Look at the 3070 or 3080 vs the 3090. 3090 has lower clocks but has better performance.
 
  • Like
  • LOL
Reactions: Riky and Darius87

Md Ray

Member
Nov 12, 2016
3,134
10,206
735
India
Clockspeed only goes so far. Rdna2 benchmarks clearly show this. Hitting a remarkable 3ghz doesn't linearly give better performance. Your just creating more heat at that point, and things will begin to throttle.
They only go so far because the clock speeds in RDNA 2 GPUs for instance in the 6800 and 6800 XT are already well over in the 2250+ MHz region, they hit diminishing returns when pushed past their factory clock speeds.

In the console space, XSX operates at just over 1800 MHz, which is not "marginally" less, that's very, very less by RDNA 2's standards and almost 500 MHz slower compared to some of the desktop parts. Anyway, PS5 GPU operates at 2230 MHz. This a massive 400+ MHz (22%) difference here, we aren't seeing this kind of difference in clock speed between different RDNA 2 tier GPUs on PC (only around 80MHz or so between 6800 and 6800 XT at stock for e.g.).

The best way I can explain: the situation between PS5/XSX GPU is akin to someone manually downclocking a higher CU RDNA 2 GPU (e.g. 6800 XT = XSX) and keeping the lower CU RDNA 2 GPU (6800 non-XT = PS5) at stock clock speeds. Performance characteristics of these GPUs when you do something like this will most likely result in some games performing better on lower CU GPU, some will be on par/close/similar (as it has been with PS5/XSX in many games now) and some that really favor CUs will perform better on higher CU part. That's the kind of situation PS5 and XSX are in.

Saying that no game favors higher clock speed would also be false because we've seen in many games PS5 outperforming XSX or have similar perf. RE8 is the most recent e.g. of this. As I've said in my prev posts - perf will flip-flop depending on scenes.

Having more CU's is the way to go.

If higher CUs were the only way to go and we can't expect a higher clock speed on fewer CUs to have similar perf -- then this shouldn't be happening (same graphics settings/4K CB/no dynamic res):
When benchmarked the avg. fps between PS5 and XSX difference is in 0%-2% range, do you know?

Notice the similarity in the graph line. This isn't some fluke. This just points to how smartly the PS5 is designed it's achieving similar perf as the 12 TF console with less silicon real estate.

Gpu manufacturers aren't making 5 sku's with the same amount of CU's, just operating at different clockspeeds. They have more CU's the higher up in tier of the GPU. This has always been the case for AMD, Nvidia, Snapdragon, Apple chips, etc. You can't expect higher clockspeeds on less CU's to perform the same as a marginally less clockspeed with several more CU's. Not sure if you'll reply to this,
This isn't really a good comparison because GPU manufacturers carefully tweak and tune their GPU lineup so that lower-tier GPU doesn't outperform higher-tier GPU even when OC'd. You can't take this logic and apply it to MS and Sony's console GPUs.
but look it up if you don't believe me. Look at the 3070 or 3080 vs the 3090. 3090 has lower clocks but has better performance.
I know what you mean and I've done extensive research on these architectures (Turing/Ampere/RDNA 2...), their lineup and down to each subsequent GPU's configs: SM/CU/ROPs/TMUs, primitive units, etc. and have come to know what kind of rasterization, fillrate throughput each of these and console GPUs have at their clock speeds.

Let's take your 3090 vs 3080 example.

According to Gamers Nexus:
RTX 3090 plots: 1890-1905 MHz
RTX 3080 plots: 1920-1935 MHz

3080 indeed has a higher clock speed. But... Did you care to look deeper? I'll take 3090's min clk speed and 3080's max clk speed for the comparison below:

3080 = 1935 MHz +2%
Pixel fillrate: 186 Gpix/sec -12%
FP32 TFLOPS: 34 TFLOPS -15%
Texture fillrate: 526 Gtex/sec -15%

3090 = 1890 MHz -2%
Pixel fillrate: 212 Gpix/sec +14%
FP32 TFLOPS: 40 TFLOPS +18%
Texture fillrate: 620 Gtex/sec +18%

You can see every aspect of the 3090 GPU is faster than 3080 despite 3080 having a slightly higher clk speed. This is what I was talking about when I said GPU manufacturers carefully tweak and tune their lower and higher tier GPUs so they perform as they intended. Those 14-18% gains across the board clearly translate to better gaming perf for the 3090 because that is intentional.

Now look at PS5 vs XSX GPU:

PS5 =
2230 MHz +22%
Pixel fillrate: 143 Gpix/sec +22%
FP32 TFLOPS: 10.3 TF -15%
Texture fillrate: 321 Gtex/sec -15%
Rasterization rate: 9 Gtri/sec +22%

XSX = 1825 MHz -18%
Pixel fillrate: 116.8 Gpix/sec -18%
FP32 TFLOPS: 12.15 TF +18%
Texture fillrate: 380 Gtex/sec +18%
Rasterization rate: 7.3 Gtri/sec -18%

Here not all aspect of the XSX's GPU is faster like the 3090 was, as you can see. That 22% increase in clock speed over XSX is so substantial that some parts of the PS5's GPU receive that much higher throughput over XSX's GPU leading to similar perf (proof above) or in some cases better perf for the PS5. You're just looking at CUs and TF and claiming that it's all that matters when it's not. If it did, XSX would come out on top in each and every comparison consistently, and not have similar perf. Let's appreciate both the designs for a second and enjoy the games.
 
Last edited:

DonJuanSchlong

Spice Spice Baby
Jul 15, 2020
3,415
8,655
745
They only go so far because the clock speeds in RDNA 2 GPUs for instance in the 6800 and 6800 XT are already well over in the 2250+ MHz region, they hit diminishing returns when pushed past their factory clock speeds.

In the console space, XSX operates at just over 1800 MHz, which is not "marginally" less, that's very, very less by RDNA 2's standards and almost 500 MHz slower compared to some of the desktop parts. Anyway, PS5 GPU operates at 2230 MHz. This a massive 400+ MHz (22%) difference here, we aren't seeing this kind of difference in clock speed between different RDNA 2 tier GPUs on PC (only around 80MHz or so between 6800 and 6800 XT at stock for e.g.).

The best way I can explain: the situation between PS5/XSX GPU is akin to someone manually downclocking a higher CU RDNA 2 GPU (e.g. 6800 XT = XSX) and keeping the lower CU RDNA 2 GPU (6800 non-XT = PS5) at stock clock speeds. Performance characteristics of these GPUs when you do something like this will most likely result in some games performing better on lower CU GPU, some will be on par/close/similar (as it has been with PS5/XSX in many games now) and some that really favor CUs will perform better on higher CU part. That's the kind of situation PS5 and XSX are in.

Saying that no game favors higher clock speed would also be false because we've seen in many games PS5 outperforming XSX or have similar perf. RE8 is the most recent e.g. of this. As I've said in my prev posts - perf will flip-flop depending on scenes.



If higher CUs were the only way to go and we can't expect a higher clock speed on fewer CUs to have similar perf -- then this shouldn't be happening (same graphics settings/4K CB/no dynamic res):
When benchmarked the avg. fps between PS5 and XSX difference is in 0%-2% range, do you know?

Notice the similarity in the graph line. This isn't some fluke. This just points to how smartly the PS5 is designed it's achieving similar perf as the 12 TF console with less silicon real estate.


This isn't really a good comparison because GPU manufacturers carefully tweak and tune their GPU lineup so that lower-tier GPU doesn't outperform higher-tier GPU even when OC'd. You can't take this logic and apply it to MS and Sony's console GPUs.

I know what you mean and I've done extensive research on these architectures (Turing/Ampere/RDNA 2...), their lineup and down to each subsequent GPU's configs: SM/CU/ROPs/TMUs, primitive units, etc. and have come to know what kind of rasterization, fillrate throughput each of these and console GPUs have at their clock speeds.

Let's take your 3090 vs 3080 example.

According to Gamers Nexus:
RTX 3090 plots: 1890-1905 MHz
RTX 3080 plots: 1920-1935 MHz

3080 indeed has a higher clock speed. But... Did you care to look deeper? I'll take 3090's min clk speed and 3080's max clk speed for the comparison below:

3080 = 1935 MHz +2%
Pixel fillrate: 186 Gpix/sec -12%
FP32 TFLOPS: 34 TFLOPS -15%
Texture fillrate: 526 Gtex/sec -15%

3090 = 1890 MHz -2%
Pixel fillrate: 212 Gpix/sec +14%
FP32 TFLOPS: 40 TFLOPS +18%
Texture fillrate: 620 Gtex/sec +18%

You can see every aspect of the 3090 GPU is faster than 3080 despite 3080 having a slightly higher clk speed. This is what I was talking about when I said GPU manufacturers carefully tweak and tune their lower and higher tier GPUs so they perform as they intended. Those 14-18% gains across the board clearly translate to better gaming perf for the 3090 because that is intentional.

Now look at PS5 vs XSX GPU:

PS5 =
2230 MHz +22%
Pixel fillrate: 143 Gpix/sec +22%
FP32 TFLOPS: 10.3 TF -15%
Texture fillrate: 321 Gtex/sec -15%
Rasterization rate: 9 Gtri/sec +22%

XSX = 1825 MHz -18%
Pixel fillrate: 116.8 Gpix/sec -18%
FP32 TFLOPS: 12.15 TF +18%
Texture fillrate: 380 Gtex/sec +18%
Rasterization rate: 7.3 Gtri/sec -18%

Here not all aspect of the XSX's GPU is faster like the 3090 was, as you can see. That 22% increase in clock speed over XSX is so substantial that some parts of the PS5's GPU receive that much higher throughput over XSX's GPU leading to similar perf (proof above) or in some cases better perf for the PS5. You're just looking at CUs and TF and claiming that it's all that matters when it's not. If it did, XSX would come out on top in each and every comparison consistently, and not have similar perf. Let's appreciate both the designs for a second and enjoy the games.
Nice write up, not gonna lie, that's pretty thorough. All of that looks good and all, but at the end of the day, XSX even with having late devkits out, they still have better performance. The Dj Khaled memes stopped in December from a certain group of people IIRC. You didn't show the bandwidth, which is what really matters at the end of day. Raytracing and running at a higher resolution, requires just that.
 
  • Like
  • LOL
Reactions: Darius87 and Riky

hoplie

Neo Member
Dec 8, 2020
48
106
200
448:10.28=43,6
560:12=46,7

The bandwith advantage for Xbox Series X isn’t that big. But yeah, CU count should be important for raytracing, so in this case there is a big advantage.
 
Last edited: