• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Tom Warren - Sony’s PS5 Pro is real and Sony is asking developers to get games ready over the summer, with a push for ray-tracing support

Gaelyon

Gold Member
I think there's "next gen" graphics and next gen gameplay. NG graphics can usually be scaled down to uglier/less defined and or low framerate. Framerate can degrade gameplay but otherwise the game is the same just uglier.
NG gameplay is much rarer, as in games which gameplay could not be replicated or very downgraded on previous gen. Long ago the 2d->3d is an obvious one, but today ? Size of the map, numbers of enemies, complex AI. Those types of games are nearly non existent because consoles aren't using top of the edge CPU and publishers still need to sell their games to all kind of hardware.
 

ChiefDada

Gold Member
Am I reading too much into this or does it seem like a given that Sony cross gen titles, specifically God of War Ragnarok, and Horizon FW/BS, TLOU will include significant ray tracing enhancements? I can't imagine Sony pushing 3rd devs for the new PS5 Pro graphics mode and not effectively mandating it for 1st party. Ragnarok, Horizon with RT is an easy sell. Imagine Ragnarok with RTGI. Horizon with RT shadows. Could easily catapult these games to GOAT visuals status.
 

SlimySnake

Flashless at the Golden Globes
Never mind that Rift Apart looks like a PS3 rendition of the game on the Deck.
Nah, Ratchet is one next gen game that runs fine on steam deck at medium settings even. And it looks very comparable to the PS5's performance mode.

It's the other next gen games like Alan Wake, Avatar, and latest UE5 titles that looked like a pixelated mess that look even worse than some PS3 games. Thats if they can even hit 30 fps which they cant.









insomniac simply didnt push the PS5 GPU hard enough for this game. The ssd is what sets this game apart from PS4 era games and the steam deck has an ssd. But other games with fancier graphics and a 1440p target on the PS5 just completely fall apart on the steam deck as they have to use the lowest presets and push resolutions to 240p to get a stable framerate and in some games like avatar they dont even get there.

Again, Rift apart runs at native 4k 50 fps on the base PS5. It is not pushing the PS5 at all. They couldve added RTGI, RT Shadows, way better volumetrics, character models and other lighting effects to make it look like a CG movie but they chose instead to simply push for native 4k and then had over 50% more GPU available to get to 45-50 fps on a consistent basis. It is the poster child for a game that is not tapping out the PS5.

Did the same with spiderman 2. Downgraded the crap out of the game from its initial trailer to hit native 4k. But for Wolverine they seem to be going all out and will be adding realtime GI, destruction, metahuman quality facial animations, cloth and muscle simulations, motion matching, weather and fluid simulations etc etc. Basically everything we saw in the 1943 demo is listed here. That is the insomniac game that will tap out the PS5. Not ratchet.

omhl7omm9y9c1.png
 
Last edited:

64bitmodels

Reverse groomer.
This was 25 years ago. The SNES didn't even have a Z-buffer, so of course, it couldn't run 3D games.

Now, though? Besides ray tracing which can be turned off, there is almost nothing that the PS5 can do that a Steam Deck cannot. The PS5 simply does everything much better. Hardware improvements are iterative and have been since the dawn of the PS4. You don't have complete changes like PS2 to PS3 or SNES to N64. It was the same way on PC back then when your GPU without pixel shader 4.0 support couldn't run games that required pixel shader 4.0 and above. This doesn't happen anymore, but this doesn't mean that Cyberpunk 2077 isn't taking full advantage of a modern PCs because it can also run on a GTX 960.

You guys keep using examples from 18-25 years ago as if things didn't change.
Ok fair. Rift Apart was a good port anyways.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Nah, Ratchet is one next gen game that runs fine on steam deck at medium settings even. And it looks very comparable to the PS5's performance mode.
The game needs to run at 900p, medium/low and uses FSR2 to run at 30fps. How is that comparable to the Performance Mode on PS5?
It's the other next gen games like Alan Wake, Avatar, and latest UE5 titles that looked like a pixelated mess that look even worse than some PS3 games. Thats if they can even hit 30 fps which they cant.
And Alan Wake 2 until last month could barely run on Pascal cards. Now, it runs well on GTX 1070s and above.
insomniac simply didnt push the PS5 GPU hard enough for this game. The ssd is what sets this game apart from PS4 era games and the steam deck has an ssd. But other games with fancier graphics and a 1440p target on the PS5 just completely fall apart on the steam deck as they have to use the lowest presets and push resolutions to 240p to get a stable framerate and in some games like avatar they dont even get there.

Again, Rift apart runs at native 4k 50 fps on the base PS5. It is not pushing the PS5 at all. They couldve added RTGI, RT Shadows, way better volumetrics, character models and other lighting effects to make it look like a CG movie but they chose instead to simply push for native 4k and then had over 50% more GPU available to get to 45-50 fps on a consistent basis. It is the poster child for a game that is not tapping out the PS5.
Rift Apart runs at 40-45fps with dips to the mid-30s during heavy action sequences on the PS5, not 50fps. Additionally, they chose to reallocate those resources to resolution rather than RT and fancier effects. It's a zero sum game. A game that runs at 4K30 is just as demanding as a game that runs at 1440p75. The GPU has a finite budget and the developer decides how to use those resources. That they didn't decide to use them for RT and higher fps on Rift Apart doesn't mean that the GPU isn't being utilized.
Did the same with spiderman 2. Downgraded the crap out of the game from its initial trailer to hit native 4k. But for Wolverine they seem to be going all out and will be adding realtime GI, destruction, metahuman quality facial animations, cloth and muscle simulations, motion matching, weather and fluid simulations etc etc. Basically everything we saw in the 1943 demo is listed here. That is the insomniac game that will tap out the PS5. Not ratchet.
And we will see what the performance in Wolverine is. If it runs at 4K30 instead of 4K40-45, then it simply means that the rendering budget was used for better graphical effects rather than a higher resolution.

My point is that Rift Apart and Spider-Man 2 do take full advantage of the PS5's hardware. Insomniac didn't leave a bunch of power on tap and idling resources in the background while the game is firing on all cylinders. It's not Dragon's Dogma 2 that both runs and looks like shit. The game runs great and looks great. You could have a game that looks better but runs worse. Doesn't change the fact that both games use the resources available. They just choose to use them differently.

And you were saying that Rift Apart is a shit port while I was arguing otherwise.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
The game needs to run at 900p, medium/low and uses FSR2 to run at 30fps. How is that comparable to the Performance Mode on PS5?

And Alan Wake 2 until last month could barely run on Pascal cards. Now, it runs well on GTX 1070s and above.

Rift Apart runs at 40-45fps with dips to the mid-30s during heavy action sequences on the PS5, not 50fps. Additionally, they chose to reallocate those resources to resolution rather than RT and fancier effects. It's a zero sum game. A game that runs at 4K30 is just as demanding as a game that runs at 1440p75. The GPU has a finite budget and the developer decides how to use those resources. That they didn't decide to use them for RT and higher fps on Rift Apart doesn't mean that the GPU isn't being utilized.

And we will see what the performance in Wolverine is. If it runs at 4K30 instead of 4K40-45, then it simply means that the rendering budget was used for better graphical effects rather than a higher resolution.

My point is that Rift Apart and Spider-Man 2 do take full advantage of the PS5's hardware. Insomniac didn't leave a bunch of power on tap and idling resources in the background while the game is firing on all cylinders. It's not Dragon's Dogma 2 that both runs and looks like shit. The game runs great and looks great. You could have a game that looks better but runs worse. Doesn't change the fact that both games use the resources available. They just choose to use them differently.

And you were saying that Rift Apart is a shit port while I was arguing otherwise.
It passes the eye test. you can see just how good Ratchet looks in comparison to the other games i posted. i said it looks comparable, not that it is comparable. again, just look at the videos i posted and the visual differences should be apparent immediately.

you have cross gen games that run at native 4k 30 fps on console. Ratchet runs comfortably above that. If you seriously believe that it is tapping out the ps5 just because its running at higher resolutions and framerates then you must also believe that gow ragnorak is tapping out the ps5.

i saw uncharted 4 can run at native 4k 40-50 fps. TLOU2 as well. Same exact performance profile as Ratchet. I guess they have also tapped out the ps5.

lastly, it doesnt matter if the GPU is hitting 100% or sitting idle. we are not talking about GPU not being utilized. we are talking about devs utilizing the power available to them to push more than just resolutions and framerates. Thats exactly what the wolverine leaked slides are showing. New features. Not resolutions.
 

Gaiff

SBI’s Resident Gaslighter
lastly, it doesnt matter if the GPU is hitting 100% or sitting idle. we are not talking about GPU not being utilized. we are talking about devs utilizing the power available to them to push more than just resolutions and framerates. Thats exactly what the wolverine leaked slides are showing. New features. Not resolutions.
Huh, yes, it does. The point is that Racthet & Clank uses what the PS5 has like almost no other game. We were discussing just yesterday how it can thrash the PCIe on PC and can lead to stutters even on an RTX 3080 and you accused it of being a bad port and I argued that it isn't a bad port. It takes advantage of the PS5's unified memory architecture. On the one hand, you claim that it runs on the little Steam Deck and looks comparable but then turn around and say it's a shit port because it isn't twice the performance of the PS5 on your 3080. Which is it? You can't have it both ways.

Alan Wake dies on the Steam Deck and until last month, tanked on Pascal cards (presumably because of no vertex shader fallback). I didn't see you accusing it of being a shit port.

And last but not least, "tap out", means absolutely nothing within this context and isn't what we're talking about. What we're talking about is whether or not a game takes full advantage of the PS5's hardware and to do so, it has to use everything the PS5 has to offer to a lesser or greater degree. That X dev decides to run the game at 4K30 while Y dev decides to do 1440p75 with lesser graphics doesn't mean X dev uses the hardware any more than the other one. They simply have different priorities and allocate their resources differently. However, if Y dev has ray tracing, fast assets streaming, and pushes the I/O subsystem while X dev doesn't, then I'd argue they take better advantage of the PS5's hardware despite not having fancier particle effects.

Rift Apart uses practically every new tool the PS5 has to offer so how can anyone claim it doesn't come close to taking full advantage of the PS5?
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Huh, yes, it does. The point is that Racthet & Clank uses what the PS5 has like almost no other game. We were discussing just yesterday how it can thrash the PCIe on PC and can lead to stutters even on an RTX 3080 and you accused it of being a bad port and I argued that it isn't a bad port. It takes advantage of the PS5's unified memory architecture. On the one hand, you claim that it runs on the little Steam Deck and looks comparable but then turn around and say it's a shit port because it isn't twice the performance of the PS5 on your 3080. Which is it? You can't have it both ways.

Alan Wake dies on the Steam Deck and until last month, tanked on Pascal cards (presumably because of no vertex shader fallback). I didn't see you accusing it of being a shit port.

And last but not least, "tap out", means absolutely nothing within this context and isn't what we're talking about. What we're talking about is whether or not a game takes full advantage of the PS5's hardware and to do so, it has to use everything the PS5 has to offer to a lesser or greater degree. That X dev decides to run the game at 4K30 while Y dev decides to do 1440p75 with lesser graphics doesn't mean X dev uses the hardware any more than the other one. They simply have different priorities and allocate their resources differently. However, if Y dev has ray tracing, fast assets streaming, and pushes the I/O subsystem while X dev doesn't, then I'd argue they take better advantage of the PS5's hardware despite not having fancier particle effects.

Rift Apart uses practically every new tool the PS5 has to offer so how can anyone claim it doesn't come close to taking full advantage of the PS5?
so Uncharted 4 is a next gen masterpiece confirmed. got it.
 

RoadHazard

Gold Member
IDK, it's extremely hard to run on PC, I have to use DLSS at 1440p using a 3090ti to hit around 30 fps. A 4080 and 3090ti are pretty close in performance, so I would be shocked if they are able to hit that without using an FSR setting that basically destroys the image. Maybe if they target 1080p

PS5 Pro games won't use FSR but the (probably) much superior PSSR (which should be closer to what DLSS can do). That's half the point of this machine. Plus a decent increase in GPU performance and a larger increase (relatively speaking) in RT performance.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
IDK, it's extremely hard to run on PC, I have to use DLSS at 1440p using a 3090ti to hit around 30 fps. A 4080 and 3090ti are pretty close in performance, so I would be shocked if they are able to hit that without using an FSR setting that basically destroys the image. Maybe if they target 1080p

So how much more powerful is the 4090 over your 3090ti?
 

Gaiff

SBI’s Resident Gaslighter
so Uncharted 4 is a next gen masterpiece confirmed. got it.
Does Uncharted 4 take advantage of most if not all the new PS5 hardware features? No, it doesn't.

If you're just going to be disingenuous, don't bother replying.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Does Uncharted 4 take advantage of most if not all the new PS5 hardware features? No, it doesn't.

If you're just going to be disingenuous, don't bother replying.
Does ratchet?

Does it support primitive shaders?
Does it support RTGI? Or even software based realtime GI like Starfield, Forza, and Software Lumen UE5 games?
Does it have AI upscaling they intend to add in wolverine?
Does it tax the CPU in any meaningful way? Like they are set to do in the wolverine feature list? With destruction, fluid simulations, support for motion matching animations, are all CPU related tasks that we did not see in Ratchet. How can it tapping out the PS5?

It's main use of the PS5 GPU is RT reflections. Something several cross gen games at launch had. Does that mean watch dogs tapped out the PS5? DMCV? Miles?

It's biggest selling point is the SSD usage in creating portal segments. But DF found that even the slowest SSD could run these on PS5.. hell, i saw videos of the game running on an HDD if you paired it up with a fast enough CPU. Alan Wake 2 on the other hand was pushing 2+ GBps. Thats higher than Ratchet ever gets to.

Avatar's flying is way faster than the flying level in Ratchet. They specifically said they needed the ssd for faster traversal on ikrans. Ratchet did not even tap out the PS5 ssd let alone the GPU.

i just find it hilarious that we have now seen GTA6 and 1943 looking a generation ahead of Spiderman 2 and ratchet, and you are arguing that insomniac has tapped out the PS5 just because the games can hit native 4k and 45-50 fps.
 

Gaiff

SBI’s Resident Gaslighter
Does ratchet?
Yes.
Does it support primitive shaders?
No idea.
Does it support RTGI? Or even software based realtime GI like Starfield, Forza, and Software Lumen UE5 games?
In other words, does it support ray tracing? Yes.
Does it have AI upscaling they intend to add in wolverine?
The PS5 doesn't do AI upscaling.
Does it tax the CPU in any meaningful way?
Yes.
Like they are set to do in the wolverine feature list? With destruction, fluid simulations, support for motion matching animations, are all CPU related tasks that we did not see in Ratchet. How can it tapping out the PS5?
Again with your "tapping out". There are dozens of different parts in the system that can be more or less taxed depending on the workload. The point is, does it take advantage of the PS5's features? The answer is a resounding yes.
It's main use of the PS5 GPU is RT reflections. Something several cross gen games at launch had. Does that mean watch dogs tapped out the PS5? DMCV? Miles?
What did I say about being disingenous?
It's biggest selling point is the SSD usage in creating portal segments. But DF found that even the slowest SSD could run these on PS5.. hell, i saw videos of the game running on an HDD if you paired it up with a fast enough CPU.
It will stutter like hell with an HDD, no matter what CPU you pair with it unless you run the sequence and reload, at which point it is cached and will run better. Not that it matters, because if your point is, "Well, with a suped up PC you can break through the hardware limitations,". No shit. That's why the PS5 has those cheaper solutions instead of a 7800X3D-class CPU. The fact that you need a modern powerful CPU to replicate what the PS5 does with an HDD (and still won't come close) means that yes, it does use those features effectively.
Alan Wake 2 on the other hand was pushing 2+ GBps. Thats higher than Ratchet ever gets to.
And runs at what? 900p upscaled to 1440p and frequently dips below 60? I forgot the exact number. Once again, different ways to allocate those resources but according to you, there's just one way to do that. Never mind the fact that Alan Wake 2 now runs on GPUs without mesh shader support pretty well.
Avatar's flying is way faster than the flying level in Ratchet. They specifically said they needed the ssd for faster traversal on ikrans. Ratchet did not even tap out the PS5 ssd let alone the GPU.
Avatar only supports mesh shaders on consoles, not on PC. Alan Wake 2 now presumably has a vertex fallback to run properly on older GPUs. What good are those if you can do the exact same things with older techniques?
i just find it hilarious that we have now seen GTA6 and 1943 looking a generation ahead of Spiderman 2 and ratchet, and you are arguing that insomniac has tapped out the PS5 just because the games can hit native 4k and 45-50 fps.
Lol, no one is arguing. Learn to read. A poster claimed that Rift Apart, "doesn't come close to fully taking advantage of the PS5's hardware." I argued otherwise and we can demonstrably show that it takes advantage of most hardware features on the PS5.

Your argument is basically, "Rift Apart doesn't tax the PS5 because instead of using the GPU power for fancy effects, it reallocates them for higher resolution and frame rates," which makes no sense. Rendering budget is a zero sum game. Everything you do has a cost and how the developer chooses to use these resources isn't very relevant to this discussion so long as the developer takes advantage of most modern features of the machine, which Rift Apart does.

If Wolverine runs at 4K and 40-45 fps while doing all of this, then we can 100% say it takes much better advantage of the PS5's hardware than Rift Apart and is far more effective at using it. But we know it won't.
 
Last edited:

Bojji

Member
Nah, Ratchet is one next gen game that runs fine on steam deck at medium settings even. And it looks very comparable to the PS5's performance mode.

It's the other next gen games like Alan Wake, Avatar, and latest UE5 titles that looked like a pixelated mess that look even worse than some PS3 games. Thats if they can even hit 30 fps which they cant.









insomniac simply didnt push the PS5 GPU hard enough for this game. The ssd is what sets this game apart from PS4 era games and the steam deck has an ssd. But other games with fancier graphics and a 1440p target on the PS5 just completely fall apart on the steam deck as they have to use the lowest presets and push resolutions to 240p to get a stable framerate and in some games like avatar they dont even get there.

Again, Rift apart runs at native 4k 50 fps on the base PS5. It is not pushing the PS5 at all. They couldve added RTGI, RT Shadows, way better volumetrics, character models and other lighting effects to make it look like a CG movie but they chose instead to simply push for native 4k and then had over 50% more GPU available to get to 45-50 fps on a consistent basis. It is the poster child for a game that is not tapping out the PS5.

Did the same with spiderman 2. Downgraded the crap out of the game from its initial trailer to hit native 4k. But for Wolverine they seem to be going all out and will be adding realtime GI, destruction, metahuman quality facial animations, cloth and muscle simulations, motion matching, weather and fluid simulations etc etc. Basically everything we saw in the 1943 demo is listed here. That is the insomniac game that will tap out the PS5. Not ratchet.


Spider-man 2 is far from native 4k:



I'm pretty sure Ratchet also uses dynamic res all the time.
 

SlimySnake

Flashless at the Golden Globes
Spider-man 2 is far from native 4k:



I'm pretty sure Ratchet also uses dynamic res all the time.

Nah, thats the debug mode which has its own performance hit. Several youtubers including DF themselves found that the 30 fps mode ran at mostly native 4k while the 40 fps mode mostly ran at 1440p.
 

Bojji

Member
Nah, thats the debug mode which has its own performance hit. Several youtubers including DF themselves found that the 30 fps mode ran at mostly native 4k while the 40 fps mode mostly ran at 1440p.

Was this performance hit confirmed? Maybe they dropped resolution since launch version.

This was at launch, still not native 4k:

ZA78YAQ.jpg


I don't know where you have seen Ratchet running in native 4k at 50fps. I think it's also dynamic.
 

SlimySnake

Flashless at the Golden Globes
Was this performance hit confirmed? Maybe they dropped resolution since launch version.

This was at launch, still not native 4k:

ZA78YAQ.jpg


I don't know where you have seen Ratchet running in native 4k at 50fps. I think it's also dynamic.
I dont know if its confirmed but its pretty obvious that running the game in debug mode would have a performance hit. I dont know where that screenshot is from but you can watch any youtuber, NX Gamer, DF, Elanaldebits or vg tech and they all say the same thing about the 30 fps mode averaging or hitting 2160p the vast majority of the time and only dropping in rare instances.

1440p is for the 40 fps mode.

I have a tv than shows fps and ratchet was hitting 50 fps in quality RT mode in some levels. You can lookup various different youtube videos that covered the vrr patch and the quality mode was running well above the 40 fps limit. They left a lot of headroom for these games.

Whether or not it drops down to a lower resolution to maintain 30 fps in hectic scenes is irrelevant because they designed the game around native 4k 30 fps. Just like how GT7 was developed around native 4k 60 fps and when the vrr patch came out, we saw that it was hitting 100 fps. No one in their right mind would say GT7 has tapped out the PS5.
 

64bitmodels

Reverse groomer.
And they’re right. Consoles are essentially just little budget PCs that AMD produce; there’s no secret sauce within them that makes them “tick”. Off the shelf parts put into a stylish plastic housing, and there’s your PS5, or Xbox S|X respectively.
There is no secret sauce true, but UE5 is still underutilized, unoptimized and bloated leaving a ton of graphical fidelity and performance on the table and it's the engine most games use
 
And they’re right. Consoles are essentially just little budget PCs that AMD produce; there’s no secret sauce within them that makes them “tick”. Off the shelf parts put into a stylish plastic housing, and there’s your PS5, or Xbox S|X respectively.
I don't think a "little budget PC" can perform as good as a console.

and consoles still have their "quirks and features". Is on devs (especially first party) to take advantage of those.
But as always, games need to show the 'power' of the hardware, and PC constantly feels like a scam in this regard
 

DanielG165

Member
There is no secret sauce true, but UE5 is still underutilized, unoptimized and bloated leaving a ton of graphical fidelity and performance on the table and it's the engine most games use
There’s definitely some performance potential left in the “base” machines that hasn’t been fully tapped into. It’ll be curious to see how a standard PS5 runs the likes of Death Stranding 2, and GT6, versus a PS5 Pro, for instance.
 
Last edited:

DanielG165

Member
I don't think a "little budget PC" can perform as good as a console.
“Budget” in that consoles are typically cheaper than your average PC; that wasn’t a slight against them. They are, however, essentially inclosed PCs nowadays, which isn’t a bad thing. It makes development on them easier, I’m sure.
 

64bitmodels

Reverse groomer.
But as always, games need to show the 'power' of the hardware, and PC constantly feels like a scam in this regard
How so

There’s definitely some performance potential left in the “base” machines that hasn’t been fully tapped into. It’ll be curious to see how a standard PS5 runs the likes of Death Stranding 2, and GT6, versus a PS5 Pro, for instance.
I think it's a bit more than just some, but yeah.

When you look at the visuals of the games coming out today and compare them to what we had last gen, realistically performance modes shouldn't be dropping to the levels they do. Games don't look that much better but seem to be much more costly to make run
 
“Budget” in that consoles are typically cheaper than your average PC; that wasn’t a slight against them. They are, however, essentially inclosed PCs nowadays, which isn’t a bad thing. It makes development on them easier, I’m sure.
i think you missed the point.

this:
"Consoles are essentially just little budget PCs that AMD produce"

and this:

"there’s no secret sauce within them that makes them “tick”. Off the shelf parts put into a stylish plastic housing"

is not true.

and this:
"They are, however, essentially inclosed PCs"

well, essentially anything that computes and makes "graphics" is a PC.

both companies have a close relationship and are collaborating with AMD to create a "bespoke" silicon or whatever. and Sony, MS put their own "secret sauce" too.

the issue here is that software (engines/middleware/tools/APIs etc..)enable devs to make things "just work" (good enough) to run on a "generic PC" which is good, in a general sense. But to take advantage of consoles (or any specific piece of hardware/software) requieres a lot more work and specific knowledge.

it is clear that the vast majority of devs are not doing that. So, based in the current state of the industry; only Sony has the leverage to show that it's console hasn't been fully utilized.
 

OverHeat

« generous god »
i think you missed the point.

this:
"Consoles are essentially just little budget PCs that AMD produce"

and this:

"there’s no secret sauce within them that makes them “tick”. Off the shelf parts put into a stylish plastic housing"

is not true.

and this:
"They are, however, essentially inclosed PCs"

well, essentially anything that computes and makes "graphics" is a PC.

both companies have a close relationship and are collaborating with AMD to create a "bespoke" silicon or whatever. and Sony, MS put their own "secret sauce" too.

the issue here is that software (engines/middleware/tools/APIs etc..)enable devs to make things "just work" (good enough) to run on a "generic PC" which is good, in a general sense. But to take advantage of consoles (or any specific piece of hardware/software) requieres a lot more work and specific knowledge.

it is clear that the vast majority of devs are not doing that. So, based in the current state of the industry; only Sony has the leverage to show that it's console hasn't been fully utilized.
The same could be said about GPU in the PC space a lot of features are not even utilized…
GsAVqN2.gif
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
related to my previous comment.

on the High End:
Games are not designed around a specific PC configuration, you are just brute-forcing performance/features.

in the middle range (which is where consoles exist) :
consoles are cheaper and perform better at that price point
Complete nonsense as usual.
 

Gaiff

SBI’s Resident Gaslighter
Games are not designed around a specific PC configuration, you are just brute-forcing performance/features.

This is wholly incorrect. Games scale up and down a wide range of configurations. High-end PCs aren’t brute-forcing things over mid-tier PCs any more than the XSX/PS5 are brute-forcing things over the XSS.
 
Games are not designed around a specific PC configuration, you are just brute-forcing performance/features.

This is wholly incorrect. Games scale up and down a wide range of configurations.
designed ≠ scaling up/down

High-end PCs aren’t brute-forcing things over mid-tier PCs any more than the XSX/PS5 are brute-forcing things over the XSS.
that's the point. What are you buying a high end gaming PC instead of a console ?... just a wild guess:

To have better performace. right?
 

Gaiff

SBI’s Resident Gaslighter
designed ≠ scaling up/down
Irrelevant semantics. You said they were "just brute-forcing", which is incorrect. Graphical features scale up and down. I sure as shit don’t believe they designed games to run 648p while looking like a mess in motion like Jedi Survivor on consoles.
 
Irrelevant semantics. You said they were "just brute-forcing", which is incorrect. Graphical features scale up and down. I sure as shit don’t believe they designed games to run 648p while looking like a mess in motion like Jedi Survivor on consoles.
No, not irrelevant at all. Words have meaning. You are equating 'to be designed' with 'having better or worse performance,' but they are not the same. I'm not talking about scalability; I'm talking about game design

Does playing Starfield on a High-End PC make it a fundamentally different game, can you seamlessly fly to planets, for example? that's the point.

In other words, you're "just brute forcing" performance without truly leveraging that hardware to experience a fundamentally different game beyond improved graphics and how the game runs.

so yeah. I think my observations are quite on point.

To tie the whole conversation together (beyond your "nonsense" reply):

Consoles indeed haven't been fully utilized by the devs. why?

as is said before:

is not because they are just "little budget PCs" with "off-the-shelf" components (implying that they don't have any more to offer). it is because of three aspects:

1. Old engines that are not able to take full advantage of the new hardware (Starfield)
2. General/generic engines like Unity/Unreal that are "too heavy" (too inefficient) to fully utilize a specific hardware. (Redfall)
3. The above creates the "good enough" to run on "generic PCs"/ lack of optimization
3. To maximize the potential of the hardware, both in design and performance, requires extensive effort and a team of highly skilled and knowledgeable developers/engineers . (usually First Party)

on top of that, you add covid and it feels like this midpoint of the gen has been an extended cross-generation. (some games came out one or even two years after they were supposed to be released).

Now, regarding the PS5 Pro, it is a machine designed to give you better performance, nothing more. (Sounds familiar?) Yet, Sony has the incentive to show what the PS5 is able to produce from a design and performance perspective. To put it bluntly: "The Next-Gen experience". timing-wise, and based on Sony´s track record they will announce between 2-4 of these "Next-Gen" games in the next 18 months.
 

bitbydeath

Member
Irrelevant semantics. You said they were "just brute-forcing", which is incorrect. Graphical features scale up and down. I sure as shit don’t believe they designed games to run 648p while looking like a mess in motion like Jedi Survivor on consoles.
Not the greatest example.

 

Deerock71

Member
I hate to break it to you guys, but 98% of the population can't even detect what ray tracing is. Graphics whores are chasing the leprechaun riding the unicorn.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
No, not irrelevant at all. Words have meaning. You are equating 'to be designed' with 'having better or worse performance,' but they are not the same. I'm not talking about scalability; I'm talking about game design
I don't equate anything to anything. Your "designed around X or Y" is completely irrelevant to the topic at hand.
Does playing Starfield on a High-End PC make it a fundamentally different game, can you seamlessly fly to planets, for example? that's the point.
Graphics settings have never changed how games play in the history of gaming so what the hell are you talking about?
In other words, you're "just brute forcing" performance without truly leveraging that hardware to experience a fundamentally different game beyond improved graphics and how the game runs.
This didn't happen 30 years ago and it still doesn't happen now. Graphics settings are there to make games look better/worse and run better/worse. They're not here to fundamentally change the gaming experience.
so yeah. I think my observations are quite on point.
No, they aren't and they're completely wrong. Having better graphics isn't brute-forcing anything. That's how it's always been designed. Game design today hasn't changed from 15 years ago and most games, even current-gen ones, can be scaled down to run on a PS4. Nothing games do these days from a design perspective couldn't have been done 15 years ago.
To tie the whole conversation together (beyond your "nonsense" reply):

Consoles indeed haven't been fully utilized by the devs. why?

as is said before:

is not because they are just "little budget PCs" with "off-the-shelf" components (implying that they don't have any more to offer). it is because of three aspects:

1. Old engines that are not able to take full advantage of the new hardware (Starfield)
2. General/generic engines like Unity/Unreal that are "too heavy" (too inefficient) to fully utilize a specific hardware. (Redfall)
3. The above creates the "good enough" to run on "generic PCs"/ lack of optimization
3. To maximize the potential of the hardware, both in design and performance, requires extensive effort and a team of highly skilled and knowledgeable developers/engineers . (usually First Party)

on top of that, you add covid and it feels like this midpoint of the gen has been an extended cross-generation. (some games came out one or even two years after they were supposed to be released).

Now, regarding the PS5 Pro, it is a machine designed to give you better performance, nothing more. (Sounds familiar?) Yet, Sony has the incentive to show what the PS5 is able to produce from a design and performance perspective. To put it bluntly: "The Next-Gen experience". timing-wise, and based on Sony´s track record they will announce between 2-4 of these "Next-Gen" games in the next 18 months.
Don't care.
 
so what the hell are you talking about?
I'm talking about game design

Bard:
Video game design is the technical art of crafting the core mechanics, rules, and systems that define a video game's interactivity. It bridges the gap between the initial creative vision and the final playable product

Chat GPT:
Video game design is the creation and refinement of gameplay elements, mechanics, narrative, visuals, and audio to craft engaging player experiences. It involves designing rules, levels, characters, and interfaces, with a focus on immersion, challenge, and enjoyment. Designers collaborate to balance gameplay, tell compelling stories, and create visually and aurally appealing worlds. Iterative playtesting helps refine designs for optimal player satisfaction.


Starfield doesn't introduce anything new when you play on a High-End PC.
Burning Shores introduces new mechanics and technology (Only on PS5).

So, go away "nonsense" guy.
 

Gaiff

SBI’s Resident Gaslighter
Bard:
Video game design is the technical art of crafting the core mechanics, rules, and systems that define a video game's interactivity. It bridges the gap between the initial creative vision and the final playable product

Chat GPT:
Video game design is the creation and refinement of gameplay elements, mechanics, narrative, visuals, and audio to craft engaging player experiences. It involves designing rules, levels, characters, and interfaces, with a focus on immersion, challenge, and enjoyment. Designers collaborate to balance gameplay, tell compelling stories, and create visually and aurally appealing worlds. Iterative playtesting helps refine designs for optimal player satisfaction.
This was a rhetorical question.
Starfield doesn't introduce anything new when you play on a High-End PC.
Neither do...99.99% of games on consoles or PC?
Burning Shores introduces new mechanics and technology (Only on PS5).
Burning Shores isn't on PS4 and sure as hell doesn't introduce anything that changes the fundamental experience lol. And clouds change the experience? lmao.
So, go away "nonsense" guy.
You've yet to make a single point that is sensible. Newsflash: Almost no game released today has mechanics that haven't been seen 15 years ago. The basic premise argument is utterly flawed. What limits game design these days isn't technology, it's creativity and the willingness to take risks. This is why you see more innovation in the indie scene than in the AAA space which is content releasing the same trash over and over again. Hell, the most groundbreaking gameplay design in the past decade is probably Shadow of Mordor and the Nemesis system back in 2014.
 
Last edited:
This was a rhetorical question.

Neither do...99.99% of games on consoles or PC?

Burning Shores isn't on PS4 and sure as hell doesn't introduce anything that changes the fundamental experience lol.

You've yet to make a single point that is sensible. Newsflash: Almost no game released today has mechanics that haven't been seen 15 years ago. The basic premise argument is utterly flawed. What limits game design these days isn't technology, it's creativity and the willingness to take risks. This is why you see more innovation in the indie scene than in the AAA space which is content releasing the same trash over and over again.
Don't care.
 

onQ123

Member
So you are saying if someone actually makes a game that truly pushes the PS5 to its limits in no way it could run on anything with weaker hardware no matter the downgrades?
As crazy as it seems yes it's possible to make a game when built for PS5 using a large dataset that can not fit into RAM but streams the data in as you look around it would not run on lesser hardware because changing the settings will not reduce the size of the data set so the lower hardware will just choke.

It would have to be a pretty big game to do this though & the fact that no one is ready for 500GB + games right now will keep devs from going this route.
 
Top Bottom