• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro Specs Leak are Real, Releasing Holiday 2024(Insider Gaming)

Gaiff

SBI’s Resident Gaslighter
read the article. they are referring to "future consoles" for that number

Whilst these are not the targets for the PS5 Pro due to hardware limitations, it is the internal goal for PSSR in future console interactions. The PlayStation 5 Pro PSSR currently supports 3840×2160 and is currently aiming for 4K 60 FPS and 8K 30FPS, but it’s unclear if those internal milestones can be passed.
At least these are more realistic.

The PlayStation 5 Pro PSSR currently supports 3840×2160 and is currently aiming for 4K 60 FPS and 8K 30FPS, but it’s unclear if those internal milestones can be passed.

Still don't think we're getting there unless they use something equivalent to DLSS Performance and run the internal res at 1080p and upscale it to 4K.
 
I know but this is also why I'm scratching my head on how this thing will perform close to a 4070 in real world games...that is close to 80percent faster than the ps5 and assuming the rt capabilities are on par.

I don't think it will. It will be closer to a 3070 than a 4070. I think people are just getting a bit carried away by the hype. It reminds me of when the PS5 was about to launch and everyone was getting excited about 4K/60 becoming the new standard.

It will be a good upgrade on the base PS5, but I think those expecting 4070ti/3090 levels of performance are going to be disappointed. Somewhere around 7700xt/3070ti is probably more realistic.
 
Interested to see how well PlayStations own upscaling tech performs

On one hand, it wouldn't make sense for it to be anywhere near as good as DLSS or FSR.

On the other hand, if it wasn't as good as FSR, you would think they would just use FSR.

Sony is really all over the place when it comes to custom solutions, but their track record isn't great. So the big question is why. When it came down to Atmos vs their own custom 3d audio, their solution was worse, but either Atmos was exclusive to Microsoft at the time or Sony didn't want to pay for it and they later came to terms.

You wouldn't think FSR would cost them anything extra.

Possible that AMD just wasn't ready to use AI upscaling with FSR and Sony was.

Will be really interesting.
 

HeisenbergFX4

Gold Member
At least these are more realistic.

The PlayStation 5 Pro PSSR currently supports 3840×2160 and is currently aiming for 4K 60 FPS and 8K 30FPS, but it’s unclear if those internal milestones can be passed.

Still don't think we're getting there unless they use something equivalent to DLSS Performance and run the internal res at 1080p and upscale it to 4K.
Yeah the article is a little clickbaity imo
 

Mr.Phoenix

Member
For the people defending this Zen2, 8 Core, 3.85GHz, 4MB Cache CPU.

Never forget this.


If you all are happy with 25/55fps becoming a stable 30/60fps, that's fine by me.

But for something called the "Pro", I expected the Pro to have a performance fidelity RT mode.

Guess I'll agree with HeisenbergFX4 HeisenbergFX4 and wait and see.
While I respect your perspective, I feel you are being somewhat biased here. Let's look at this from a purely numbers perspective.

if you have a game that can do 25/55fps in quality and performance modes respectively on the current PS5, then that literally means the CPU is able to handle game logic at up to 55fps already. And the PS5pro with the higher CPU clock would get that to your stable 30/60fps as you put it.

But this is where i feel people are being disingenuous.

A 1.45x render improvement, and this is not taking into account any benefits from AI, RT or even more advanced RDNA3+ architecture. Means that that 30/60fps is more akin to a 43/87fps in quality and performance modes respectively. Or at the very least a locked 40fps quality mode and a 60fps performance mode with significantly better IQ than the current 60fps performance mode running games manage currently. And that to me, is worthy of a $100 price bump over the current PS5 and what the Pro is designed to do.

And this is looking at that 45% better render vs PS5 in a vacuum. Which we shouldn't do, to begin with.

I dont understand why people are expecting the PS5pro to have GPU performance on par with $600-$750 GPUs.
 
PlayStation’s new Spectral Super Resolution (PSSR), which will first be integrated into the PlayStation 5 Pro is internally aiming for 4K 120 FPS and 8K 60FPS console gaming, Insider Gaming has learned.

The news comes following this week’s leaks of the PS5 Pro, which revealed that the PlayStation 5 Pro (codenamed Trinity) will be incorporating PSSR to upscale to higher resolutions. Currently, PSSR works on SKD 9.00 in the PlayStation 5 Pro to bring 4K resolutions.

Insider Gaming has also revealed more specifications on the upcoming PlayStation 5 Pro, which you can read here.

Outlined in documents provided to Insider Gaming under the condition that they are not made public, PlayStation’s ambitions with PSSR is to achieve 4K 120FPS and 8K 60FPS. Whilst these are not the targets for the PS5 Pro due to hardware limitations, it is the internal goal for PSSR in future console interactions. The PlayStation 5 Pro PSSR currently supports 3840×2160 and is currently aiming for 4K 60 FPS and 8K 30FPS, but it’s unclear if those internal milestones can be passed.

PSSR Memory Requirements is roughly 250MB; 180MB from the PSML Library and 64MB from the game.

Two Case Studies for two unnamed first-party games include:

Game 1​

Target – image quality close to Fidelity Mode (1800p) with Performance Mode FPS (60 FPS)

Standard PlayStation 5 –


  • Performance Mode – 1080p at 60FPS
  • Fidelity Mode – 1800p at 30FPS
PlayStation 5 Pro –

  • 1440p at 60FPS (PSSR used)

Game 2​

Target – Add Raytracing to gameplay

Standard PlayStation 5 achieved 60FPS without raytracing, and PlayStation 5 Pro achieved 60FPS with Raytracing.

For those who don't want to click

I would wonder if Game 1 would have a 4K30 mode or 4K40 with VRR.

What's funny is I can't think of any PS5 first party games that run at 1800p30...
 

Loxus

Member
Yeah the article is a little clickbaity imo
I think he's trying not to get in trouble for leaking to much info.

Should it be future consoles "iteration" and not "interaction"?

And what happened to the performance 8k mode?
8uvm6so.jpg
 

Mr.Phoenix

Member
On one hand, it wouldn't make sense for it to be anywhere near as good as DLSS or FSR.

On the other hand, if it wasn't as good as FSR, you would think they would just use FSR.

Sony is really all over the place when it comes to custom solutions, but their track record isn't great. So the big question is why. When it came down to Atmos vs their own custom 3d audio, their solution was worse, but either Atmos was exclusive to Microsoft at the time or Sony didn't want to pay for it and they later came to terms.

You wouldn't think FSR would cost them anything extra.

Possible that AMD just wasn't ready to use AI upscaling with FSR and Sony was.

Will be really interesting.
Sony did CBR on the PS4pro, and that was great. They did the whole IO complex thing with the PS5 SSD, and that has been great. If Sony is using an AI accelerated reconstruction tech with the PS5pro, it will be better than anything FSR has been able to do thus far. And should easily put it in the XeSS and DLSS conversation. AI-based reconstruction is not some sort of secret super sauce, its a by-product of having machine learning training a reconstruction pipeline using the same data inputs across the board.

If it wasn't going to be better than FSR, then as you said, why bother at all with dedicated AI hardware when that dies space could just be used for something else? And if the PS5pro does have AI reconstruction, I will be willing to bet my left nut that the next RDNA iteration (RDNA4?) will have it too.

You are right about it being interesting though, this is my two cents on this at least.
 
well native 1440 upscaled to 4k using PSSR. If im understanding correctly

That's what I was thinking, but it's not clear

Is it upscaled FROM 1440p or TO 1440p?

The second case doesn't make a lot of sense as 99% of people already output standard PS5 at 2160p (as they have 4K TVs)
 
Last edited:
Sony did CBR on the PS4pro, and that was great. They did the whole IO complex thing with the PS5 SSD, and that has been great. If Sony is using an AI accelerated reconstruction tech with the PS5pro, it will be better than anything FSR has been able to do thus far. And should easily put it in the XeSS and DLSS conversation. AI-based reconstruction is not some sort of secret super sauce, its a by-product of having machine learning training a reconstruction pipeline using the same data inputs across the board.

If it wasn't going to be better than FSR, then as you said, why bother at all with dedicated AI hardware when that dies space could just be used for something else? And if the PS5pro does have AI reconstruction, I will be willing to bet my left nut that the next RDNA iteration (RDNA4?) will have it too.

You are right about it being interesting though, this is my two cents on this at least.

Well one interesting thing mentioned in the leak was that this won't require assets on a title by title basis, which is extremely interesting.

Will be nice to see Cerny break it down.
 

Mr.Phoenix

Member
That's what I was thinking, but it's not clear

Is it upscaled FROM 1440p or TO 1440p?

The second case doesn't make a lot of sense as 99% of people already output standard PS5 at 2160p (as they have 4K TVs)
lol. you are confusing yourself.

Currently, all PS5 games output at 4K. But internally, they run at all sorts of different resolutions then use some sort of scaling solution to get to the final output rez. Even the games that we say are native 4K@30fps, are using some sort of dynamic rez scaling and can average anywhere from 1800p-2160p internally. Some even scale as low as 1658p/1440p then reconstruct up to 4K using whatever scaling solution the devs want.

The PS5pro PSSR, is basically going to do or attempt to do what DLSS does. It would allow devs run their games internally at 1440p and use the saved GPU headroom to push for higher framerates, then use PSSR to scale that up to 4K. Look at the example given, its taking a game that is running internally at 1800p-2160p at 30fps (quality mode), but allows devs run that at less than half the internal rez (1440p), then use PSSR to take that to 4K. The GPU load saved, can instead go towards higher framerates.

Well one interesting thing mentioned in the leak was that this won't require assets on a title by title basis, which is extremely interesting.

Will be nice to see Cerny break it down.
I am really curious about this. Surely there has to be some sort of patch pushed out for games to support it. I would want to assume that its something like how DLSS/FSR/XeSS can be plugged into to anything that using TAA since they all use the same data sets.

I can understand telling devs, you don't have to do any extra work, if you already have the motion vectors and temporal data for TAA, then PSSR will just work. But I don't think it's like saying, play any game on the PS5pro and PSSR will be enabled. Like some sort of PSSR boost mode. That makes no sense to me.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Game 1
Target – image quality close to Fidelity Mode (1800p) with Performance Mode FPS (60 FPS)

Standard PlayStation 5 –


  • Performance Mode – 1080p at 60FPS
  • Fidelity Mode – 1800p at 30FPS
PlayStation 5 Pro –

  • 1440p at 60FPS (PSSR used)

Wow so not even 4k dlss performance? The goal is 1800p? Why not 4k? What is this?

And They are using PSSR to get to 1440p? Is this equivalent to 1440p dlss quality? Eh. 1440p dlss quality is not the same as 4k dlss performance both of which used 1080p as a base internal resolution. I would have thought that a premiere console would be targeting 4k.

Game 2​

Target – Add Raytracing to gameplay

Standard PlayStation 5 achieved 60FPS without raytracing, and PlayStation 5 Pro achieved 60FPS with Raytracing.
This is good but its a first party game. We get like one from sony every year. Other devs absolutely suck at getting more out of the PS5. The true test would be UE5 which is single threaded as fuck.
 

David B

An Idiot
The PS5 Pro will do real 4K at 30 FPS. Most publishers and developers will do 1440p to 1600 to 1800p with 60 FPS to 120 FPS. Some games will likely be 1440p 120 FPS like COD and Battlefield games with 4K upscaling or the patch for every other pixel 1440 and every other be 4K. Called checker boarding that Sony already admitted they do. So basically we will have more realistic real 4K games with low frame rate and than we will have games with 1800p max and 60 FPS to 120 FPS.
 

yamaci17

Member
spiderman is just doing rt reflections and already drops below 1080p. those other games are doing a lot more rt effects which is why they cause a bottleneck on the CPU. it makes zero sense to increase rt performance by 2-4x and then cheap out on the CPU when the CPU is the major bottleneck.
spiderman is just well optimized but it is exception to the norm. no need to dismiss the game regardless, it just proves BVH structure at 60+ fps cpu bound is possible. but it was only possible because base spiderman already targeted a perfectly frame paced 30 fps on 1.6 ghz jaguar cores.

amount, type or quality of ray tracing effects does not have a big impact on the CPU performance. you need the very same BVH structure no matter what you're going to do with ray tracing. it is a fixed cost you pay, no matter what you run. you can enable ray tracing BVH structure and just run it while no ray tracing effects are being present and you will still be hit with the CPU cost

please stop talking about things you don't even understand


just reflections

67 fps cpu bound
p3E9cnL.png


just shadows 68 fps cpu bound
Hj3f8Aj.png


path tracing 63 fps cpu bound (a mere %6 drop)
jFYFfDI.png


reflections + shadows 68 fps cpu bound

fKE565K.png


reflections + shadows + gi 67 fps cpu bound

wITRgsC.png




no ray tracing 87 fps cpu bound (1.29x cpu bound performance hit from ray tracing)

Lr8lJcZ.png


all values are within margin except path tracing (which is %6 more cpu heavy compared to shadows+gi+reflections.)



actually spiderman miles morales has a bigger hit on CPU with ray tracing on PC:

TVT0mOB.png

hZXlZGw.png


115 to 80 (1.43x increase)

reason spiderman is able to hit 60+ fps with ray tracing is because it is less cpu bound at its baseline than cyberpunk. and funny thing is, spiderman's BVH cpu cost is much heavier than cyberpunk which defeats your logic entirely
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
The comments for that tweet is hilarious.

Dudes getting all bent out of shape Sony may have a decent competitor to fsr and dlss.
They misunderstood the tweet which has a baity headline and thought that the Pro was aiming for 4K120 or 8K60 which is utterly ridiculous. The article actually specifies that it's for future consoles. I also thought this was about the Pro and was rolling my eyes.
 

SlimySnake

Flashless at the Golden Globes
No it's not.




Here, let me help you buddy. Even with a higher class CPU, 12400 vs 11400, the 3090 optimized/High RT and max/Very High RT are clocking at same FPS

4De1FoQ.jpg


5kwHgUP.jpg
Dude you have absolutely no idea what you are talking about. Just go back to discussing graphics fidelity. The 11400 and 12400 are not that far apart and those are two different benchmarks. you cant be comparing things like this.

Look at the screenshot i posted. You have two different CPUs in the same scene showing a 45 fps delta. That is all the proof you need. Dont need to argue against facts. It is annoying. No one likes it. You are not a woman, lets act like men and accept basic facts. Please.

These CPUs clock at 4.70-5.0 ghz. And they are still bottlenecking the GPU utilizing only half of the GPU. If you are unable to understand this then its ok, but dont continue to argue. Trust me when i say this, but you are making a fool out of yourself. You are out of your element here and every single post makes you look worse. Just stop.
 
lol. you are confusing yourself.

Currently, all PS5 games output at 4K. But internally, they run at all sorts of different resolutions then use some sort of scaling solution to get to the final output rez. Even the games that we say are native 4K@30fps, are using some sort of dynamic rez scaling and can average anywhere from 1800p-2160p internally. Some even scale as low as 1658p/1440p then reconstruct up to 4K using whatever scaling solution the devs want.

The PS5pro PSSR, is basically going to do or attempt to do what DLSS does. It would allow devs run their games internally at 1440p and use the saved GPU headroom to push for higher framerates, then use PSSR to scale that up to 4K. Look at the example given, its taking a game that is running internally at 1800p-2160p at 30fps (quality mode), but allows devs run that at less than half the internal rez (1440p), then use PSSR to take that to 4K. The GPU load saved, can instead go towards higher framerates.

Yeah, so they start from internal 1440p->PSSR->2160p

Is there also some kind of "frame-gen" technology?
 

twilo99

Member
I don't think it will. It will be closer to a 3070 than a 4070. I think people are just getting a bit carried away by the hype. It reminds me of when the PS5 was about to launch and everyone was getting excited about 4K/60 becoming the new standard.

It will be a good upgrade on the base PS5, but I think those expecting 4070ti/3090 levels of performance are going to be disappointed. Somewhere around 7700xt/3070ti is probably more realistic.

Is that based on a 7700xt/3070ti build that uses a weak zen2 CPU? That’s really the only way to say that it has a certain amount of GPU performance with the corresponding bottleneck.
 
spiderman is just well optimized but it is exception to the norm. no need to dismiss the game regardless, it just proves BVH structure at 60+ fps cpu bound is possible. but it was only possible because base spiderman already targeted a perfectly frame paced 30 fps on 1.6 ghz jaguar cores.

amount, type or quality of ray tracing effects does not have a big impact on the CPU performance. you need the very same BVH structure no matter what you're going to do with ray tracing. it is a fixed cost you pay, no matter what you run. you can enable ray tracing BVH structure and just run it while no ray tracing effects are being present and you will still be hit with the CPU cost
You have a developer on this thread, fafalada, just saying RT is CPU heavy only in PC games, not (or much less) on consoles due to different APIs.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
ChiefDada ChiefDada You should really consider getting a PC, even a relatively cheap gaming rig. You love graphics and tech but consoles don't let you do anything. Can't peep inside files, settings, or run profiling tools to know what's going on. You can just look and try to guess or maybe run an fps counter. Kinda limits what you can do and how far you can take your knowledge.
 

SlimySnake

Flashless at the Golden Globes
well native 1440 upscaled to 4k using PSSR. If im understanding correctly
No. It says 1440p using PSSR to get image quality close to the native 1800p in the fidelity mode.
Game 1
Target – image quality close to Fidelity Mode (1800p) with Performance Mode FPS (60 FPS)
Standard PlayStation 5 –


  • Performance Mode – 1080p at 60FPS
  • Fidelity Mode – 1800p at 30FPS
PlayStation 5 Pro –

  • 1440p at 60FPS (PSSR used)

i told you guys that if you have just a 45% increase in raw GPU performance, you cant expect massive pixel increases because these upscaling solutions have a cost themselves. FSR2 for example has a 30% cost on the GPU in some games. The PS5 pro upgrade is only 45% so if 30% is going into upscaling then there is not much else they can do with that remaining 15%.

This is why a bigger GPU jump was needed.
 

yamaci17

Member
You have a developer on this thread, fafalada, just saying RT is CPU heavy only in PC games, not (or much less) on consoles due to different APIs.
it must be special sauce to 1st party games. jedi survivor clearly suffered from extreme cpu bottleneck with ray tracing. it went from being an unstable 30 fps game to a rock solid 60 fps with the removal of ray tracing. dragon dogma 2 seems to be the same.

some 1st party developers probably use special solutions to reduce cpu boundness of ray tracing or maybe remove it altogether. but multiplatform games will adhere to different norms instead of special norms set by playstation 5

result is spiderman where they needed enormous effort to port ray tracing for PC. actually they had to redo the whole thing from ground almost. regular 3rd party games and their ray tracing implementations still rely on CPU on PS5 most likely. in that case, you can't expect developers to create special builds for playstation 5. some of these developers make games for multiple forms. they cannot specifically tune their ray tracing implementation for PS5

also my response was to the CPU boundness of ray tracing being dependent on effects. that person claims jedi survivor is more cpu bottlenecked than spiderman because it runs more advanced ray tracing but reality is that CPU performance hit, relatively, is similar no matter the effect or complexity
 
Last edited:

Mr.Phoenix

Member
Wow so not even 4k dlss performance? The goal is 1800p? Why not 4k? What is this?

And They are using PSSR to get to 1440p? Is this equivalent to 1440p dlss quality? Eh. 1440p dlss quality is not the same as 4k dlss performance both of which used 1080p as a base internal resolution. I would have thought that a premiere console would be targeting 4k.

This is good but its a first party game. We get like one from sony every year. Other devs absolutely suck at getting more out of the PS5. The true test would be UE5 which is single threaded as fuck.
Sigh... come on man, you are better than this. It frustrates me more when this comes from you because I KNOW you know better.

Let's look at it this way.

Take the og PS5. In quality mode, its running a game at 30fps with internal DRS rez of 1800p - 2160p. Its using 10TF of GPU power to do this. Remember this part, 1800p-2160p.

Then comes the PS5pro. Its going to do, what DLSS has been doing, and praised to be doing for the last 4 years. The exact same thing. First remember, the PS5pro has 16TF of GPU raster power.

Then this is where it gets interesting. The PS5pro will allow the devs to takle that og PS5 game, and instead of rendering 1800p-2160p in quality mode, it would allow them HALVE that internal resolution. So now, they can run the game internally at 1440p on the 16TF PS5. They would now use PSSR, and reconstruct that to 4K. Which if DLSS is any indication, will give you IQ similar/on par/better than native 4k. And then all the GPU overhead saved from being able to halve the render rez on the more powerful hardware, can now go towards higher frames.

That is Cerny's "console" approach to this conundrum.
 
No. It says 1440p using PSSR to get image quality close to the native 1800p in the fidelity mode.


i told you guys that if you have just a 45% increase in raw GPU performance, you cant expect massive pixel increases because these upscaling solutions have a cost themselves. FSR2 for example has a 30% cost on the GPU in some games. The PS5 pro upgrade is only 45% so if 30% is going into upscaling then there is not much else they can do with that remaining 15%.

This is why a bigger GPU jump was needed.

So It's not me that is confused, you basically wrote the opposite of the previous posters

Is 1440p the starting point or the end point???
 

Schmendrick

Member
AI-based reconstruction is not some sort of secret super sauce,
At runtime with the framerates we see in video games and with good/very good quality at nearly no performance hit it absolutely is.
Under those circumstances it is not easy at all.....

Whatever Sony has cooking has quite a few years of R&D and billions in investments to overcome if it wants to be mentioned in the same sentence as DLSS.
 
Last edited:

Mr.Phoenix

Member
No. It says 1440p using PSSR to get image quality close to the native 1800p in the fidelity mode.


i told you guys that if you have just a 45% increase in raw GPU performance, you cant expect massive pixel increases because these upscaling solutions have a cost themselves. FSR2 for example has a 30% cost on the GPU in some games. The PS5 pro upgrade is only 45% so if 30% is going into upscaling then there is not much else they can do with that remaining 15%.

This is why a bigger GPU jump was needed.
You are misconstruing the information.

It says the target is to get IQ of performance quality mode while getting the framerate of performance mode. That is the goal of PSSR on PS5pro.

And how they accomplish this, is by dropping the internal render rez of quality mode to 1440p, then use PSSR to get it to 4K. They did not say they are using PSSR to get 1440p. As that is not going to give you the IQ of 1800p/2160p. What that document is showing are the internal rez. fidelity mode in og PS5 (1800p), and performance mode in og PS5 (1080p). Native rez in PS5pro, 1440p, then PSSR to 4K.

This is basically DLSS/XeSS quality preset. Why are you expecting it to somehow be different here or work on a way that makes no sense.
So It's not me that is confused, you basically wrote the opposite of the previous posters

Is 1440p the starting point or the end point???
Dont listen to him, listen to me. lol.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Sigh... come on man, you are better than this. It frustrates me more when this comes from you because I KNOW you know better.

Let's look at it this way.

Take the og PS5. In quality mode, its running a game at 30fps with internal DRS rez of 1800p - 2160p. Its using 10TF of GPU power to do this. Remember this part, 1800p-2160p.

Then comes the PS5pro. Its going to do, what DLSS has been doing, and praised to be doing for the last 4 years. The exact same thing. First remember, the PS5pro has 16TF of GPU raster power.

Then this is where it gets interesting. The PS5pro will allow the devs to takle that og PS5 game, and instead of rendering 1800p-2160p in quality mode, it would allow them HALVE that internal resolution. So now, they can run the game internally at 1440p on the 16TF PS5. They would now use PSSR, and reconstruct that to 4K. Which if DLSS is any indication, will give you IQ similar/on par/better than native 4k. And then all the GPU overhead saved from being able to halve the render rez on the more powerful hardware, can now go towards higher frames.

That is Cerny's "console" approach to this conundrum.
You are misreading what they are saying.

They specifically state that the target is the image quality of the 1800p fidelity mode. If 1440p was the internal resolution and then PSSR was used to upscale, it would be upscaled to 4k which would be the image quality better than native 4k just like 4k DLSS quality. it would NOT be equivalent to the image quality of 1800p.

I am not going to discuss your hypothetical scenario of 1800p to 2160p when i have a real life scenario of 1800p of an actual PS5 Sony first party game. Why discuss hypotheticals when we have real figures from sony?

You guys did this earlier in the thread when sony themselves stated a 45% boost in gpu power saying the dual issue and console optimization and cerny secret sauce would have it act like a 6800xt. Turns out, it was what I had predicted in that they cant even take a 1080p game and push to 1440p internal resolution because the GPU power in only 45% and the pixel difference is 75%. Even if the pixel upgrade was 60% like the max tflops suggested then they would be below the 1440p target required.
 

Mr.Phoenix

Member
You are misreading what they are saying.

They specifically state that the target is the image quality of the 1800p fidelity mode. If 1440p was the internal resolution and then PSSR was used to upscale, it would be upscaled to 4k which would be the image quality better than native 4k just like 4k DLSS quality. it would NOT be equivalent to the image quality of 1800p.
No no no no and no.

Why make this complicated. They can take 1440p and PSSR that to 1800p if they want. Theer is nothing stopping them from doing that. It does not have to be PSSR to 4K.

They said, the target is to match the IQ of quality mode. In this example, that just happens to be 1800p in the game being used.

How you match that with an internal rez of 1440p, is to then use PSSR to reconstruct that back up to 1800p. Get it?

They also say to match the frame's performance of performance mode, which runs internally at 1080p. Dropping the rez of fidelity mode from 1800p to 1440p, will give them the GPU headroom to then run it at a higher frameratre than what the og fidelity mode could manage. And then using PSSR, which is AI accelerated on the Pro, allows them to reconstruct that up with the lowest possible frametime cost. It would at least cost significantly less than FSR.
I am not going to discuss your hypothetical scenario of 1800p to 2160p when i have a real life scenario of 1800p of an actual PS5 Sony first party game. Why discuss hypotheticals when we have real figures from sony?

You guys did this earlier in the thread when sony themselves stated a 45% boost in gpu power saying the dual issue and console optimization and cerny secret sauce would have it act like a 6800xt. Turns out, it was what I had predicted in that they cant even take a 1080p game and push to 1440p internal resolution because the GPU power in only 45% and the pixel difference is 75%. Even if the pixel upgrade was 60% like the max tflops suggested then they would be below the 1440p target required.
Ah well.. I have tried. We will see hwo this plays out then. How you cant see how what you are suggesting makes absolutely no sense is beyond me though.
 
it must be special sauce to 1st party games. jedi survivor clearly suffered from extreme cpu bottleneck with ray tracing. it went from being an unstable 30 fps game to a rock solid 60 fps with the removal of ray tracing. dragon dogma 2 seems to be the same.

some 1st party developers probably use special solutions to reduce cpu boundness of ray tracing or maybe remove it altogether. but multiplatform games will adhere to different norms instead of special norms set by playstation 5

result is spiderman where they needed enormous effort to port ray tracing for PC. actually they had to redo the whole thing from ground almost. regular 3rd party games and their ray tracing implementations still rely on CPU on PS5 most likely. in that case, you can't expect developers to create special builds for playstation 5. some of these developers make games for multiple forms. they cannot specifically tune their ray tracing implementation for PS5

also my response was to the CPU boundness of ray tracing being dependent on effects. that person claims jedi survivor is more cpu bottlenecked than spiderman because it runs more advanced ray tracing but reality is that CPU performance hit, relatively, is similar no matter the effect or complexity
Maybe. But RT in some multiplatform games also seems quite efficient like in Cyberpunk or Hogwarts Legacy (on PS5 at least running 27% better than on XSX according to NXGamer). From memory hardware RT seems to perform quite well in some PS5 ports using UE. It could depend of what API they use on PS5.
 
Last edited:

ChiefDada

Gold Member
Dude you have absolutely no idea what you are talking about. Just go back to discussing graphics fidelity. The 11400 and 12400 are not that far apart and those are two different benchmarks. you cant be comparing things like this.
Far apart enough where you should see a decent difference if really cpu limited, as shown below.

bwvbbnu.jpg
0lDWwun.jpg


Look at the screenshot i posted. You have two different CPUs in the same scene showing a 45 fps delta. That is all the proof you need. Dont need to argue against facts. It is annoying. No one likes it. You are not a woman, lets act like men and accept basic facts. Please.

No you don't. It's the same CPU. That's the mistake I'm referring to that DF made.

e0aumKp.jpg


ChiefDada ChiefDada You should really consider getting a PC, even a relatively cheap gaming rig. You love graphics and tech but consoles don't let you do anything. Can't peep inside files, settings, or run profiling tools to know what's going on. You can just look and try to guess or maybe run an fps counter. Kinda limits what you can do and how far you can take your knowledge.

What did I say that was wrong? I'm not unnecessarily bashing DF but their method used for Spiderman PC video slimy referenced makes it near impossible to determine cpu vs gpu portion of bottleneck.
 

Gaiff

SBI’s Resident Gaslighter
What did I say that was wrong? I'm not unnecessarily bashing DF but their method used for Spiderman PC video slimy referenced makes it near impossible to determine cpu vs gpu portion of bottleneck.
Oh, I'm not really following what you two are discussing. Just saying that it'd just be much easier and flexible to gauge performance yourself and run tests if you had a PC. Don't see how you can be such an enthusiast but have no desire to poke around inside files, settings, debug menus, and whatnot. Hell, UE5 alone has free dev tools you can use to see a bunch of stuff you'd never be able to see on consoles.
 

David B

An Idiot
I say who cares. PS5 uses Sony's own custom UNIX OS and uses Vulkan graphics engine. I'm not a technical guy at all. All I know is that it's somewhat similar to DirectX. Vulkan is an available graphics engine on Windows, Linux, Mac OS, and UNIX derivatives. It's a universal application. Even some games on Xbox uses Vulkan.
 

yamaci17

Member
Oh, I'm not really following what you two are discussing. Just saying that it'd just be much easier and flexible to gauge performance yourself and run tests if you had a PC. Don't see how you can be such an enthusiast but have no desire to poke around inside files, settings, debug menus, and whatnot. Hell, UE5 alone has free dev tools you can use to see a bunch of stuff you'd never be able to see on consoles.
what will it matter anyways? some people here just deny facts no matter how you prove to them. you can prove a game is being extremely GPU limited at a low resolution verifiably with PresentMon and some people will still say you're CPU bottlenecked and they will deny facts

I could personally get a CPU upgrade just to prove that 3070 still is GPU bottelenecked at 1080p around 60 FPS in Starfield but knowing how slimy some people are and how they will find a new excuse and move the goapost, I don't bother. If I had any belief they would just admit being wrong, I would do so by the way. But I don't have any belief at this point. if you don't trust presentmon, chances are, you may deny water being wet anyways
 
Last edited:

yamaci17

Member
Hopefully for GTA VI too.
GTA VI will be extremely, extremely CPU bound. This console adresses none of the CPU concerns if it is going to get a mild bandwidth upgrade and no CPU architectural update.

either both ps5 and pro will hit 60 fps or none of them will. there's no winning with zen 2.

and even then, I'm sure they will barely, barely hit that 30 fps target on zen 2. which means you need 2x cpu performance to hit a similarly reliable 60 FPS lock. which is impossible to do with console budgets at this point. even 5800x 3d cannot double the performance of a 3700x on desktop

gta 6 will push the limits of console cpus with its drawcalls, NPC density and draw distance.
 
Last edited:
GTA VI will be extremely, extremely CPU bound. This console adresses none of the CPU concerns if it is going to get a mild bandwidth upgrade and no CPU architectural update.

either both ps5 and pro will hit 60 fps or none of them will. there's no winning with zen 2.

and even then, I'm sure they will barely, barely hit that 30 fps target on zen 2. which means you need 2x cpu performance to hit a similarly reliable 60 FPS lock. which is impossible to do with console budgets at this point. even 5800x 3d cannot double the performance of a 3700x on desktop

gta 6 will push the limits of console cpus with its drawcalls, NPC density and draw distance.

I love how the narrative has shifted to the CPU bound games somehow not being able to take advantage of everything else the ps5 pro has on offer

Talk about a completely disingenuous argument
 
Last edited:

yamaci17

Member
I love how the narrative has shifted to the CPU bound games somehow not being able to take advantage of everything else the ps5 pro has on offer

Talk about a completely disingenuous argument
yeah I'm being disingenuous for wanting a better, balanced product for console folks. enjoy your new glorified Xbox One X then, if you really want me to be disingenuous.

people get mad over someone pairing a midrange cpu with a highend GPU. here ps5 pro is pairing a super lowend CPU with a decent upper midrange GPU. you just can't defend that. this mentality is the reason consoles are being kept to 30 FPS time and time again
 
Last edited:

jroc74

Phone reception is more important to me than human rights
The comments for that tweet is hilarious.

Dudes getting all bent out of shape Sony may have a decent competitor to fsr and dlss.
I didn't realize how bad is was until I started looking at forums and social media.

Damn some ppl are really feeling some type of way about Sony trying to improve tech for PlayStation.

It's not that serious.
 
Top Bottom