• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro devkits arrive at third-party studios, Sony expects Pro specs to leak

Gaiff

SBI’s Resident Gaslighter
The ps5 matches the 3080 in rift apart though tbf it’s the only game I’ve seen usually 3070 is best case.

1. It doesn't. In Fidelity Mode, the 3080 averages 60fps and the PS5 48fps, that's a 25% performance advantage to the 3080. In Performance Mode, the 3080 is 17% faster.

2. NxGamer argues that these differences aren't due to the GPU but to the PS5's better IO which prevents the massive temporary drops of the 3080. It has much higher highs but also much lower lows. Furthermore, Nixxes also issued several patches to address the streaming issues.

3. The DRS in this game doesn't work the same on PC vs console. That's coming straight from Nixxes:

DRS on PS5 functions differently than it does on PC. DRS on PS5 works from a list of pre configured resolutions to choose from with limits on the top and bottom res with course adjustments (aka 1440p down to 1296p). PC DRS is freefloating and fine-grain. If you turn on IGTI with DRS set to 60, it will max your GPU essentially at the highest and most fine grain res possible.

And we can actually see that in NxGamer's own footage. Not only is the image obviously blurrier on PS5, but the plants are a dead giveaway that PC operates at a higher resolution.

M8JlGT4.png

I also never said the ps5 performed like a 4080 literally where did I say that
Here?
Based on how last of us for example runs worse on a 4080 than a ps5 is it really that unrealistic for the pro to outperform the 4080 there same with ratchet in pure raster

I said there are several games like last of us, rift apart some Ubisoft games where it already clearly outperforms the 4060 and matches a 4070 so why would a pro matching or dare I say edging the 4080 be some crazy idea. This isn’t for every game for reference
The 3080 is something like 5% faster than the 4070. There isn't a single game where the PS5 performs like the 4070 and you haven't shown one. The best it can do is be in the league of a 2080 Ti/3070. 3080/4070 are way out of reach.

I think the relative GPU power comparable of the PS5 is even less than a 2080TI at is gives very similar native 4k performance to the 2070S benchmarks when you look around the web. The 2080TI benchmarks always had it most of the time over 60 fps at 4k , at least in 2020 . Since we know the PS5 can do at least 40 fps (mode) at 4K but can't do 60 at this resolution it probably hovers in the 45 fps range tops . PS5 Pro if increase a 2x fold , it will be comparable to a 4070 Super (tbd) I suppose

I'm talking about PS5 exclusives that run particularly well on PS5. Generally, the 2080 Ti is quite a bit faster than the PS5. And yeah, agreed with the 4070/4070S-ish tier of performance.

Base PS5 has no DLSS equivalent.
It has FSR which usually performs within 5% of DLSS. Their performance is typically identical. Same for XeSS on Intel hardware. No upscaling solution is much more performant than the others if they're using the same internal resolution.
With PS5 Pro, if you were to get it to produce the same resolution and framerate as PS5 Pro, how much LESS could you get away with in terms of TFLop utilization?
Without knowing what Sony has in store, it's impossible to tell. Based on how DLSS, FSR, and XeSS perform though, I would expect the same performance as those 3 given the same resolution but with IQ closer to DLSS and XeSS than FSR. Better image stability, better resolve of fine details, more temporal stability, etc.
 

ChiefDada

Gold Member
Uncharted 4 PS5 Locked 1440p with mix of High/Ultra Presets. Per DF - "Average framerate is 100fps"





nMk7Q52.jpg



Hear is 4070 running 1440p Ultra:




Overall, I would consider this roughly 4070 level. And again I'm only choosing these ND titles because the native resolution is locked and has the uncapped modes which is rare for balanced comparison.
 

Gaiff

SBI’s Resident Gaslighter
Uncharted 4 PS5 Locked 1440p with mix of High/Ultra Presets. Per DF - "Average framerate is 100fps"





nMk7Q52.jpg



Hear is 4070 running 1440p Ultra:


He hasn't even run benchmark with final results. He just looked at the meter and said that it's "around 100fps" in general. Could it be 92? Could it be 105? He also says that it can dip to the low 70s or even 60s in brief moments and the settings aren't even the same.
Overall, I would consider this roughly 4070 level. And again I'm only choosing these ND titles because the native resolution is locked and has the uncapped modes which is rare for balanced comparison.
And you'd be wrong. Your favorite youtuber has actually run the tests:



"The RX 6800 at 4K Ultra can outperform the PS5 by approximately 24%."

If the regular 6800 beats the PS5 by up to 24%, then it doesn't stand a chance against the RTX 4070. The PS5 in Uncharted 4 performs at around the same level as a 3070/2080 Ti...kinda like TLOU Part I. It'd be actually a bit worse but meh. Still the same ballpark.

Once again, the 4070 and 3080 are generally around 50-80% faster than the PS5 in rasterization. Do you seriously think console optimization can cover such an enormous power gap? Because that's what you two are trying to argue.
 
Last edited:

James Sawyer Ford

Gold Member
Without knowing what Sony has in store, it's impossible to tell. Based on how DLSS, FSR, and XeSS perform though, I would expect the same performance as those 3 given the same resolution but with IQ closer to DLSS and XeSS than FSR. Better image stability, better resolve of fine details, more temporal stability, etc.

Yes but doesn't FSR give far worse results than DLSS? So it's not quite as equivalent.
 

Gaiff

SBI’s Resident Gaslighter
Yes but doesn't FSR give far worse results than DLSS? So it's not quite as equivalent.
At 4K Quality (base 1440p resolution), FSR isn't far worse. It's worse but still pretty good. At 1440p Quality (1080p base resolution), then yeah, the gap in quality is quite large but the performance remains the same. And anything below that just looks like shit with FSR.

PS5 Pro will be able to operate at much higher resolutions though so even if the solution isn't as good as DLSS, the mere fact that it will operate at say, 1440p instead of 900p, would make the reconstructed image much better compared to the base PS5.
 

ChiefDada

Gold Member
He hasn't even run benchmark with final results. He just looked at the meter and said that it's "around 100fps" in general. Could it be 92? Could it be 105? He also says that it can dip to the low 70s or even 60s in brief moments and the settings aren't even the same.

And you'd be wrong. Your favorite youtuber has actually run the tests:



"The RX 6800 at 4K Ultra can outperform the PS5 by approximately 24%."

If the regular 6800 beats the PS5 by up to 24%, then it doesn't stand a chance against the RTX 4070. The PS5 in Uncharted 4 performs at around the same level as a 3070/2080 Ti...kinda like TLOU Part I. It'd be actually a bit worse but meh. Still the same ballpark.

Once again, the 4070 and 3080 are generally around 50-80% faster than the PS5 in rasterization. Do you seriously think console optimization can cover such an enormous power gap? Because that's what you two are trying to argue.


Lol, very sneaky of you - at 4k ultra we're introducing memory constraints I thought you were more honest than that. It's true these games favor AMD cards.
 

Gaiff

SBI’s Resident Gaslighter
Lol, very sneaky of you - at 4k ultra we're introducing memory constraints. It's true these games favor AMD cards.
NxGamer mentions CPU bottlenecks at 1440p. And what do you mean memory constraints? Bandwidth?

And the PS5 still easily gets beaten at 1440p by the 6800 anyway despite the CPU bottleneck. The 4070 is faster than the 6800. Yes, those games favor AMD cards by around 10%, which would still make the 4070 over 20% faster than the PS5 since it's itself 11% faster than the 6800 in general.

I thought you were more honest than that
You know I always try my best to be cordial with you but you always turn into a shithead. You're the one who took benchmarks from a busted game's early phase and tried to prove a point. When I debunked your flawed methodology, you started complaining I was accusing you of cherry-picking. Then you took Oliver's vague comments that weren't even a benchmark, picked a 4070 in a completely different section with higher settings without a frame rate analysis constrained by a massive CPU bottleneck and went, "Looks like they perform the same to me."

I went and found an NxGamer video where he compares the footage directly and gives hard numbers and you have the nerve to call out my honesty? That's rich coming from the dude who's been having his claims thoroughly debunked for several posts now.

The PS5 never matches a 4070 in any game. You were wrong. Just admit it.
 
Last edited:

ChiefDada

Gold Member
Yes, those games favor AMD cards by around 10%, which would still make the 4070 over 20% faster than the PS5 since it's itself 11% faster than the 6800 in general.


Kandi Burruss Bravo GIF


Edit: You're not exactly lying but you're being VERY disingenuous which is just as worse. Regardless, I like the gif.


4070 and 6800 are on par with each other in terms of raster performance. Quoting an average cross vendor GPUs overinflates the values/importance of outliers.

 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Kandi Burruss Bravo GIF


Edit: You're not exactly lying but you're being VERY disingenuous which is just as worse. Regardless, I like the gif.


4070 and 6800 are on par with each other in terms of raster performance. Quoting an average cross vendor GPUs overinflates the values/importance of outliers.


No, they aren't.

yLyW6JU.png

50I62AT.png

NWelPnE.png


That's the Techpowerup database which gathers hundreds of results. The 4070 is 10-15% faster than the 6800 in rasterization alone.

Even in the video you posted, whenever the cards aren't CPU or engine-limited (Elden Ring needs to be manually unlocked and scales VERY poorly above 60fps), the 4070 comes out ahead:

Hogwarts Legacy: 4070 is 7% faster
Jedi Survivor: 4070 is 16% faster
Resident: Evil 4: 4070 is 5% faster.
TLOU: 4070 is 5% faster
GOW: 4070 is 5% faster
Shadow of the Tomb Raider (CPU-limited): 4070 is 9% faster
COD: 6800 is 2% faster (surprising because in this game, AMD trounces NVIDIA by over 20%)
Atomic Heart: 4070 is 20% faster
The Witcher 3: 4070 is 13% faster
RDR2: 4070 is 10% faster
Elden Ring (Engine-limited): 4070 is 5% faster

You get the idea.

frcQT8c.png


The 4070 is only marginally slower than the 3080. And you'd still be wrong even if they (6800 and 4070) were equal because the 6800 crushes the PS5 by 24% in Uncharted 4, so unless you think the 4070 would be that much slower in that game (it isn't), then it'd still be ahead of the PS5 by 20% or more. Hell, in TLOU which uses the same engine, the 4070 is still a tad faster.

And we also got Hardware Unboxed:

1440p-p.webp

4070 is 14% faster in rasterization. That's more comprehensive and they're also a far more reputable outlet than your random video. The 4070 and 6800 aren't equal. It's like saying the 6800 and 6800 XT are equal. It's very common for the 4070 to outperform the 6800 by over 10% and not that rare to see margins over 15% in rasterization.

You're out of your depth. Stick to console talk because you clearly are completely ignorant about PCs and have been getting exposed and embarrassing yourself for several pages. Simply admit you underestimated the 4070, but you keep doubling down and getting shit wrong.
 
Last edited:

SABRE220

Member
Yes I know what raster is I fully expect it to outperform the 4080 there at least in exclusive titles solely because of the pc vs console environment
You do know that the 4080ntrades blows or is within inches of the 7900xtx in rasterization right? The 4070 is a large downgrade compared to past gens, the pro will have a signifigant advantage over rdna3 in rt performance but in rasterization noway is it approaching the 7900xtx etc.
 

Go_Ly_Dow

Member
So how likely this ps5 pro run ff7 remake trilogy 4k 60fps or ff16?
FF7R part 1 on base PS5 is great and runs at 2160p 30fps or 1512p 60fps, so probably doable.
FF16 runs at like 1440p 30fps or 1080p 60fps in battles only. So native 4k 60fps is a stretch here but the Pro would help a lot to boost both of these.
FF7R part 2 we don't know how it will perform but it will have both 30/60 modes on base PS5. My guess is a drop in resolution compared to part 1 as the environments are much bigger and complex.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
FF7R part 1 on base PS5 is great and runs at 2160p 30fps or 1512p 60fps, so probably doable.
FF16 runs at like 1440p 30fps or 1080p 60fps in battles only. So native 4k 60fps is a stretch here but the Pro would help a lot boost both of these.
FF7R part 2 we don't know how it will perform but will have both 30/60 modes on base PS5. My guess is a drop in resolution compared to part 1 as the environments are much bigger and complex.
Also notice that the performance modes now cut effects, missing RT reflections on a whole swatch of objects and/or simplified reflections in general when supported, LOD in the distance, etc… not just resolution so the jump is a tad bigger.
 

Dorfdad

Gold Member
1440p 60 is guaranteed 4k 60 is a maybe
100 dollars they announce this capable of 8k
It’s a selling point

Having said that, I think they should’ve just marketed this entire generation as 1440 P ultra but I get why they don’t

Really interested in their upscaling solution and if it is the true secret sauce or this is just a test bed for NexGen.

I also wonder why they’re going with their own in-house solution instead of using AMD’s, which is probably far more advanced currently than Sony. It also concerns me, that you would need special tools to adopt the games to support their solution, which might hammer some developers.

I just wish they had a hardware of scaler built-in, and not just a software solution
 
Last edited:

Clear

CliffyB's Cock Holster
MS' main issue with moving to a new hardware tier is that they really cannot afford to market GamePass as being inferior to console output. Its just not possible.

Bottom line is that they'd have to upgrade their entire back-end first, and that's an expensive and time-consuming project in itself.
 

skit_data

Member
Bottom line is that they'd have to upgrade their entire back-end first, and that's an expensive and time-consuming project in itself.
Thought about this too. Afaik one purpose of the Series X design is to be used for Xcloud. Seems like a huge waste to scrap all that already, it will be stuck in Beta for yet another long period of time. There is simply no focus, which results in one strategy actively working against the other. Cloud streaming is better on PS5 for petes sake!
 

StereoVsn

Member
It can't be packing

Zen 5 and RDNA 4 Would be too expensive if they play to release it 2025,
if they want really a big jump it's not going to be possible even at 599
if they release it later it will be outdated and outperformed by the PS6
MS might be willing to subsidize it $200-$300 per unit PS3 style. Then it could happen. I don’t think it’s likely, but it’s a possibility.
 

buenoblue

Member
Uncharted 4 PS5 Locked 1440p with mix of High/Ultra Presets. Per DF - "Average framerate is 100fps"





nMk7Q52.jpg



Hear is 4070 running 1440p Ultra:




Overall, I would consider this roughly 4070 level. And again I'm only choosing these ND titles because the native resolution is locked and has the uncapped modes which is rare for balanced comparison.

I don't know why I'm even bothering as this guy obviously is trolling but suggesting a 10tf RDNA2 AMD GPU can perform the same as a 30tf 4070 Nvidia GPU is just plain wrong.

If you had played and seen games on a pS5 and a 4070 this would be blatantly obvious. There is no secret sauce, the tech is what it is. Yeah PS5 has extra chips and boards but this mainly helps CPU, which is typically clocked higher in a PC anyway.

Dont get me wrong I love ps5 and play probably 80% of my gaming on it but let's not get it twisted it ain't no 4070 and the pro ain't gonna be no 4080 lol.
 

Gaiff

SBI’s Resident Gaslighter
I don't know why I'm even bothering as this guy obviously is trolling but suggesting a 10tf RDNA2 AMD GPU can perform the same as a 30tf 4070 Nvidia GPU is just plain wrong.

If you had played and seen games on a pS5 and a 4070 this would be blatantly obvious. There is no secret sauce, the tech is what it is. Yeah PS5 has extra chips and boards but this mainly helps CPU, which is typically clocked higher in a PC anyway.

Dont get me wrong I love ps5 and play probably 80% of my gaming on it but let's not get it twisted it ain't no 4070 and the pro ain't gonna be no 4080 lol.
I just spent the last few pages debunking his shit. The 4070 is hilariously stronger than the PS5's GPU from a hardware perspective so I don't even know why he entertained this idea. The PS5 generally performs in line with a 2070S to 2080. It gets wiped by a 3070 in Alan Wake II and in general really. Comparing it to a 4070 doesn't even make sense. The guy tried to take the worst performing game, took a benchmark with a huge CPU bottleneck and at a resolution that favors weaker cards but still came up massively short.

ChiefDada ChiefDada Classic laughing emoji when you got no rebuttal. You always end up looking like a clown whenever you get into PC discussions. As I said, stick to consoles. You don't even follow the hardware market on PC, which shows with your utterly ignorant takes.

PS5 already competes with 4070 in many games

You meant 2 games and even in these two games, it doesn't.
 
Last edited:

rofif

Banned
1. It doesn't. In Fidelity Mode, the 3080 averages 60fps and the PS5 48fps, that's a 25% performance advantage to the 3080. In Performance Mode, the 3080 is 17% faster.

2. NxGamer argues that these differences aren't due to the GPU but to the PS5's better IO which prevents the massive temporary drops of the 3080. It has much higher highs but also much lower lows. Furthermore, Nixxes also issued several patches to address the streaming issues.

3. The DRS in this game doesn't work the same on PC vs console. That's coming straight from Nixxes:

DRS on PS5 functions differently than it does on PC. DRS on PS5 works from a list of pre configured resolutions to choose from with limits on the top and bottom res with course adjustments (aka 1440p down to 1296p). PC DRS is freefloating and fine-grain. If you turn on IGTI with DRS set to 60, it will max your GPU essentially at the highest and most fine grain res possible.

And we can actually see that in NxGamer's own footage. Not only is the image obviously blurrier on PS5, but the plants are a dead giveaway that PC operates at a higher resolution.

M8JlGT4.png


Here?



The 3080 is something like 5% faster than the 4070. There isn't a single game where the PS5 performs like the 4070 and you haven't shown one. The best it can do is be in the league of a 2080 Ti/3070. 3080/4070 are way out of reach.



I'm talking about PS5 exclusives that run particularly well on PS5. Generally, the 2080 Ti is quite a bit faster than the PS5. And yeah, agreed with the 4070/4070S-ish tier of performance.


It has FSR which usually performs within 5% of DLSS. Their performance is typically identical. Same for XeSS on Intel hardware. No upscaling solution is much more performant than the others if they're using the same internal resolution.

Without knowing what Sony has in store, it's impossible to tell. Based on how DLSS, FSR, and XeSS perform though, I would expect the same performance as those 3 given the same resolution but with IQ closer to DLSS and XeSS than FSR. Better image stability, better resolve of fine details, more temporal stability, etc.
25% difference between 3080 and ps5 is really good result for ps5.
Normally 3080 is usually up to double the framerate than ps5.
 

welshrat

Member
I don't know why I'm even bothering as this guy obviously is trolling but suggesting a 10tf RDNA2 AMD GPU can perform the same as a 30tf 4070 Nvidia GPU is just plain wrong.

If you had played and seen games on a pS5 and a 4070 this would be blatantly obvious. There is no secret sauce, the tech is what it is. Yeah PS5 has extra chips and boards but this mainly helps CPU, which is typically clocked higher in a PC anyway.

Dont get me wrong I love ps5 and play probably 80% of my gaming on it but let's not get it twisted it ain't no 4070 and the pro ain't gonna be no 4080 lol.
Yeah, it's nonsense. I own a ps5 and also a pc with a 6800 and as much as I love the PS5 the thing does not come close in games I have double dipped on. Cyberpunk, bf2042 etc. looking forward to the PS5 pro but to pretend the PS5 is anything more powerful than a 6700 is rubbish.
 

Gaiff

SBI’s Resident Gaslighter
25% difference between 3080 and ps5 is really good result for ps5.
Normally 3080 is usually up to double the framerate than ps5.
Oh, it is a fantastic result, relatively speaking. But again, the GPU isn't even the bottleneck here. And a 25% gap isn't them being equal, which is what is being argued.
 
It doesn't. For one, you're not even doing a side-by-side so it's utterly pointless. The PS5 could be running at 61fps in this area for all we know. For two, this video is from 7 months ago when the game was still severely unoptimized on PC. Notice something in this screengrab I made?

h3u2QLk.png


Look at the GPU usage, a paltry 83% while the CPU, a 5600X, is at 90%. The game is CPU-limited and had severe threading issues that weren't fixed until months after.

Yn6HA8w.png


A whopping 35% improvement in a CPU-bound scene. The usage went from 96% all the way down to 60% on a 5600X. The best data point we got where shots are lined up is the DF video where Alex compared the PS5 to a 2070S running High settings.

iqSKhW5.png


Here, the PS5 is outperforming the 2070S by 35% once it settles down. During the explosion, there is a massive fps spike on PC that causes the performance to tank and at that moment, the difference is 47% in favor of the PS5 but it only briefly happens. This would put the PS5's performance in line with a 2080 Ti/3070, nowhere near a 3080, let alone a 4080. Even using the 47% figure, you'd end up on the level of an RX 6800, again, short of a 3080.

So no, there aren't many games where the PS5 matches a 3080, in fact, there isn't a single one, which is expected since the 3080 is something like 60-80% faster in rasterization.

And I say 3080 because the 4070 and it are very close, similarly to the 3070 and 2080 Ti.

No, the 4080 isn't on par with the PS5 on TLOU. This is moronic to even suggest.

Last thing, I wouldn't even use TLOU as an example because even among Sony ports, it's an outlier for how abysmal it was (and still is lacking, even today). None of their other games exhibit anything close to that.

Also, the 4060 at times barely performs better than the 3060 which is a tad slower than the regular PS5 in general, so of course, the PS5 Pro will mop the floor with a 4060. I'm expecting it to land somewhere around a 7800 XT in rasterization. Jury is still out with RT because I haven't got the faintest idea of how these improvements will translate to the real world.
3070 performance is also about the performance in Death Stranding.
1. It doesn't. In Fidelity Mode, the 3080 averages 60fps and the PS5 48fps, that's a 25% performance advantage to the 3080. In Performance Mode, the 3080 is 17% faster.

2. NxGamer argues that these differences aren't due to the GPU but to the PS5's better IO which prevents the massive temporary drops of the 3080. It has much higher highs but also much lower lows. Furthermore, Nixxes also issued several patches to address the streaming issues.

3. The DRS in this game doesn't work the same on PC vs console. That's coming straight from Nixxes:

DRS on PS5 functions differently than it does on PC. DRS on PS5 works from a list of pre configured resolutions to choose from with limits on the top and bottom res with course adjustments (aka 1440p down to 1296p). PC DRS is freefloating and fine-grain. If you turn on IGTI with DRS set to 60, it will max your GPU essentially at the highest and most fine grain res possible.

And we can actually see that in NxGamer's own footage. Not only is the image obviously blurrier on PS5, but the plants are a dead giveaway that PC operates at a higher resolution.

M8JlGT4.png


Here?



The 3080 is something like 5% faster than the 4070. There isn't a single game where the PS5 performs like the 4070 and you haven't shown one. The best it can do is be in the league of a 2080 Ti/3070. 3080/4070 are way out of reach.



I'm talking about PS5 exclusives that run particularly well on PS5. Generally, the 2080 Ti is quite a bit faster than the PS5. And yeah, agreed with the 4070/4070S-ish tier of performance.


It has FSR which usually performs within 5% of DLSS. Their performance is typically identical. Same for XeSS on Intel hardware. No upscaling solution is much more performant than the others if they're using the same internal resolution.

Without knowing what Sony has in store, it's impossible to tell. Based on how DLSS, FSR, and XeSS perform though, I would expect the same performance as those 3 given the same resolution but with IQ closer to DLSS and XeSS than FSR. Better image stability, better resolve of fine details, more temporal stability, etc.
What about the frame-pacing / min framerate issues on PC (caused by I/O stutters). I see here a 45ms frame (and then sub 30fps drop) on 3080... Jeez.
 

Gaiff

SBI’s Resident Gaslighter
3070 performance is also about the performance in Death Stranding.
No, it's around a 2080S.
What about the frame-pacing / min framerate issues on PC (caused by I/O stutters). I see here a 45ms frame (and then sub 30fps drop) on 3080... Jeez.
"Caused by I/O stutters". Didn't you just answer your own question? And there's been a bunch of updates that smooth out the streaming issues which aren't a GPU bottleneck.
 
Last edited:

ChiefDada

Gold Member
ChiefDada ChiefDada Classic laughing emoji when you got no rebuttal. You always end up looking like a clown whenever you get into PC discussions. As I said, stick to consoles. You don't even follow the hardware market on PC, which shows with your utterly ignorant takes.

Why derail the thread when it's clear there's no end in sight? I have bookmarked your original prediction. Instead of name-calling why don't you quantify your prediction below:

I'd be really surprised if the PS5 Pro matches the 4080 in any game. 4070? Sure. 4070 Ti? Perhaps in select cases. 4080 is very likely out of reach unless we get a big surprise.

How should we measure "perhaps in select cases" when we judge PS5 Pro vs 4070 ti? If you're so much smarter than me on this and truly believe what you're saying, why not claim here and now that 4070 and 4070ti will beat PS5 Pro in majority of raster benchmarks upon release?
 

Gaiff

SBI’s Resident Gaslighter
Why derail the thread when it's clear there's no end in sight? I have bookmarked your original prediction. Instead of name-calling why don't you quantify your prediction below:
There's a clear end and has been for a long time now. You're just being stubborn. Your claims have been completely debunked and you haven't shown a single game where the PS5 matches a 4070. Why is it so hard to admit to being wrong on that front?
How should we measure "perhaps in select cases" when we judge PS5 Pro vs 4070 ti? If you're so much smarter than me on this and truly believe what you're saying, why not claim here and now that 4070 and 4070ti will beat PS5 Pro in majority of raster benchmarks upon release?
I've already given my prediction. 4070-tier to 4070S level of performance. The 4070 won't necessarily beat it "in the vast majority of raster benchmarks". It should be fair competitive. That's assuming that those specs are anywhere close to reality to begin with. I was honestly thinking Sony would shoot for at least twice the raster performance but I might have been too optimistic.
 
Last edited:

Baki

Member
I don't know why I'm even bothering as this guy obviously is trolling but suggesting a 10tf RDNA2 AMD GPU can perform the same as a 30tf 4070 Nvidia GPU is just plain wrong.

If you had played and seen games on a pS5 and a 4070 this would be blatantly obvious. There is no secret sauce, the tech is what it is. Yeah PS5 has extra chips and boards but this mainly helps CPU, which is typically clocked higher in a PC anyway.

Dont get me wrong I love ps5 and play probably 80% of my gaming on it but let's not get it twisted it ain't no 4070 and the pro ain't gonna be no 4080 lol.

Raw numbers don't tell the story as PS5 is gaming focused device, with an SOC design, with lower OS overhead, and with lower-level access to the system. It allows the PS5 to punch way above its weight. The problem with PC gaming is that PC is that devs have to target 100s of different specs and configurations, significant OS overhead, all leading to top of the line GFX cards and specs being left to brute force results. Now, does the 4070 combined with a powerful CPU and SSD, outperform a PS5. Yes. Does it outperform the PS5 by 3x, nowhere near. It's a real shame we don't see Nvidia tech in consoles, as their RT and upscaling tech is a generation ahead. I can only imagine how those beasts could perform if devs could focus on those specs and get the lower level access that a console provides.

MS' main issue with moving to a new hardware tier is that they really cannot afford to market GamePass as being inferior to console output. Its just not possible.

Bottom line is that they'd have to upgrade their entire back-end first, and that's an expensive and time-consuming project in itself.

People who stream games are likely more casual gamers. They won't care about IQ being below the current top of the spec console.
 
Last edited:

ChiefDada

Gold Member
There's a clear end and has been for a long time now. You're just being stubborn. Your claims have been completely debunked and you haven't shown a single game where the PS5 matches a 4070. Why is it so hard to admit to being wrong on that front?

I've shown you two games which give the closest presets, resolution than any other game I can think of and you disagreed with my inferences so there's no point. No problem.

I've already given my prediction. 4070-tier to 4070S level of performance.

Ok so you're on record saying PS5 Pro won't match or exceed a 4070ti performance based on your prediction?
 

RoadHazard

Gold Member
100 dollars they announce this capable of 8k
It’s a selling point

Having said that, I think they should’ve just marketed this entire generation as 1440 P ultra but I get why they don’t

Really interested in their upscaling solution and if it is the true secret sauce or this is just a test bed for NexGen.

I also wonder why they’re going with their own in-house solution instead of using AMD’s, which is probably far more advanced currently than Sony. It also concerns me, that you would need special tools to adopt the games to support their solution, which might hammer some developers.

I just wish they had a hardware of scaler built-in, and not just a software solution

They already market the standard PS5 as being capable of 8K, it's right on the box. They just haven't enabled it for anything AFAIK (and the only thing it reasonably could do at 8K is video streaming).
 

Gaiff

SBI’s Resident Gaslighter
Ok so you're on record saying PS5 Pro won't match or exceed a 4070ti performance based on your prediction?
Assuming those specs are remotely true but between you and I, what are the odds that they are? Who has the final specs sheets a year before the thing is even out? If Sony comes out and say AMD has found a way to cram 96 CUs clocked at 5.5GHz with HBM3 and 1.5TB/s of bandwidth at 120W, then obviously my prediction will be completely wrong.

PS: Those specs are 99% likely to be bullshit anyway.
 
Last edited:

Celcius

°Temp. member
They already market the standard PS5 as being capable of 8K, it's right on the box. They just haven't enabled it for anything AFAIK (and the only thing it reasonably could do at 8K is video streaming).
There's a PS5 game called The Touryst that runs at native 8K but they downsample to 4K because the PS5 doesn't support it yet. Plus older game remasters like an 8K Tales of Symphonia would be possible.
 
Last edited:

ChiefDada

Gold Member
Assuming those specs are remotely true but between you and I, what are the odds that they are? Who has the final specs sheets a year before the thing is even out? If Sony comes out and say AMD has found a way to cram 96 CUs clocked at 5.5GHz with HBM3 and 1.5TB/s of bandwidth at 120W, then obviously my prediction will be completely wrong.

PS: Those specs are 99% likely to be bullshit anyway.

Kepler has already confirmed 60 active CUs this past week, same as 7800xt. Assuming even just 10% upclock in frequency from base PS5 you are already looking at a GPU that beats 7800xt on paper, before considering architectural efficiency improvements/corrections of RDNA 3 refresh and console optimization (more implementation opportunities of dual issue compute, possibly). Secondly, you should know RDNA 3 GPU subsystem regressed vs RDNA 2 with gimped infinity cache.

The idea that a PS5 Pro couldn't easily surpass a 7800xt let alone match a 4070ti is nonsense from those assume technology is stagnant.
 

buenoblue

Member
This shit reminds of a convo I had with a guy at work when the steam deck came out. He was insistent that the steam deck was super powerful and would out perform a PS5 and a proper gaming PC. He told me he heard it was the most powerful gaming device. 🤣.

What he actually heard was it was the super powerful for it's form factor and price. Specs are specs.
 

Gaiff

SBI’s Resident Gaslighter
Kepler has already confirmed 60 active CUs this past week, same as 7800xt. Assuming even just 10% upclock in frequency from base PS5 you are already looking at a GPU that beats 7800xt on paper, before considering architectural efficiency improvements/corrections of RDNA 3 refresh and console optimization (more implementation opportunities of dual issue compute, possibly). Secondly, you should know RDNA 3 GPU subsystem regressed vs RDNA 2 with gimped infinity cache.
Cute, but unless Kepler is Mark Cerny's twitter handle, why should I believe anything he says? No one has the final specs sheet of the PS5 Pro so I'm not sure what you're trying to say here. And a 10% uptick in clock frequency from its already huge TDP? Sure.

The L3 cache saw a small decrease but the link between the L2 and L3 cache was widened considerably, resulting in a much higher throughput for N31. Hardly gimped.
The idea that a PS5 Pro couldn't easily surpass a 7800xt let alone match a 4070ti is nonsense from those assume technology is stagnant.
The 4070 Ti is faster than the 7800 XT. The 7800 XT is barely an improvement over its predecessor and often times comes within 5% of it depending on the workload. In other cases, it can be a tad faster, like 10-15% but you won't see it improve its performance by 30%. It also has a massive TDP of 285W (the biggest limiting factor). The 7800 XT is just a poorly named 7700 XT.

The PS5 Pro's GPU needs to roughly halve the thermal output of the 7800 XT to fit in a console environment. It needs to be on a smaller node with enormous power efficiency improvements and unlike high-end AD102-104, the power efficiency hasn't been cranked far past its maximum.

Those specs aren't looking particularly great unless you believe there are some enormous as-of-yet-to-be-revealed improvements.
 
Last edited:

RoadHazard

Gold Member
There's a PS5 game called The Touryst that runs at native 8K but they downsample to 4K because the PS5 doesn't support it yet. Plus older game remasters like an 8K Tales of Symphonia would be possible.

Yeah, I know about that one, but I guess they don't feel it's warranted for a single game.

Still, kinda shady to put it on the box and then never enable it.
 

sachos

Member
"Architecture is RDNA3, but it's taking ray tracing improvements from RDNA4. BVH traversal will be handled by dedicated RT hardware" and "XDNA2 NPU will be featured for the purpose of accelerating Sony's bespoke temporal machine learning upscaling technique." DAMN, i really hope thats true.
 

ChiefDada

Gold Member
Cute, but unless Kepler is Mark Cerny's twitter handle, why should I believe anything he says? No one has the final specs sheet of the PS5 Pro so I'm not sure what you're trying to say here. And a 10% uptick in clock frequency from its already huge TDP? Sure.

The L3 cache saw a small decrease but the link between the L2 and L3 cache was widened considerably, resulting in a much higher throughput for N31. Hardly gimped.

The 4070 Ti is faster than the 7800 XT. The 7800 XT is barely an improvement over its predecessor and often times comes within 5% of it depending on the workload. In other cases, it can be a tad faster, like 10-15% but you won't see it improve its performance by 30%. It also has a massive TDP of 285W (the biggest limiting factor). The 7800 XT is just a poorly named 7700 XT.

The PS5 Pro's GPU needs to roughly halve the thermal output of the 7800 XT to fit in a console environment. It needs to be on a smaller node with enormous power efficiency improvements and unlike high-end AD102-104, the power efficiency hasn't been cranked far past its maximum.

Those specs aren't looking particularly great unless you believe there are some enormous as-of-yet-to-be-revealed improvements.

Lol Jesus Christ get up to speed! It's no wonder you've been talking this way. So where are you getting your PS5 Pro performance prediction from if you don't believe Kepler or Tom Henderson reports? Or do you just automatically assume it will be less performant since it's a console and "PC always da best"? Lol.

60CU 7800xt saw it's IC halved vs 6800 (60CU) prior gen counterpart. The architectural benefits AMD assumed would achieve with RDNA 3 by going chiplet didn't pan out and PS5 Pro will expose this. PS5 Pro monolithic so it will be more power efficient.
 

Dorfdad

Gold Member
Why everonye here keeps saying things are not possible is beyond me. We are all guessing. WE HAVE ZERO idea what Sony's Special DLSS alternative is . We Have no idea how it's going to benefit the GPU from a Desktop GPU. We're all playing the PC vs console spec game but we already know the PS5 GPU can punch way above its weight when tools and coding are designed for it. They can get way more leverage from their GPU's than a desktop PC can.

I'll wait and see what they announce. Why is everyone doubting what Sony can do? They have proven time and time again they are capable of wizardry!
 

Clear

CliffyB's Cock Holster
People who stream games are likely more casual gamers. They won't care about IQ being below the current top of the spec console.

Its more about marketing and perception than anything else. I mean, MS obviously has no issue with PC versions of its games looking and performing better, but that's because its not running on a box with their branding on it.

You will never hear Phil Spencer or any other MS' exec saying that streaming is inferior to playing on actual hardware. They can't or their whole business plan is shot. By supplying an Xbox branded console that performs better than GP/XCloud that's what they would be demonstrating implicitly.

A high-end console is not going to reverse their fortunes versus Sony and Nintendo. However continued growth of streaming makes the "console war" irrelevant, so at this point they'll need to commit ever harder to that side of their business.
 

SHA

Member
Here's an idea, stop releasing mid generation upgrades and just make the best console you can.

This "ps5 pro" could easily have been made 3 years ago, and everyone who bought a ps5 is effectively left with an inferior product that has barely been utilised to the fullest.

I don't know, I'm all for choice I suppose, but there's just something about this whole mid gen upgrade business model that now seems ubiquitous that rubs me the wrong way.

Does anyone really think we won't see a ps6 pro?

I wonder what Microsoft will do?

Will we see a Series X 2.0 or something?

Don't remember hearing any rumours, unlike the Sony ones which have been around for ages.
My One X look stupid now, bad choice, you have a point.
 
Top Bottom