• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: The Touryst PS5 - The First 8K 60fps Console Game

Onironauta

Member
I have always felt that the Jaguar CPUs in the PS4 held the GPU back in many ways. The GPU had to do more heavy lifting in 60 FPS games than they would with a similar GPU on PCs with better CPUs. Those Uncharted 4 hacks that had the game running at 60 fps were only able to get to 60 fps by reducing the resolution down to 560p. 1/4 resolution drop instead of 1/2 it would take on PCs to get double the FPS.

The PS5 GPU also has 1.5x IPC gains compared to the GCN 1.0 PS4 GPU, but that still puts the PS5 at around 15 GCN 1.0 Tflops. I'd expect an 8x resolution boost. 16x is mind boggling and is probably mostly due to the 8x more powerful CPU.
How can this game be CPU-bound if it runs on Switch which has a much weaker CPU than the Jaguar? I think the PS5 version is simply more optimized than the PS4 one.
 

ethomaz

Banned
I have always felt that the Jaguar CPUs in the PS4 held the GPU back in many ways. The GPU had to do more heavy lifting in 60 FPS games than they would with a similar GPU on PCs with better CPUs. Those Uncharted 4 hacks that had the game running at 60 fps were only able to get to 60 fps by reducing the resolution down to 560p. 1/4 resolution drop instead of 1/2 it would take on PCs to get double the FPS.

The PS5 GPU also has 1.5x IPC gains compared to the GCN 1.0 PS4 GPU, but that still puts the PS5 at around 15 GCN 1.0 Tflops. I'd expect an 8x resolution boost. 16x is mind boggling and is probably mostly due to the 8x more powerful CPU.
From the last DF video the Call of Duty 120Hz mode looks to be CPU bound and that is show in the 20-30fps drops.
This one definitively not.
 
Last edited:

sncvsrtoip

Member
?

6k to 8k is 2x more pixels... or 100% more pixels.
4k to 8k is 4x more pixels... or 300% more pixels.

I have not ideia where are you getting 55.66%.


Well when you compare the normal 8k with normal 4k.

You need to use the 19:6 aspect ratio:

8k 16:9: 7680x4320
6k 16:9: 5568x3132 (it is not a standard... so maybe Series X uses another 6k 16:9 res)
Yes I know but also found value of 3240p in other article
 
Pfft, only native 6K on Series X?


Clearly I'm joking and I shouldn't really need to explain that.
It's always good to see the work of people who originated in the demo scene bring that demo scene ethos over to commercial products. Super sampling is an incredible way to implement high quality anti aliasing but it's obviously an incredibly expensive way to do it's really not suitable for the vast majority of modern games. I am really impressed that the team have wanted to taken advantage of each particular platform's strengths just to see what they could achieve.
I'm quite surprised at how different the 6K and 8K super sampling looked on the two consoles to be honest, even on YouTube.
It is a beautiful game irrelevant of the platform and supposed to be a lot of fun too so I will definitely get around to playing it at some point.
 

Heisenberg007

Gold Journalism
If you decrease the 8k pixels and increase the 6x pixeks I guess.

8k = 7680x4320 = 33.177.600 (standard res in 16:9 called 8K UHD)
6x = 5760x3240 = 18.662.400 (not a standard res in 16:9 but that is the middle between 8k and 4k that are standards)
5k = 5120x2880 = 14.745.600 (standard res in 16:9 called 5k)
4k = 3840x2160 = 8.294.400 (standard res in 16:9 called 4k UHD)

BTW 8k = 2x6k = 4x4k = 16x2k.
Thanks for the correction!
 

Reallink

Member
An RDNA2 teraflop is not the same thing as an old GCN / RX 480 architecture teraflop. You get a lot more out of a lot less with RDNA2... especially in certain types of workload. These systems are much more powerful than comparing teraflops between generations would indicate.

Wait until mesh shaders and other tricks are utilized to their full potential.

Architectural gains would be in the +30-50% range in best case scenarios, not +1000%.
 

ethomaz

Banned
There is more in the article.


"Shin'en tells us that in the case of its engine, the increase to clock frequencies and the difference in memory set-up makes the difference."

So it is not just the higher clock but the memory setup too.
 
Devs told DF that higher clocks allowed them to hit 8K vs 6K on Series X.

They actually said that?

I can't watch the video right now so I wanted to make sure it wasn't a mistake.

Edit: NVM I just saw the article. Very interesting though.
 
Last edited:

Md Ray

Member
schitts creek suck it GIF by CBC


happy told you so GIF
 

Fake

Member
How can this game be CPU-bound if it runs on Switch which has a much weaker CPU than the Jaguar? I think the PS5 version is simply more optimized than the PS4 one.

I remember John mention that only a native PS5 app could take full advance over RDNA 2.0, if not would be just an overclock PS4 PRO legacy mode.
 

skneogaf

Member
Impressive. my pc chokes even on 8k video,,, and dark souls 3 runs 12fps at 8k. I have 3080

I tried dark souls 3 and I get around 35 to 40 fps at 8k.

6k gives me 60fps which cleans up the chain mail.

I still use 4k though as the latency goes down to less than 7ms
 
Very interesting.

PS5 can render at 8k thanks to the higher GPU clockspeed vs 6k at the Series X according to the developer. Bigger difference than Hitman 3, I assume?

Cerny redeemed?

Because the visuals are simple, what matters is the machine that can push more pixels, I think.
So this in one example where the higher Pixel Rate of the PS5 makes a concrete difference.

Lol Imagine if this video had come out last week when we were discussing the advantages the PS5 might have with its higher clock speeds.

I am guessing one of these things are helping this particular game help hit higher resolutions. Though i still wont declare victory yet. Not every game seems to be benefiting from higher clocks like this. At the end of the day, tflops are still the best metric for ingame performance.

s0n39Hi.png


Always bet o Cerny you fools!


But it must be said... the SeX wasn't running this game completely native like the PS5 is, right?
This must makes things a bit more difficult.
 
Last edited:

skneogaf

Member
John does say this is native ps5 so is the xbox version the xbox one version?

If so then it shows how much better ps5 native games are compared to ps4 games running on the ps5.
 

ethomaz

Banned
John does say this is native ps5 so is the xbox version the xbox one version?

If so then it shows how much better ps5 native games are compared to ps4 games running on the ps5.
Can non-native Series X games reach 6k in resolution? I'm inclined to say this game is using GDK (native Series game).
 
Last edited:

rofif

Can’t Git Gud
I tried dark souls 3 and I get around 35 to 40 fps at 8k.

6k gives me 60fps which cleans up the chain mail.

I still use 4k though as the latency goes down to less than 7ms
maybe it was 12 fps on 2070... and closer to 30 on 3080 ? I must be messing something up
 

NeoIkaruGAF

Gold Member
Incredible how a 2yo indie game is suddenly relevant...

Serious question: why go for resolutions that almost all screens can’t even display, and not aim for very high fps instead?

Who am I kidding? I know the answer.
 

assurdum

Banned
Because the visuals are simple, what matters is the machine that can push more pixels, I think.
So this in one example where the higher Pixel Rate of the PS5 makes a concrete difference.





Always bet o Cerny you fools!


But it must be said... the SeX wasn't running this game completely native like the PS5 is, right?
This must makes things a bit more difficult.
Forgive me but this is a pure nonsense. What about the XSX hardware specs is able to render more "complex" rendering? I don't follow you. Faster GPU is faster in everything, XSX has more CUs and outside that, GPU hasn't any other advantage
 
Last edited:
Watching teh video, they say that the shadows were improved and this looks true, but not at all times? When they show that back and forth comparison inside a store, isn't the SeX the one with better shadows? Look at the objects in and around the shelves, they are properly shadowed, but in the PS5 version we can see that the upside of the objects are not. Isn't this worse?
 

Arioco

Member
They actually said that?

I can't watch the video right now so I wanted to make sure it wasn't a mistake.

Edit: NVM I just saw the article. Very interesting though.


Yep, that's what the devs told DF according to John.

Not that it's that unexpected after all. When One S launched I remember Microsoft engenieers explaining DF that they could get more juice from a 7% overclock than they do from a 17% increase in the number of CUs.

And MS has done it again this gen when they allowed devs to choose between running Series X CPU at 3.6 Ghz using 16 threads and running it at 3.8 Ghz using only 8 threads. Why would they do that if more cores/threads performed better than higher frecuencies in any circumstance? I guess it all depends on the engine and what could be bottlenecking the system in that specific scenario. On the other hand having more compute power/Tflops has its own benefits.
 

01011001

Banned
Yes of course its an edge case and not applicable across the board, but I still can’t shake the feeling it can be considered relevant since most games exhibit nigh on identical res and performance in the two consoles despite the TFLOPs difference.

true. it could be that they basically balance eachother out somewhat in most games
 

Arioco

Member
Incredible how a 2yo indie game is suddenly relevant...

Serious question: why go for resolutions that almost all screens can’t even display, and not aim for very high fps instead?

Who am I kidding? I know the answer.
It has a 4K@120fpd mode. What did you expect? A 1080p@480fps? I'm sure that a lot of screen out there can display that, right. 🙄
 
Forgive me but this is a pure nonsense. What about the XSX hardware specs is able to render more "complex" rendering? I don't follow you. Faster GPU is faster in everything, XSX has more CUs and outside that, GPU hasn't any other advantage

Read what I wrote immediately after the comma.
 

Md Ray

Member
From the last DF video the Call of Duty 120Hz mode looks to be CPU bound and that is show in the 20-30fps drops.
This one definitively not.
No, it's GPU bound. If last-gen PS4/XB1 could do near 60fps avg, then PS5/XSX including XSS have a ton of headroom for 120+fps on the CPU side.
 

skneogaf

Member
I remember saying that I thought the xbox series x needed to try to increase the clock rate as it looked low when I first saw it.

The xbox series x is 1.825 GHZ which definitely seems to have harmed some games, especially the ones that have had the ps5 as the lead console for development.

Maybe 2GHZ with a few less CU's would have helped with the ps5 lead games.

Its always really interesting when games like this show the differences.
 

arvfab

Banned
Already beat the game on the Switch. Really enjoyed it, but not enough to double-dip on PS.

Amazing achievement by the devs.

And the PS5 proves once again to be the best value in gaming, providing 8k gaming for just 400€/$!
 
Last edited:

iamvin22

Industry Verified
Is that just internal rendering atm right?
PS5 output is currently limited at 4k.
So basically we're getting a high quality downsampling.
Sony needs to upgrade the PS5 video output with 8k and VRR. The VRR update for their TVs seems to be rolling out right now.
i was told vrr will role out around December once it is certified.
 

skit_data

Member
true. it could be that they basically balance eachother out somewhat in most games
Yeah exactly, there will always be the TFLOPs differentiator between the two, making exact comparisons when it comes to CU/clockspeed inaccurate. I didn’t expect the difference between the two be this small pre-launch, though.
 

Fake

Member
i was told vrr will role out around December once it is certified.

Rich said on the video VRR was his top priority from Sony than 8k but is crazy to think 8k output support will come close or if not in the same instance as VRR?
 
Top Bottom