• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

onesvenus

Member
Could you post me this too? Thanks a lot.

Yup, credit goes to Dubs from beyond3d forums as before
lNTNUrC.jpg
 

HeisenbergFX4

Gold Member
Gears 5 on XSX:

  • SP: dynamic res, 1080p to 2160p, with an average of 1720p
  • MP: 1080p at 120FPS
  • Small dips in FPS but mostly due to bugs which will be fixed with an update
  • Series S will run at dynamic res 720p to 1440p
  • SSGI, several upgrades to shadows, lighting, textures and shaders
  • VRS Tier 2 gives 5 to 12% improvement in performance (as per Devs)
  • Greatly reduced input latency
  • Load times reduced from 42s on One X to 8s on XSX

Not mind blowing by any means but for a free update that includes a 120 fps option in multiplayer?

Yes please
 

IntentionalPun

Ask me about my wife's perfect butthole
You're kinda wrong, that's called 'race to idle' ps5 won't boost frequencies meaninglessly such as this. It will boost clocks only when both cpu or gpu having a meaningful works and productive. Cpu will boost clocks when the game needs super intensive processing then it will throttled back when its not needed. Same goes for gpu. But gpu will spend more time at max frequencies due to having more power budget and games needs rendering.
Nah you are completely wrong here.. Cerny never said the PS5 wont' have these "meaningless" boosts, he simply said you shouldn't take those into account when calculating how often the processor is at max.
 

geordiemp

Member
Gears 5 on XSX:

  • SP: dynamic res, 1080p to 2160p, with an average of 1720p
  • MP: 1080p at 120FPS
  • Small dips in FPS but mostly due to bugs which will be fixed with an update
  • Series S will run at dynamic res 720p to 1440p
  • SSGI, several upgrades to shadows, lighting, textures and shaders
  • VRS Tier 2 gives 5 to 12% improvement in performance (as per Devs)
  • Greatly reduced input latency
  • Load times reduced from 42s on One X to 8s on XSX

It was also interesting the amount of the visible screen that was a lower actual shader resolution using VRS extensively for that saving 5-12 % saving . Below, the green is half shader resolution


xJnwp6L.png


Benefit vs cost and added artefact discussion will be interesting.
 
Last edited:

Lysandros

Member
w98Ojgk.jpg




You can't be serious?

It's a 4TF console, with less RAM, slower memory, etc. and yet you believe it will start pushing the PS5???

I thought TF was the measurement that counted and now you're saying it does not and it's in all the marvelous features the GPU has that can make up the difference with a GPU that's 2.5 times more powerful?

I, I don't have words for this...
Furthermore PS5 is 'more' than 2.5 times as powerful, XSS has only around 30% of the triangle/rasterization throughput and pixel fill rate of PS5.
 

kyliethicc

Member
Gears 5 on XSX:

  • SP: dynamic res, 1080p to 2160p, with an average of 1720p
  • MP: 1080p at 120FPS
  • Small dips in FPS but mostly due to bugs which will be fixed with an update
  • Series S will run at dynamic res 720p to 1440p
  • SSGI, several upgrades to shadows, lighting, textures and shaders
  • VRS Tier 2 gives 5 to 12% improvement in performance (as per Devs)
  • Greatly reduced input latency
  • Load times reduced from 42s on One X to 8s on XSX
4K @ 120 FPS tho ? Phil promised. 12 TFlops... fixed frequency.. full powa of game pass cloud x?


so their 4K console can't even average 1800p and their 1440p console is a 720p console. LOL
 

saintjules

Member
It can still be a removable M.2 NVMe PCI 4.0 SSD with a custom software stack for increased I/O efficiency (think of something like Mantle).

They can always set some minimum requirements for replacing the SSD, just like Sony did for the PS4:

Getting ready for the PS5 Ultimate FAQ!

Your destiny.

Yes
 
Last edited:

onesvenus

Member
It was also interesting the amount of the visible screen that was a lower actual shader resolution using VRS extensively for that saving 5-12 % saving . Below, the green is half shader resolution


xJnwp6L.png


Benefit vs cost and added artefact discussion will be interesting.
For me the biggest point was when he said: "impressingly it's not something you'll notice during normal play". It seems Tier 2 VRS is really worth it.
 
Last edited:

ZywyPL

Banned
It’s average 1720p. Better than ULTRA Settings on PC with SSGI, several upgrades to shadows, lighting, textures and shaders.

Ultra setting are already a waste on PC, let alone on consoles with their limited power budgets, so this is so dumb to even try to "outperform" the PC settings. This game could've easily be 4K60 with a few options tonned down to Medium-High, without any noticeable differences in the visuals.
 

IntentionalPun

Ask me about my wife's perfect butthole

I mean sure.. but what upscaling technique is being used?

Like we are just dismissing games with sub 4k rendering and then throwing around discussion of DLSS but there's no indication any such technique is being used lol

(not that I have an issue with regular dynamic resolution, I'm sure the game looks great at 1800p, but why again is this guy bringing up DLSS?)
 

yewles1

Member

geordiemp

Member
For me the biggest point was when he "impressingly it's not something you'll notice during normal play". It seems Tier 2 VRS is really worth it.

Most stuff > 1440p is not noticed during normal gameplay on a 4K screen for a console with modern temporal if your moving and playing. If you stop and analyse details, then whats the difference between 1600p and 1700p is my point or VRS and allot of screen at half shader resolution.

At the end of the day, with my eyes :messenger_beaming: anything over 1440p with 60 FPS is fine for either console lol, just asking a valid question.
 
Last edited:

HeisenbergFX4

Gold Member
I mean sure.. but what upscaling technique is being used?

Like we are just dismissing games with sub 4k rendering and then throwing around discussion of DLSS but there's no indication any such technique is being used lol

(not that I have an issue with regular dynamic resolution, I'm sure the game looks great at 1800p, but why again is this guy bringing up DLSS?)

Whats coming down the pipe with DLSS is amazing.
 

jose4gg

Member
I mean sure.. but what upscaling technique is being used?

Like we are just dismissing games with sub 4k rendering and then throwing around discussion of DLSS but there's no indication any such technique is being used lol

(not that I have an issue with regular dynamic resolution, I'm sure the game looks great at 1800p, but why again is this guy bringing up DLSS?)

Damage control, As far as I can tell ( I haven't watched the DF video yet), they aren't claiming to be using any "temporal injection" or ML rescaling, nor have a checkerboarding chip integrated into the console...

Someone correct me if I'm wrong.
 

Andodalf

Banned
Larger jump than from Morrowind to Oblivion? Shouldn't that be a given considering Oblivion will have been 2 gens ago by the time this arrives?

What?

He's saying Fallout76 -> Starfield is a bigger jump in terms of their tech. Morrowind to Oblivion was one gen, 76 to Starfield is one gen. Not sure what the confusion is?
 
We'll see, also with the RDNA2 features I would think within 12 months Series S could be pushing PS5 pretty hard, I wouldn't worry about Series X If I were you it'll be out of sight by then.

I wonder if you are serious, because you dont understand what you are talking about or are you just trolling. Because anybody whom understand these things would not think that "rdna2 features magically make 4Tflop machine push 10.2 Tflops machine that is also based on RDNA2, but have more custom silicon so it is not technically full rdna2"

44NuNT6.png

Going as low as 720P on a next gen console? What the hell??

I (and many others) said it months ago:

if game doesnt run(forced native 4k) 4k on series x, it is possible that it runs sub 1080p or even 720p on series s :messenger_savoring:

Feels good to be right, especially after all fucking "full rdna2" and shit that people whom should not have the right to even write about tech because of lack of basic understanding spammed months.

But this is good for xsex owners, at least some devs dont use native 4k to have mercy on series s. Dynamic res is much better option with limited power
 

HeisenbergFX4

Gold Member
Translate?

BTW Sony already confirmed the bundled cable is HDMI 2.1.

The FUD did not last longer this time.

Their second post is what I actually meant to include

but it says:

At this point we want to correct the spread of misinformation regarding the HDMI 2.1 cable of the PS5. As Sony confirmed to us, it is an HDMI 2.1 cable. We apologise for the ambiguities. Complete correction

Edit - I wasnt trying to add any FUD btw, I had already told several people here via PMs who asked that it was 2.1
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Whats coming down the pipe with DLSS is amazing.
For sure.. and for now it's on PC only.

These console games aren't using the technique and we don't know when/if they will. Feel like if I was a twatter personality I'd frame these posts way differently... if you are going to bring up DLSS in relation to a system that currently has no games using the technique perhaps that should be noted, or it should be framed as a when/if scenario.
 

MistBreeze

Member
For sure.. and for now it's on PC only.

These console games aren't using the technique and we don't know when/if they will. Feel like if I was a twatter personality I'd frame these posts way differently... if you are going to bring up DLSS in relation to a system that currently has no games using the technique perhaps that should be noted, or it should be framed as a when/if scenario.
is not AMD is working on like DLSS feature it can be used in xbox series x and ps5?

Im sure I read it somewhere....

If this is the case it will help these consoles immensely down the road
 

Rea

Member
Nah you are completely wrong here.. Cerny never said the PS5 wont' have these "meaningless" boosts, he simply said you shouldn't take those into account when calculating how often the processor is at max.
"There's another phenomenon here, which is called 'race to idle'. Let's imagine we are running at 30Hz, and we're using 28 milliseconds out of our 33 millisecond budget, so the GPU is idle for five milliseconds. The power control logic will detect that low power is being consumed - after all, the GPU is not doing much for that five milliseconds - and conclude that the frequency should be increased. But that's a pointless bump in frequency," explains Mark Cerny.
At this point, the clocks may be faster, but the GPU has no work to do. Any frequency bump is totally pointless. "The net result is that the GPU doesn't do any more work, instead it processes its assigned work more quickly and then is idle for longer, just waiting for v-sync or the like. We use 'race to idle' to describe this pointless increase in a GPU's frequency," explains Cerny.

Read again, nothing i said is wrong. Why would he literally said it's pointless to boost clock then continue to boost his ps5 when "race to idle"? LoL. That's a waste of power and resources. :messenger_tears_of_joy:

Edit to spelling:
 
Last edited:

Zheph

Member
is not AMD is working on like DLSS feature it can be used in xbox series x and ps5?

Im sure I read it somewhere....

If this is the case it will help these consoles immensely down the road
There will be alternative, checkerboard rendering is one used on PS4 Pro and Sony patented not long ago some solution much closer to DLSS

 

IntentionalPun

Ask me about my wife's perfect butthole
"There's another phenomenon here, which is called 'race to idle'. Let's imagine we are running at 30Hz, and we're using 28 milliseconds out of our 33 millisecond budget, so the GPU is idle for five milliseconds. The power control logic will detect that low power is being consumed - after all, the GPU is not doing much for that five milliseconds - and conclude that the frequency should be increased. But that's a pointless bump in frequency," explains Mark Cerny.
At this point, the clocks may be faster, but the GPU has no work to do. Any frequency bump is totally pointless. "The net result is that the GPU doesn't do any more work, instead it processes its assigned work more quickly and then is idle for longer, just waiting for v-sync or the like. We use 'race to idle' to describe this pointless increase in a GPU's frequency," explains Cerny.

Read again, nothing i said is wrong. Why would he literally said it's pointless to boost clock then continue to boost his ps5 when "race to idle"? LoL. That's a waste of power and resources. :messenger_tears_of_joy:

Edit to spelling:
You need to keep reading that article as he never says this doesn't happen on PS5 dude.

"So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."

He's not saying the PS5 doesn't race to idle.. he's saying "when I made my statements about how often the GPU / CPU are at or near max, I was not talking about situations where it was meaningless to have them maxed."

There's literally also no reason NOT to give the CPU or GPU full power during these scenarios.. like at all.. even if you were running on a battery, because you save more power with race to idle than with not racing to idle (hence why it's a power saving technique). He's calling it meaningless from a performance standpoint, he's not declaring the PS5 as not having that power saving technique.
 
Last edited:

MastaKiiLA

Member
I think Dualsense impressions are great and I’m excited and I already have an extra one pre-ordered. I don’t mean it in a bad way.
That's fair, and I didn't mean to imply that it was a negative opinion. It's just I've seen so much of that stated in the Dualsense-related threads, that I think it's beating a dead horse at this point.
 
I'm not following the threads, so I have no idea if this was already discussed here.
More than a year ago Microsoft discussed their solution for Image Reconstruction Upscaling using DirectML.




Starts at minute 24.
It's interesting that while doing their research Microsoft working with Nvidia, not with AMD.
Nvidia gave them their "secret sauce", but it's not relevant anymore because by the time it was the original version of DLSS, that gargabe. Since than Nvidia learned with AMD's RIS and changed everything to release DLSS 2.0.
Anyway, you can see that it's just a question of software. Just like you don't need dedicated hardware to do raytracing the additional hardware for ML just speeds the work. Work that can be done with low precision math. There Microsoft explains that their solution uses FP16, and I guess it can be done with INT8 and INT4. Guess what, RDNA1 supports double rate FP16 and quad INT8 and octa INT4. Microsoft talked about customization to improve this and we suppose this means this work can be done across all the CUs, and not limited like it is on RDNA1.

So, Microsoft is working on this for a long time, and because they had to work together with AMD it's safe to assume that AMD's solution will be very similar and use the same hardware optimizations as the SeX.
We can also expect this coming to RDNA1, and because it'll be open source Sony will also probably take advantage of this work if they didn't do their own work to offer the same on the PS5.
All consoles must receive this feature via firmware update next year.
As expected this isn't and Nvidia advantage anymore, it's a new industry standard.
 
Last edited:

Sethbacca

Member
w98Ojgk.jpg




You can't be serious?

It's a 4TF console, with less RAM, slower memory, etc. and yet you believe it will start pushing the PS5???

I thought TF was the measurement that counted and now you're saying it does not and it's in all the marvelous features the GPU has that can make up the difference with a GPU that's 2.5 times more powerful?

I, I don't have words for this...

Their main argument seems to be that frames > every other consideration. A game could be running 300fps@480p on the S and 4k30 on the PS5 and they would consider it an S victory. Dumbestshit ever.
 
Status
Not open for further replies.
Top Bottom