• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Handy Fake

Member
Someone here once mentioned that Smartshift can execute in about 2ms.

That's actually quite a lot. If you're targeting 60fps, you need to account for that 2ms drop, reducing your frame budget to 14ms. If it's switching a lot, you're going to notice the hit.

But, I'm not sure how it works. I guess the developers has much more control over it than the system trying to intelligently guess how to shuffle power. It would be too unpredictable otherwise?
Might be talking out of my well-toned sphincter, but as I understand it the tech actually anticipates the shift in advance.
 
Last edited:

geordiemp

Member
Someone here once mentioned that Smartshift can execute in about 2ms.

That's actually quite a lot. If you're targeting 60fps, you need to account for that 2ms drop, reducing your frame budget to 14ms. If it's switching a lot, you're going to notice the hit.

But, I'm not sure how it works. I guess the developers has much more control over it than the system trying to intelligently guess how to shuffle power. It would be too unpredictable otherwise?

No thats not what it means, it means smart shift is deciding every 2ms how to distribute power between CPU and GPU.

So in a 16 ms frame, it maybe doing normal for 14 ms, and for 2 ms everything given to GPU and then back again.

So every 16 ms, it is saying every 2ms what the distribution in power is - which makes sense, in any frame their are periods where CPU is thrashing and GPU is having a cigar, and periods where its other way around. GPUs only do what CPUs tell them.

Its not a 2 ms delay :messenger_beaming: that would be silly, its the speed of the power shifts, the CPU and GPU still are working away during all of this otherwise it would be a chocolate radiator.

The 2ms is from the laptop version, we dont know the shfit time or the granualarity of the ps5 version as it aslo has the predictive downclock (which is not smart shift, its different) - I would bet on both frequency and power shit being the same controls on the die....

Might be talking out of my well-toned sphincter, but as I understand it the tech actually anticipates the shift in advance.

Predictive sounds like that, but if you see an AVX instruction, doubt they could drop frequencies fast enough before its already processed, its probably a set of things happening and wording used is deterministic and predictable .....we need to learn more for sure.

Its probably simplistic loops of GPU doing simple things like map screens....There are enough programs that stress GPU temperatures it will be well known what group of code causes edge case heat that is not normal in playing a game.
 
Last edited:

Dolomite

Member
Halo Dev teasing no load times
Then renigging in the comments 😂😂

Too late expectations set. I feel like slipspace should nerf loading in at least the XSX version of infinite
 
🤦🏾‍♂️🤦🏾‍♂️ I mean. Technically of they all adopt down form of Global illumination, then maybe. But even that is Faux ray tracing tbh

Hmm spidey sense is tingling.

You can have software raytracing & hardware. GI is ray tracing. If you just google GI vs Raytracing. The below turns up. Into some PDF from a university.

"Global illumination," the more advanced form of ray tracing, adds to the local model by reflecting light from surrounding surfaces to the object. A global illumination model is more comprehensive, more physically correct, and it produces more realistic images.
 
Last edited:
D

Deleted member 775630

Unconfirmed Member
Apparently that UE5 demo is not GPU intensive, abuot same as fortnite at 60 FPS acording to EPIC.

Looks like its all about latency, doing things fast, and IO


But what about terrafloppies......?
That's great to hear. Can't wait to see what Ninja Theory is able to do with this and make it run at 4K native
 

Dolomite

Member
Hmm spidey sense is tingling.

You can have software raytracing & hardware. GI is ray tracing. If you just google GI vs Raytracing. The below turns up. Into some PDF from a university.

"Global illumination," the more advanced form of ray tracing, adds to the local model by reflecting light from surrounding surfaces to the object. A global illumination model is more comprehensive, more physically correct, and it produces more realistic images.
Im aware, I say faux because it achieves the same goals but with different results than Traditional HW dedicated RT. (I also call prim shaders fuax mesh shading for the same reason).Software RT is amazing and often less taxing on the GPU. But I wouldn't count out HW RT completely. GI has been around for iirc almost a decade, not new but never touted in the gaming world until the UE5 demo showed how damn clean it can look
 

geordiemp

Member
That's great to hear. Can't wait to see what Ninja Theory is able to do with this and make it run at 4K native

Do you think Hellblade will be using nantite and lumen or standard UE5 upgraded from UE4.

They are not the same thing lol

The UE5 Nanite demo is not about TF, it needs low latency and fast SSD, fast fast and fast again. TF wont help much for nanite is what the message is here.
 
Last edited:

Dolomite

Member
Apparently that UE5 demo is not GPU intensive, abuot same as fortnite at 60 FPS acording to EPIC.

Looks like its all about latency, doing things fast, and IO


But what about terrafloppies......?
That's actually not surprising. Makes sense, why epic refused to release the demo specs and minimum requirement the entire month after it's show. I knew this demo was Very achievable on comparable hardware.
 

Dolomite

Member
Do you think Hellblade will be using nantite and lumen or standard UE5 upgraded from UE4.

They are not the same thing lol

The UE5 Nanite demo is not about TF, it needs low latency and fast SSD, fast fast and fast again. TF wont help much for nanite is what the message is here.
UE5 is literally UE4 with updated plugins. The engine is built on UE4. The entire magic behind nanite (the technology not the name of the land in the demo) builds on the same technology you find in Quixel megascans. Infact Quixel is free when purchase UE as a student if you develope now, even on current gen GDKs.
High LOD and GI are amazing but have been around as a technology for nearly 10 yrs at this point. Now there is Much More to UE5 than GI and Nanite but that's all you'll hear people parrot around Gaf because they haven't used it and the idea of GI was foreign here before the tech demo

(This is all running on a 6 yr old software engine using the technology behind Lumin)
 
Last edited:
D

Deleted member 775630

Unconfirmed Member
Do you think Hellblade will be using nantite and lumen or standard UE5 upgraded from UE4.

They are not the same thing lol

The UE5 Nanite demo is not about TF, it needs low latency and fast SSD, fast fast and fast again. TF wont help much for nanite is what the message is here.
How do you mean what's the message here? The message is that Ninja Theory is a very skillful team with the money of Microsoft behind them now, who are planning to use UE5.

Do you believe Nanite is only possible in PS5? That EPIC created an amazing feature in UE5 that's only possible on the PS5? Wouldn't make sense from a business perspective.
 

BluRayHiDef

Banned

Nowcry

Member
UE5 is literally UE4 with updated plugins. The engine is built on UE4. The entire magic behind nanite (the technology not the name of the land in the demo) builds on the same technology you find in Quixel megascans. Infact Quixel is free when purchase UE as a student if you develope now, even on current gen GDKs.
High LOD and GI are amazing but have been around as a technology for nearly 10 yrs at this point. Now there is Much More to UE5 than GI and Nanite but that's all you'll hear people parrot around Gaf because they haven't used it and the idea of GI was foreign here before the tech demo

(This is all running on a 6 yr old software engine using the technology behind Lumin)

I am not at all agreed with that statement.

Put a demo Megascan asset into a UE4 scene, only 1 asset and render it, you will see the fps drop to 18 in a 2080.

UE5 gets Nanite and Lumen, Lumen is based on GPU / CPU but nanite is based on I / O + latency + GPU in an unprecedented sync dance to be able to push micro-polygons to the screen and GDDR.

In order not to overload the GPU, a raster is used when rendering, but you will be unable to create a scene with only 16 GB of GDDR. You need the division into micro-polygons and transmit them a few frames before and destroy them a few frames later to be able to move all that on the scene.

Nanite changes the way to use the GPU completely, there are no more load transmissions, everything is done in every second, you must transmit the polygons and destroy them as the camera rotates. The rendering must have a rasterizer to avoid overloading so many polygons. They must calculate in advance which polygons will come to the next table and make the requests very quickly and accurately. Now the GPU carries all those tasks that actually match only the essential tasks. The change is so great that this module would have been written practically from the beginning.


If it's not GPU intensive, then why was it running at 1440p rather than 4K?

Because the GPU is performing many more tasks than previously, not just the traditional ones. The GPU is software rasterizing, possibly destroying and creating polygons, it is anticipating the textures of the next micro-polygons and preparing to use them in caches. It is an intensive use but the graphic load of the GPU is not being high, it is the other tasks that are loading the system.
 
Last edited:

Dolomite

Member
Try again, its not about TF, its about IO and removing bottlenecks, how many times do we have to keep posting teh same stuff.




Congratulations?
You've mentioned Flops 3 sepperate times on this page of the thread alone. I've mentioned Flops exactly zero times. Are you arguing with yourself? What can be found in my post that had Anything in the world to do with the price of rice in Taiwan, let alone Tflops?
If the tech demo pulled the same amount of assets needed to run Fortnite at 60 fps then it's even more impressive because it looks amazing and doesn't tax the GPU....as expected. Cheaper crisper textures( from 1440-full8K) is the name of the game. LOD is the name of the game. RT alternatives is the name of the game, more bang for less processing buck. The engine was designed with Mobile devices in mind, why would they design it to be GPU heavy?
Also it seems like every response you've posted again, attempts to imply that the PS5 is the standard hardware needed to run the engine...it's not. Please get help. Fast I/0 writes are the standard of the next gen.
...can you guess which consoles feature fast I/O speeds SSD's 6 xs as fast as current gen SATA SSD's?..... every next gen console!🎉🎉🎉🥳🥳 (Lol even the rumored Lockhart).
No one is disputing what epic pulled off, you should though at some point sit your self down and have that hard honest conversation with yourself: Sony does not own. UE5 😔
 
Last edited:

Bo_Hazem

Banned
A game about fish WOOOOOOOOOOOOOOOOO I CAN DREAM>

And that bloody Dolphin game is not a fish, if your thinking about linking that.

HK-Project-Cat-GIF.gif
 

geordiemp

Member
Congratulations?
You've mentioned Flops 3 sepperate times on this page of the thread alone. I've mentioned Flops exactly zero times. Are you arguing with yourself? What can be found in my post that had Anything in the world to do with the price of rice in Taiwan, let alone Tflops?
If the tech demo pulled the same amount of assets needed to run Fortnite at 60 fps then it's even more impressive because it looks amazing and doesn't tax the GPU....as expected. Cheaper crisper textures( from 1440-full8K) is the name of the game. LOD is the name of the game. RT alternatives is the name of the game, more bang for less processing buck. The engine was designed with Mobile devices in mind, why would they design it to be GPU heavy?
Also it seems like every response you've posted again, attempts to imply that the PS5 is the standard hardware needed to run the engine...it's not. Please get help. Fast I/0 writes are the standard of the next gen.
...can you guess which consoles with feature SSD's 6 xs as fast as current gen SATA SSD's?..... every next gen console!🎉🎉🎉🥳🥳 (Lol even the rumored Lockhart).
No one is disputing what epic pulled off, you should though at some point sit your self down and have that hard honest conversation with yourself: Sony does not own. UE5 😔

Less the ad hominem and personal insults, grow up.
 

Thirty7ven

Banned

With the Nvidia RTX 2060 Super, meanwhile, you might expect Nvidia's proprietary DLSS standard to be your preferred option to get up to 4K resolution at 60fps. Yet astoundingly, AMD's FidelityFX CAS, which is platform agnostic, wins out against the DLSS "quality" setting.

Both of these systems generally require serious squinting to make out their rendering lapses, and both apply a welcome twist on standard temporal anti-aliasing (TAA) to the image, meaning they're not only adding more pixels to a lower base resolution but also smoothing them out in mostly organic ways. But FidelityFX CAS preserves a slight bit more detail in the game's particle and rain systems, which ranges from a shoulder-shrug of, "yeah, AMD is a little better" most of the time to a head-nod of, "okay, AMD wins this round" in rare moments. AMD's lead is most evident during cut scenes, when dramatic zooms on pained characters like Sam "Porter" Bridges are combined with dripping, watery effects. Mysterious, invisible hands leave prints on the sand with small puddles of black water in their wake, while mysterious entities appear with zany swarms of particles all over their frames
.

A glimpse of AMD upscaling future.
 
Last edited:

BluRayHiDef

Banned
How could you tell it was 1440p if you weren't told?

That's not the point. The point is that it was indeed running at 1440p, which implies that it was GPU intensive. Otherwise, it would have been running at 4K or at least at 1440p but at 60 frames per second. Having said that, I'm not dissing the PS5, because I cannot wait to get one.
 

Nowcry

Member
Congratulations?
You've mentioned Flops 3 sepperate times on this page of the thread alone. I've mentioned Flops exactly zero times. Are you arguing with yourself? What can be found in my post that had Anything in the world to do with the price of rice in Taiwan, let alone Tflops?
If the tech demo pulled the same amount of assets needed to run Fortnite at 60 fps then it's even more impressive because it looks amazing and doesn't tax the GPU....as expected. Cheaper crisper textures( from 1440-full8K) is the name of the game. LOD is the name of the game. RT alternatives is the name of the game, more bang for less processing buck. The engine was designed with Mobile devices in mind, why would they design it to be GPU heavy?
Also it seems like every response you've posted again, attempts to imply that the PS5 is the standard hardware needed to run the engine...it's not. Please get help. Fast I/0 writes are the standard of the next gen.
...can you guess which consoles feature fast I/O speeds SSD's 6 xs as fast as current gen SATA SSD's?..... every next gen console!🎉🎉🎉🥳🥳 (Lol even the rumored Lockhart).
No one is disputing what epic pulled off, you should though at some point sit your self down and have that hard honest conversation with yourself: Sony does not own. UE5 😔

This "If the tech demo pulled the same amount of assets needed to run Fortnite at 60 fps" is not this If it’s true that running the UE5 demo only took the “geometry rendering budget” of Fortnite at 60fps on console.

That’s one thing that makes UE5 so exciting. If it’s true that running the UE5 demo only took the “geometry rendering budget” of Fortnite at 60fps on console, we might be in store for a game engine that really gives lower-end hardware something to work with. Sure, geometry rendering isn’t the be-all and end-all of GPU gruntwork, and the GPU isn’t the only component relevant to gaming performance, but geometry rendering certainly takes up a fair chunk of the graphics pipeline, and the GPU is the single most important component when it comes to gaming performance.

Geometry rendering budget NOT EQUAL same amount of assets needed to run Fortnite at 60 fps. Is a diferent concept.

Greetings
 

geordiemp

Member
That's not the point. The point is that it was indeed running at 1440p, which implies that it was GPU intensive. Otherwise, it would have been running at 4K or at least at 1440p but at 60 frames per second. Having said that, I'm not dissing the PS5, because I cannot wait to get one.

And here is the caveat, the TIME spent rendering on ps5 was same as fortnite on current gen consoles, lolol :messenger_beaming:



Rermember ps5GPU is fast at 2.23 Ghz, it will render very fast....

So funny, back to the drawing board lololol
 
Last edited:

Dolomite

Member
This "If the tech demo pulled the same amount of assets needed to run Fortnite at 60 fps" is not this If it’s true that running the UE5 demo only took the “geometry rendering budget” of Fortnite at 60fps on console.

That’s one thing that makes UE5 so exciting. If it’s true that running the UE5 demo only took the “geometry rendering budget” of Fortnite at 60fps on console, we might be in store for a game engine that really gives lower-end hardware something to work with. Sure, geometry rendering isn’t the be-all and end-all of GPU gruntwork, and the GPU isn’t the only component relevant to gaming performance, but geometry rendering certainly takes up a fair chunk of the graphics pipeline, and the GPU is the single most important component when it comes to gaming performance.

Geometry rendering budget NOT EQUAL same amount of assets needed to run Fortnite at 60 fps. Is a diferent concept.

Greetings

Correct, I speed skimmed that post because much of it was burried in FUD(the original post I was responding to). But you're correct, and either way we slice it, that's impressive. We need more engines to follow the template of lost cost/ high detail. My post reiterates that this is 1st, a multiplat engine and we'll all benefit from from Everything nanite and lumen have to offer regardless of platform, period. If someone wants to argue against that, then they can't be helped imo
 
Last edited:

Elog

Member
That's not the point. The point is that it was indeed running at 1440p, which implies that it was GPU intensive. Otherwise, it would have been running at 4K or at least at 1440p but at 60 frames per second. Having said that, I'm not dissing the PS5, because I cannot wait to get one.

This is the point. It is obviously bottle-necked somewhere like you state. However, Epic states it is not GPU limited which so far implies that it is I/O limited. We will see but that is the fairly obvious conclusion that can be drawn from the bread crumbs so far.
 
Last edited:

Nowcry

Member
Correct, I speed skimmed that post because much of it was burried in FUD(the original post I was responding to). But you're correct, and either way we slice it, that's impressive. We need more engines to follow the template of lost cost/ high detail. My post reiterates that this is 1st, a multiplat engine and we'll all benefit from from Everything nanite and lumen have to offer regardless of platform, period. If someone wants to argue against that, then they can't be helped imo

Totally agree, but we must always be aware of what parts of the hardware are being used and the limitations of each system in each part.

That way you make sure you have the right idea about what to expect.
 

Bo_Hazem

Banned
I missed you the other day......That was the other day.

46y42z.jpg



 
Last edited:

sircaw

Banned
46y42z.jpg




He's still got a gun, don't make me show you what 4 t flops of power can do.
I have heard from legitimate people that it shits all over the ps5.
Thanks Dealer.
 
DF pixel counting i guess
Was not, they say in that time the reconstruction technique was good enough that they got 4k from the image of the demo,
so even in you are able to see artifact DF say they need to improve its tools in order to measure the pixel counts.

Was the people of Epic who tell them was 1440p.
UE5 is literally UE4 with updated plugins. The engine is built on UE4. The entire magic behind nanite (the technology not the name of the land in the demo) builds on the same technology you find in Quixel megascans. Infact Quixel is free when purchase UE as a student if you develope now, even on current gen GDKs.
High LOD and GI are amazing but have been around as a technology for nearly 10 yrs at this point. Now there is Much More to UE5 than GI and Nanite but that's all you'll hear people parrot around Gaf because they haven't used it and the idea of GI was foreign here before the tech demo

(This is all running on a 6 yr old software engine using the technology behind Lumin)

Aside the console war you have now with geordiemp geordiemp ... did you just say UE5 is just UE4 with plugins ? really

My question is I am talking with a student or with with real engineer/dev with experience because if you are
student let me tell you Nanite is almost so unbelieve to see in action in games which actually use REYES that is shocking.

Sorry you cannot find Nanite in the Quixel megascans this is not how this works if you think yes, please let us know in the forum of Unreal
engine I think everyone which like me uses UE want to see REYES in action now.

The video you share is basically what you can do if you bandwidth and memory is enough good but still really expensive to use it,
this is not REYES (Nanite) the main reason of why look so incredible is for the work of this guys of have so incredible library of megascans
with some many LoD just ready for be used but even then library has its limits obviously like still has limit number of assets so if you want
your game has more unique assets you still need to do it for yourself but this is other topic.

If you are an engineer/dev with experience please also told me how I can use the assets like trees of more of 100 MB in size in its max level
without worry to have in my files all normal maps even without worry to config the LoD for all my models because you know LoD is not the
same as REYES.

Regarding the GI yes always exists in the last versions of UE but we are talking of Lumen the new solution for the GI, don't confuse subjects
the actual solutions (like lightmass) we use for GI are less effective that Lumen and probably harder to use (as I didn't use Lumen yet).

If you are not an student,dev, engineer or somebody who plans to actually use an engine, please avoid start a console war with technical topics.

And of course here the focus is lumen and nanite because things like Niagara and the convolution reverb are already available.
 
Status
Not open for further replies.
Top Bottom