• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Shmunter

Member
I also don't doubt Sony and Epic had a joint marketing deal to make this demonstration for PS5, it's the natural conclusion to the conversations they've had. that lead to PS5 and UE5 ending up where they have.

But that doesn't then just mean it's nothing but a marketing stunt and any old NVMe equipped system could run it in its current state at the same level of detail. If Sony were smart they'd have asked Epic to make something that really makes use of their specific IO, no matter how excessive and pointless it is to use that much bandwidth, purely to demonstrate what can be done.

In other words it can be both: it being marketing and it being genuinely a (pointlessly excessive?) test of PS5 IO aren't mutually exclusive, and it would make less sense for such a marketing deal to result in Epic making something that doesn't play to PS5's strengths and could run on pretty mid-range IO. That would be doing Sony a disservice.

My point is a lot of the "facts" being banded around to dismiss all this as smoke and mirrors actually raise more questions than they answer, and end up implying things that then actually end up making less sense than before when you think about it for more than a moment.
Precisely
 
Last edited:

geordiemp

Member
They'll be using Sony APIs for using accessing the storage. All developed and tested and functionally emulated on a PC, but profiled and bench-marked on the dev-kit itself.

Kind of like when making an Android or iOS app. You use Android/iOS APIs while programming it on your PC, even running it at a slower speed in an emulator before pushing it to the real hardware and profiling it.

This is an important point relating to the EpiC china discussion about running the first part of the demo in editor on laptop. Does not mean the demo runs without allot of API changes.

Any idea when the EPIC UE5 detailed talk is ?

Could it be after the rumoured Ps5 event, as the NDA maybe will be relaxed somewhat ?
 
Last edited:

Exodia

Banned
When I said Lumen I was using it as shorthand for the wordy demo name. I was not referring to the lighting technology. I think there's a bit of misunderstanding there as I know nothing on the history of that and don't claim to know much about it at all, but it's interesting that you detailed it anyway.
When I said "last time", I meant it in the context of us getting an Unreal demo shown on unreleased and as of yet unseen PS4 hardware, and how long it took for us to get our hands on it in the PC space.
From memory I thought it was ony Elemental that was shown pre-release of PS4, but it must have been Infiltrator I was recalling. I was WRONG. in thinking the engine was released prior to the demo, but about right on the time we had to wait from seeing it at GDC to being able to run it on a PC. You're right that the demos could run from day one, it was importing assets from it into your own project that could be problematic without following a guide they provided. At least according to the press release that accompanied the demo being announced as available to try.
We should be getting a "preview release" of UE5 "next year", and a slated general release "late 2021". In your experience is there much time between the two? If they are around the same time, then the time line between seeing Nanite and getting to play it ourselves is about on par with the last Unreal Engine demo PS4 got before release, and we'll be waiting a long time either way.

I'm 100% certain that the demo release will follow the timeline of the kite demo which was unveiled at GDC March 2015 and released June 2015 less than 3 months later.

I fully believe we will get our tech talk deep dive at GDC in august with a UE5 Preview release around Feb 2021 including the release of the Lumen in the land of Nanite project files.

On the subject of getting our hands on Lumen in the Land of Nanite as part of the UE5 SDK at the back end of 2021, it's still likely not going to tell us much if the developers are saying it's still early technology and the target is this visual quality at 60 FPS. If they can manage that over the next 18 months or so, it won't be comparable unless it can be run on a released PS5 again somehow?

Wont tell us anything? Lol I love the goal post shifting. It's the project files, it literally tells us everything.

You do realize the bottleneck is Lumen and not Nanite? This has already been confirmed by the chinese Senior Epic Engineer. They will tell us the ms performance cost for each feature in their deep dive in August.

You're assuming they've already made a PC build of the PS5 demo and have run it on various configurations to benchmark it and get an idea of what is or would be required in future. What if at this stage it was built on/for a PS5 dev-kit using PS5 SDK ready for GDC 2020? This is all work in progress. Nanite, Lumen and UE5 are still well over a year away from being ready for public release. It makes no sense to start quoting required specs if they're going to be changing constantly over the next few months.
You're free to have your suspicions, but to effectively call anyone that doesn't agree with you an idiot or gullible is a bit silly in my opinion.

Do people realize that development is done on the PC a d not on the devkit? Do ppl realize that these studio with 50,100,200, 2000 people only still get a couple Devkits? Some even just one or two? Each engineer dont have their own devkit!!!! Therefore development still happens on the PC a d strictly ONLY the PC.

Secondly people dont know that all features show up in the rendering-dev repository 2 years before it actually makes it into public release in prod status. 1 year before it makes it in as an experimental feature?

The only reason we dont already have lumen and nanite last year is because of PR and marketing for the major (5) number.
 

Deto

Banned
EPIC has claimed RTX 2070 Super (entry-level TU104) running the same UE5 demo has "pretty good performance".

https://www.pcgamer.com/unreal-engine-5-tech-demo/


Epic Games chief technical officer Kim Libreri.

Regarding loading and streaming, though, Sweeney says that the PlayStation 5's SSD architecture is "god-tier" and "pretty far ahead of PCs," but that you should still get "awesome performance" with an NVMe SSD, which I'm using

I think you need to learn to interpret texts and a little bit of logic.
 
Last edited:

FeiRR

Banned
I don't doubt it was developed on a PC, that's how it's done, but if this demo is being made for and with Sony on a specific bit of hardware, the development PCs will be using that hardware's SDK for its engine features (they mention they used PS5's primitive shaders as another example).
Having this SDK installed allows you to emulate functionality before running it native on the dev-kit, but it isn't helpful in telling you what your performance would have been if you'd developed it with PC architecture and Windows GPU drivers in mind from the beginning.
That they're not ready to start quoting requirements because the engine and technology is still in its infancy was kind of my original point, and to me is exactly what I assumed when Sweeney said it's too early to say. Like it's literally too early to be saying that.
Or maybe the PC API (here DX12U) is not ready for their technology. Maybe that's why they aren't talking about that and they won't share Xbox/PC data.

Edit: I see how the astrowarrior's got triggered right now and I think I might be hitting the spot :messenger_tears_of_joy:
 
Last edited:

Leyasu

Banned
Again it comes back to what is meant by "it". "It" the in work game engine with the Nanite and Lumen technology that will have been in development years before they got a PS5 dev-kit? Or "it" the playable GDC demo made with Sony using a PS5 dev-kit?

You're right that both will absolutely be programmed on a PC, but if Sony's playable GDC demo was a project entirely designed for PS5, then the PCs set up for developing that game would be using a software development kit, or UE5 engine extensions to specifically target its hardware and talk through its APIs. They'll be using Sony APIs for programming the primitive shaders. They'll be using Sony APIs for using accessing the storage. All developed and tested and functionally emulated on a PC, but profiled and bench-marked on the dev-kit itself.

Kind of like when making an Android or iOS app. You use Android/iOS APIs while programming it on your PC, even running it at a slower speed in an emulator before pushing it to the real hardware and profiling it.
If you've made Sony a playable demo in your generic engine using your new technology and using Sony's APIs for accessing storage and GPU features, not to mention Sony's own graphics API as they don't use DirectX or likely anything PC related for rendering, that doesn't mean you'll know how well it would then work on various configurations of PC without basically making Sony's playable GDC demo with those APIs ripped out and the game programmed for PC architecture using standard libraries and PC drivers.
If I've made a 3D game for iOS using a desktop computer and Apples Metal graphics API, I'd have no idea how well it could run natively on a PC unless I rewrote large parts of it for PC, swapped out Metal for DirectX, OpenGL or Vulkan or something etc. Even though I can run it functionally perfectly fine in the editor or through the emulator on the desktop I'm using to develop it.



But this was a demo made for/with Sony for PS5 on a PS5 dev-kit. Sony wouldn't even let Epic make the flying scene playable in case people were able to fly out of the world and break the game, or see that they hadn't bothered to make art assets for the views to the side. Why would something this specific and made for a specific event also have been written for PC? I'm not saying it wasn't, but I'm not as convinced that it's clear they would have. I'm sure 18 months from now when the engine and the new technology is more mature and optimised they'll bundle this demo with it, and have made a PC build of it. But this far away from release, for something made for a Sony GDC event? I just don't think you can absolutely say they would have.



Who knows, maybe it didn't even tickle PS5's IO, or wouldn't even have stressed XSX's IO either. I'm not really that bothered either way if I'm honest. But if that is the case, what the hell is Sony planning for with the IO they have provided? Is it miscalculated and completely overkill? Even for this specific PS5 demo that Sony had creative control over?
I'm not asking that as some kind of challenge at you by the way, I'm genuinely interested in what the suggestion that Lumen is no big deal from an IO perspective actually means if true.
You make some good points and explain them well.... Great post!

Even though there are still a few things that I don't necessarily agree with and with your follow up post below, they are not worth running in circles with by continuing when we are not in possession of the facts and the information.

Thanks for your replies by the way. It is nice to see people thoughtfully speculate without being condescending or rude!

The wait for next gen is real.
 

Nevaeh

Member
Somehow this guy has me blocked even though I don’t know him? And I rarely even mention videogames on my twitter.
He either tracked me down on gaf or he regularly searches “Xbox” and saw my tweet about how the last event sucked
yep same here, blocked me even tho I very rarely even type a tweet lmao
 

PaintTinJr

Member
I wonder if this Nvidia paper (PRACTICAL REAL-TIME VOXEL-BASED GLOBAL ILLUMINATION FOR CURRENT GPUS) gives us a ballpark for extrapolating how the PC hardware measures up on Lumen/Nanite tech.


IWTdHDu.jpg


tUqi8OF.jpg


MJqBN7W.jpg


uKlDnN3.jpg

In the paper, we can see that the best looking images(slide 1) needs a GTX Titan, 2.5GB of VRAM, render at 1080p and a frame-rate just shy of 40fps(25.4ms per frame), and traces 1/4 or less of the UE5 demo rays, and does so with models with a fraction of the polygons(see slide 3) and texture detail (even at ultra settings).

The thing that got me thinking is, UE5 doesn't use HW RT of the PS5 or the fixed path Rasterization IIRC. So even if the latest RTX 2060/3080s can scale through those multi factor deficits of the older GTX Titan to match the PS5 image, the PS5 will still have GPU headroom to do much more, presumably - if some (or all?) of this is being done on the PS5's CPU a-sync cores (with acceleration help of its IO complex). The logical rational is that a current PC's CPU can't replace the compute of even a GTX titan, never mind something that exceeds its capabilities.

Maybe Nvidia found major algorithm improvements since then and this is a poor reference point to extrapolate from, but I thought it was quite interesting o extrapolate from, and marvel again at the visuals of the UE5 at 1440p30 with 20M polys on screen per frame..
 
Last edited:

rnlval

Member
Regarding loading and streaming, though, Sweeney says that the PlayStation 5's SSD architecture is "god-tier" and "pretty far ahead of PCs," but that you should still get "awesome performance" with an NVMe SSD, which I'm using

I think you need to learn to interpret texts and a little bit of logic.
Are you claiming the recent UE5 demo streams data at 22GB/s?

It's you who needs to learn to interpret texts and a little bit of logic.
 
Wont tell us anything? Lol I love the goal post shifting. It's the project files, it literally tells us everything.

But who’s doing the goal post shifting, though? Is it wrong that Epic might find performance improvement and optimisation in this kind of rendering and in UE5 between last month and the end of 2021? If this is the case, doesn’t it invalidate PS5’s dynamic 1440P@30Hz “score” being used as demonstration of PS5’s capability at the end of 2021 in that demo? Is the quoted Epic engineers target of the same quality but at 60FPS not true?
 
I wonder if this Nvidia paper (PRACTICAL REAL-TIME VOXEL-BASED GLOBAL ILLUMINATION FOR CURRENT GPUS) gives us a ballpark for extrapolating how the PC hardware measures up on Lumen/Nanite tech.


IWTdHDu.jpg


tUqi8OF.jpg


MJqBN7W.jpg


uKlDnN3.jpg

In the paper, we can see that the best looking images(slide 1) needs a GTX Titan, 2.5GB of VRAM, render at 1080p and a frame-rate just shy of 40fps(25.4ms per frame), and traces 1/4 or less of the UE5 demo rays, and does so with models with a fraction of the polygons(see slide 3) and texture detail (even at ultra settings).

The thing that got me thinking is, UE5 doesn't use HW RT of the PS5 or the fixed path Rasterization IIRC. So even if the latest RTX 2060/3080s can scale through those multi factor deficits of the older GTX Titan to match the PS5 image, the PS5 will still have GPU headroom to do much more, presumably - if some (or all?) of this is being done on the PS5's CPU a-sync cores (with acceleration help of its IO complex). The logical rational is that a current PC's CPU can't replace the compute of even a GTX titan, never mind something that exceeds its capabilities.

Maybe Nvidia found major algorithm improvements since then and this is a poor reference point to extrapolate from, but I thought it was quite interesting o extrapolate from, and marvel again at the visuals of the UE5 at 1440p30 with 20M polys on screen per frame..

Is it known whether this nVidia solution is all done within a single frame, or is something that accumulates over several frames? UE5’s Lumen lighting seems to spread the work over several frames and accumulate results, so it may not be comparable?
 

PaintTinJr

Member
Is it known whether this nVidia solution is all done within a single frame, or is something that accumulates over several frames? UE5’s Lumen lighting seems to spread the work over several frames and accumulate results, so it may not be comparable?
AFAIK the clipmap is incrementally updated and only revoxelises as necessary when the camera moves (pages 17-22), so I think it is reasonably similar to the UE5 accumulation approach for comparison of processing demands - I could be completely wrong, though.
 

Corndog

Banned
Oh my goodness. People are not listening. Tim Sweeney said the UE5 engine will run on most platforms. That's their business after all, making an engine that can work on everything from a phone to a PC. But he also said that THIS particular demo. THIS ONE, was made specifically for the PS5 and took advantage of it's strengths, particularly the super fast I/O that he said CANNOT be duplicated, even on the PC at this time. Would that particular demo run on other platforms? If one has the code and the UE5 engine on their platform, SURE. Will there be differences? YES. This is simple logic folk. I don't know why some are so threatened by the I/O and SSD speed issues that mean that particular demo at that particular quality can't be run elsewhere, at least right now (according to Tim Sweeney). Big deal! I'm sure they could have made a demo that took advantage of a PC being able to have 128GB of RAM or something else to show off whatever platform they chose. This demo had been meant to be playable on PS5 at GDC earlier this year. THAT IS ALL.
And I’m telling you it can most likely run on xsx and for sure pc right now. If epic would release more information we would know for sure but they haven’t.
 

Corndog

Banned
That when we say XSX will use duplicates, if there will be no duplicates on PS5, or more duplicates on XSX vs less duplicates on PS5 = 100GB XSX vs 70-30GB on PS5 of the same game depending on how pacy the game is.
There is no reason to use duplicates and either machine. Ssd doesn’t have a mechanical head like a regular hard drive which is the reason to have duplicates.
 

Corndog

Banned
SFS is a software. The hardware support is standard for RDNA2.
I.e. PS5 will have all the same features if they bother to write a software for it.
But overall they will rely on game developers to use their own methods for texture streaming, that are more game-specific. And not "one size fits all" MSFT approach. As usual.
I don’t have the tweet but I believe one of the Xbox engineers was asked this and said it was custom to Xbox. Not part of rdna. Will try to find it later.
 

Sinthor

Gold Member

And I’m telling you it can most likely run on xsx and for sure pc right now. If epic would release more information we would know for sure but they haven’t.

I agree. Running and running "as is" though are two different things. If he's saying this demo required I/O performance that even PC's cannot handle right now, it's very possible that the demo would have to be altered for those platforms. But, as you note...we don't have enough info to say for sure. Good times though!
 

Handy Fake

Member
There is no reason to use duplicates and either machine. Ssd doesn’t have a mechanical head like a regular hard drive which is the reason to have duplicates.
Just a quick question along those lines, and forgive me if I've gotten the wrong end of the stick...
But would that still mean that the XB would need to shift any files into the instantly accessible 100gb part of the SSD before (during) use?
Or have I misunderstood the "100gb" thing?
 

Corndog

Banned
MS patenting VRS is like me trying to patent my dick to fool women into thinking I've got the only penis on the planet.

"My patented form of Penis empowers me to efficiently utilize the full power my cock. Rather than wasting precious strokes thrusting in and out, my penis vibrates and expands once inside a female vagina. This allows for greater framerates and resolution than the PS5 which is using a narrow and fast pencil of a penis."
Sony patented a bottom cooled heat sink. Guess it was just a waste of money since you can already use a heat sink.
 
"Pretty good" is in quotation marks LOL

Question: Would this demo run on my PC with a RTX 2070 Super?

Answer: Yes, "Pretty good" (Libreri ).


Are you better than pcgamer.com?

It doesn’t say anything about IO requirements, though. That a 2070 Super GPU could render the detail on screen “pretty good” isn’t really controversial or in doubt, is it?
Did I dream it or was there another interview where he said the demo could be run on even two year old Android devices if scaled down sufficiently, and their target is for the engine to do all that automatically for the content creator?
 

ToadMan

Member
Just a quick question along those lines, and forgive me if I've gotten the wrong end of the stick...
But would that still mean that the XB would need to shift any files into the instantly accessible 100gb part of the SSD before (during) use?
Or have I misunderstood the "100gb" thing?

The 100gb virtual memory is configured/set for each app/game - its not a shared single part of the SSD.

What happens if a game is bigger than 100gb? I guess the extra data is accessed with a File System lookup so it's a bit slower access.
 

rnlval

Member
It doesn’t say anything about IO requirements, though. That a 2070 Super GPU could render the detail on screen “pretty good” isn’t really controversial or in doubt, is it?
Did I dream it or was there another interview where he said the demo could be run on even two year old Android devices if scaled down sufficiently, and their target is for the engine to do all that automatically for the content creator?
The net result, this demo runs "pretty good" on RTX 2070 Super.
 
Status
Not open for further replies.
Top Bottom