• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Tim Sweeney on the Tech Demo: "Nanite and Lumen tech powering it will be fully supported on both PS5 and Xbox Series X and will be awesome on both."

Use Google to translate or visit the sites I mentioned earlier.

It's much easier to Google or do your own research, than blindly rely on your comrades to post what you would prefer to hear. Check it out yourself, and if you can't dispute anything I said, there's a reason why. This is straight from the guys who made the demo possible. Hands on with the engine, etc.
Cut the console warring BS ;)

I asked you for sources of your repy to my post. Dont have any?

Move on!
 

DeepEnigma

Gold Member
I'm just waiting for some dev to utter that magic word for their upcoming XsX/PS5 multiplatform title, and they will instantly be labelled a Sony / Microsoft shill for the rest of the generation.

P A R I T Y :messenger_weary: :messenger_ok:

UbiSoft has the title from current gen. “To avoid debates and stuff”.

That Streisand effect.
 
Cut the console warring BS ;)

I asked you for sources of your repy to my post. Dont have any?

Move on!
I just gave you the ultimate source, lol! From the horses mouth. I doubt you watched an hour and a half long video, between you reading my post and responding. No console warring here, just replying to someone who thinks there's no such thing of a multiplatform engine, running on different hardware. Watch the video. Quote me in an hour and a half if I'm lying. Trust me on this.
 
Last edited:
I just gave you the ultimate source, lol! From the horses mouth. I doubt you watched an hour and a half long video, between you reading my post and responding. No console warring here, just replying to someone who thinks there's no such thing of a multiplatform engine, running on different hardware. Watch the video. Quote me in an hour and a half if I'm lying. Trust me on this.
Your "source" is 2.18GB of a .rar file lol

Is that file the footage of the PS5 demo gameplay running on a chinese laptop as you have claimed?

if not, I am very aware of the BS chinese translations that have been circling since yesterday, I have read all of them, and I dont believe any of that BS pusched for the Era Discord Group.

I am not saying you are lying, you can believe what you want to, I say I dont believe that shit or anything that is not official about this.

Tim Sweeney, who maybe knows a little more than you, has shut down all that crap. Sorry man, nothing personal againt you.
 
Last edited:
Your "source" is 2.18GB of a .rar file lol

Is that file the footage of the PS5 demo gameplay running on a chinese laptop as you have claimed?

if not, I am very aware of the BS chinese translations what have been circling since yesterday, I have read all of them, and I dont believe any of that BS pusched for the Era Discord Group.

I am not saying you are lying, I say I dont believe that shit or anything that is not official about this.

Tim Sweeney, who maybe knows a little more than you, has shut down all that crap. Sorry man, nothing personal againt you.
You laugh at my source, which contains literally everything you are complaining about. I'm confused?


Use the Google translator, or go to the Chinese site I mentioned earlier on to get the real translations, and not some stupid fanboy made up shit.


I promise, you can debunk everything, right now, by simply doing what I said. You can speculate all you want to, and deny whatever you want. What you are questioning is answered, by the actual team that made the demo. They have no reason to lie. The only reason the video was pulled, is because it went against NDA of talking about performance of other hardware. Only the ps5 could be mentioned, which is where they slipped up.

Feel free to quote me if you actually watch the video. If you can't do that, there's nothing more to be said about the topic.
 
Last edited:

Bogroll

Likes moldy games
Oh I figured it has nothing to prove to you out already. Not sure about the claim that it is the same for “most” or why one has everything to prove and the other has nothing to prove beyond console partisanship though.
I was implying Xbox has nothing to prove as in the the difference will be mainly resolution and i think most will agree going off the specs.
Its the SSD this SSD that Temperal Sound, TFs don't matter and how will that boost mode is going to work, but will it be a work of genius providing us with quiet low energy console. Yes there has been Xbox threads but nowhere near as much as PS5's.
See it as you will that being curious about how PS5 this going to perform when it comes down to it as console preference. You might believe what you hear and read on here, i like to see it first.
And of course Xbox has everything to prove about games,
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
I was implying Xbox has nothing to prove as in the the difference will be mainly resolution and i think most will agree going off the specs.

That I can agree, but both consoles have to prove the marketing PR can meet the specs, that the tradeoffs they made (think RAM setup) do not impact effective performance and there is a noticeable jump for that 18% of performance delta between the consoles. Nope, as long as my marketing is claiming a bigger number than your marketing I am happy kind of scenario?!

Not sure why we should take SSD and audio claims by Sony as attention grabbing BS that does not work as well in practice and treat the specs thrown out for XSX as gospel instead of say “theoretical peaks that can only be achieved in particular synthetic benchmarks but never even close in most games”.

Its the SSD this SSD that Temperal Sound, TFs don't matter and how will that boost mode is going to work, but will it be a work of genius providing us with quiet low energy console

So TFLOPS this TFLOPS that, nothing else matters ;)? By the way, Sony never made a TFLOPS do not matter statement, they just arrived close enough to the other box’s numbers using a completely different approach which has some benefits and some tradeoffs.
Tempest Engine has everything to prove, but the laundry list of marketing PR names here ( https://www.google.co.uk/amp/s/www....lfoundry-2020-inside-xbox-series-x-full-specs ) and claims are taken for granted. DLI actual results, Machine Learning in the GPU, Direct Storage and SSD access (SSD this SSD that on XSX too eh?), SFS and its 2-3x physical memory expansion and I/O bandwidth improvement (as it if was a generic multiplier over the best PRT can get on RDNA1/2 GPU’s), etc...

I understand taking both manufacturers’ words with a pinch of salt more or less or calling BS on both, but one having everything to prove and the other not? Sorry, that is way too partisan for me unless there are reasons in the consoles my history or in what they have said to make one reasonably believe that.

And of course Xbox has everything to prove about games,

I think both do to a point, but so far one has given me at the start, in the middle, and the end of the generation more of what I wanted and the platform needed. One also still believes in console generations... which is what I also believe in.
 
Last edited:

Bogroll

Likes moldy games
That I can agree, but both consoles have to prove the marketing PR can meet the specs, that the tradeoffs they made (think RAM setup) do not impact effective performance and there is a noticeable jump for that 18% of performance delta between the consoles. Nope, as long as my marketing is claiming a bigger number than your marketing I am happy kind of scenario?!

Not sure why we should take SSD and audio claims by Sony as attention grabbing BS that does not work as well in practice and treat the specs thrown out for XSX as gospel instead of say “theoretical peaks that can only be achieved in particular synthetic benchmarks but never even close in most games”.



So TFLOPS this TFLOPS that, nothing else matters ;)? By the way, Sony never made a TFLOPS do not matter statement, they just arrived close enough to the other box’s numbers using a completely different approach which has some benefits and some tradeoffs.
Tempest Engine has everything to prove, but the laundry list of marketing PR names here ( https://www.google.co.uk/amp/s/www....lfoundry-2020-inside-xbox-series-x-full-specs ) and claims are taken for granted. DLI actual results, Machine Learning in the GPU, Direct Storage and SSD access (SSD this SSD that on XSX too eh?), SFS and its 2-3x physical memory expansion and I/O bandwidth improvement (as it if was a generic multiplier over the best PRT can get on RDNA1/2 GPU’s), etc...

I understand taking both manufacturers’ words with a pinch of salt more or less or calling BS on both, but one having everything to prove and the other not? Sorry, that is way too partisan for me unless there are reasons in the consoles my history or in what they have said to make one reasonably believe that.



I think both do to a point, but so far one has given me at the start, in the middle, and the end of the generation more of what I wanted and the platform needed. One also still believes in console generations... which is what I also believe in.
Like I said I'll like to see it to believe it and are I expecting my tv to be transformed, no. I've bought plenty of TVs in the past with claims about sound. If it works and I'm wowed,great and the same to all the other stuff. You swallow it up, I want to see and hear it.
 

Panajev2001a

GAF's Pleasant Genius
Like I said I'll like to see it to believe it and are I expecting my tv to be transformed, no. I've bought plenty of TVs in the past with claims about sound. If it works and I'm wowed,great and the same to all the other stuff. You swallow it up, I want to see and hear it.

Lol nice dig there mate, whether you expect it or not (I have experience with my own setup before and after calibration and positional audio was a great change), but you do not need to buy into the presence they can derive from audio processing (despite the way they clarified it, the maths around it for headphones which they said is priority number one, and what they said progressively about TV speakers and then surround sound systems and people’s own ears), you still get advantages of doing 3D audio with a separate powerful chip that can also be used for general purpose computations worth around 0.1 TFLOPS or so (8 Jaguar PS4 cores or so).

You built up a straw man about the TV being easily transformed at launch, linked to unrelated TV audio sound processing claims, connected the two somehow all to put uncertainty and doubt about positional audio and presence which they clearly made a honest case about and you can see the root of that in binaural audio processing which works and was designed for headphones.
 

ToadMan

Member
GPU workloads are inherently parallelisable because you are working on millions of pixels at a time. That's why GPU manufacturers are able to scale performance by increasing the number of compute units.

Indeed, but not at the expense of clock.

Xsex has 40% more CUs running 20% slower than PS5.

There are examples in the GPU world of lower tflop gpus out matching higher tflop gpus - sometimes because of more CUs and sometimes because of more clock.

PS4 had 50% more CUs than xb1, but they were only running 7% slower than xb1. Even so the PS4 didn’t blow xb1 out of the water.

Xsex has 40% more CUs running 20% slower - less of an advantage than PS4 over xb1. Hence my feeling it is not clear the Xsex has an advantage in practical terms when it comes to multiplats.
 

Bogroll

Likes moldy games
Lol nice dig there mate, whether you expect it or not (I have experience with my own setup before and after calibration and positional audio was a great change), but you do not need to buy into the presence they can derive from audio processing (despite the way they clarified it, the maths around it for headphones which they said is priority number one, and what they said progressively about TV speakers and then surround sound systems and people’s own ears), you still get advantages of doing 3D audio with a separate powerful chip that can also be used for general purpose computations worth around 0.1 TFLOPS or so (8 Jaguar PS4 cores or so).

You built up a straw man about the TV being easily transformed at launch, linked to unrelated TV audio sound processing claims, connected the two somehow all to put uncertainty and doubt about positional audio and presence which they clearly made a honest case about and you can see the root of that in binaural audio processing which works and was designed for headphones.
No Mark Cerny mentioned the tv sound in his presentation, you brought up the sound. I'm not digging at you. I hope I'm really surprised and impressed, by the sound of you your expecting it to be the bees knees. You can only be disappointed if everything is not all what it claims to be. Let's see.
 

Panajev2001a

GAF's Pleasant Genius
No Mark Cerny mentioned the tv sound in his presentation, you brought up the sound. I'm not digging at you. I hope I'm really surprised and impressed, by the sound of you your expecting it to be the bees knees. You can only be disappointed if everything is not all what it claims to be. Let's see.

I remember what he said and he specifically mentioned it is hard, possible but hard. Harder than headphones as those follow you, but easier than surround sound as you can make some assumptions around the viewer/listener position. What he said does not sound unrealistic and even basic Audyssey calibration of your speakers setup with the cheap microphone they give you with your AVR can make a big difference to your movie listening experience (good ok’ trusty Denon AVR-X2300W).

Do not forget Sony acquired the biggest name in Audio middleware early last year, Wwise, and they were researching about positional audio for PSVR for many years too so their bullish position on this is not a last minute marketing driven PR move without any grounding in reality.

Positional audio / 3D audio has been the topic of a lot of research over the years (it is curious how a 1D source like sound can be interpreted by the brain through your ear As having positional information that can only be inferred from the stream of samples you get and the magnitude of each as each sample essentially lack other data).
 
Last edited:
Assuming that the panel of Epic Chinese all knew exactly what they were talking about, my takeaway from the removed video is that not only will UE5 be awesome when released, the demo we've seen is but a taste of things to come.

The loudest dude in the panel said and implied a few things:

1. They are aiming for 60fps with visual quality as shown in that demo on next gen consoles and UE5 isn't out now because they haven't reached that point yet.

2. He strongly implies that progress is being made and the above target of 60fps is not merely wishful thinking; says they can browse the scene in the cave (where Lumen was first shown off) at 40fps on their notebook, with uncooked assets (I suppose that means unoptimised assets).

3. In his estimation, two triangles per pixel at 2K resolution is "not really that much" and Nanite doesn't need the high speed SSD solution by Sony to pull that flying scene off. In other words, a similar scene could probably be made to be even more complex in future.

Some points mentioned by others on the panel:

1. Niagara is expected to be very useful in the film and animation industry, especially for visual FX. One of the guys said he was quite impressed at how Niagara was used to make the puddle, the keyword is "quite". I think he knows it's not the best looking (and behaving) water...yet. He also said some other cool things about Niagara and its potential application in games but I couldn't really understand the examples he gave.

2. Lumen is expected to be applied in conjunction with other lighting techniques. One of them said that unlike GPU-based RT, you can selectively apply Lumen to specific aspects. If I'm not misunderstanding what was said, the explanation was that GPU-RT can either be fully on or off, and sometimes that results in the scene not being able to run at a minimum of 30fps. Lumen was described as being very flexible.

Personally, this means Fortnite will look and (probably) run better, so I'm hyped :D

Edit: Changed "image quality" to "visual quality".

Edit 2: Made some additions and changes to point 2 in the upper section.
 
Last edited:

ethomaz

Banned
Assuming that the panel of Epic Chinese all knew exactly what they were talking about, my takeaway from the removed video is that not only will UE5 be awesome when released, the demo we've seen is but a taste of things to come.

The loudest dude in the panel said and implied a few things:

1. They are aiming for 60fps with visual quality as shown in that demo on next gen consoles and UE5 isn't out now because they haven't reached that point yet.

2. He strongly implies that progress is being made and the above target is not merely wishful thinking; ~40fps on their notebook, with uncooked assets (I suppose that means unoptimised assets).

3. In his estimation, two triangles per pixel at 2K resolution is "not really that much" and Nanite doesn't need the high speed SSD solution by Sony to pull that flying scene off. In other words, a similar scene could probably be made to be even more complex in future.

Some points mentioned by others on the panel:

1. Niagara is expected to be very useful in the film and animation industry, especially for visual FX. One of the guys said he was quite impressed at how Niagara was used to make the puddle, the keyword is "quite". I think he knows it's not the best looking (and behaving) water...yet. He also said some other cool things about Niagara and its potential application in games but I couldn't really understand the examples he gave.

2. Lumen is expected to be applied in conjunction with other lighting techniques. One of them said that unlike GPU-based RT, you can selectively apply Lumen to specific aspects. If I'm not misunderstanding what was said, the explanation was that GPU-RT can either be fully on or off, and sometimes that results in the scene not being able to run at a minimum of 30fps. Lumen was described as being very flexible.

Personally, this means Fortnite will look and (probably) run better, so I'm hyped :D

Edit: Changed "image quality" to "visual quality".
1. Yeap. The creator of Lumen said the target is 60fps for the PS5 demo quality on next-gen consoles... he said they could have run the demo on PS5 at 60fps if they dropped the quality.

2. He said that he ran on his notebook with a RTX 2070 level GPU (way weaker GPU than PS5 or new Xbox) but never said which settings like resolution, triangles per pixel, etc and so he didn’t need an SSD like PS5 but a EVO 970 Pro. Tim said on Twitter PS5 demo run at Vsync capped 30fps so that men it is running over 40fps already.

3. It was an example that if you drop to 2k resolution, 1-2 triangles per pixel (PS5 seems to be using avg. 5-6 triangles per pixel), some vortex compression you don’t need high speeds SSDs like PS5... you can even run at SATA SSDs with MB/s speeds (SATA max speed is 750MB/s).

What is Niagara? Do you mean Nanite?

Lumen is the most GPU intensive of the two techs and it is a hybrid approach (it uses software ray-tracing in some tasks)... Epic said they intent to use more ray-tracing and add optimization support to hardware on Lumen.
 
2. He said that he ran on his notebook with a RTX 2070 level GPU (way weaker GPU than PS5 or new Xbox) but never said which settings like resolution, triangles per pixel, etc and so he didn’t need an SSD like PS5 but a EVO 970 Pro. Tim said on Twitter PS5 demo run at Vsync capped 30fps so that men it is running over 40fps already.
Did he actually give the notebook specs? you have a source for that? All I've heard from chinese speakers is that he didn't give notebook specs.

There were rumors of a 2080 and 970 evo from some random forum post of someone claiming to have called the engineer.
 

killatopak

Member
1. Yeap. The creator of Lumen said the target is 60fps for the PS5 demo quality on next-gen consoles... he said they could have run the demo on PS5 at 60fps if they dropped the quality.

2. He said that he ran on his notebook with a RTX 2070 level GPU (way weaker GPU than PS5 or new Xbox) but never said which settings like resolution, triangles per pixel, etc and so he didn’t need an SSD like PS5 but a EVO 970 Pro. Tim said on Twitter PS5 demo run at Vsync capped 30fps so that men it is running over 40fps already.

3. It was an example that if you drop to 2k resolution, 1-2 triangles per pixel (PS5 seems to be using avg. 5-6 triangles per pixel), some vortex compression you don’t need high speeds SSDs like PS5... you can even run at SATA SSDs with MB/s speeds (SATA max speed is 750MB/s).

What is Niagara? Do you mean Nanite?

Lumen is the most GPU intensive of the two techs and it is a hybrid approach (it uses software ray-tracing in some tasks)... Epic said they intent to use more ray-tracing and add optimization support to hardware on Lumen.
Didn’t Tim Sweeney dispute that already and said the laptop was running a video and not the actual demo?
 

ethomaz

Banned
Didn’t Tim Sweeney dispute that already and said the laptop was running a video and not the actual demo?
Yeap.
But the guy didn’t said it was running the demo at the live stream.
He was talking about when he ran on his notebook.
Tim said he can’t understand what they talk too (language).

Another hint he said he ran just the opening part of the demo in his notebook... the video of the demo is showed fully more than one time.
 
Last edited:

ethomaz

Banned
Did he actually give the notebook specs? you have a source for that? All I've heard from chinese speakers is that he didn't give notebook specs.

There were rumors of a 2080 and 970 evo from some random forum post of someone claiming to have called the engineer.
It is a 2080MaxQ that is why I said 2070 level.
The MaxQ is a mobile version with only 6.5TFs... 2070 has 7.5TFs.

Somebody asked him in the forum.
He didn’t give specs at the live stream.
 
Last edited:
1. Yeap. The creator of Lumen said the target is 60fps for the PS5 demo quality on next-gen consoles... he said they could have run the demo on PS5 at 60fps if they dropped the quality.

2. He said that he ran on his notebook with a RTX 2070 level GPU (way weaker GPU than PS5 or new Xbox) but never said which settings like resolution, triangles per pixel, etc and so he didn’t need an SSD like PS5 but a EVO 970 Pro. Tim said on Twitter PS5 demo run at Vsync capped 30fps so that men it is running over 40fps already.

3. It was an example that if you drop to 2k resolution, 1-2 triangles per pixel (PS5 seems to be using avg. 5-6 triangles per pixel), some vortex compression you don’t need high speeds SSDs like PS5... you can even run at SATA SSDs with MB/s speeds (SATA max speed is 750MB/s).

What is Niagara? Do you mean Nanite?

Lumen is the most GPU intensive of the two techs and it is a hybrid approach (it uses software ray-tracing in some tasks)... Epic said they intent to use more ray-tracing and add optimization support to hardware on Lumen.
Yea, I don't doubt they could go for 60fps by cutting back in some other areas but their ideal target is 60fps at the visual quality shown in the demo, so the demo can be considered WIP and UE5 should perform even better when they release it.

Niagara is their new system governing particles and how they interact with one another. The water, the bats and the bugs were all powered by Niagara.

Yup, in that video one of the dudes said Lumen is more resource-hungry than Nanite. I don't know much else about Lumen but the guys seemed to be pretty excited about it; In the scene with all the statues, the guy was talking about the lighting and mentioned how there were translucencies, diffused and specular lighting. Perhaps it's because of how that lighting is interacting with all the triangles in the scene.
 

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
Deep down now every Steam owner knows Steam isn't in Epic's league, gamers from the early 2000s wouldn't want Steam.
 
It is a 2080MaxQ that is why I said 2070 level.
The MaxQ is a mobile version with only 6.5TFs... 2070 has 7.5TFs.

Somebody asked him in the forum.
He didn’t give specs at the live stream.
The forum quote is unreliable, we don't know for sure the member actually contacted or if he made it up. (what I heard is not that someone asked him in the forum, but that someone in a forum claimed to have called him.) And I think the forum only said 2080. There is a 2080 mobile and a 2080 max q. The 2080 mobile is nearly as powerful as the desktop 2080.
 
Last edited:

DeepEnigma

Gold Member
But won't you not need DRAM if it's (the controllers) sandwiched with the SOC (let's say the other side of it on the mainboard) communicating directly to the GDDR6?

Some are speculating that is the design which fits the cooling patent of a heatsink on both sides of the mainboard sandwiching it all together. Almost 3D stacked in a way. And not unlike the CoC SiD design in the Vita, which Cerny was the lead architect as well.

It does have on chip SRAM, which is faster than DRAM and doesn't need to be constantly refreshed:
71340_512_understanding-the-ps5s-ssd-deep-dive-into-next-gen-storage-tech.png

And of course you ignore the post. ;)
 
Last edited:
I remember what he said and he specifically mentioned it is hard, possible but hard. Harder than headphones as those follow you, but easier than surround sound as you can make some assumptions around the viewer/listener position. What he said does not sound unrealistic and even basic Audyssey calibration of your speakers setup with the cheap microphone they give you with your AVR can make a big difference to your movie listening experience (good ok’ trusty Denon AVR-X2300W).

Do not forget Sony acquired the biggest name in Audio middleware early last year, Wwise, and they were researching about positional audio for PSVR for many years too so their bullish position on this is not a last minute marketing driven PR move without any grounding in reality.

Positional audio / 3D audio has been the topic of a lot of research over the years (it is curious how a 1D source like sound can be interpreted by the brain through your ear As having positional information that can only be inferred from the stream of samples you get and the magnitude of each as each sample essentially lack other data).
It's hard because most consumers probably don't have an acoustically treated room for audio. Typical rooms have hard surfaces which creates more unwanted reflections in the sounds and that messes with our brain's perception of the sounds.

Headphones bypass that more or less, though it seems Sony wants to take it a step further by taking the flaps of our ears into account as well.

Also, I suppose sound is 1D if you have only one working ear. Because both ears receive a sound in the real world at different times, the brain is able to compute that difference to perceive its location, kinda like how two eyes are needed for depth perception.

Ultimately, it's all lots of money and research being put into figuring out how to trick the brain into perceiving what they want us to hear. If you're interested, the study of such things is called psychoacoustics.
 

ethomaz

Banned
The forum quote is unreliable, we don't know for sure the member actually contacted or if he made it up. (what I heard is not that someone asked him in the forum, but that someone in a forum claimed to have called him.) And I think the forum only said 2080. There is a 2080 mobile and a 2080 max q. The 2080 mobile is nearly as powerful as the desktop 2080.
Yeap the only reference to notebook specs are that forum post:


Nobody talked about it on the streaming.

The laptop ran the video of the ps5. At another moment the engineer said it ran on his laptop at 40fps in his editor, not sure if he actually ran it in the stream or before while he was at work.
I don't believe he run it at the live streaming... there is no demo show except the video of the PS5 demo.... now if he showed off the live streaming it is another story.
I believe it is more probably he run with his notebook at home/work... the notebook showed in the live streaming was probably not his notebook but the owner of the stream to show stuff on the background screen.
 
Last edited:
Yeap the only reference to notebook specs are that forum post:

这次UE5演示,最让人在意的应该是多边形使用限制几乎被解除了 - 游戏业界综合讨论区 - TGFC Lifestyle
Nobody talked about it on the streaming.
What's interesting is that the vast majority of 2080 notebooks are only 1080p. I mean even 6000+$ 2080 laptops are only 1080p.

There appear to be no 1440p. And there are like 2 or 3 capable of 4k as far as I could search.

To me this suggests that there is a very good likelihood that the laptop ran it at 1080p.

edit:
If it truly was 1080p that bodes extremely well for the ps5, assuming it wasn't a max-q, the ps5 had capped rate and managed 1440p. If the uncapped rate was similar and it is pushing twice the pixels, and if this was a 2080 not a maxq, the ps5 would have twice the performance.
1440p has double the pixels of 1080p -240Hzmonitor

Keep in mind iirc desktop 2080 is like 10% faster than mobile 2080 and 2080ti is about 30% faster than a 2080
Under 4K resolution, the RTX 2080 Ti outperformed the standard model by 29 percent-digitaltrends

But if the ps5 is truly 100% faster than a 2080, then it is even faster than a 2080ti at this demo. Significantly faster.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
It's hard because most consumers probably don't have an acoustically treated room for audio. Typical rooms have hard surfaces which creates more unwanted reflections in the sounds and that messes with our brain's perception of the sounds.

Headphones bypass that more or less, though it seems Sony wants to take it a step further by taking the flaps of our ears into account as well.

Also, I suppose sound is 1D if you have only one working ear. Because both ears receive a sound in the real world at different times, the brain is able to compute that difference to perceive its location, kinda like how two eyes are needed for depth perception.

Ultimately, it's all lots of money and research being put into figuring out how to trick the brain into perceiving what they want us to hear. If you're interested, the study of such things is called psychoacoustics.

Thanks for the informative post, good point I completely missed about the controlled sound environment the headphones provide (in terms of reflections and brain perception).

About the sound being 1D, I was just saying that having two ears and/or being able to re-orientate your sensors (think dogs or cats) does not change that the signal is still essentially one dimensional. Having two ears means sampling from two points and using the known distance between the ears and different time the sound hits each of your ear at must help your brain infer more positional data (and variation of pitch as the sound source may move play into this as well).
The way I am oversimplifying it for myself as is akin to stereoscopy used to take two different camera images that have been aligned in a certain known way to infer depth data of the surface: the sampled image is still 2D.

Brain is indeed a quite funny thing indeed, not an expert here of course, as it takes some cues to infer pet of what we understand as depth that are not based on stereoscopy (so even someone with a single eye does not go to zero depth perception) or how small jittering of the eye contributes to us perceiving images at effectively higher level of detail/resolution than a single sample provides (trying to look up a reference for that, sorry going by memory).
 
Last edited:
Thanks for the informative post, good point I completely missed about the controlled sound environment the headphones provide (in terms of reflections and brain perception).

About the sound being 1D, I was just saying that having two ears and/or being able to re-orientate your sensors (think Dogs or cats) does not change that the signal is still essentially one dimensional.

Having two ears means sampling from two points and using the known distance between the ears and different time the sound hits each of your ear at must help your brain infer more positional data (and variation of pitch as the sound source may move play into this as well).
The way I am oversimplifying it for myself as is akin to stereoscopy used to take two different camera images that have been aligned in a certain known way to infer depth data of the surface: the sampled image is still 2D.

Brain is indeed a quite funny thing indeed, not an expert here of course, as it takes some cues to infer pet of what we understand as depth that are not based on stereoscopy (so even someone with a single eye does not go to zero depth perception) or how small jittering of the eye contributes to us perceiving images at effectively higher level of detail/resolution than a single sample provides (trying to look up a reference for that, sorry going by memory).
You're giving my post too much credit, your anecdote about the Audyssey calibration is probably (I'm unsure because it's my first time hearing of this product) alluding to that: taking the reflections of the room you're in and compensating for its influence on the sound your speakers will produce.

In all honesty I'm not fully grasping the idea of sound being 1D, but yea I like the stereoscopy comparison. Some neat tricks people came up with haha!

Not an expert on the brain here either and considering the Dunning Kruger effect, I think those whose jobs revolve around studying it would say there's so much more for them to discover.

I have a suspicion that perceived depth even with one eye is a result of past experience as well as having slight movement IRL.

In complete stillness I think depth becomes exceedingly difficult to perceive with just one eye. Also, if it's say, a natural landscape, then the way light filters through the air also acts to give some depth information (colours of things far away are more muted and I'm sure modern day lighting techniques replicate this real world phenomenon too though I'm not savvy enough to know the industry name for it).

I seem to remember reading a bit about the eye jitter, it's some absurdly high vibration rate isn't it? The human body sure is high tech. And if you remember the link just tag me in an edit :D
 
What's interesting is that the vast majority of 2080 notebooks are only 1080p. I mean even 6000+$ 2080 laptops are only 1080p.

There appear to be no 1440p. And there are like 2 or 3 capable of 4k as far as I could search.

To me this suggests that there is a very good likelihood that the laptop ran it at 1080p.

edit:
If it truly was 1080p that bodes extremely well for the ps5, assuming it wasn't a max-q, the ps5 had capped rate and managed 1440p. If the uncapped rate was similar and it is pushing twice the pixels, and if this was a 2080 not a maxq, the ps5 would have twice the performance.


Keep in mind iirc desktop 2080 is like 10% faster than mobile 2080 and 2080ti is about 30% faster than a 2080


But if the ps5 is truly 100% faster than a 2080, then it is even faster than a 2080ti at this demo. Significantly faster.
It's definitely 2K/1440p. Multiple translations have stated so.

The mention of 1080p was used as an example to get better performance, like running from a HDD for instance.

I could sell my 2080 TI by itself, and buy both next gen consoles, and possibly get money on top. There would be no way that the weaker of the 2 consoles, would have double the performance of a GPU, that costs more than double the price of a single, next gen console. Logically, this would never happen.

There was supposed to be a follow up video to the demo shown, and the devs were going to go more in depth with the engine, but that has been delayed. We should hear more info soon.

where's the video?

Video
 
Last edited:

ethomaz

Banned
It's definitely 2K/1440p. Multiple translations have stated so.
FUD.
You know the guy you replied watched the video lol
You don’t fool here anymore.

BTW you never showed these multiples translation because it doesn’t exists... the guy never talked about the resolution he run the demo.
 
Last edited:
FUD.
You know the guy you replied watched the video lol
You don’t fool here anymore.
C cookiemuffin asked for the video, so I posted a link. What's wrong with that?

You know, no one has been able to link anything concrete regarding 1080p to the demo running on the laptop? There's a reason for that, because it's not true. The dev said it was at 1440p, himself. If you have proof of the dev saying the laptop was running at 1080p, please present it. Others and I have asked for proof, several times. I even posted the video almost 24 hours ago, and yet no one can dispute it. Not even you can.

Here's your chance to redeem yourself. If you can't, there will be no need to spread anymore lies on this topic. Thanks in advance!
 

ethomaz

Banned
C cookiemuffin asked for the video, so I posted a link. What's wrong with that?

You know, no one has been able to link anything concrete regarding 1080p to the demo running on the laptop? There's a reason for that, because it's not true. The dev said it was at 1440p, himself. If you have proof of the dev saying the laptop was running at 1080p, please present it. Others and I have asked for proof, several times. I even posted the video almost 24 hours ago, and yet no one can dispute it. Not even you can.

Here's your chance to redeem yourself. If you can't, there will be no need to spread anymore lies on this topic. Thanks in advance!
The dev never said 1440p lol
Just cut the bullshit.

All translations... every single one never talked about 1440p.
Watch your own video... 1440p was never stated lol

PS. What my comment has to do with somebody asking a video lol
 
Last edited:
It's definitely 2K/1440p. Multiple translations have stated so.

The mention of 1080p was used as an example to get better performance, like running from a HDD for instance.

I could sell my 2080 TI by itself, and buy both next gen consoles, and possibly get money on top. There would be no way that the weaker of the 2 consoles, would have double the performance of a GPU, that costs more than double the price of a single, next gen console. Logically, this would never happen.

There was supposed to be a follow up video to the demo shown, and the devs were going to go more in depth with the engine, but that has been delayed. We should hear more info soon.



Video
hmmm.... ultragpu seems to be able to speak chinese and he claims it was not said.
OK sure thing. But nowhere in that Chinese podcast did they mention about the laptop running a fixed 1440p res, for all we know it could also be running with DRS on, maybe not even at 1440p at all. The nature of that laptop performance was very vaguely presented so I don't think we can rely too much on that for now. More so that PS5 demo was running on early devkit and they were aiming for 60fps, who knows if the final optimized build could put it on the same level as a 2070s. But I admit I jumped into conclusion too fast myself before, we should wait for more data. ... -ultragpu, beyond3d
Yes, I forgot about this part when he said it. The sentiment he's conveying is almost as if next gen consoles can easily reach 60fps after optimization when even his laptop can do it at 40fps. Also yes he didn't specify at any one point that the demo was 1440p on his laptop.
A deep tear down of UE5 couldn't happen soon enough. -ultragpu, beyond3d
 
The nanite and lumen tech will be on both and awesome on both... some people are missing he specified those two things even though there were other very impressive things the demo accomplished. Even then does awesome on both mean equivalent on both? Just be ready for another gen where DF and NX Gamer breakdowns become key to knowing where to play a game (and whose side of the dumb console war has gotten the upper hand that week)
 
So no evidence of 1080? Dodging questions as usual. If you find any solid evidence, feel free to let me know. Otherwise, safe to say it's 1440p, running same assets, resolution, scale, etc.

There does not appear to be 1440p laptops, perhaps there are.

I went searching for 2080 laptops to see what was for sale, practically all laptops were 1080p. we don't know if the laptop was 2080 as that is what a guy who claimed he called the engineer said in a forum, which could be bogus.

There are like 2 or 3 4k 2080 laptops I found. But most even $5000+ and $6000+ laptops I found were 1080p.

Even non2080 laptops are mostly 1080p.

We know it is in all likelihood a 1080p laptop as practically all gaming laptops are even the most expensive ones



That is a 6500$ 2080 laptop and even it uses a 1080p display.

What we don't know is what card the laptop had. A 2060? A 2070? A 2080? A 2080 super?

But as I said if the card was 2080 mobile or higher, the ps5 performance is effectively 3080ti or higher at running this demo without the use of ray tracing, ignoring potential ray tracing accelration.
 
There does not appear to be 1440p laptops, perhaps there are.

I went searching for 2080 laptops to see what was for sale, practically all laptops were 1080p. we don't know if the laptop was 2080 as that is what a guy who claimed he called the engineer said in a forum, which could be bogus.

There are like 2 or 3 4k 2080 laptops I found. But most even $5000+ and $6000+ laptops I found were 1080p.

Even non2080 laptops are mostly 1080p.

We know it is in all likelihood a 1080p laptop as practically all gaming laptops are even the most expensive ones



That is a 6500$ 2080 laptop and even it uses a 1080p display.

What we don't know is what card the laptop had. A 2060? A 2070? A 2080? A 2080 super?

But as I said if the card was 2080 mobile or higher, the ps5 performance is effectively 3080ti or higher at running this demo without the use of ray tracing, ignoring potential ray tracing accelration.
When working on the engine in the lab, I can guarantee you, he's using external monitors, and more than one at that. No developers, animators, programmers, etc, use the laptop display in their office. Especially when writing or testing an engine.

He said that he had the demo running in the lab, 1440p, 40fps, on a laptop with a rtx 2080 and a Samsung 970.

Jensen from Nvidia said laptops with the rtx 2080 would be more powerful than next gen consoles. (Not hard to imagine, as you won't find this gpu for console prices). The Chinese dev who helped create the engine is also saying the demo is running better than next gen consoles.

To further reiterate what the devs and Jensen said, if the demo was running 40fps at 1080p, it would have less performance at 1440p.

It only makes sense to trust the people who worked on this specific engine and the leader in graphics technology, over some random console warriors who also said:

First the engine could only run on ps5. False

Then the engine would need to be scaled down to run on Xbox and pc. False

Then there was no pc version, and it was just a video playing during the presentation. False

Then it was running on a 1080p panel. False

With that said, I'ma step back and watch this thread. No point in repeating the same facts, over and over, to those who are fighting for their company of choice.
 
When working on the engine in the lab, I can guarantee you, he's using external monitors, and more than one at that. No developers, animators, programmers, etc, use the laptop display in their office. Especially when writing or testing an engine.
when working in the lab I believe he likely uses a desktop workstation.
He said that he had the demo running in the lab, 1440p, 40fps, on a laptop with a rtx 2080 and a Samsung 970.
Someone in a forum claimed they called him and that's how the specs appeared, we don't know if it is bogus.

From my understanding all he said is something to the effect of even my laptop can run it at 40fps on the editor so consoles will likely be able to reach 60fps, that would be foolish to suggest if he was claiming his laptop was noticeably more powerful. It seems he's implying if even his laptop can obviously the consoles will easily reach 60fps, basically implying the consoles are stronger than his laptop by his comment.
Yes, I forgot about this part when he said it. The sentiment he's conveying is almost as if next gen consoles can easily reach 60fps after optimization when even his laptop can do it at 40fps. Also yes he didn't specify at any one point that the demo was 1440p on his laptop.
A deep tear down of UE5 couldn't happen soon enough. -ultragpu, beyond3d


He didn't mention resolution according to some who have seen the video and seem to speak chinese. Did he claim to work on it on his laptop? We'd need to ask someone who speaks chinese.

edit:
BTW should add, that if his laptop is the laptop that appears on the podcast, from first impressions it appears notably thick, probably a full mobile gpu not the max q version which would likely be slim.
 
Last edited:

Truespeed

Member
So apparently the demo was only running at 1440p 30 FPS on the PS5. I wonder what it would have been on the Xbox Series X.
 
Top Bottom