• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Neo Blaster

Member
Awkward times for Dictator and the rest of tars who has been pushing this PS5/Demo/Laptop BS 👇:messenger_tears_of_joy:

Mlqfl1y.jpg


MdzgPes.jpg

Facts lines up, idiot? :)
I think I haven't stressed enough how the UE5 demo is still delivering.

A video, Jesus Christ, a video...
 

Bo_Hazem

Banned
Well if they knew any better its not TFLOPS that counts, its the level of detail and streaming billions of triangles which the PS5 SSD clearly does a great job of. Even I know better than you. 🙄🙄

Really? Have you ever heard of workstations with terabytes of memory? Those can easily run that by some coding making the game temporarily jump into DDR4 and wait there.


Go get one, I'll stop wasting my time here.
 

tmlDan

Member
He didn’t deny they have marketing deal though, his answer isn’t really definitive.


And? What's your point. All you have is this quote.

And a little fun fact for you, as a Marketing Manager myself, not all partnerships are give and take, and they don't all involve money.

I'll break it down for you, Epic makes money off of software sales - they get a small percentage of revenue the dev gets when they use UE for their game (apparently its 5% for UE4 games).

Since PS4, and likely PS5, being the best selling console, that also likely means they sell more software.

It's not that complicated guys.
 
Yes, the MIGHTY RTX 8000 has been struggling yesterday with their latest tech demo with stutter/resolution drop/shitty VRS with below 30fps most of the time with downgraded aspect ratio from 16:9 to 23.9:1 for semi-4K. Brute-forcing can't solve everything.

The brute force of the PS5 SSD I/O throughput without bottlenecks solved a lot of problems habibi, enough to discredit the need for TFLOPS.
 
Last edited:
I will be very surprised if the XsX is a Fixed 12 TF all the time. Is that the consensus? I mean I dont think any GPU out there is a constant max Freq or TF? I am just trying to understand it, like when I am on the home screen it will be running at 12TF?

Naw bro. XsX goes all the way down to 4 TFLOPS with a slower SSD. Luckily, the PS5 is 10 TFLOPS with brute force-bottle neck removed SSD throughput where you can make a CGI movie using your phone app.
 
Last edited:

HAL-01

Member
That's the equivalent of saying.... The closest grocery store is 100 miles away. I couldn't simply walk the distance. The powerful engine of the Lamborghini allowed me to drive to the store and back to buy everything I need before sundown.
That doesn't mean a ferrari couldn't do the same thing. Hell maybe even a Toyota could.
The only way this analogy works is if the other car is half the top speed as the lambo, but has better torque. We dont know how far away the store is, if its a straight road, or if its steep. All we know is that you already did it with the lambo and your wife was happy at how quickly you came back with some milk
 

Andodalf

Banned
I will be very surprised if the XsX is a Fixed 12 TF all the time. Is that the consensus? I mean I dont think any GPU out there is a constant max Freq or TF? I am just trying to understand it, like when I am on the home screen it will be running at 12TF?

MS went out of their way to make sure everyone knew it was 12TF sustained well before Sony even mention variable performance. At the time people said it was dumb of them to even mention, but here we are.

From the inital DF article

But up until now at least, the focus has been on the GPU, where Microsoft has delivered 12 teraflops of compute performance via 3328 shaders allocated to 52 compute units (from 56 in total on silicon, four disabled to increase production yield) running at a sustained, locked 1825MHz. Once again, Microsoft stresses the point that frequencies are consistent on all machines, in all environments. There are no boost clocks with Xbox Series X.
 
Last edited:

RookX22

Member
MS went out of their way to make sure everyone knew it was 12TF sustained well before Sony even mention variable performance. At the time people said it was dumb of them to even mention, but here we are.

From the inital DF article
I hope someone can hack it and load some performance metric tools, and if it is in fact a locked constant 12tf at all times in all instances I really hope these RDNA2 GPUs can achieve this. I also hope they are hella energy efficient I would hate to imagine my vega 56 was pegged at 1600 mhz even when I was browsing the internet.
 

HAL-01

Member
I will be very surprised if the XsX is a Fixed 12 TF all the time. Is that the consensus? I mean I dont think any GPU out there is a constant max Freq or TF? I am just trying to understand it, like when I am on the home screen it will be running at 12TF?
Teraflops are not a number like the clock speed. a gpu does not run "at" 12tf. It is a measurement of the theoretical peak number of calculations it could do per second. Similarly to how a car may have a 200mph peak speed, but may rarely or never reach it during normal use, for a number of mechanical or external reasons.
 

RookX22

Member
Teraflops are not a number like the clock speed. a gpu does not run "at" 12tf. It is a measurement of the theoretical peak number of calculations it could do per second. Similarly to how a car may have a 200mph peak speed, but may rarely or never reach it during normal use, for a number of mechanical or external reasons.
I know this but in the DF article linked it said it was going to run at all times at locked 1825MHz.
 
The brute force of the PS5 SSD I/O throughput without bottlenecks solved a lot of problems habibi, enough to discredit the need for TFLOPS.

That demo relied heavily on GPU compute as well as super I/O. But it seems people have a tendency to oversimplify how these systems work. Now suppose a somewhat weaker machine is significantly more efficient, so it is able to sustain performance closer to peak performance, thereby closing the gap with the (on paper) more powerful system. This was already mentioned in the Road to PS5 presentation, and the example given relates to the use of a higher GPU clock speed. There are plenty of other unknown details that could further improve system efficiency.

Back in the PS3 & XB360 days, one machine had a Blu-ray drive that allowed the delivery of 5x as many assets. A lot of that extra space tended to be used for cinematics, which could augment storytelling in games, a major benefit. Also, the design of the Cell processor allowed very high sustained throughput, and I recall some games pushIng its utilization to around 80% or more, effectively extending the performance gap between the two systems’ CPUs. So, while the 360’s GPU was more powerful, in the end it was not the whole story.

For the PS4, the use of asynchronous compute can improve system utilization, effectively “filling bubbles” in the GPU’s workload. It would be interesting to know if TLOU2 or GoT managed more than 75%. As I understand it, contention between the CPU & GPU in AMD’s architecture limit efficiency here.
 
Last edited:

Bo_Hazem

Banned
I hope someone can hack it and load some performance metric tools, and if it is in fact a locked constant 12tf at all times in all instances I really hope these RDNA2 GPUs can achieve this. I also hope they are hella energy efficient I would hate to imagine my vega 56 was pegged at 1600 mhz even when I was browsing the internet.

Here, my Radeon VII barely working, 0TF out of 13.8TF, nearly most of the parts are barely working:

148974.jpg


No such thing as fixed.
 
Interesting that people have put so much effort into downplaying the PS5 and its SSD but the PS5 itself is still generating so much buzz and excitement, the UE5 tech demo has already got like 10 million views which is insane.
10 Million, that's insane. For comparison the Series X gameplay reveal hasn't even reached 1 million. Which is weird because Xbox has around 3 million subs in their channel while Epic is around 500k.
 

Sinthor

Gold Member
Yes, NVIDIA, AMD, Microsoft and even Apple agree to what you have just said. Wait until something comparable comes along that matches that tech demo at 1440p/30fps
:pie_roffles::pie_roffles::pie_roffles:"pie_tears_joy:"pie_tears_joy:"pie_tears_joy:"pie_tears_joy:

You apparently have no conception of what it takes to stream in 8k assets as this demo was doing. You couldn't do that on the PC you have today. Forget about the resolution and frame rate. We'll have equal comparisons soon enough. I'd temper my expectations if I were you. The way some are carrying on, they expect the Xbox series X to be running everything 4k @120fps while the "poor PS5" can only manage 1440p and 30fps. Not gonna be that kind of difference between the two. Not LIKELy let's say anyway. We'll see soon enough.
 

HAL-01

Member
I know this but in the DF article linked it said it was going to run at all times at locked 1825MHz.
Yes, the idea is that the console should run reliably at that fixed clock regardless of if the game requires it or not, contrary to the PS5 which will lower clocks depending on the game. I dont know if it's running the same on the main UI, perhaps then only one of the cores is active.

Edit: My bad, The PS5 power consumption remains constant regardless of the clock speed.
 
Last edited:

RookX22

Member
Here, my Radeon VII barely working, 0TF out of 13.8TF, nearly most of the parts are barely working:

148974.jpg


No such thing as fixed.
Yeah I figured as much with my experience with building PCs and watching benchmarking and overclocking videos on LTT GN and the like. Even under heavy cooling at a certain point clocks drop under thermal throttling. Thats why I am curious of the fixed claim either they have some kind of crazy cooling or the chip they are using is more effecient / underutilized so it will not thermal throttle, and it feels wasteful for a system to run at sustained clocks capable of producing 12 tf or graphical performance even if you are just watching youtube. Just my 2 cents.
 

Bo_Hazem

Banned
10 Million, that's insane. For comparison the Series X gameplay reveal hasn't even reached 1 million. Which is weird because Xbox has around 3 million subs in their channel while Epic is around 500k.

It's not just subscription, they indeed contribute but overall PS5 gets much more attention, not that it's like people watching it for being Epic Games. Plus all of xbox fans and other PC gamers and casuals watched it, it's the first true next gen gameplay so everybody wanna check it out. Even professionals and amateurs using UE for general purposes are watching it.
 
Last edited:

RookX22

Member
Yes, the idea is that the console should run reliably at that fixed clock regardless of if the game requires it or not, contrary to the PS5 which will lower clocks and power consumption depending on the game. I dont know if it's running the same on the main UI, perhaps then only one of the cores is active.
Its just in my experience you can overclock a cpu to say 5.0 ghz and it will run at that freq but it might only be single digit utilized during typical use on a pc thus not producing alot of heat and power draw but a gpu is different you can overclock them but they typically dont have fixed clock even when not utilized only when they need it it will clock up to the set overclock that is if thermal and stability allows for it. My problem is the claim of constant gpu frequency.
 

quest

Not Banned from OT
Yeah I figured as much with my experience with building PCs and watching benchmarking and overclocking videos on LTT GN and the like. Even under heavy cooling at a certain point clocks drop under thermal throttling. Thats why I am curious of the fixed claim either they have some kind of crazy cooling or the chip they are using is more effecient / underutilized so it will not thermal throttle, and it feels wasteful for a system to run at sustained clocks capable of producing 12 tf or graphical performance even if you are just watching youtube. Just my 2 cents.
Its during games it sustains clocks ffs.
 

Sinthor

Gold Member
Are you trying to compare Hollywood studios with gaming PC's? @BGs told us that their company has two massive PC's, one is 75TF and I think one is 45TF. How about you pay $50,000 USD or more to beat a $500 USD console? Sounds smart.

Plus he's missing the point of the UE5 demo. They were streaming in assets that were higher resolution (8k), which would make it possible to do exactly what he was snarking about! Whatever......must be tough to be soooo emotionally invested in a brand and a particular piece of plastic and silicon!
 
It's not just subscription, they indeed contribute but overall PS5 gets much more attention, not that it's like people watching it for being Epic Games. Plus all of xbox fans and other PC gamers and casuals watched it, it's the first true next gen gameplay so everybody wanna check it out. Even professionals and amateurs using UE for general purposes are watching it.
Makes sense. I think when Sony reveals it's own event, reactions are going to get nuclear.
 

Bo_Hazem

Banned
That demo relied heavily on GPU compute as well as super I/O. But it seems people have a tendency to oversimplify how these systems work. Now suppose a somewhat weaker machine is significantly more efficient, so it is able to sustain performance closer to peak performance, thereby closing the gap with the (on paper) more powerful system. This was already mentioned in the Road to PS5 presentation, and the example given relates to the use of a higher GPU clock speed. There are plenty of other unknown details that could further improve system efficiency.

Back in the PS3 & XB360 days, one machine had a Blu-ray drive that allowed the delivery of 5x as many assets. A lot of that extra space tended to be used for cinematics, which could augment storytelling in games, a major benefit. Also, the design of the Cell processor allowed very high sustained throughput, and I recall some games pushIng its utilization to around 80% or more, effectively extending the performance gap between the two systems’ CPUs. So, while the 360’s GPU was more powerful, in the end it was not the whole story.

For the PS4, the use of asynchronous compute can improve system utilization, effectively “filling bubbles” in the GPU’s workload. It would be interesting to know if TLOU2 or GoT managed more than 75%. As I understand it, contention between the CPU & GPU in AMD’s architecture limit efficiency here.

What a wonderful input. That's why I'm addicted to this thread, learning every day. Although I might not fully comprehend everything, but it's like visiting a perfume shop, you'll smell good by testing even if you end up not buying any.

The CELL tech is still too confusing to me because it has varied jobs and calculations from Road to PS4 video and other videos and posts here.

You apparently have no conception of what it takes to stream in 8k assets as this demo was doing. You couldn't do that on the PC you have today. Forget about the resolution and frame rate. We'll have equal comparisons soon enough. I'd temper my expectations if I were you. The way some are carrying on, they expect the Xbox series X to be running everything 4k @120fps while the "poor PS5" can only manage 1440p and 30fps. Not gonna be that kind of difference between the two. Not LIKELy let's say anyway. We'll see soon enough.

Let's not mention AC: Valhalla, the first reality check for next gen. Although AC games are notoriously taxing anyway.

Who the fuck are you calling, 'habibi?' I know you're not being THAT kind of fucktard now, are you?

Don't worry :messenger_tears_of_joy: I stopped replying to him since that workstation post. Better enjoy our time here with most of the rest being genuine and trying to bring something new to the table.

And in case someone doesn't understand what "habibi" means, it means "my love" in Arabic.
 
T

Three Jackdaws

Unconfirmed Member
What a wonderful input. That's why I'm addicted to this thread, learning every day. Although I might not fully comprehend everything, but it's like visiting a perfume shop, you'll smell good by testing even if you end up not buying any.

The CELL tech is still too confusing to me because it has varied jobs and calculations from Road to PS4 video and other videos and posts here.



Let's not mention AC: Valhalla, the first reality check for next gen. Although AC games are notoriously taxing anyway.



Don't worry :messenger_tears_of_joy: I stopped replying to him since that workstation post. Better enjoy our time here with most of the rest being genuine and trying to bring something new to the table.

And in case someone doesn't understand what "habibi" means, it means "my love" in Arabic.
Yeah so true, I’ve learnt so much about gaming components and software technologies. RAM, SSD, bandwidth, triangles/polygons, illuminations/ray-tracing! I could go on. A lot of these discussions require more technically minded people to explain things more simply so I guess it makes it easier for us to learn.
 

Sinthor

Gold Member
What a wonderful input. That's why I'm addicted to this thread, learning every day. Although I might not fully comprehend everything, but it's like visiting a perfume shop, you'll smell good by testing even if you end up not buying any.

The CELL tech is still too confusing to me because it has varied jobs and calculations from Road to PS4 video and other videos and posts here.



Let's not mention AC: Valhalla, the first reality check for next gen. Although AC games are notoriously taxing anyway.



Don't worry :messenger_tears_of_joy: I stopped replying to him since that workstation post. Better enjoy our time here with most of the rest being genuine and trying to bring something new to the table.

And in case someone doesn't understand what "habibi" means, it means "my love" in Arabic.


Ah, I see. He was expressing his....errr....'ADMIRATION' for you as such a fine technical arbiter of information about the mighty PS5! My mistake!
:messenger_winking_tongue:
 

Bo_Hazem

Banned
Yeah so true, I’ve learnt so much about gaming components and software technologies. RAM, SSD, bandwidth, triangles/polygons, illuminations/ray-tracing! I could go on. A lot of these discussions require more technically minded people to explain things more simply so I guess it makes it easier for us to learn.

Indeed, I barely understood a thing 3 months before I signed here. So many tech savvies here to learn and engage with, and entertaining at the same time :messenger_tears_of_joy:

Ah, I see. He was expressing his....errr....'ADMIRATION' for you as such a fine technical arbiter of information about the mighty PS5! My mistake!
:messenger_winking_tongue:

Well, he was using it in a sarcastic way, it's like when I debate with you and things heat up a little and I start calling you "honey". It's a trick to make you go nuts, a troll method.

But hey, let's not worry and resume :messenger_winking_tongue:
 

Sinthor

Gold Member
Indeed, I barely understood a thing 3 months before I signed here. So many tech savvies here to learn and engage with, and entertaining at the same time :messenger_tears_of_joy:



Well, he was using it in a sarcastic way, it's like when I debate with you and things heat up a little and I start calling you "honey". It's a trick to make you go nuts, a troll method.

But hey, let's not worry and resume :messenger_winking_tongue:
Roger that!

On the same track....anyone have any guesses as to how much of a load it would take off of that demo to use 4k assets instead of 8k? Estimates on what that might do to frame rate and/or resolution? I know it's super early tech and we can't REALLY know the answers until we know and see a LOT more....just looking for any educated guesses?
 

Bo_Hazem

Banned
Roger that!

On the same track....anyone have any guesses as to how much of a load it would take off of that demo to use 4k assets instead of 8k? Estimates on what that might do to frame rate and/or resolution? I know it's super early tech and we can't REALLY know the answers until we know and see a LOT more....just looking for any educated guesses?

I would say in the PS5 you can just push the outcome to like 10 million polygons as a cap instead of 20 million per frame and you'll easily jump to 60fps or native 4K@30fps. As explained by Tim, it's actually running at ~40fps but capped at solid 30fps, that's why you don't feel frame drops.

Even if software-based ray tracing isn't too much taxing and used mildly, you can offload that work for the intersection engines and benefit from either better ray tracing utilization or higher resolution/framerates. We shall wait for the full version with HW RT support for PS5. Sony's first party and Decima engine is already insanely smart and done the impossible this current gen, and more than sure that Decima engine will grow to be the main shared engine by Sony studios, or even gets shared for PC games and other 3D design usage. 4th of June is gonna be flaming with next gen games!

(timestamped)

 
Last edited:
Status
Not open for further replies.
Top Bottom