• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

ethomaz

Banned
N0r8wQ8.png
Where that come? It looks to have the most accurate width.
 
Last edited:
Minor problem is it's not giving me surround 7.1/5.1 channel sound, only 2.0 channel although I'm using PS4-certified Astro A40 with the mixer through optical audio.

I’ve had the same issue when using LCPM on my PS4 with ARC. Try switching to DTS or Dolby in the sound settings and it should fix the surround issues.
 

ToadMan

Member
I wasn't aware that MS had ever announced a showcase in June. Could you point me in the direction of that tweet.

Not a tweet - it’s mentioned here:

“Starting with the May 7 episode of Inside Xbox, we will be showcasing what happens next in the world of Xbox, every month, which is why we’re calling it “Xbox 20/20.” These monthly moments will take place throughout the rest of the year and...”

https://news.xbox.com/en-us/2020/05/05/xbox-2020-the-future-of-xbox/

My guess is they saw the blow back from their failed May event, also saw Sony were producing actual gameplay and exclusives to show and realised that a set of trailers in June running on “representative hardware” would go down like a lead ballon.

So they try and get some games running on Xsex at high enough quality to show in June/July... maybe.

June is almost over and haven’t heard about a July date either yet ...
 
Last edited:
I think you're missing something here. First, PC games have NOT gone all digital. You can still buy disk packages in stores and online. Amazon sells online codes AND the package with disk for Jedi : Fallen Order, for example.

Also, yes, you pay full price whether disk or digital, but keep in mind that lots of people DEPEND on using trade ins to help them afford buying MORE used or new games. If there wasn't a big market for the used games and trade-ins, GameStop would already be done and gone.
I mean, I guess that's true for pc on a technicality, in the same sense you can still buy music in vinyl. It exists, but it's not even close to being significant enough to be considered meaningful. And the other thing is, the digital discounts are so deep and so frequent i'm convinced the scales will keep tipping towards digital. Btw I'm not going to shed any tears for Gamestop once they go the way of Blockbuster.
 

Corndog

Banned
Well, if that’s accurate then that would explain why Lockhart hasn’t been mentioned. Even though the release is suppose to be soon.

I assume here that MS would’ve gotten wind of the DE model at least a month before the public.

To be honest the Lockhart strategy never made any sense to me - it would just confuse purchasers and water down the impact of Xsex even more than the cross platform game line up is already.

Maybe if MS are worried by the DE they’ll just trash Lockhart or remodel it to an x1x slim next year.




Running yes. What hadn’t been confirmed until now was that games would get improved performance without additional dev work.
So your saying runnning some type of boost mode?
 

ToadMan

Member
No gzip and Kraken are compression methods that further compress BCn files. Oodle Texture and BCPack (come on man BC means Block Compression) arrange the texture data in a way that allows it to be compressed better. It can be lossless which result in less compression by gzip and Kracken or can be lossy which will allow for better compression at the expense of texture quality.

Stanard describes BCpack as a “compression codec”.

To me that means it’s part of the target os /system apis for decompression at or near runtime same as kraken - not a tool like oodle texture for use during development.

But there’s not much detail about bcpack - it could incorporate develop time compression optimisation as part of the toolset or perhaps it uses a third party texture preprocessor as a base itself.
 
Last edited:
Yeah because he's the only one that though or forsaw eventually a digital future lol 😂 😂 😂 Netflix must have spoken to him and Sony when thy made the pstv 😏

Not sure we're seeing the same angle. Not a soul had a shit-fit when Netflix released. Blockbuster didn't release a this is how you trade your movies video.
 

J_Gamer.exe

Member
You’ve repeated this claim about six times now—even linked to it—but have completely failed to understand the context yourself.

With fixed clocks and varying power consumption you have to make assumptions about how much power a game will need. That means also making assumptions on power hungry instructions like AVX. In this case assuming (and hoping) they’re not used much so you can set your fixed clock a little higher.
This is how things worked on PS4 and something like XSX.

But what if AVX instructions aren’t used sparingly, or if a game design needs them? Then you’d need to reduce your fixed clock to have enough headroom on power.

That’s the “warning against using them”.

When targeting fixed power and allowing the frequency to vary, you don’t have to assume anything about AVX usage, and there are no longer any unknowns on power usage.

Actual calculation and work done costs watts, regardless of clock frequency. Targeting fixed power consumption over fixed frequency means a fixed amount of work being done.

Identifying unproductive transient spikes in power usage (uncapped frame rates in simple map screens) and reducing clocks to keep that work at maximum power means that actual efficient CU occupancy in real world game code can have its frequency ramped up beyond what you could if you had to set a fixed clock and assume unproductive transients and unpredictably high AVX usage.

By catering to these unproductive transients in a fixed frequency domain it wouldn’t be possible to go this high on clock speed, and real game code would need to be run significantly slower to cater for these spikes.

Targeting fixed power is targeting fixed performance as far as actual numbers crunched goes. It’s leaving nothing on the table.

A fixed clock does not mean fixed performance. Nor does it mean uncapped performance. Power draw varies with work load, which is why you could have a 5Ghz clock that is stable all day in Windows, gets hot but is stable in games, and crashes in a synthetic stress test like Linpack.

Modern GPU overclocking is done by providing enough cooling so that your performance is limited by maximum allowed TDP, with no thermal throttling.

It’s targeting fixed power consumption. It’s the ideal thing to be limited by, rather than temperature.

PS5 does not throttle based on temperature. It cannot vary clocks based on temperature.
It doesn’t boost until a temperature threshold, but to stay at maximum power consumption.
PS5 is a big lad in a big boy chassis, it is not in a laptop chassis with limited cooling. It is not in the PC and mobile domain of boosting based on ambient temperature and die temperature, and backing off as they are hit.

AMD SmartShift only varies the maximum TDP allowed to each component of the same shared APU, and is only part of the variable frequency.
In your laptop with limited cooling and which boosts up based on die temperature, SmartShift extends the duration of maximum boosts. That’s what it’s designed to do based on AMD’s own literature on the subject.
In a PS5 it augments the fixed power target system so that each component has an even higher TDP to play with individually.
They have balanced the GPU and CPU frequencies so that they aren’t compromising the other and have similar thermal density.



Cerny’s quote about a “couple of percent” drop in clock rate yielding a 10% drop in power consumption (which the article you quote flips around) is a quote to show how little you need to drop clocks to reduce power consumption.

It’s highlighting the relationship between the two variables. It isn’t in either case saying how low GPU clocks will fall.
Cerny chose dropping power by 10% as his starting point for that relationship. The article you quote chose 10% drop in clock for the starting point for that relationship.

If you’d been paying attention you’d see that PS5 targets a fixed power draw.
Fixed power draw.
Not an alternating 100% or 90% power draw (using Cerny’s power/clock relationship example figures).
The GPU doesn’t run into some power hungry instructions and decide to drop the clocks enough to now start running at 90% power until they’re finished.
It drops the clocks just enough to stay at 100% power because it targets fixed power usage by varying clocks. It doesn’t vary clocks and power.

You’ve not only taken numbers to highlight the non-linear relationship between power consumption and clock-speed to assume what a low clock speed might be, but you’ve chosen something the explain the relationship that arbitrarily chose a 10% reduction in clock-speed as its starting point, before kindly doing the math to tell us that equals 2Ghz as if that now somehow says this is what PS5 GPU clocks fall to.

Why not use Cerny’s own arbitrarily chosen 10% reduction in power consumption?

What would that yield as the lowest the clocks go?

How about if I say a 15% reduction in clock speed reduces power consumption by 75%
Have I now just proved that PS5 clocks go as low as 1.9Ghz?

You arbitrarily chose 2Ghz because that sounds more dramatic of the two relationships quoted, because you are an ill-informed troll.

Cerny has repeatedly said both clocks will run at maximum frequency most of the time.

In the same Eurogamer interview he even further explains to the interviewer who is also looking at the situation from a thermally throttling PC point-of-view that there is no “base clock”, and that even when the GPU spent an entire 33ms frame budget in work it sat at maximum clocks, without relying on a race to idle condition to artificially keep the clock-speed high.

When pushed further by the interviewer to figure out what the “base clock” is Cerny talks about what a synthetic benchmark that flips all transistors every tick would do, which would likely cause PS5 to reduce clocks more significantly and cause a fixed frequency system like PS4 to overheat and crash.

If you’re using your APU at 100% TDP at all times during a game with some kind of instruction snooping system, then you’re boosting useful game code while taming useless transients without having to cater to them.
Typically efficient game code only has around 30-40% CU occupancy. Which is why peak figures are meaningless. Especially if one system can recognise the difference between efficient game code and unproductive transients that would otherwise spike power on fixed clocks.
And before you misunderstand again, “efficient” in this case means above average CU occupancy. It means code written to really stretch the hardware in a useful way.
Synthetic stress tests are full of loops with no calculated result just for the sake of consuming watts/generating heat.

Fixed power variable frequency is like normalising an audio wave form to flatten out spikes so that you can amplify the rest of the audio without those spikes clipping.
Spikes aren’t efficient useful game code but typically oversights like uncapped low triangle scenes. Not extremely busy scenes with lots going on.

GPU work done is watts consumed, not clock speed.

wtfl;dr you misunderstood Cerny’s context for bringing up power hungry instructions to the extent it makes the opposite point you think it does. You’re taking an arbitrarily chosen number used to highlight a relationship to infer what PS5 clocks are while forgetting it’s fixed power variable clock, not variable power and variable clock

swtfl;dr trolling and derailing doesn’t go down well here
No modern chip will even come close to that though. There are power and thermal protections that would stop execution or severely down-clock depending on how it’s configured.
On PS4 it would cause a crash according to Cerny.

The point is with a fixed clock variable power solution you have to estimate what your real world power consumption is likely to be with game code—including unproductive transients—and set the frequency bar low enough to cover that.

The second point is graphically intense scenes at fixed frame-rates are full of data being passed around the GPU and things waiting on other things and aren’t as power intensive as unproductive transients like a low triangle scene with an uncapped frame-rate. So by setting your fixed clocks based on dealing with those transients your graphically intense scenes are running at a lower clock-rate than they otherwise could be.

Games are nothing at all like synthetic unproductive benchmarks/burn-tests in how they use power. The simple cooling used in dedicated gaming machines like PS4 and even XSX would run into power/heat protection and crash if running synthetic burn-tests that try to use every flop every clock-tick.

A misconception around PS5 comes from a naive understanding that clock-speed relates to performance, when it is actually power consumption that measures work done. A fixed clock can be drawing next to no power, a lot, or even too much all depending on workload.
Fixed clocks are chosen based on worse case transient scenarios, like the God of War map screen. They are not chosen to maximise performance during busy and complex rendering.
This makes choosing clock-speed, power supply and cooling solution (educated) guess work that errs on the side of caution.

Another misconception is that PS5 variable frequency is the same as found in mobiles and PCs, where there is some kind of “base clock” that will be boosted from until the chip reaches some kind of thermally saturated threshold and returns to the base clock. Taking advantage of thermal mass to sneak in some extra performance in small tasks.

PS5—at least as described by the lead system architect—isn’t varying clock speed based on temperature of the die, but on power consumption to keep it at maximum when under load. PS5 isn’t some slim laptop chassis or mobile device.
It isn’t a general purpose computer that can benefit from transient boosts in clock speed during intermittently demanding tasks like opening an application, or running a calculation etc.
As described it is deterministic and identical regardless of expected working environment.

As further described by the lead system architect both the GPU and CPU are expected to stay at their maximum boost frequencies most of the time, without needing to pick and choose, and without the GPU needing being in a race to idle scenario but spending the entire frame drawing.

There’s a reason Cerny mentioned the map screen as being power hungry, and not a graphically complex scene.

The PS5 is a games machine tuned to be at its best running game code on its cores and CUs. It isn’t tuned to run benchmarks or load an application quickly. Its variable frequency system isn’t related to mobile and PC turbo clocks.

Neither XSX nor PS5 will be running game code at 12 or 10 teraflops. Comparing theoretical maximums might be useful for crunching prime numbers or running fury cube, or running crazy efficient OpenCL calculations, or even just looping something aimlessly to generate heat, but it’s just not applicable to 30-40% occupancy CU work.

If Cerny is right when he said both clocks will stay at or near maximum, and that it only takes a 2% drop in clock-speed to reduce power consumption by 10%, the clocks will spend most of their time pinned, varying away from it just enough to keep power at 100% during the kinds of map/menu screens that make my PS4 Pro to huff a bit more.

And because it’s worth repeating, the numbers above are arbitrary and describe a relationship between two variables.
The point being how little you need to drop clock-speed to reduce power requirement.
The power doesn’t need to be reduced by 10%, the entire point is that it’s fixed and stays pegged. 10% is an arbitrary number, he could have picked 5% and a smaller clock-speed decrement, and vice versa.

Even Leadbetter couldn’t wrap his head around it and was trying to understand it in terms of mobile/PC turbo boost clocks and base clocks. I think it confused Cerny which is why he brought up the synthetic code example.

PS5 is as much “10TF“ as XSX is “12TF”. The differences in APIs will make more of a difference than comparing theoretical maximums as far as games go. I’m also interested to see how much of a difference the Coherency Engines and GPU cache scrubbers make, as in theory that could significantly increase CU occupancy, making a direct comparison between the two less useful.
They are both RDNA2 based, but they are not entirely equal if the CUs in one have to do (significantly/slightly?) less trips to system memory.

Thanks for these explanations, I believe I moslty understood it before, but its always helpful for another way of explaining.

So would I be correct in saying that this variable method is actually most likely the superior method, all things being equal, if they had the same gpu the variable would perform higher more often because it's not limited by that worse case scenario?

Effectively the xbox could get higher clocks if it had it, as it's clock is limited by guestimating and testing for that worst case scenario or high workloads?

However most of the time it's not an excessive workload and therefore could run higher if it was allowed maybe at something like 2ghz for eg and only then need to drop once those higher workloads come for a small amount of time?

So because these highest workloads are only fleeting / rare, most the time the clocks actually could have run higher?

So bragging about having fixed clocks vs this clever variable clocks and constant power is actually a bit silly?
 
Hey guys, how do you thinks PC will work in this generation? Without a I/O PC will have several problems, like data acessibility, speed and bottlenecks.

Remember when Tim Sweeny said that probably PS5 SSD would manage to make AMD, Intel and Nvidia to work together to create a similiar system to PC ?
So for now is the "kill of ultra mode in PC", if I'm not wrong the ultra mode is only a improvement of Geometry and polygons, and this is not a problem for any nextgen console now.
I am bored so lets make some guesses without basses :lollipop_squint_tongue:

I think will have phases for minim/recommended requirement.

First phase where the games are more similar to the current than a new gen(first two years):

-Minimum: some with zero changes will be as now like AC Valhalla I think, but many games will need increment his minimum requirement
to whatever is similar in brute force (without RT) to a XSS maybe a rx 590, 8 GB and a normal HDD and ryzen 3 1200.
-Recommended: another games which start to use SSD as essential part of its engines will require at least an SSD sata, a CPU similar to a ryzen 3 3600 as the many game
its use a less aggressive way the cpu (aside the AAA of the first parties of companies like CD Projekt Red) and 16GB of ram.
-PC not much master race experience: rtx 2070 super, 24/32 of RAM, ryzen 7 3700x, hi NVME (more of 2 GB/s)

Second phase (next 2-3 years) the games of console became the standard in quality for AA and AAA the indie market for nature is unpredictable:

-Minimum: the specs similar to a possible XSS; similar a rx 590 but now with RT for the games which use it, 12 GB of ram, SATA SSD, ryzen 3 3600 (bye four cores).
-Recommended: rtx 2070 super (RT necessary), ryzen 3800x, NVME of 2 GB/s some even request PCI 4, 24/32 GB RAM
-PC master race experience: something above a 2080 TI, ryzen 9 3900x, 32GB of ram (DDR5 should already out there), SSD PCI 4 more of 5.5 GB/s

Third phase (last years of the gen) again the PC destroy even with the medium PC segment any console:

-Minimum: a combination of XSX and PS5 specs, rtx 2070 super (RT necessary), ryzen 3800x,
NVME of 2 GB/s, 24 GB RAM
-Recommended: much above of a 2080 TI, ryzen 9 3900x, 32GB of ram (DDR5), SSD PCI 4 more of 5.5 GB/s
-PC master race experience: whatever NVIDIA has is xx80 series, ryzen 9 3950x or even above, ssd PCIe 5 above 9 GB/s

-Nintendo announce its first console with RT in this era but weaker than a XSS :lollipop_fearful:

I supposing between many things that the directx 12 ultimate will help to reduce the workload necessary for streaming many data from a the SSD because if not ... oh boys
we have problems for the SSD SATA, 16GB of RAM and CPUs of less of 8 cores and probably that ryzen 3800x in recommend will became a 3900x.

Basically my idea is different components of the consoles will be the base for many requirements but with pass of the time this will pass from a high end pc to
a "budget" also I expect a tons of users say this gen is the worst as many of the games are really hard to push to more of 60 fps 4k at least the first two phases.
giphy.gif
 

Neo Blaster

Member
Not a tweet - it’s mentioned here:

“Starting with the May 7 episode of Inside Xbox, we will be showcasing what happens next in the world of Xbox, every month, which is why we’re calling it “Xbox 20/20.” These monthly moments will take place throughout the rest of the year and...”

https://news.xbox.com/en-us/2020/05/05/xbox-2020-the-future-of-xbox/

My guess is they saw the blow back from their failed May event, also saw Sony were producing actual gameplay and exclusives to show and realised that a set of trailers in June running on “representative hardware” would go down like a lead ballon.

So they try and get some games running on Xsex at high enough quality to show in June/July... maybe.

June is almost over and haven’t heard about a July date either yet ...
Well, it was themselves who talked about a first party presentation on July, the ball is in their court.
 
Well looking at that CP2077 delay message - it says it’ll be playable on next gen consoles using BC mode and will get some improvements while we wait for the “real” next gen patch.

That’s basically confirming PS5 has the capability to improve PS4 games. The only question now is whether that’ll be for the whole 4000 odd game library or just for games built more recently.

I don’t think it said that? It think it said the PS4 game would work on PS5, and any improvements to leverage PS5 hardware would come later.

So will devs on both sides decide to go with the greater compression for a (very?) small visual difference.

Or will some want to make assets as clean as possible...

I've looked at the pics on rad site, couldn't see much difference but they were small pics and maybe might show up more on a bigger screen?

I suppose either way now the option is there for devs.

It’s RAD’s job to make tools and provide options. I’m not sure what would call for a lossy BC7 texture to be losslessly compressed, but work has been put it to make it available.
Maybe for things with a lot of small text or menu backgrounds or something. I’m not sure.
 

Bo_Hazem

Banned
Just stop okay, he clearly says "going forward" he won't miss such iconic game because only xbox isn't getting such games. It might change. Stop with that trolly wink and blow kiss emoji too.

Yup, playing every game day one, when I go for PS5 it'll be extremely hard for me to play PS4 games on PS5 if already finished them. Maybe TLOU2 MP on PS5? I loved Uncharted 4 MP so much, that might makes me play it with PS5 upgrade if I got the time for it, I believe I'll be busy with those new games announced.

There is nothing wrong with someone liking BC, it's just not for me, and about Captain Tsubasa, it's a very iconic anime globally except in US and some countries? And a multiplat that somehow xbox is the only platform that doesn't support it, which makes customers interested in the title question their strategy especially with Japanese games.
 

Tiago07

Member
I am bored so lets make some guesses without basses :lollipop_squint_tongue:

I think will have phases for minim/recommended requirement.

First phase where the games are more similar to the current than a new gen(first two years):

-Minimum: some with zero changes will be as now like AC Valhalla I think, but many games will need increment his minimum requirement
to whatever is similar in brute force (without RT) to a XSS maybe a rx 590, 8 GB and a normal HDD and ryzen 3 1200.
-Recommended: another games which start to use SSD as essential part of its engines will require at least an SSD sata, a CPU similar to a ryzen 3 3600 as the many game
its use a less aggressive way the cpu (aside the AAA of the first parties of companies like CD Projekt Red) and 16GB of ram.
-PC not much master race experience: rtx 2070 super, 24/32 of RAM, ryzen 7 3700x, hi NVME (more of 2 GB/s)

Second phase (next 2-3 years) the games of console became the standard in quality for AA and AAA the indie market for nature is unpredictable:

-Minimum: the specs similar to a possible XSS; similar a rx 590 but now with RT for the games which use it, 12 GB of ram, SATA SSD, ryzen 3 3600 (bye four cores).
-Recommended: rtx 2070 super (RT necessary), ryzen 3800x, NVME of 2 GB/s some even request PCI 4, 24/32 GB RAM
-PC master race experience: something above a 2080 TI, ryzen 9 3900x, 32GB of ram (DDR5 should already out there), SSD PCI 4 more of 5.5 GB/s

Third phase (last years of the gen) again the PC destroy even with the medium PC segment any console:

-Minimum: a combination of XSX and PS5 specs, rtx 2070 super (RT necessary), ryzen 3800x,
NVME of 2 GB/s, 24 GB RAM
-Recommended: much above of a 2080 TI, ryzen 9 3900x, 32GB of ram (DDR5), SSD PCI 4 more of 5.5 GB/s
-PC master race experience: whatever NVIDIA has is xx80 series, ryzen 9 3950x or even above, ssd PCIe 5 above 9 GB/s

-Nintendo announce its first console with RT in this era but weaker than a XSS :lollipop_fearful:

I supposing between many things that the directx 12 ultimate will help to reduce the workload necessary for streaming many data from a the SSD because if not ... oh boys
we have problems for the SSD SATA, 16GB of RAM and CPUs of less of 8 cores and probably that ryzen 3800x in recommend will became a 3900x.

Basically my idea is different components of the consoles will be the base for many requirements but with pass of the time this will pass from a high end pc to
a "budget" also I expect a tons of users say this gen is the worst as many of the games are really hard to push to more of 60 fps 4k at least the first two phases.
giphy.gif
Hmm I understand your point, I think we have to wait to see, but is like I said, for now The Streaming of Higher Assets, streaming of general data, the Ultra mode (which increases geometry and Polygons if I'm not wrong, what is not a problem for XSX and PS5, remember PS5 Billions of Polygons and triangles) and the loading is a problem for PC now.

The only thing I can see is PC with more FPS and Better RT, but I still have some doubts about it too. It's not about brute force but what you can do with data, which is PC largest weakness. And in that Linus video, he said PC could became lowest denominator, so now I'm curious.
 

ethomaz

Banned
I think what I did looks close but there's so many different comparisons out there that I'm still not sure what the actual size is.
Yeap I search on twiiter and come with so many results lol
But something is true the pics with the FAT width are all fakes... it looks tall and thin.
 
Last edited:

HAL-01

Member
Yeap I search on twiiter and come with so many results lol
But something is true the pics with the FAT width are all fakes... it looks tall and thin.
Yup what doesnt really show in the front and side renders is that the top is more than twice as thick as the bottom, and in the back the top is similarly much thinner than the bottom. From a front view it all looks sort of xbox 360 shaped but its actually a much slimmer, more dynamic shape

The horizontal render makes it easier to see

tBMIUuL.png
 
Last edited:
Status
Not open for further replies.
Top Bottom