• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

geordiemp

Member
Smartshift makes the budget shared and fixed. It's not fixed in every APU. That it's shared is not relevant to anything I said. That it's fixed makes the 2 processors interdependent. We still don't know how long the PS5 can keep both at their max frequency. Cerny says most of the time. The Smartshift website says never (to be at their max frequency at the same time they'd both need more than half of the power budget).
It's always possible that Sony has modified smartshift but like the GPU features, it's rather a mystery still.



I love this GIF
 

onesvenus

Member
Could someone with the approved beyond3d account take this photo and display it on this forum?

thank you
Here you go. Credit goes to Dubs at the beyond3d forum.

oMdDm0G.png

I think he is overestimating the width of the chip due to the perspective used on the unboxing.

I am on the no IC on either console camp still
 

raul3d

Member
[..]
Smart shift is good. Actually, it's quite amazing. Saves throttling for hardware that can't support the beat output, and/or power issues.

However, if your console can handle the thermo and power of said hardware, it won't need to worry about this, and in fact be a step backward.

I don't want to get into this too much, as I'm sure some people wouldn't be able to take it in, not many people understood the HotChips. So for that, I will reply within 8 days.
Your description of Smart Shift sounds really off..

For a given hardware there is nothing like "the thermo and power". Power consumption (and heat generation) vary a lot with the workload (and obviously frequency). Now when you design your console, do you plan your cooling/power solution for your common game workload, do you factor in AVX, what utilization do you assume? With the fixed clocks, Microsoft planned for one scenario and I would assume it is nothing too heavy. With the variable clocks, Sony planned for both the common and worst case workload.

I am curious what you want to explain from the HotChips conference. I am pretty sure I understood what was presented there and I would say that there is nothing "hidden" that needs explaining, it was a pretty straight forward hardware talk.
 
A topic on Variable Frequency & Power:

Bear with me here long post incoming. It will be worth a read. And hopefully, it'll be able to clear some misunderstanding regarding the variable frequency of PS5. I'll do my best to explain with the limited info we have available to the best of my ability. And pardon my English as it's not my native language.

Instead of asking "Why didn't MS implement variable frequency?"

A better question is why did Sony choose to NOT implement fixed frequency?

It's simply because the fixed frequency strategy didn't allow Sony to reach where they wanted to be in terms of frequency. What does that mean? Their goal from the get-go for PS5 was to set the GPU frequency as high as possible.

As Cerny mentions:

But with the fixed frequency strategy, he couldn't go as high as he wanted to, more on this below.


So what's this "old fixed frequency strategy"?
It is to run the GPU at a constant frequency at all times and let power vary based on the GPU workload. Btw, GPU power draw varies A LOT from game to game and even scene to scene in a game. Anyway, this is the exact same strategy that PS4, PS4 Pro, and Xbox consoles from XB1 to Series X/S use.

And why was 2 GHz unreachable?
Cerny's above "2 GHz was looking unreachable" comment doesn't mean the chip was having thermal issues nor was it incapable of reaching 2 GHz. In fact, the chip was capable and had the potential to go even further than 2 GHz comfortably. It was simply due to this old strategy of power being variable, they were unable to achieve their goal of crossing 2 GHz, in other words the power being supplied to the chip was likely insufficient. Now with the variable frequency strategy, "a completely different paradigm" as Cerny calls it, they're able to fully tap into the full potential of the GPU in terms of its frequency. More on this below.



So what is this different paradigm and how are they achieving higher frequency with this variable frequency strategy?

To this Cerny answers:


So what was previously an unreachable 2 GHz is now way over 2+GHz.

So the goal of 2.2+ GHz is now achieved and this was done so, in Cerny's words, by "supplying a generous amount of electrical power" i.e. constant power. And they had to cap it at 2.23 GHz to ensure the logic operated properly. And SmartShift was used as a supplement to this strategy for the SoC to further boost performance/pixels. SmartShift isn't the solution to variable frequency, btw.



Why does Cerny like running the GPU at a higher frequency?
Suppose Sony had gone with the same 56 CUs as MS with 4 disabled (52 CUs active) for PS5 and with the goal of 10.3 TF. They would have to set the frequency at 1544 MHz to achieve that. They would surely achieve their goal of 10.3 TF but the performance would be noticeably different between 36 CU at 2230 MHz vs 52 CU at 1544 MHz. Cerny even gave this e.g. but with 36 vs 48 CU.
20200329151024.jpg



At Hotchips MS revealed their "GPU Evolution" slide for Series X comparing its GPU all the way from the original Xbox One GPU.

sUET9U8.jpg


For the GPU evolution slide, they focus on 4 metrics here to show the evolution.
  1. Computational power
  2. Memory bandwidth
  3. Rasterization rate and
  4. Pixel fillrate.
Here's how a notional PS5's GPU perf with 52 CU @1544 MHz config would look like:
10.3 TFLOPS, 448 GB/sec, 6.18 Gtri/sec, 98.8 Gpix/sec

Here's what the actual PS5's GPU perf looks like with the current 36 CU @2230 config:
10.3 TFLOPS, 448 GB/sec, 8.92 Gtri/sec, 142.7 Gpix/sec

You reach your TF goal but look at rasterization and pixel fillrate. They take a massive hit. With 36 CUs those other units are now 44% higher than they were with 52 CUs. Not only does rasterization go up, but pixel fillrate also goes up along with the processing of the command buffer. And L2 and other caches now get 44% more bandwidth. Not to mention it would cost more money to go with a larger GPU. Please note I'm not downplaying MS here. Just pointing out the strategy that Sony has taken based on the information that's out there. Whatever strategy MS followed for their machine works the best for them and I'm not downplaying that one bit.

Hope this post was able to better explain Cerny's variable frequency topic just a bit more and why they went with it. There is obviously more to talk about it and there's still not much detail and info like what's the power consumption like, etc.

You sir, did an amazing job explaining everything and have officially won the internet for today!!! The elders of the internet would be proud!

I have attached your coupon below!

hereyougo_f4697eb3e6691e14a350555b13210ac9.jpg
 

MastaKiiLA

Member
I can say the same about vibration in controllers used in the majority of games. It is a gimmick. Doesn’t automatically means bad. But it is a gimmick.
That's a completely subjective statement. What's a gimmick to you won't be to others. So there's little to no value in repeatedly sharing this opinion. You don't think much of it? Good for you. I think we've heard this stated enough already. I'm not sure anyone's opinion will be changed for or against it.
 

Nowcry

Member
That's what it looks like it will do, but I'm not sure.
Starfield and TES:VI day one on gamepass!

I prefer to pay for it, which are long games, lest they put it on and take it away when I have not finished it yet. it is also worth replaying it for the mods a second time.

also all bethesda games is better on PC as it gets all mod support.

It has been a good acquisition to sell windows but quite poor to sell consoles
 

Thirty7ven

Banned
Oh a feature supported by most gaming titles for the past 25 years is now a gimmick

We are really redefining meaning of everything right now

You see a gimmick is apparently a feature that isn't needed for a game to work. So games are mostly gimmicks and a couple of core features...

Expect the next year to be filled with bipolar arguments about the merits of the Dual Sense by fans of other plastic boxes.
 

Godfavor

Member
Ps5 adjust cpu or gpu speeds based on load that pushes the system above its power limits that exceed the tdp of the psu.

Because it is clocked so high, engineers have to put limitations based on cooling, wattage and EU regulations.

This means that the PS5 will be at a max and steady clocks all the time, and will only downclock if a scene has a lot of effects in place.

Because XSX has lower clocks (wide and slow approach) the psu/cooling is more than enough to maintain both the cpu and gpu at max clocks all the time.
 

hemo memo

Gold Member
That's a completely subjective statement. What's a gimmick to you won't be to others. So there's little to no value in repeatedly sharing this opinion. You don't think much of it? Good for you. I think we've heard this stated enough already. I'm not sure anyone's opinion will be changed for or against it.

I think Dualsense impressions are great and I’m excited and I already have an extra one pre-ordered. I don’t mean it in a bad way.
 
Getting a Series X is getting less and less exciting by the day. You want a game experience only possible by the hardware. That’s why PS5 exclusives are exciting.
is it a pun? i have to read multiple times. if you are serious, what has changed you? I recall you were in the green camp.
 

AeneaGames

Member
We'll see, also with the RDNA2 features I would think within 12 months Series S could be pushing PS5 pretty hard, I wouldn't worry about Series X If I were you it'll be out of sight by then.



w98Ojgk.jpg




You can't be serious?

It's a 4TF console, with less RAM, slower memory, etc. and yet you believe it will start pushing the PS5???

I thought TF was the measurement that counted and now you're saying it does not and it's in all the marvelous features the GPU has that can make up the difference with a GPU that's 2.5 times more powerful?

I, I don't have words for this...
 
Last edited:

hemo memo

Gold Member


Edit: Added reddit link


- Creation Engine has been overhauled

Yeah bullshit. They are still using this awful old glitch-fest broken engine and trying to get away with the game eventual glitch-festival on release day with the “It is a MASSIVE world” bullshit again.

No. Bethesda. Not doing this bullshit again and Microsoft should honestly step-in and burn that fucking engine to the ground.
 

DrDamn

Member
It has been a good acquisition to sell windows but quite poor to sell consoles

For me I think ...
  • It's a great acquisition to sell GamePass.
  • It's a fantastic acquisition to sell GamePass in the PC space.
  • It's a very good acquisition to sell consoles.
The first two points are most important for MS, but the set of games and developers they have acquired (or are acquiring) are great for them for a whole host of reasons.
 

Gudji

Member

XSX: 1080-2160p
XSS: 720-1440p

I know people talk about res and frame rates for console warz but at the end of the day all next-gen consoles will use dynamic resolutions.
 
Last edited:

Reficul

Member
Remember guys and girls,

If you have not made a donation or just want to leave some kind words, this thread is still open. i will only keep bumping it for a few more times.

Thanks all

Thanks for the reminder sircaw sircaw
Made a small donation for AeneaGames AeneaGames , not much but as a good friend of me told me:
A rising tide lifts all boats.
Or something like that.

By the way: I'm back in this thread after a rough couple of weeks (thread ban).
Thanks to the mod staff for letting me back in.
 

azertydu91

Hard to Kill
Smartshift doesn’t adjust clocks.

In the first four words you’ve demonstrated you haven’t paid attention.
Dynamic 4K always changes framerates and varying clocks changes power.
When I turn on the oven in my kitchen, the shower starts pouring.
Putting gas in my car refill brake fluids too.
Now let me tell why all of those things that I don't understand are a bad thing.
 

Nowcry

Member
For me I think ...
  • It's a great acquisition to sell GamePass.
  • It's a fantastic acquisition to sell GamePass in the PC space.
  • It's a very good acquisition to sell consoles.
The first two points are most important for MS, but the set of games and developers they have acquired (or are acquiring) are great for them for a whole host of reasons.

Don't keep it in mind, it's a personal impression. I like Bethesda games on PC. Only that.
 

Insane Metal

Gold Member
Gears 5 on XSX:

  • SP: dynamic res, 1080p to 2160p, with an average of 1720p
  • MP: 1080p at 120FPS
  • Small dips in FPS but mostly due to bugs which will be fixed with an update
  • Series S will run at dynamic res 720p to 1440p
  • SSGI, several upgrades to shadows, lighting, textures and shaders
  • VRS Tier 2 gives 5 to 12% improvement in performance (as per Devs)
  • Greatly reduced input latency
  • Load times reduced from 42s on One X to 8s on XSX
 
44NuNT6.png

Going as low as 720P on a next gen console? What the hell??
It’s average 1720p. Better than ULTRA Settings on PC with SSGI, several upgrades to shadows, lighting, textures and shaders.

Small and very rare dips to 1080p. But most of the time it’s 1720p or higher.
Why do you act like its 99% of the time at 1080p?

Same applies to XSS too, very rare dips to 720p.

only because it dips in a very rare occasion to 720p doesn’t mean that it’s the end of the world. Stop reaching.
 
Status
Not open for further replies.
Top Bottom