• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

vpance

Member
Some interesting undervolting and downclock tests on 5700XT over on Ree. 15% downclock cut average watts usage by 85.

10-11TF minimum should be no problem especially with EUV.
 

SonGoku

Member
Some interesting undervolting and downclock tests on 5700XT over on Ree. 15% downclock cut average watts usage by 85.

10-11TF minimum should be no problem especially with EUV.
Yeah, what I've been preaching CrustyBritches CrustyBritches about lower clocks and voltage
5700XT @1686Mhz = 105W average - 120W Peak

A 54CU APU could comfortably hit 10.5-11TF on DUV at 200W or less
54CU @1520Mhz = 10.5TF
54CU @1592MHz = 11TF
 
Last edited:

CrustyBritches

Gold Member
Yeah, what I've been preaching CrustyBritches CrustyBritches about lower clocks and voltage
5700XT @1686Mhz = 105W average - 120W Peak
Nah. He's using software monitoring, a game that doesn't push the GPU(this game run 4k/30fps/X1X and averages around 1100MHz clock on RX 480). I was using software to monitor my GPU consumption and it reported 105W on DOOM, then I got a kill-a-watt and it's easily over 150W. 5700 Pro comes undervolted to .98 V and runs around 1.7GHz while pulling ~166W, while XT is 1.2V. What voltage is he running?
 

SonGoku

Member
Nah. He's using software monitoring, a game that doesn't push the GPU(this game run 4k/30fps/X1X and averages around 1100MHz clock on RX 480). I was using software to monitor my GPU consumption and it reported 105W on DOOM, then I got a kill-a-watt and it's easily over 150W. 5700 Pro comes undervolted to .98 V and runs around 1.7GHz while pulling ~166W, while XT is 1.2V. What voltage is he running?
That doesn't explain the drop in power consumption compared to stock values (Peak 214w and average 190w). Still an impressive drop
 

CrustyBritches

Gold Member
That doesn't explain the drop in power consumption compared to stock values (Peak 214w and average 190w). Still an impressive drop
It explains it just fine. You can watch a video where anandtech takes a 5700 Pro that comes undervolted at .98v and 1.6GHz-1.7GHz and paired it with an undervolted Zen 2 8c undervolted and underclocked to 3.2GHz and ~32W consumption, and the system draw on kill-a-watt was well over 200W, likely ~250W under load. That's 36CU.

I told you how I underclocked and undervolted a RX 480 and gave FH4 benchmarks results months ago. Don't try.
 

SonGoku

Member
It explains it just fine. You can watch a video where anandtech takes a 5700 Pro that comes undervolted at .98v and 1.6GHz-1.7GHz and paired it with an undervolted Zen 2 8c undervolted and underclocked to 3.2GHz and ~32W consumption, and the system draw on kill-a-watt was well over 200W, likely ~250W under load. That's 36CU.

I told you how I underclocked and undervolted a RX 480 and gave FH4 benchmarks results months ago. Don't try.
Its paints a positive picture for the 11TF dream on DUV, even if the readings are higher than reported, this applies to both stock and UV
If a undervolt/downclock can shave almost 100W from peak (120 vs 214) on a young 7nm process at 1680MHZ
A 54CU console APU at 1592MHz could potentially hit the 200W sweet spot on a more mature process or refinement (7nm Pro) with the hobbit method as bonus

On 7nm EUV 11TF becomes the minimum baseline
 
Last edited:

CrustyBritches

Gold Member
Its paints a positive picture for the 11TF dream on DUV
If a undervolt/downclock can shave almost 100W from peak (120 vs 214) on a young 7nm process at 1680MHZ
A 54CU console APU at 1592MHz could potentially hit the 200W sweet spot on a more mature process or refinement (7nm Pro) with the hobbit method as bonus

On 7nm EUV 11TF becomes the minimum baseline
Nah. I told you this months ago and gave you results and you were dead against the idea that X1X had only ~150W consumption...May 19th.
You can get X1X's 4k/30fps FH4 result with RX 480 1305MHz(core)/2150MHz(mem), -12% power limit, and slight undervolt. Average core speed is 1125MHz with mem clock balls to the wall. 156W peak draw, average much lower. 150-160W peak draw is probably a best-case for next-gen GPUs.

"No offense I'm skeptical" was your response.
 

SonGoku

Member
"No offense I'm skeptical" was your response.
How about you quote my full response
No offense but im skeptical, this is the first time i hear of the X GPU under performing a rx 480 most (including DF) put it up there with the 580
The X is limited by its CPU so fps count wont tell the whole story without diving into every effect used and its effect on performance
My skepticism was towards your claim that a rx480 matched the X GPU.
 

CrustyBritches

Gold Member
How about you quote my full response

My skepticism was towards your claim that a rx480 matched the X GPU.
The context of the whole discussion was that I was telling you the X1X has ~150W peak GPU consumption and you were adamant it was higher. So I gave you power limited(underclocked)/undervolt FH4 results explaining why that is months ago. Nice flip flop. I'll wait for more comprehensive tests, thanks.
 

SonGoku

Member
The context of the whole discussion was that I was telling you the X1X has ~150W peak GPU consumption and you were adamant it was higher. So I gave you power limited(underclocked)/undervolt FH4 results explaining why that is months ago. Nice flip flop. I'll wait for more comprehensive tests, thanks.
I never claimed the X GPU pulled more, my argument was the x GPU being on par with 200W+ GPUs
My thesis for next gen consoles has always been consistent throught this thread:
By going with a wider/slower design, consoles GPUs can outperform 200W GPUs (5700xt in this case) while consuming less.
 
Last edited:

CrustyBritches

Gold Member
I never claimed the X GPU pulled more, my argument was the x GPU being on par with 200W+ GPUs
You claimed X1X performance reflected a RX 580(200W avg consumption). I showed why it reflected a RX 480 underclocked/undervolt in the same FH4 with 1120MHz avg, ~150W RX 480, 4k/30fps/near ultra. If I end up with ~150W at 1120MHz with a GPU that starts at 166W at 1266MHz, then there is a problem with your math. That's all. 5700 Pro is your underclocked/undervolted 5700 XT, but less 4CUs and it still pulls ~166W avg, just like RX 480. I bet a good tuning would yield around ~150W just like my results. Anandtech's results match this as well.
 
Last edited:

SonGoku

Member
You claimed X1X performance reflected a RX 580(200W avg consumption)
Please quote the post in question if you are so confident
My argument has always been the same: Consoles can outperform 200W cards by going with a wider/slower design (more CUs less clocks/voltage)
In the case of the X, it could match the RX580 by going with a wider/slower design

The problem with your RX480 example is chip lottery, similarly a Radeon 7 can be undervolted to hit 160W on stock clocks
 

CrustyBritches

Gold Member
Please quote the post in question if you are so confident
My argument has always been the same: Consoles can outperform 200W cards by going with a wider/slower
I have bog standard aib RX 480 that matches 90% of results. Silicon lottery with rando results from software, in FH4, and with no listed voltage doesn't cut it. I showed you underclocked/undervolted FH4 results months ago. I'll stand by my results and what I've seen from anandtech.
 

SonGoku

Member
I have bog standard aib RX 480 that matches 90% of results. Silicon lottery with rando results from software, in FH4, and with no listed voltage doesn't cut it. I showed you underclocked/undervolted FH4 results months ago. I'll stand by my results and what I've seen from anandtech.
If your 6TF RX480 was the norm the stock RX580 would consume a lot less. Likewise if that 160W Radeon 7 was the norm, AMD would have released Radeon 7 with lower voltages
What are we even arguing here? RX480 and 580 are the same rebranded card anyways...

Anyways, I expect a 54CU APU with 1520-1592MHz clocks + hobbit method and more mature process (or 7nm Pro) to hit the 200W sweetspot
 
Last edited:
Hi Guys. Have not been following last few pages.
Can anyone advise where we are now? Have there been any new leaks that seem credible or good educated guesses on some of the parts on the upcoming next gen consoles?
Anything new?
Thanks guys and gals.
 

xool

Member
3-3.3Ghz appears to be the sweet spot for consoles


Isn't that graph actually showing CPU speed starting to outpace (bottleneck) memory at 3.3 GHz - so position actually memory (speed) dependent - hence diminishing (perf.) returns with power (watts)

Really need to know the voltages used (or was it fixed) for this graph

(Otherwise performance would be linear with frequency, but with the square of the voltage needed to get higher clocks)
 
Last edited:

SonGoku

Member
Isn't that graph actually showing CPU speed starting to outpace (bottleneck) memory at 3.3 GHz - so position actually memory (speed) dependent - hence diminishing (perf.) returns with power (watts)
You think a bandwidth bottleneck is the reason for diminishing returns beyond 3.3Ghz?
 

xool

Member
GDDR6 base clock varies between 1.75 (14Gbps) and 2 GHz (16Gbps)

Compare a base clock of 1.6 GHz for the DDr4-3200 in the example above.

I'd expect this difference in memory architecture/frequency to affect sweet spot for CPU/GPU frequencies.
 

xool

Member
You think a bandwidth bottleneck is the reason for diminishing returns beyond 3.3Ghz?

[edit] I might not be a conventional bottleneck, but more to do with CPU frequency exceeding the "synched" 1:1 frequency point

Well memory was 3.2 GHz (DDR4-3200) ..

Like I mentioned I'd need to see what was going on with voltages .. but I think so yes.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Well memory was 3.2 GHz (DDR4-3200) ..

Like I mentioned I'd need to see what was going on with voltages .. but I think so yes.

The graph seems to be consistent with a design where to increase frequency they had to increase voltage hence the rapid spike at 3.3 GHz and upwards.

Of course, the lower the clock speed of memory the more difficult it is to raise the clock speed of the CPU (sometimes making wider bursts helps, but you need to be able to work around it on the CPU side... more data per transfer but each transfer taking day 4-6 CPU clocks means you need to find work to cover the extra latency).
 
Last edited:

CrustyBritches

Gold Member
This vid has a 3700X with downclock and undervolt running at 3.2GHz with ~34W consumption. Hard to read, but looks like .9V. He's running 16GB DDR4 3400. He seemed to have some issues in Exodus, perhaps related with too low a voltage, but everything else was fine. I think it's reasonable to say that consoles will use something like that.


SonGoku SonGoku
RightHomelyAmurratsnake-size_restricted.gif
 
Last edited:

SonGoku

Member
[edit] I might not be a conventional bottleneck, but more to do with CPU frequency exceeding the "synched" 1:1 frequency point

Well memory was 3.2 GHz (DDR4-3200) ..

Like I mentioned I'd need to see what was going on with voltages .. but I think so yes.
Well in that case GDDR6 should provide more bandwidth than DDR4 but it depends of the CPU bus (PS4 was 20GB/s)
Regardless the sweetspot in frequency/watt seems to be somewhere in between 3-3.3Ghz.
 
This vid has a 3700X with downclock and undervolt running at 3.2GHz with ~34W consumption. Hard to read, but looks like .9V. He's running 16GB DDR4 3400. He seemed to have some issues in Exodus, perhaps related with too low a voltage, but everything else was fine. I think it's reasonable to say that consoles will use something like that.



That gif hahahaha
 

SonGoku

Member
definitely from 14Tf to 10Tf you can't miss, maybe at some time you went to 16/15 can't remember.
I've been set on 11-13TF pretty consistently, now with news of Navi matching Turing game Perf/FP I've made individual DUV & EUV estimates.
10-11TF (DUV) - 11-14TF (EUV)

For EUV I'd bet somewhere between 12TP to 13TP as the sweet spot, that said even though i don't find it very likely 14TF is technically possible on a console chip designed on 7nm EUV.
lol i just saw this, is it related to the video?
 
Last edited:

xool

Member
Next gen also needs to fix its patch sizes or I may just find another hobby (not joking)

Here's what the Days Gone 1.25 patch does :

Weekly DLC Challenges
  • “Drifter’s Run” is our first challenge involving our bike! Gather all the bandages in the harsh volcanic area of Crater Lake
  • Avoid burned down houses, Freakers, and the Horde to gather bandages as fast as you can and make it to the finish line!
  • Use ramps to score extra points and utilize your Nitrous Boost as much as can
A reminder that all the challenges we release are planned to stay unlocked indefinitely

Progression Issues

  • Players should be able progress properly in “It’s A Rifle, Not a Gun”
General Fixes
  • General awareness of the NPCs have been adjusted for all modes
  • If you were killed right after burning down a nest, the nest should repopulate properly
  • Picking up a group of items near an ammo box will now also grab the ammo box
  • Adjustments to scoring on the DLC challenge “Drifter’s Run”
  • Correct number of bolts crafted is shown in the player’s menu
  • Fixed issues involving Survival Vision and enemy outlines during Survival Mode
  • The “Dead Don’t Ride” custom accent should be appear in the Mechanic’s menu in this patch

Tiny bug fixes probably 100bytes different, and a short script to add a timed collection challenge..

Patch size 16GB
 

xool

Member

Well 4k textures should be 4x the size of 1080p textures, so a quadrupling of memory requirements isn't unrealistic. (8k should be 16x memory compared to 1080p) .. I expect maps to grow too (or have smallest ground poly unit to half too - which is 4x again.)
 
Last edited:
Correct me if I'm wrong, but multi-GB patches don't just update the executable (10-20MB?), but they also update all the shader files (they're pre-compiled in consoles, unlike PCs) to solve optimization issues.
 

nowhat

Member
Patch size 16GB
Uhh, you're sure that's just for that particular patch and not compared to a vanilla release? The patches are cumulative, so if you want to get version X, you also have to install the ones that came before.
 
Well 4k textures should be 4x the size of 1080p textures, so a quadrupling of memory requirements isn't unrealistic. (8k should be 16x memory compared to 1080p) .. I expect maps to grow too (or have smallest ground poly unit to half too - which is 4x again.)
They should give us an option to stick to 1080p assets, no matter what. 4K is already ridiculous size-wise and 8K is gonna be even worse.

XB1X does not give an option for 1080p users to stick to 1080p assets. Only XB1 Slim allows you to do that. It's a simple software/OS fix.
 

xool

Member
Correct me if I'm wrong, but multi-GB patches don't just update the executable (10-20MB?), but they also update all the shader files (they're pre-compiled in consoles, unlike PCs) to solve optimization issues.
I think so - as I understand it the files are often huge blobs with multiple types of assets within

.. but that isn't the issue - I refuse to believe there are multiple GB of actual changes - diff files (delta compression) exists ...
 
Uhh, you're sure that's just for that particular patch and not compared to a vanilla release? The patches are cumulative, so if you want to get version X, you also have to install the ones that came before.
Nope, PS4 supports delta (differential) updates.

If you go from v1.12 to v1.13, you only download the difference between those two, not everything from v1.01 to v1.12 (PS3 did that and it was a horrible experience in games like GT5).
 

xool

Member
Uhh, you're sure that's just for that particular patch and not compared to a vanilla release? The patches are cumulative, so if you want to get version X, you also have to install the ones that came before.

Again - something else they might need to fix - if I have version 1.23 don't patch me from version 1.00, but from version 1.22 .. but that doesn't actually make sense - because version 1.00 will be long gone on my disc ??
 

nowhat

Member
Nope, PS4 supports delta (differential) updates.

If you go from v1.12 to v1.13, you only download the difference between those two, not everything from v1.01 to v1.12 (PS3 did that and it was a horrible experience in games like GT5).
Oh sure, I'm aware of the delta updates (which are a terrific thing). Just that unless I've missed it, I haven't had 16GB updates for Days Gone in like, ever. So for such a huge patch to arrive I'd assume you're updating from 1.00 (or whatever the gold version was). I may have worded it wrong, but that's what I was getting at.
 
Last edited:
Status
Not open for further replies.
Top Bottom