• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & Next Xbox |OT| Speculation/Analysis/Leaks Thread

Next Gen Consoles Power Level


  • Total voters
    237

llien

Gold Member
Feb 1, 2017
5,724
2,822
720
So in theory RDNA it's a 1.25x gain, but in practice, it can be as much as a 1.4x gain? Maybe that's where I got the 1.4x gain from yesterday. A 1.4x gain for a 9.2TF RDNA GPU would be similar to 12.88 TF GPU of today.
IPC is gain per clock. But Navi cards also clock higher, e.g. Vega64 hovered at 1.6Ghz, 5700XT can cross 2Ghz.
 
  • Like
Reactions: mckmas8808

JaguarCROW

Neo Member
Jun 5, 2013
49
3
310
Sunnyvale CA
Sorry but is there some special method for being able to Vote in the poll above. I can only select each category and get the breakdown of users but can't seem to cast a vote myself. Thanks and sorry for the stupid question.
 

llien

Gold Member
Feb 1, 2017
5,724
2,822
720
oberon is also a shakespeare character like ariel, gonzalo and flute.

but the biggest indication that oberon is the ps5 gpu are the three clockspeeds of 800 mhz, 911 mhz and 2.0 ghz. the first two being the exact clockspeeds of the base ps4 and the pro.

even if gonzalo and flute werent the ps5, this thing definitely is.
But given that RDNA IPC is higher, shouldn't said clockspeeds also be reduced and not match base PS4 and Pro?
 
  • Thoughtful
Reactions: Negotiator

SonGoku

Gold Member
Aug 16, 2018
3,849
3,651
560
So RDR2 is not a graphics intensive game compared to what.... ?
Did i say it wasn't? doesn't change the fact most graphic intensive games dip below 4k
Just like most PS4 Pro games dip below 1800p even though GoW is
I'm really curious what is that secret sauce in next gen visuals you expect that you need 11-13TF Navi? And without such, 4k is somehow impossible?
2x for a next gen leap is anemic
I expect next gen games not current gen games on ultra...
Read what Phil said recently...
Read what Cerny said recently:
A true generational shift tends to include a few foundational adjustments. A console’s CPU and GPU become more powerful, able to deliver previously unattainable graphical fidelity and visual effects; system memory increases in size and speed; and game files grow to match, necessitating larger downloads or higher-capacity physical media like discs.
They achieved 4k with XBX and now they will target other areas.
So 11TF+ is a given then for XBOX2
Sorry but is there some special method for being able to Vote in the poll above. I can only select each category and get the breakdown of users but can't seem to cast a vote myself. Thanks and sorry for the stupid question.
I think you need member status to be able to vote
 
Last edited:
  • Like
Reactions: pawel86ck

joe_zazen

Member
May 2, 2017
1,429
1,012
385
.rdr2 was 4k on x1x because it is a game built for 1.3TF x1 running on x1x. X1x isnt some magic box that run everything at 4k because Phil built it that way. It can run most x1 games that way because x1 limits how much can be packed into a game. Remove the x1 limiting factor, and x1x will struggle.
 
Last edited:

squarealex

Member
May 11, 2014
228
15
340
France
Also. the One X have 12 GB Memory so most game can push 4k easily compare to PS4 Pro. More VRAM is needed when you push the resolution. And the Pro have like base PS4, only 8 GB (5GB for gaming).
 
Last edited:

JaguarCROW

Neo Member
Jun 5, 2013
49
3
310
Sunnyvale CA
I think you need member status to be able to vote
That would be nice if it indicated it in the UI... Still frustrated that my account was closed years ago because of a server mixup on their end and I was forced to open a new one.
I mostly just like to read threads and not post much so not sure If I will ever become a "member". This account is certainly not new. Thanks for the clarification.
 

Panajev2001a

GAF's Pleasant Genius
Jun 7, 2004
13,590
1,773
1,955
Also. the One X have 12 GB Memory so most game can push 4k easily compare to PS4 Pro. More VRAM is needed when you push the resolution. And the Pro have like base PS4, only 8 GB (5GB for gaming).
Technically the Pro has 0.5 GB more RAM dedicated to games than the base PS4 as that was what they estimated to allow display buffers would need to grow to 4K (1 GB extra is given back as they added a separate 1 GB DDR4 block to store the backgrounder media / OS apps when you go back to a game, the other 0.5 GB is used by the OS to bring all of its UI to native 4K).

Having an extra 4 GB of RAM does allow greater bandwidth (more DRAM chips, extra channels are required, etc...) and space for higher resolution textures, but sure some of that is likely used to hold higher resolution buffers for screen space effects, off screen render targets, and the like, but the first two items I think are taking far more advantage of the extra RAM.

Xbox One X has higher floating point performance, higher bandwidth, and more RAM. Thankful for that else one year later and $100 more expensive than the Pro would have been a disaster if it were not considerably more powerful.
 
  • Like
Reactions: Deto

Panajev2001a

GAF's Pleasant Genius
Jun 7, 2004
13,590
1,773
1,955
But given that RDNA IPC is higher, shouldn't said clockspeeds also be reduced and not match base PS4 and Pro?
I think it would help their compatibility mode if they could get clockspeed to match as much as possible (part of their BC patents was to allow HW components like GPU, busses, etc... to clock as close to the original system as possible... although they also had patents about allowing per title BC boost modes to improve quality and performance).
 

ANIMAL1975

Member
Aug 27, 2018
431
473
270
Portugal 🇵🇹
That would be nice if it indicated it in the UI... Still frustrated that my account was closed years ago because of a server mixup on their end and I was forced to open a new one.
I mostly just like to read threads and not post much so not sure If I will ever become a "member". This account is certainly not new. Thanks for the clarification.

Talk with @EviLore , maybe he can help you.
 

it_wasn't_me

Member
May 22, 2017
520
299
295
Also. the One X have 12 GB Memory so most game can push 4k easily compare to PS4 Pro. More VRAM is needed when you push the resolution. And the Pro have like base PS4, only 8 GB (5GB for gaming).
Memory footprint is shrinking during console lifespan. 3 GB for OS was 6 years ago ( Shadowfall presentation )
 
  • Like
Reactions: Panajev2001a

MadAnon

Member
Sep 11, 2018
186
122
230
Did i say it wasn't? doesn't change the fact most graphic intensive games dip below 4k
Just like most PS4 Pro games dip below 1800p even though GoW is
Which are those most graphics intensive games? By the way, Metro: Exodus also runs native 4k on XBX. Probably one of the most graphics intensive games out there.
PS4pro is not only quite a bit weaker than XBX but basically x3 weaker than 5700xt equivalent.

2x for a next gen leap is anemic
I expect next gen games not current gen games on ultra...
x2 and x3 the respective mid gen refreshes is not anemic. Way to twist logic to fit your narrative. It's much more than previous gen. It's would roughly be x8 over PS4 in gpu department, x4 in cpu.

Did you see Halo teaser? Where was your super duper next gen looking ultra visuals? Honestly it looked no better or even worse than some of the best looking games from this gen. I think you overestimate the graphics leap. Too many tech demos in your imagination.

So what exactly he said that doesn't align with what Phil said? 4k was possible last gen and now they will push for higher, stable framerates + better visuals.

So 11TF+ is a given then for XBOX2
How it is given? Keep going with your flawed logic to fit your narrative.
 
Last edited:
Sep 25, 2016
1,707
14
240
this is a more realistic spec, i salute you.
I'm more of a PC player nowadays but people forget the shit they pulled out of 1.3 / 1.8 tflop boxes with Jaguar CPU's. Going from a base of 1.3tf to somewhere in the region of 10tflops along with an ~400% increase in CPU compute is going to yield insane visuals and crazy improvements to physics simulations, NPC numbers etc with hopefully 1080p/60fps options for almost all games.
 
Last edited:

INCUBASE

Member
Jan 8, 2018
324
215
255
If they can guarantee 120fps at 1440p/1080p for all games, I'll pay £500+

A console over £600 for just the base unit won't go down well with the public, too high

£499 is the magic number
 
  • Like
Reactions: Gamerasi

MadAnon

Member
Sep 11, 2018
186
122
230
.rdr2 was 4k on x1x because it is a game built for 1.3TF x1 running on x1x. X1x isnt some magic box that run everything at 4k because Phil built it that way. It can run most x1 games that way because x1 limits how much can be packed into a game. Remove the x1 limiting factor, and x1x will struggle.
Let's go with multiplat Metro: Exodus then. Runs native 4k/30fps on RX 580 equivalent and Jaguar cpu. I bet that they could push quite a bit more frames with just a better CPU alone. Now imagine if you have a potential equal to another RX 580 and 3 more Jaguars untapped.
 
Last edited:

xGreir

Member
Apr 1, 2019
139
117
220
I am not willing to pay for a 2 GHz machine that can't push at least more than 10 TF, u know, making me forget that I've purchased a Jet Engine.

I had enough with the Yellow Surprise on PS3.
 
  • LOL
Reactions: Negotiator

sonomamashine

Member
Jun 29, 2019
503
240
265
Seems a bit ahead of the actual news - they have the clock (supposedly) but how do they know how many CUs to get the performance?
They didnt mention CU count on the article, i dont know if i understood your question (correct me if im wrong), if its 9.7Tf a 38CU @ 2Ghz = 9.7.
Are you asking for the formula ?
I can't understand how, with all the knowledgeable ppl out there, these bs get to be published.

They are hungry of visits
Gotta pay those bills 🤣.
 

xool

Member
May 29, 2018
841
723
335
if its 9.7Tf a 38CU @ 2Ghz = 9.7.
Are you asking for the formula ?
Wondering where they (or anyone) got the 38CU (or whatever it's supposed to be) from - which afaik they need to estimate TFlops (the article says 9.2TF RDNA)

afaik we haven't had any leaks giving CUs yet, even dodgy ones

[edit = it's 128 GFlops per CU/GHz right ?]
[edit2 - oh at 9.2TF that's 36 CUs - so they must be assuming it's an "overclocked 5700" ? ]
 
Last edited:

sonomamashine

Member
Jun 29, 2019
503
240
265
Wondering where they (or anyone) got the 38CU figure from - which afaik they need to estimate TFlops (the article says 9.2TF RDNA)
AMD TF count:
The number of compute unite ×64 wich gives you the amount of Radeon shaders, ×2 theoreticly instructions can be processed, × clock speed.
You just do the math to figure CU count.
 

MadAnon

Member
Sep 11, 2018
186
122
230
Wondering where they (or anyone) got the 38CU (or whatever it's supposed to be) from - which afaik they need to estimate TFlops (the article says 9.2TF RDNA)

afaik we haven't had any leaks giving CUs yet, even dodgy ones

[edit = it's 128 GFlops per CU/GHz right ?]
[edit2 - oh at 9.2TF that's 36 CUs - so they must be assuming it's an "overclocked 5700" ? ]
The CU count is only an educated guess from available info. Gonzalo benchmark, leaked frequencies in comparison to desktop equivalents. 36CU seems plausible especially due to that potential backwards compatibility approach.
 
Last edited:
Dec 12, 2006
2,802
374
1,080
Seems a bit ahead of the actual news - they have the clock (supposedly) but how do they know how many CUs to get the performance?
18.4 TF

72 active CUs @2000

7nm EUV moves the power curve to the right.

So a 1550 clock gaining ~10% (per 7nm EUV perf gain) takes the clock to around 1700.

But if a console manufacturer was determined to go all out with the launch machine, the old power curve didn't really start accelerating until 1800. So 1800 plus ~10% puts us around 2000.

Previously I was only thinking of the density increase that 7nm EUV would bring to squeeze in 80 CUs. Doh!

This would give an incredible 10x performance increase over the base PS4. Truly a generation a leap. I hope the engineers can pull it off.

Poll options for an aggressive 16+TF and ultra 18+TF may be necessary for dreamers like me ;)

*tips hat to the excellent work of @SonGoku @Negotiator and @DemonCleaner on this thread*
 

sonomamashine

Member
Jun 29, 2019
503
240
265
18.4 TF

72 active CUs @2000

7nm EUV moves the power curve to the right.

So a 1550 clock gaining ~10% (per 7nm EUV perf gain) takes the clock to around 1700.

But if a console manufacturer was determined to go all out with the launch machine, the old power curve didn't really start accelerating until 1800. So 1800 plus ~10% puts us around 2000.

Previously I was only thinking of the density increase that 7nm EUV would bring to squeeze in 80 CUs. Doh!

This would give an incredible 10x performance increase over the base PS4. Truly a generation a leap. I hope the engineers can pull it off.

Poll options for an aggressive 16+TF and ultra 18+TF may be necessary for dreamers like me ;)

*tips hat to the excellent work of @SonGoku @Negotiator and @DemonCleaner on this thread*
There is a difference between dreaming and trippin guess wich side your in.
 
Dec 12, 2006
2,802
374
1,080
There is a difference between dreaming and trippin guess wich side your in.
I'm not sure either!

I'm on holiday at the moment and typing this on mobile. If I was at home, I'd be all over that 2000 clock leak to verify it. Here my google-fu skills are weak unfortunately.

As I see it currently:
  • 12 TF - rock bottom target on 7nm EUV
  • 14.2 TF - mid-low target 72@1550 (my current baseline)
  • 16.1 TF - mid-high target 72@1750
  • 18.4 TF - ultra target 72@2000
I've no idea how this beast will be cooled / powered. It seems crazy that these type of figures are even on the table but here we are.

Scarlett specs would get a bump too as it will also be 7nm EUV.
 

MadAnon

Member
Sep 11, 2018
186
122
230
With liquid cooling people are pushing 5700xt comfortably over 2ghz. It Can actually be pushed that high with reference cooler cranked up to max aka jet mode.
N7P supposedly offers 10% reduction in power consumption at the same clocks compared to N7 (7nm DUV). With a good non-liquid cooling solution and N7P, 36/40 CUs at 2ghz is very doable.
 

SlimySnake

Member
Feb 5, 2013
4,032
35
460
But given that RDNA IPC is higher, shouldn't said clockspeeds also be reduced and not match base PS4 and Pro?
technically the polaris GPU in the Pro also had IPC improvements over the first gen GPU that was in the PS4 and yet Sony thought it was wise to disable 18 CUs to make sure the unpatched PS4 games ran on the same 18 CUs they were used to running on the base PS4.

I see what you are saying but whatever BC Sony is going with seems to work well when running the game at the same clocks as they were designed.
 
  • Like
Reactions: llien

Negotiator

Member
Jun 28, 2011
4,026
656
695
But given that RDNA IPC is higher, shouldn't said clockspeeds also be reduced and not match base PS4 and Pro?
Correct.

Even Zen 2 running at 1.6 GHz will not be able to properly replicate Jaguar's IPC at 1.6 GHz. 2x IPC difference.

Maybe Zen 2 running at 800 MHz would be more accurate, but I don't think that's possible (only Intel CPUs can downclock to 800 MHz).

Memory footprint is shrinking during console lifespan. 3 GB for OS was 6 years ago ( Shadowfall presentation )
How much is it these days?

I remember the PS3 OS went from 120MB to 50MB, but for the PS4 OS we will don't have more current info than the 3GB figure.
 
Last edited:

Negotiator

Member
Jun 28, 2011
4,026
656
695
technically the polaris GPU in the Pro also had IPC improvements over the first gen GPU that was in the PS4 and yet Sony thought it was wise to disable 18 CUs to make sure the unpatched PS4 games ran on the same 18 CUs they were used to running on the base PS4.

I see what you are saying but whatever BC Sony is going with seems to work well when running the game at the same clocks as they were designed.
Are we sure it's Polaris and not some semi-custom hybrid with Polaris features?


CLRX* Version: GCN 1.1

* https://github.com/CLRX/CLRX-mirror/wiki/GcnIsa
 

Marlenus

Member
Jul 29, 2013
1,143
77
445
UK
No it is not "retarded" because I was comparing performance vs clock - power usage didn't even come into it.

Seriously don't go throwing around phrase "retarded"



1080 is full fat, 5700 is 36 of a max 40 CU.

Again lay off the attacks.
I am calling a spade a spade.

Perf/clock does not really make sense for graphics cards. Better to compare perf/tflop, perf/transistor, perf/mm^2 or perf/watt.

Computerbase.de did perf/flop comparisons between Navi and Turing, Pascal, Polaris and Vega. Vs Turing it was neck and neck, Vs Pascal rdna was around 13% ahead on average.

From that you can say that rdna has lower perf/transistor than Pascal when clockspeeds are fixed but RDNA naturally clocks slightly higher (about 5%) than Pascal if you look at average clock speed of the 1080 Vs the 5700xt which closes this gap somewhat. Anandtech have really good average clocks per game in their power consumption pages.
 

Marlenus

Member
Jul 29, 2013
1,143
77
445
UK