• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

MadAnon

Member
Because most graphics intensive games are dynamic, regardless assuming native 4k 11TF+ even more necessary
For a 11TF machine i expect the target resolution to be the same as the X: 4K Dynamic, sometimes static.
So RDR2 is not a graphics intensive game compared to what.... ? I'm really curious what is that secret sauce in next gen visuals you expect that you need 11-13TF Navi? And without such, 4k is somehow impossible?

Read what Phil said recently...

I think the area that we really want to focus on next generation is frame rate and playability of the games.
This generation, we’ve really focused on 4K visuals and how we bring both movies through 4K Blu-ray and video streaming, and with Xbox One X allowing games to run at 4K visuals will make really strong visual enhancements next generation.

They achieved 4k with XBX and now they will target other areas. But keep cornering yourself.
 
Last edited:

llien

Member
So in theory RDNA it's a 1.25x gain, but in practice, it can be as much as a 1.4x gain? Maybe that's where I got the 1.4x gain from yesterday. A 1.4x gain for a 9.2TF RDNA GPU would be similar to 12.88 TF GPU of today.
IPC is gain per clock. But Navi cards also clock higher, e.g. Vega64 hovered at 1.6Ghz, 5700XT can cross 2Ghz.
 

JaguarCROW

Member
Sorry but is there some special method for being able to Vote in the poll above. I can only select each category and get the breakdown of users but can't seem to cast a vote myself. Thanks and sorry for the stupid question.
 

llien

Member
oberon is also a shakespeare character like ariel, gonzalo and flute.

but the biggest indication that oberon is the ps5 gpu are the three clockspeeds of 800 mhz, 911 mhz and 2.0 ghz. the first two being the exact clockspeeds of the base ps4 and the pro.

even if gonzalo and flute werent the ps5, this thing definitely is.
But given that RDNA IPC is higher, shouldn't said clockspeeds also be reduced and not match base PS4 and Pro?
 

SonGoku

Member
So RDR2 is not a graphics intensive game compared to what.... ?
Did i say it wasn't? doesn't change the fact most graphic intensive games dip below 4k
Just like most PS4 Pro games dip below 1800p even though GoW is
I'm really curious what is that secret sauce in next gen visuals you expect that you need 11-13TF Navi? And without such, 4k is somehow impossible?
2x for a next gen leap is anemic
I expect next gen games not current gen games on ultra...
Read what Phil said recently...
Read what Cerny said recently:
A true generational shift tends to include a few foundational adjustments. A console’s CPU and GPU become more powerful, able to deliver previously unattainable graphical fidelity and visual effects; system memory increases in size and speed; and game files grow to match, necessitating larger downloads or higher-capacity physical media like discs.
They achieved 4k with XBX and now they will target other areas.
So 11TF+ is a given then for XBOX2
Sorry but is there some special method for being able to Vote in the poll above. I can only select each category and get the breakdown of users but can't seem to cast a vote myself. Thanks and sorry for the stupid question.
I think you need member status to be able to vote
 
Last edited:

joe_zazen

Member
.rdr2 was 4k on x1x because it is a game built for 1.3TF x1 running on x1x. X1x isnt some magic box that run everything at 4k because Phil built it that way. It can run most x1 games that way because x1 limits how much can be packed into a game. Remove the x1 limiting factor, and x1x will struggle.
 
Last edited:

squarealex

Member
Also. the One X have 12 GB Memory so most game can push 4k easily compare to PS4 Pro. More VRAM is needed when you push the resolution. And the Pro have like base PS4, only 8 GB (5GB for gaming).
 
Last edited:

JaguarCROW

Member
I think you need member status to be able to vote
That would be nice if it indicated it in the UI... Still frustrated that my account was closed years ago because of a server mixup on their end and I was forced to open a new one.
I mostly just like to read threads and not post much so not sure If I will ever become a "member". This account is certainly not new. Thanks for the clarification.
 

Panajev2001a

GAF's Pleasant Genius
Also. the One X have 12 GB Memory so most game can push 4k easily compare to PS4 Pro. More VRAM is needed when you push the resolution. And the Pro have like base PS4, only 8 GB (5GB for gaming).

Technically the Pro has 0.5 GB more RAM dedicated to games than the base PS4 as that was what they estimated to allow display buffers would need to grow to 4K (1 GB extra is given back as they added a separate 1 GB DDR4 block to store the backgrounder media / OS apps when you go back to a game, the other 0.5 GB is used by the OS to bring all of its UI to native 4K).

Having an extra 4 GB of RAM does allow greater bandwidth (more DRAM chips, extra channels are required, etc...) and space for higher resolution textures, but sure some of that is likely used to hold higher resolution buffers for screen space effects, off screen render targets, and the like, but the first two items I think are taking far more advantage of the extra RAM.

Xbox One X has higher floating point performance, higher bandwidth, and more RAM. Thankful for that else one year later and $100 more expensive than the Pro would have been a disaster if it were not considerably more powerful.
 

Panajev2001a

GAF's Pleasant Genius
But given that RDNA IPC is higher, shouldn't said clockspeeds also be reduced and not match base PS4 and Pro?

I think it would help their compatibility mode if they could get clockspeed to match as much as possible (part of their BC patents was to allow HW components like GPU, busses, etc... to clock as close to the original system as possible... although they also had patents about allowing per title BC boost modes to improve quality and performance).
 

ANIMAL1975

Member
That would be nice if it indicated it in the UI... Still frustrated that my account was closed years ago because of a server mixup on their end and I was forced to open a new one.
I mostly just like to read threads and not post much so not sure If I will ever become a "member". This account is certainly not new. Thanks for the clarification.


Talk with EviLore EviLore , maybe he can help you.
 
Also. the One X have 12 GB Memory so most game can push 4k easily compare to PS4 Pro. More VRAM is needed when you push the resolution. And the Pro have like base PS4, only 8 GB (5GB for gaming).

Memory footprint is shrinking during console lifespan. 3 GB for OS was 6 years ago ( Shadowfall presentation )
 

MadAnon

Member
Did i say it wasn't? doesn't change the fact most graphic intensive games dip below 4k
Just like most PS4 Pro games dip below 1800p even though GoW is

Which are those most graphics intensive games? By the way, Metro: Exodus also runs native 4k on XBX. Probably one of the most graphics intensive games out there.
PS4pro is not only quite a bit weaker than XBX but basically x3 weaker than 5700xt equivalent.

2x for a next gen leap is anemic
I expect next gen games not current gen games on ultra...

x2 and x3 the respective mid gen refreshes is not anemic. Way to twist logic to fit your narrative. It's much more than previous gen. It's would roughly be x8 over PS4 in gpu department, x4 in cpu.

Did you see Halo teaser? Where was your super duper next gen looking ultra visuals? Honestly it looked no better or even worse than some of the best looking games from this gen. I think you overestimate the graphics leap. Too many tech demos in your imagination.


So what exactly he said that doesn't align with what Phil said? 4k was possible last gen and now they will push for higher, stable framerates + better visuals.

So 11TF+ is a given then for XBOX2

How it is given? Keep going with your flawed logic to fit your narrative.
 
Last edited:

Polygonal_Sprite

Gold Member
this is a more realistic spec, i salute you.

I'm more of a PC player nowadays but people forget the shit they pulled out of 1.3 / 1.8 tflop boxes with Jaguar CPU's. Going from a base of 1.3tf to somewhere in the region of 10tflops along with an ~400% increase in CPU compute is going to yield insane visuals and crazy improvements to physics simulations, NPC numbers etc with hopefully 1080p/60fps options for almost all games.
 
Last edited:

INC

Member
If they can guarantee 120fps at 1440p/1080p for all games, I'll pay £500+

A console over £600 for just the base unit won't go down well with the public, too high

£499 is the magic number
 

MadAnon

Member
.rdr2 was 4k on x1x because it is a game built for 1.3TF x1 running on x1x. X1x isnt some magic box that run everything at 4k because Phil built it that way. It can run most x1 games that way because x1 limits how much can be packed into a game. Remove the x1 limiting factor, and x1x will struggle.
Let's go with multiplat Metro: Exodus then. Runs native 4k/30fps on RX 580 equivalent and Jaguar cpu. I bet that they could push quite a bit more frames with just a better CPU alone. Now imagine if you have a potential equal to another RX 580 and 3 more Jaguars untapped.
 
Last edited:

xGreir

Member
I am not willing to pay for a 2 GHz machine that can't push at least more than 10 TF, u know, making me forget that I've purchased a Jet Engine.

I had enough with the Yellow Surprise on PS3.
 
Seems a bit ahead of the actual news - they have the clock (supposedly) but how do they know how many CUs to get the performance?
They didnt mention CU count on the article, i dont know if i understood your question (correct me if im wrong), if its 9.7Tf a 38CU @ 2Ghz = 9.7.
Are you asking for the formula ?
I can't understand how, with all the knowledgeable ppl out there, these bs get to be published.

They are hungry of visits
Gotta pay those bills 🤣.
 

xool

Member
if its 9.7Tf a 38CU @ 2Ghz = 9.7.
Are you asking for the formula ?

Wondering where they (or anyone) got the 38CU (or whatever it's supposed to be) from - which afaik they need to estimate TFlops (the article says 9.2TF RDNA)

afaik we haven't had any leaks giving CUs yet, even dodgy ones

[edit = it's 128 GFlops per CU/GHz right ?]
[edit2 - oh at 9.2TF that's 36 CUs - so they must be assuming it's an "overclocked 5700" ? ]
 
Last edited:
Wondering where they (or anyone) got the 38CU figure from - which afaik they need to estimate TFlops (the article says 9.2TF RDNA)
AMD TF count:
The number of compute unite ×64 wich gives you the amount of Radeon shaders, ×2 theoreticly instructions can be processed, × clock speed.
You just do the math to figure CU count.
 

MadAnon

Member
Wondering where they (or anyone) got the 38CU (or whatever it's supposed to be) from - which afaik they need to estimate TFlops (the article says 9.2TF RDNA)

afaik we haven't had any leaks giving CUs yet, even dodgy ones

[edit = it's 128 GFlops per CU/GHz right ?]
[edit2 - oh at 9.2TF that's 36 CUs - so they must be assuming it's an "overclocked 5700" ? ]
The CU count is only an educated guess from available info. Gonzalo benchmark, leaked frequencies in comparison to desktop equivalents. 36CU seems plausible especially due to that potential backwards compatibility approach.
 
Last edited:
Seems a bit ahead of the actual news - they have the clock (supposedly) but how do they know how many CUs to get the performance?

18.4 TF

72 active CUs @2000

7nm EUV moves the power curve to the right.

So a 1550 clock gaining ~10% (per 7nm EUV perf gain) takes the clock to around 1700.

But if a console manufacturer was determined to go all out with the launch machine, the old power curve didn't really start accelerating until 1800. So 1800 plus ~10% puts us around 2000.

Previously I was only thinking of the density increase that 7nm EUV would bring to squeeze in 80 CUs. Doh!

This would give an incredible 10x performance increase over the base PS4. Truly a generation a leap. I hope the engineers can pull it off.

Poll options for an aggressive 16+TF and ultra 18+TF may be necessary for dreamers like me ;)

*tips hat to the excellent work of SonGoku SonGoku N Negotiator and DemonCleaner DemonCleaner on this thread*
 
18.4 TF

72 active CUs @2000

7nm EUV moves the power curve to the right.

So a 1550 clock gaining ~10% (per 7nm EUV perf gain) takes the clock to around 1700.

But if a console manufacturer was determined to go all out with the launch machine, the old power curve didn't really start accelerating until 1800. So 1800 plus ~10% puts us around 2000.

Previously I was only thinking of the density increase that 7nm EUV would bring to squeeze in 80 CUs. Doh!

This would give an incredible 10x performance increase over the base PS4. Truly a generation a leap. I hope the engineers can pull it off.

Poll options for an aggressive 16+TF and ultra 18+TF may be necessary for dreamers like me ;)

*tips hat to the excellent work of SonGoku SonGoku N Negotiator and DemonCleaner DemonCleaner on this thread*
There is a difference between dreaming and trippin guess wich side your in.
 
There is a difference between dreaming and trippin guess wich side your in.

I'm not sure either!

I'm on holiday at the moment and typing this on mobile. If I was at home, I'd be all over that 2000 clock leak to verify it. Here my google-fu skills are weak unfortunately.

As I see it currently:
  • 12 TF - rock bottom target on 7nm EUV
  • 14.2 TF - mid-low target 72@1550 (my current baseline)
  • 16.1 TF - mid-high target 72@1750
  • 18.4 TF - ultra target 72@2000
I've no idea how this beast will be cooled / powered. It seems crazy that these type of figures are even on the table but here we are.

Scarlett specs would get a bump too as it will also be 7nm EUV.
 

xool

Member
I'm not sure either!
A4y29uL.gif

call me at 52
 

MadAnon

Member
With liquid cooling people are pushing 5700xt comfortably over 2ghz. It Can actually be pushed that high with reference cooler cranked up to max aka jet mode.
N7P supposedly offers 10% reduction in power consumption at the same clocks compared to N7 (7nm DUV). With a good non-liquid cooling solution and N7P, 36/40 CUs at 2ghz is very doable.
 

SlimySnake

Flashless at the Golden Globes
But given that RDNA IPC is higher, shouldn't said clockspeeds also be reduced and not match base PS4 and Pro?
technically the polaris GPU in the Pro also had IPC improvements over the first gen GPU that was in the PS4 and yet Sony thought it was wise to disable 18 CUs to make sure the unpatched PS4 games ran on the same 18 CUs they were used to running on the base PS4.

I see what you are saying but whatever BC Sony is going with seems to work well when running the game at the same clocks as they were designed.
 
But given that RDNA IPC is higher, shouldn't said clockspeeds also be reduced and not match base PS4 and Pro?
Correct.

Even Zen 2 running at 1.6 GHz will not be able to properly replicate Jaguar's IPC at 1.6 GHz. 2x IPC difference.

Maybe Zen 2 running at 800 MHz would be more accurate, but I don't think that's possible (only Intel CPUs can downclock to 800 MHz).

Memory footprint is shrinking during console lifespan. 3 GB for OS was 6 years ago ( Shadowfall presentation )
How much is it these days?

I remember the PS3 OS went from 120MB to 50MB, but for the PS4 OS we will don't have more current info than the 3GB figure.
 
Last edited:
technically the polaris GPU in the Pro also had IPC improvements over the first gen GPU that was in the PS4 and yet Sony thought it was wise to disable 18 CUs to make sure the unpatched PS4 games ran on the same 18 CUs they were used to running on the base PS4.

I see what you are saying but whatever BC Sony is going with seems to work well when running the game at the same clocks as they were designed.
Are we sure it's Polaris and not some semi-custom hybrid with Polaris features?


CLRX* Version: GCN 1.1

* https://github.com/CLRX/CLRX-mirror/wiki/GcnIsa
 

Marlenus

Member
No it is not "retarded" because I was comparing performance vs clock - power usage didn't even come into it.

Seriously don't go throwing around phrase "retarded"



1080 is full fat, 5700 is 36 of a max 40 CU.

Again lay off the attacks.

I am calling a spade a spade.

Perf/clock does not really make sense for graphics cards. Better to compare perf/tflop, perf/transistor, perf/mm^2 or perf/watt.

Computerbase.de did perf/flop comparisons between Navi and Turing, Pascal, Polaris and Vega. Vs Turing it was neck and neck, Vs Pascal rdna was around 13% ahead on average.

From that you can say that rdna has lower perf/transistor than Pascal when clockspeeds are fixed but RDNA naturally clocks slightly higher (about 5%) than Pascal if you look at average clock speed of the 1080 Vs the 5700xt which closes this gap somewhat. Anandtech have really good average clocks per game in their power consumption pages.
 
Status
Not open for further replies.
Top Bottom