• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

R600

Banned
Also, not only is resolution increase for next gen expected to be 4x (which is twice the increase last gen), people also expect them to run 60fps, while this gen it was 30fps.

So just based on resolution/frame rate expectation, next gen would have to be 4x by DEFAULT + 8-10x increase in performance every gen. So really, really unrealistic expectations.

5700XT and 8 core Zen2 will have no problems at 4K and 30fps. But but...people expect 60fps at 4K, even though last gen we were happy with 30fps and 1080p.
 

SonGoku

Member
Is CPU from userbenchmark leak enough to push ANY game at 60fps? YES
Current gen games designed for Jaguar CPUs
Do you expect next gen to be current gen at 60fps lol?
It would be incredibly stupid tradeoff.
Remember, extra die space is ALWAYS better spent on GPU side in gaming consoles. Without exception.
If we were talking about a 370mm2 vs 400mm2 die i would agree with you, but for a 300 vs 330 die size its a incredibly shortsighted downgrade that will bring minimal short term savings and null long term savings while crippling cpu performance
5700XT and 8 core Zen2 will have no problems at 4K and 30fps
It barely runs current gen games a 4k, next gen games will be much more demanding
While 7850 was eating last gen games alive at 1080p: High settings and 60fps+ even 100+
 
Last edited:

R600

Banned
I wouldnt say 7850 was destroying games at 1080p back in 2013 (I am talking about modern games such as Crysis 2, Metro 2033 and BF3). In fact, at 1080p average for these 3 less then 50fps.

And obviously, another point is that increase for next gen, just in pixel pushing is TWICE bigger then increase last time. Compare 1440p which would actually, based on poxel increase, be much more suitable.
 

SonGoku

Member
wouldnt say 7850 was destroying games at 1080p back in 2013 (I am talking about modern games such as Crysis 2, Metro 2033 and BF3). In fact, at 1080p average for these 3 less then 50fps.
Thats mid to high 50s on high settings, similar performance to the RTX 2080 at 4k
Compare 1440p which would actually, based on poxel increase, be much more suitable.
Precisely! I've been saying this all along, for a 1440p target 5700xt offers excellent performance
The bad news is next gen target is 4k and the gonzalo fs score points towards less than 5700xt performance.
 
Last edited:

Darius87

Member
9TF RDNA is roughly comparable to Vega64 which is why i brought it up
PC comparisons give a estimate of performance gap between cards, that's the point
i don't know why you even bringing that up, the point i'm making that these cards would perform far better in consoles, but you say that vega64 isn't even 2x perf of xonex, but vega64 is around the same as 5700xt, which is true, but vega64 is not in xonex so perf in console with vega64 would be 2x in perf so as 5700xt, that's why you don't compare pc cards with consoles.

Not really lol
2x the XBONEX will basically give you current gen games at 4k/60 and slightly prettier current gen games at 4k/30
can you imagine 12 tflop gcn console 9x > xboxone, 6.5x > ps4, 2.8x > over pro with slighly better gfx? i'm not it would be fully capable next-gen system, even 8 tflops gcn would provide 4k with increased details over this gen so 4 tflops left would do wonders.
 

R600

Banned
Sony was aiming for 2x1080p officially in E3 with PS3, they hyped it to death, yet there was more sub HD games then 720p, let alone 1080p.
 

SonGoku

Member
i don't know why you even bringing that up, the point i'm making that these cards would perform far better in consoles, but you say that vega64 isn't even 2x perf of xonex,
Because the equivalent of a rx 580 is inside the X. So 2x the X
or 6.5x the PS4 for 4x the resolution
can you imagine 12 tflop gcn console 9x > xboxone, 6.5x > ps4, 2.8x > over pro with slighly better gfx? i'm not it would be fully capable next-gen system,
For 9TF i expect devs to target 1440p-1800 to provide a next gen leap
even 8 tflops gcn would provide 4k with increased details over this gen so 4 tflops left would do wonders.
It will look better for sure, it just woulnt make for a next gen leap better which is why i think with that performance devs would target dynamic resolution, 1440p, 1800p etc to provide a next gen leap.
Sony was aiming for 2x1080p officially in E3 with PS3, they hyped it to death, yet there was more sub HD games then 720p, let alone 1080p.
tbh i would be super happy with a 1440p (downsample to 1080p tv) ps5 with 5700xt level gpu
The problem is both ms/sony dug themselves in the 4k hole with pro/x.
 
Last edited:

R600

Banned
I dont agree they dug themselves into a hole. Xbox One X is 4k console at most 30-40% of the time. Pro is best console for 1080p gaming and you can count native 4k games on your fingers, so that tells you everything you need to know.

So its PR as usual IMO.
 
Last edited:

SonGoku

Member
I dont agree they dug themselves into a hole. Xbox One X is 4k console at most 30-40% of the time. Pro is not best console for 1080p gaming and you can count native 4k games on your fingers.

So its PR as usual IMO.
But they set the expectation for 4k just like 1080p was the expectation for this gen
Customers expect next gen to be better than the mid gen revisions.

Even if they don't reach full 4k they'll want to approximate it using dynamic rez, 1800p. Solutions that'll take more resources than 1440p
 

Darius87

Member
Because the equivalent of a rx 580 is inside the X. So 2x the X
or 6.5x the PS4 for 4x the resolution
but x doesn't need 2x increase to run at 4k modern games it nearly does at it is, nether do ps4 need 6.5x to run 4k.

For 9TF i expect devs to target 1440p-1800 to provide a next gen leap
with gcn no(but 4k current gen), with rdna yes(4k next-gen).

It will look better for sure, it just woulnt make for a next gen leap better which is why i think with that performance devs would target dynamic resolution, 1440p, 1800p etc to provide a next gen leap.
it's really hard to understand what you mean by next-gen leap? people have different expectations if you could provide some example video of what you call next-gen leap, would be great.
and over all next-gen leap was in ps1 to ps2 or ps2 to ps3 generations, nowadays i wouldn't call next-gen leap more like smooth bump in gfx.
 

R600

Banned
1080p was expectation this gen and 70% of Xbone titles where below it.

Additionally, 99% of titles actually running in native 1080p where 30fps.

Is 4K on Navi XT at 30fps impossible? No, of course no, but seems to me people expect 4K + 60fps, which is VERY different from what we got last gen to what you are comparing a leap to.

I bet 70+ % of titles would be able to run in native 4K in console environmet. Not that I expect it, I think CB techniques have gotten so good, especially with RDNA (+sharpening) that results could look almost 1:1 to 4k while running comfortably below.
 

SonGoku

Member
but x doesn't need 2x increase to run at 4k modern games it nearly does at it is, nether do ps4 need 6.5x to run 4k.
I meant to run next gen games at 4k 2x jump is small
nether do ps4 need 6.5x to run 4k.
4k consumes like 4x
it's really hard to understand what you mean by next-gen leap? people have different expectations if you could provide some example video of what you call next-gen leap, would be great.
Every generation jump has had a bigger jump in GPU performance and smaller increase in resolution
For next gen 9tf would be an smaller than usual jump for a bigger than usual resolution jump

By next gen leap i mean night and day difference you can instantly tell, not just prettier current gen games.
Additionally, 99% of titles actually running in native 1080p where 30fps.
PS4 runs 99% games at 1080
XBONE stuck at 900p because ms fucked up

I don't expect 60fps btw
Is 4K on Navi XT at 30fps impossible?
For crossgen games sure, for a next gen leap in visuals no. They'll have to use similar resolution targets as the PS4 Pro 1440p - 1800p (best case scenario)
 
Last edited:

Darius87

Member
I meant to run next gen games at 4k 2x jump is small
it would be more or less same jump like ps3 to ps4.

By next gen leap i mean night and day difference you can instantly tell, not just prettier current gen games.
that's not gonna happen, prepare for letdown, i don't say next-gen consoles won't be capable of that jump but even at the end of next-gen i would be surprised if it would be more like it was ps3 to ps4, problem diminishing returns.
 

Darius87

Member
PS3 to PS4 is A 8X jump (without even taking into account arch efficiency multiplier) for 2X the resolution
Much bigger jump
what's arch efficiency ps3 to ps4? i haven't even seen any figures except x8, it's basically nvidia's arch that rsx was vs polaris.

ps3 to ps4 games are a night and day difference for me, its instantly obvious.
i would call that ps2 to ps3, ps3 to ps4 it's like half of that.
 
The reason for that is ISA for GPUs changed several times in the last decade compared to the more static x86-64
He's right. nVidia uses the same RISC ISA since G80 in 2006. CUDA offers BC.

The big difference between CPUs and GPUs is that CPUs offer BC via elaborate microcode, while GPUs handle BC in the driver layer.

How do i search GPU ISA for nvidia?
Tesla, Fermi and Maxwell?
AMD TeraScale, GCN

Still you gotta admit isa is much more dynamic for gpus
Tesla, Fermi and Maxwell are microarchitectures, not entirely new ISAs.

TeraScale is a VLIW ISA, while GCN is a RISC ISA. If you ever see assembly output from those 2, you'll see that it's entirely different.

There's probably a reason Sony was able to offer PS4 BC on the PS5: it's still GCN at its core, with some major enhancements to improve rasterization efficiency.
 

xool

Member
it would be more or less same jump like ps3 to ps4.

More like 8x increase - CPU+GPU transistor budget increased from ~500million to 4000-5000million

that's not gonna happen, prepare for letdown, i don't say next-gen consoles won't be capable of that jump but even at the end of next-gen i would be surprised if it would be more like it was ps3 to ps4, problem diminishing returns.
Could well be right - we aren't going to get 32-40billion transistor APUs next gen. 4k will also steal some of the other gains made.

[tbh 32billion transistors would get performance to where this forum's expectations would be satisfied .. me included - difficult multipatterning at 7nm stole at lot of "Moore's Law" gains, specifically affordability ]
ps3 to ps4 games are a night and day difference for me, its instantly obvious.
Rose-tinted memories distort everything - my memory of Oblivion (360 near launch) is similar to Witcher 3 (near PS4 launch) - but the actual side by side screenshot reality is completely different
 
Last edited:

SonGoku

Member
what's arch efficiency ps3 to ps4?
RSX wasn't even unified shaders, it was fixed pipeline. The arch gap is huge, much bigger than gcn->rdna
XENOS is based on a very early version of TeraScale the jump TS->GCN is also huge
i haven't even seen any figures except x8
8x is for raw numbers only not taking into account arch efficiency
ps3 to ps4 it's like half of that
You asked what i meant by next gen leap at 4k, that's my answer.
[tbh 32billion transistors would get performance to where this forum's expectations would be satisfied .. ]
With <25 billion they can fit 80CUs + Zen2 CCD.
 
Last edited:

xool

Member
what's arch efficiency ps3 to ps4?

.Efficiency (ie polys per second per million transistors or something) doesn't increase as shaders become more programmable - it actual decreases - I have no way of putting a number on it though
 
Last edited:
Lol I dont know how you guys do it lol keep this thread alive saying the same shit, different ways 😂😂😂😂😂. You guys should work in PR lol 👀🤣

I like it. :messenger_grinning_smiling:

Console generation transitions are fun. I won't be jumping onboard until the middle of next-gen but I like that other gamers are excited about the next console tech jump. This is one of my favourite GAF threads to be subscribed to :messenger_beermugs:.
 

TeamGhobad

Banned
I dont agree they dug themselves into a hole. Xbox One X is 4k console at most 30-40% of the time. Pro is best console for 1080p gaming and you can count native 4k games on your fingers, so that tells you everything you need to know.

So its PR as usual IMO.

doing 4k on nexgen consoles is not going to be that hard. expect beasts this time around. both understand the value of having the most powerful system. 12-13Tflop RDNA will be plenty enough for 4k. at a reasonable price.
 

pawel86ck

Banned
52 Navi CUs at 1600MHz should be around 10.5-11TF? Also interesting details in regards to RT and insane up to 64GB SDD VRAM cache.

  1. SOC: Anubis ( 393mm2 )
  2. CPU: Custom Zen 2, 8 Cores 16 Threads @ 3.4GHz Built-in with DirectX 12.X, DXR, DirectML and Havok instructions into the chip.
  3. GPU: Custom Navi 21, 52 Compute Units @ 1623MHz
  4. RT: 1 RTC per Compute Unit
  5. RAM: 24GB GDDR6 @ 560GB/s ( Samsung 12x2GB ), 18GB GDDR6 + 2GB GDDR6 Cache + 32GB~64GB SSD vRAM ( Up-to 84GB for Game ) and Dedicated 4GB GDDR6 for OS ( Native 4K60/120FPS Dashboard )
  6. STORAGE: 1TB SSD NVMe PCIe 4.0 @ 4GB/s, Flexible Dedicated vRAM starts with 32GB ( Up-to 64GB )
  7. AUDIO: Custom Tensilica HIFI DSP
  8. I/O: 2x HDMI 2.1, 2x USB-C ( Thunderbolt ), 2x USB-A, 1x RJ-45, 1x S-PDIF, 1x IR-OUT, Bluetooth 5.0, WIFI Direct, WIFI IEEE 802.11ax
  9. API: DirectX 12.X, DXR, DirectML
  10. PSU: 275W
 
Last edited:

pawel86ck

Banned


Man I'm looking at these benchmarks and even 2080ti results in 4K arnt always good. So the best currently aviable GPU cant max out current games at 60fps, yet people expect they will see 4K 60fps and much improved graphics fidelity on next gen consoles. IMO next gen consoles should aim at 1440p or dynamic 4K plus checkerboard like in Gears Of War 5. 4K native is extreme resource hog.
 

R600

Banned
Benchmark posted yesterday with 16GB was with downclocked 18Gbps chips on 256bit bus resulting in 528 GB/s. So more bandwidth then 2080 Super.

This is why I wonder how it can be anything other then console? Its clearly AMD APU based with 8 core Zen2 and half the cache.

It also packs 16GB of absolute fastest RAM that can be found today (not even in mass production) and Navi GPU that matches Ariel GPU.

There is not a single product I can think of that would pack something like this. Workstation laptop? Yea, but with DDR4 RAM because GDDR6 as system RAM is useless in anything other then console. Its actually completely counterproductive.

Too much heat and costs for actually lower performance then DDR4. That changes if you need very fast, unified RAM, but you would need that only in...well, console.
 
Last edited:

xool

Member
Its clearly AMD APU based with 8 core Zen2 and half the cache.

I think the other explanation make more sense :
  • Has 4MB L3 cache because it's Zen+ not Zen2
  • Float scores seem relatively low because it's Zen+ not Zen2
  • Is a pre-production final sample today because it's Zen+ and not Zen2
[I agree it's a gaming based APU] My guess it's one of those unweildy portable gaming PCs you see once at EGS and never hear from again .. some sort of gaming tablet with side sticks and a 2kg battery glued to the back. Something Razer might make.
 
Last edited:

Gamernyc78

Banned
I like it. :messenger_grinning_smiling:

Console generation transitions are fun. I won't be jumping onboard until the middle of next-gen but I like that other gamers are excited about the next console tech jump. This is one of my favourite GAF threads to be subscribed to :messenger_beermugs:.

OH trust me I'm excited. Nothing like tht high feeling of opening a new console day one and trying all its features lol First day of ps4 all I did was stream, watch amateur porn on ps live lol and also participate lol and use share play etc...
 

More dynamic in a sense. Intel AMD never dropped x86 support so its about 40 years old.
But we're talking about x86-64, which is 16 years old. :)

Consoles don't have to provide 16-bit/IBM PC compatibility, they don't even have a traditional BIOS. Even 32-bit support is useless (with the exception of OG XBOX BC, which requires i386 ISA support).

PCs might do the same next year:


GCN is only 8 years old, so it's younger in comparison.

I wouldn't say x86 is "static". They keep adding new instructions all the time (i.e. AVX-512) that didn't exist 40 years ago.

The frontend offers BC via microcode, but the backend of a modern x86 processor is totally different compared to Intel 8088.

.Efficiency (ie polys per second per million transistors or something) doesn't increase as shaders become more programmable - it actual decreases - I have no way of putting a number on it though
This is kinda true in a sense, but could you imagine a fixed-function T&L GPU these days? Or a VLIW uarch (compiler issues)? Or separate pixel/vertex shader pipelines?

These technologies don't make much sense for multi-billion transistor chips and that's why they were abandoned.

Unified shader efficiency comes from using all the available pipelines:

unifiedshader.jpg
 

R600

Banned
I think the other explanation make more sense :
  • Has 4MB L3 cache because it's Zen+ not Zen2
  • Float scores seem relatively low because it's Zen+ not Zen2
  • Is a pre-production final sample today because it's Zen+ and not Zen2
[I agree it's a gaming based APU] My guess it's one of those unweildy portable gaming PCs you see once at EGS and never hear from again .. some sort of gaming tablet with side sticks and a 2kg battery glued to the back. Something Razer might make.
AFAIK Zen+/Navi APU does not exist, only Zen2 and Navi (end of the year release).

Also I struggle, really do, to find any hardware that could pack 18Gbps GDDR6 chips as SYSTEM memory. It would be unprecedented for laptop or PC. For laptop, because its completely counterproductive (much higher TDP/costs and actually worse perf then 16GB of DDR4) and for PC because APU with 16GB GDDR6 at 18Gbps would be....completely puzzling? You would provide APU based PCs only for low end, whats the reason of incredibly high powered APUs in PC when you can go discrete?

Weird thing is, these are absolute fastest chips Samsung makes and are yet to be found in any product. Even high performance GPUs such as 2080S have 16Gbps chips. These speeds almost sound too much even for a console, but not for a console with "narrow" bus. For 256bit bus, slightly downclocked 18Gbps would bring 528GB/s of bandwidth. Would leave 440GB/s for GPU alone...
 

xool

Member
AFAIK Zen+/Navi APU does not exist, only Zen2 and Navi (end of the year release).

Also I struggle, really do, to find any hardware that could pack 18Gbps GDDR6 chips as SYSTEM memory. It would be unprecedented for laptop or PC. For laptop, because its completely counterproductive (much higher TDP/costs and actually worse perf then 16GB of DDR4) and for PC because APU with 16GB GDDR6 at 18Gbps would be....completely puzzling? You would provide APU based PCs only for low end, whats the reason of incredibly high powered APUs in PC when you can go discrete?

Weird thing is, these are absolute fastest chips Samsung makes and are yet to be found in any product. Even high performance GPUs such as 2080S have 16Gbps chips. These speeds almost sound too much even for a console, but not for a console with "narrow" bus. For 256bit bus, slightly downclocked 18Gbps would bring 528GB/s of bandwidth. Would leave 440GB/s for GPU alone...

afaik we don't know it's Navi - and yes I would expect Zen+/Vega based APU

also I didn't remember anything about the speed of the GDDR6 - the latencies show that its not DDR4 - but where is the 18Gb/s from ?

So much odd about this :
  • Why a new codename "Flute" if it's PS5 (Gonzalo)
  • Why did the sample id format change (if it's Gonzalo)
I try to be nearly realistic on next gen specs, but that 4MB [edit typo - 8MB] L3 cache has me triggered - it just seems a little too lowball, as does the 1.6/3.2GHz clock.

btw the bench is now MIA https://www.userbenchmark.com/UserRun/18618484
 
Last edited:

NickFire

Member
OH trust me I'm excited. Nothing like tht high feeling of opening a new console day one and trying all its features lol First day of ps4 all I did was stream, watch amateur porn on ps live lol and also participate lol and use share play etc...
I basically do the same thing with new consoles. Takes me a couple hours to actually play a game.
 

R600

Banned
afaik we don't know it's Navi - and yes I would expect Zen+/Vega based APU

also I didn't remember anything about the speed of the GDDR6 - the latencies show that its not DDR4 - but where is the 18Gb/s from ?

So much odd about this :
  • Why a new codename "Flute" if it's PS5 (Gonzalo)
  • Why did the sample id format change (if it's Gonzalo)
I try to be nearly realistic on next gen specs, but that 4MB L3 cache has me triggered - it just seems a little too lowball, as does the 1.6/3.2GHz clock.

btw the bench is now MIA https://www.userbenchmark.com/UserRun/18618484
Look at the code of APU - 13F9 (Navi10). It cannot be Vega. 13e9 was Gonzalo GPU part which matched Ariels (codename) GPU. According to Komachi, this could be later revision, therefore different ID as well.

16GB duo to fact that we know there are 16 chips and SC for each is 33.1, which would indicate more then 66GB/s per chip (so only downclocked 18Gbps chip is possibility).

Where are you seeing 4MB of L3? Its 16MB for entire CPU. Halved from Zen2 desktop part.

Ok went to look at benchmark and it has been removed.🤔🤔
 
Last edited:
Look at the code of APU - 13F9 (Navi10). It cannot be Vega. 13e9 was Gonzalo GPU part which matched Ariels (codename) GPU. According to Komachi, this could be later revision, therefore different ID as well.

16GB duo to fact that we know there are 16 chips and SC for each is 33.1, which would indicate more then 66GB/s per chip (so only downclocked 18Gbps chip is possibility).

Where are you seeing 4MB of L3? Its 16MB for entire CPU. Halved from Zen2 desktop part.

Ok went to look at benchmark and it has been removed.🤔🤔
Makes sense since consoles typically get the lower powered, mobile variant.
 

MadAnon

Member
Another thing to consider is the 1.6ghz base clock which is the same as PS4. Isn't that important for backwards compatibility?

Such clocks would make no sense for some random chinese console.
 
Last edited:

R600

Banned
Another thing to consider is the 1.6ghz base clock which is the same as PS4. Isn't that important for backwards compatibility?

Such clocks would make no sense for some random chinese console.
Its actually patented Sony PS BC method. Clocks have to match.

PS4Pro lowers CPU clocks to 1.6GHZ in BC mode and halves 36CUs to 18CUs from PS4.

I think interesting thing about this entire leak is that it was removed from UB.
 

Lunatic_Gamer

Gold Member
Speculation, conjecture. More speculation, more conjecture. That’s all this and similar threads have become. Remember VGLeaks? They did such a good job this gen, leaking info on a regular basis. What happened to them? Anyhow. I know we are a year and a half away minimum from a possible launch but still. Like most folks here I’m starving for new info on the next gen machines. Not Pastebin, Reddit or FakeEra crap, real stuff. Hopefully we shall get something soon. 👊🏻😎
 
Last edited:

Fafalada

Fafracer forever
PS3 to PS4 is A 8X jump (without even taking into account arch efficiency multiplier) for 2X the resolution
That comparison is apples to oranges, at base level we pretend PS3 had no access to more than half of its programmable compute. Also the resolution jump on average from 360/PS3 -> 1080p is at least 3x.

Single numbers across disparate architectures don't mean much, you need context. PS2->PS3 was the smallest fillrate jump of any Sony gen to date (arguably just about on par with PS4->PS4Pro), but like above, just one number.
 
Last edited:
Status
Not open for further replies.
Top Bottom