• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
CPU and GPU is entirely different. Scaling threads across CPUs is far more challenging than loading up all the shaders in a GPU.

If it wasn't the 2080Ti would not be faster than the 2080S.

52CUs at a more reasonable clock speed is far far far more efficient than ramping up the voltage to hit 2.23Ghz that the PS5 is doing.

No you're completely wrong. Using your example, the 2080 Ti should be more efficient than the 2080S then right? It's got lower clocks but more transistors.

power-gaming-average.png


So which is it?

Also, again, you're being selective. If 52CUs at lower clocks is more efficient, then Series X is 'ramping up the voltage' to hit 3.8Ghz on the CPU.

To clarify - you think an APU with a 52 CU GPU @ 1800Mhz and a CPU @ 3.8Ghz is more efficient than an APU with a 36 CU GPU @ 2200Mhz and a CPU @ 3.5Ghz?
 

PocoJoe

Banned
3ttoig.jpg


Xbox defence force have gone nuts. Anything about PS5 is "damage control" and anything PS5 does, xbox does better.

While there isnt even damage to control, PS5 is on the way to success as always and few Tflops here and there wont truly matter.

Only failed Sony console is Vita, so it is highly unlikely that PS5 would "lose" or fail, yet people continue to bash it and half the population have became engineers that read Cernys mind and tell how he did panic and overclock the system while being drunk etc. Madness
 

Gamernyc78

Banned
Whatever your damage control says to you, You give me a CCN link, where most comments are making fun of this article.

If you have to build a PC, nobody would ever choose SSD speed over CPU, GPU, and RAM speed. Pure logic.
I gave you facts, with sources, you give me smileys. But ... Are you laughing or crying though? It´s incredible how fanboys cannot see through. Or some of you have a bad time with confinement?

I'm crying 😂 over this stupidity you can't tell? 🤦‍♂️🤔
 
This is false as it has been explained ps5 has no boost clocks . It has variable frequencies .you can read more about it here if interested.

 
This is false as it has been explained ps5 has no boost clocks . It has variable frequencies .you can read more about it here if interested.

Always more convinced Sony shouldn't have done such a technical show in the open as first, true presentation of PS5. Or at the very least they should have changed the presentation to be have less Willy Wonka' terms like "boost mode" or shit like that.
 

M-V2

Member
Alot of people talk about performance, but several games run better on PS4 pro which is the weakest console in comparison to Xbox one X which is more powerful in 40%. You want a few pump in resolution?? Sure take it idc but I want performance. In the end of the day it depends on the devs. Take for example Doom enteranl runs better on PS4 pro according to digital foundy, RE3 remake demo runs way better on ps4 pro according to digital foundry, if this what the most powerful console guarantee then keep that a few pump in resolution to yourself I don't want it, give me the performance.

Note: you can watch digital foundry analysis before you attacking me lol

CGrPBhE.jpg
0bHyYKZ.jpg
 
Last edited:
Alot of people talk about performance, but several games run better on PS4 pro which is the weakest console in comparison to Xbox one X which is more powerful in 40%. You want a few pump in resolution?? Sure take it idc but I want performance. In the end of the day it depends on the devs. Take for example Doom enteranl runs better on PS4 pro according to digital foundy, RE3 remake demo runs way better on ps4 pro according to digital foundry, if this what the most powerful console guarantee then keep that a few pump in resolution to yourself I don't want it, give me the performance.

Note: you can watch digital foundry analysis before you attack me lol

CGrPBhE.jpg
0bHyYKZ.jpg
That frame time is fucking wild, makes Bloodborne look like perfection.
 

Felessan

Member
All that compute advantage of Series X will translate to real world performance too, no matter how how much customization Sony have done on their end. Will the end result be obvious to naked eye? Probably not. In fact, I wrote it a while back that ~15-18% advantage basically means Series X renders a game at 2160p native while PS5 does it at ~1950p. Or both run at 4K native but PS5 drops resolution more often. But, that still means Series X is performing better than PS5.
Not necessarely. You can be ROP bound, you can be asset loading speed bound - and your performance or image quality might drop even below ps5 level. XsX have some parts of pipeline that is better than ps5, some other parts are the same or even worse, this means that in reality getting better will require quite some efforts.
 

oxman2k

Member
nice to back after ban. hope the XSX doesn't go down the higher res ,ray tracing at the cost of performance road.If a multi plat game has higher res and ray tracing on the XSX but has a lower frame rate than the PS5 version which has lower res,less or no ray tracing but runs at a locked or much higher frame rate it's the PS5 version every time for me. Frame rate is king every time over higher res ,ray tracing,etc.
 

geordiemp

Member
The thing is, people dont get the upclock downclock at all. Look at Spiderman activity in 1 frame below posted over other place, things get busy for FRACTIONS OF FRAME, nanoseconds. 1 GHz = 1 nanosecond, its so damn fast.

So many posters think oh Ps5 GPU is going to dip to under 10 TF for the whole game or even a few seconds, like what ? Its to deal with the odd spike FFS. and will just borrow power from the CPU.

BBBBBUUT what if both happen together oh I am really concerned lol - look at how these things work for gods sake. I think either posters know and are just trolling, or dont want to know and being THICK


x6Q3gRr.png
 
Last edited:
The thing is, people dont get the upclock downclock at all. Look at Spiderman activity in 1 frame below posted over other place, things get busy for FRACTIONS OF FRAME, nanoseconds. 1 GHz = 1 nanosecond, its so damn fast.

So many posters think oh Ps5 GPU is going to dip to under 10 TF for the whole game or even a few seconds, like what ? Its to deal with the odd spike FFS. and will just borrow power from the CPU.

BBBBBUUT what if both happen together oh I am really concerned lol - look at how these things work for gods sake. I think either posters know and are just trolling, or dont want to know and being THICK


x6Q3gRr.png
CPU and GPU can go in overwork anyway, that's why frame drops happen in any system no matter the fixed clocks or not.
PS5 is more customizable, that's it: devs can choose how to manage clocks and they will manage the game performance accordingly. They could very well place the GPU clocks lower and than throw RT shit at it, making the game unplayable, as they could exagerate with RT on SeX also and have an unplayable game as well.
There is no flip flopping of TFs by 10% or frames dropping because there is the sun outside and it's hot, for the last fucking time: variable clocks doesn't mean that they will go up and down like in lunaparks, it means that devs can change it, contrary to SeX.
 
Last edited:

chilichote

Member
Raw-Power-Xbox-Series-X-vs-Playstation-5.png



From the Article (via Google translator):
1. With nominally the same computing power, a system with fewer hardware units on a higher clock rate is more efficient than a system with more hardware units on a lower clock rate. The latter is broader and therefore always has to struggle to utilize all hardware units - otherwise the nominal computing power cannot be achieved in practice. A higher clock rate, on the other hand, always scales perfectly in terms of computing power, since there are no further load problems. How much this makes in practice is difficult to estimate - in the case of PS5 & XBSX (36 vs. 52 CU), this difference in efficiency is certainly not really big, but it should still result in a few percentage points. Of course, this does not increase the nominal computing power, but only the practically usable computing power on the system with fewer hardware units (PS5).

2. With all hardware units, which are present in the same number on both game console SoCs, the system automatically achieves a not inconsiderable advantage with the higher clock rate - so it can under certain circumstances offset the disadvantage of pure computing power with advantages in other raw performance categories . In the current case, this could apply to the rasterizer and the ROP performance, since both console SoCs apparently run with 4 raster engines - and thus the Playstation 5 can fully exploit its clock rate advantage at this point and thus (nominally) could offer up to + 22% more rasterizer performance and up to + 22% more ROP performance than the Xbox Series X.

Overall, a very interesting and balanced article.
 
But 6Tf is better than 4.2Tf

same thing for, Ace Combat, Call of Duty (MW / BO4), RE2 is more sharpen on Pro and run very well performance, KH3 have 1080p mode for better performance.
Many games have this issue. Maybe Microsoft want the best resolution dispite performance.
All this big penis resolution drama really needs to stop.
We went from sub HD to HD, to HD to Full HD, to full HD to 2K/checkerboard and dynamic resolutions, now to 4K and even 8K starts to catch up. This is silly, there isn't ANY NEED for such resolutions to the level of textures, effects and geometries we are going to see, you are not boosting a PS2 game to 1080 p. At this point, res over performance is completely crazy.
Use checkerboard or dynamic resolution, it's fucking fine, I just hope we drop this run to "BEHOLD THE NATIVE 4K" so we can finally target easily proper fluidity.
 
Last edited:

geordiemp

Member
CPU and GPU can go in overwork anyway, that's why frame drops happen in any system no matter the fixed clocks or not.
PS5 is more customizable, that's it: devs can choose how to manage clocks and they will manage the game performance accordingly. They could very well place the GPU clocks lower and than throw RT shit at it, making the game unplayable, as they could exagerate with RT on SeX also and have an unplayable game as well.
There is no flip flopping of TFs by 10% or frames dropping because there is the sun outside and it's hot, for the last fucking time: variable clocks doesn't mean that they will go up and down like in lunaparks, it means that devs can change it, contrary to SeX.

I was trying to get people to think of clocks in the time domain that is all, I agree that both with sit at the max clocks, GPU probably all the time, CPU buta few % in freqneuncy when necessary which will make no difference in my mind with 16 threaded CPUs.

If you want to try blending the clock discssion with what CPU and GPU are doing...... your welcome I could not be bothered lol .
 
Last edited:

Marlenus

Member
No you're completely wrong. Using your example, the 2080 Ti should be more efficient than the 2080S then right? It's got lower clocks but more transistors.

power-gaming-average.png


So which is it?

Also, again, you're being selective. If 52CUs at lower clocks is more efficient, then Series X is 'ramping up the voltage' to hit 3.8Ghz on the CPU.

To clarify - you think an APU with a 52 CU GPU @ 1800Mhz and a CPU @ 3.8Ghz is more efficient than an APU with a 36 CU GPU @ 2200Mhz and a CPU @ 3.5Ghz?

The 2080Ti has more perf/watt at 1440p and 4k Vs the 2080S. At 1080p they are the same when using the TPU bench suite. ComputerBase has the perf/watt of the 2080Ti higher across the board, that is how you measure efficiency.

In the case of the 2080Ti it averages around 1800 MHz in games but the 2080S manages 1925 Mhz on average.

With the Xbox and playstation we are comparing 1825Mhz Vs 2230 MHz and power consumption goes up drastically with high clock rates.

So my estimate is that the Xbox will draw less power from the wall when gaming than the PS5 will if the PS5 is running at max clocks. I also expect the Xbox to have smoother frame rates or slightly better visuals at the same time.
 

Marlenus

Member
Raw-Power-Xbox-Series-X-vs-Playstation-5.png



The number of ROPs for either system had not been disclosed AFAIK.

Xbox one has a 100%+ rasterisation advantage over PS4 because it has 2 rasterisers Vs the 1 in PS4 and it didn't help it.
 

DaGwaphics

Member
This is on the CPU. Same applies but even more to the GPU. So

Because we all know all those instances when a new, wider GPU hits the market, but can't beat old cards because the workload doesn't make use of the wider GPU. Oh, right, that's never happened because GPU workloads naturally lend themselves to parallel work. AMD and Nvidia will release new GPUs in the fall that are much wider than the chip in the XSX, but neither will struggle with utilization in the slightest. LOL

Big no on that assumption.
 

Marlenus

Member
The contrary. XB1 has 16 Rops, PS4 has 32.

Rasterisers are at the front end, rops are the backend, they are different units.

Xbox one could setup 2 triangles per clock. PS4 could only setup one. It was one of the changes between GCN 1 (Pitcairn) and GCN 1.1 (Bonaire).

EDIT: I do recall that Xbox one had double the primitive setup rate compared to PS4 but I am probably mis remembering the why. It was 7-8 years ago.
 
Last edited:

draliko

Member
Right now the only thing I want are more new trailers, let us see those plastic pieces in action... Damn without conferences now we don't have a real calendar to follow.. When the next gathering should have been?
 
Rasterisers are at the front end, rops are the backend, they are different units.

Xbox one could setup 2 triangles per clock. PS4 could only setup one. It was one of the changes between GCN 1 (Pitcairn) and GCN 1.1 (Bonaire).

EDIT: I do recall that Xbox one had double the primitive setup rate compared to PS4 but I am probably mis remembering the why. It was 7-8 years ago.
XB1 has around 7% more vertex processing than PS4 because of clocks.
 

nosseman

Member
Its not.source ??



Funny note - nvidias fastest GPU (always fastest in game benchmarks) - the 2080 ti has quite a bit lower clock than 2070 Super and the other cards but higher CU/Cores/ROPs.

2080 TI:

Base Clock 1350 MHz
Boost Clock 1545 MHz

Shading Units 4352
TMUs 272
ROPs 88
SM Count 68

2070 Super:

Base Clock 1605 MHz
Boost Clock 1770 MHz

Shading Units 2560
TMUs 160
ROPs 64
SM Count 40
 


Funny note - nvidias fastest GPU (always fastest in game benchmarks) - the 2080 ti has quite a bit lower clock than 2070 Super and the other cards but higher CU/Cores/ROPs.

2080 TI:

Base Clock 1350 MHz
Boost Clock 1545 MHz

Shading Units 4352
TMUs 272
ROPs 88
SM Count 68

2070 Super:

Base Clock 1605 MHz
Boost Clock 1770 MHz

Shading Units 2560
TMUs 160
ROPs 64
SM Count 40
Wtf ? That's not a confirmation xsx is 80 rop ? Tech power source is based on rumors lol .just write my wishful thinking for now .that would suffice .🤣🤣

It literally says data on this page is not final and can change in near future. Lol
 
Last edited:

ethomaz

Banned
Nope. The SSD’s in both consoles are internal storage like the Switch has. They also have expansion ports. Not replacement. The PS5 and Switch have ports for M.2 SSDs and Micro SD cards respectively. The Xbox Series X has the slot on the back for their proprietary Expansion Card. None of this added storage replaces the internal storage.

Switch 32 GB internal + Micro SD card expansion slot
PS5 825 GB internal + M.2 SSD expansion bay
XSX 1 TB internal + proprietary Expansion Card slot

Unlike the PS3 and PS4, which had a user replaceable 2.5” SATA hard drive, the PS5 is using 12 individual flash chips over a 12 channel PCIe 4.0 interface to basically directly interface with the SoC through its custom I/O unit. Is basically part of the SoC (the APU and RAM etc). This is no how the PS4 was. It just had a regular hard dive that used a standard SATA connector.

It’s much better for the user because now we can use external hard dives for PS4 games on PS5, like say a 2TB, then use the internal 825 GB SSD for PS5 games, and then add a second M.2 SSD in the expansion bay for another say 512 GB or 1 TB of extra SSD storage for more PS5 games. This is great because we can add without having to subtract.
PS5 SSD is replaceable.
There is no additional expansion slot... M.2 MVMe skit is where the drive that comes with it is.


There are two PS5 expanded storage options for the PS5:

  • Replacing the internal drive within the PS5 with a Sony-certified, off-the-shelf SSD
  • Plugging in an external hard drive
 
Last edited:

icerock

Member
Not necessarely. You can be ROP bound, you can be asset loading speed bound - and your performance or image quality might drop even below ps5 level. XsX have some parts of pipeline that is better than ps5, some other parts are the same or even worse, this means that in reality getting better will require quite some efforts.

I would agree with you, if PS5, to add to their advantage of having higher clocks also possessed faster memory and more bandwidth to play with. 448GB/s is what you'll find in 5700XT, and RTX2080. Keep in mind, unlike those two GPUs, PS5 would be sharing this bandwidth with CPU and rest of the hardware. Plus that RT, which hogs bandwidth pretty fast. So, faster the GPU is going to do all of the work, faster its going to drain that bandwidth. Eventually, there's only so much you can do before that VRAM is filled up. From the GitHub leaks, we saw them testing the iGPU with underclocked 18Gbps modules, price prohibited them from putting that in the final hardware. Memory bandwidth is the biggest bottleneck I see in both next-gen consoles, Sony should've taken more loss and gone with faster memory, this system and their design philosophy is tailor made for it.

PS5 SSD is replaceable.
There is no additional expansion slot.

This is not true, there's a dedicated bay where you can slot in an extra SSD.
 
Last edited:

nosseman

Member
Wtf ? That's not a confirmation xsx is 80 rop ? Tech power source is based on rumors lol .just write my wishful thinking for now .that would suffice .🤣🤣

It literally says data on this page is not final and can change in near future. Lol

Ok, show me a source that says Xbos Series X only has 64 ROPs?

Shading Units is pretty much known and Shading Units/CU/TMU/ROPs pretty much stick together number wise.

PS5 render config is very similar of 5600XT but with higher clocks:

5600 XT:

Shading Units 2304
TMUs 144
ROPs 64
Compute Units 36

XBX on the other hand is a step over 5700 XT render configuration wise:

XBX:

Shading Units 3328
TMUs 208
ROPs 80
Compute Units 52

5700 XT:

Shading Units 2560
TMUs 160
ROPs 64
Compute Units 40
 

PaintTinJr

Member
....

With the Xbox and playstation we are comparing 1825Mhz Vs 2230 MHz and power consumption goes up drastically with high clock rates.

So my estimate is that the Xbox will draw less power from the wall when gaming than the PS5 will if the PS5 is running at max clocks. I also expect the Xbox to have smoother frame rates or slightly better visuals at the same time.

Maybe that has only been true because non switching transistors would normally be acting like tiny little capacitors when they aren't conducting. The PS5 solution to drive clock-rates to funnel the excess through the conductive transistors and lower clocks when most are conducting may produce a completely different scenario. It certainly sounded like the high fixed clocks & high power draw scenario they said they tried didn't work and this was a new approach that does work.
 

ethomaz

Banned
This is not true, there's a dedicated bay where you can slot in an extra SSD.

From my understanding the PS5 SSD is on the motherboard, you can expand the storage by adding a M.2 NVMe SSD in addition to that 825GB, but that is probably soldered onto the mainboard.
That is how Cerny explained.
Eurogamer has an article about it too.
There is no SSD on motherboard... the SSD is on PCI-E 4.0 MVMe M.2 slot.


There are two PS5 expanded storage options for the PS5:

  • Replacing the internal drive within the PS5 with a Sony-certified, off-the-shelf SSD
  • Plugging in an external hard drive
 
Last edited:
the crazy thing is that it might actually be 7gbps uncompressed under ideal conditions and 22 gbps compressed if kraken works the way it is supposed to. we are approaching re-ram speeds.

i want to see this thing in action. sony HAS to release a demo this week.

How about the rumored GTA6 tomorrow.

I hope they showcase some games already.
 

ethomaz

Banned
Ok, show me a source that says Xbos Series X only has 64 ROPs?

Shading Units is pretty much known and Shading Units/CU/TMU/ROPs pretty much stick together number wise.

PS5 render config is very similar of 5600XT but with higher clocks:

5600 XT:

Shading Units 2304
TMUs 144
ROPs 64
Compute Units 36

XBX on the other hand is a step over 5700 XT render configuration wise:

XBX:

Shading Units 3328
TMUs 208
ROPs 80
Compute Units 52

5700 XT:

Shading Units 2560
TMUs 160
ROPs 64
Compute Units 40
ROPs are not related to the the number of CUs because they are outside the WGP/CU..
TMUs is inside the WGP/CU so it increase more CUs you have.

If Xbox has 2 SEs then it probably has 64ROPs just like others RDNA GPU with 2 SEs.
 
Last edited:
Status
Not open for further replies.
Top Bottom