• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Mahavastu

Member
I don’t know how much transistors the PS5 has but one thing I know, we can guess the size of the GPU portion of the PS5.

This site gives us the size of the ps4 GPU. https://www.extremetech.com/extreme...u-reveals-the-consoles-real-cpu-and-gpu-specs
Which is 88 mm sq.
Now according to Mark Cerny, the size of each CU in the PS5 GPU is roughly 62% larger than a PS4 CU. Thus, there is the equivalent of 58 PS4 CUs in the PS5.
You can not really use the values from the article, because the OG PS4 used a 28nm process, while the PS5 uses a 7nm process. Therefore your numbers are too high for the PS5

For example, the GPU of the XSX is 47% of the 360mm2 XSX SoC, which is about 170mm2. This is much smaller then the size you calculated for the PS5 GPU, which should be smaller then the XSX GPU.
 

kyliethicc

Member
I guess so. I mean, I didn't notice a difference. Still probably won't see it for myself until next year at this rate. "lollipop_disappointed:


Maybe. I'm not sure if BluePoint has said what the two modes differ in exactly.
Bluepoint said the 2 modes are for performance or fidelity, same as their Shadow of the Colossus remake.

That game, on PS4 Pro, had 2 modes, 1440p @ 30 Hz or 1080p @ 60 Hz.

So for PS5, they will just up the resolution mode to 4K @ 30 Hz, while the 60 Hz mode will likely be around 1440p. And pixel counters have said thats what the footage they've shown indicates.

No idea if the 60 Hz mode will have same amount of ray tracing, maybe not. Idk. Both look great so far.
 

jose4gg

Member
Bluepoint said the 2 modes are for performance or fidelity, same as their Shadow of the Colossus remake.

That game, on PS4 Pro, had 2 modes, 1440p @ 30 Hz or 1080p @ 60 Hz.

So for PS5, they will just up the resolution mode to 4K @ 30 Hz, while the 60 Hz mode will likely be around 1440p. And pixel counters have said thats what the footage they've shown indicates.

No idea if the 60 Hz mode will have same amount of ray tracing, maybe not. Idk. Both look great so far.

I need confirmation on the pixel count for the uploaded trailer that is 60FPS...
 

Lethal01

Member
Bluepoint said the 2 modes are for performance or fidelity, same as their Shadow of the Colossus remake.

That game, on PS4 Pro, had 2 modes, 1440p @ 30 Hz or 1080p @ 60 Hz.

So for PS5, they will just up the resolution mode to 4K @ 30 Hz, while the 60 Hz mode will likely be around 1440p. And pixel counters have said thats what the footage they've shown indicates.

No idea if the 60 Hz mode will have same amount of ray tracing, maybe not. Idk. Both look great so far.

I like this, I don't want anything but resolution being sacrificed to give 60fps
 

kyliethicc

Member
I need confirmation on the pixel count for the uploaded trailer that is 60FPS...
Ehh, who cares really. It'll be less than full native 4K (cuz thats the 30 Hz mode) but higher than just 1080p. Good enough for a solid 60 FPS.

If a user picks that mode, they care about performance not resolution. And for the people who need native 4K and max visuals, there's that option too, just at 30 FPS. Great to have choices.
 

Rea

Member
I personally do not like what they have done with spiderman, i think it its an arsehole move by Sony, regardless if its upgraded with all those rays, textures bla bla , i think they should let people have it for free. Put an extra 5 bucks on the MM game if they want, i think when you buy a game it should be yours and upgrades should just be patched onto the game.

I remember buying Skyrim a long time ago, then they came out with another one, expected you to buy that full price, then released it somewhere else bla bla bla. I liked skyrim at one stage but practises like that, its bullshit.

As for not releasing more console features, its coming, you just need to be patient. The YouTubers of the future are going to give you everything you want to know., also in Japanese
Dude, why do you think you are entitled for free upgrade, you already gotten what you have paid which is ps4 spider man, its already sold. But this spider man remaster has new features like ray tracing, rescan facial animation, new assets, 3d audio, etc. Developers have to re do alot of things, it's not a simple 1 line of code patched. Also, lots of people don't understand that, ray tracing is Not cheap, it's very expensive, developers have to make lots of calculations, it also need better hardware. So yeah, i believe they deserve $20. I will be getting Ultimate version day one, if i can snatch one ps5 of course, still didn't manage to secure pre order yet "lollipop_disappointed:
 

thelastword

Banned
I know that, but AMD's TFs usually render less performance than nvidia's that's all. That might change with RDNA2, who knows! At least I hope so, but........
That talk was derived from Vega architecture (compute focused) vs a Pascal gaming architecture, engineered to run games primarily....

So Pascal had compressed textures, lower color fidelity in many games, but at 1080p and even below you could not see it clearly. So Nvidia traded Image Quality at lower TFLOPS and lower power draw. AMD did not compress anything so though it's Vega and Polaris cards had higher TF, they sucked in more power because the IQ and colors were better....

Vega was compute focused, yet, too many games were not compute focused, so in the majority of games Nvidia won. Notwithstanding certain features which kinda crippled AMD performance like tesselation and gameworks. Now Polaris was the first step by AMD to go the pure gaming route, at 5.1 and 6.1 TF the RX 570 and 580 were great cards against the 1050ti and 1060, yet if AMD scaled up polaris, the powerdraw would have skyrocketed at 14nm......So they went with Vega and HBM for the high end, but that was not a gaming architecture, most of the times 50% of Vega's raw compute power remained idle, so the cards had lots of power, just never utilized due to the non-gaming architecture or rather devs not prioritizing on compute focused games....

Which brings us to RDNA 1. An improvement on Polaris and a partial departure from GCN at a lower node. RDNA 1 was customized as a gaming GPU and it's ultra fast....ON IPC, RDNA 1 beat Turing easily if you look at performance per watt. In essence, if AMD developed a larger chip than the 251 MM2 5700XT chip, it would most certainly be as fast and even faster than 2080ti. Yet a 5700XT at 251MM2 beats a 2070 and is on par with a 545MM2 2070S. This means that if the 5700XT was only 400MM2, the chip would be packed enough to beat the 2080ti on RDNA 1, yet 2080ti is a 754MM2 chip.....

In short, RDNA has better IPC and performance per watt over Turing by a large margin. Now the team on RDNA 2 took all of this to a new level, it's said performance per watt got upshot to 60% from RDNA 1. The IPC gains are about 20% from RDNA 1, the clock speeds are insanely high, the CU's have improved a million fold relative to instructions per CU, the node is super efficient hence the high clockspeeds.....Then people are forgetting AMD has not stopped there, they implemented their own form of HBCC straight on the board with 128MB of cache. AMD is not effing around, they are trying to push things this gen and go for a complete kill with MGPU's on RDNA 3 in the future....

Now, people are weary, they can't be patient. AMD is not going to allow Sony to announce a feature set of their architecture yet, at least one where both companies benefitted from their collaboration, fueled by Sony for their PS5 regardless. yet, that's how collaborations work, I scratch your back, you scratch mine. AMD has to sell GPU's too and they have always said their GPU's would hit shelves before the console.....After the 28th of October I'm sure all the breakdowns people want to know will be a go........Sometimes it's good to keep something in the oven for longer, keep quiet and work out your kinks....Cerny said that there were some logic issues with clocking PS5 over 2.23GHz, I'm almost certain that has been resolved, that's what time gives you......So you will see AMD GPU's clock pretty high and over that of PS5. Not sure if it would make sense for them to boost the PS5 clock even more, I guess it depends on the formidability of their cooling solution, but it's a possibility. I still think there are some nice perks we have not heard about PS5, hardware related and software related (OS especially)........There is still time to announce all of that, there is more than enough time.....I told folk AMD would hit it out of the park, they are coming for both the CPU and GPU market this holiday.....The best kit you will buy will be AMD CPU + AMD GPU as the most forward moving pieces of architecture in both realms later on this year...People who were wishing for a intel + nv combo for consoles have no idea how power hungry, expensive and limited that would have been.....
 

StreetsofBeige

Gold Member
Now, people are weary, they can't be patient. AMD is not going to allow Sony to announce a feature set of their architecture yet, at least one where both companies benefitted from their collaboration, fueled by Sony for their PS5 regardless. yet, that's how collaborations work, I scratch your back, you scratch mine. AMD has to sell GPU's too and they have always said their GPU's would hit shelves before the console.....After the 28th of October I'm sure all the breakdowns people want to know will be a go........Sometimes it's good to keep something in the oven for longer, keep quiet and work out your kinks....Cerny said that there were some logic issues with clocking PS5 over 2.23GHz, I'm almost certain that has been resolved, that's what time gives you......So you will see AMD GPU's clock pretty high and over that of PS5. Not sure if it would make sense for them to boost the PS5 clock even more, I guess it depends on the formidability of their cooling solution, but it's a possibility. I still think there are some nice perks we have not heard about PS5, hardware related and software related (OS especially)........There is still time to announce all of that, there is more than enough time.....I told folk AMD would hit it out of the park, they are coming for both the CPU and GPU market this holiday.....The best kit you will buy will be AMD CPU + AMD GPU as the most forward moving pieces of architecture in both realms later on this year...People who were wishing for a intel + nv combo for consoles have no idea how power hungry, expensive and limited that would have been.....
So what your saying is at some point, PS5 has secret sauce that will be announced to the world.

Got it.
 

thelastword

Banned
So what your saying is at some point, PS5 has secret sauce that will be announced to the world.

Got it.
PS5 has already announced it's secret sauce.....That's it's heavily customized and efficient hardware with nigh on no bottlenecks, it's blazing fast SSD and it's next gen sound chip and controller....The extra bits on how they work and some extra hardware and software details will round out how impressive the console really is.......Yet the console has already sold itself on it's pure speed, it's ease of development and it's performant raytracing ability...

In essence, people are already sold, that's why many can't get a pre-order.......Whatever else Sony announces is just gravy at this point...
 

edotlee

Member
InXile dev




Anyone else feel that Microsoft should've just released their new console in 2022-2023? They could've released an even more powerful console by that time (25, 30TF?) with a stellar launch line-up? Feels like they would make a bigger impact like that then what they're doing now. Instead, they have to pray a quality version of Halo comes out in 2021 to carry them to 2022-2023.
 
Last edited:

duhmetree

Member
That talk was derived from Vega architecture (compute focused) vs a Pascal gaming architecture, engineered to run games primarily....

So Pascal had compressed textures, lower color fidelity in many games, but at 1080p and even below you could not see it clearly. So Nvidia traded Image Quality at lower TFLOPS and lower power draw. AMD did not compress anything so though it's Vega and Polaris cards had higher TF, they sucked in more power because the IQ and colors were better....

Vega was compute focused, yet, too many games were not compute focused, so in the majority of games Nvidia won. Notwithstanding certain features which kinda crippled AMD performance like tesselation and gameworks. Now Polaris was the first step by AMD to go the pure gaming route, at 5.1 and 6.1 TF the RX 570 and 580 were great cards against the 1050ti and 1060, yet if AMD scaled up polaris, the powerdraw would have skyrocketed at 14nm......So they went with Vega and HBM for the high end, but that was not a gaming architecture, most of the times 50% of Vega's raw compute power remained idle, so the cards had lots of power, just never utilized due to the non-gaming architecture or rather devs not prioritizing on compute focused games....

Which brings us to RDNA 1. An improvement on Polaris and a partial departure from GCN at a lower node. RDNA 1 was customized as a gaming GPU and it's ultra fast....ON IPC, RDNA 1 beat Turing easily if you look at performance per watt. In essence, if AMD developed a larger chip than the 251 MM2 5700XT chip, it would most certainly be as fast and even faster than 2080ti. Yet a 5700XT at 251MM2 beats a 2070 and is on par with a 545MM2 2070S. This means that if the 5700XT was only 400MM2, the chip would be packed enough to beat the 2080ti on RDNA 1, yet 2080ti is a 754MM2 chip.....

In short, RDNA has better IPC and performance per watt over Turing by a large margin. Now the team on RDNA 2 took all of this to a new level, it's said performance per watt got upshot to 60% from RDNA 1. The IPC gains are about 20% from RDNA 1, the clock speeds are insanely high, the CU's have improved a million fold relative to instructions per CU, the node is super efficient hence the high clockspeeds.....Then people are forgetting AMD has not stopped there, they implemented their own form of HBCC straight on the board with 128MB of cache. AMD is not effing around, they are trying to push things this gen and go for a complete kill with MGPU's on RDNA 3 in the future....

Now, people are weary, they can't be patient. AMD is not going to allow Sony to announce a feature set of their architecture yet, at least one where both companies benefitted from their collaboration, fueled by Sony for their PS5 regardless. yet, that's how collaborations work, I scratch your back, you scratch mine. AMD has to sell GPU's too and they have always said their GPU's would hit shelves before the console.....After the 28th of October I'm sure all the breakdowns people want to know will be a go........Sometimes it's good to keep something in the oven for longer, keep quiet and work out your kinks....Cerny said that there were some logic issues with clocking PS5 over 2.23GHz, I'm almost certain that has been resolved, that's what time gives you......So you will see AMD GPU's clock pretty high and over that of PS5. Not sure if it would make sense for them to boost the PS5 clock even more, I guess it depends on the formidability of their cooling solution, but it's a possibility. I still think there are some nice perks we have not heard about PS5, hardware related and software related (OS especially)........There is still time to announce all of that, there is more than enough time.....I told folk AMD would hit it out of the park, they are coming for both the CPU and GPU market this holiday.....The best kit you will buy will be AMD CPU + AMD GPU as the most forward moving pieces of architecture in both realms later on this year...People who were wishing for a intel + nv combo for consoles have no idea how power hungry, expensive and limited that would have been.....
How profound is it for the GPU having these unheard of clock speeds? Will they have similar benefits compared to a CPU with increased clock speeds?

I'm an idiot when it comes to these things.
 

kyliethicc

Member
... they have to pray a quality version of Halo comes out in 2021 ...

343 Industries

13iajGw.jpg
 
PS5 has already announced it's secret sauce.....That's it's heavily customized and efficient hardware with nigh on no bottlenecks, it's blazing fast SSD and it's next gen sound chip and controller....The extra bits on how they work and some extra hardware and software details will round out how impressive the console really is.......Yet the console has already sold itself on it's pure speed, it's ease of development and it's performant raytracing ability...

In essence, people are already sold, that's why many can't get a pre-order.......Whatever else Sony announces is just gravy at this point...

I can’t believe all he took from that great post of yours was that PS5 has secret sauce. What a horrible fanboy.

Nice post!
 

duhmetree

Member
PS5 has already announced it's secret sauce.....That's it's heavily customized and efficient hardware with nigh on no bottlenecks, it's blazing fast SSD and it's next gen sound chip and controller....The extra bits on how they work and some extra hardware and software details will round out how impressive the console really is.......Yet the console has already sold itself on it's pure speed, it's ease of development and it's performant raytracing ability...

In essence, people are already sold, that's why many can't get a pre-order.......Whatever else Sony announces is just gravy at this point...
will they have DLSS equivalent? That'd be huge

They will lead in that category, considering most developers would use AMDs tech
 
Last edited:
will they have DLSS equivalent? That'd be huge

They will lead in that category, considering most developers would use AMDs tech

Whats all this DLSS hype. PS4 Pro was doing supersampling before Nvidia even brought it to PC space. (2 years before IIRC). People hyping up old tech thats been in consoles for years. How you think PS4 games look so good.

Sony does image quality in their sleep (camera division).For the record PS5 already showcased the upgraded super sampling in the UE5 demo. It's just Sony ain't got time to be naming everything they do. Whats the point if its just basic tool. Cant be naming everything.

Is this a US thing? Where everything has to have a fancy name ,badge ,logo. Marketing pr spin .For people to be sold on how good something is or works.
 
Last edited:

By-mission

Member
That talk was derived from Vega architecture (compute focused) vs a Pascal gaming architecture, engineered to run games primarily....

So Pascal had compressed textures, lower color fidelity in many games, but at 1080p and even below you could not see it clearly. So Nvidia traded Image Quality at lower TFLOPS and lower power draw. AMD did not compress anything so though it's Vega and Polaris cards had higher TF, they sucked in more power because the IQ and colors were better....

Vega was compute focused, yet, too many games were not compute focused, so in the majority of games Nvidia won. Notwithstanding certain features which kinda crippled AMD performance like tesselation and gameworks. Now Polaris was the first step by AMD to go the pure gaming route, at 5.1 and 6.1 TF the RX 570 and 580 were great cards against the 1050ti and 1060, yet if AMD scaled up polaris, the powerdraw would have skyrocketed at 14nm......So they went with Vega and HBM for the high end, but that was not a gaming architecture, most of the times 50% of Vega's raw compute power remained idle, so the cards had lots of power, just never utilized due to the non-gaming architecture or rather devs not prioritizing on compute focused games....

Which brings us to RDNA 1. An improvement on Polaris and a partial departure from GCN at a lower node. RDNA 1 was customized as a gaming GPU and it's ultra fast....ON IPC, RDNA 1 beat Turing easily if you look at performance per watt. In essence, if AMD developed a larger chip than the 251 MM2 5700XT chip, it would most certainly be as fast and even faster than 2080ti. Yet a 5700XT at 251MM2 beats a 2070 and is on par with a 545MM2 2070S. This means that if the 5700XT was only 400MM2, the chip would be packed enough to beat the 2080ti on RDNA 1, yet 2080ti is a 754MM2 chip.....

In short, RDNA has better IPC and performance per watt over Turing by a large margin. Now the team on RDNA 2 took all of this to a new level, it's said performance per watt got upshot to 60% from RDNA 1. The IPC gains are about 20% from RDNA 1, the clock speeds are insanely high, the CU's have improved a million fold relative to instructions per CU, the node is super efficient hence the high clockspeeds.....Then people are forgetting AMD has not stopped there, they implemented their own form of HBCC straight on the board with 128MB of cache. AMD is not effing around, they are trying to push things this gen and go for a complete kill with MGPU's on RDNA 3 in the future....

Now, people are weary, they can't be patient. AMD is not going to allow Sony to announce a feature set of their architecture yet, at least one where both companies benefitted from their collaboration, fueled by Sony for their PS5 regardless. yet, that's how collaborations work, I scratch your back, you scratch mine. AMD has to sell GPU's too and they have always said their GPU's would hit shelves before the console.....After the 28th of October I'm sure all the breakdowns people want to know will be a go........Sometimes it's good to keep something in the oven for longer, keep quiet and work out your kinks....Cerny said that there were some logic issues with clocking PS5 over 2.23GHz, I'm almost certain that has been resolved, that's what time gives you......So you will see AMD GPU's clock pretty high and over that of PS5. Not sure if it would make sense for them to boost the PS5 clock even more, I guess it depends on the formidability of their cooling solution, but it's a possibility. I still think there are some nice perks we have not heard about PS5, hardware related and software related (OS especially)........There is still time to announce all of that, there is more than enough time.....I told folk AMD would hit it out of the park, they are coming for both the CPU and GPU market this holiday.....The best kit you will buy will be AMD CPU + AMD GPU as the most forward moving pieces of architecture in both realms later on this year...People who were wishing for a intel + nv combo for consoles have no idea how power hungry, expensive and limited that would have been.....
About this?

 
T

Three Jackdaws

Unconfirmed Member
will they have DLSS equivalent? That'd be huge

They will lead in that category, considering most developers would use AMDs tech
Supersampling and AI upscaling will play a big role next-gen IMO especially with 4K becoming a standard in gaming, as powerful as these GPU's are developers don't want to waste the GPU's budget on ultra high resolutions, so upscaling techniques will be important, especially for 60 FPS games.

Also #LakeShow baby! if you know you know!
 

HAL-01

Member
Whats all this DLSS hype. PS4 Pro was doing supersampling before Nvidia even brought it to PC space. (2 years before IIRC). People hyping up old tech thats been in consoles for years. How you think PS4 games look so good.

Sony does image quality in their sleep (camera division).For the record PS5 already showcased the upgraded super sampling in the UE5 demo. It's just Sony ain't got time to be naming everything they do. Whats the point if its just basic tool. Cant be naming everything.

Is this a US thing? Where everything has to have a fancy name ,badge ,logo. Marketing pr spin .For people to be sold on how good something is or works.
Please refrain from chiming into tech discussion if you won't bother to educate yourself and would rather dismiss it with blatant misinformation.

Supersampling is the opposite of what you're talking about. What you mean is upsampling/upscaling. The UE5 demo showcased Unreal's own Screen percentage temporal upsample (TAAU). And different technologies tend to have different names because they all do different things.
DLSS 2 currently sports superior image quality to checkerboard rendering, and is slightly above temporal upsampling, being near indistinguishable from a native 4K output.

I'm pretty sure Sony will have a DLSS competitor relatively soon considering gaming and imaging are one of their brand new AI division's flagship projects.
 

thelastword

Banned
How profound is it for the GPU having these unheard of clock speeds? Will they have similar benefits compared to a CPU with increased clock speeds?

I'm an idiot when it comes to these things.
In a GPU pipeline, high clockspeeds are extremely important. I guess more so now since AMD is running it's Raytracing in part via GPGPU or through simul CU operations.....The faster clockspeeds will definitely accelerate rasterization and RT performance just the same.....

will they have DLSS equivalent? That'd be huge

They will lead in that category, considering most developers would use AMDs tech
I imagine there will be many solutions like DLSS, people speak about DLSS, but' it's not a standard for upscaling games yet. Only a handful of games use it.....More games have used checkerboarding and though CB is not AI based, there are other solutions which will surface, some hardware based and some AI based which will rival or exceed DLSS. Direct ML is one, perhaps CB 2.0 or an AI or hardware only based GPU solution via Fidelity FX.....

AMD's software features have seen the most steady growth I've seen in the industry. Radeon Chill, RIS, HBCC and their GUI are second to none......Improvements on the geometry engine, samples from PS4 PRO's ID buffer are all areas they could research and work from to improve IQ and sub native resolutions for next gen...

About this?


I know the performance will be there, but these TDP ratios are absolutely bonkers and this is what impresses most here....If these clock speeds are base/gaming clock speeds, then it means with some extra juice these cards can push even more. These cards on LN2 and with a powerful high phase VRM mobo unit, with little to no restrictions should be an overclockers dream......Yet let's not get ahead of ourselves. These type of clockspeeds are absolutely gamechanging with such low wattage.....Efficency is the name of the game in RDNA 2.....I think the engineer's goal was to ensure what happened with Vega didn't happen here, the core of RDNA 2 is to utilize and maximize every compute + rasterization capability of the GPU at all times. No power or cycles should be left underutilized and idle, so they are aiming for a 99-100% utilization on every frame......It's why they have pushed Smartshift and variable clockspeeds on PS5 too. Whatever they can do to utilize idle power and boost framerate or fidelity, this is what they will do, maximum utilization is the key.

So yes, this is the first true architecture built on efficient technology and I'm not simply talking the likes of lowering IQ, using VRS or DLSS type solutions to boost frames, but concurrent CU work to boost the entire work load without compromising fidelity in any way.....Of course these type of technologies I just mentioned (vrs, dlss etc...) will help boost the framerates even more, but what AMD did within it's architecture is essential, you need a really speedy and efficient architecture if you are going to pitch raytracing because that is very costly......DLSS and VRS should not be the only way to boost RT performance, the architecture must be built for the raytracing demands for next gen innately.....That is why I side with AMD and their technology, they are really moving every aspect of computing forward in a new way....It's really great to see, because they always did tbh, just that this time everything is aligning and it seems they will have the success they deserve for it........
 
Aye, we get back to that old adage of brute force vs better design in terms of all those custom hardware improvements that Sony put in.

Its basically ultimate power vs custom efficiency, not sure if efficiency is the correct word atm, looking for a better one maybe,

either way, the ass creed demo and dirt demo already state/ both are 4k60 ----- 120 fps for dirt.

ITs already neck and neck.

Besides a few tree branches here, an occasion sheep walking about, no idea if anyone can tell the difference.

It is going to be very interesting, at the end of the day i think both will looking smashing.

Hey I pay good money for that extra sheep shit on the lawn you sexy, juicy, juicy bounty of beans.
 
Well, Navi 22 with 40 CUs has a boost clock of 2,5 Ghz.
Cerny redeemed, 2,23 Ghz not a last minute overclock. So time to sleep those tales and the the yield problems ones.
Probably like Navi 21/22 that will have 128 MB of cache PS5 will have also some sorts of it, probably the IO's SRAM. Thats why 448 GB/s with main RAM will fly...
Suddenly XSX doesnt seem so RDNA 2...
 
Last edited:

NAVI 21 would be 22.5 TFs. Pretty decent, but far from the 29.8 TF of the 3080. If priced correctly AMD might have a damn good product, but once again, unable to compete for the high end.

Edit: we still have no idea how exactly AMD cards will run RT so probably performance there will take a bigger hit than with Nvidia. Also, nothing close to DLSS unless they announce something.

The tf count on the 3000 series should be enough to prove this is a meaningless metric. The performance gains over Turing are around 30%, with a 100% tflop increase.

Not to mention the energy consumption increase as well. Nvidia was all smoke and mirrors again :messenger_confused:
 

Gediminas

Banned
Well, Navi 22 with 40 CUs has a boost clock of 2,5 Ghz.
Cerny redeemed, 2,23 Ghz not a last minute overclock. So time to sleep those tales and the the yield problems ones.
Probably like Navi 21/22 that will have 128 GB of cache PS5 will have also some sorts of it, probably the IO's SRAM. Thats why 448 GB/s with main RAM will fly...
Suddenly XSX doesnt seem so RDNA 2...
but but but :messenger_crying: PS5 is actually 5TF console and just on rare occasions run on boost mode aka 2,23 Ghz...
 
Last edited:
T

Three Jackdaws

Unconfirmed Member
Well, Navi 22 with 40 CUs has a boost clock of 2,5 Ghz.
Cerny redeemed, 2,23 Ghz not a last minute overclock. So time to sleep those tales and the the yield problems ones.
Probably like Navi 21/22 that will have 128 GB of cache PS5 will have also some sorts of it, probably the IO's SRAM. Thats why 448 GB/s with main RAM will fly...
Suddenly XSX doesnt seem so RDNA 2...
Honestly, I saw this coming for a long time now, Coreteks, RGT and other PC centric channels all predicted/leaked that RDNA 2 GPU's were going to hit some serious clock speeds based off what was revealed about the PS5's GPU.

RGT in almost all his recent videos about RDNA 2 mentioned that the RDNA 2 discreet GPU's would hit "console level clock speeds", this was the quote he used from his "sources" and it was obvious they were referring to PS5. However he also mentioned that AMD were having issues with running the clocks "too high", not because of overheating but because there was a "logic breakdown" and this was a drawback of the RDNA architecture, don't know exactly what that means but Mark Cerny also said the same thing about why the PS5's GPU wasn't clocked higher than 2.23 Ghz. Maybe AMD have resolved the issue? only time will tell.
 
this is no longer true with rdna 1.0 and turing tflops. The 2070 is 9.1 tflops at average game clocks and the 5700 xt is 9.3 tflops at average game clocks. Both offer nearly identical performance.

Nvidia is no longer undereporting their tflops numbers. But their tflops are not offering 1:1 performance anymore. 2080 is 11 tflops but the 3080 is 30 tflops and offers only 2x more performance. Adding more shader cores is clearly not giving linear performance increases.
The 3080 is not double the performance of the 2080. And the biggest gap is at 4k (getting to 70% or so increase), if you're playing at 1440p the difference is closer to 30% or so. So yeah, can we stop looking at "tflops" already?

Oh, and the 2080 is running at 220 watt vs 320 watt of the 3080 as well. That lowers even further the ppw difference of the 2 cards. Very disappointing actually. I thoroughly believe AMD has them this time.
 
Honestly, I saw this coming for a long time now, Coreteks, RGT and other PC centric channels all predicted/leaked that RDNA 2 GPU's were going to hit some serious clock speeds based off what was revealed about the PS5's GPU.

RGT in almost all his recent videos about RDNA 2 mentioned that the RDNA 2 discreet GPU's would hit "console level clock speeds", this was the quote he used from his "sources" and it was obvious they were referring to PS5. However he also mentioned that AMD were having issues with running the clocks "too high", not because of overheating but because there was a "logic breakdown" and this was a drawback of the RDNA architecture, don't know exactly what that means but Mark Cerny also said the same thing about why the PS5's GPU wasn't clocked higher than 2.23 Ghz. Maybe AMD have resolved the issue? only time will tell.
Seems like you were ri... I mean, Paul was right on that one :)

(sorry, almost dropped the mask there for a second) :messenger_grinning_squinting:
 
Status
Not open for further replies.
Top Bottom