• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & Next Xbox |OT| Speculation/Analysis/Leaks Thread

Next Gen Consoles Power Level

  • 5.8TF (24CU ) 7nm DUV ≥ 1660Ti tier

    Votes: 0 0.0%

  • Total voters
    66

R600

Member
Jun 17, 2019
319
203
195
Seems Nvidia will be using TSMC as well along with Samsung for their next gen GPUs.


Updated, 7/5/19, 9:23am PT: Nvidia executive vice president of operations Debora Shoquist said in a statement that “Recent reports are incorrect – NVIDIA’s next-generation GPU will continue to be produced at TSMC. NVIDIA already uses both TSMC and Samsung for manufacturing, and we plan to use both foundries for our next-generation GPU products.”
 

LordOfChaos

Member
Mar 31, 2014
9,078
1,121
710
Seems Nvidia will be using TSMC as well along with Samsung for their next gen GPUs.


Updated, 7/5/19, 9:23am PT: Nvidia executive vice president of operations Debora Shoquist said in a statement that “Recent reports are incorrect – NVIDIA’s next-generation GPU will continue to be produced at TSMC. NVIDIA already uses both TSMC and Samsung for manufacturing, and we plan to use both foundries for our next-generation GPU products.”

Samsung for lower cost tiers, TSMC for the mid-upper end, is my guess? N7 (and N7+) is still the node to beat, while the rumor was that Samsung was aggressively undercutting on pricing to get any business, offering a full mask set for under the price of one multi layer mask from TSMC.
 
Last edited:

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
I think we are finding out that it's possible a 9 TF NAVI GPU may be similar to an RTX 2080 tier GPU
Not even close LOOOOL if that was the case AMD would be singing it from the rooftops, and Navi perf per flop would be higher than Turing (which isn't the case)
2070, 1080, Vega64 are all on the same tier and make for a very underwhelming next gen leap
If they never came out an 8tflop console would have been huge for us.
It wouldn't, Vega64/2070 performance is not enough for a next gen leap at 4k
TSMC was promising 7nm EUV mass production starting March this year.
7nm EUV this year is for mobile chips
If it means the 9TF GPU with RDNA2 performs better than the 10.7TF GPU on GCN, then I don't think they'll care.
Vega56/64 is not the benchmark for next gen
So 7nm+ must be EUV?
Yes
The nexbox SoC was confirmed to be 390mm^2 surely more than 36 CU's can be packed into that even with 7nm DUV.
That's 60CUs on 7nm DUV and close to 80 on EUV
 
Last edited:

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
How come 1.8TF GCN1.0 with mobile CPU made it to 2020?
Performance wise 5700 and 5700 XT are comparable to using a GTX 280 and GTX 460 respectively both way below the GPU used in the PS4
Jaguar was trash but it was a massive leap over the PPE core nevertheless
 
Last edited:

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
@ArabianPrynce
There is talks about 7nm+, but historically consoles use the manufacturing node that was released a year or earlier. So a 7nm+ is unlikely. After all, you don't want your console plans to hinge a node that may not come to fruition, have really bad yields or unable to go mass manufacturing.
7nm+ is not a new node, its 7nm with some EUV layers (4 non critical layers), yields are already on par with 7nm
TL;DR: based on Navi benchmarks for the RX 5700, PS5 GPU will be around the RTX 2080 / 2080 Ti assuming the rumors hold true.
This is what i expect to: in between 2080 and 2080Ti (closer to the 2080)
Power draw is a concern. As such, take the above with a giant grain of salt.
A full length vapor chamber can handle 250W no problem
 
Last edited:
  • Like
Reactions: ArabianPrynce

llien

Gold Member
Feb 1, 2017
5,396
2,504
680
...NVIDIA already uses both TSMC and Samsung for manufacturing...
Is this true though? Which cards are produced at "and Samsung"?
That comment could simply be "let's not make TSMC nervous".

Vega56/64 is not the benchmark for next gen
Perhaps it would make sense to switch to % of Vega64 or other cards, instead of TF figure, which brings nothing but confusion.
 

Marlenus

Member
Jul 29, 2013
1,135
74
385
UK
Not even close LOOOOL if that was the case AMD would be singing it from the rooftops, and Navi perf per flop would be higher than Turing (which isn't the case)
2070, 1080, Vega64 are all on the same tier and make for a very underwhelming next gen leap.
computerbase.de did some rdna Vs Turing tests at the same shader count, same clockspeed and same memory bandwidth and the average was a 1% win for rdna.

It means perf/flop and perf/transistor for RDNA is roughly equal to Turing which is a pretty impressive leap from where GCN was.
 

ANIMAL1975

Member
Aug 27, 2018
413
417
270
Portugal 🇵🇹
What happened in 2005 was graphic tech was completely different then it is today. Really, hardly comparable.

You could put 200W console that would he 2x as fast as best PC at the time, but nowadays you need 400W+ to match high end PC (while Nvidia is on older node process).

What happened in early 2000s will never happen again. Tech is just hitting the wall, there is no big node advances and new ways to skin a cat. GCN lasted for 7 years, back in the day Ati and Nvidia changed arch every year so console makers could benefit alot more.
You didn't understand what i meant. After a all generation of losses in hardware (and no payed multiplayer on Sony side), both companies had to plan the new consoles on a tight budget, ps plus become obligatory for online, taking losses per machine was out of the question! This time around there's a all new picture.
 

mckmas8808

Gold Member
May 24, 2005
41,274
4,151
1,630
Not even close LOOOOL if that was the case AMD would be singing it from the rooftops, and Navi perf per flop would be higher than Turing (which isn't the case)
2070, 1080, Vega64 are all on the same tier and make for a very underwhelming next gen leap
I thought AMD's 5700XT was 7.5 or 8 TFs of GPU power. Is that wrong?
 

mckmas8808

Gold Member
May 24, 2005
41,274
4,151
1,630
That's 9tf. The 5700 is 7.5.
Oh okay, so that's the confusion I had. So an 11TF Navi that's 7nm+ EUV, would be like having an RTX 2080Ti then. So we probably won't be getting that. The best we can hope for is a custom Navi that's between slightly better than the 5700XT (example: 10TF with some extra Sony added features to boot).
 

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
computerbase.de did some rdna Vs Turing tests at the same shader count, same clockspeed and same memory bandwidth and the average was a 1% win for rdna.

It means perf/flop and perf/transistor for RDNA is roughly equal to Turing which is a pretty impressive leap from where GCN was.
No it doesn't, turing is about 1TF ahead compared to RDNA
I thought AMD's 5700XT was 7.5 or 8 TFs of GPU power. Is that wrong?
9-9.7TF depending of boost clock
Since when have game consoles been better than mid-range PC's? This would be mind blowing and will never happen.
2080 will be midrange by the time consoles launch
Next-gen will have 12 Navi TF like PS3 had "2TF"!
You joke, but 11-12TF Navi is the minimum for a proper next gen leap at 4K
Adjusted to resolution performance 8-9TF Navi for next gen is the equivalent of using GTX 280 and GTX 460 respectively for current gen consoles.
 
Last edited:

Sycomunkee

Member
Jan 21, 2019
1,239
1,058
360
2080 will be midrange by the time consoles launch
Maybe, but everything needs to exist today so they can build games with final devkits and consoles. The GPU of tomorrow does not exist today, therefore we are not getting next years GPU in a console that releases next year.

I've said this before, but we are not getting Wine Tier GPU's for Beer Tier pricing. It's simply not going to happen.
 
Last edited:
  • Like
Reactions: LSWilson

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
So an 11TF Navi that's 7nm+ EUV, would be like having an RTX 2080Ti then
Not at all
11TF Navi would be 2080 tier maybe close to a 2080S but nowhere close to a Ti
Maybe, but everything needs to exist today so they can build games with final devkits and consoles. The GPU of tomorrow does not exist today, therefore we are not getting next years GPU in a console that releases next year.

I've said this before, but we are not getting Wine Tier GPU's for Beer Tier pricing. It's simply not going to happen.
Not really, console launches have proven they have their own roadmaps and engeneering samples independent of discrete cards available
That's the benefit of custom and semi custom designs.

Big Navi discrete cards will easily pull 16-18TF, Nvidia ampere midrange card will be 2080S performance of above
 

Ovech-King

Member
Feb 27, 2017
135
71
225
11-12TF Navi is the minimum for a proper next gen leap at 4K
Adjusted to resolution performance 8-9TF Navi for next gen is the equivalent of using GTX 2080 and GTX 460 respectively for current gen consoles.
Totally; my 10.5 Tflops rtx 2080 even with manual overclock run maybe on 50% of games at 4k 60 fps and I have a better cpu frequency (4.4 ghz) than the ryzen will have
 
  • Like
Reactions: SonGoku

THE:MILKMAN

Member
Mar 31, 2006
4,420
291
1,215
Midlands, UK
You joke, but 11-12TF Navi is the minimum for a proper next gen leap at 4K
Adjusted to resolution performance 8-9TF Navi for next gen is the equivalent of using GTX 280 and GTX 460 respectively for current gen consoles.
I'm really not joking. I'm just being realistic. If Sony/Microsoft have some TF rabbit in their hats that will give us 11-12 Navi TF then Great. I'm not going all in on expecting it though as it just leads to disappointment.

For the last 2 years I've expected Vega 56-Vega 64 performance for next-gen and even the 5700 meets/exceeds that. Add custom bits/RT/SSD/16GB+ RAM and I can't wait to see the closed box results.
 

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
If Sony/Microsoft have some TF rabbit in their hats that will give us 11-12 Navi TF then Great
That Rabbits name is EUV :)
But even on standard DUV between 10-11TF is doable on a console
I'm not going all in on expecting it though as it just leads to disappointment.
I don't get the need to lowball to be satisfied, its just consoles. But fine
For the last 2 years I've expected Vega 56-Vega 64 performance for next-gen and even the 5700 meets/exceeds that.
The thing is Vega64 + 15% is nowhere near enough for a next gen leap at 4k
If the target resolution was 1080p to 1440p it would be fine
Totally; my 10.5 Tflops rtx 2080 even with manual overclock run maybe on 50% of games at 4k 60 fps and I have a better cpu frequency (4.4 ghz) than the ryzen will have
and thats for current gen games, next gen will be much more demanding
 
Last edited:

THE:MILKMAN

Member
Mar 31, 2006
4,420
291
1,215
Midlands, UK
@SonGoku

The single biggest barrier/kryptonite for Sony and MS is power consumption of Navi. The same happened with PS4 to be fair. Sony took a full 7870 chip and disabled 2CUs but then clocked it lower than the 7850.

I could see them do the same for PS5. Take a full Navi 10 chip and disable 4CUs but this time clock it as high as possible and put in better cooling in a bigger case.

Another reason for Sony/MS to hold back on specs is possible mid-gen upgrades. Gotto leave room to make the upgrade worth it.
 

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
@SonGoku
The single biggest barrier/kryptonite for Sony and MS is power consumption of Navi. The same happened with PS4 to be fair. Sony took a full 7870 chip and disabled 2CUs but then clocked it lower than the 7850.
I could see them do the same for PS5. Take a full Navi 10 chip and disable 4CUs but this time clock it as high as possible and put in better cooling in a bigger case.
The work around that is going with a wider design clocked lower (1500-1600Mhz) to undervolt the chips and applying the hobbit method
7nm pro and 7nm+ have power reductions as well
All things considered a 250W console would be enough for the higher end of expectations (11TF DUV & 13-14TF EUV)
Another reason for Sony/MS to hold back on specs is possible mid-gen upgrades. Gotto leave room to make the upgrade worth it.
That could be possible if they suggest devs to target 1440p instead of 4k and launch at $400
5700XT is just not enough for a next gen leap at 4K, a disabled version would do much worse
 
Last edited:
  • Like
Reactions: SpinningBirdKick

LordOfChaos

Member
Mar 31, 2014
9,078
1,121
710
Edit: Nevermind me, guess I was late to this party


Navi (RDNA) vs.
Average gain
Turing​
+1%​
Pascal​
+13%​
Vega​
+28%​
Polaris​
+39%​



Granted they're a full node down from Nvidia, but they did all but catch up in performance per clock/ALU (in other words, IPC).
 
Last edited:

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
Granted they're a full node down from Nvidia, but they did all but catch up in performance per clock/ALU (in other words, IPC).
Does that mean a 11TF Navi would match a 2080S?
I can see that being the sweet spot next gen consoles target
 

THE:MILKMAN

Member
Mar 31, 2006
4,420
291
1,215
Midlands, UK
The work around that is going with a wider design clocked lower (1500-1600Mhz) to undervolt the chips and applying the hobbit method
7nm pro and 7nm+ have power reductions as well
All things considered a 250W console would be enough for the higher end of expectations (11TF DUV & 13-14TF EUV)
Where it all comes crumbling down is saying 250W would be enough. No way is any console even approaching 250W. That is higher than the the high-end PC test systems running a 5GHz i9, 32GB RAM and a 5700 the tech sites use (Anandtech's used 231 watts running the 5700).
 

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
Where it all comes crumbling down is saying 250W would be enough. No way is any console even approaching 250W. That is higher than the the high-end PC test systems running a 5GHz i9, 32GB RAM and a 5700 the tech sites use (Anandtech's used 231 watts running the 5700).
Why not? thats 250W total system power consumption btw
The X already breaks 200W in some tests

And again that's for the higher end of expectations the more conservative ones (10.3TF DUV & 12TF EUV) would comfortably fit a 200W total system consumption
 
Last edited:

vpance

Member
Jun 20, 2005
7,044
274
1,380
@SonGoku

The single biggest barrier/kryptonite for Sony and MS is power consumption of Navi. The same happened with PS4 to be fair. Sony took a full 7870 chip and disabled 2CUs but then clocked it lower than the 7850.

I could see them do the same for PS5. Take a full Navi 10 chip and disable 4CUs but this time clock it as high as possible and put in better cooling in a bigger case.

Another reason for Sony/MS to hold back on specs is possible mid-gen upgrades. Gotto leave room to make the upgrade worth it.
I'm not so sure a mid gen upgrade will be worth it for them this time. They were able to release Pro for a bit more than 2x the power for a fair price. Next gen they could do maybe 40% more power on 5nm? Even Pro already was already not really worth it if you didn't own a 4K TV.

That's why they should go as powerful as they can at launch, at $499.
 
Last edited:

THE:MILKMAN

Member
Mar 31, 2006
4,420
291
1,215
Midlands, UK
Why not? thats 250W total system power consumption btw
The X already breaks 200W in some tests

And again that's for the higher end of expectations the more conservative ones (10.3TF DUV & 12TF EUV) would comfortably fit a 200W total system consumption system
Because it just doesn't make sense and would be costly. Bigger case, bigger power supply etc. I expect +-175W and possibly Microsoft pushing 200W if they feel the need to take the performance crown.

One X was different as it was a niche model and sold at the higher $500 price point. It was pushed to the limit on thermals and watts and I can't see Scarlett doing the same if they hope to compete in sales this time round (1 million a/month).

I really do think Sony and Microsoft knew from the outset they would have to get creative for these consoles to feel like a proper leap. The SSD being part of that.
 

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
Because it just doesn't make sense and would be costly. Bigger case, bigger power supply etc. I expect +-175W and possibly Microsoft pushing 200W if they feel the need to take the performance crown.

One X was different as it was a niche model and sold at the higher $500 price point. It was pushed to the limit on thermals and watts and I can't see Scarlett doing the same if they hope to compete in sales this time round (1 million a/month).

I really do think Sony and Microsoft knew from the outset they would have to get creative for these consoles to feel like a proper leap. The SSD being part of that.
Its the same scenario plus more incentive (main system vs upgrades), next gen consoles will target $500 so vapor chamber is given
The X broke 200W depending of the model tested and that was a very slim console made for profit, nothing stops launch consoles from being fatter/bigger as they traditionally are and taking a small loss.

I think Sony/MS knew less than 10TF wouldn't cut it for next gen, 11TF being the sweetspot, that's why they target late 2020.
 
Last edited:
Dec 12, 2006
2,727
191
1,060
Next-gen will have 12 Navi TF like PS3 had "2TF"!
Come on son, have some ambition...

Next gen will have at least one 14 TF beast.

Sony will double down with powerful hardware and MS needs a hail mary pass to get them out of the jam they're in.

Neither will be giving you less than 12 TF. If they were they wouldn't have waited for 7nm EUV in 2020 to release their next consoles.
 
  • Like
Reactions: ANIMAL1975

Marlenus

Member
Jul 29, 2013
1,135
74
385
UK
No it doesn't, turing is about 1TF ahead compared to RDNA
5700 has 2304 shaders and 7ghz gddr6. The 2070 also has 2304 shaders and 7ghz gddr6. Computer base.de set both cards to 1.5Ghz (around 6.9Tflops) and the performance was the same on average across a range of games.

That means performance per flop is roughly equal, maybe rdna gets bottlenecked at higher clocks more readily than turing, I don't know and I have only seen one data point so it is not totally conclusive.

Using Anandtechs review as they give the average clock in each game the 2070 has a 15% performance advantage over the 5700 in GTA V while having a 12% clockspeed advantage. GTA V also seems to very slightly favour Turing as the 5700 is a dead heat with the 2060. When I get chance I will compare more games bit it does suggest that with equal Tflops rdna can pretty much match Turing.
 
  • Like
Reactions: xool

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
5700 has 2304 shaders and 7ghz gddr6. The 2070 also has 2304 shaders and 7ghz gddr6. Computer base.de set both cards to 1.5Ghz (around 6.9Tflops) and the performance was the same on average across a range of games.
Yeah I've seen the benchmark @LordOfChaos posted. Great news assuming it applies to all gaming scenarios
That means that with a conservative 12TF (on EUV ) they can surpass 2080S and 10.5TF (on DUV) match 2080.
  • 54CUs @1520 Mhz = 10.5TF (2080 tier)
  • 60CUs @1600Mhz = 12.28TF (2080S+ tier)
 
  • Like
Reactions: joshwaan
Jul 29, 2013
1,251
1,301
615
Oregon, US
Even if you had a 20% reduction in average power consumption for 5700 XT, you're still pulling 175W on just the graphics card alone. That's as much as the "at the wall" consumption for the Xbox One X. Add 8-16GB GDDR6 so +16-32W, uc/uv Zen 2 8c at 32W, plus the rest of the system = ~240W average gaming consumption...with highly hopeful 20% reduction in power consumption.

Sony took 2016's RX 480(~166W) and downclocked by ~30% for ~155W total system consumption in the same year. MS waited a year and put the equivalent of a slightly oc'd RX 480 with more/faster RAM in the Xbox One X for ~175W total system consumption in 2017. Working back from a hypothetical 200W system...subtract ~48W for CPU+8GB GDDR6 = 152W. Account for the rest of the system(SSD, BD Drive, USB, etc) and you have maybe 140W for the GPU TBP(this includes 8GB GDDR6, rolling mobo in as part of TBP). Personally, 200W sounds high for a console.
 

Marlenus

Member
Jul 29, 2013
1,135
74
385
UK
Even if you had a 20% reduction in average power consumption for 5700 XT, you're still pulling 175W on just the graphics card alone. That's as much as the "at the wall" consumption for the Xbox One X. Add 8-16GB GDDR6 so +16-32W, uc/uv Zen 2 8c at 32W, plus the rest of the system = ~240W average gaming consumption...with highly hopeful 20% reduction in power consumption.

Sony took 2016's RX 480(~166W) and downclocked by ~30% for ~155W total system consumption in the same year. MS waited a year and put the equivalent of a slightly oc'd RX 480 with more/faster RAM in the Xbox One X for ~175W total system consumption in 2017. Working back from a hypothetical 200W system...subtract ~48W for CPU+8GB GDDR6 = 152W. Account for the rest of the system(SSD, BD Drive, USB, etc) and you have maybe 140W for the GPU TBP(this includes 8GB GDDR6, rolling mobo in as part of TBP). Personally, 200W sounds high for a console.
Assuming the 5700xt is towards the upper end of the clockspeed/power curve then a card with more shaders but lower clockspeeds will use less power for the same performance, the trade-off is die area but if euv is cheaper that may not matter as much.
 

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
Even if you had a 20% reduction in average power consumption for 5700 XT, you're still pulling 175W on just the graphics card alone.
Thats not how it works... 5700XT is pushing 1900MHz clocks with a voltage to sustain it.
At 1520Mhz (DUV) and 1600Mhz (EUV) the consoles chips would have lower voltage and much better perf/watt. Thats before even taking into account any 7nm+/7nm pro power savings
Personally, 200W sounds high for a console.
Certain X models (hobbit method) broke 200W. The X was also a very slim console
I don't see why a phat launch console with a large vapor chamber couldn't handle 250W comfortably
, the trade-off is die area but if euv is cheaper that may not
Right EUV is less complex so cheaper to produce and 66CUs 384bit bus would fit 350-360mm2
 
Last edited:
Jul 29, 2013
1,251
1,301
615
Oregon, US
Thats not how it works... 5700XT is pushing 1900MHz clocks with a voltage to sustain it.
Demonstrate that with 5700 XT. UC/UV it while UV'ing a 5700 Pro and compare perf/watt gains then we'll go from there.

Your hopes and dreams are beyond best-case. Reconcile that with recent historical precedence(PS4/PS4 Pro, Xbox X) and known consumption for Navi 10.

You know I'm on the conservative side. I just wanted to pop back in and say Anandtech's "Building the PS5" build with UC/UV Zen 2 8c and 5700 Pro was pulling ~250W during the 120fps test. So get on and ask them their thoughts on your theory, I've explained my position maybe we will learn something new from them?

Austin Evans Twitter
 
Last edited:

xool

Member
May 29, 2018
592
456
310
When I get chance I will compare more games bit it does suggest that with equal Tflops rdna can pretty much match Turing.
This seems an important takeaway

So new (RDNA) AMD FlOPS ~= Nvidia Flops (for same real game performance)

?Does other people agree with this rough equivalence ?
 
Last edited:
  • Like
Reactions: mckmas8808

TeamGhobad

Member
Oct 15, 2018
2,133
1,988
435
its time to watercool. thats my opinion. i dont want a mid cycle refresh. water cool the sucker. put in a big fan like the OG xbox one had. and ur good to go.
 
  • Like
Reactions: ANIMAL1975

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
Demonstrate that with 5700 XT. UC/UV it while UV'ing a 5700 Pro and compare perf/watt gains then we'll go from there.
Power hungry Vega:
Reconcile that with recent historical precedence(PS4/PS4 Pro, Xbox X) and known consumption for Navi 10.
Launch consoles this gen target $400 with profits, next gen $500 with small loss will enable 200-250W consoles by employing vapor chambers or similar state of the art cooling solutions
"Building the PS5" build with UC/UV Zen 2 8c and 5700 Pro was pulling ~250W during the 120fps test.
They are doing it wrong... even if the target for next gen was a measly 8TF console manufacturers would go with a wider/slower design. 1.7/1.8GHZ its just too much for console clocks on standard 7nm
its time to watercool. thats my opinion. i dont want a mid cycle refresh. water cool the sucker. put in a big fan like the OG xbox one had. and ur good to go.
No need, a large vapor chamber would be enough to cool 250W consoles which can power the highest of expectations (11TF DUV & 13-14TF EUV)
My baseline expectations are now 10.5TF (DUV) & 12TF (EUV)
 
Last edited:
  • Like
Reactions: TeamGhobad

Marlenus

Member
Jul 29, 2013
1,135
74
385
UK
This seems an important takeaway

So new (RDNA) AMD FlOPS ~= Nvidia Flops (for same real game performance)

?Does other people agree with this rough equivalence ?
I am comparing the Anandtech numbers for the 5700XT and the 2070 super at the moment and it is interesting as both have the same number of shaders, the same ROPS and the same memory bandwidth.

At 1440P the 5700XT does really well.

Tomb Raider perf is the same with the 5700XT running at 1780Mhz and the 2070S running at 1875Mhz
F1 2019 the 5700XT is a couple of % ahead with the xt at 1800Mhz and the 2070s at 1875Mhz

It seems the games where the 5800XT loses (at 1440p) are the ones where it does not clock that high whereas the 2070s seems to clock up at 1850+MHz on a pretty consistent basis.

At 4k and 1080p though it seems to fall behind which seems odd.

It makes me think that with some driver tweaks and aib coolers that can keep it in check it might match the 2070s at 1440p.

EDIT: I got fed up of comparing on my phone but if you read the 5700 review for the FPS numbers/5700 clockspeeds and the 2070s review for the 2070s clockspeeds but the takeaway is the clockspeeds on the xt are far more variable Vs the 2070s.
 
Last edited:

ArabianPrynce

Member
Jun 4, 2019
153
125
195
I don't know what you're smoking, but it's peaked my interest. I hope your right, but I have serious doubts.
When i think about it. The console manufactures dug themselves into a whole with their pro varients of their consoles because its skewed everyones expectations. If they just stuck with the ps4 and xbox one for whole life cycle. the annoucement of an 9 or 8tflop console with an ssd and ryzen would have been fucking killer.

I think the pro varient consoles put em in a position where they have no choice but raise the specs. Sony said they are targeting the hardcore and see playstation as a "niche market". I am sure most hardcore gamers are able to see the minimal jump between the pro and ps5. Yes to us there is more going on in the gpu than just to tflops but for most ppl they see a 2x jump in power leaves very little incentive to buy the ps5.
@SonGoku
 
Last edited:
  • Like
Reactions: LSWilson