SonGoku
Member
That's what im saying though a 3080 XT (going by adoredtv latest video) with clocks fined tuned to hit a performance per watt sweet spot can reach ~12.5TF on 150W real. Another factor to influence power consumption are yields and manufacturing refinements which could both improve by the time PS5 enters mass production bringing further power reductions.PS4 Pro released the same year as Polaris 10 and had ~155W average total system consumption for $399. MS released Xbox One X a year later and it got them another ~20W on the TDP budget. In the end they still have a card that's basically just AMD's 2016 $250/150W mid-range part with higher memory bandwidth and more cache.
I was just wanting to tell you if you're going by the adoredtv leaks, then RX 3080 isn't a 150W any longer, it's 175W and supposedly worse. Something from the 160W tier with power capping and voltage tuning, or the 130W tier would be more appropriate. Adoredtv could be bullshit, but of course it allows us to talk next-gen console tdp and PC pricing tiers.
P.S.- My max prediction(48CUs at 1.8GHz peak core) isn't very far off your 12ish TF prediction. Anywhere near GTX 1080 power in a console will be awesome.
Im talking about the 190W 3080XT btw.
So realistically we can get ~12.5TF on a 200W box for $500. The console would have to be slightly bigger than the X but that console is almost as small as the slim anyways so no issue there.
A 130W GPU (~10-11TF) would be more appropriate if they were trying to hit $399.
PS: I think a 56CU part (60 CU with 4 disabled) will hit a better performance per watt sweet spot.
Last edited: