• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Latest Speculation from "known sources" points to Navi(GPU in next-gen consoles) having issues.

johntown

Banned
that's terrible news for everyone. i can't believe that there are still pc gamers out there, that didn't get that the fate of (state of the art) pc games is directly linked to what your console generation base spec is. i wish it wasn't that way because then awesome things like realtime GI would have been standard in games the whole last generation. but that's how it is.
What?? What does say base PS4/Xbox specs have to do with PC gaming at all? It's not like PC version of those games don't get a ton of extra features and look 10x better.

Please explain this more as it makes zero sense to me. Thx
 

joe_zazen

Member
What?? What does say base PS4/Xbox specs have to do with PC gaming at all? It's not like PC version of those games don't get a ton of extra features and look 10x better.

Please explain this more as it makes zero sense to me. Thx

Expensive games are built for consoles, so console tech determines what game devs can and cannot do. Aside from visuals and framerate, pc gamers with $1500 cards and $1000 cpus get nothing extra. So things like ai, physics, etc....basically gameplay is the same. Want better gameplay, hope for better consoles. But it looks like base will be cutdown cheap 4TF lockhart, so small jump coming gen.
 

johntown

Banned
Expensive games are built for consoles, so console tech determines what game devs can and cannot do. Aside from visuals and framerate, pc gamers with $1500 cards and $1000 cpus get nothing extra. So things like ai, physics, etc....basically gameplay is the same. Want better gameplay, hope for better consoles. But it looks like base will be cutdown cheap 4TF lockhart, so small jump coming gen.
Gotcha....that make sense
 

johntown

Banned
For what? Do you really think that's a healthy attitude for PC gaming in general?
No it is not. I have just never liked AMD as they seem to always have driver issues and "seem" to be behind the pace of NVIDIA. Competition is always good but I'm just at a point right now where it does not matter much to me. I look it like this some people like to drive a Lexus while other are content with a Toyota.
 

Ar¢tos

Member
I honestly don't understand how the company selling a 6tf console wants to sell a 4tf "next gen" console. Even if the CPU/gpu is more advanced, selling this diffence to casual players is a challenge.
 

llien

Member
Im waiting for Navi to upgrade, if it sucks i just get a RTX 2060 along with Zen 2 CPU
Vega 56 is good 70-80 Euros cheaper where I live and if you care about power consumption much, underclocking does wonders to Vega's.

AMD as they seem to always have driver issues
1Oe2p91.png
 
Last edited:

CrustyBritches

Gold Member
It's 2019 and yes AMD still has poor drivers consistently.

EDIT: Here is a recent article for your enjoyment as well.

Aside from being a random Reddit post, he says, "I've looked around and found people on Nvidia with similar problems and solutions but nothing for AMD..."
SevereImaginativeGrison-max-1mb.gif

I don't have issues with either brands drivers. Most multiplats play pretty well on AMD hardware just because they're made for PS4 as the primary target. RE2 and Div2 being recent examples. I've had 1060 6GB and RX 480 8GB and they both have great performance and easy driver management. Nvidia has lower power consumption with higher overclocks, while AMD has more memory with higher power consumption. Can't go wrong either way.
 

Ascend

Member
No it is not. I have just never liked AMD as they seem to always have driver issues and "seem" to be behind the pace of NVIDIA. Competition is always good but I'm just at a point right now where it does not matter much to me. I look it like this some people like to drive a Lexus while other are content with a Toyota.
If you're aware that it's an unhealthy attitude, why not change it?

Did you ever use an AMD card to encounter these 'driver issues'? Leaving this here as a bonus...


Even when AMD is ahead, they are seen as being behind because... Reasons. AMD has had good products, and still have. I bet you to find a better deal in their price range right now, that is superior to the RX 570 or the Vega 56 (referring to US prices here...). People hate AMD when they can't keep up. Sure, they like AMD when they do well, but, they like it because nVidia is forced to lower prices, and then they go out and buy nVidia still... And that is exactly what propels the gaming industry to what it is today. The exploitation of people that will pay more and more for less and less. People trashed the Radeon VII for its price, but the only reason that card exists in the first place, is because nVidia's RTX prices allow it. But no one blames nVidia for it. Sure, they trash RTX, but they don't say that it's their fault that the Radeon VII has that price. But it definitely is AMD's fault that the RTX 2080 Ti is $1200+ 😵

While I understand your analogy... Lexus is not charging more and more for every new car generation simply because Toyota cannot reach 0-60 mph in only 3.6 seconds. In fact, for the majority of people, 0-60 mph in 4.6 seconds is still overkill. Not that that is helping Lexus sell any more cars though, because the majority of car buyers are generally more sensible to buy something for their actual use, unlike gamers, which are more worried about the brand name than what is actually best for them, both in the short and the long term.

Just remember that your money is a vote for how you want things in the world to be. People have voted that slower cards for more money is fine for multiple generations, and so, here we are.

I do not have hope anymore that this will change. That includes Navi. Because if the RX 570 can't change anything right now, what makes one think that anything else that AMD will put out, can? At this point they need a $500 RTX 2080 Ti level of performance card, and even then, I have my doubts that people will buy it.


It's 2019 and yes AMD still has poor drivers consistently.

EDIT: Here is a recent article for your enjoyment as well.

Oh..... What about this? Example comments;

"1080 Ti here using 419.35..running at 1440p 144hz monitor with gsync on..game does hit 60 but randomly drops down to 30 or lower when fighting enemies. Kinda hard to deflect when the whole game stutters lol. "
"Same, 2080ti with 6core cpu still getting 20-30fps, unplayable even in the lowest possible setting. Tried every single thing on the web and none of them works. Trash. "

https://steamcommunity.com/app/814380/discussions/0/1850323802572206287/

In other words... Anecdotal evidence does not prove that somehow AMD has inferior drivers to nVidia.
 
Last edited:

Ivellios

Member
Vega 56 is good 70-80 Euros cheaper where I live and if you care about power consumption much, underclocking does wonders to Vega's.


1Oe2p91.png

Vega where i live is way more expensive than RTX 2060, plus i always read that its really hot and power hungry and i prefer not to mess with undervolting or anything like that.

However if it was considerable cheaper with similar performance i could stand a card with these problems. Which is why im really hoping Navi dont flop on price/performance at least
 

Ascend

Member
Updated "leak" summary:

UqyC9jb.png
Now that I look at it more closely... It doesn't seem to add up...

RX 3060 & RX 3070 is supposed to be like the RX 580 and the Vega 56. There is a 30%ish performance gap between those two. Seems like a huge hole to be left open. It would be like inviting nVidia in to make both those cards redundant.

Then there is;
RX 3080 = Vega 64 + 10%
RX 3080 XT = RTX 2070
Except the RTX 2070 is practically already the Vega 64 +10%. So... That doesn't add up either.
 
Hopefully it's bullshit or something that gets fixed since both consoles are using(?) this architecture. I want next gen systems to be as powerful as possible, at least as much power as $500 can buy.
 

Ascend

Member
+25% CUs and likely higer clocks too are quite in "30%-ish" area.
AMD's compute units never had linear scaling.

Which adds up perfectly on the chart, with 2070 listed above V64+10%.
You missed the point, which is that the table indicates RX 3080 and RX 3080XT being equally as fast as each other.

Which made you miss another point, which is, why would they leave a 30% gap at the lower end with a $60 difference, and, put two basically equally fast graphics cards for $50 difference at the mid tier?

Sense make it does not.
 

llien

Member
AMD's compute units never had linear scaling.
That's not true, besides, it gets real bad only in higher end cards, with lots of CU power (also applies to nVidai).

You missed the point, which is that the table indicates RX 3080 and RX 3080XT being equally as fast as each other.
I read it as 3080XT being 5-10% faster than 3080:

relative-performance_3840-2160.png


why would they leave a 30% gap at the lower end with a $60 difference, and, put two basically equally fast graphics cards for $50 difference at the mid tier?

That's an interesting question, but note how xx50 series of nVidia are, even Turing x50 cannot beat similarly priced 3 years old 570
 

johntown

Banned
If you're aware that it's an unhealthy attitude, why not change it?

Did you ever use an AMD card to encounter these 'driver issues'? Leaving this here as a bonus...


Even when AMD is ahead, they are seen as being behind because... Reasons. AMD has had good products, and still have. I bet you to find a better deal in their price range right now, that is superior to the RX 570 or the Vega 56 (referring to US prices here...). People hate AMD when they can't keep up. Sure, they like AMD when they do well, but, they like it because nVidia is forced to lower prices, and then they go out and buy nVidia still... And that is exactly what propels the gaming industry to what it is today. The exploitation of people that will pay more and more for less and less. People trashed the Radeon VII for its price, but the only reason that card exists in the first place, is because nVidia's RTX prices allow it. But no one blames nVidia for it. Sure, they trash RTX, but they don't say that it's their fault that the Radeon VII has that price. But it definitely is AMD's fault that the RTX 2080 Ti is $1200+ 😵

While I understand your analogy... Lexus is not charging more and more for every new car generation simply because Toyota cannot reach 0-60 mph in only 3.6 seconds. In fact, for the majority of people, 0-60 mph in 4.6 seconds is still overkill. Not that that is helping Lexus sell any more cars though, because the majority of car buyers are generally more sensible to buy something for their actual use, unlike gamers, which are more worried about the brand name than what is actually best for them, both in the short and the long term.

Just remember that your money is a vote for how you want things in the world to be. People have voted that slower cards for more money is fine for multiple generations, and so, here we are.

I do not have hope anymore that this will change. That includes Navi. Because if the RX 570 can't change anything right now, what makes one think that anything else that AMD will put out, can? At this point they need a $500 RTX 2080 Ti level of performance card, and even then, I have my doubts that people will buy it.



Oh..... What about this? Example comments;

"1080 Ti here using 419.35..running at 1440p 144hz monitor with gsync on..game does hit 60 but randomly drops down to 30 or lower when fighting enemies. Kinda hard to deflect when the whole game stutters lol. "
"Same, 2080ti with 6core cpu still getting 20-30fps, unplayable even in the lowest possible setting. Tried every single thing on the web and none of them works. Trash. "

https://steamcommunity.com/app/814380/discussions/0/1850323802572206287/

In other words... Anecdotal evidence does not prove that somehow AMD has inferior drivers to nVidia.

My post was the troll the person trying to troll my previous post. Of course we can both find posts about each manufacturer having driver issues. Probably most of them on both sides are ppl who don't really know what they are doing when it comes to PC configuration.

It really comes down to personal preference for the most part. I like NVIDIA cards, the price does not matter to me, I like a lot of the extra features they add to PC gaming (exclusive features) and I have yet to have any major problems with them. I also work in IT so I know what I am doing with PC's in general.

I prefer premium products and I want my PC gaming experience to be the best it can be. IMO that means NVIDIA and Intel.
 

llien

Member
the bad news that we already had was that it was still GCN.
Exactly. AS soon as I heard it was STILL GCN based I was a little upset

It is GCN instruction set (march is also called things, but you cannot possibly know what the physical architecture of Navi is).
GCN is 7 years old.
CUDA is 11 years old.
x86 is 40 years old


"Zen 2 is still x86", chuckle... :D
 

Ascend

Member
It really comes down to personal preference for the most part. I like NVIDIA cards, the price does not matter to me, I like a lot of the extra features they add to PC gaming (exclusive features) and I have yet to have any major problems with them. I also work in IT so I know what I am doing with PC's in general.

I prefer premium products and I want my PC gaming experience to be the best it can be. IMO that means NVIDIA and Intel.
To each his own... I disagree, and the reason I disagree is because no one company or brand is infallible. Toyota, which is often considered to be the most reliable car brand, was required to recall cars multiple times. One should not look (solely) at the brand, but what the specific products have to offer. You might not care about price, but sooner or later, you will, unless something changes and changes soon.

Not addressing you directly, but since you mentioned you work in IT, I have experienced that the ones that work in IT can be just as biased as (if not more than) a hobbyist. I've met multiple that didn't even know what an R9 Fury was during the time of the nVidia 980Ti. Unless PCs are also their hobby, generally, they are not going to know the ins and outs of the market at any given time. I bet many can't tell where the Vega 56 and the Radeon VII fall in the performance bracket, if they even know what these cards are. One cannot make informed decisions if one only knows half the market.

Often the mentality is, that since the companies that IT people work for go for Intel and nVidia, and so do the likes of Dell and HP, they must be better, all the while forgetting that other big companies, like Microsoft and Sony chose AMD over both Intel and nVidia for years. Maybe not everyone knows that though...

More importantly, lately, we know Google Stadia will be using AMD Graphics, and now it was just announced that the world's fastest exascale supercomputer will be made in collaboration with AMD... So yeah. If AMD was not 'premium', these things wouldn't happen.

Although no one can disagree that AMD has not been really at their best in the graphical department since at least the 290X, the general stigma around their brand makes their cards seem worse than they really are, which is exactly why some people see extremely good value in them.

Now let's try to bring things back to Navi...
 

SonGoku

Member
After watching the video with the "bad news", it doesn't really change anything for PS5 target TDP.


I think 12.5TF or somewhere in close vicinity its quite likely for PS5. RX 3080 on paper performance in between GTX 1080 and 1080Ti (RTX2070/Vega64+15%) for 150W. Well within power consumption limits of a console GPU on an APU.
On a closed box with games designed to take advantage of the arch's strenghts it would bring it closer to 1080Ti and if specialiced RT hw is present comparable to a RTX 2070-2080.
 
Last edited:

Schnozberry

Member
The leaks have been so all over the place on Zen 2 and Navi that I'm firmly in the camp of waiting for the silicon to show up in a state where people can review it before I'll believe anything.

I wish for the best for AMD. It's better for the industry when we have active and healthy competition in the CPU and GPU markets. Having very good performance at an affordable price is a place AMD should be trying to land. It'd be great if the Zen 2 and Navi mid range chips would allow for 1080p60 and push down the barrier of cost to break into a PC closer to the $500 mark. Nvidia and Intel having been fleecing people for years while consistently lowering expectations on subsequent product generations.
 

CrustyBritches

Gold Member
After watching the video with the "bad news", it doesn't really change anything for PS5 target TDP.

I think 12.5TF or somewhere in close vicinity its quite likely for PS5. RX 3080 on paper performance in between GTX 1080 and 1080Ti (RTX2070/Vega64+15%) for 150W.
You gotta watch all the way through to the end where the meat of the vid is at. Don't know how youtube works in that a creator would want to pad vids for ad views or something, but the whole vid could be summed up in 30secs. This is the updated chart he lists at the end:
updatenaviplanned.jpg

That's a 25W boost over the original numbers. However, he says his source tells him Navi is even more power hungry and even these numbers are outdated. He says nothing in the updated looked as bad as some rumors were indication, but then his source gave him newer info saying:
navirumors.jpg

"efficiency is worse than I thought"..."60CUs of Navi can't match 60CUs of Vega 20".

The updated chart puts the 2070 equivalent at 190W, Vega 64 at 160W. He says his source said to expect even more power consumption. My own guess-timate based on going from 28nm to *edit* 14nm was around 167W for just under Vega 64 type performance on 7nm.
 
Last edited:

Elios83

Member
These rumors are all over the place. We're at a stage where there are simply no factual infos.
I see that in these discussions each time different rumors singing different tunes are used as ammo by people with different ''preferences''.
When AMD announces the Navi architecture some things will be clearer, But even then considering that both Sony and Microsoft have asked AMD to make specific customizations, it will only be part of the story. Next gen consoles are late 2020 products, the feature set will probably include things from the gen beyond 2019 Navi.
About PS5 it will definetly be interesting to look at their ray tracing implementation. I also wonder how much Microsoft is willing to disclose next E3. Logic would advise against disclosing too much so long before launch but it also depends on their strategy (ex. importance given on hardware as the platform vs services, their own target launch price vs expected launch price from the competition which directly translates into an expected hardware power difference).
 

SonGoku

Member
CrustyBritches CrustyBritches
I did that's why i said for PS5 TDP target, it doesn't change all that much (assuming bad news are real)
PS5 target spec moved from 48CUs at 150W to 56CUs at 190W. That isn't much of a increase (40W) granted it would require a more expensive cooling solution similar to the X.

We should also take into consideration these are launch TDP numbers. Its not uncommon for TDP and thermals to improve with time (better yields, manufacturing refinements etc) while on the same processes node. So by the time PS5 enters mass production things should look better. So over all taking the bad news into account, i think the 12.5TF target or thereabouts to be doable on a console launching late 2020.


Having said all that if the info on Navi being broken and engineers wanting to ditch it asap are true it makes me question Sony's decision to codevelop the arch with AMD for PS5
So either the PS5 chip was especially designed to hit a performance per watt sweetspot or there's something wrong about Navi horror stories.
 
Last edited:

CrustyBritches

Gold Member
CrustyBritches CrustyBritches
I did that's why i said for PS5 TDP target, it doesn't change all that much (assuming bad news are real)
PS5 target spec moved from 48CUs at 150W to 56CUs at 190W. That isn't much of a increase (40W) granted it would require a more expensive cooling solution similar to the X.

So either the PS5 chip was especially designed to hit a performance per watt sweetspot or there's something wrong about Navi horror stories.
We're all assuming $499(admittedly, I switched over about a week ago), but it could be $399. With $399 budget, the year of Polaris 10's debut, PS4 Pro has an average total system consumption of ~155W at 4K and about 30% less performance than RX 480. A year later X1X had roughly RX 480 performance and average power consumption of ~172W at $499.

It's all guess work, of course, but I can see ~170-175W total system power for a $499 system. Going by Polaris 10 and last-gen systems, that gives you a "AMD 150W TDP" GPU(that's really 166W in-game). If adoredtv's leak is true, that would allow for something in between the RX 3070 and the 3070XT. I'm guessing they go for 48CU. They'll say 1.7-1.8GHz core and 10-11TF, but power cap and and under volt so the actual consumption and performance will be slightly lower than paper specs.

Conspiracy theory time: PS5 is Gonzalo(aka Navi 10 Lite) and the Navi evaluation board is Navi 10/RX 3080 or 3070xt, but basically the PS5 GPU, occupying PC retail price tier of RX 480/PS4 Pro GPU. The tip of the spear Navi 10 midrange GPU. $280-300. :messenger_winking_tongue:
tenor.gif


P.S.- Somebody on Beyond3d had a good post that explained better than I can. It was interesting and I'll try to find it again since we are bored and thirsty.
 
Last edited:

PSXMGSplayer23

Neo Member
I would be content with GTX 1080 level of GPU in there. Its still a very great performing GPU, and still expensive. Just imagine what developers can do with a GPU like that.
 

SonGoku

Member
We're all assuming $499(admittedly, I switched over about a week ago), but it could be $399. With $399 budget, the year of Polaris 10's debut, PS4 Pro has an average total system consumption of ~155W at 4K and about 30% less performance than RX 480. A year later X1X had roughly RX 480 performance and average power consumption of ~172W at $499.

It's all guess work, of course, but I can see ~170-175W total system power for a $499 system. Going by Polaris 10 and last-gen systems, that gives you a "AMD 150W TDP" GPU(that's really 166W in-game). If adoredtv's leak is true, that would allow for something in between the RX 3070 and the 3070XT. I'm guessing they go for 48CU. They'll say 1.7-1.8GHz core and 10-11TF, but power cap and and under volt so the actual consumption and performance will be slightly lower than paper specs.

Conspiracy theory time: PS5 is Gonzalo(aka Navi 10 Lite) and the Navi evaluation board is Navi 10/RX 3080 or 3070xt, but basically the PS5 GPU, occupying PC retail price tier of RX 480/PS4 Pro GPU. The tip of the spear Navi 10 midrange GPU. $280-300. :messenger_winking_tongue:
tenor.gif


P.S.- Somebody on Beyond3d had a good post that explained better than I can. It was interesting and I'll try to find it again since we are bored and thirsty.
The 3080 from the source which runs at 190W and is matched with a 2070 is likely hitting low to mid 13TFs, a hypothetical 56CU PS5 chip (60CU with 4 disabled) would have fined tuned clocks to hit a performance per watt sweet spot coupled with improved yields and manufacturing can make 12.5TF (take or add) possible at 150W. That would mean close to 200W total power consumption for the APU. Cooling for that TDP would be doable on a $500 budget. This is all assuming the worst case scenario for Navi.

BTW im curious whats stopping consoles with going with 300W monsters? I've seen console sized micro towers with 500W+ power supplies, whats the unspoken barrier here?
I would be content with GTX 1080 level of GPU in there. Its still a very great performing GPU, and still expensive. Just imagine what developers can do with a GPU like that.
GTX 1080 would mean 12TF so yeah very good
 
Last edited:

Jigsaah

Gold Member
Question: If I go from an Intel CPU to an AMD one...is it expected that I need to change my motherboard too?

Relevant comment: Yea I read this as...fuckin shit Navi sucks so consoles suck and PC players who were holding out now either have to go with Nvidia for the foreseeable future.
 

shark sandwich

tenuously links anime, pedophile and incels
The 3080 from the source which runs at 190W and is matched with a 2070 is likely hitting low to mid 13TFs, a hypothetical 56CU PS5 chip (60CU with 4 disabled) would have fined tuned clocks to hit a performance per watt sweet spot coupled with improved yields and manufacturing can make 12.5TF (take or add) possible at 150W. That would mean close to 200W total power consumption for the APU. Cooling for that TDP would be doable on a $500 budget. This is all assuming the worst case scenario for Navi.

BTW im curious whats stopping consoles with going with 300W? I've seen console sized micro towers with 500W+ power supplies, whats the unspoken barrier here?

GTX 1080 would mean 12TF so yeah very good
If you’re talking about an APU that consumes 300W, that would take one hell of a cooling solution to manage the heat.

Just look at Radeon VII. It takes a freaking humongous cooler + 3 fans to remove 300W from that package and it sounds like a 747 taking off when under load.

If they went with discrete GPU it might be a different story, but a 300W APU would be nuts.
 

SonGoku

Member
If you’re talking about an APU that consumes 300W, that would take one hell of a cooling solution to manage the heat.

Just look at Radeon VII. It takes a freaking humongous cooler + 3 fans to remove 300W from that package and it sounds like a 747 taking off when under load.

If they went with discrete GPU it might be a different story, but a 300W APU would be nuts.
Yep i meant 300w apu
But then again the Radeon VII is tiny card compared to a full console. Considering the size of a console its not really that big of a cooler
Such a cooler would not be possible on a $500 or something?
 

shark sandwich

tenuously links anime, pedophile and incels
Yep i meant 300w apu
But then again the Radeon VII is tiny card compared to a full console. Considering the size of a console its not really that big of a cooler
Such a cooler would not be possible on a $500 or something?
I think it might be technically possible to do a 300W console the size of a PS3. But I can’t really imagine Sony going with the bigass premium console design again.

Even their premium PS4 Pro model was only 170-something watts. And the noise level of its cooling system was at the upper limit of what I’d consider acceptable for a console.

So who knows. Might happen but IMO it’s unlikely.
 

SonGoku

Member
Even their premium PS4 Pro model was only 170-something watts. And the noise level of its cooling system was at the upper limit of what I’d consider acceptable for a console.

So who knows. Might happen but IMO it’s unlikely.
I think it was even less than that but Pro was $400 and selling for profit
200W APU(or thereabouts) APU wouldn't be too crazy for $500 right?

Right now im thinking
$400: 10-11TF
$500: 12-12.9 TF
Question: If I go from an Intel CPU to an AMD one...is it expected that I need to change my motherboard too?
100% Yes
 
Last edited:

Ascend

Member
Question: If I go from an Intel CPU to an AMD one...is it expected that I need to change my motherboard too?
Definitely yes. They use different sockets. In other words, an Intel CPU will not fit in an AMD motherboard, and the opposite is also true. It's even recommended to do a fresh install of Windows to avoid issues.
 

Leonidas

Member
Relevant comment: Yea I read this as...fuckin shit Navi sucks so consoles suck

Next-gen consoles will still be a massive jump over current gen in many ways, though graphically it might end up being less than 2x better than the X...
I wouldn't say Navi sucks either, it is disappointing that its coming a year after Turing and may only match mid-range Turing GPUs.
Navi 20 situation is even worse since that's expected in 2020 and it might not even match Nvidia's 2018 high end GPU.

So it appears they're still 1-2 years behind Nvidia. And Nvidia hasn't even made it to 7nm yet...

PC players who were holding out now either have to go with Nvidia for the foreseeable future.

Nvidia has had the most powerful, most efficient high end GPUs for years so it's kind of just more of the same.
 
Last edited:

SonGoku

Member
Next-gen consoles will still be a massive jump over current gen in many ways, though graphically it might end up being less than 2x better than the X...
I think even assuming the latest info on Navi is real 12TF+ is doable at $500
Nvidia has had the most powerful, most efficient high end GPUs for years so it's kind of just more of the same.
Pricing its the worst its ever been though, hope was/is AMD can bring much needed competition.
 
Last edited:

CrustyBritches

Gold Member
The 3080 from the source which runs at 190W and is matched with a 2070 is likely hitting low to mid 13TFs, a hypothetical 56CU PS5 chip (60CU with 4 disabled) would have fined tuned clocks to hit a performance per watt sweet spot coupled with improved yields and manufacturing can make 12.5TF (take or add) possible at 150W. That would mean close to 200W total power consumption for the APU. Cooling for that TDP would be doable on a $500 budget. This is all assuming the worst case scenario for Navi.
PS4 Pro released the same year as Polaris 10 and had ~155W average total system consumption for $399. MS released Xbox One X a year later and it got them another ~20W on the TDP budget. In the end they still have a card that's basically just AMD's 2016 $250/150W mid-range part with higher memory bandwidth and more cache.

I was just wanting to tell you if you're going by the adoredtv leaks, then RX 3080 isn't a 150W any longer, it's 175W and supposedly worse. Something from the 160W tier with power capping and voltage tuning, or the 130W tier would be more appropriate. Adoredtv could be bullshit, but of course it allows us to talk next-gen console tdp and PC pricing tiers.

P.S.- My max prediction(48CUs at 1.8GHz peak core) isn't very far off your 12ish TF prediction. Anywhere near GTX 1080 power in a console will be awesome.
 
Last edited:

Jigsaah

Gold Member
Next-gen consoles will still be a massive jump over current gen in many ways, though graphically it might end up being less than 2x better than the X...
I wouldn't say Navi sucks either, it is disappointing that its coming a year after Turing and may only match mid-range Turing GPUs.
Navi 20 situation is even worse since that's expected in 2020 and it might not even match Nvidia's 2018 high end GPU.

So it appears they're still 1-2 years behind Nvidia. And Nvidia hasn't even made it to 7nm yet...



Nvidia has had the most powerful, most efficient high end GPUs for years so it's kind of just more of the same.

Don't alter my words. It's not a quote if you change the words.
 

Jigsaah

Gold Member
Definitely yes. They use different sockets. In other words, an Intel CPU will not fit in an AMD motherboard, and the opposite is also true. It's even recommended to do a fresh install of Windows to avoid issues.
Thanks. I was considering it after reading about Ryzen 3xxx
 
Last edited:
Top Bottom