• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Latest Speculation from "known sources" points to Navi(GPU in next-gen consoles) having issues.

shark sandwich

tenuously links anime, pedophile and incels
No he's wrong.

It's 2019, we had a huge graphical jump in chipsets that also dropped in prices for the last 3 years. The next generation consoles are going to have recent CPU's and recent GPU's that will be closer to the top.
GTX 1080 Ti released early 2017 and it’s looking more and more like we’ll get power slightly below that in next-gen consoles.

So yeah... like an enthusiast gaming PC from 3.5 years prior. The only wild card is raytracing.
 

stetiger

Member
The point I'm making is that if you want to sell a console for $400-500 an APU is the way to go and right now AMD is the only company that can offer an APU with an adequate iGPU.
Well no, switch mobile has greater constraint than a box ps4/xbone and that is working out fine at 259 or whatever. I think the APU benefits are not as pronounced as you would like to believe. Also you can make APUs with different vendors. Intel and AMD have a shared APU go figure.
 

Evilms

Banned
0FIDWRV.png
 
GTX 1080 Ti released early 2017 and it’s looking more and more like we’ll get power slightly below that in next-gen consoles.

So yeah... like an enthusiast gaming PC from 3.5 years prior. The only wild card is raytracing.

The prices for more power graphical hardware is now at the price that the Xbox One GPU was when they teased it in 2016 (a year before it launched), things are not what you think.
 

CyberPanda

Banned
GTX 1080 Ti released early 2017 and it’s looking more and more like we’ll get power slightly below that in next-gen consoles.

So yeah... like an enthusiast gaming PC from 3.5 years prior. The only wild card is raytracing.
Also, Nvidia and AMD flips are different. It also depends on what products you look at, and the difference can be quite huge as well. The RTX 2060 for example is at 6.5 Teraflops, and can keep up with AMD Vega cards at 12 Teraflops in DX11 games.
 

shark sandwich

tenuously links anime, pedophile and incels
The prices for more power graphical hardware is now at the price that the Xbox One GPU was when they teased it in 2016 (a year before it launched), things are not what you think.
Power consumption of AMD’s high-end GPUs is NOWHERE NEAR low enough for a console.

Their latest and greatest 7nm Radeon VII consumes freaking 300+ watts and delivers 1080 Ti-level gaming performance.

Navi is going to have to perform a miracle in order to get even 1080 Ti-like performance while staying within the power budget of a console.

Also, Nvidia and AMD flips are different. It also depends on what products you look at, and the difference can be quite huge as well. The RTX 2060 for example is at 6.5 Teraflops, and can keep up with AMD Vega cards at 12 Teraflops in DX11 games.
Yeah I’m not trying to directly compare flops at all. 1080 Ti trades blows with Radeon VII in gaming benchmarks. So my assumption is that PS5’s GPU would need to perform equivalent to Radeon VII if we are expecting something on par with 1080 Ti
 
Last edited:
Uh yeah it is terrible news even if you hate AMD and exclusively buy Nvidia products.

Nvidia is behaving exactly as you’d expect a company with no competition to behave. The entire reason 2080 Ti is $1200+ and 2080 is $700+ is because they have no serious competition at the high end.

It’s even more depressing when you realize that performance per $ has essentially remained stagnant for the last few years.

This is not the normal state of things where we used to get roughly a doubling of performance/dollar every 2-3 years.

I'm pretty sure Nvidia performance per Watt will improve when they move to 7nm cards.

Also Turing is around 10-15% more power efficient than Pascal which is ok for minimal process node improvement from 16nm to pseudo 12nm and all the extra stuff they packed in.

No he's wrong.

It's 2019, we had a huge graphical jump in chipsets that also dropped in prices for the last 3 years. The next generation consoles are going to have recent CPU's and recent GPU's that will be closer to the top.

He isn't wrong. Since Nvidia counts TF differently even rumoured 13TF PS5 would fall somewhere between 1080 and 1080ti.
 

Ellery

Member
I bet it is the RX 3090.
Great wattage and is said to support Ray Tracing which Mark Cerny already confirmed.

The chart there doesn't even make sense. A card of the same architecture with more compute units consume less power than the card below. It is a typo.

Also the RX 3090 (if the leak is somewhat true) is way too overpowered and hot and power hungry for the PS5. I would think the RX 3070 is much more likely.

The RX 3090 alone consumes more power than all of the PS4 including everything else
 

bitbydeath

Gold Member
The chart there doesn't even make sense. A card of the same architecture with more compute units consume less power than the card below. It is a typo.

Also the RX 3090 (if the leak is somewhat true) is way too overpowered and hot and power hungry for the PS5. I would think the RX 3070 is much more likely.

The RX 3090 alone consumes more power than all of the PS4 including everything else

What do you mean?
PS4 Pro is 310 watts.
 

Redneckerz

Those long posts don't cover that red neck boy
I love everyone’s excitement for ray tracing as if nVidia cards can do it well. The lowering of FPS is not worth the effect.
I take it you have yet to see Metro Exodus with RT GI enabled. It is, literally, a generational difference and imo the only title so far that shows off the potential the right way. Not even 3D Mark Port Royal, which is a reflection demo, shows it better.

Wait? Next gen is going to meet the power of a GPU I bought years ago?

Fuck next gen consoles if true. I'll teach my wife to use a mouse and keyboard.

I don't understand why APUs have to be used. Just use dedicated and not worry about power draw. I have to gaming PCs with large power supplies and it adds very little per month on the power bill.
Dedicated to optimization.

As for your second comment: Size, integration. Reliability. Also your bill is not everyone's.
 
I take it you have yet to see Metro Exodus with RT GI enabled. It is, literally, a generational difference and imo the only title so far that shows off the potential the right way. Not even 3D Mark Port Royal, which is a reflection demo, shows it better.


Dedicated to optimization.

As for your second comment: Size, integration. Reliability. Also your bill is not everyone's.
The way it looks will be great, but I don’t want to play games with frame rate issues.
 
Both of those solutions are power anemic, and seem more expensive with cost per performance in comparison.
When you give an ARM core desktop-class TDP, it's pretty damn powerful. They use those for servers.
Nvidia of course has the better GPU's.
If they wanted to use Nvidia for home consoles, they could with a power-scaled ARM core and Nvidia GPU. But they seem to prefer x86 because development is much easier, so AMD is the only option there.
 

pawel86ck

Banned
If PS5 will use 12.9TF GPU, then I think it should match radeon 7 performance and probably surpass it. Yes, Radeon 7 has more TFLOPS (13.8TF), BUT:
-PS5 GPU will use new Navi architecture
-architecture will be also custom made for PS5 just to squeeze even more performance
-developers will use PS5 hardware to the fullest, games are coded for a specific hardware and consoles have more efficient shader compiler

With HW RT on top of that PS5 GPU should be a beast, and I think the jump from PS4 - PS5 will be bigger then many people expect🙂.
 
Last edited:

PocoJoe

Banned
Well no, switch mobile has greater constraint than a box ps4/xbone and that is working out fine at 259 or whatever. I think the APU benefits are not as pronounced as you would like to believe. Also you can make APUs with different vendors. Intel and AMD have a shared APU go figure.

No they dont have APU, if you are talking about that i7-8705G with intel CPU + amd GPU. It's just two separated chips on same base, so it is not technically an APU. APU is cpu+gpu combined on the same silicon chip

Also many talk about NVIDIA+Intel on console. not going to happen because they ask much more for their chips than AMD, even Nintendo just used stock tegra x1, maybe customized would have been even more expensive.

Aldo many compare PC nvidia vs. amd flops, which isn't directly comparable to consoles, as they are closed systems with much better optimizations, on pc there are drivers, deals with devs, sponsored games to run better, OS and other things that affect TFlops vs. performance.

AMD is good and best option for home consoles, if looking at price/performance combinations.

Someone could make a console with 500€ CPU and 1000€ GPU with water cooling, would it be fast, yes. would it cost too much, yes.

Those that want to have more power than consoles = just stay on PC world, 999€ consoles with high end parts wont happen anymore
 

shark sandwich

tenuously links anime, pedophile and incels
So what we can realisticaly expect is vega56 perf.
Yeah pretty much, unless Navi is a total slam dunk (which everything so far points to exactly the opposite):
- leaked Navi board has 2x 8-pin power connectors (aka 250+ watts)
- Adored TV leak says they couldn’t hit their power target
- still based on GCN instead of a brand new architecture
- AMD already has a 7nm GPU and the power consumption is atrocious


All signs are pointing to an underwhelming system. I would love to be wrong about this but I’m trying to stay realistic.
 

CrustyBritches

Gold Member
That chart is listing peak draw which might occur once or twice during a test. Average consumption for PS4 Pro is more like ~155W. DF says as much, and EU in-depth voluntary energy compliance tests indicate the same:
DFPSUPEAK.jpg


EUCOMMITTE2017.png


I've become more curious about consistency of listed "peak clock speed" versus the effect of power capping on these systems. If PS4 Pro was pulling over 170W consistently they would be getting much better performance than they do. I've been messing with RX 480 undervolt, overclock, and power limiting and you can test FH4 and get 4K/30fps on X1X settings with ~160W peak GPU power draw. The actual readout from the benchmark doesn't show consistent peak core clock at all, but does for memory clock, and it's average is probably more like 140-150W. If anything, the card is starved for more memory bandwidth and suffers from lower min than X1X(during puddle splashes).

You can have "whatever TFLOPS" all you want, but the memory subsystems are where they can really raise the baseline without pushing out of the sweet spot for core clocks and power consumption.
 
Last edited:

somerset

Member
Warning- watching "adoredTV" videos is known to kill braincells.

The history of this guy is hilarious. It is important to know he has *zero* tech understanding of software or hardware. He began life producing moderate reasonable fanboy promotions of AMD to counter the vast (at the time) fanboy productions shilling for Intel and Nvidia. Then he got really popular.

Another story. Once upon a time, iD's Quake was about to release, and every tech site had iD PR pointing out this was the first game to rely on the modern FPU power of the Pentium (1)- a beautiful refinement of Intel's 486, and Intel's last CISC design. In short, Michael Abrash (the *real* tech wizard behind Quake- and not Carmack) had cracked the general purpose perspective correct pixel rendering problem, with an approximation that used the floating point *division* power of the Pentium.

Yeah, I know- technical mumbo-jumbo well above the 'pay grade' of most of you gamers here- but stay with me. Then a multi-page 'technical' document appeared on a Usenet group (back when Usenet forums were used for discussion) that 'proved' with 'code examples and deep CPU analysis, that this was a *lie*. Because fools cannot analyse a paper for accuracy in its proclaimed 'technical' detail, this paper was proclaimed a "wonder"- and it's nonsensical lies spread wide and far. To be honest, it was written by a person *exactly* like adoredTV.

I debunked it immediately- but very few others did, teching me the level of basic tech knowledge held by those that follow tech. The the game came out- Carmack talked about the methods used (the FP division), and the nonsense paper *every* tech outlet at the time quoted as "brilliant analysis" was forgotten. Abrash- as a *non-owner* at id who was doing all the real work, saw he was wasting his time, and took his big brain to MS where at least he would be well paid. And iD began its long slow decline.

Back to Navi. In its more advanced console form (Navi+ with elements from AMD's post-navi architecture), it is already a smashing success, as gamers will discover when the new consoles appear. So what gives?

A lot of people work at AMD. Many of them are low talent grunts. A lot of them hear a lot of things they barerly understand. And with a few beers they leak. And people who make a living selling such leaks as *their own* tech awareness spin these leaks.

Add this to early engineering work in the labs, where every temp set-back has us raving and ranting (yes, low level coder *and* circuit designer). It's a way of handling tension- but to a know nothing over-hearing and hoping to pass on gossip- well this can be easily made to seem 'serious'.

But what actually matters is the progress from prototypes to release version. And, remember, the *halo* product tends to be over-clocked, over-volted and is not a happy chappie. Nvidia had a ton of Turing failures at first release down to issues like this. But now Turing is solid.

What really matter is *yield*, die size, and the performance of the slightly cut down version of the halo product. The Vega 56, *not* the Vega 64 etc. It is easy to rant about the crapness of the Vega 64, but the Vega 56 by not being over stressed is a killer card.

Peeps who wanna whale on AMD will focus on the issues with the overstressed Navi "special ed". Initially, AMD will hope that with enough power and cooling, this part will beat some Nvidia target- and when it falls short there'll be a lot of unhappy memos for adoredTV to quote.

Yet in the *real* world, people want the 120 dollar 570 performance Navi and the 280 dollar >2070 Navi. For the clicks, AdoredTV will call these Navi parts "failures". All because the most expensive Navi that doesn't rival the 2080TI runs hungry and hot- which every informed person already knew.

But it gets worse- TSMC is not GF. Even if version one of Navi has to go with 'lower' clocks, AMD can expect to rapidly fix this. And Navi shouldn't even be long for the world, since AMD has years in development changes to the macro architecture of their GPUs. So all Navi has to do is give use cheaper and faster cards than the current Polaris family- a high end card that kills it at 1440P. A very cheap card that kills it at 1080P (which the 570 does today).

I'll be sad if the best current Navi *without* extreme cooling, doesn't come close to the 1080TI. If it does just above a 1080/2070, that'll be sad, but at the right price it will be fantastic value against Nvidia.

But be warned- AMD wants the *best* AMD gaming experience to be the PS5 and Xbox Next until sometime after these consoles launch. AMD's true high-end comes after the consoles as part of a very deliberate strategy. As a PC gamer this hurts- but since AMD entered the console biz, this has always been their chosen strategy - console partner *before* the PC gamer.

AMD is all about decent value for the majority. Nvidia about performance, and damn the price.
 
But be warned- AMD wants the *best* AMD gaming experience to be the PS5 and Xbox Next until sometime after these consoles launch. AMD's true high-end comes after the consoles as part of a very deliberate strategy. As a PC gamer this hurts- but since AMD entered the console biz, this has always been their chosen strategy - console partner *before* the PC gamer.

It sounds like you're saying the PS5 and NextBox will contain a more powerful AMD GPU than can be found in the AMD desktop space at the time of launch. Not going to happen. But I guess it all depends on exactly what you mean by "best."

They will likely have a handful of features from a future PC GPU, but that's about it. Will those handful of future features be enough to claim the best AMD gaming experience? Debatable but likely not.

The consoles will contain some version of Navi with a cut down number of CUs to increase yields and reduce cost. It will be clocked lower for heat and power constraints and further gains in yields. Plus the all important "secret sauce" of course, but that's really all there is too it. In what way will that be the "best AMD gaming experience"?
 

TLZ

Banned
Warning- watching "adoredTV" videos is known to kill braincells.

The history of this guy is hilarious. It is important to know he has *zero* tech understanding of software or hardware. He began life producing moderate reasonable fanboy promotions of AMD to counter the vast (at the time) fanboy productions shilling for Intel and Nvidia. Then he got really popular.

Another story. Once upon a time, iD's Quake was about to release, and every tech site had iD PR pointing out this was the first game to rely on the modern FPU power of the Pentium (1)- a beautiful refinement of Intel's 486, and Intel's last CISC design. In short, Michael Abrash (the *real* tech wizard behind Quake- and not Carmack) had cracked the general purpose perspective correct pixel rendering problem, with an approximation that used the floating point *division* power of the Pentium.

Yeah, I know- technical mumbo-jumbo well above the 'pay grade' of most of you gamers here- but stay with me. Then a multi-page 'technical' document appeared on a Usenet group (back when Usenet forums were used for discussion) that 'proved' with 'code examples and deep CPU analysis, that this was a *lie*. Because fools cannot analyse a paper for accuracy in its proclaimed 'technical' detail, this paper was proclaimed a "wonder"- and it's nonsensical lies spread wide and far. To be honest, it was written by a person *exactly* like adoredTV.

I debunked it immediately- but very few others did, teching me the level of basic tech knowledge held by those that follow tech. The the game came out- Carmack talked about the methods used (the FP division), and the nonsense paper *every* tech outlet at the time quoted as "brilliant analysis" was forgotten. Abrash- as a *non-owner* at id who was doing all the real work, saw he was wasting his time, and took his big brain to MS where at least he would be well paid. And iD began its long slow decline.

Back to Navi. In its more advanced console form (Navi+ with elements from AMD's post-navi architecture), it is already a smashing success, as gamers will discover when the new consoles appear. So what gives?

A lot of people work at AMD. Many of them are low talent grunts. A lot of them hear a lot of things they barerly understand. And with a few beers they leak. And people who make a living selling such leaks as *their own* tech awareness spin these leaks.

Add this to early engineering work in the labs, where every temp set-back has us raving and ranting (yes, low level coder *and* circuit designer). It's a way of handling tension- but to a know nothing over-hearing and hoping to pass on gossip- well this can be easily made to seem 'serious'.

But what actually matters is the progress from prototypes to release version. And, remember, the *halo* product tends to be over-clocked, over-volted and is not a happy chappie. Nvidia had a ton of Turing failures at first release down to issues like this. But now Turing is solid.

What really matter is *yield*, die size, and the performance of the slightly cut down version of the halo product. The Vega 56, *not* the Vega 64 etc. It is easy to rant about the crapness of the Vega 64, but the Vega 56 by not being over stressed is a killer card.

Peeps who wanna whale on AMD will focus on the issues with the overstressed Navi "special ed". Initially, AMD will hope that with enough power and cooling, this part will beat some Nvidia target- and when it falls short there'll be a lot of unhappy memos for adoredTV to quote.

Yet in the *real* world, people want the 120 dollar 570 performance Navi and the 280 dollar >2070 Navi. For the clicks, AdoredTV will call these Navi parts "failures". All because the most expensive Navi that doesn't rival the 2080TI runs hungry and hot- which every informed person already knew.

But it gets worse- TSMC is not GF. Even if version one of Navi has to go with 'lower' clocks, AMD can expect to rapidly fix this. And Navi shouldn't even be long for the world, since AMD has years in development changes to the macro architecture of their GPUs. So all Navi has to do is give use cheaper and faster cards than the current Polaris family- a high end card that kills it at 1440P. A very cheap card that kills it at 1080P (which the 570 does today).

I'll be sad if the best current Navi *without* extreme cooling, doesn't come close to the 1080TI. If it does just above a 1080/2070, that'll be sad, but at the right price it will be fantastic value against Nvidia.

But be warned- AMD wants the *best* AMD gaming experience to be the PS5 and Xbox Next until sometime after these consoles launch. AMD's true high-end comes after the consoles as part of a very deliberate strategy. As a PC gamer this hurts- but since AMD entered the console biz, this has always been their chosen strategy - console partner *before* the PC gamer.

AMD is all about decent value for the majority. Nvidia about performance, and damn the price.
Thanks for this insightful and thoughtful post. At least it's not the usual rollercoaster we have here.
 
Warning- watching "adoredTV" videos is known to kill braincells.

The history of this guy is hilarious. It is important to know he has *zero* tech understanding of software or hardware. He began life producing moderate reasonable fanboy promotions of AMD to counter the vast (at the time) fanboy productions shilling for Intel and Nvidia. Then he got really popular.

Another story. Once upon a time, iD's Quake was about to release, and every tech site had iD PR pointing out this was the first game to rely on the modern FPU power of the Pentium (1)- a beautiful refinement of Intel's 486, and Intel's last CISC design. In short, Michael Abrash (the *real* tech wizard behind Quake- and not Carmack) had cracked the general purpose perspective correct pixel rendering problem, with an approximation that used the floating point *division* power of the Pentium.

Yeah, I know- technical mumbo-jumbo well above the 'pay grade' of most of you gamers here- but stay with me. Then a multi-page 'technical' document appeared on a Usenet group (back when Usenet forums were used for discussion) that 'proved' with 'code examples and deep CPU analysis, that this was a *lie*. Because fools cannot analyse a paper for accuracy in its proclaimed 'technical' detail, this paper was proclaimed a "wonder"- and it's nonsensical lies spread wide and far. To be honest, it was written by a person *exactly* like adoredTV.

I debunked it immediately- but very few others did, teching me the level of basic tech knowledge held by those that follow tech. The the game came out- Carmack talked about the methods used (the FP division), and the nonsense paper *every* tech outlet at the time quoted as "brilliant analysis" was forgotten. Abrash- as a *non-owner* at id who was doing all the real work, saw he was wasting his time, and took his big brain to MS where at least he would be well paid. And iD began its long slow decline.

Back to Navi. In its more advanced console form (Navi+ with elements from AMD's post-navi architecture), it is already a smashing success, as gamers will discover when the new consoles appear. So what gives?

A lot of people work at AMD. Many of them are low talent grunts. A lot of them hear a lot of things they barerly understand. And with a few beers they leak. And people who make a living selling such leaks as *their own* tech awareness spin these leaks.

Add this to early engineering work in the labs, where every temp set-back has us raving and ranting (yes, low level coder *and* circuit designer). It's a way of handling tension- but to a know nothing over-hearing and hoping to pass on gossip- well this can be easily made to seem 'serious'.

But what actually matters is the progress from prototypes to release version. And, remember, the *halo* product tends to be over-clocked, over-volted and is not a happy chappie. Nvidia had a ton of Turing failures at first release down to issues like this. But now Turing is solid.

What really matter is *yield*, die size, and the performance of the slightly cut down version of the halo product. The Vega 56, *not* the Vega 64 etc. It is easy to rant about the crapness of the Vega 64, but the Vega 56 by not being over stressed is a killer card.

Peeps who wanna whale on AMD will focus on the issues with the overstressed Navi "special ed". Initially, AMD will hope that with enough power and cooling, this part will beat some Nvidia target- and when it falls short there'll be a lot of unhappy memos for adoredTV to quote.

Yet in the *real* world, people want the 120 dollar 570 performance Navi and the 280 dollar >2070 Navi. For the clicks, AdoredTV will call these Navi parts "failures". All because the most expensive Navi that doesn't rival the 2080TI runs hungry and hot- which every informed person already knew.

But it gets worse- TSMC is not GF. Even if version one of Navi has to go with 'lower' clocks, AMD can expect to rapidly fix this. And Navi shouldn't even be long for the world, since AMD has years in development changes to the macro architecture of their GPUs. So all Navi has to do is give use cheaper and faster cards than the current Polaris family- a high end card that kills it at 1440P. A very cheap card that kills it at 1080P (which the 570 does today).

I'll be sad if the best current Navi *without* extreme cooling, doesn't come close to the 1080TI. If it does just above a 1080/2070, that'll be sad, but at the right price it will be fantastic value against Nvidia.

But be warned- AMD wants the *best* AMD gaming experience to be the PS5 and Xbox Next until sometime after these consoles launch. AMD's true high-end comes after the consoles as part of a very deliberate strategy. As a PC gamer this hurts- but since AMD entered the console biz, this has always been their chosen strategy - console partner *before* the PC gamer.

AMD is all about decent value for the majority. Nvidia about performance, and damn the price.
Great post, and what you said about AMD is in line with what Coreteks are claiming in their latest video. Or should I say HIS latest video.
 

Leonidas

Member
Warning- watching "adoredTV" videos is known to kill braincells.

Agreed, which is why I did not link to his video which is probably at least 30 minutes of fanboy drivel(I only posted the info from his sources, who have been correct in the past). And now this thread has been fouled by his name(12x already on this page stemming from only one post...)
 

stetiger

Member
No they dont have APU, if you are talking about that i7-8705G with intel CPU + amd GPU. It's just two separated chips on same base, so it is not technically an APU. APU is cpu+gpu combined on the same silicon chip

Also many talk about NVIDIA+Intel on console. not going to happen because they ask much more for their chips than AMD, even Nintendo just used stock tegra x1, maybe customized would have been even more expensive.

Aldo many compare PC nvidia vs. amd flops, which isn't directly comparable to consoles, as they are closed systems with much better optimizations, on pc there are drivers, deals with devs, sponsored games to run better, OS and other things that affect TFlops vs. performance.

AMD is good and best option for home consoles, if looking at price/performance combinations.

Someone could make a console with 500€ CPU and 1000€ GPU with water cooling, would it be fast, yes. would it cost too much, yes.

Those that want to have more power than consoles = just stay on PC world, 999€ consoles with high end parts wont happen anymore
Consoles have never had high end parts. They have had cutting edge techs but never high end parts for the most parts. At least not the successful ones. In addition, all console generations up to this one have had discrete chips and none have cost 999€ as far as I know. If the APU yields were so amazing I am just saying that I think other industries would have caught on at this point. Nvidia has much more efficient gpus than amd, while intel has more efficient cpus than amd (and so does ARM btw) which is why Nintendo chose Nvidia and Arm to powere the switch. They must certainly have assessed that even the mighty online APU was not power efficient enough for them, and to me that is telling for a 249$ box that is not being sold at a box.

If you like cheap console buy a switch. For the rest of us who like the medium and want to see it advanced $500 dollars is great if the performance is there to match it. I am okay with a 500$ dollar Ps5 with an 8core zen 2 chip, 12 TFlop gpu, with custom ray tracing rivaling an RTX 2070, and super fast ssd. That is a better console to last 7 years than a $400 dollar box that can't do any of the above well and keeps us locked into a terrible 7 years with few innovation in AI, animations, and physics for another 7 years of gaming. I think this medium needs those breakthrough to survive.
 

xool

Member
There's a couple people doing this - but stop with the "APU is/isn't a great thing/ hasn't caught on" - APU is just AMD's marketing name for a CPU+GPU SoC (System_on_a_chip)

Everyone has CPU+GPU SoC's - they're the predominant main silicon in consumer devices - all phones, all tablets have these (yes they use ARM) .. even Intel has them - their low end tablet/netbook chips (eg Z3735f or Z8350) have the same combination of CPU+GPU+io. (also Smart TVs etc)

Switch's Tegra X1 is a GPU+CPU SoC too - the X1 has a similar feature set to AMD's APU including assisted compute (via CUDA)

These things aren't small either - eg Apple A12 7nm (2.49GHz) 6.9billion transistors

Apart from the intel desktop and power user niche every consumer device is "APU" these days
 
Last edited:

Zannegan

Member
If you like cheap console buy a switch. For the rest of us who like the medium and want to see it advanced $500 dollars is great if the performance is there to match it. I am okay with a 500$ dollar Ps5 with an 8core zen 2 chip, 12 TFlop gpu, with custom ray tracing rivaling an RTX 2070, and super fast ssd. That is a better console to last 7 years than a $400 dollar box that can't do any of the above well and keeps us locked into a terrible 7 years with few innovation in AI, animations, and physics for another 7 years of gaming. I think this medium needs those breakthrough to survive.

If anything, the medium is moving backwards in terms of AI and physics complexity in spite of the increases in power since Half Life 2, Crysis, and FEAR. Better hardware opens up new possibilities, but game makers have clearly decided that it's not worth the investment to go down those routes.

I'd love to be wrong, but next gen I expect larger, prettier open worlds, and MAYBE more a little more object interaction in those sandboxes (though still lower fidelity than BotW's cartoon physics) no matter what the hardware costs.
 

CuNi

Member
It hurts me to see that people still think that Nvidia and AMD count flops different.

Hint, they don't. You can directly compare them.

What people do forget is, that flops are only floating point operations and the flop number actually doesn't matter for consumers but for scientific operations as they usually purely run on them. Games dont run purely on floating point operations, that's why a lower rated flop card can beat a higher rated flop card and that's probably where the misconception of "they count flops different" comes from.
 

Ascend

Member
- AMD already has a 7nm GPU and the power consumption is atrocious
People keep repeating that mantra, but it really is not THAT bad;


Warning- watching "adoredTV" videos is known to kill braincells.

The history of this guy is hilarious. It is important to know he has *zero* tech understanding of software or hardware. He began life producing moderate reasonable fanboy promotions of AMD to counter the vast (at the time) fanboy productions shilling for Intel and Nvidia. Then he got really popular.
I don't get why the guy gets so much hatred. His videos are often insightful and you can actually learn stuff from them. Sure, he loves to speculate, but he clearly indicates when he does this. More importantly he freely admits when he was wrong too. This latest Navi video is no exception. The fact that he has sources that leak reliable information to him says a lot about his content. If it really was so bad, no one would leak anything to him. And yes they are reliable. This has been proven multiple times over time.


Since people are apparently not willing to link him, which is borderline censorship because of disagreement with something. I'll do it myself, so people can, you know, judge for themselves;

The positive Navi view;



The Navi disaster, basically;



And one more thing....
How the discussion regarding consoles started... Is beyond me really... Console chips will most likely be APUs, and they will operate within the most power efficient range. Desktop parts generally do not, especially when it comes to AMD. These are pushed beyond their optimal power efficiency range for more performance, because, desktops can handle it. Consoles cannot.
The only way that they can be equated if Navi is indeed a chiplet, where the consoles will get a Ryzen chiplet with a Navi chiplet, forming an APU. But even then, that chiplet is most likely going to be a lot more specialized compared to a desktop version. Chips for consoles are made based on the wishes of the customer. They are custom designed most of the time, because, they don't want it to be easy to emulate their stuff. They want the hardware to be proprietary, basically. Even if Navi is a failure for the desktop, it doesn't mean that it will somehow be a disaster for the consoles. Remember that the PS4 Pro and the Xbox One X are still using an adapted Bulldozer, basically.
 

johntown

Banned
Uh yeah it is terrible news even if you hate AMD and exclusively buy Nvidia products.

Nvidia is behaving exactly as you’d expect a company with no competition to behave. The entire reason 2080 Ti is $1200+ and 2080 is $700+ is because they have no serious competition at the high end.

It’s even more depressing when you realize that performance per $ has essentially remained stagnant for the last few years.

This is not the normal state of things where we used to get roughly a doubling of performance/dollar every 2-3 years.
Eh still does not bother me and yeah I'm pretty much on the hate AMD wagon.
 
Top Bottom