• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

xool

Member
TSMC data is in theory... AMD reached around 40% increase avg. in real benchs with Vega 20 going from 16nm to 7nm.
There is obvious some Architecture improvement with Navi but it is way lower than the marketing tries to show.... 10% improvement in perf./watt with a new Arch is good it is just not Maxwell level.
Yes. TSMC figures will be optimistic/best case - so that 55% increase drops to ? 45% ?

(the table you posted above) - I guess the comparison is VegaVII 16GB vs Vega64 8GB .. but here's the thing - there's also a doubling of HBM2 memory in that comparison which adds further power consumption to VegaVI, making the gains less by an (unknown) amount ..

I hope that +50% is purely architectural otherwise the non-shrink gains seem to be approaching negible levels ..
 

CrustyBritches

Gold Member
There are pics from cards coming from Computex... all of them shows two SKUs:

225W TBP
180W TBP

Basically the Sapphire leasks.
Yep Navi 10 XT and Navi 10 Pro. Curious to see actual peak gaming consumption vs listed tdp. Probably be like Polaris 10 where it was ~15W over TDP. Basically a 160-165W Vega 56 and 190-200W Vega 64. At least that's what the 2x8-pin pci-e connector would have me believe.
 

shark sandwich

tenuously links anime, pedophile and incels
Yep Navi 10 XT and Navi 10 Pro. Curious to see actual peak gaming consumption vs listed tdp. Probably be like Polaris 10 where it was ~15W over TDP. Basically a 160-165W Vega 56 and 190-200W Vega 64. At least that's what the 2x8-pin pci-e connector would have me believe.
Once again indicating that ~Vega 56-level performance is about the best we can hope for in next gen consoles.
 

ethomaz

Banned
Yep Navi 10 XT and Navi 10 Pro. Curious to see actual peak gaming consumption vs listed tdp. Probably be like Polaris 10 where it was ~15W over TDP. Basically a 160-165W Vega 56 and 190-200W Vega 64. At least that's what the 2x8-pin pci-e connector would have me believe.
It is because AMD uses TBP... Typical Board Power.
Different of Intel and nVidia TDP... Thermal Design Power.
 
Last edited:

llien

Member
People stop pushing "dedicated" too far.
Dedicated HW for RT is too risky and situational at this point.
Especially given this demo, ran on Vega 56 with generic shaders:




Are they using Vega 20 or Vega 10 in the comparison?
14nm Vega.
 

CrustyBritches

Gold Member
Once again indicating that ~Vega 56-level performance is about the best we can hope for in next gen consoles.
Taking into consideration 8c/16t CPU, 16-24GB of GDDR6 and fast SSD...yeah I'd guess somewhere in between Vega 56 and 64. Navi 10 Lite sounds about right.

The asrock cards have nice cooling and the leaked pcb we saw had a heavy-duty VRM setup. I doubt consoles are getting cooling like that.
It is because AMD uses TBP... Typical Board Power.
Different of Intel and nVidia TDP... Thermal Design Power.
*edit* ok I see you were comparing to Intel/Nvidia. AMD listed RX 480 at 150W TDP and in reality it pulled more like 165W in average gaming. Sapphire 580 Nitro has 185W listed TDP with 230W boost bios average gaming consumption. The 580 Nitro has 6 phase VRM and 1x6-pin+1x8-pin pci-e power connectors. Navi 10 has 8+1 phase VRM looking at the leaked PCB pics and 2x8-pin pci-e connectors. Similar listed TDP to real-world average gaming consumption wouldn't surprise me.

As sexy as those cards look, I don't think I can afford $499 XT. $399 Pro is even pushing it. Here's hoping for the Sapphire price rumors to be off, or for more premium models.
 
Last edited:
I think that's a bit of a reach. She mentions Cerny wanting to revolutionize gaming for the next decade, not that his input was revolutionary. In the end only 42 seconds of the Keynote was devoted to their partnership with Sony, compare that to a spokesperson from MS on stage with Lisa to speak about their relationship/involvement is a much more telling (totalling roughly 5 mins).

And let's not forget this glorious moment.

kh7BLB.gif
You do not bite the hand that feeds. Microsoft will have something good, but so will Sony. We are in for a hell of a generation.
 
Last edited:

ethomaz

Banned
*edit* ok I see you were comparing to Intel/Nvidia. AMD listed RX 480 at 150W TDP and in reality it pulled more like 165W in average gaming. Sapphire 580 Nitro has 185W listed TDP with 230W boost bios average gaming consumption. The 580 Nitro has 6 phase VRM and 1x6-pin+1x8-pin pci-e power connectors. Navi 10 has 8+1 phase VRM looking at the leaked PCB pics and 2x8-pin pci-e connectors. Similar listed TDP to real-world average gaming consumption wouldn't surprise me.

As sexy as those cards look, I don't think I can afford $499 XT. $399 Pro is even pushing it. Here's hoping for the Sapphire price rumors to be off, or for more premium models.
I'm not comparing anything it just AMD doesn't use TDP measurement.
They use TBP.

Typical Board Power (TBP) is the avg. power used by board and that is what AMD shows in specs... so it can be lower or higher depending of the workload.

Intel and nVidia uses Thermal Design Power (TDP) that means the max. thermal power that the card was designed.
* nVidia changed recently to shows only the GPU TDP to the whole board TDP.

These are two different types of measurements and they are not comparable at all.
I pointed that because people says the AMD cars pulls more than what the TDP shows... that is wrong because it is not TDP but TBP and going over the TBP is normal and expected.

I guess that tweet is relevant:



PS. I saw my mistake... I fast typed TDP when AMD cards shared TBP... fixing.

Edit - There is nothing to fix here... I posted here TBP since beginning lol
 
Last edited:

Ellery

Member
Yes. TSMC figures will be optimistic/best case - so that 55% increase drops to ? 45% ?

(the table you posted above) - I guess the comparison is VegaVII 16GB vs Vega64 8GB .. but here's the thing - there's also a doubling of HBM2 memory in that comparison which adds further power consumption to VegaVI, making the gains less by an (unknown) amount ..

I hope that +50% is purely architectural otherwise the non-shrink gains seem to be approaching negible levels ..

Don't get your hopes up. I bet it is compared to 14nm Vega and not 7nm Vega.
 

ethomaz

Banned
Lisa Su round table...


They are basically dodging ray-tracking in Navi questions putting the roadmap on front...

But PS5...

Reporter: Given that Sony has already announced ray tracing as part of the PlayStation 5, can you tell us if that’s a Sony optimization, or part of RDNA?

Su: So we certainly have done very specific optimizations for Sony. They are a deep customer for us on semi-custom products. There are optimizations there. However, we view ray tracing as a very important element across the portfolio. So we’ll have ray tracing a number of other places... Look at that, you got me to say more about ray tracing!

David Wang, senior vice president of engineering at AMD: We started our RDNA development before the Sony engagement. RDNA is a revolutionary architecture; it’s also very flexible. So it can be optimized [inaudible].

Reporter: So it’s like an FPGA.

Su: I wouldn’t exactly say that.
 
Last edited:

CrustyBritches

Gold Member
I'm not comparing anything it just AMD doesn't use TDP measurement.
They use TBP.

Typical Board Power (TBP) is the avg. power used by board and that is what AMD shows in specs... so it can be lower or higher depending of the workload.
I see what you meant now looking at AMD's official page, I've been using 150W TDP for RX 480 they list 150W TBP.

Going by that actual gaming power consumption should be more like 230W+.
 

LordOfChaos

Member
Don't get your hopes up. I bet it is compared to 14nm Vega and not 7nm Vega.

Lisa said so (it was both architecture and the node shrink) but people want to keep their hopes up. If it was architecture alone I'd be thrilled, but she flat out said a combination of both, Anandtech was even wondering the same thing and went back to clarify. 50% gains from architecture alone are a rare breed indeed.
 
Last edited:

ethomaz

Banned
I see what you meant now looking at AMD's official page, I've been using 150W TDP for RX 480 they list 150W TBP.

Going by that actual gaming power consumption should be more like 230W+.
It a useless metric for comparison :(

How do you measure Typical Board Power? Runs some benchs and make an avg.? It is really a violative spec data that can be vastly different from the actual power draw of the card.
 
Last edited:

ethomaz

Banned
I laughed at that one.

Reporter: How important is the halo spot in the GPU market for AMD?

Su: You should ask David that.

Wang: Very important. We should be able to compete very well in the higher-end space.
 

Ellery

Member
Lisa said so (it was both architecture and the node shrink) but people want to keep their hopes up. If it was architecture alone I'd be thrilled, but she flat out said a combination of both, Anandtech was even wondering the same thing and went back to clarify. 50% gains from architecture alone are a rare breed indeed.

Ah ok I didn't catch that part. Thanks for sharing. I genuinely don't expect much of Navi, but it all comes down to pricing and if AMD screws that up then I will just buy an RTX 2080 or so
 

demigod

Member
It really makes no sense that they are revealing more of Navi at E3 unless Sony or MS is revealing something as well.
 
  • Like
Reactions: TLZ

ethomaz

Banned
It really makes no sense that they are revealing more of Navi at E3 unless Sony or MS is revealing something as well.
Navi is focused first in gaming pipeline so what is the best place to share more details? Computex is out of place for that.
It is a different arch from GCN that focus on computing.
 
Last edited:

LordOfChaos

Member
Lisa Su round table...


They are basically dodging ray-tracking in Navi questions putting the roadmap on front...

But PS5...


Yeah sounds a lot like ray tracing is a Sony customization while it's coming to standalone Radeon next gen.
 

demigod

Member
Navi is focused first in gaming pipeline so what is the best place to share more details? Computex is out of place for that.
It is a different arch from GCN that focus on computing.

Polaris and Vega weren't revealed at E3, they had their own thing.

Edit : I'm looking back and it appears that Polaris was in fact revealed at Computex with pricing.
 
Last edited:

ethomaz

Banned
Polaris and Vega weren't revealed at E3, they had their own thing.
GCN focused on computing...

AMD is splitting their portfolio now (something they should have done years ago)... GCN for compute... RDNA for gaming... so it is better to match the events to shows that.

Computex is a GCN more focused event.
E3 is a RDNA more focused event.

It is easy to understand AMD marketing choices.
They want more exposure for Radeon for games... there is no better place today than E3... even more with MS and Sony pushing their next gen consoles with AMD Radeon tech (I believe that hold a big weight in the AMD decision to split the market of their GPUs).
 
Last edited:

vpance

Member
Sony probably has some marketing deal with AMD for first reveal of RT components on Navi. Is secret June event still on?
 

demigod

Member
GCN focused on computing...

AMD is splitting their portfolio now (something they should have done years ago)... GCN for compute... RDNA for gaming... so it is better to match the events to shows that.

Computex is a GCN more focused event.
E3 is a RDNA more focused event.

It is easy to understand AMD marketing choices.

No its not, because they just revealed Navi/RDNA just now at Computex.
 

LordOfChaos

Member
They talk about PCI-E 4.0 helping Sony but that they solution for SSD is very proprietary.

I swear to Cerny if they make an SSG-like solution where the whole storage is already the extended framebuffer the GPU can read off directly and cache into the VRAM leading to no loading, I'm going to go to Sony HQ with a picket sign demanding Other OS support lol
 

ethomaz

Banned
Sony probably has some marketing deal with AMD for first reveal of RT components on Navi. Is secret June event still on?
Looks from the replies Navi will do RT via programmable shaders... not dedicated hardware.

"We started our RDNA development before the Sony engagement. RDNA is a revolutionary architecture; it’s also very flexible. So it can be optimized [inaudible]."

While Sony and probably MS solutions will be hardware based.
 

ethomaz

Banned
No its not, because they just revealed Navi/RDNA just now at Computex.
They just revealed the name and said you need to come to E3 to we share all about the new architecture lol

Computex did only tease RDNA/Navi... they showed absolutely nothing but Lisa Su clearly said their engineers will share all the aspects of RDNA at E3.

E3 is their main reveal event.
 
Last edited:

llien

Member
How realistic is Sony (and possibly Microsoft) just last minute slapping on "hardware RT"?
This is how anand describes what RT core is doing:

Unlike Tensor Cores, which are better seen as an FMA array alongside the FP and INT cores, the RT Cores are more like a classic offloading IP block. Treated very similar to texture units by the sub-cores, instructions bound for RT Cores are routed out of sub-cores, which is later notified on completion. Upon receiving a ray probe from the SM, the RT Core proceeds to autonomously traverse the BVH and perform ray-intersection tests. This type of ‘traversal and intersection’ fixed function raytracing accelerator is a well-known concept and has had quite a few implementations over the years, as traversal and intersection testing are two of the most computationally intensive tasks involved. In comparison, traversing the BVH in shaders would require thousands of instruction slots per ray cast, all for testing against bounding box intersections in the BVH.

Returning to the RT Core, it will then return any hits and letting shaders do implement the result. The RT Core also handles some grouping and scheduling of memory operations for maximizing memory throughput across multiple rays. And given the workload, presumably some amount of memory and/or ray buffer within the SIP block as well. Like in many other workloads, memory bandwidth is a common bottleneck in raytracing, and has been the focus of several NVIDIA Research papers. And in general, raytracing workloads result in very irregular and random memory accesses, mainly due to incoherent rays, that prove especially problematic for how GPUs typically utilize their memory.
anandtech

If this is easy to moderately difficult to implement, AMD would have no problem adding it to own cards.
But if it is not, how come Sony/Microsoft could do it so quickly?


Microsoft adopting DXR so quickly (Vulkan is still waiting and watching) adds some spice to it.
 

xool

Member
Think I said it before somewhere - but forget RT this gen to avoid disappointment - it's the thisgen/nextgen promise

They promise X, and struggle to deliver, except in a nerfed way with massive performance hits .. but next gen they deliver

PS4 era delivered the "HD era" we were promised on PS3/360 but only got 720p
PS4 era promised "turbocharged custom PC design" - what we got was laptop chips with reasonable GPU power
PS5 era looks to absolutely deliver "turbocharged custom PC design" with 8 core 3GHz+ CPU and 10+TF GPU

Happens every gen as far as I can remember back
 
Last edited:

scalman

Member
new consoles will have that RT with much more new tech that nobodys knows yet , i mean what will be used in they new engines of new games will be amazing , no matter how it will be called. lots amazing stuff will come , cant wait ... but speculations and talk about new AMD chips does not mean much here.
 

LordOfChaos

Member
How realistic is Sony (and possibly Microsoft) just last minute slapping on "hardware RT"?
This is how anand describes what RT core is doing:

Why would it be last minute? Nvidia was first out the gate with it but that doesn't mean it wasn't on anyone's radar until they had it, it would have been on API roadmaps and likely somewhere in AMDs, in the 4+ year dev cycle of the PS5 they probably saw an interesting feature near-enough in the roadmap to pull it in.

Lisa also said they'd be talking about it in the coming weeks, so we don't yet know when it'll be an AMD feature, or if it's not already upon us.
 

Gamernyc78

Banned
I think that's a bit of a reach. She mentions Cerny wanting to revolutionize gaming for the next decade, not that his input was revolutionary. In the end only 42 seconds of the Keynote was devoted to their partnership with Sony, compare that to a spokesperson from MS on stage with Lisa to speak about their relationship/involvement is a much more telling (totalling roughly 5 mins).

And let's not forget this glorious moment.

kh7BLB.gif

Everyone is reaching and of course that was part of the presentation they are heavily invested in Azure, they are part of the Azure compute family lol thts money, money, money. In the end tht means nothing as all this giddyness towards Sony and Xbox was present right before the full reveals of Ps4/Xbox one and we know how that turned out.

I got my money on Cerny and his secret sauce though :) we shall see.
 
Last edited:

pawel86ck

Banned
People stop pushing "dedicated" too far.
Dedicated HW for RT is too risky and situational at this point.
Especially given this demo, ran on Vega 56 with generic shaders:





14nm Vega.

That's tech demo running at 1080p 30fps. Real games like MineCraft need 1070 just for 720p. Do you really think sony would build PS5 to run games just at 720p LOL😂😉👍. No, they will run games at 1440p and above and they absolutely need HW RT for that.
 
Think I said it before somewhere - but forget RT this gen to avoid disappointment - it's the thisgen/nextgen promise

They promise X, and struggle to deliver, except in a nerfed way with massive performance hits .. but next gen they deliver

PS4 era delivered the "HD era" we were promised on PS3/360 but only got 720p
PS4 era promised "turbocharged custom PC design" - what we got was laptop chips with reasonable GPU power
PS5 era looks to absolutely deliver "turbocharged custom PC design" with 8 core 3GHz+ CPU and 10+TF GPU

Happens every gen as far as I can remember back
720p is HD. 1080p is FHD. 1440p is QHD. 4k is UHD
The PS4 still looks fairly amazing today regardless of it's gimped processor.
PS5 will look amazing compared to today's standards.
 

Stuart360

Member
720p is HD. 1080p is FHD. 1440p is QHD. 4k is UHD
The PS4 still looks fairly amazing today regardless of it's gimped processor.
PS5 will look amazing compared to today's standards.
PS5 games will look like PS4 games at 4k, especially if they all have resource hogging Ray Tracing as well.
 

DeepEnigma

Gold Member
PS5 games will look like PS4 games at 4k, especially if they all have resource hogging Ray Tracing as well.

I am going to bookmark this post. It is possible, but 1.84TF had no business making games look and run as good as GoW, HZD, etc., and at the time of the announcement in 2013, even showing off those games years later in initial trailers, people still called BS on what they were seeing or told they would eventually see.
 

Fake

Member
PS5 games will look like PS4 games at 4k, especially if they all have resource hogging Ray Tracing as well.
How can you be sure? PS4 games don't look like PS3 game.
Maybe in the PS5 early days, but mid/end life really look next gen, at least for the first party.
 
  • Like
Reactions: TLZ

Stuart360

Member
I am going to bookmark this post. It is possible, but 1.84TF had no business making games look and run as good as GoW, HZD, etc., and at the time of the announcement in 2013, even showing off those games years later in initial trailers, people still called BS on what they were seeing or told they would eventually see.
I'm not saying EVERY game will look like ps4 games at 4k, but there will be plenty. If the consoles have dedicated seperate hardware for ray tacing, that will help make some games look beyond PS4 at 4k, but if its software ray tracing?, well those terraflops will be gobbled up like nobodys business.
Look at the X, 6tf yet it cant always run every XB1 game at 4k, it does most, but not all. You're looking at 8tf to run PS4 games at 4k (as Cerny said himself), which will leave what, about 4tf to play with (assuming the consoles are around 12tf), sortware ray tracing will eat those remaining 4tf without batting an eyelid. I mean look at the difference the RTX cards have with ray tracing compared to no ray tracing, and they have dedicated RT hardware!.
 

Stuart360

Member
How can you be sure? PS4 games don't look like PS3 game.
Maybe in the PS5 early days, but mid/end life really look next gen, at least for the first party.
720p to 1080p. PS4 to PS5 will be 1080p to 4k, plus PS4 didnt have ray tracing to deal with over PS3.
 
Status
Not open for further replies.
Top Bottom