• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RX 5500 XT 4GB and 8GB sees a notable improvement under PCIE 4.0 over PCIE 3.0

thelastword

Banned
lx1cxYa.png



20Ofuak.png



EcvJlM7.png



x3sQgeL.png



jgKCxAP.png



A nice improvement......Now, so many folks told me that PCIE 4.0 would mean nothing for gaming, but I told them that it would mean lots for advanced graphics, more bandwidth is always good.....Just look at the Wolfenstein results to see what I mean....Wolfenstein is a game that streams lots of high quality textures on the fly, it also happens to be using a high quality particle system pushed by the GPU, all the effects are GPU related and high rez, so that requires lots of bandwidth.....see the improvement under PCIE 4.0...

What does this mean for the future? I think this means lots for next gen GPU's and Consoles.....I remember that AMD showed a demo on the raw bandwidth of PCIE 4.0...……...



What people need to know, is how important this is for next gen, next gen is about better lighting, 4k-8k textures, full rez particle systems via the GPU, 60fps, larger worlds where the fidelity stays at a high quality level.....you need bandwidth for that....Moreso is that Raytracing will be a mainstay for next gen GPU's and consoles, the bandwidth you need to do RT not just for Lighting, but for shadows and reflectiosn in game at high rez and quality is indeed tremendous...…...It's also the reason why an HBCC solution is needed on console to stream these assets via a very fast SSD on a PCIE 4.0 bus, you can't have anything bottlenecking the pipeline...….The GPU's will be fast, the bus will be fast, the memory will be fast, that's why I won't be surprised if HBM 3 is a thing for the PS5 or even HBM2 if 3 is not quite ready......HBM 2 spec was actually upgraded months ago, it now supports 2.4GB/s per pin and also increased it's design from 8Gb per layer to 16Gb per layer definition for higher density solutions...…Personally, I'd love to see HBM3 on PS5 with it's efficiency and raw bandwidth, but perhaps that a talk for PCIE 5.0 + HBM 3 in the future.....


In any case, more is always better when it comes to bandwidth, it's only logical that the next step in GPU's and consoles require high speed buses to push the envelope with high end graphics with Raytracing in tow...….So when I read the next gen hardware thread and people try to downplay console hardware to 8-9TF or GTX 1080 like performance, I have to sigh, because if we truly want a new gen experience, we have to do way better than a GTX 1080......There's no way you are getting 4k 30/60fps with RT on such tepid GTX class solutions.....Next gen consoles will easily outclass the 5700XT with RT on top.….
 
Last edited by a moderator:
This is only the case because that card only uses 8 lanes instead of the usual 16. For cards that use 16 like the 5700 XT , there is almost zero difference between PCIe 3.0 and 4.0. AMD more or less crippled the card to create a scenario where it will benefit disproportionately from PCIe 4.0, which is kind of a scummy thing to do since it's a budget card aimed at people who are unlikely to have X570 boards.
 
Last edited:

Armorian

Banned
This is only the case because that card only uses 8 lanes instead of the usual 16. For cards that use 16 like the 5700 XT , there is almost zero difference between PCIe 3.0 and 4.0. AMD more or less crippled the card to create a scenario where it will benefit disproportionately from PCIe 4.0, which is kind of a scummy thing to do since it's a budget card aimed at people who are unlikely to have X570 boards.

Yep, dick move from AMD.

So far, even PCIE x16 2.0 is enough for most GPUs.
 

Ivellios

Member
From what i read this only happened with this particular card, since even the 2080ti dont need PCI 4.0.
 

PhoenixTank

Member
This is only the case because that card only uses 8 lanes instead of the usual 16. For cards that use 16 like the 5700 XT , there is almost zero difference between PCIe 3.0 and 4.0.
I agree with this part of your post and it needs to be the tl;dr for this thread.

As for the rest it seems to be part of a cost saving measure. If it affects performance, why they wouldn't they instead opt for 16 lanes at PCIE 3? Nobody would bat an eye.
Definitely a weird card out of the bunch.

Edit: HWUnboxed have done some testing too.
 
Last edited:

MrRenegade

Report me if I continue to troll
This is only the case because that card only uses 8 lanes instead of the usual 16. For cards that use 16 like the 5700 XT , there is almost zero difference between PCIe 3.0 and 4.0. AMD more or less crippled the card to create a scenario where it will benefit disproportionately from PCIe 4.0, which is kind of a scummy thing to do since it's a budget card aimed at people who are unlikely to have X570 boards.
They learned a lot from NVIDIA.
 

JohnnyFootball

GerAlt-Right. Ciriously.
AMD botched this card badly and deserves a ton of heat for it. The pricing sucks ass and now this.

It can't be overstated how bad of a design oversight this is to have it lose performance unless it's connected to a high end motherboard with PCI 4.0.

It's extremely unlikely that anyone buying this GPU is going to be putting it on an X570 motherboard.
 
They learned a lot from NVIDIA.
It was kind of a dumb thing to do, but I don't think they did this intentionally. They have nothing to gain, really. Nobody is going to buy an X570 board just so their budget GPU gets a slight boost, they'd just buy Nvidia instead.
 

thelastword

Banned
AMD botched this card badly and deserves a ton of heat for it. The pricing sucks ass and now this.

It can't be overstated how bad of a design oversight this is to have it lose performance unless it's connected to a high end motherboard with PCI 4.0.

It's extremely unlikely that anyone buying this GPU is going to be putting it on an X570 motherboard.
Perhaps they can update it to support 16 lanes......With the production issues at TSMC, it's probably why AMD launched these cards at the price, also to cash in on the holiday sales.....As it stands, if you were to purchase a mid end card right now, it makes sense to go with the 5500 XT for lower power draw over the 580 and 590 as those cards are being phased out...…..

So 5500XT over 580 and 5600XT over 590, 1660ti, Vega 56, 1070ti etc...….

I think the prices of the 5500XT will go down after the holidays and some of it's issue will be fixed......
 

PhoenixTank

Member
Perhaps they can update it to support 16 lanes......With the production issues at TSMC, it's probably why AMD launched these cards at the price, also to cash in on the holiday sales.....
There was a rumour that these 5500 chips are on Samsung 7nm. Have not seen it debunked or confirmed yet, but a bit time limited currently.
 
This is only the case because that card only uses 8 lanes instead of the usual 16. For cards that use 16 like the 5700 XT , there is almost zero difference between PCIe 3.0 and 4.0. AMD more or less crippled the card to create a scenario where it will benefit disproportionately from PCIe 4.0, which is kind of a scummy thing to do since it's a budget card aimed at people who are unlikely to have X570 boards.
Wow. How could anyone be dumb enough to do this and cripple their own product? Oh wait....AMD GPU division. Flaming trainwreck continues to smolder.

AMD's CPU division is amazing these days, too bad their GPU division is still....this. Yeah.........

No one is going to buy into X570 and get PCIe 4.0 and then only buy a 5500 series GPU btw.
 
Last edited:

LordOfChaos

Member
Perhaps they can update it to support 16 lanes......With the production issues at TSMC, it's probably why AMD launched these cards at the price, also to cash in on the holiday sales.....As it stands, if you were to purchase a mid end card right now, it makes sense to go with the 5500 XT for lower power draw over the 580 and 590 as those cards are being phased out...…..

I don't recall this happening before, isn't this a matter of physical wiring from the pins to the PCB and chips? If at some point it's simply unsoldered somewhere, there's no updating it.
 

thelastword

Banned
There was a rumour that these 5500 chips are on Samsung 7nm. Have not seen it debunked or confirmed yet, but a bit time limited currently.
Was it not Nvidia going the Samsung route because of the large commitment by TSMC in fulfilling it's current orders......According to AMD they are ok with production, but there is a huge slate of 7nm products hitting the market in 2020 all over......

I'm still thinking AMD has not released 5800XT yet because of the huge demand in placing 7nm products....

I don't recall this happening before, isn't this a matter of physical wiring from the pins to the PCB and chips? If at some point it's simply unsoldered somewhere, there's no updating it.
Didn't Nvidia release a firmware to update compatibility with DisplayPort 1.3 and 1.4 some time ago.....? In any case, we shall see how AMD approaches this....
 

PhoenixTank

Member
Was it not Nvidia going the Samsung route because of the large commitment by TSMC in fulfilling it's current orders......According to AMD they are ok with production, but there is a huge slate of 7nm products hitting the market in 2020 all over......
Yes, that was another rumour. I'd be wanting to use as many TSMC wafers as possible for zen, if it were me.

I believe Jensen recently said at a conference/investor meeting that Nvidia will split between Samsung and TSMC. Interestingly I think he mentioned ~80% would be on TSMC.

There is room for translation error there too.
 

Ascend

Member
They probably thought the card is too slow for the 8 lane reduction to make a significant difference. In some games the difference is bigger than others, but, honestly, I don't see the performance difference as a huge deal.
 

LordOfChaos

Member
Didn't Nvidia release a firmware to update compatibility with DisplayPort 1.3 and 1.4 some time ago.....? In any case, we shall see how AMD approaches this....

The hardware would have been capable of that from the outset then, but most cards that aren't using the full 16 lanes simply aren't carrying all the wiring for the data pins as far as I know.

Also displayport version doesn't kneecap its own launch reviews like this did.

I wonder if gamer jesus might know
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Wow. How could anyone be dumb enough to do this and cripple their own product? Oh wait....AMD GPU division. Flaming trainwreck continues to smolder.

AMD's CPU division is amazing these days, too bad their GPU division is still....this. Yeah.........

No one is going to buy into X570 and get PCIe 4.0 and then only buy a 5500 series GPU btw.
The 5700 and 5700Xt would like a word with you. They're excellent cards and priced pretty well with how they perform.

They've also have had some nice sales with some AIB 5700XT's being sold for around $370, while nvidia GPUs pretty much never get a sale.
 
Top Bottom