• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Navi21 XT/6800XT(?) allegedly scores more than 10,000 in Fire Strike Ultra

Ascend

Member
There seems to be some confusion on whether the 6800XT has 80 CUs or 72 CUs... It should be 72 CUs while the 6900XT would be 80 CUs. But we don't know for sure... I'm just saying, so people keep their expectations in check.

Best of the best for GAMING, which the titans are not geared towards, but I'm sure you already knew that. The gotcha ain't gonna happen to me, sorry. But I'll leave you to it.
Obviously it can't happen if you keep moving the goal post after the fact 🤷‍♂️
 
Last edited:

Pagusas

Elden Member
Very very excited for this reveal. Fingers crossed AMD has that DLSS and RT performance to go along with the raw numbers. This could be a huge come from behind story.
 
There seems to be some confusion on whether the 6800XT has 80 CUs or 72 CUs... It should be 72 CUs while the 6900XT would be 80 CUs. But we don't know for sure... I'm just saying, so people keep their expectations in check.


Obviously it can't happen if you keep moving the goal post after the fact 🤷‍♂️
You don't know how to let things go lol. We're talking about gaming, in a gaming forum, and here you are moving the goalposts from gaming performance, gaming benchmarks, and gaming in general, to GPU's aimed at content creators? I even threw in the 3090 as it's marketed as a gaming GPU. Go figure. 🤷‍♂️ 🤷‍♀️. I'll respond to actual on topic discussions from now on.
 
Last edited:

RoboFu

One of the green rats
Very very excited for this reveal. Fingers crossed AMD has that DLSS and RT performance to go along with the raw numbers. This could be a huge come from behind story.

That wont have DLSS and it wont matter if these cards do better than a 3080, but i can see it as a moving goal post for fanboys. (I just chuckled outloud at the thought of gpu fanbys!, people will play teams for anything)
 

Pagusas

Elden Member
That wont have DLSS and it wont matter if these cards do better than a 3080, but i can see it as a moving goal post for fanboys. (I just chuckled outloud at the thought of gpu fanbys!, people will play teams for anything)

Im not a fanboy, I just want good competition in the space to push down prices and move the market forward faster.
 

CrustyBritches

Gold Member
It's a good thing if AMD brings the competition. Lord knows that people are spending more time inside lately and the thirst for next-gen upgrades is intense. If I'm lucky I'll end up with a new card in the $500-600 range and still be able to sell my 2060S for a decent chunk of cash.

Look, fellas, there's never gonna be a perfect baby. Pick the card with the features you value most and be happy we're all PCMR at the end of the day.
 

00_Zer0

Member
For those worrying about drivers for future AMD cards, I wouldn't. Right now I have a 5700XT Nitro Plus from Sapphire and after many updates I have no complaints about AMD drivers. I had a black screen once and another time I had to roll back drivers so VR worked, but after those two mishaps I have no complaints and drivers have been great lately.

For those that are worrying about a DLSS 2.0 solution from AMD, you should look into purchasing a Sapphire 6800 XT card when they are available, because they offer Trixxx Boost software on top of AMD image sharpening software. Most of the time the combination of both solutions look great on my games. I am sure they will include this software in Big Navi as well, and if they do it's almost guaranteed that I purchase a Sapphire branded Big Navi card.

Sapphire Trixxx Boost

You can choose to scale down resolution of your game by a certain percentage and it will recognize this lower resolution in your games settings. I have used Trixxx Boost on Jedi Fallen Order with AMD image sharpening enabled, and it looks and runs great. I have it scaled down to 85% of 1440p, hdr enabled, freesync enabled on my LG CX oled. I also selected all Ultra settings and capped fps to 90 and it all looks and runs beautifully with nairy a hitch.
 
Last edited:

Ascend

Member
I agree with you, Pagusas Pagusas and CrustyBritches CrustyBritches ... nVidia and Intel have had their reign for too long. AMD has already brought proper competition for CPUs. Now, hopefully, they can deliver on the GPU front so that these prices can be kept in check. nVidia's prices are still high, even though people see them as being lowered. I've never bought a GPU for more than $300, but this might be the first time that I might get one in the $500 - $600 range. The value is becoming too good, even at higher prices. Same applies for the 5950X...

It's a great time to be a PC hardware enthusiast. Can't exactly call myself a gamer specifically, because I do so much more with my PCs, and gaming has taken a back seat lately... But the best time to buy hardware for both price and performance is reaching its peak, and we couldn't have said that for quite a while.

For those worrying about drivers for future AMD cards, I wouldn't. Right now I have a 5700XT Nitro Plus from Sapphire and after many updates I have no complaints about AMD drivers. I had a black screen once and another time I had to roll back drivers so VR worked, but after those two mishaps I have no complaints and drivers have been great lately.

For those that are worrying about a DLSS 2.0 solution from AMD, you should look into purchasing a Sapphire 6800 XT card when they are available, because they offer Trixxx Boost software on top of AMD image sharpening software. Most of the time the combination of both solutions look great on my games. I am sure they will include this software in Big Navi as well, and if they do it's almost guaranteed that I purchase a Sapphire branded Big Navi card.

Sapphire Trixxx Boost

You can choose to scale down resolution of your game by a certain percentage and it will recognize this lower resolution in your games settings. I have used Trixxx Boost on Jedi Fallen Order with AMD image sharpening enabled, and it looks and runs great. I have it scaled down to 85% of 1440p, hdr enabled, freesync enabled on my LG CX oled. I also selected all Ultra settings and capped fps to 90 and it all looks and runs beautifully with nairy a hitch.
That's exactly what I think AMD should automatically implement in their drivers; Make up some fancy name with a simple toggle for automatic downscaling with sharpening. People won't care how it works, just that it improves performance with no/minimal image quality loss.
AMD has quite a few innovative features, but unfortunately, they don't get the attention they deserve. Part of it is their lack of marketing, part of it is nVidia's mindshare clouding people's judgment.

You don't know how to let things go lol.
88aa5a9708fe10faa3529b8420fc07aa.jpg


We're talking about gaming, in a gaming forum, and here you are moving the goalposts from gaming performance, gaming benchmarks, and gaming in general, to GPU's aimed at content creators? I even threw in the 3090 as it's marketed as a gaming GPU. Go figure. 🤷‍♂️ 🤷‍♀️. I'll respond to actual on topic discussions from now on.
troll-18240_1280.jpg
 
Last edited:

llien

Member
I have a 2080 and I dont use DLSS when avaible it just always has some weird look to it. I do think it will continue to get better.

Oh, but DF said when you look at your screen from far away, it looks the same or better than "native 4k".
At least if 4k is TAAed to look blurrier.
 

Senua

Member
Oh, but DF said when you look at your screen from far away, it looks the same or better than "native 4k".
At least if 4k is TAAed to look blurrier.
I don't expect it to be better than native 4k with a decent AA solution, but for the performance it saves and how good it looks, it's a winner in my books. Native 4k is a bitch to run

and also how often do games not use shitty soft arse TAA these days?
 
Last edited:

Kenpachii

Member
remember when people where thinking it was just going to be a 3070 rofl.

Imagine if amd launched that 6900xt for 500 bucks with 16gb of v-ram to save some money. Its 2080ti price drop all over again with 3090 holy shit.
 
Last edited:

smbu2000

Member
Wow, that would be a great score. Hopefully it will also translate into real world performance.
Competition is always best as it forces both companies to try and do their best as well as keeping prices in check. That way you don't have companies arbitrarily increasing prices with minimal performance increases (RTX 2000 series).

I'm currently using an Nvidia card (2080Ti), but would happily switch back to AMD in the future with the right card.
 

pullcounter

Member
Looks like the 6900 XT is gonna be released in limited quantities, and once the stocks are bought up there won't be anymore made, leaving the 72CU 6800 XT as the top card which seems to be roughly a 3080 minus RTX perf, DLSS, but has more VRAM.

Also, there are rumors floating around of these two cards hitting 2.6ghz on air, if true then wow
 
Last edited:

Krappadizzle

Gold Member
Looks like the 6900 XT is gonna be released in limited quantities, and once the stocks are bought up there won't be anymore made, leaving the 72CU 6800 XT as the top card which seems to be roughly a 3080 minus RTX perf, DLSS, but has more VRAM.

Also, there are rumors floating around of these two cards hitting 2.6ghz on air, if true then wow
I think it's gonna really important to a lot of people to see what it's raytracing capabilities is and answer to DLSS are. I'm super excited. It's such an exciting time to be a tech enthusiast and gamer.
 

pullcounter

Member
I think it's gonna really important to a lot of people to see what it's raytracing capabilities is and answer to DLSS are. I'm super excited. It's such an exciting time to be a tech enthusiast and gamer.

yeah,for sure. I believe amd also stated there won't be any shortages either. though there aren't really shortages for the Ampere cards either,it's all intentional by nvidia.
they're withholding memory chips so AIBs can't put more cards out. inflating demand until the amd cards are announced

nvidia has done this numerous times. amd really fucked up by not having their cards ready for an October launch. they're gonna have to really compete on price
 

DeepEnigma

Gold Member
Really...? Let's see;


That alone is enough to discount your claim. I would keep going, but, I have better things to do with my time. It's obvious that you're simply touting the latest newest shiniest stuff that nVidia propagates, because that's what you like. That's fine by me, but that you don't freely admit it when being called out, is a problem. But whatever.

In any case...




Holy shit if true!
 
yeah,for sure. I believe amd also stated there won't be any shortages either. though there aren't really shortages for the Ampere cards either,it's all intentional by nvidia.
they're withholding memory chips so AIBs can't put more cards out. inflating demand until the amd cards are announced

nvidia has done this numerous times. amd really fucked up by not having their cards ready for an October launch. they're gonna have to really compete on price


What is thin nonsense man ? Nvidia withholding chips so partners cant put more cards out ? Jesus christ. Its been told numerous times, by countless sources, the supply is greater than 2000 and on par with the 1000 launch. The demand though by EVGA's own words, the greatest gpu demand in the last 15 years, the greatest they have ever seen. Newegg said there was greater traffic for this card than they had for black friday. 3dmark shows far higher number of results than numerous other launches . Why are people keep ignoring hard data and putting conspiracy theories out there ?
 
Last edited:

Krappadizzle

Gold Member


Ray-Traced Diffuse Illumination
Ray-Traced Reflections
Ray-Traced Ambient Occlusion
Ray-Traced Shadows

This game will be a different game on a 3080 than everywhere else visually.
This is the one I'm most excited for.
 

pullcounter

Member
What is thin nonsense man ? Nvidia withholding chips so partners cant put more cards out ? Jesus christ. Its been told numerous times, by countless sources, the supply is greater than 2000 and on par with the 1000 launch. The demand though by EVGA's own words, the greatest gpu demand in the last 15 years, the greatest they have ever seen. Newegg said there was greater traffic for this card than they had for black friday. 3dmark shows far higher number of results than numerous other launches . Why are people keep ignoring hard data and putting conspiracy theories out there ?

Taking the word of a corporation as fact is just as naive as what you're claiming I'm doing.
When Nvidia drops a shitload of cards a day or two before/after the AMD cards are reviewed will pretty much prove what sources are saying, that Nvidia is sitting on enough memory for 300,000 cards to drop along with about 30,000 3090 cards

 

Nydus

Gold Member
If my B9 could do freesync I would consider going AMD. But I won't throw money away just to sidegrade for AMD :/
 

pullcounter

Member
I had a 3080 reserved but the shipping date was November 11th (I ordered it September 17th lol) but their price was about 20% higher than msrp (and they only price match up to 30 days from date of INVOICE) so I cancelled the order. If their price match was 30 days from product delivery I would've kept it and tried to find it somewhere else cheaper
 

Ascend

Member
Some additional food for thought... I'm gonna focus on Port Royal performance for a second. We have these results;
01-Percent.png



I find it odd that these results are at 4k. The default is 1440p for that benchmark, and even then, an RTX 2080ti averages around 43 fps... And that's an upper tier one with a blazingly fast CPU, not an FE version. You can easily get 33-ish fps with a 2700x or a lower tier 2080Ti.

These results are in percentages... So that means that with the RTX 2080 Ti with 43 fps as a basis, the 6800XT would score around 45 fps, and the 3080 around 55 fps. If that's what that looks like at 1440p, how would it look like at 4k? Running that benchmark at 4K is extremely demanding. Right now we would have a difference between 45 fps and 55 fps. For some people that is significant for games, for others, it's not... If we assume linear scaling (which is rarely the case, especially across different architectures), that means that if the 3080 can manage 110 fps with RT, the 6800XT would do 90 fps. Good enough in my book, even though that's a ball park figure, in a very very large park.

At 4k, you can expect around 23 fps for the 2080Ti, which means you'd be looking at 24 fps for the 6800XT and 29fps for the 3080. That's poor performance for all of them. The percentages are meaningless in this case, because the framerate is too low to be of any value. The error margins start to play too big a role to rely on percentages. It would be a similar situation as one card running 5 fps and the other 10 fps, and then claiming that one is 100% faster than the other. Not exactly reliable. We need playable framerates to determine how the performance scales.

Additionally, the RT performance difference in Port Royal can be a lot larger or smaller than in games if you compare Ampere to Turing... The 3080 is about 25% faster than the 2080 Ti in Port Royal, but, in Wolfenstein Young Blood, it's a mere 10% difference. Then there's stuff like Fortnite, where a 3080 is a whopping 45% faster than a 2080 Ti...

So... Even among nVidia cards, there is a huge difference in RT performance across games. That means that these numbers don't say much about AMD's RT performance... We're technically still blind... The only confirmation we have is that the potential to match the 2000 series is there.
 
Last edited:

Rickyiez

Member
IgorsLab score in TimeSpy Extreme does not reflect the correct one for the 3080 as it should consistently scoring around 8.4k

Harukaze's graph is more accurate , that said it's still looking very good for the Radeon rasterization performance .

AMD-Radeon-RX-6800XT-Scores-1200x675.jpg
 
Last edited:

regawdless

Banned
IgorsLab score in TimeSpy Extreme does not reflect the correct one for the 3080 as it should consistently scoring around 8.4k

Harukaze's graph is more accurate , that said it's still looking very good for the Radeon rasterization performance .

AMD-Radeon-RX-6800XT-Scores-1200x675.jpg

Curious why AMD is so incredibly fast on Fire Strike, even beating the 3090, while being a little bit behind the 3080 on TimeSpy.
 

Barakov

Gold Member


Ray-Traced Diffuse Illumination
Ray-Traced Reflections
Ray-Traced Ambient Occlusion
Ray-Traced Shadows

This game will be a different game on a 3080 than everywhere else visually.
iu
 

llien

Member
Oh, boy, rumors of a new NV card between 3070 and 3070.
It is said to be based on... GA102, motherfucker.

lisa.gif
 

BluRayHiDef

Banned
Those of you who have managed to get an RTX 30 Series card, if RDNA2 winds up being faster than the RTX 30 Series, will you regret your purchase?

Personally, I won't regret my decision because I'm pretty sure that the difference will be negligible, relevant in only traditional rasterization, and in Nvidia's favor when ray tracing and DLSS are available.
 
Those of you who have managed to get an RTX 30 Series card, if RDNA2 winds up being faster than the RTX 30 Series, will you regret your purchase?

Personally, I won't regret my decision because I'm pretty sure that the difference will be negligible, relevant in only traditional rasterization, and in Nvidia's favor when ray tracing and DLSS are available.


"I have a unique perspective on the recent nVidia launch, as someone who spends 40 hours a week immersed in Artificial Intelligence research with accelerated computing, and what I’m seeing is that it hasn’t yet dawned on technology reporters just how much the situation is fundamentally changing with the introduction of the GeForce 30 series cards (and their AMD counterparts debuting in next-gen consoles).

Reviewers are taking a business-as-usual approach with their benchmarks and analyses, but the truth is, there is nothing they can currently include in their test suites which will demonstrate the power of the RTX 3080 relative to previous-gen GPU’s.

nVidia has been accused of over-stating the performance improvements, by cherry picking results with RTX or DLSS turned on, but these metrics are the most representative of what’s going to happen with next-gen games. In fact, I would say nVidia is understating the potential performance delta.

Don’t get me wrong, most of the benchmarks being reported are valid data, but these cards were not designed with the current generation of game engines foremost in mind. nVidia understands what is coming with next-gen game engines, and they’ve taken a very forward-thinking approach with the Ampere architecture. If I saw a competing card released tomorrow which heavily outperformed the GeForce 3080 in current-gen games, it would actually set my alarm bells ringing, because it would mean that the competing GPU’s silicon has been over-allocated to serving yesterday’s needs. I have a feeling that if nVidia wasn’t so concerned with prizing 1080 Ti’s out of our cold, dead hands, then they would have bothered even less with competing head-to-head with older cards in rasterization performance. "
 
Last edited:

BluRayHiDef

Banned

"I have a unique perspective on the recent nVidia launch, as someone who spends 40 hours a week immersed in Artificial Intelligence research with accelerated computing, and what I’m seeing is that it hasn’t yet dawned on technology reporters just how much the situation is fundamentally changing with the introduction of the GeForce 30 series cards (and their AMD counterparts debuting in next-gen consoles).

Reviewers are taking a business-as-usual approach with their benchmarks and analyses, but the truth is, there is nothing they can currently include in their test suites which will demonstrate the power of the RTX 3080 relative to previous-gen GPU’s.

nVidia has been accused of over-stating the performance improvements, by cherry picking results with RTX or DLSS turned on, but these metrics are the most representative of what’s going to happen with next-gen games. In fact, I would say nVidia is understating the potential performance delta.

Don’t get me wrong, most of the benchmarks being reported are valid data, but these cards were not designed with the current generation of game engines foremost in mind. nVidia understands what is coming with next-gen game engines, and they’ve taken a very forward-thinking approach with the Ampere architecture. If I saw a competing card released tomorrow which heavily outperformed the GeForce 3080 in current-gen games, it would actually set my alarm bells ringing, because it would mean that the competing GPU’s silicon has been over-allocated to serving yesterday’s needs. I have a feeling that if nVidia wasn’t so concerned with prizing 1080 Ti’s out of our cold, dead hands, then they would have bothered even less with competing head-to-head with older cards in rasterization performance. "

Sounds interesting, but a lot of games are going to be backwards compatible with the PS4 (Pro) and the Xbox One (X) for the next two years, which means that it will be at least two years until the release of games that will run on completely revamped engines. By then, there will be another generation of cards by both AMD and Nvidia.
 
Last edited:
Sounds interesting, but a lot of games are going to be backwards compatible with the PS4 (Pro) and a Xbox One (X) for the next two years, which means that it will be at least two years until the release of games that will run on completely revamped engines. By then, there will be another generation of cards by both AMD and Nvidia.


It doesnt matter that there'll be crossgen. A lot of games are gonna start to include rtx and dlss. We have an article putting 12 more games with rtx and dlss just in these remaining 2 months. AMD has a worse raytracing implementation that takes away from rasterisation performance. Thats why you see in the leaked benchmarks much worse performance than a 3080. AMD seems to have pumped up being able to run 2013 tech games, while nvidia will let you run tommorow's games.

Even if AMD's cards are slightly better in rasterisation, the 3080 is still a goliath of raster performance. But we will be able to play this good stuff, Watch Dogs, Cyberpunk, Cold War and beyond with superior effects and image quality than AMD. Time will tell if amd fucked up here or nvidia. I think its AMD
 
Last edited:

evanft

Member
Taking the word of a corporation as fact is just as naive as what you're claiming I'm doing.
When Nvidia drops a shitload of cards a day or two before/after the AMD cards are reviewed will pretty much prove what sources are saying, that Nvidia is sitting on enough memory for 300,000 cards to drop along with about 30,000 3090 cards



Imagine linking to a MLID video. Jesus Fucking Christ.
 
Top Bottom