• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Navi21 XT/6800XT(?) allegedly scores more than 10,000 in Fire Strike Ultra

llien

Member
Tensor cores performing int8 inference
There is no info on which kind of values NN layer of TAA upscaling used in DLSS 2.0 is using.
GPUs are inherently well suited for NN inference, as NV itself was shouting loudly in the past, way before TCs.

TCs are needed to explain why something is disabled on older cards and "I just want you to buy my newer product" wouldn't fly.

I'm just warning everyone who's worshipping at the alter of Lisa Su that they don't care (much) more about you than Nvidia does
FidelityFX is cross platform.
Freesync is cross platform.
Mantle turned into Vulkan is... you've guessed it.

Meanwhile Huang managed to kill OpenGL by pushing proprietary crap. (Microsoft was amused, I guess)

So, Lisa over Jensen any time, thank you very much.


This "but they are commercial companies, so they would do anything to make more money" is getting old.
Yes, they are commercial companies, and one of them is used to all kind of shit, while another plays classy.
If Huang tells you something is two times faster, it is highly unlikely it is.
Whereas when Dr Su tells you, something is X% faster, you could be sure it really is.
 

spyshagg

Should not be allowed to breed
I honestly don’t know how AMD do it. First they crush intel and now Nvidia. Absolute giant slayers; Lisa’s balls are bigger than putins.

The past 8 years aren't the entire history of these companies.

They (AMD/ATI) have been wining battles with Intel/Nvidia since the 90's.

The current slump began in 2003~2007, when AMD while having the best CPU's in the market, were actually getting bankrupted by Intel mal practices leaving AMD with no money to continue R&D. This won Intel the next 10 years in the CPU war.

Unfortunately, ATI was bought by AMD at the worst possible time (because of Intel), and suddenly had no money to compete with nvidia. The last "world fastest" GPU from AMD was the 290X in 2013, only 7 years ago.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Nvidia is shook.
New GA102 based card incoming.

RTX 3075 or RTX 3070Ti looks like it will easily work RDNA2 in RT workload.

Whats strange is the RTX 3070 hits ~8900 - 9000 Firestrike Ultra so this new card should hit 9500 - 10000.
NVIDIA-RTX-3070-3DMark-Performance-768x228.jpg

This card could be priced exactly as the 6800XT or Nvidia brings everything down a price point.

 
lol I love the confidence that Radeon fanboys exude every single time they're about to launch a product. It's always so funny when the reality sets in 😂

I'm not a dickhead, so I'll never pay 3080+ prices for a videocard.

And I'm not a twat, so I'll never buy gash product like the 2060, where the main feature you paid all that extra money for was borderline unsuitable, or pay top whack for a mid tier 3070 with it's gimped memory arrangement. If that card had more vram I'd be all over it.

I just want a good card for a good, fair and reasonable price. If AMD can offer that, cool, as honestly I honestly don't think nvidia want my money anymore, which is fair enough.

If neither can, then I'll look at the consoles instead and wait a couple of years.

Look, I've been buying cards since the voodoo days. I know full well how all these companies work. And how their 'super special' feature sets pan out. I simply refuse to get caught up in the zeitgeist anymore, and pay a fortune to early adopt absolute crap.
 
No, it didnt have a small bump over turing. Its 50% faster than 2080Ti and 100% faster than a regular 2080. nvidia wasnt lying when they said twice as fast as the 2080. It literally is that fast.

You're feeding in to Nvidia marketing tricks. They also said Ampere enjoys 1.9X perf per watt improvement over Turing. We all know that is total BS under normal circumstances.

So let's take your claim - if there is 50% improvement in RT cores, show me 50% improvement in framerates with RT on, because that's all that matters at the end of the day, not Nvidia marketing theoretical numbers. In games that have been tested, HU/Techspot showed 10% improvement from 2080 Ti to 3080s RT acceleration:

Techspot said:
Our tests also show that ray tracing acceleration in the RTX 3080 isn’t overly better than the 2080 Ti's, with less than a 10% speed up to RT acceleration separating the two. In a best-case scenario like Wolfenstein: Youngblood at 4K which received an Ampere-specific patch to improve performance, the RTX 3080 is 25% better at ray tracing acceleration. Ideally, we'd need to see a 50% or even 100% improvement to RT acceleration before the RTX on and off gap feels more acceptable with today’s effects.

Because Ampere is only a minor improvement over Turing for acceleration, there’s still question marks over whether this card will be sufficient for the next few years of ray tracing. Most of today’s games use only one or two effects, and use them sparingly.

 
Last edited:

FireFly

Member
No, it didnt have a small bump over turing. Its 50% faster than 2080Ti and 100% faster than a regular 2080. nvidia wasnt lying when they said twice as fast as the 2080. It literally is that fast.
It depends how you look at it. The 2080 Ti has as many SMs as the 3080, so to scale raytracing performance to match the doubled FP32 throughout per SM, they had to increase the capabilities of the RTX cores.

But just adding more SM's to Turing would have achieved the same thing, so this itself doesn't prove Ampere is much more efficient.

The 3070 apparently performs around the same as the 2080 Ti in raytracing, while having around 6% less transistors. So in that case we are talking about a modest improvement.

Edit:

You're feeding in to Nvidia marketing tricks. They also said Ampere enjoys 1.9X perf per watt improvement over Turing. We all know that is total BS under normal circumstances.

So let's take your claim - if there is 50% improvement in RT cores, show me 50% improvement in framerates with RT on, because that's all that matters at the end of the day, not Nvidia marketing theoretical numbers. In games that have been tested, HU/Techspot showed 10% improvement from 2080 Ti to 3080s RT acceleration:



I think you're comparing different things. He's talking about the absolute performance difference, while you seem to be talking about the performance hit for enabling raytracing. Just scaling up Turing yields big gains in raytracing performance without changing the fundamental performance tradeoff, which is what Techspot were talking about.
 
Last edited:
You're feeding in to Nvidia marketing tricks. They also said Ampere enjoys 1.9X perf per watt improvement over Turing. We all know that is total BS under normal circumstances.

So let's take your claim - if there is 50% improvement in RT cores, show me 50% improvement in framerates with RT on, because that's all that matters at the end of the day, not Nvidia marketing theoretical numbers.


Certainly:


Xsz5nVe.png


p7UfjV4.png



l7gRzYm.png


ppg0MiE.png




The performance uplift over a 2080Ti is 40% on the lower end and 50% on the high end. Which is significant. Basically the performance is enough for a locked 60 with a 3080 in all titles while a 2080ti cant do it in none of them
 
Certainly:


Xsz5nVe.png


p7UfjV4.png



l7gRzYm.png


ppg0MiE.png




The performance uplift over a 2080Ti is 40% on the lower end and 50% on the high end. Which is significant. Basically the performance is enough for a locked 60 with a 3080 in all titles while a 2080ti cant do it in none of them
That's depressing. Not in a bad way, but since everything is sold out and my 2080 Ti could be sold for more money right now than msrp 3080.... But I'd be GPUless for a while. Maybe AMD might surprise us all, and come out lunching above it's weight, but I'll hold my breath...
 

notseqi

Member
Who told you the prices would be very good? Got a link?
I don't have prices. I ask you, will AMD price themselves out of the game the same second they have a product that can match their competitors mainstream card?

I phrased my post wrong though, I wanted to refer to the cards we have gotten so far. No leaders but okay for the price.
 
I think AMD will almost definitely be cheaper than their equivalent Nvidia card, but I think a lot of people expecting bargain basement prices are in for a rude awakening.

I would predict they will still be close to Nvidia in pricing, 100 less maximum, possibly 50 less minimum. I suppose there is a hail mary chance of around 150 cheaper? But that seems unlikely.
 

notseqi

Member
I think AMD will almost definitely be cheaper than their equivalent Nvidia card, but I think a lot of people expecting bargain basement prices are in for a rude awakening.

I would predict they will still be close to Nvidia in pricing, 100 less maximum, possibly 50 less minimum. I suppose there is a hail mary chance of around 150 cheaper? But that seems unlikely.
I went for a 650$ prediction on the 6800xt somwehere around here.
 
I went for a 650$ prediction on the 6800xt somwehere around here.

That sounds about right for my estimation too.

Seems like a pretty good value proposition when you think about it.

1. Equal to or slightly more powerful than 3080 in 4K
2. Much more capable than 3080 in 1440p
3. 6GB more VRAM
4. $50 cheaper
5. Probably lower power draw.
6. Much better OC headroom/potential

I guess the main disadvantage would be slightly worse RT performance and no DLSS (Although RedGamingTech did mention they have some kind of competitor? So we will see what happens). Not bad at all especially given AMD's position in the GPU market the last few years.
 
Last edited:
I hope AMD is preparing a viable DLSS alternative. The current 3080 stock situation is great for an upset comeback by AMD, but I think DLSS will be too important in the near future for me to invest in a card without it.
 
The internet had me almost convinced that AMD couldn't compete with Nvidia, pretty exciting to have competition in the sector. How much y'all think the 6900XTX or whatever will be?

Yeah there was a ton of "hurr durr AMD's top card will only be as powerful as a 3070 lololol" concern trolling going on from people who, let's be honest here 90+% have no intention of buying an AMD graphics card regardless of performance, features or price. Glad we can finally put that narrative to rest once and for all.

As for price of 6900XTX? Hard to say, it will depend on the performance gain over the 6800XT. If it matches or exceeds the 3090 then expect maybe $899-$1000, that would still make it $500 cheaper than a 3090 while having less RAM.

But maybe it will actually be instead a special edition of the 6800XT, so maybe +5% performance over the 6800XT? Which would put it between 3080 and 3090.

If that is the case then maybe I could see $799? Hard to say exactly.
 
Last edited:

Kenpachii

Member
'Very good' being $50 less (*at best, according to Red Gaming Tech)?

That's quite a pill to swallow if you care about the upcoming AAA releases that will utilize both DLSS and Ray Tracing to a great degree. Given that it's an AMD pick on the 4k titles, I'd say 5 wins for team red with 2 ties and 3 losses is probably optimistic. I play on 43 inch 4k120hz monitor, so 4k, DLSS and to a lesser degree, ray tracing performance are all very important to me.

As a 1440p card, it might have a compelling argument to make. Wait for reviews. A lot of people on here might have a serious decision to make.

But for my use case, there's no way I choose what is likely going to be a wash in 4k rasterization performance for MAYBE $50 when I can have a tech that preserves image quality to almost a native 4k image while granting 30% performance gains.

All fun and dandy until u hit 10gb of v-ram limit and your card doesn't play the game anymore in any decent way at 4k while that 6800xt has zero issues playing the game and at higher settings. This is something a lot of people don't realize. And as u game at 4k u pretty much are the biggest v-ram consumer that exists. I would have avoided that 10gb model like the plague unless u are fine with current gen titles and will buy another gpu anyway in 2 years down the line.

That's how Nvidia works, because nvidia knows that in 1,5 years from now the bigger juggernaut next gen titles appear most likely. which means perfectly in time to buy the higher v-ram versions of 4000 series. That 3080 will age like a wet noodle.

Nvidia is shook.
New GA102 based card incoming.

RTX 3075 or RTX 3070Ti looks like it will easily work RDNA2 in RT workload.

Whats strange is the RTX 3070 hits ~8900 - 9000 Firestrike Ultra so this new card should hit 9500 - 10000.
NVIDIA-RTX-3070-3DMark-Performance-768x228.jpg

This card could be priced exactly as the 6800XT or Nvidia brings everything down a price point.


GPU performance isn't what makes the 6800xt interesting its the v-ram. Unless they double that v-ram and kinda make 3080 a joke in the meanwhile nobody will care about it. Also unless that 3075 is based around the 3080 die it won't come close to the 6800xt most likely if the 6800xt competes with a 3080. That card is straight up focused for competing against a lower tier card.
 
Last edited:

Armorian

Banned
Nvidia is shook.
New GA102 based card incoming.

RTX 3075 or RTX 3070Ti looks like it will easily work RDNA2 in RT workload.

Whats strange is the RTX 3070 hits ~8900 - 9000 Firestrike Ultra so this new card should hit 9500 - 10000.
NVIDIA-RTX-3070-3DMark-Performance-768x228.jpg

This card could be priced exactly as the 6800XT or Nvidia brings everything down a price point.


Ampere doesn't scale that great with GPU cores so this will be very close to 3080.
 
Last edited:

ZywyPL

Banned
I hope AMD is preparing a viable DLSS alternative. The current 3080 stock situation is great for an upset comeback by AMD, but I think DLSS will be too important in the near future for me to invest in a card without it.

There is DirectML in XSX, so technically they can have AI computations on their GPS, so who knows.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
GPU performance isn't what makes the 6800xt interesting its the v-ram. Unless they double that v-ram and kinda make 3080 a joke in the meanwhile nobody will care about it. Also unless that 3075 is based around the 3080 die it won't come close to the 6800xt most likely if the 6800xt competes with a 3080. That card is straight up focused for competing against a lower tier card.

Yeah its GA102 based because GA104 was pretty much tapped out.

Itll make the RTX 3070 alot less attractive unless it too gets a price cut.
 

Kenpachii

Member
Yeah its GA102 based because GA104 was pretty much tapped out.

Itll make the RTX 3070 alot less attractive unless it too gets a price cut.

Honestly there greed became their own enemy at this point. They should just have released that 3080 with 16gb of memory and honestly nobody would have cared for those amd gpu's. Now they left a huge open crater for AMD to market themselves in. And nothing they can do about it because it will enrage the people that bought into the current 3080 one way or the other.
 

psorcerer

Banned
DLSS allows the imposant 3080 to render the full suite of raytracing effects with playable, high framerates while keeping crystal clear image quality.

Dvelopers allow that.

AMD and consoles are following in nvidia's steps.

That's why Ampere returned to be heavily compute-skewed just like Vega, took NV quite a long time to catch up with AMD...
 

SantaC

Member
Nvidia is shook.
New GA102 based card incoming.

RTX 3075 or RTX 3070Ti looks like it will easily work RDNA2 in RT workload.

Whats strange is the RTX 3070 hits ~8900 - 9000 Firestrike Ultra so this new card should hit 9500 - 10000.
NVIDIA-RTX-3070-3DMark-Performance-768x228.jpg

This card could be priced exactly as the 6800XT or Nvidia brings everything down a price point.

Nvidia, the scumbag company
 

Malakhov

Banned
The biggest joke for me is the 3070 having just 8GB of memory. This is a 2080 Ti performance level, which was/is a 4K card. Yet a card with the same perf level has 3GB less memory.

The 'reviewers guide' from Nvidia does some prepping for the media for the VRAM issue: https://videocardz.com/newz/nvidia-geforce-rtx-3070-reviewers-guide-leaked
Yeah but a 2080ti at 500$. I don't like what nvidia did with this launch don't get me wrong but the 3070 isn't part of that. I think it's a decent budget option so far, without considering AMD yet since we will know in 3 days and I'm sure they'll cover the best budget option as well
 
Last edited:
Yeah but a 2080ti at 500$. I don't like what nvidia did with this launch don't get me wrong but the 3070 isn't part of that. I think it's a decent budget option so far, without considering AMD yet since we will know in 3 days and I'm sure they'll cover the best budget option as well

To be fair Turing was so ridiculously priced that '$500' (will we actually be able to reasonably buy one at this price?) seems like a great deal. I mean $1200 for a graphics card will always be fucking obscene!
 

dEvAnGeL

Member
DLSS will be a definitive buying factor when is a default option in all games. Very interested in AMD is showing this coming week.
 

RedVIper

Banned
My store has 3080s unlike any other stores in the world.

But I can't say which store it is.

😂😂😂😂😂😂😂

How does the name of the store help anybody?

It's a local store that's doesn't sell outside the country.

I'm not doxing myself just so you guys can open the store and look at cards you can't buy anyway
 

thelastword

Banned
With all this talk on DLSS......the constant, "Amd has to offer an alternative to DLSS or else".......Seems like De Ja Vu. They said the same thing about Navi 1.0 when the Super cards launched. People were hyped the same way about DLSS 1.0 as they are now for DLSS 2.0....and how many DLSS games do we have even after 2.0? Not many.......Initially, Nvidia said DLSS 1.0 was it's own form of AA due to A.I, they said no TAA for us, no time for that blurry shit. Then when the comparisons came, the IQ on DLSS 2.0 was awful with jaggies and shimmering everywhere, so Nvidia simply adopted TAA for DLSS 2.0 and then sharpened the snort out of the image, whilst using some A.I to clean artefacts in the foreground, but there is still missing detail in backgrounds as approximations are just that, approximations. Yet, people call it the holy grail, but a proper comparison has not been done on it yet. When more DLSS games are released, when some Direct ML games are released, when CB 2.0/Fidelity FX 2.0 is released we will sit at this table and analyze properly. There are not enough games, there is not enough content to justify any hype or proper breakdown of the techniques as even AMD has not evolved their tech from CB 1.0, at least let them release their software first and do their reveal.

DLSS 1.0 was hyped even more than 2.0, still only a few games.... Raytracing through Turing was hyped just the same....People are only going to take stock of RT now, since AMD has made it viable on consoles and it's also a feature on their PC GPU's. Hype does not make anything stick in this industry.....The reality is this, RT is going to take off now because of AMD, when AMD announces it's upscaling and reconstruction tech, that is what is going to be used predominantly by the industry, because it's the consoles that determine how popular specific gaming features can get by normalizing it. We have two consoles built ground up on AMD tech. Doing a faster reconstruction technique without using A.I makes more sense for consoles, it would mean the technology would just work without the need for any third party or super computer or server farm, it would be innate to the console hardware and an easier implementation for devs and of course adoption in games...Nvidia's proprietary nonsense has never really gone too far and with AMD dominating CPU's and now clutching at the GPU teat with a fierce bite from the leaks so far, it's inevitable that their technologies and architecture will be prioritized over Nvidia's proprietary tech......Console tech is pretty much AMD tech and that's 200 million gamers that devs will use what's best for said architecture....


So preliminary AMD RT performance, don't worry about that, devs will learn how to utilize the strengths by which AMD cards do RT and get it to run even faster, because it's what's in the consoles too. The infinity cache, great for rasterization and if it's being used a bit more when RT comes into the picture, devs will use that better too. AMD's approach to RT is still new and devs are putting their heads around it now, console devs are already doing great with it, even in crossgen titles like Miles Morales, which happens to have a very impressive implementation of RT, it will only get better. Devs will seek to get more RT performance on AMD hardware as the consoles launch, RDNA PC GPU's launch and beyond. This is what devs will prioritize, not ampere performance or DLSS 2.0 and you will see an uptick in performance and quality of RT on consoles and RDNA 2 GPU's as we go along...

AMD GPU's has more vram, the infinity cache, better GPGPU performance, the writing is on the wall.....DLSS 2.0 is not Nvidia's savior because there will be many solutions to combat that, at AMD and even from MS. What becomes most adopted as the go to reconstruction technique will have very little to do with Nvidia.... Stop hanging your hats on DLSS 2.0 as soon as you hear rumors that AMD has taken the rasterization performance crown.....

Competition is a good thing and you guys should be glad, but seeing threads like "AMD will not even match the 2080ti/3070".......tweets like "There's just no way RDNA2 gets close to RTX 3000 performance, if these numbers are real ", in which Herkelman answered with an emoji, that was shortly after Jensen gave his spiel on 30TF cards without the caveat..... and spoke of an 8K 3090, but was not in hurry to emphasize that this was with 1440p DLSS upscaling to 8K...So yes, stop hiding behind marketing speak, cards with such TF power should ideally be doing 8k and 4k 120Hz imo.....You can't talk about 33TF and on the flip use a cost cutting DLSS to sell your higher rez......I mean if you are going to defend something, defend a company which is pushing the technology forward with higher clocks, infinity cache, 7nm standards which nets more performance per watt, a company that has given consoles some damn good CPU+GPU power on 300-350 watts on a SFF, with no BS like 4 core i7's which we would still be on had it not been for Ryzen...A company that's pushing open standards with Radeon Rays, Freesync, whilst all the other company does is do proprietary stuff to kill the competition and industry, hog performance and ask for insanely high prices because people are high on the Fellatio 1-0-Huang.....

The industry is changing, it's for the good... In a few years, Infinity Fabric will be on everything, chiplets will be on GPU's and CPU's. What were we expecting, pushing insane clocks on lower single fab nodes forever.... I guess if our intention was to limit core counts and maximize profit on 4 core i7's for another decade, that would be a sound plan....but I'm glad the industry is moving forward. Next gen, I'm all but ready to embrace multi GPU's and CPU's on one die, I'm ready to embrace even more revolutions in this industry.....Yet in the here and now, I can't deny who has has got us here and provided this much needed injection in the vein of this industry.....We were stagnant for far too long, almost like we were pushing daisies, but better days are here, that's for sure....
 
I kind of wonder sometimes if some of you guys have ever owned an Nvidia card? I completely understand price to performance from AMD. I get it, but what's the point of having fast rasterization and possibly worse performance in raytracing and DLSS alternative for a possible savings of only $50 USD? That's the big thing that I don't understand, as your already spending a bit of money, you might as well secure best performance over all. At least that makes sense to me, so my opinion.


Performance king this gen very well could be AMD this time around, but let's not pretend that's how it is currently or over the past couple of years, so to hate Nvidia like some of you guys do, makes no sense. (which for some of y'all, I NEVER see in PC threads to begin with, go figure...)
 

Ascend

Member
I kind of wonder sometimes if some of you guys have ever owned an Nvidia card? I completely understand price to performance from AMD. I get it, but what's the point of having fast rasterization and possibly worse performance in raytracing and DLSS alternative for a possible savings of only $50 USD? That's the big thing that I don't understand, as your already spending a bit of money, you might as well secure best performance over all. At least that makes sense to me, so my opinion.


Performance king this gen very well could be AMD this time around, but let's not pretend that's how it is currently or over the past couple of years, so to hate Nvidia like some of you guys do, makes no sense. (which for some of y'all, I NEVER see in PC threads to begin with, go figure...)
You want to know why I dislike nVidia? Ok...

My first nVidia card ever was an MX440. If you're old enough, I think that already might give you some indication as to why I was at least not happy with that particular card... Then there's this...;


And this


In the interest of keeping things at least a bit light-hearted... You know damn well what this refers to;


That should be more than reason enough... But....


(it only no longer applies for the ones with short memories)







But when people only care about their shiniest new toy, I guess better cannot be expected. It's the same mentality as the people that are still buying shoes from certain companies despite it being well-known that they incorporate child labor and even human trafficking. But hey. As long as I get the best of the best... Right? Saving $50 for a similar product by another company? Nah...

I have more against the people that blindly follow companies than the companies themselves. Companies do what they have to, to maximize profits. It's the sheep following them that allow all the weird shenanigans. And if AMD is not kept in check, in a few years, the same might apply to them. But at this point, Intel is still arrogant, and so is nVidia. So let them have it. I said it before and I'll say it again; It's the ones blindly buying everything that enable nVidia to rip them off constantly.
 
With all this talk on DLSS......the constant, "Amd has to offer an alternative to DLSS or else".......Seems like De Ja Vu. They said the same thing about Navi 1.0 when the Super cards launched. People were hyped the same way about DLSS 1.0 as they are now for DLSS 2.0....and how many DLSS games do we have even after 2.0? Not many.......Initially, Nvidia said DLSS 1.0 was it's own form of AA due to A.I, they said no TAA for us, no time for that blurry shit. Then when the comparisons came, the IQ on DLSS 2.0 was awful with jaggies and shimmering everywhere, so Nvidia simply adopted TAA for DLSS 2.0 and then sharpened the snort out of the image, whilst using some A.I to clean artefacts in the foreground, but there is still missing detail in backgrounds as approximations are just that, approximations. Yet, people call it the holy grail, but a proper comparison has not been done on it yet. When more DLSS games are released, when some Direct ML games are released, when CB 2.0/Fidelity FX 2.0 is released we will sit at this table and analyze properly. There are not enough games, there is not enough content to justify any hype or proper breakdown of the techniques as even AMD has not evolved their tech from CB 1.0, at least let them release their software first and do their reveal.

DLSS 1.0 was hyped even more than 2.0, still only a few games.... Raytracing through Turing was hyped just the same....People are only going to take stock of RT now, since AMD has made it viable on consoles and it's also a feature on their PC GPU's. Hype does not make anything stick in this industry.....The reality is this, RT is going to take off now because of AMD, when AMD announces it's upscaling and reconstruction tech, that is what is going to be used predominantly by the industry, because it's the consoles that determine how popular specific gaming features can get by normalizing it. We have two consoles built ground up on AMD tech. Doing a faster reconstruction technique without using A.I makes more sense for consoles, it would mean the technology would just work without the need for any third party or super computer or server farm, it would be innate to the console hardware and an easier implementation for devs and of course adoption in games...Nvidia's proprietary nonsense has never really gone too far and with AMD dominating CPU's and now clutching at the GPU teat with a fierce bite from the leaks so far, it's inevitable that their technologies and architecture will be prioritized over Nvidia's proprietary tech......Console tech is pretty much AMD tech and that's 200 million gamers that devs will use what's best for said architecture....


So preliminary AMD RT performance, don't worry about that, devs will learn how to utilize the strengths by which AMD cards do RT and get it to run even faster, because it's what's in the consoles too. The infinity cache, great for rasterization and if it's being used a bit more when RT comes into the picture, devs will use that better too. AMD's approach to RT is still new and devs are putting their heads around it now, console devs are already doing great with it, even in crossgen titles like Miles Morales, which happens to have a very impressive implementation of RT, it will only get better. Devs will seek to get more RT performance on AMD hardware as the consoles launch, RDNA PC GPU's launch and beyond. This is what devs will prioritize, not ampere performance or DLSS 2.0 and you will see an uptick in performance and quality of RT on consoles and RDNA 2 GPU's as we go along...

AMD GPU's has more vram, the infinity cache, better GPGPU performance, the writing is on the wall.....DLSS 2.0 is not Nvidia's savior because there will be many solutions to combat that, at AMD and even from MS. What becomes most adopted as the go to reconstruction technique will have very little to do with Nvidia.... Stop hanging your hats on DLSS 2.0 as soon as you hear rumors that AMD has taken the rasterization performance crown.....

Competition is a good thing and you guys should be glad, but seeing threads like "AMD will not even match the 2080ti/3070".......tweets like "There's just no way RDNA2 gets close to RTX 3000 performance, if these numbers are real ", in which Herkelman answered with an emoji, that was shortly after Jensen gave his spiel on 30TF cards without the caveat..... and spoke of an 8K 3090, but was not in hurry to emphasize that this was with 1440p DLSS upscaling to 8K...So yes, stop hiding behind marketing speak, cards with such TF power should ideally be doing 8k and 4k 120Hz imo.....You can't talk about 33TF and on the flip use a cost cutting DLSS to sell your higher rez......I mean if you are going to defend something, defend a company which is pushing the technology forward with higher clocks, infinity cache, 7nm standards which nets more performance per watt, a company that has given consoles some damn good CPU+GPU power on 300-350 watts on a SFF, with no BS like 4 core i7's which we would still be on had it not been for Ryzen...A company that's pushing open standards with Radeon Rays, Freesync, whilst all the other company does is do proprietary stuff to kill the competition and industry, hog performance and ask for insanely high prices because people are high on the Fellatio 1-0-Huang.....

The industry is changing, it's for the good... In a few years, Infinity Fabric will be on everything, chiplets will be on GPU's and CPU's. What were we expecting, pushing insane clocks on lower single fab nodes forever.... I guess if our intention was to limit core counts and maximize profit on 4 core i7's for another decade, that would be a sound plan....but I'm glad the industry is moving forward. Next gen, I'm all but ready to embrace multi GPU's and CPU's on one die, I'm ready to embrace even more revolutions in this industry.....Yet in the here and now, I can't deny who has has got us here and provided this much needed injection in the vein of this industry.....We were stagnant for far too long, almost like we were pushing daisies, but better days are here, that's for sure....

Did this guy get banned for a while? He's back spewing the same old nonsense. Just curious if the hiatus was voluntary or self-imposed.
 

FireFly

Member
I kind of wonder sometimes if some of you guys have ever owned an Nvidia card? I completely understand price to performance from AMD. I get it, but what's the point of having fast rasterization and possibly worse performance in raytracing and DLSS alternative for a possible savings of only $50 USD? That's the big thing that I don't understand, as your already spending a bit of money, you might as well secure best performance over all. At least that makes sense to me, so my opinion.
There are lot more dimensions to compare than just those two

- Actual street pricing (eg. in my country the cheapest 3080 you can pre-order is $911)
- Availability
- Power consumption
- Heat/noise
- Memory
- Resolution

For example I'm building in a mini ITX case that only fits two models of 3080, and people using these models still report high temps and fan speeds. I will be gaming on a 1440p monitor, so that's where performance advantages are most critical, but I will also need good rasterisation performance to power a Reverb G2 headset.
 
You want to know why I dislike nVidia? Ok...

My first nVidia card ever was an MX440. If you're old enough, I think that already might give you some indication as to why I was at least not happy with that particular card... Then there's this...;


And this


In the interest of keeping things at least a bit light-hearted... You know damn well what this refers to;


That should be more than reason enough... But....


(it only no longer applies for the ones with short memories)







But when people only care about their shiniest new toy, I guess better cannot be expected. It's the same mentality as the people that are still buying shoes from certain companies despite it being well-known that they incorporate child labor and even human trafficking. But hey. As long as I get the best of the best... Right? Saving $50 for a similar product by another company? Nah...

I have more against the people that blindly follow companies than the companies themselves. Companies do what they have to, to maximize profits. It's the sheep following them that allow all the weird shenanigans. And if AMD is not kept in check, in a few years, the same might apply to them. But at this point, Intel is still arrogant, and so is nVidia. So let them have it. I said it before and I'll say it again; It's the ones blindly buying everything that enable nVidia to rip them off constantly.

So AMD GPU's are made in America or Europe, where it's illegal to have child labor? Do you own an iPhone or Android? Wouldn't that be a sheep mentality to get one of those devices as well? We can fault every company for that sheep mentality, as people will still buy them.

My first car was a piece of shit, so I'm not going to buy from this company ever again? You do realize companies can improve, right? Nvidia have had disasters, just like AMD have. But I wouldn't go so far to let an old GPU be a huge determining factor decades later. That's a silly mentality to have, especially as they are the performance kings right now. If you want the best performance, what other options do you have as an enthusiast right now? A 5700XT won't cut it. Intel isn't even in the discussion here.

There are lot more dimensions to compare than just those two

- Actual street pricing (eg. in my country the cheapest 3080 you can pre-order is $911)
- Availability
- Power consumption
- Heat/noise
- Memory
- Resolution

For example I'm building in a mini ITX case that only fits two models of 3080, and people using these models still report high temps and fan speeds. I will be gaming on a 1440p monitor, so that's where performance advantages are most critical, but I will also need good rasterisation performance to power a Reverb G2 headset.
Do we know the dimensions of the new 6XXX boards yet? I'm not a fan of any of the 3 slot design cards out currently and hope this isn't an ongoing trend.
 

Malakhov

Banned
To be fair Turing was so ridiculously priced that '$500' (will we actually be able to reasonably buy one at this price?) seems like a great deal. I mean $1200 for a graphics card will always be fucking obscene!

Only the few who will be able to snatch one on the first day, then scalping time for us peasants but I refuse to give them my money

How does the name of the store help anybody?

It's a local store that's doesn't sell outside the country.

I'm not doxing myself just so you guys can open the store and look at cards you can't buy anyway
Yeah by mentionning a local store you will be doxed. God forbids another local gafer could go there as well. Total B.S, there is no cards lying on the shelves in any store, scalpers would make sure it wont happen.
 
Last edited:

Ascend

Member
So AMD GPU's are made in America or Europe, where it's illegal to have child labor? Do you own an iPhone or Android? Wouldn't that be a sheep mentality to get one of those devices as well? We can fault every company for that sheep mentality, as people will still buy them.

My first car was a piece of shit, so I'm not going to buy from this company ever again? You do realize companies can improve, right? Nvidia have had disasters, just like AMD have. But I wouldn't go so far to let an old GPU be a huge determining factor decades later. That's a silly mentality to have, especially as they are the performance kings right now. If you want the best performance, what other options do you have as an enthusiast right now? A 5700XT won't cut it. Intel isn't even in the discussion here.


Do we know the dimensions of the new 6XXX boards yet? I'm not a fan of any of the 3 slot design cards out currently and hope this isn't an ongoing trend.
I gave you my reasons. You don't have to agree. They are MY reasons, and you're free to have yours.
 

RedVIper

Banned
Yeah by mentionning a local store you will be doxed. God forbids another local gafer could go there as well. Total B.S, there is no cards lying on the shelves in any store, scalpers would make sure it wont happen.

If I say the store everybody would know what city I live in.

If another gafer lives here they can find the store with a google search, fuck off.
 
Last edited:

evanft

Member
With all this talk on DLSS......the constant, "Amd has to offer an alternative to DLSS or else".......Seems like De Ja Vu. They said the same thing about Navi 1.0 when the Super cards launched. People were hyped the same way about DLSS 1.0 as they are now for DLSS 2.0....and how many DLSS games do we have even after 2.0? Not many.......Initially, Nvidia said DLSS 1.0 was it's own form of AA due to A.I, they said no TAA for us, no time for that blurry shit. Then when the comparisons came, the IQ on DLSS 2.0 was awful with jaggies and shimmering everywhere, so Nvidia simply adopted TAA for DLSS 2.0 and then sharpened the snort out of the image, whilst using some A.I to clean artefacts in the foreground, but there is still missing detail in backgrounds as approximations are just that, approximations. Yet, people call it the holy grail, but a proper comparison has not been done on it yet. When more DLSS games are released, when some Direct ML games are released, when CB 2.0/Fidelity FX 2.0 is released we will sit at this table and analyze properly. There are not enough games, there is not enough content to justify any hype or proper breakdown of the techniques as even AMD has not evolved their tech from CB 1.0, at least let them release their software first and do their reveal.

DLSS 1.0 was hyped even more than 2.0, still only a few games.... Raytracing through Turing was hyped just the same....People are only going to take stock of RT now, since AMD has made it viable on consoles and it's also a feature on their PC GPU's. Hype does not make anything stick in this industry.....The reality is this, RT is going to take off now because of AMD, when AMD announces it's upscaling and reconstruction tech, that is what is going to be used predominantly by the industry, because it's the consoles that determine how popular specific gaming features can get by normalizing it. We have two consoles built ground up on AMD tech. Doing a faster reconstruction technique without using A.I makes more sense for consoles, it would mean the technology would just work without the need for any third party or super computer or server farm, it would be innate to the console hardware and an easier implementation for devs and of course adoption in games...Nvidia's proprietary nonsense has never really gone too far and with AMD dominating CPU's and now clutching at the GPU teat with a fierce bite from the leaks so far, it's inevitable that their technologies and architecture will be prioritized over Nvidia's proprietary tech......Console tech is pretty much AMD tech and that's 200 million gamers that devs will use what's best for said architecture....


So preliminary AMD RT performance, don't worry about that, devs will learn how to utilize the strengths by which AMD cards do RT and get it to run even faster, because it's what's in the consoles too. The infinity cache, great for rasterization and if it's being used a bit more when RT comes into the picture, devs will use that better too. AMD's approach to RT is still new and devs are putting their heads around it now, console devs are already doing great with it, even in crossgen titles like Miles Morales, which happens to have a very impressive implementation of RT, it will only get better. Devs will seek to get more RT performance on AMD hardware as the consoles launch, RDNA PC GPU's launch and beyond. This is what devs will prioritize, not ampere performance or DLSS 2.0 and you will see an uptick in performance and quality of RT on consoles and RDNA 2 GPU's as we go along...

AMD GPU's has more vram, the infinity cache, better GPGPU performance, the writing is on the wall.....DLSS 2.0 is not Nvidia's savior because there will be many solutions to combat that, at AMD and even from MS. What becomes most adopted as the go to reconstruction technique will have very little to do with Nvidia.... Stop hanging your hats on DLSS 2.0 as soon as you hear rumors that AMD has taken the rasterization performance crown.....

Competition is a good thing and you guys should be glad, but seeing threads like "AMD will not even match the 2080ti/3070".......tweets like "There's just no way RDNA2 gets close to RTX 3000 performance, if these numbers are real ", in which Herkelman answered with an emoji, that was shortly after Jensen gave his spiel on 30TF cards without the caveat..... and spoke of an 8K 3090, but was not in hurry to emphasize that this was with 1440p DLSS upscaling to 8K...So yes, stop hiding behind marketing speak, cards with such TF power should ideally be doing 8k and 4k 120Hz imo.....You can't talk about 33TF and on the flip use a cost cutting DLSS to sell your higher rez......I mean if you are going to defend something, defend a company which is pushing the technology forward with higher clocks, infinity cache, 7nm standards which nets more performance per watt, a company that has given consoles some damn good CPU+GPU power on 300-350 watts on a SFF, with no BS like 4 core i7's which we would still be on had it not been for Ryzen...A company that's pushing open standards with Radeon Rays, Freesync, whilst all the other company does is do proprietary stuff to kill the competition and industry, hog performance and ask for insanely high prices because people are high on the Fellatio 1-0-Huang.....

The industry is changing, it's for the good... In a few years, Infinity Fabric will be on everything, chiplets will be on GPU's and CPU's. What were we expecting, pushing insane clocks on lower single fab nodes forever.... I guess if our intention was to limit core counts and maximize profit on 4 core i7's for another decade, that would be a sound plan....but I'm glad the industry is moving forward. Next gen, I'm all but ready to embrace multi GPU's and CPU's on one die, I'm ready to embrace even more revolutions in this industry.....Yet in the here and now, I can't deny who has has got us here and provided this much needed injection in the vein of this industry.....We were stagnant for far too long, almost like we were pushing daisies, but better days are here, that's for sure....

Hope she sees this bro.
 
Last edited:
Top Bottom