• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • Hi Guest. We've rebooted and consolidated our Communities section, so be sure to check it out and subscribe to some threads. Thanks!

Review Nvidia GeForce RTX 2060/2070 Super Review Thread. New $399/$499 GPU King.

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
9,617
3,038
800
Let the customers decide what they want to buy, when the reviews go up.....RTX on these cards are a joke, raytracing is not ready for primetime, not at least 1.5-2 years away to be done properly....

I was thinking about AMD slides yesterday and I felt that I should do some extrapolations, but hardwarecanucks beat me to it......In any case, the main point which I've mentioned in prior Super threads is that persons should hold on for actual benchmarks.......AMD used the best API for RTX cards and the best API for Navi cards...So if they tested Metro in DX12 and found that Navi performed better in DX12 and RTX performed better in DX11, they would use the API that offers the best averages for each card...….Now since Navi won most of the benches over RTX, some in excess of 20%, it could mean that if the same API was used, RTX could fall behind even more in said titles...…...

It only makes sense to compare common API to API to see the strengths of the new RDNA architecture from Navi, how it performs in older DX11 titles where Vega did worse, like Fortnite, Overwatch etc....How much better the new CU architecture makes it perform in titles Radeon already excelled in like Dirt 2, Strange Brigade, Battlefields, Kingdom Come, COD, Forza's, DMC's, Resident Evils etc...…...People who think Super has dusted 5700 are out of their minds and just FUDDING...

Also, people want to compare the 380 and 450 Navi cards vs 400 and 500 Super Cards and that's fine, but persons are forgetting that the $500 5700XT Anniversary exists, with higher clock speeds.....So it would make sense to compare the anniversary 5700xt to the RTX 2070 Super as well....

API to API, should be an interesting comparison come the 7th......
Correct, the 2060 and 2070 can AT BEST provide a preview of where titles could go with RTX, but they're not remotely close to giving anyone a good gaming experience where the performance hit would be worth what they have to give up with RTX enabled.
Im sorry, but I simply don't believe most PC Gamers would choose RTX-enabled 1080p/30 - 60 fps, when in a non-RTX scenario you could get 1440p/60fps and don't even get me started on the ultra fail DLSS has been.
Even the 2080 Ti, which I own, struggles with RTX enabled, although most RTX enabled games are playable with at 1440p.
 
  • Like
Reactions: thelastword

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
9,617
3,038
800
Absolutely false....DLSS is the laughing stock of recent graphical features, its a meme at this point......There are countless videos showing the farce that DLSS is, it's the reason why no one mentions it but Nvidia, to scam comparisons vs Pascal.......You get a worse, more aliased and blurry image with DLSS........Metro used sharpening to get rid of the blur, but it's still awful......It's no comparison to native rez, and besides, it has been proven that normal upscaling resolves much more detail and is sharper than DLSS........

Checkerboard is several classes above DLSS and so is Fidelity FX, if you happened to watch AMD's presentation from the Unity Engine.....
No shit. DLSS has been an epic fail. It was better looking and even performed better to simply drop the resolution than enable DLSS at a higher resolution.

Its a shame, too. I remember a lot of gamers were far more excited about the potential of DLSS than RTX.
 
Last edited:

Pimpbaa

Member
Jun 8, 2004
17,063
371
1,660
43
Canada
Checkerboarding is light years more impressive then DLSS has been.
Considering the silicon spent for DLSS, it should have been the reverse. I dunno how they fucked it up so bad. Temporal injection used by Insomniac looks more impressive than DLSS too.
 

thelastword

Gold Member
Apr 7, 2006
8,547
2,656
1,505
No shit. DLSS has been an epic fail. It was better looking and even performed better to simply drop the resolution than enable DLSS at a higher resolution.

Its a shame, too. I remember a lot of gamers were far more excited about the potential of DLSS than RTX.
I remember all too well........The ramp up on DLSS started when NV fans realized that RTX cards had launched and there were no RTX games, so they tried to save face by wishing and hopping on the buttocks of DLSS, because they thought it would be much easier to implement (which makes sense) and that many of their upcoming games could have DLSS.....Wasn't the case though. Nvidia showed a long list of compatible DLSS games at their CES presentation, so when people asked, where are the RTX games? NV fans argued that DLSS is an RTX feature, so even if we have no raytracing games yet, our DLSS spring of milk and honey will be coming fast and furious in our games, didn't you see the list?...….Ha ha, clearly, didn't pan out...…..Yet after a while we waited and waited, but when the first DLSS game came out to much anticipation, it was awful, it was crucified in the media who had oodled it with hype before release.....Even in Metro asfter a few DLSS games came along prior, it was the same awful thing,, so Metro devs took slight against the awful DLSS and negative feedback and sharpened the hell out of DLSS in that game, yet, all the lost detail and a more exposed image is seen, we can clearly see it's shortcomings more vividly......In essence, Native and traditional upscaling looks miles and miles better...….There's no way Leadbetter without his spectacles can't tell the difference....

Now I get it, when I saw the horrendous results with Hybrid raytracing on RTX 2060, 2070, 2080 and even in some case 2080ti....I said if DLSS is worth it's salt then perhaps these RTX gamers can use DLSS to lower the footprint and still have a sharp an awesome IQ, whilst increasing frames, Notwithstanding, that though I had my own reservations on such a solution, since these RTX cards were an arm and a leg at the checkout....My take was such expensive cards should verily have the grunt to push the rez and frames with RTX after all Jensen promised with hardware raytracing, pulling up supercomputers costing lotto prizes and the like and comparing that to what they had now in retail Turing, he hyped the stalls of his Turing offering, yet in retrospect, it is clear that Jensen was only hyping Turing against much more expensive parts to sell a half-baked, not ready for primetime feature for the asking price of $1200 on the high end, He had to pitch it innit?...…….

Yet boy, was still looking to see how effective a solution DLSS was, and boy-o-boy, was the world not ready for the absolute shit stain that DLSS is.....Jensen spoke such a huge game on both features, when they didn't get their RTX games or when their RTX games absolutely massacred frames and rez,, they latched unto DLSS as the savior that would take them out of Babylon...

Only to find out that Jensen Huang just sold them a used Camry at a Ferrari dealership.....

BAIT and SWITCH.........


 

Chunk Loves Sloth

Formerly 'JareBear'
Nov 5, 2016
12,097
15,213
900
I don't care where (just far)

These guys do a good job with pre built systems.
I definitely have had positive experiences with DigitalStorm. Highly recommend for people who don’t build it themselves
 
  • Like
Reactions: CurryPanda

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
9,617
3,038
800
I definitely have had positive experiences with DigitalStorm. Highly recommend for people who don’t build it themselves
I highly recommend every build themselves. A gaming PC is a 10000000000000X more fun when you build it yourself.

Seriously, I almost buy new hardware just so I can get the arousal of PC building.

 

Kenpachii

Member
Mar 23, 2018
2,266
1,749
540
I remember all too well........The ramp up on DLSS started when NV fans realized that RTX cards had launched and there were no RTX games, so they tried to save face by wishing and hopping on the buttocks of DLSS, because they thought it would be much easier to implement (which makes sense) and that many of their upcoming games could have DLSS.....Wasn't the case though. Nvidia showed a long list of compatible DLSS games at their CES presentation, so when people asked, where are the RTX games? NV fans argued that DLSS is an RTX feature, so even if we have no raytracing games yet, our DLSS spring of milk and honey will be coming fast and furious in our games, didn't you see the list?...….Ha ha, clearly, didn't pan out...…..Yet after a while we waited and waited, but when the first DLSS game came out to much anticipation, it was awful, it was crucified in the media who had oodled it with hype before release.....Even in Metro asfter a few DLSS games came along prior, it was the same awful thing,, so Metro devs took slight against the awful DLSS and negative feedback and sharpened the hell out of DLSS in that game, yet, all the lost detail and a more exposed image is seen, we can clearly see it's shortcomings more vividly......In essence, Native and traditional upscaling looks miles and miles better...….There's no way Leadbetter without his spectacles can't tell the difference....

Now I get it, when I saw the horrendous results with Hybrid raytracing on RTX 2060, 2070, 2080 and even in some case 2080ti....I said if DLSS is worth it's salt then perhaps these RTX gamers can use DLSS to lower the footprint and still have a sharp an awesome IQ, whilst increasing frames, Notwithstanding, that though I had my own reservations on such a solution, since these RTX cards were an arm and a leg at the checkout....My take was such expensive cards should verily have the grunt to push the rez and frames with RTX after all Jensen promised with hardware raytracing, pulling up supercomputers costing lotto prizes and the like and comparing that to what they had now in retail Turing, he hyped the stalls of his Turing offering, yet in retrospect, it is clear that Jensen was only hyping Turing against much more expensive parts to sell a half-baked, not ready for primetime feature for the asking price of $1200 on the high end, He had to pitch it innit?...…….

Yet boy, was still looking to see how effective a solution DLSS was, and boy-o-boy, was the world not ready for the absolute shit stain that DLSS is.....Jensen spoke such a huge game on both features, when they didn't get their RTX games or when their RTX games absolutely massacred frames and rez,, they latched unto DLSS as the savior that would take them out of Babylon...

Only to find out that Jensen Huang just sold them a used Camry at a Ferrari dealership.....

BAIT and SWITCH.........


That's why you never invest into first gen new tech stuff and wait a gen.
 
Last edited:
  • Like
Reactions: CurryPanda

thelastword

Gold Member
Apr 7, 2006
8,547
2,656
1,505
That's why you never invest into first gen new tech stuff and wait a gen.
That's fine, but tell that to someone who just bought a 2080 for $700 and now he gets similar performance in a 2070 Super for $200 less.....Nvidia just does not care about customers....

I'm pretty sure there's a reason why AMD has not launched the 5800 or 5900 yet, it's because they just launched Radeon VII, even though it could be argued that Radeon 7 has it's own niche in the market due to content creation+gaming.....Yet they still felt it best not to eclipse Radeon 7's value so early after launch.....
 
  • Like
Reactions: Bonfires Down

Cheezeus

Member
Oct 24, 2017
93
75
190
All i care about is having the best card for 4k60fps. Price is irrelevant (within reason). As long as Nvidia is the one providing this to me, ill keep going nvidia. If AMD wants my money, they need to compete better in the highest end.
 
  • LOL
Reactions: KingT731

CurryPanda

Will restrain his sexual perversions
Mar 4, 2019
10,114
15,897
1,180
All i care about is having the best card for 4k60fps. Price is irrelevant (within reason). As long as Nvidia is the one providing this to me, ill keep going nvidia. If AMD wants my money, they need to compete better in the highest end.
Intel is getting into the GPU market next year. Things are heating up, and new GPU competition is always welcome.
 
  • Like
Reactions: Remij

Remij

Member
Apr 23, 2009
2,216
210
835
I love how nothing is relevant until AMD finally gets around to supporting some knock off or whatever it is 1.5-2 years later. :messenger_grinning_squinting:

SLI sucks! (YAY Crossfire!)
Hairworks sucks! (YAY TressFX!)
Gsync sucks! (YAY Freesync!)
Shadowplay sucks! (YAY Relive!)
Ray Tracing sucks! (YAY Hybrid Ray Tracing!)
Freestyle filters suck! (YAY AMD FidelityFX!)

:messenger_tears_of_joy:

What AMD fanboys don't understand... is that AMD doesn't need GOOD gpus.. the need GREAT gpus.. and these 5700 series cards aren't shit. They aren't impressive at all. These 7nm cards should be smoking Nvidia's 12nm cards... but they aren't. What an accomplishment... comparing a 2019 architecture to a 2018 architecture which still has more features and better performance. It's embarrassing. Their main damage control so far is that AMD "could potentially cut prices by up to $100." lmao like Nvidia couldn't? We can all laugh at that.. because we know that Nvidia won't.. because they don't have to. People will buy these over the AMD parts regardless.

Ok AMD.. go ahead and drop the price by $100 and make less money while Nvidia responds by dropping the price of their cards JUST enough to again take the wind out of their sails. Then, by the time AMD's "Fine Wine™ technology" begins to work it's magic.. Nvidia hit's them with their new 7nm EUV gpus and embarrasses them again.

The only good thing about AMD gpus is that they have the potential to bring Nvidia prices down, or increase Nvidia's performance/dollar. That's what gets most people interested in AMD gpus.
 
Last edited:

Nydus

Member
May 6, 2014
352
246
450
No shit. DLSS has been an epic fail. It was better looking and even performed better to simply drop the resolution than enable DLSS at a higher resolution.

Its a shame, too. I remember a lot of gamers were far more excited about the potential of DLSS than RTX.
I liked dlss in metro and anthem. Didn't see much difference but performance with raytracing was much nicer. But I played those games at 1440p for high Fps and not on my tv. So maybe it's much worse at 4k.
 

CurryPanda

Will restrain his sexual perversions
Mar 4, 2019
10,114
15,897
1,180
I love how nothing is relevant until AMD finally gets around to supporting some knock off or whatever it is 1.5-2 years later. :messenger_grinning_squinting:

SLI sucks! (YAY Crossfire!)
Gsync sucks! (YAY Freesync!)
Shadowplay sucks! (YAY Relive!)
Ray Tracing sucks! (YAY Hybrid Ray Tracing!)
Freestyle filters suck! (YAY AMD FidelityFX!)

:messenger_tears_of_joy:

What AMD fanboys don't understand... is that AMD doesn't need GOOD gpus.. the need GREAT gpus.. and these 5700 series cards aren't shit. They aren't impressive at all. These 7nm cards should be smoking Nvidia's 12nm cards... but they aren't. What an accomplishment... comparing a 2019 architecture to a 2018 architecture which still has more features and better performance. It's embarrassing. Their main damage control so far is that AMD "could potentially cut prices by up to $100." lmao like Nvidia couldn't? We can all laugh at that.. because we know that Nvidia won't.. because they don't have to. People will buy these over the AMD parts regardless.

Ok AMD.. go ahead and drop the price by $100 and make less money while Nvidia responds by dropping the price of their cards JUST enough to again take the wind out of their sails. Then, by the time AMD's "Fine Wine™ technology" begins to work it's magic.. Nvidia hit's them with their new 7nm EUV gpus and embarrasses them again.

The only good thing about AMD gpus is that they have the potential to bring Nvidia prices down, or increase Nvidia's performance/dollar. That's what gets most people interested in AMD gpus.
That's why I am glad Intel is gonna enter the race. They put some fire under Nvidia's butt.
 
  • Like
Reactions: Remij

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
9,617
3,038
800
I liked dlss in metro and anthem. Didn't see much difference but performance with raytracing was much nicer. But I played those games at 1440p for high Fps and not on my tv. So maybe it's much worse at 4k.
You are far far far better off just dropping the resolution and disabling DLSS.

DLSS basically looks like vasoline was thrown on the screen.
 

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
9,617
3,038
800
That's why I am glad Intel is gonna enter the race. They put some fire under Nvidia's butt.
I wouldn't get your hopes up since it's being lead by Raja Koduri. His track record at AMD was average at best.
 

xPikYx

Member
Jun 23, 2019
76
62
170
I liked dlss in metro and anthem. Didn't see much difference but performance with raytracing was much nicer. But I played those games at 1440p for high Fps and not on my tv. So maybe it's much worse at 4k.
The thing is people do not realize 4k is useless for the majority of them unless the play on monitors /screen 50inch and above, I personally purchased a 2080 with cuatom waterblock only to play 1080p raytracing titles, for me raytracing improves the IQ of a 3d scene much better than a higher crazy resolution like 4k, 1080p is more than sufficient on a 27inch screen.

 

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
9,617
3,038
800
The thing is people do not realize 4k is useless for the majority of them unless the play on monitors /screen 50inch and above, I personally purchased a 2080 with cuatom waterblock only to play 1080p raytracing titles, for me raytracing improves the IQ of a 3d scene much better than a higher crazy resolution like 4k, 1080p is more than sufficient on a 27inch screen.

I've seen that video plenty of times and it's one of the few videos that I disagree with Linus. I have a 27 inch monitor and I can certainly see the difference between 4K and 1440p. However with AA applied, 1440p looks close to 4K at that screen size.
 
  • Like
Reactions: Celcius

Kenpachii

Member
Mar 23, 2018
2,266
1,749
540
That's fine, but tell that to someone who just bought a 2080 for $700 and now he gets similar performance in a 2070 Super for $200 less.....Nvidia just does not care about customers....

I'm pretty sure there's a reason why AMD has not launched the 5800 or 5900 yet, it's because they just launched Radeon VII, even though it could be argued that Radeon 7 has it's own niche in the market due to content creation+gaming.....Yet they still felt it best not to eclipse Radeon 7's value so early after launch.....
The 2000 series where always overpriced, everybody know this and everybody knew the performance was a joke. Site after tech site reported on this and everybody told them so if they asked. For the stuborn people that bought a 2080 for 700 bucks they will be playing games through all next gen without much issue's. and all games at this day of age at high end framerates without problems. that extra x% performance isn't going to change that on a super.

Nvidia only cares about your money, anybody that readed stuff for the last few years with there anti consumer practises knows this. The company is a garbage one.

Sadly they do tend to make good products that work, and they have no real competition going on in there own space.

Sorry but you sympathizing with people that buy first gen tech, spend a metric ton on gpu's that simple where laughed at on launch by anybody on the earth and basically helped nvidia set there price in stone for higher prices in the future. Earn zero sympathy from me.

U can thank all those clowns for a 2060 super at a price point of 400 bucks.
 
Last edited:

xPikYx

Member
Jun 23, 2019
76
62
170
This unfortunately happens when you don't have competitors, you can do whatever you want
 

xPikYx

Member
Jun 23, 2019
76
62
170
I've seen that video plenty of times and it's one of the few videos that I disagree with Linus. I have a 27 inch monitor and I can certainly see the difference between 4K and 1440p. However with AA applied, 1440p looks close to 4K at that screen size.
Well the thing is everybody is different, simple as that, what's worth for me is not for you etc. From my point of view I do agree with Linus, indeed I bought a 1080 pascal to play games at 1080p and upgraded to a 2080 only because of an issue I had with my PC and GPU, so I was constrained even though I don't regret it, but if I haven't had that issue I would have waited for Ampere GPUs, normally I consider to buy a new GPU every 2 generations
 

xPikYx

Member
Jun 23, 2019
76
62
170
Intel will come next year and bring some heat. Should be interesting imo.
I really hope so, leaks have been quite interesting so far, not many as consoles' though and that's a shame. Next year will be be one of the most interesting, new consoles, brand new intel discrete GPUs, new Nvidia Ampere GPUs, and high end Navi cards with ray tracing acceleration
 
  • Like
Reactions: CurryPanda

CurryPanda

Will restrain his sexual perversions
Mar 4, 2019
10,114
15,897
1,180
I really hope so, leaks have been quite interesting so far, not many as consoles' though and that's a shame. Next year will be be one of the most interesting, new consoles, brand new intel discrete GPUs, new Nvidia Ampere GPUs, and high end Navi cards with ray tracing acceleration
Yep, Ampere with its 7nm should be very interesting. And, Intel GPU should be too.
 

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
9,617
3,038
800
Well the thing is everybody is different, simple as that, what's worth for me is not for you etc. From my point of view I do agree with Linus, indeed I bought a 1080 pascal to play games at 1080p and upgraded to a 2080 only because of an issue I had with my PC and GPU, so I was constrained even though I don't regret it, but if I haven't had that issue I would have waited for Ampere GPUs, normally I consider to buy a new GPU every 2 generations
Bump it up to 1440p, you will appreciate the difference. So much potential is being wasted by limiting yourself to 1080p with a 2080.

If your monitor won't support 1440p, use supersampling.
 

xPikYx

Member
Jun 23, 2019
76
62
170
I purposely chose a 1080p 144hz monitor with G-sync for that scope, if applicable I use supersampling but i didn't need it so far
 
  • Like
Reactions: Goff2k
Feb 17, 2005
7,764
62
1,380
Still in no hurry to upgrade my 970 as performance is good enough for me in 1080p for what I play and I'm not fussed about raytracing.

I'll consider upgrading once I can't play some next gen stuff in 1080p at a stable 30fps with decent settings (i.e. mostly high at least). I'd probably look for whatever 2x70 series it out vs. a year old one at that time and find a price/performance sweet spot.
I feel the same. I do want Ray Tracing eventually, but for now I'll be just fine with my stock 1070.
 

JohnnyFootball

The Last of Us may be third person, but it is hardly third person.
Jan 20, 2014
9,617
3,038
800
I feel the same. I do want Ray Tracing eventually, but for now I'll be just fine with my stock 1070.
A 1070 is still a solid card and can do most current games at medium to high settings at 1440p. I have a feeling once Ampere (or if AMD just shocks the world) comes out I'll hate having spent the mega $$ to get a 2080 Ti.
 

Pagusas

Member
Jun 9, 2006
11,231
1,529
1,400
Prosper, Tx
I’m just going to quietly wait for Cyberpunk and see how my 1080ti handles it. Hopeful AMD or Nvidia will have better cards out by then that can handle Cyberpunk Raytracing setting at 80%+ scaled 4K. But I truely doubt they will.
 
Last edited:

BusierDonkey

Member
Sep 21, 2018
1,111
1,940
350
The thing is people do not realize 4k is useless for the majority of them unless the play on monitors /screen 50inch and above, I personally purchased a 2080 with cuatom waterblock only to play 1080p raytracing titles, for me raytracing improves the IQ of a 3d scene much better than a higher crazy resolution like 4k, 1080p is more than sufficient on a 27inch screen.

This is why I find the hype building for 8K so funny. I have a 40" 4K monitor sitting about 3 feet in front of my face and I'm in a rare situation where 1440 and 2160 are visibly discernable from each other. 1440 is considerably softer in detail when viewed from that close and 1080 is rough. That said, when I'm faced with the choice between 4K or locked 60 I'll bump the resolution down every time. People on a 21"-27" monitor should be aiming for 1080 at a high FPS, not resolution. I went the route I did because I like the real estate when I'm not running a game. I can have a video open, be reading GAF :messenger_sunglasses: , and have Krita or PS running on half the screen and every 1/4 is as big as the 20" monitor I ran years ago. The down side is that I'm limited to 60Hz. If I ran a second monitor just for gaming or didn't spend a lot of time reading, drawing, and watching shit I would have gone with something I could OC that focused entirely on FPS.
 

Krappadizzle

Member
Oct 4, 2011
12,375
809
895
I’m just going to quietly wait for Cyberpunk and see how my 1080ti handles it. Hopeful AMD or Nvidia will have better cards out by then that can handle Cyberpunk Raytracing setting at 80%+ scaled 4K. But I truely doubt they will.
This is my plan too. I'm hopeful that there will be a new GPU out by then that can give me great performance with raytracing for Cyberpunk. Was gonna upgrade if they had intentions of releasing a 2080ti "Super" or whatever, but since they don't, then there's no reason to upgrade the 1080ti.
 

cyber69

Member
Sep 21, 2018
408
353
280
Nvidia is just playing reactionary and could possibly release something that would absolutely curb stomp AMD.
 

Nydus

Member
May 6, 2014
352
246
450
You are far far far better off just dropping the resolution and disabling DLSS.

DLSS basically looks like vasoline was thrown on the screen.
I looked really hard but the differences where almost non existent during gameplay. Maybe on screenshots but in motion it was fine. (Not perfect but ok)

The thing is people do not realize 4k is useless for the majority of them unless the play on monitors /screen 50inch and above, I personally purchased a 2080 with cuatom waterblock only to play 1080p raytracing titles, for me raytracing improves the IQ of a 3d scene much better than a higher crazy resolution like 4k, 1080p is more than sufficient on a 27inch screen.

I only use 4k on my 65inch TV. But my 1440p monitor is 32inch. 14400p is still good enough for me there but at that size I can easily see the difference to 4k when the game uses bad AA.
 

sinnergy

Member
Jun 16, 2007
2,141
39
985
I watched a video that explained Nvidia does hidden optimizations in compression that developers have no control over, and the pixel quality is actually lower than AMD cards, that’s why they are almost always faster...

ACTUALLY pretty smart as general customers will think faster is always better.
 

thelastword

Gold Member
Apr 7, 2006
8,547
2,656
1,505
I watched a video that explained Nvidia does hidden optimizations in compression that developers have no control over, and the pixel quality is actually lower than AMD cards, that’s why they are almost always faster...

ACTUALLY pretty smart as general customers will think faster is always better.
Yes, I did a thread on that sometime ago...…..The thing is, NVIDIA uses lots of compression in it's drivers, they compress textures quite a bit....As it relates to Image Quality, they reduce the image so much, where at times you can get an uplift of 25% performance......Of course, if you've never owned an AMD card, you won't notice it, but if you have you will notice that generally games have better colors, they have less texture compression and resolves more detail on AMD cards....


So when people say Nvidia TF is better, it's not, no TF is better......Nvidia uses more tricks under the hood that decreases image quality for perf...If AMD did the same with their higher TF'd cards, they would have much more performance over Nvidia in DX11 games.....and of course the games where Radeon already wins in, the gap would stretch even further......
 

Evilms

Member
Dec 16, 2018
787
1,797
380
So when people say Nvidia TF is better, it's not, no TF is better......Nvidia uses more tricks under the hood that decreases image quality for perf...If AMD did the same with their higher TF'd cards, they would have much more performance over Nvidia in DX11 games.....and of course the games where Radeon already wins in, the gap would stretch even further......
Or simply that techno and architectures nvidia are better than those of amd.

Besides, not for nothing only on PC, it's been like this for years:

 
Last edited:

Ascend

Member
Jul 23, 2018
633
500
315
I love how nothing is relevant until AMD finally gets around to supporting some knock off or whatever it is 1.5-2 years later. :messenger_grinning_squinting:

SLI sucks! (YAY Crossfire!)
Hairworks sucks! (YAY TressFX!)
Gsync sucks! (YAY Freesync!)
Shadowplay sucks! (YAY Relive!)
Ray Tracing sucks! (YAY Hybrid Ray Tracing!)
Freestyle filters suck! (YAY AMD FidelityFX!)

:messenger_tears_of_joy:

What AMD fanboys don't understand... is that AMD doesn't need GOOD gpus.. the need GREAT gpus.. and these 5700 series cards aren't shit. They aren't impressive at all. These 7nm cards should be smoking Nvidia's 12nm cards... but they aren't. What an accomplishment... comparing a 2019 architecture to a 2018 architecture which still has more features and better performance. It's embarrassing. Their main damage control so far is that AMD "could potentially cut prices by up to $100." lmao like Nvidia couldn't? We can all laugh at that.. because we know that Nvidia won't.. because they don't have to. People will buy these over the AMD parts regardless.

Ok AMD.. go ahead and drop the price by $100 and make less money while Nvidia responds by dropping the price of their cards JUST enough to again take the wind out of their sails. Then, by the time AMD's "Fine Wine™ technology" begins to work it's magic.. Nvidia hit's them with their new 7nm EUV gpus and embarrasses them again.

The only good thing about AMD gpus is that they have the potential to bring Nvidia prices down, or increase Nvidia's performance/dollar. That's what gets most people interested in AMD gpus.
Although the second part of your post is true, the first part is pure nonsense. In reality it's the exact opposite. Everything nVidia does is praised, and everything ATi/AMD does is neglected.

Latest example?
AMD refreshes cards: Ugh another refresh/rebrand. AMD doesn't have anything.
nVidia refreshes RTX cards to Super: Wow these are much better!

nVidia has a lot more mind share, and even if some AMD fanboys make ridiculous claims, it has little impact compared to nVidia, simply because everyone has green glasses on.
 
Last edited:
Jul 29, 2013
1,475
1,822
825
Oregon, US
I'm most curious to see overall performance summary for a 2060 Super oc vs 5700 Pro oc, as these cards are at the top end of what I'm willing to pay. Not interested in your blowers, AMD.
 

DrCheese

Member
Jul 15, 2013
52
31
410
Although the second part of your post is true, the first part is pure nonsense. In reality it's the exact opposite. Everything nVidia does is praised, and everything ATi/AMD does is neglected.

Latest example?
AMD refreshes cards: Ugh another refresh/rebrand. AMD doesn't have anything.
nVidia refreshes RTX cards to Super: Wow these are much better!

nVidia has a lot more mind share, and even if some AMD fanboys make ridiculous claims, it has little impact compared to nVidia, simply because everyone is green glasses on.
But they don't have anything - Whatever they release is worse than whatever Nvidia already has out on the market & costs the same/more - No one is going to get excited that AMD's 2019 cards are as good as Nvidia's 2016 cards. Who cares? We've had that performance for years now.

Nvidia's new releases are better because they are simply faster than anything else at that price point.
 

Ascend

Member
Jul 23, 2018
633
500
315
- No one is going to get excited that AMD's 2019 cards are as good as Nvidia's 2016 cards. Who cares? We've had that performance for years now.
The 2060 Super and 2070 Super also are as good as nVidia's 2016 cards. Why do they get a pass? Just because they're nVidia? Yeah... Green glasses.

Nvidia's new releases are better because they are simply faster than anything else at that price point.
Tell that to the GTX 1650 vs the RX 570.
 

Remij

Member
Apr 23, 2009
2,216
210
835
Although the second part of your post is true, the first part is pure nonsense. In reality it's the exact opposite. Everything nVidia does is praised, and everything ATi/AMD does is neglected.

Latest example?
AMD refreshes cards: Ugh another refresh/rebrand. AMD doesn't have anything.
nVidia refreshes RTX cards to Super: Wow these are much better!

nVidia has a lot more mind share, and even if some AMD fanboys make ridiculous claims, it has little impact compared to nVidia, simply because everyone has green glasses on.
No. The first part is also true. Nvidia constantly gets shit for everything, AMD gets praised for coming up with some open source solution 1-2 years later. It doesn't take much more than being cheap to win the minds of those people. Nothing is important until they can afford it.. that's what it seems like.

AMD refreshes cards out of desperation. Nvidia refreshes cards to stick it to their competition. That's the difference. And since people want Nvidia cards, the new cards are a better value than the old cards they replace... so the fans who waited are happy and buy them. Meanwhile AMD fans try to convince themselves they are happy and that FineWine exists and will be coming for them soon. lol

I understand the mindshare thing.. you're absolutely right. Nvidia has a lot more. But nobody that complains about AMD fanboys thinks that their comments really affect Nvidia... because it's quite clear that despite talking a lot of shit, tons more people buy Nvidia GPUs... because they want the best shit. But that doesn't give AMD fanboys a free pass to make completely stupid claims... so I'll call it out when I see it.
 
Last edited:

Pagusas

Member
Jun 9, 2006
11,231
1,529
1,400
Prosper, Tx
Nvidia is just playing reactionary and could possibly release something that would absolutely curb stomp AMD.
Why would Nvidia bother? They want AMD in the world, just like Intel wants them in the world. They need AMD to exist to keep regulators off their backs.

AMD is like that team that always plays the Globetrotters. This is basically the Nvidia/AMD relationship:
 
Last edited:

Ascend

Member
Jul 23, 2018
633
500
315
No. The first part is also true. Nvidia constantly gets shit for everything, AMD gets praised for coming up with some open source solution 1-2 years later. It doesn't take much more than being cheap to win the minds of those people. Nothing is important until they can afford it.. that's what it seems like.
Really?

When the RX 480 had a (non-)issue with power consumption through the PCI-E slot, it was a huge deal that made the card so-called dangerous and unreliable.
When the GTX 970 deliberately deceived nVidia customers regarding its usable memory size, it was still a great card to buy anyway.

When the GTX 970 had 3.5GB and was a competitor of the R9 390 with 8GB, the memory on the GTX 970 is fine.
When the R9 Fury X with 4GB was a competitor to the 980 Ti with 6GB, Fury X has too little VRAM.
(in before "Bbbbut 1440p!!!!")

When AMD had the superior product in terms of power consumption in the late 2000s, only the speed mattered.
When AMD had the best speed with the R9 290X release, their power consumption mattered.

When AMD had driver faults, those were a deal breaker.
When nVidia had drivers that killed their cards, they're still good enough to buy.

When AMD brought FreeSync, they were just copying nVidia and having an inferior version of G-sync.
When nVidia started supporting FreeSync, nVidia is great for doing so.

When nVidia brings out overpriced cards, they are justified in doing so because features, or speed.
When AMD brings out equivalently performing & priced cards (5700, 5700XT), they are overpriced because AMD can never be cheap enough.

When nVidia brought out RTX, everyone saw it as an overpriced gimmick.
When AMD releases cards less than a year later without ray tracing, suddenly everyone wants RTX.

And the list goes on and on. The sad thing is that the majority are not even aware that they are doing it. And when reporters, reviewers and tech enthusiasts fall into this category, it's a truly sad time for gamers. People see nVidia as the default without realizing it. At this point, AMD can bring a $400 card performing like a 2080Ti, and people will find some excuse not to buy it. Just like the RX570 is now cheaper than a 1050Ti and gets you two additional games, and yet, Steam is littered with 1050 Ti cards and barely any RX 570s to be seen. Just like the Vega 56 has been $300 for a long while and no one recommends it despite being the best value card in its price range. [sarcasm]All of this is rational consumers picking out the best options I presume? [/sarcasm]

The goal posts shift every time in favor of nVidia. It's nice and all what you said about nVidia being trashed, and that might be true, but nVidia never feels it in their wallets... AMD feels it constantly, even when they don't deserve it... THAT is the big difference. RTX was the first time nVidia felt anything in terms of sales within the last decade.

AMD refreshes cards out of desperation. Nvidia refreshes cards to stick it to their competition. That's the difference.
Kind of hard to do R&D when everyone buys the competition no matter what you do. I mean, look at right now. Everyone is dissing the 5700(XT) cards without reviews being out, and nVidia, the one that jacked up prices through the roof are once again being praised and getting a free pass for releasing something at the price they should have released them at in first place. How does that qualify as "nVidia getting shit for everything"? And this is yet another example of nVidia getting a free pass. A refresh is a refresh. But I guess we can't see it that way when we're biased to one side.

And since people want Nvidia cards, the new cards are a better value than the old cards they replace... so the fans who waited are happy and buy them.
That was not what happened with RTX, but apparently that has already been forgotten. People have REALLY short memories it seems... It's like the price being $500 today, tomorrow releasing a card at $1000, and then dropping it to $750. Guess what. You're still being screwed because it's still $250 more expensive.

Meanwhile AMD fans try to convince themselves they are happy and that FineWine exists and will be coming for them soon. lol
There's a great chance Navi will have something similar to the HD7000 series in terms of longevity, simply because it's a new architecture. No one in their right mind is really happy with AMD right now regarding anything above the Vega 56, although that might change with the 5700. We'll see. Anyone that buys under $300 would be crazy not to consider the Vega 56 or the RX 570.

I understand the mindshare thing.. you're absolutely right. Nvidia has a lot more. But nobody that complains about AMD fanboys thinks that their comments really affect Nvidia... because it's quite clear that despite talking a lot of shit, tons more people buy Nvidia GPUs... because they want the best shit. But that doesn't give AMD fanboys a free pass to make completely stupid claims... so I'll call it out when I see it.
Neither does it give nVidia fanboys a free pass, simply because everyone wants to buy nVidia at a certain point in time. And the vice versa is sadly not true... The nVidia fanboy talk DOES hurt AMD. To this day, there are people complaining about why AMD always has bad drivers despite it being proven that AMD drivers are more stable... And they have this reputation even though it's no longer true and hasn't been for years.

Lastly, leaving this here...

 
Last edited: