• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Next Generation GPUs With Up To 7552 Cores Benchmarked 40% Faster Than TITAN RTX

Kenpachii

Member
LOL you guys are morons!!

Control models are butt ugly u clearly didn't play the game then. Honestly it's even a downgrade from their last game. As they use the same piss poor engine with ray tracing slapped on it to sell it. It runs just as shit.

I tried to make a picture of its face, but it blurs out with the tool i use, so its hard but all the detail is in his face and no this aint no cutscene face. AC odyssey models are incredible detailed and shit all over control with and without ray tracing.

Control is straight up ugly without ray tracing because they nuked the visuals of the non raytracing solution massively. When i bought the game on launch day and booted it up i straight up thought the game was broken.



3d2518043edb9e399f6a57ce4ea6a694.jpg




Here ultra settings by the way with ray tracing. Even the fucking floors lightning is way off without ray tracing, everything looks grainy and totally dog shit on purpose. The game is buttugly because it was ment to sell raytracing cards as it was heavily sponsored by nvidia. Sadly nobody cared which resulted in the keys now already selling for a 3rd of its price.
 
Last edited:

pawel86ck

Banned
Control models are butt ugly u clearly didn't play the game then. Honestly its even a downgrade from there last game. As they use the same piss poor engine with raytracing slapped on it to sell it. It runs just as shit.

I tried to make a picture of its face, but it blurs out with the tool i use, so its hard but all the detail is in his face and no this aint no cutscene face. AC odyssey models are incredible detailed and shit all over control with and without raytracing.

Control is straight up ugly without raytracing because they nuked the visuals of the non raytracing solution massively. When i bought the game on launch day and booted it up i straight up thought the game was broken.



3d2518043edb9e399f6a57ce4ea6a694.jpg



Here ultra settings by the way with raytracing.

We arent talking here about character face details but lighting (RT). Uncharted 4 models are also very detailed during normal gameplay, but the lighting on entire face looks different and that's the problem. Control has cutscene quality shadows even during gameplay and that's what make it so special. Character model from your AC screenshot is also detailed but there's no cutscene quality lighting here.
 
Control models are butt ugly u clearly didn't play the game then. Honestly it's even a downgrade from their last game. As they use the same piss poor engine with ray tracing slapped on it to sell it. It runs just as shit.

I tried to make a picture of its face, but it blurs out with the tool i use, so its hard but all the detail is in his face and no this aint no cutscene face. AC odyssey models are incredible detailed and shit all over control with and without ray tracing.

Control is straight up ugly without ray tracing because they nuked the visuals of the non raytracing solution massively. When i bought the game on launch day and booted it up i straight up thought the game was broken.



3d2518043edb9e399f6a57ce4ea6a694.jpg




Here ultra settings by the way with ray tracing. Even the fucking floors lightning is way off without ray tracing, everything looks grainy and totally dog shit on purpose. The game is buttugly because it was ment to sell raytracing cards as it was heavily sponsored by nvidia. Sadly nobody cared which resulted in the keys now already selling for a 3rd of its price.

I thought something was off in that video. That's with RT off, as 1080 TI didn't support rtx. Both were great games in their own respect though. Both look better than uncharted as well.
 
Last edited:

Kenpachii

Member
I thought something was off in that video. That's with RT off, as 1080 TI didn't support rtx. Both were great games in their own respect though. Both look better than uncharted as well.

Every card with DX12 supports can raytrace, i can play shadow of the tomb raider at medium raytracing perfectly fine on a 1080ti with 12 ms input lag and average of ~90 fps. ( from what i can remember could be lower )

Raytracing on 2000 series just gets accelerated as it has dedicated cores for it. Even radeon cards can raytrace.

However some games don't work well with raytracing at all without those cores, control for example has piss poor MS without raytracing cores, which makes sense as it was builded for it through heavily nvidia sponsorship.

Here's a 5700xt raytracing.



Here's shadow of the tomb raider with a 1070.

 
Last edited:

pawel86ck

Banned
I thought something was off in that video. That's with RT off, as 1080 TI didn't support rtx. Both were great games in their own respect though. Both look better than uncharted as well.
ACO looks more detailed than thought, I say on pair if not better with Uncharted 4 details wise (of course during gameplay, because cutscens in Ucharted 4 are different story)


VRSR05q.jpg


But the same scene with RT lighting and shadows would look much better.

The difference would be similar this:
7CfVZU0.png
 

Turk1993

GAFs #1 source for car graphic comparisons
We arent talking here about character face details but lighting (RT). Uncharted 4 models are also very detailed during normal gameplay, but the lighting on entire face looks different and that's the problem. Control has cutscene quality shadows even during gameplay and that's what make it so special. Character model from your AC screenshot is also detailed but there's no cutscene quality lighting here.
I agree with you about ray tracing but Ryse really had amazing lighting and shadows on the face of the marcus even in gameplay, mostly because the gameplay was 50% like the cutscenes with close up finishing moves and animations with bokeh effect.
z3AsQY1.jpg

lpY2j7M.jpg
 
Every card with DX12 supports can raytrace, i can play shadow of the tomb raider at medium raytracing perfectly fine on a 1080ti with 12 ms input lag and average of ~90 fps. ( from what i can remember could be lower )

Raytracing on 2000 series just gets accelerated as it has dedicated cores for it. Even radeon cards can raytrace.

However some games don't work well with raytracing at all without those cores, control for example has piss poor MS without raytracing cores, which makes sense as it was builded for it through heavily nvidia sponsorship.

Here's a 5700xt raytracing.



Here's shadow of the tomb raider with a 1070.


My bad I literally watched the first 10 seconds of it, and closed it cause it looked like something was off. Just watched the video again, and the guy enabled raytracing after like 15 seconds. Shit, I just realized I'm very impatient sometimes lol.
 

Leonidas

Member
Try to read this part slowly, maybe twice: Navi beats Turing at perf/transistor.

Navi RDNA1 has 0 transistors dedicated to ray-tracing. Navi RDNA1 has 0 transisters dedicated to deep learning.

Your comparison is bogus. Even Pascal has better perf/transister in legacy games than Turing because it has no next-gen features, just like Navi.

Navi RDNA1 is garbage for next-gen games...
 
Last edited:

pawel86ck

Banned
Try to read this part slowly, maybe twice: Navi beats Turing at perf/transistor.

Currently released NAVI cards are also 7nm EUV, the best, 7nm DUV is yet to come.
How you can say Navi beats Turing performance and transistors density wise when AMD is using supperior process node and despite that it's still slower and still using way less transistor compared to Turing.

2080ti 18,600 million transistors
5700XT 10,300 million transistors

Maybe RDNA2 80CUs Navi will finally beat 2080ti (two years old card) but Nv will launch 3080ti soon 😃👌.

AMD 80CUs (5120 Shading Units) 2GHz - 20TF
Nv 7552 Shading Units 2GHz - 30TF
 
Last edited:

llien

Member
transisters dedicated to deep learning.
:D

"Stop mistyping our name" (c) Transistors

RDNA1 has 0 transistors dedicated to ray-tracing
Yeah, while Turing has estimated 8% dedicated to it. No big deal.

performance and transistors density wise
No, simply transistor count wise.

RDNA2 80CUs Navi and epeen competition?
I don't think it matters. It certainly doesn't matter to me, I cannot imagine myself wasting north of $1k on a GPU.
Tech/architecture parity does matter, as it affects competitiveness of entire product line.

AMD is possibly lagging on power efficiency front, but that's it.

Now that company isn't R&D budget starved, I expect it to expand on the success of the Navi cards, which keep grabbing market share:
 

llien

Member
It is curious these cards even show up. (an obviously controlled "leak")
For starters, remind me, when was the last time NV debuted fab node switch with a supersized chip?

80CU Navi, if real, would have no problems beating 2080Ti, perhaps NV doesn't want AMD to have halo product, as it would dispel reality distortion field that leads to "Switch would be faster than PS4" expectation.
 
Last edited:

Xyphie

Member
The best comparison of "per-transistor perf" between current architectures would be Navi 14 vs TU116 as they have the most similar feature set and the same shader count. Navi 10 versus TU106 would be wonky because the lower shader count and higher feature set of TU106.

Navi 14 = 6,400 million transistors
TU116 = 6,600 million transistors

3% more transistors. AMD doesn't offer a version with the full shader count available, so let's use RX 5500 XT and 1660 Super as they both have 90% of the full shader count of the full chips.

1660 Super looks to be a good 20% faster. Much in part because nVidia is able to put a 192-bit memory controller in the same transistor budget versus AMD's 128-bit controller.

I don't know where this AMD has better per-transistor perf comes from but it's Illien being a bad faith actor as per usual.
 
Last edited:

pawel86ck

Banned
No, simply transistor count wise.


I don't think it matters. It certainly doesn't matter to me, I cannot imagine myself wasting north of $1k on a GPU.
Tech/architecture parity does matter, as it affects competitiveness of entire product line.


If AMD will make good value card, Nv will respond too. Right now people can buy 5700XT or 2060S and Nv card is more forward-looking purchase (HW RT, DLSS, VRS, Mesh Shading). 2060S can run wolfenstein 2 youngblood with RTX and DLSS with very good results, so 5700XT owners already have to play with inferior graphics.
 

Leonidas

Member
The best comparison of "per-transistor perf" between current architectures would be Navi 14 vs TU116 as they have the most similar feature set and the same shader count. Navi 10 versus TU106 would be wonky because the lower shader count and higher feature set of TU106.

Navi 14 = 6,400 million transistors
TU116 = 6,600 million transistors

3% more transistors. AMD doesn't offer a version with the full shader count available, so let's use RX 5500 XT and 1660 Super as they both have 90% of the full shader count of the full chips.

1660 Super looks to be a good 20% faster. Much in part because nVidia is able to put a 192-bit memory controller in the same transistor budget versus AMD's 128-bit controller.

I don't know where this AMD has better per-transistor perf comes from but it's Illien being a bad faith actor as per usual.

Ouch llien llien 's bogus comparison thrown right out the window :messenger_tears_of_joy:
 

pawel86ck

Banned
The best AMD gaming GPUs currently (Radeon 7 and 5700XT) cant compete with the best Nv has to offer in transistor count and performance. Comparisons with low end cards are pointless if someone want to prove AMD superiority, it's not Nv fault AMD cant compete in high end segment.
 

llien

Member
If AMD will make good value card, Nv will respond too
Remind me, how it worked with 57xx series.
NV response was 5% faster card that is 20% more expensive.
2060sup simply didn't make sense.

The best AMD gaming GPUs currently (Radeon 7 and 5700XT) cant compete with the best Nv has to offer in transistor count and performance
The best AMD gaming GPU at the moment is a 251mm2 piece of silicon, that spoils the sales of much larger (even taking process difference into account).

The fact that 5700XT "cannot compete to 2080Ti" is, sorry, irrelevant.
 
Last edited:

Xyphie

Member
48 vs 32 rops
There's no magic.

This really has nothing to do with the original infactual point made (i.e. that Navi uses less transistors than nVidia for equal raster performance). If you chopped off a third of the memory controller on a TU116 die it would still fewer transistors than Navi 14, be a bit faster and have more features in hardware (VRS etc).
 

pawel86ck

Banned
Remind me, how it worked with 57xx series.
NV response was 5% faster card that is 20% more expensive.
2060sup simply didn't make sense.


The best AMD gaming GPU at the moment is a 251mm2 piece of silicon, that spoils the sales of much larger (even taking process difference into account).

The fact that 5700XT "cannot compete to 2080Ti" is, sorry, irrelevant.
5% faster 2060S only when Turing features arent used

2060S features:
DLSS - 35% performance inprovement in wolfenstein 2 and picture looks maybe even sharper than native 4K with TAA.
VRS - wolfenstein 2 implementation isnt the best but there's huge potential in VRS. 3dmark with VRS tier 2 feature set shows massive 75% performance improvement
- RT is around 4-6x times faster
- Mesh shading offers massive increase in polygon counts

These are all very important features that will make a big difference when used correctly. In wolfenstein 2 youngblood people can see already what Turing GPUs can do, 5700 simply cant provide similar experience in this game. Soon developers will start porting PS5/XSX games to PC and dont be surprised if certain games will not even run without HW RT. IMO Radeon 5700 has no future compared to 2060S.
 
Last edited:

psorcerer

Banned
This really has nothing to do with the original infactual point made (i.e. that Navi uses less transistors than nVidia for equal raster performance). If you chopped off a third of the memory controller on a TU116 die it would still fewer transistors than Navi 14, be a bit faster and have more features in hardware (VRS etc).

Minor things.
It's the same within a margin of error.
No magic exists in hardware.
 

llien

Member
5% 6-7% faster than 2060S (and closer to 2070S, which is 20% more expensive) when Turing anyway irrelevant features arent used

FTFY.

RT is around
Nobody gives a fuck.

DLSS - 35% performance inprovement
Useless feature that degrades visuals, thanks.

Surely it went along the lines "hey, we have heavy denoising for our so called RT, can't we apply it anywhere else?".
Nope, at least not here.

These are all very important features
Give me a break. It is a totally moot point in this context.

It's amazing how short a memory people have.
The "OMG, teh gap" narrative is from Vega VII times. 7nm, expensive RAM, good if breaks even when sold at competitive price.
Well, guess what, folks, times have changed, Navi is a major leap forward that has closed the gap to an elusive levels.

Can NV go bazinga and roll out monster size chip, just for epeen purposes?
Sure as hell.
Can NV roll out something that would bring back Pascal vs Polaris times, when starved AMD focused all resources on Zen development?
You gotta be kidding even asking that question.
 

diffusionx

Gold Member
I'm not even sure what the hell you people are arguing about, but bringing up BFV and year old DLSS is not fair. Here is an article from the same place talking about what Nvidia has done since. It's very solid now.


I wouldn't buy a AMD GPU, they need to actually release good competitive products, not stuff that is a more-or-less wash with old GPUs that run on older manufacturing tech.
 
Last edited:

pawel86ck

Banned
FTFY.


Nobody gives a fuck.


Useless feature that degrades visuals, thanks.

Surely it went along the lines "hey, we have heavy denoising for our so called RT, can't we apply it anywhere else?".
Nope, at least not here.


Give me a break. It is a totally moot point in this context.

It's amazing how short a memory people have.
The "OMG, teh gap" narrative is from Vega VII times. 7nm, expensive RAM, good if breaks even when sold at competitive price.
Well, guess what, folks, times have changed, Navi is a major leap forward that has closed the gap to an elusive levels.

Can NV go bazinga and roll out monster size chip, just for epeen purposes?
Sure as hell.
Can NV roll out something that would bring back Pascal vs Polaris times, when starved AMD focused all resources on Zen development?
You gotta be kidding even asking that question.
Next gen consoles will use similar features therefore when developers will port their games to PC more and more games will use RT whatever you like it or not.

DLSS 1 implementation was far from good, but DLSS 2 has changed everything. Now picture quality is comparable with 4K native +TAA if not sharper and performance boost is huge.



Without DLSS you would have to lower resolution much lower in order to get similar performance but then picture quality would be still far inferior compared 4K native.
 
Last edited:
Now I have nothing against AMD, their cpu's are on a while different level. But to vouch for them in the gpu department is fucking hilarious. It's like there are some people eating up every bit of PR about "next gen" consoles, and truly believing it. If I had a choice between a card that has next gen features, uses less power, and has better performance, especially in the mid to high end market... Why would AMD even be an option? Can I get a raise of hands of anyone in here who has witnessed the performance of the hyped up RDNA2 gpu's? Anyone??

I would gladly spend 50 bucks more to have better performance, and future proofing, as none of the RX 5xxx have next gen features. They are obsolete out of the box already.

1080p

BmwuIih.png


1440p

Nbw38qF.png
 
Last edited:

diffusionx

Gold Member
I have a Ryzen 3700X, which replaced a Ryzen 1600, so I was on board with AMD CPUs immediately. But their GPUs have been a consistent disappointment for a long time now. The newest ones are solid responses to the mid-high Nvidia, but unfortunately for them, they launched a year later and Nvidia just had to do a small price cut and speed boost to get back into it. I know people are excited about the new Xbox and PS5 but when it comes to AMD graphics, it should be full "believe it when I see it" mode.
 

Kenpachii

Member
Now I have nothing against AMD, their cpu's are on a while different level. But to vouch for them in the gpu department is fucking hilarious. It's like there are some people eating up every bit of PR about "next gen" consoles, and truly believing it. If I had a choice between a card that has next gen features, uses less power, and has better performance, especially in the mid to high end market... Why would AMD even be an option? Can I get a raise of hands of anyone in here who has witnessed the performance of the hyped up RDNA2 gpu's? Anyone??

I would gladly spend 50 bucks more to have better performance, and future proofing, as none of the RX 5xxx have next gen features. They are obsolete out of the box already.

1080p

BmwuIih.png


1440p

Nbw38qF.png

Because there logic is, if amd releases new hardware that's 5% faster and 5% cheaper its now time for everybody to upgrade.

While in reality, people already have the hardware if they wanted it and have no reason to even remotely buy that GPU to start with because they are to late to the party.

The entire 5000 series that can't even beat the 1000 series of nvidia already says enough, the only reason nvidia released the super series was to counter amd pricing while not pissing off there loyal consumers by dropping the prices on there super overpriced 2000 series cards that they got away with because AMD isn't delivering.

It's like console people talking about "build me a PC that does the same my xbox one X does for 400 bucks!!!!" U can't.

While in reality, everybody's PC already does 60 fps in witcher 3, for a couple of years now without effort. But when u ask build me a console that can do 60 fps in witcher 3 its "error'.
 
Because there logic is, if amd releases new hardware that's 5% faster and 5% cheaper its now time for everybody to upgrade.

While in reality, people already have the hardware if they wanted it and have no reason to even remotely buy that GPU to start with because they are to late to the party.

The entire 5000 series that can't even beat the 1000 series of nvidia already says enough, the only reason nvidia released the super series was to counter amd pricing while not pissing off there loyal consumers by dropping the prices on there super overpriced 2000 series cards that they got away with because AMD isn't delivering.

It's like console people talking about "build me a PC that does the same my xbox one X does for 400 bucks!!!!" U can't.

While in reality, everybody's PC already does 60 fps in witcher 3, for a couple of years now without effort. But when u ask build me a console that can do 60 fps in witcher 3 its "error'.
I honestly thought that was obvious to the general public... But I guess you can't assume everyone has basic common sense.

There was a guy last night on here, that said he spent 200 euros to build a pc, and can play recent games at 60fps with all settings on high. He was truly amazed.

Even 60fps now, seems like what 30fps used to be, as monitors and gpu's are pushing for high fps more than ever. 100fps>> is gonna be the standard in pc world soon, while consoles may even have some 30fps games, and some may struggle to hold a constant 60fps, as devs will prioritize 4k and raytracing, over fluid gameplay.

I'm interested in seeing how this whole hype train will go when we near console release. Will the same people who were bent over for AMD, still be in the same position? What happens when Digital Foundry debunks what many users claimed and wished for. And then.... There will be the Nvidia 3xxx series release! This will be a very entertaining year to say the least.
 
Soon developers will start porting PS5/XSX games to PC and dont be surprised if certain games will not even run without HW RT. IMO Radeon 5700 has no future compared to 2060S.

ps5 and xsx are likely to have at least 16GB of ram, so that may be an issue regards future proofing for current cards.
Useless feature that degrades visuals, thanks.

Surely it went along the lines "hey, we have heavy denoising for our so called RT, can't we apply it anywhere else?".
Nope, at least not here.
DLSS 2.0 is actually quite good.
 

llien

Member
Screamer-RSA Screamer-RSA
These screenshot look terrible next to what Sony is doing in a 7870 level GPU.

whatever you like it or not.
Sure thing, John.
It's me "not liking it" that causes less than 1% of games to support it and even 2080 owners to disable it in settings.

but DLSS 2
Yeah, imagine what we'll get with DLSS 3, I bet it will be photorealism+, with stuff looking even more realistic, than real stuff, chuckle.
 
Last edited:
Top Bottom