• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX SUPER Not So SUPER After All???

thelastword

Banned





"We have also confirmed the GPU variants that will launch with SUPER series.

The GeForce RTX 2080 SUPER will feature Turing TU104-450 GPU. This alone points towards 8GB GDDR6 memory, which for this very SKU will be faster (16 Gbps vs 14 Gbps) than the rest of the RTX stack. At this point, there is no doubt that this model will have all CUDA cores enabled (3072 to be exact).

Meanwhile, the RTX 2070, which was used by AMD during Radeon RX 5700 XT presentation, will receive a faster variant with an additional 256 CUDA cores. This card will be powered by TU104-410 GPU.

The RTX 2060 Super will basically be a cut-down version of RTX 2070 non-Super, featuring TU106-410 GPU with full 256-bit memory bus. This means 8GB GDDR6 memory, which is a 2GB more than RTX 2060 non-Super."


6WqBIs0.jpg



01aDZbY.jpg



https://videocardz.com/81046/nvidia-geforce-rtx-20-super-series-to-launch-mid-july
 

thelastword

Banned
So if it wasn't for AMD, we would not be getting 8GB on a 2060....So they added a few more Cuda cores on the 2060, 2070 and 2080, yet the rumors recently suggested that all RTX cards would be going up to 16Gbps memory, but that is not so.....The 2080 is the only one going with faster memory....They said that the RTX 2080 would have 11GB of Vram, that is not so according to Igor either....

The real question at what price will these SUPER Cards be priced at....When Turing launched, NVIDIA declared how big the die was and how expensive it was to produce because of RTX, which was a lie, because the RTX 2080 is not close to being as expensive to produce as the Vega 7, yet the FE was more expensive.....Nvidia has been so stingy with VRAM, no VRR over HDMI and so many features......So this Nvidia is now going to give us these SUPER cards at competitive prices...…? I see....
 

Ivellios

Member
They are also adding more Tensor and RT cores, this thread is misleading.

unknown.png

I hope this chart is true, although i still believe they wont reduce vanilla cards prices and make these super versions slightly more expensive.
 

gspat

Member
So realistically, what's the increase from 20xx series card to 20xx series super? 5%?

Doesn't look like a heck of a lot?
 

Ivellios

Member
So realistically, what's the increase from 20xx series card to 20xx series super? 5%?

Doesn't look like a heck of a lot?

Plus some more RT and tensor cores for better ray tracing.

Im still more interested in a price cut on the vanilla rtx cards tbh
 

Celcius

°Temp. member
So realistically, what's the increase from 20xx series card to 20xx series super? 5%?

Doesn't look like a heck of a lot?
I would assume 10%, or else it wouldn't be worth the trouble.
I can get 10% extra performance just from overclocking my videocard.
 
Last edited:

gspat

Member
Nvidia 12nm outperforming 7nm? Sounds pretty impressive to me :goog_smile_face_eyes:
Intel is slightly managing the same with +++++++++++++++++...

What concerns me is that nvidia, with their new arch, can barely manage to improve on their previous cards..

Screw their competition - perf/$ per upgraded card, turing seems to be just lousy.
 

SonGoku

Member
The best news of this is 16Gbps GDDR6 chips availability one year before launch bodes well for next gen consoles
Nvidia 12nm outperforming 7nm? Sounds pretty impressive to me :goog_smile_face_eyes:
yeah tbf nvidia doesnt need to release even more expensive cards, the current RTX lineup is solid. What they need to do is drop prices to make it more appealing.
 
Last edited:

thelastword

Banned
Nvidia 12nm outperforming 7nm? Sounds pretty impressive to me :goog_smile_face_eyes:
Not till they refreshed cards that just launched a few months ago to compete......So what? More power consumption on their end, look at how this will become a non-factor when these Turing cards run hotter.....And how much performance do you expect a few more Cuda Cores to deliver exactly?

Bare in mind, AMD still has the upper hand here, they have no RTX hardware (which is a farce on Turing anyway), watch how nobody is talking about DLSS anymore, that was a crap-shoot too.....The advantage AMD has is that they can be more flexible on price than Nvidia against these RTX cards which no one is using for RTX, anyway...….Right now Turing Super is going to be more expensive to produce than NAVI, increasing clocks and features will only increase power draw, which NV fans hate right? AMD is no longer hamstrung by expensive HBM for their gaming GPU's, they have the Vega line for high bandwidth, targeting a different market atm...….AMD can drop the price further, but if people think Nvidia is going to sell Super for peanuts, they must not know Nvidia....

As it stands, 5700XT beats RTX 2070, we don't know how well it overclocks yet, or if these cards will reach over 2000Mhz under AIb's or even reference...Sapphire may have something to say...…..The 5700 decimates the 2060, which I felt was Turing's best offering on a price to perf ratio, so there's no way that NVIDIA would not respond there, notwithstanding, finally conceding that they had to get with the times and place 8GB of Vram on a $379 card for crying out loud...…(in the 2060 Super)….So yes, 5700 is the star here, it destroys the 2060, 1660ti, 1660......Polaris 570 destroys the 1650...…..This is where most of the gamers are, not at 2080ti or 2080 Super Spec...….

They are also adding more Tensor and RT cores, this thread is misleading.

unknown.png

The OP is asking the same question based on Joker's video........People were talking of RTX Super, decimating the competition yesterday, leaving it in the dust......No one is talking about RTX, because it's a non factor...…... Annnnd…. do you really think 2-4 more RT cores is going to make these super cards smoking fast in raytraced games? Do you somehow believe a 2070 super or 2060 super is going to run QUAKE 2 at 900p 60fps now?
 

Ascend

Member
The fact that nVidia had to immediately reply to downplay AMD's features shows that they are actually in damage control mode... The funny one is definitely their response to Radeon Anti-Lag. Their response is basically "We don't know what they're doing exactly, BUT, we already did it!". Hilarious. It's already well-known that Polaris has lower input lag compared to their Pascal and Turing counter parts in the same performance bracket, so... Yeah...

 

PhoenixTank

Member
I can't say I was anticipating anything more than the videocardz image, tbh. Where have these (previously?) heightened expectations come from?
Edit: To answer my own question
 
Last edited:
It's not a new gen of GPUs or anything but there doesn't seem to be anything wrong with this new lineup.

Believe me, I would LOVE to see AMD take it to Nvidia like they are doing to Intel... but so far they are far from that goal.

If you want to buy an AMD GPU, there are many OK options. But there is very little that straight up beats what Nvidia offers.

As far as I can tell the ONLY AMD GPU that really makes sense to buy is the 570 ( that's 570 not 5700 ).

Best to not buy one of these Super cards if you can wait as there will be a significant jump in performance when Nvidia moves to 7nm.
 
Last edited:

sendit

Member
Depends on the performance. If it gets me Cyberpunk at max settings, 4k, then yes. :)

I’m sure even a 2060 RTX can get you to max settings at 4K. The real question would be, what would be an acceptable frame rate for you to play at?

For me..if it’s not hitting over 100 FPS. I will adjust graphic settings until it does.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I’m sure even a 2060 RTX can get you to max settings at 4K. The real question would be, what would be an acceptable frame rate for you to play at?

For me..if it’s not hitting over 100 FPS. I will adjust graphic settings until it does.
Considering that a 2060 gives you around 1070 Ti performance in non-ray tracing, I can assure you're not going to get 4K in Cyberpunk at a framerate most PC players would consider acceptable if Witcher 3 is anything to go by, which gives around 41 fps at 4K

 
Last edited:
  • Like
Reactions: 888

Agent_4Seven

Tears of Nintendo
What a rip-off. Nvidia gonna Nvidia no matter what:messenger_grinning_sweat:
C'-fuckin'-mon, AMD! 😩 Do something about it!:messenger_angry: 5900XT for $400! Let's go!:messenger_horns:
 
Last edited:
I have an idea. Why doesn't...hear me out.....listen closely....check this out....why doesn't...bear with me....why doesn't AMD...check this out....listen up.....simply stick two 5700's together on one board and sell it for a reduced price. :messenger_beaming:

Multi-gpu setups need to make a comeback. We have the motherboard bandwidth and game engines capable of making efficient use of multi-gpu.
 
Last edited:

Armorian

Banned
I have an idea. Why doesn't...hear me out.....listen closely....check this out....why doesn't...bear with me....why doesn't AMD...check this out....listen up.....simply stick two 5700's together on one board and sell it for a reduced price. :messenger_beaming:

Multi-gpu setups need to make a comeback. We have the motherboard bandwidth and game engines capable of making efficient use of multi-gpu.

Lol
 
I’m sure even a 2060 RTX can get you to max settings at 4K. The real question would be, what would be an acceptable frame rate for you to play at?

For me..if it’s not hitting over 100 FPS. I will adjust graphic settings until it does.

I'm a 60fps sort of guy. Never had a monitor with higher refresh rates, so I'm content in my ignorance. :p
 
D

Deleted member 752119

Unconfirmed Member
I hope this chart is true, although i still believe they wont reduce vanilla cards prices and make these super versions slightly more expensive.

That’s my thought as well. I’ll Probably just see if my GPU runs Gears 5 well this fall (runs Gears 4 great) and then next fall decide between PC upgrade or Scarlett. PS5 is a must as Sony first party stuff is my favorite (along with Nintendo), but I want the Xbox exclusives and Gamepass too.

My gaming PC has gotten a lot less use than I’d hoped outside of Xbox stuff, so with GPU prices what they are it probably just makes more sense for me to go back to owning all three consoles and just keep the PC for indie exclusives and emulators.
 
Top Bottom