• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA may announce new RTX 2060, 2070 and 2080 “Super” GPUs at E3 2019

Zannegan

Member
WCCFTECH is as low as it gets. Pure clickbait. Pure hype. Pure speculation. They do a "news" bit for everything. I could mail them that Navi 64 CU card comes out in August faster than RTX Titan for 899$ and they would publish it 5 minutes later.
Ah, thank you. Glad to see my BS meter is still working.
 

Spukc

always chasing the next thrill
RTX 2080TI SUPER ?

glad i waited before getting a new build for around the end of the year 🤣

I wanted the normal 2080ti and 9900k.
 

dorkimoe

Member
Still waiting for the 2080ti to drop in price so i can upgrade my 1080. Havent had a need yet since it still runs everything great.
 

888

Member
Regardless of sources speculation. All these refreshed cards will be closer to the model above. I’m more interested in the price changes as the vanilla RTX was not a competitive price point. Regardless of the VRAM is changed I’ll probably end up grabbing a 2080 Super if the price is right vs performance.
 

Zannegan

Member
I am not saying they are 100% wrong on this one. Something is definitely coming and "refreshes" or better binned RTX 20 series chips are the most logical thing, but to what extend no idea
Of course. I don't discount the info out of hand just because the source is... somewhat less than objective. NVIDIA tweeted about something super a couple of days ago, no? Or maybe I'm misremembering. And it's fun to talk about either way.

I'm sure something is coming. If even I've heard about it, then there's too much smoke for there not to be a fire. I just wanted to know if I should expect the second coming or to throw up my hyperbole shield at full strength.
 
Last edited:

GHG

Gold Member
If they drop the price and increase the performance over the original 2 series cards then I'll likely grab one. My 980ti is getting long in the tooth for VR and I'm getting an ultrawide monitor soon so will need more power.
 

888

Member
Of course. I don't discount the info out of hand just because the source is... somewhat less than objective. NVIDIA tweeted about something super a couple of days ago, no? Or maybe I'm misremembering. And it's fun to talk about either way.

I'm sure something is coming. If even I've heard about it, then there's too much smoke for there not to be a fire. I just wanted to know if I should expect the second coming or to throw up my hyperbole shield at full strength.

Nvidia sent out a tweet of a video detailing the word super. There is always leaks happening and now we are seeing some filings happening. It’s not uncommon for them to refresh but now that AMD is finally putting pressure this is when Nvidia gets interesting. I’ve always been Team Green but I like them way more when AMD is on fire. Nvidia alone without competition is never a good thing. But their drivers and features are way more fleshed out.
 
I was planning to buy second 2080TI but I guess I can hold off until situation is clear. Great to see competition between Nvidia and AMD kicking into high gear.
 

somerset

Member
*Warning*- the price war is about to start- whatever you do, do *not* buy any GPU in the next few weeks. Please wait for AMD and Nvidia to knowk seven hells out of one another. (I'm assuming the WCCFTech leak is true- and of late their leaks have been on the nose)

AMD's price announcement of the 5700 and 5700XT was one of the biggest mistakes in its corporate history- I called the prices "obscene" in an earlier post when brain dead AMD fans were telling you all the obscene price was just "fine"- be very careful who you believe (and remeber I love AMD stuff- but that doesn't make me an AMD lick-spittle).

AMD is currently busy licking the backside of Sony and MS (where AMD makes "cost plus"- in other words so little profit Nvidia and Intel long ago gave up bothering with consoles- tho now only AMD has the tech anyway).

But whatever- AMD isn't focused on giving PC gamers the best they can *yet* (and who likes to wait- I sure the hell do not).

Dribblers have been telling you for *years* why such and such performance *has* to be a such an obscene price- and they were always *wrong*. The prices were *artificial*- simply a function of what Nvidia thought it could get away with, and Nvidia, with Turing, has milked the cow far too hard.

In other words, Nvidia saturated its customer base, and killed the idea of casual and semi-casual GPU upgrades. I rock a 470 because I have never gone into this hobby prepared to pay thru the nose, and always look for best value per unit spend. But VR means I wanna upgrade- to around 1080TI performance, But eff the current prices for that level. They are off by a factor of more than *two* in histroic terms, thanks to weak competition from AMD (now ending).

So Turing isn't popular, isn't selling, and AMD is coming, but not yet. What does Nvidia do? The only thing it can do- cannibalise its future 7nm by giving us now what AMD isn't on its 7nm.

AMD has another chip to annonce, but delayed cos its Polaris parts are already killer in that segment (570 down at 120 dollars and killing it at 1080).

Worse, AMD has the vile Vega sitting at 250 dollars with the Vega 56 (a good price here, but too low a performance for me, and utterly dreadful tech to boot- especially the dreadful stack memory). The Navi 5700 (infinitely cheaper to make that the sh-t HBM vega garbage, and great tech to boot) was priced *above* vega- and you cannot get dumber than that. So Nvidia seems to have thought "eff it, we're going for the long term kill- ruin AMD 7nm before it even gets traction" knowing that AMD is kowtowing to Sony and MS, and indends to screw over its PC gaming customers til the new consoles hit).

Look guys, a company like AMD cannot serve two masters. It's either serving PC gamers, or serving the two console giants. So AMD just handed over the win to Zen2 + Nvidia 'super'.

PS yes AMD will, humiliatingly, collapse the price of the 5700 *before* launch if Nvidia super marketing hits in the meantime.

PPS *stop* trying to justify the high GPU prices, you fanboy dribblers. You are like the morons who justified Intel's effing awful 4-core i7 prices before Zen(1) hit.
 

Makariel

Member
I guess AMD is focussing its GPU development on whatever comes into the next batch of consoleboxes? My 1080 will still last me a good while, so I'll sit this GPU generation out and hope that market forces eventually force teams red and green to sort their tech out and deliver at a reasonable price.
 

Se_7_eN

Member
I am about to start a new build for 2077 (Waiting on the new ZEN lineup), but now it appears I will be waiting on these... Hopefully they are readily available on release, they were pretty good about that with the 2000 series.
 

Ellery

Member
Do they actually drop prices? I've always seen NVIDIA keep there prices steady.

yup. They also dropped the 780 when the R9 290(X) released.

And those cards weren't as overpriced as the RTX cards so they have a lot of room to drop prices and they kind of need to, because right now every single RTX card is just too expensive for the performance in 2019 and for the very low amount of VRAM those cards have.
It feels like Pascal was a nice jump in 2016 with the 10 series but after that we didn't get anything good going for us as consumers. Just the same price/perf still with the addition of RTX for less than 1% of games
 

Xyphie

Member
A new die for 2070 Ti seems strange. You have TU104 (2080) at 545mm^2 and TU106 (2070) at 445mm^2. Is it really meaningful to make another chip at ~500mm^2 when you'll also sell a binned version of TU104 as 2070 SUPER?
 

MadAnon

Member
yup. They also dropped the 780 when the R9 290(X) released.

And those cards weren't as overpriced as the RTX cards so they have a lot of room to drop prices and they kind of need to, because right now every single RTX card is just too expensive for the performance in 2019 and for the very low amount of VRAM those cards have.
It feels like Pascal was a nice jump in 2016 with the 10 series but after that we didn't get anything good going for us as consumers. Just the same price/perf still with the addition of RTX for less than 1% of games
What is this whining about low VRAM? These cards are targeted for specific gaming resolutions. If you buy an RTX 2060 for 4k gaming then you need serious tech advise. I don't see other reason for higher VRAM.
 
Last edited:

Ellery

Member
So it's just baseless whining?

You are derailing the thread. I am not the first and not the last to voice concern about the lack of additional VRAM on the RTX 20 series cards coming from the 10 series Pascal cards which had the same amount.

You are happy with the amount of VRAM on the RTX 20 series cards. Good for you. I am not. I think 8GB VRAM on an $800 card is stingy
 

MadAnon

Member
You are derailing the thread. I am not the first and not the last to voice concern about the lack of additional VRAM on the RTX 20 series cards coming from the 10 series Pascal cards which had the same amount.

You are happy with the amount of VRAM on the RTX 20 series cards. Good for you. I am not. I think 8GB VRAM on an $800 card is stingy
So you still didn't answer the question. It's just about being happy or unhappy? LoL ok!
 

Ellery

Member
So you still didn't answer the question. It's just about being happy or unhappy? LoL ok!

Your question was "What is this whining about low VRAM? ".
Do you really think I am going to answer a childish, petty, immature question like that.

Maybe you are having a bad day or are just generally the charming nice sweet person you are today, but next time try something like "Why do you think the current configuration of the RTX cards are not enough and would you be so kind as to say what games and resolution you think are going to struggle with it?"
Then I would have answered with joy.
 

MadAnon

Member
Your question was "What is this whining about low VRAM? ".
Do you really think I am going to answer a childish, petty, immature question like that.

Maybe you are having a bad day or are just generally the charming nice sweet person you are today, but next time try something like "Why do you think the current configuration of the RTX cards are not enough and would you be so kind as to say what games and resolution you think are going to struggle with it?"
Then I would have answered with joy.
Why do you think the current configuration of the RTX cards are not enough and would you be so kind as to say what games and resolution you think are going to struggle with it?
 

Krappadizzle

Gold Member
Yeah, no reason. 2060 is at least 60% faster than 1060. In some cases (eg Witcher 3) you get almost double framerate.
I have a 1080ti. Still smokes every game. Even at 1080p a 1060 will give you 60fps on just about every game. 20xx series cards are a waste for anyone for anyone with a 10xx series card.
 

base

Banned
I have a 1080ti. Still smokes every game. Even at 1080p a 1060 will give you 60fps on just about every game. 20xx series cards are a waste for anyone for anyone with a 10xx series card.
Not necessarily. If you want to play in quadhd then 2060 is a must. Playing in 1080p on a 27" is a joke.
 

Ellery

Member
Why do you think the current configuration of the RTX cards are not enough and would you be so kind as to say what games and resolution you think are going to struggle with it?

Because games like Shadow of The Tomb Raider are already hitting 7.5+ GB at 1440p (the resolution I am playing at) and since I don't know what future games might need, because I don't know what games are going to release within the lifespan of me using a GPU and what amount of VRAM the can work well with I can only look at the past and how it was then.

In 2012 I thought the 2GB GTX 680 would be great and have absolute no problems at all. Stupid me a couple years later the 2GB are a huge bottleneck.

And if I spend $800 on a graphics card I am not going to buy a new one 1-2 years later. I know Nvidia (or AMD) would love that and that is why they are doing that.

Right now I can probably play everything on 1440p Ultra with an RTX 2080, but what about Cyberpunk 2077 or games that come after the new consoles come out?

I can't afford a new expensive graphics card then and I think Nvidia (and also the AMD RX 5700 cards to an extend, though they have other problems) are just to stingy with 8GB of VRAM.

It is the same as the GTX 1070 and even cards like the R9 390X. Imho the way it should have been is 12GB for the RTX 2080 and 16GB for the RTX 2080 Ti.

Nothing I can do about it other than voting with my wallet and not buying those products. I can't afford an RTX 2080 Ti
 

Arkage

Banned
*Warning*- the price war is about to start- whatever you do, do *not* buy any GPU in the next few weeks. Please wait for AMD and Nvidia to knowk seven hells out of one another. (I'm assuming the WCCFTech leak is true- and of late their leaks have been on the nose)

AMD's price announcement of the 5700 and 5700XT was one of the biggest mistakes in its corporate history- I called the prices "obscene" in an earlier post when brain dead AMD fans were telling you all the obscene price was just "fine"- be very careful who you believe (and remeber I love AMD stuff- but that doesn't make me an AMD lick-spittle).

AMD is currently busy licking the backside of Sony and MS (where AMD makes "cost plus"- in other words so little profit Nvidia and Intel long ago gave up bothering with consoles- tho now only AMD has the tech anyway).

But whatever- AMD isn't focused on giving PC gamers the best they can *yet* (and who likes to wait- I sure the hell do not).

Dribblers have been telling you for *years* why such and such performance *has* to be a such an obscene price- and they were always *wrong*. The prices were *artificial*- simply a function of what Nvidia thought it could get away with, and Nvidia, with Turing, has milked the cow far too hard.

In other words, Nvidia saturated its customer base, and killed the idea of casual and semi-casual GPU upgrades. I rock a 470 because I have never gone into this hobby prepared to pay thru the nose, and always look for best value per unit spend. But VR means I wanna upgrade- to around 1080TI performance, But eff the current prices for that level. They are off by a factor of more than *two* in histroic terms, thanks to weak competition from AMD (now ending).

So Turing isn't popular, isn't selling, and AMD is coming, but not yet. What does Nvidia do? The only thing it can do- cannibalise its future 7nm by giving us now what AMD isn't on its 7nm.

AMD has another chip to annonce, but delayed cos its Polaris parts are already killer in that segment (570 down at 120 dollars and killing it at 1080).

Worse, AMD has the vile Vega sitting at 250 dollars with the Vega 56 (a good price here, but too low a performance for me, and utterly dreadful tech to boot- especially the dreadful stack memory). The Navi 5700 (infinitely cheaper to make that the sh-t HBM vega garbage, and great tech to boot) was priced *above* vega- and you cannot get dumber than that. So Nvidia seems to have thought "eff it, we're going for the long term kill- ruin AMD 7nm before it even gets traction" knowing that AMD is kowtowing to Sony and MS, and indends to screw over its PC gaming customers til the new consoles hit).

Look guys, a company like AMD cannot serve two masters. It's either serving PC gamers, or serving the two console giants. So AMD just handed over the win to Zen2 + Nvidia 'super'.

PS yes AMD will, humiliatingly, collapse the price of the 5700 *before* launch if Nvidia super marketing hits in the meantime.

PPS *stop* trying to justify the high GPU prices, you fanboy dribblers. You are like the morons who justified Intel's effing awful 4-core i7 prices before Zen(1) hit.

The Gforce 470 launched in 2010 @ $349, which is about $410 in today's money. The two cards being talked about cost $449 and $379. You perceptions of what constitutes a "high GPU price" is pretty bizarre.
 

MadAnon

Member
Because games like Shadow of The Tomb Raider are already hitting 7.5+ GB at 1440p (the resolution I am playing at) and since I don't know what future games might need, because I don't know what games are going to release within the lifespan of me using a GPU and what amount of VRAM the can work well with I can only look at the past and how it was then.

In 2012 I thought the 2GB GTX 680 would be great and have absolute no problems at all. Stupid me a couple years later the 2GB are a huge bottleneck.

And if I spend $800 on a graphics card I am not going to buy a new one 1-2 years later. I know Nvidia (or AMD) would love that and that is why they are doing that.

Right now I can probably play everything on 1440p Ultra with an RTX 2080, but what about Cyberpunk 2077 or games that come after the new consoles come out?

I can't afford a new expensive graphics card then and I think Nvidia (and also the AMD RX 5700 cards to an extend, though they have other problems) are just to stingy with 8GB of VRAM.

It is the same as the GTX 1070 and even cards like the R9 390X. Imho the way it should have been is 12GB for the RTX 2080 and 16GB for the RTX 2080 Ti.

Nothing I can do about it other than voting with my wallet and not buying those products. I can't afford an RTX 2080 Ti
That's because Tomb Raider allocates free Vram. It doesn't mean it uses/needs that much. It would show almost 8GB Vram usage even at 1080p if there's that much Vram. 8gb is MORE than fine for 1440p. It's even a very good 4k GPU. I could see it starting to struggle with 4k after next-gen launch and its titles but not 1440p any time soon.
 
Last edited:

thelastword

Banned
Probably waiting on AMD to flesh out the details of Navi so they can take a dump on their announcement
NAVI details have already been fleshed out...…..5700XT beats the 2070 and 5700 beats the 2060......

So what if the super GPU's is similar to the memory upgrade the GTX 1060 received? That will be something, people are expecting super cheap Nvidia GPU's with SUPER aren't they?
 

Krappadizzle

Gold Member
Not necessarily. If you want to play in quadhd then 2060 is a must. Playing in 1080p on a 27" is a joke.
Lol. If you found a way to justify the 20xx series, by all means, I'm glad for you. But for the general consumer(and I play at 3440x1440) the 10xx series is fine. The fact that the cards aren't flying off the shelves is testament to that.

Even then, my 1080ti gives me substantially better performance at ultrawide than a 2060/2070/2080, so it's even less incentive for someone in my position to upgrade.

If I was building a new PC I'd suggest the 20xx series. But for upgrading? Lol, absolutely not. Not a good enough return.
 

Ellery

Member
That's because Tomb Raider allocates free Vram. It doesn't mean it uses/needs that much. It would show almost 8GB Vram usage even at 1080p if there's that much Vram. 8gb is MORE than fine for 1440p. It's even a very good 4k GPU. I could see it starting to struggle with 4k after next-gen launch and its titles but not 1440p any time soon.

That is true and I probably wouldn't mind if the RTX 2080 was a bit cheaper. I hope it does get a price cut with this SUPER announcement.
 

888

Member
Lol. If you found a way to justify the 20xx series, by all means, I'm glad for you. But for the general consumer(and I play at 3440x1440) the 10xx series is fine. The fact that the cards aren't flying off the shelves is testament to that.

Even then, my 1080ti gives me substantially better performance at ultrawide than a 2060/2070/2080, so it's even less incentive for someone in my position to upgrade.

If I was building a new PC I'd suggest the 20xx series. But for upgrading? Lol, absolutely not. Not a good enough return.

I agree that for most applications a 10x is good enough. But theres a big gap between my 1070 vs your 1080ti. I play at 1440p 144hz when possible. I am only looking to get a super card to hand down an old 780 in one of my rigs to my nephew. This refresh is coming at the right time in my case.
 
What is this whining about low VRAM? These cards are targeted for specific gaming resolutions. If you buy an RTX 2060 for 4k gaming then you need serious tech advise. I don't see other reason for higher VRAM.


This is a decade old argument and it's always been lost by the fact that a VRAM limitation is the single biggest killer of performance. A few dropped frames here or there from too high a setting is nothing compared to the grinding fuckfest of a memory cap. You hit that cap, that's it... War is over. Press F to pay respects.

8gig isn't nearly enough, and this is from someone with an 8gig card. If you're paying top dollar TODAY for an 8gig card then you're a mug. A fucking mug.

There's a reason Nvidia segments it's market with Vram. Their engineers can tape out miracle silicon til the cows come home but it's crippled by bullshitz memory imposed by the bean counters and SKU stack. You'll never have god tier upper low-end to Mid range GPU's ever again.
 
Nvidia segments with VRAM because VRAM is fucking expensive. I mean, duh. The only reason AMD gives ridiculous VRAM counts is because they depend on the low power usage of HBM to meet any sort of sane power envelope due to GCN being such a power hog, and HBM is literally not available in lower capacities like GDDR5/6 are.
 

MadAnon

Member
This is a decade old argument and it's always been lost by the fact that a VRAM limitation is the single biggest killer of performance. A few dropped frames here or there from too high a setting is nothing compared to the grinding fuckfest of a memory cap. You hit that cap, that's it... War is over. Press F to pay respects.

8gig isn't nearly enough, and this is from someone with an 8gig card. If you're paying top dollar TODAY for an 8gig card then you're a mug. A fucking mug.

There's a reason Nvidia segments it's market with Vram. Their engineers can tape out miracle silicon til the cows come home but it's crippled by bullshitz memory imposed by the bean counters and SKU stack. You'll never have god tier upper low-end to Mid range GPU's ever again.

Can you name the title you struggle with at 1440p? Because I don't believe for a second it struggles with anything at 1440p60fps, ultra settings.. 4k depends on how high settings you want. It's definitely not a card for 4k/60 fps ultra textures on every title out there. That's what 2080 ti is for.
 
Last edited:

Zerotex

Member
I will wait another gen, cause Im still getting good fps with 1080ti and I think the cards needs more power for a proper ray tracing gaming in 4k.
 

888

Member
Can you name the title you struggle with at 1440p? Because I don't believe for a second it struggles with anything at 1440p60fps, ultra settings.. 4k depends on how high settings you want. It's definitely not a card for 4k/60 fps ultra textures on every title out there. That's what 2080 ti is for.

Assassins Creed Odyssey and Origins.
 

Ellery

Member
I will wait another gen, cause Im still getting good fps with 1080ti and I think the cards needs more power for a proper ray tracing gaming in 4k.

Well yeah the 1080Ti is a truly great card. I wish I had bought one when it came out 2 years ago
 

MadAnon

Member
Assassins Creed Odyssey and Origins.
Except it doesn't. I've seen these games benchmarked myself live with RTX 2080 and it averages above 60fps at 1440p ultra. And those games never use even close to 8GB of Vram at 1440p ultra. Mostly around 5.5GB
 
Last edited:

888

Member
Except it doesn't. I've seen these games benchmarked myself live with RTX 2080 and it averages above 60fps at 1440p ultra. And those games never use even close to 8GB of Vram at 1440p ultra. Won't fool me.

Won't fool you? Dude you are delusional, no one is out to get you lol.

You asked what title that struggles to run at 1440p 60fps at Ultra. I tossed in a suggestion of my own on games that are hard to run at ultra. Using a 1070 those games don't run at 1440p 60 ultra.

Nice edit.
 
Last edited:

manfestival

Member
NAVI details have already been fleshed out...…..5700XT beats the 2070 and 5700 beats the 2060......

So what if the super GPU's is similar to the memory upgrade the GTX 1060 received? That will be something, people are expecting super cheap Nvidia GPU's with SUPER aren't they?
Vague details about its actual performance isn't fleshed out.
 
Top Bottom