• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • Hey Guest. Check out the NeoGAF 2.2 Update Thread for details on our new Giphy integration and other new features.
  • The Politics forum has been nuked. Please do not bring political discussion to the rest of the site, or you will be removed. Thanks.

Rumor NVIDIA allegedly cancels GeForce RTX 3080 20GB and RTX 3070 16GB

Elcid

Banned
Jul 27, 2018
3,076
4,902
720
I'm not even bothering to upgrade until something like the 3090 is more affordable. A 3080 (4080?) 20GB sounds like a sweet spot though.
 
  • Like
Reactions: ShirAhava

Spukc

Member
Jan 24, 2015
16,991
18,112
920
Never knew Jensen had a neogaf account
so i should buy the 3080 now?
 
Last edited:

Malakhov

Member
Jun 6, 2004
7,909
1,946
1,790
It's 2020, don't be surprised if AMD also does a paper launch and everybody has to wait for Q1-Q2 2021 to easily get cards from either company.
2020 should be erased from history...
Yeah but at least it gives me another option to sign up on discord alerts 😂
 

rofif

Member
Sep 13, 2019
5,109
6,847
595
Never knew Jensen had a neogaf account
so i should buy the 3080 now?
Sure if You can.
the 3080 is a damn beast here and now. Not sure if it's worth futureproofing. But how is this not at least a little bit future proofing? it's the best gpu now for the price.
I think by the time games will require more than 10gb, 3080 will be running those games 12fps anyway and it will be years
 

Jigsaah

Gold Member
Jan 31, 2018
5,711
5,753
740
read more
https://videocardz.com/newz/nvidia-allegedly-cancels-geforce-rtx-3080-20gb-and-rtx-3070-16gb


um...

LIEEEEEEEEEEEEEEEEEEEEEEES! They are waiting. This is 3d chess played at the highest level.
 

Sparhavoc

Member
Sep 4, 2020
669
1,278
480
depend on your resolution and game you play
I game on my 4k TV and perfectly happy with 1440p. Currently have a Ryzen 5 3600 16gb ram and RX 5700 XT. Wondering if there's any point upgrading for next gen games or if my specs can hold for sales.
 
  • Strength
Reactions: notseqi
Mar 23, 2018
7,049
8,492
765
They can't even ship the current versions. why would they bother making second versions at this point. Makes no sense.

When AMD gpu's hit the market we will see how it goes, nothing is in stone at this point.
 
Last edited:

Dave_at_Home

Member
Jun 19, 2020
999
3,151
380
I think some people are gonna have buyers remorse when Nvidia releases the Super 3000 series based on 7nm. Cheaper and probably with more VRAM.
 

KungFucius

Member
Jul 16, 2008
1,870
718
1,120
I still didn't get the 20GB. The 24GB 3090 only had 10% better performance than the 3080. The 20GB would benchmark horribly when considering the cost compared to its cheaper 10GB model.
 

Malakhov

Member
Jun 6, 2004
7,909
1,946
1,790
I game on my 4k TV and perfectly happy with 1440p. Currently have a Ryzen 5 3600 16gb ram and RX 5700 XT. Wondering if there's any point upgrading for next gen games or if my specs can hold for sales.
If you're satisfied with 1440p you can wait, this shitstorm of a fake launch will finish and upgrade later when you want to game in 4k. You'll have amd and nvidia as options and maybe even models to pick from.
 
Mar 23, 2018
7,049
8,492
765
I still didn't get the 20GB. The 24GB 3090 only had 10% better performance than the 3080. The 20GB would benchmark horribly when considering the cost compared to its cheaper 10GB model.

People buy the 20gb version to be future proof, it could even do worse then the original 3080 by 5% i would still get it over a 3080 at 50 or 100 bucks premium.
 

grfunkulus

Member
Dec 23, 2016
724
310
350
People buy the 20gb version to be future proof, it could even do worse then the original 3080 by 5% i would still get it over a 3080 at 50 or 100 bucks premium.

What? No. Some people would, MAYBE edit: sorry, misread and thought you said 'people would'

If excess vram sold GPUs then Amd would be leading the market. I think a fair few people who are buying $700+ cards are the kind that upgrade every 2-3 years max. Most people I know with 5 year old cards are not shopping the high end, even remotely.

How would nvidia market that anyway lol? There's already the 3090 for creators who like to game. An inferior 3080 in performance would just be a big wet fart. Edit: again, only if it's inferior. If it's same or 2-5% better it still sells, I bet just not as well as the base varient
 
Last edited:
Sep 1, 2020
213
304
265
No buy until they drastically bump up the VRAM in these things. My 1070 from 2016 has 8GB, I’m not getting suckered into this beginning of next gen trap.
 

ShirAhava

Plays with kids toys, in the adult gaming world
Jun 10, 2008
3,915
1,516
1,310
I bought a 2060 Super earlier in the year because with everything going on in 2020 I did not think these cards were coming out before Q1 2021

A 3080/3090 paper launch is better than I expected

with the lack of VRAM and the fact that pretty much all the new RTX features are supported in 2060 and up I'm in no rush
 
Last edited:
  • Thoughtful
  • Like
Reactions: GHG and notseqi

Hudo

Member
Jul 26, 2018
4,943
6,340
490
Why would it cost twice as much as a 3090 with less ram?
Because nVidia have gone crazy the last decade with their pricing. Why does a graphics card (RTX 3090) cost 1500€? It's beyond ridiculous. No consumer-grade graphics card is worth that much. nVidia are overpricing GPUs for quite a long time.
 

Malakhov

Member
Jun 6, 2004
7,909
1,946
1,790
There most likely won't be a 3080ti. The 3090 is the 3080ti.
And if you compare the 3090, which is really the best ampere gpu (not titan) to the best flagship of the old gen (2080ti). This jump of this generation is one of the shittiest ever

Grats to nvidia for all these mind games

 
Last edited:

rofif

Member
Sep 13, 2019
5,109
6,847
595
People buy the 20gb version to be future proof, it could even do worse then the original 3080 by 5% i would still get it over a 3080 at 50 or 100 bucks premium.
The excess of vram is doing to do shit in 5 years when the card is just to slow. There is no future proofing. That's called a console
 
Apr 11, 2016
1,352
1,542
510
The card wont be too slow in 5 years come on. Its massively powerful and twice as fast as next gen consoles, it wont become a piece of shit in 5 years, it'll still be twice as fast as ps5 even 7 years from now. We'll just need to lower a few options to free some ram up at 4k as time goes on i suspect. But we'll really need to see how all this evolves over the next 2 years
 
  • Like
Reactions: RedVIper

Tschumi

Banned
Jul 4, 2020
2,559
2,968
590
Japan
I guess it has something to do with third party manufacturers skimping on parts and tarnishing the release?
 
Dec 1, 2014
1,165
966
555
Belgium
For sure it is canceled in December to be able to produce more RTX 3080 10GB to meet demand ... I'm sure these cards will come out later, they're just delayed ...
 

Malakhov

Member
Jun 6, 2004
7,909
1,946
1,790
I guess it has something to do with third party manufacturers skimping on parts and tarnishing the release?
No one tarnished the release except nvidia themselves

They decided against TSMC's more costly 7nm process and went with Samsung's 8nm to save money and also decided to release cards early, when stock hadn't been built up and drivers hadn't been fully tested, causing both stock & launch issues.

Fuck them
 
  • Praise the Sun
Reactions: Tschumi

nemiroff

Gold Member
Feb 19, 2018
1,685
2,172
595
People buy the 20gb version to be future proof, it could even do worse then the original 3080 by 5% i would still get it over a 3080 at 50 or 100 bucks premium.

The "future proofing" argument seems like built on a flimsy foundation to me. As it looks right now there's no demonstrable evidence (benchmarks from other than fringe developers) that show 10GB GDDR6X cards won't last at least this GPU gen and the next. With that said, since I'm a facts-driven person I'd like no more than to be proven wrong so-to-speak..
 
Last edited:

Arun1910

Member
Sep 11, 2013
1,758
2,259
680
Not surprised, it sounded ridiculous as a concept.

How would it be priced between the 3080 and the 3090? Why would they even drop it 3 months after launch of the 3080? What would be the point in a 3090 if this was true?
 

Malakhov

Member
Jun 6, 2004
7,909
1,946
1,790
Not surprised, it sounded ridiculous as a concept.

How would it be priced between the 3080 and the 3090? Why would they even drop it 3 months after launch of the 3080? What would be the point in a 3090 if this was true?
The 2080ti launched one week after the 2080. Then one month later the 2070 and 4 months later the 2060. I don't see it happening either, the 3090 is basically the 2080ti equivalent of the 3000 series
 

prinz_valium

Member
Oct 15, 2013
2,241
2,528
815
There most likely won't be a 3080ti. The 3090 is the 3080ti.
And if you compare the 3090, which is really the best ampere gpu (not titan) to the best flagship of the old gen (2080ti). This jump of this generation is one of the shittiest ever

Grats to nvidia for all these mind games

Feels soooo good that I only bought the GTX 470 and 1070
 

AncientOrigin

Member
Jun 11, 2020
56
45
180
AMD will have the better cards this Time.I think Performance AMD will win,Nvidia will probably have better ray tracing.When RDNA3 gets released AMD will start trash NVIDIA like they did arrogant Intel.Even if the RDNA2 cards are a bit behind in performance they will have way more ram then Nvidia cards which is a great plus.
 

CuNi

Member
Sep 4, 2014
1,210
1,083
765
Germany
AMD will have the better cards this Time.I think Performance AMD will win,Nvidia will probably have better ray tracing.When RDNA3 gets released AMD will start trash NVIDIA like they did arrogant Intel.Even if the RDNA2 cards are a bit behind in performance they will have way more ram then Nvidia cards which is a great plus.

This "more RAM" fallacy needs to stop. More VRAM does not make a better card.
You could slap 400GB VRAM on a 3080 and it would still perform the same. You need to balance. If you give the card all the VRAM in the world to render worlds in 20k by 20k with 40k textures, it will do so with 1 FPS because it's just too much to compute for the card. Before the 10GB VRAM on the 3080 will be so bottlenecked that it will start to not dip 1-2FPS but really cripple the card, the card won't be powerful enough to render those games in high fidelity anyway.
This card will easily hold 4 years after which it will still deliver great performance, just no on ultra anymore, but that was always the case with GPUs.

Also, Intel lost against AMD because Intel fucked up its node R&D, they got stuck in a corner which they are still somewhat unable to come out of and swing back to full force. NVIDIA on the other hand never pushed itself into a corner and keeps advancing on all fronts. They introduced RTX-HW support into mainstream GPUs, they introduced DLSS into mainstream GPUs and they keep pushing those techs heavily. Yes AMD has done a lot to catch up and is in a far better position than ever before, but saying "they will trash NVIDIA" is just so naiv I cannot think of where to begin to destroy this illusion of yours. Best case we can hope for is that they will be close to each other with only single digit performance differences, but most likely scenario is that AMD will somewhat be a competitor in the mainstream market while NVIDIA will stay High-End leader with having the "best card money can buy" but obviously on a price tag only for the lucky 0.1%.
 

Phrixotrichus

Banned
Aug 3, 2020
753
1,326
370
The excess of vram is doing to do shit in 5 years when the card is just to slow. There is no future proofing. That's called a console
Nonsense.... a PC GPU doesn`t get slower, just like a console doesn´t get faster. A GPU that outperforms a console now will still outperform it in 5 years.
 
Last edited:
Mar 23, 2018
7,049
8,492
765
What? No. Some people would, MAYBE edit: sorry, misread and thought you said 'people would'

If excess vram sold GPUs then Amd would be leading the market. I think a fair few people who are buying $700+ cards are the kind that upgrade every 2-3 years max. Most people I know with 5 year old cards are not shopping the high end, even remotely.

How would nvidia market that anyway lol? There's already the 3090 for creators who like to game. An inferior 3080 in performance would just be a big wet fart. Edit: again, only if it's inferior. If it's same or 2-5% better it still sells, I bet just not as well as the base varient

double the v-ram models aren't something new that get released mate for premium prices. AMD also ended up doing this with 200 series and eventually adopted it in the 300 series. Nvidia used it in the 700 / 600 and 500 series of there gpu's and even the most used 1060 have this concept. i probably forgot a bunch lot of them.

Seems like u are unfamiliar with the concept.

Also a group of people u don't seem familiar with they upgrade whenever a new gpu gets released for the simple fact to always be up to date. For example. if u bought a 2080 gpu which is basically a 2070 super with 5% performance, u could have sold that one for 500 bucks second handed before the 3000 series wasn't out yet add now 200 and have a hot new gpu again at top performance. Do the same in 2 years from now and u are at the top again. I know people that do exactly this. and even i did it back in the day. So yes there is a market for that.

A 3080 card indeed will not be bought by casuals so no clue why u even bother with this group. If nvidia cared about casuals with the 3000 series or even 2000 series cards they would have released a 300 and 200 buck GPU by now. They don't. The whole RTX lineup isn't meant for casuals, they got a 16xx series for that as of now.

The excess of vram is doing to do shit in 5 years when the card is just to slow. There is no future proofing. That's called a console

Let me give you a example so it's easier for you to understand.

Last generation game PS3 area 2013 ac black flag game on pc.

Minimum
RAM: 2 GB RAM
Video Card: Nvidia Geforce GTX 260 or AMD Radeon HD 4870 (512MB VRAM with shader Model 4.0 or higher) see supported list*

Ac unity next gen game a year later:

minimum:
Graphics: AMD Radeon HD 7970 or NVIDIA GeForce GTX 680 VRAM: 2GB ( 3gb recommend )
System Memory: 6 GB RAM

A 2012 gpu the 680 went from ultra high end material towards absolutely minimum specs simple because of v-ram ( reason lots of people bought 4gb model and waited on it and didn't even saw the 780 as upgrade for the simple fact 3gb of v-ram which lots of people waited again for rumored 6gb models that never arrived which made them upgrade to a 4gb 970 which ended up in lawsuits because people got scammed out of there v-ram performance. The die hards moved to 980ti's are moved to 1000 series with 8gb of memory and never looked back.

AC valhalla for example is straight up black flag on requirements, we will see a sharp rise in v-ram consumption to the point a 3080 could very well be seen as minimum spec material in a year or so from now simple because of its v-ram pool and nothing else.

Will it? we will have to see, however it doesn't look good for the card and that's also the reason nvidia did it. To make you upgrade in 2 years again. They are known for playing the v-ram game for ages now. the 1000 series however was a flux at this point.

Future proofing is a thing on PC to the point that makes sense. From system ram, to ssd capacity to v-ram to core counts etc.

The "future proofing" argument seems like built on a flimsy foundation to me. As it looks right now there's no demonstrable evidence (benchmarks from other than fringe developers) that show 10GB GDDR6X cards won't last at least this GPU gen and the next. With that said, since I'm a facts-driven person I'd like no more than to be proven wrong so-to-speak..

My argument is based around that xbox series X has 10gb of v-ram allocated. Sony will most likely also use that amount for there games. now the series S exists which can keep the v-ram budget down
" if devs don't decide to universally drop support for that box entirely and ignore it which could also happen ( wii-u debacle for example )." but higher settings on PC will consume more then what consoles have so either way v-ram will be a thing no matter what.

To what extend the 3080 10gb will age who knows its all guessing at this point. However with the experience i got with nvidia, with pc with gen upgrades and with v-ram. the 3080 reminds me of a 680 gtx for reason i stated above. i rather wait on the 680 4gb model or the 700 higher v-ram model.
 

llien

Member
Feb 1, 2017
9,989
7,944
935
They can't even ship the current versions. why would they bother making second versions at this point. Makes no sense.
They can't ship current versions, because they had to OC them into "not-OCeable" territory, and sell a card 20-30% faster than the last $1'200 card for $699, because...
I have yet to hear the reason not related to pressure from AMD.

10GB GA102 and 16GB GA104 makes no sense, it's a forced move.

I don't think Samsung 8nm Ampere is getting anywhere.
 
Last edited:

nemiroff

Gold Member
Feb 19, 2018
1,685
2,172
595
double the v-ram models aren't something new that get released mate for premium prices. AMD also ended up doing this with 200 series and eventually adopted it in the 300 series. Nvidia used it in the 700 / 600 and 500 series of there gpu's and even the most used 1060 have this concept. i probably forgot a bunch lot of them.

Seems like u are unfamiliar with the concept.

Also a group of people u don't seem familiar with they upgrade whenever a new gpu gets released for the simple fact to always be up to date. For example. if u bought a 2080 gpu which is basically a 2070 super with 5% performance, u could have sold that one for 500 bucks second handed before the 3000 series wasn't out yet add now 200 and have a hot new gpu again at top performance. Do the same in 2 years from now and u are at the top again. I know people that do exactly this. and even i did it back in the day. So yes there is a market for that.

A 3080 card indeed will not be bought by casuals so no clue why u even bother with this group. If nvidia cared about casuals with the 3000 series or even 2000 series cards they would have released a 300 and 200 buck GPU by now. They don't. The whole RTX lineup isn't meant for casuals, they got a 16xx series for that as of now.



Let me give you a example so it's easier for you to understand.

Last generation game PS3 area 2013 ac black flag game on pc.

Minimum
RAM: 2 GB RAM
Video Card: Nvidia Geforce GTX 260 or AMD Radeon HD 4870 (512MB VRAM with shader Model 4.0 or higher) see supported list*

Ac unity next gen game a year later:

minimum:
Graphics: AMD Radeon HD 7970 or NVIDIA GeForce GTX 680 VRAM: 2GB ( 3gb recommend )
System Memory: 6 GB RAM

A 2012 gpu the 680 went from ultra high end material towards absolutely minimum specs simple because of v-ram ( reason lots of people bought 4gb model and waited on it and didn't even saw the 780 as upgrade for the simple fact 3gb of v-ram which lots of people waited again for rumored 6gb models that never arrived which made them upgrade to a 4gb 970 which ended up in lawsuits because people got scammed out of there v-ram performance. The die hards moved to 980ti's are moved to 1000 series with 8gb of memory and never looked back.

AC valhalla for example is straight up black flag on requirements, we will see a sharp rise in v-ram consumption to the point a 3080 could very well be seen as minimum spec material in a year or so from now simple because of its v-ram pool and nothing else.

Will it? we will have to see, however it doesn't look good for the card and that's also the reason nvidia did it. To make you upgrade in 2 years again. They are known for playing the v-ram game for ages now. the 1000 series however was a flux at this point.

Future proofing is a thing on PC to the point that makes sense. From system ram, to ssd capacity to v-ram to core counts etc.



My argument is based around that xbox series X has 10gb of v-ram allocated. Sony will most likely also use that amount for there games. now the series S exists which can keep the v-ram budget down
" if devs don't decide to universally drop support for that box entirely and ignore it which could also happen ( wii-u debacle for example )." but higher settings on PC will consume more then what consoles have so either way v-ram will be a thing no matter what.

To what extend the 3080 10gb will age who knows its all guessing at this point. However with the experience i got with nvidia, with pc with gen upgrades and with v-ram. the 3080 reminds me of a 680 gtx for reason i stated above. i rather wait on the 680 4gb model or the 700 higher v-ram model.

The earlier barriers was partly because of the resolution bumps. I still don't see any hard evidence for why 10GB is not enough to last for all but a few obscure cases. But again; all it takes really is a few benchmarks showing the 3090 performing considerably more than the usual 5-10% performance over the 3080 and I'd be convinced.. All I want is to escape the meme-worthy notions and get to something tangible. My budget for a 2020 GPU was/is about $2000 (but with a "common sense clause", so no 3090 for me - Yet), so I'd flip in a heartbeat.. Until that I'll be patiently "waiting" using a 10GB 3080.
 
Last edited:

grfunkulus

Member
Dec 23, 2016
724
310
350
double the v-ram models aren't something new that get released mate for premium prices. AMD also ended up doing this with 200 series and eventually adopted it in the 300 series. Nvidia used it in the 700 / 600 and 500 series of there gpu's and even the most used 1060 have this concept. i probably forgot a bunch lot of them.

Seems like u are unfamiliar with the concept.

Also a group of people u don't seem familiar with they upgrade whenever a new gpu gets released for the simple fact to always be up to date. For example. if u bought a 2080 gpu which is basically a 2070 super with 5% performance, u could have sold that one for 500 bucks second handed before the 3000 series wasn't out yet add now 200 and have a hot new gpu again at top performance. Do the same in 2 years from now and u are at the top again. I know people that do exactly this. and even i did it back in the day. So yes there is a market for that.

A 3080 card indeed will not be bought by casuals so no clue why u even bother with this group. If nvidia cared about casuals with the 3000 series or even 2000 series cards they would have released a 300 and 200 buck GPU by now. They don't. The whole RTX lineup isn't meant for casuals, they got a 16xx series for that as of now.



Let me give you a example so it's easier for you to understand.

Last generation game PS3 area 2013 ac black flag game on pc.

Minimum
RAM: 2 GB RAM
Video Card: Nvidia Geforce GTX 260 or AMD Radeon HD 4870 (512MB VRAM with shader Model 4.0 or higher) see supported list*

Ac unity next gen game a year later:

minimum:
Graphics: AMD Radeon HD 7970 or NVIDIA GeForce GTX 680 VRAM: 2GB ( 3gb recommend )
System Memory: 6 GB RAM

A 2012 gpu the 680 went from ultra high end material towards absolutely minimum specs simple because of v-ram ( reason lots of people bought 4gb model and waited on it and didn't even saw the 780 as upgrade for the simple fact 3gb of v-ram which lots of people waited again for rumored 6gb models that never arrived which made them upgrade to a 4gb 970 which ended up in lawsuits because people got scammed out of there v-ram performance. The die hards moved to 980ti's are moved to 1000 series with 8gb of memory and never looked back.

AC valhalla for example is straight up black flag on requirements, we will see a sharp rise in v-ram consumption to the point a 3080 could very well be seen as minimum spec material in a year or so from now simple because of its v-ram pool and nothing else.

Will it? we will have to see, however it doesn't look good for the card and that's also the reason nvidia did it. To make you upgrade in 2 years again. They are known for playing the v-ram game for ages now. the 1000 series however was a flux at this point.

Future proofing is a thing on PC to the point that makes sense. From system ram, to ssd capacity to v-ram to core counts etc.



My argument is based around that xbox series X has 10gb of v-ram allocated. Sony will most likely also use that amount for there games. now the series S exists which can keep the v-ram budget down
" if devs don't decide to universally drop support for that box entirely and ignore it which could also happen ( wii-u debacle for example )." but higher settings on PC will consume more then what consoles have so either way v-ram will be a thing no matter what.

To what extend the 3080 10gb will age who knows its all guessing at this point. However with the experience i got with nvidia, with pc with gen upgrades and with v-ram. the 3080 reminds me of a 680 gtx for reason i stated above. i rather wait on the 680 4gb model or the 700 higher v-ram model.

I understand all of this, mate, nothing revolutionary about what they'd be doing. I simply don't believe you're right that we'll see any but a few fringe cases pushing the limits for at least a few years. I bought a 1060 3 gb which was the recommended value option for 1080p gaming at the time because it was substantially cheaper and it held up in all but a handful of games for the 2-3 or more years I was 'promised' it would. I go with the conventional wisdom, not what a few alarmist on a tech forum say MIGHT happen if I don't future proof.

And I am exactly that 1 year upgrader you're talking about now. Bought a 2080 at fry's for $580 brand new a short while after launch. Sold for $450 right after 3080 announcement (should've sold before but hey, can't be perfect). Bought a Tuff gaming OC for $730 on Newegg with coupon. Feel pretty good about that performance increase for the money, and the additional 2 gbs of much faster vram. No one has really tested if small periods of running out of vram really tank framerate on this card now that the new nvidia io is a thing. The times of .1% lows going to absolute shit are possibly (likely) a thing of the past.

All I'm saying is the issue is far overblown.
 
  • Like
Reactions: Kenpachii