• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Specs for nVidia Ampere Supposedly Leaked

DeaDPo0L84

Member
So am I right in thinking these will be the 3000 series cards:

3090
3080ti
3080
3070
3060

I was all set in buying the 3080ti but it seems it may be an actual card that exist according to the most recent leaks/rumors.
 

Rikkori

Member
So... how fucked is my 7700k with a new medium specc 30XX. Should I instantly plan the CPU upgrade? (playing 1440p)
Above 60 hz? Otherwise I wouldn't worry, unless you like playing on medium/low. Games are gonna grind even a higher end card into dust once you turn on all those pretty graphical features.

The bitcoinhype made them believe this.
Funny thing is, that's starting to come back too. The AMD cards are going first again, but they'll come for Nvidia ones too if it keeps up.

So am I right in thinking these will be the 3000 series cards:

3090
3080ti
3080
3070
3060

I was all set in buying the 3080ti but it seems it may be an actual card that exist according to the most recent leaks/rumors.
Remove 3080 ti, add 3070 Ti
 
Last edited:

martino

Member
Granted, keep in mind FE MSRP was $1199 too, and in fact that was the general price anyway. The $999 2080 Ti existed only on paper but never in reality. Supposedly there's a lot more structural changes to the chips themselves, so who knows how they end up looking. I think in the end Nvidia is simply going to pocket the higher margins from having went with Samsung instead of TSMC.

aP6ryAG.png

probably but still it seems my hope node reduction would cover a reasons of such high price coming with this tech choice was misplaced one.
let's hope intel shake thing a little (even if i don't believe that one second from them)

Because nvidea isn’t selling those high end cards like console makers sell consoles.
The bitcoinhype made them believe this.

Most used gpu on steam was the GTX1060
A console priced card. There is where the money is. What nvidea needs to do is offer ps5/Xs. Gfx for around 400. 500.

only silly tits like me spend idiot cash on high ends. Heck the 2070S was about my limit without feeling like a complete jack ass

where is this going ? if next refresh add more cu / tensor /rt core and there is no node reduction (or even with it) how much will it cost ?
i am a ti owner (1080). i skipped the 2xxx gen because i though this one will be matured version of the tech coming cheaper (thank to node reduction too)
if this is confirmed it's also confirmation price never go back to imo not absurd level (and i bought at near absurd one, i got my 1080ti for 720 euros)) and they will lost me in the process.
 

DeaDPo0L84

Member
Remove 3080 ti, add 3070 Ti

I'm so confused, this whole time up until yesterday 3080ti has been a given. So assuming this is now the order of things, would a 3070ti theoretically be better than a 3080ti? I am mainly buying a new card cause I want to play CP2077 the best way possible w/ray tracing.
 

Spukc

always chasing the next thrill
probably but still it seems my hope node reduction would cover a reasons of such high price coming with this tech choice was misplaced one.
let's hope intel shake thing a little (even if i don't believe that one second from them)



where is this going ? if next refresh add more cu / tensor /rt core and there is no node reduction (or even with it) how much will it cost ?
i am a ti owner (1080). i skipped the 2xxx gen because i though this one will be matured version of the tech coming cheaper (thank to node reduction too)
if this is confirmed it's also confirmation price never go back to imo not absurd level (and i bought at near absurd one, i got my 1080ti for 720 euros)) and they will lost me in the process.
Well dude people are retarded enough to spend 1500$ on a 2 year cycle product like a mobile phone. I think nvidia is expecting us to pay 1K on slightly better console gfx like it's nothing.
 

Rikkori

Member
I'm so confused, this whole time up until yesterday 3080ti has been a given. So assuming this is now the order of things, would a 3070ti theoretically be better than a 3080ti? I am mainly buying a new card cause I want to play CP2077 the best way possible w/ray tracing.
It was between 3090 and 3080 ti, but now we know it's 3090. Or well, as much as we can until Nvidia themselves confirm it. When I say add 3070 Ti and remove 3080 Ti I mean simply as models, of course a 3070 Ti would not be faster than a 3080 Ti (well, except if you count one existing and one not, then ofc the existing one will be faster than the inexistent one).

0TDWYj2.png
 

martino

Member
Well dude people are retarded enough to spend 1500$ on a 2 year cycle product like a mobile phone. I think nvidia is expecting us to pay 1K on slightly better console gfx like it's nothing.

in my case they found my limit (it was arround 800€ for high end)
 

Spukc

always chasing the next thrill
no i skipped 2xxx because i hoped node reduction + matured tensor/rt core would be cheaper
i have a 1080 ti
ah oki oki yeah 1080ti gotten in a deal was not bad
you best bet might be old stock 2080ti ?
i couldn't even max out BL3 at 3440x1440 @ Solid 60 with that card. hence i returned it
 
Last edited:

Airbus Jr

Banned
Micron suggests Nvidia's RTX 3090 comes with GDDR6X memory at up to 21Gbps


A tech brief from memory manufacturer Micron suggests Nvidia will be adopting GDDR6X memory for at least the RTX 3090, if not other high-end Ampere graphics cards. The news comes from a PDF from Micron (uncovered by Videocardz), which clearly states the next-generation graphics card will be equipped with yet-unannounced GDDR6X rated to 19-21Gbps.


That's seriously speedy memory, and according to the same document [PDF warning] hosted on the Micron website the company expects GDDR6X cards to come equipped with an average of around 12GB of the stuff. Both of which are increasingly important metrics for gaming at high resolution and fidelity. If you crave 4K60, it's not simply a question of moar cores.


For comparison, the RTX 2080 Ti Founders Edition came with 11GB of memory rated to 14Gbps.


This is also the first official word we've received of the GDDR6X memory configuration's existence. Nvidia pulled a similar move for its high-end Pascal graphics cards, which came with shiny and new GDDR5X memory. GDDR5X offered a fairly considerably bump in speed over the GDDR5 standard—10Gbps to 7Gbps, respectively, and later increased to 11Gbps—and the jump with GDDR6 to GDDR6X looks to be no different. That means it may also run with slightly lower voltage demands (and also may be more expensive).



With a price premium likely attached to the faster memory, it's unlikely we'll see GDDR6X proliferate its way down the entire GPU stack.

This document also serves as further proof that Nvidia will be opting for the RTX 30-series branding and naming schema—and even a lofty RTX 3090. Yet Micron doesn't put the cards together, only sells the chips—so there may be some guesswork involved on its part. After taking a second look at the included table, it appears as though the exact RTX 3090 graphics card specification cannot be determined from what's listed.

With an expectation that Nvidia will announce the Ampere generation on September 1, 2020, during its 'GeForce Special Event', it's likely we're hearing the name that's stuck for the GPU generation. The proximity to the event also adds weight to the veracity of the recent specs bumble over at Micron—and I can guarantee that Jen-Hsun Huang won't be happy if someone has spoiled his launch party.
 
Last edited:

martino

Member
ah oki oki yeah 1080ti gotten in a deal was not bad
you best bet might be old stock 2080ti ?
i couldn't even max out BL3 at 3440x1440 @ Solid 60 with that card. hence i returned it

i will quote what yoda said in similar context :
vduagi5dpxk01.jpg
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Well...

Only the price for the 3080 would be insane, right, since it would be about $150 more expensive than the 2080 was when it was released? As for the 3090, assuming that it's a successor to the RTX Titan, wouldn't it be a bargain at $1300 to $1500 since the RTX Titan costs $2500?

Also, why are people surprised that the 3090 would have 24GB of VRAM, considering that the RTX Titan also has 24GB of VRAM?

To be fair, I didn't know the RTX Titan cost $2500. That's just crazy man. Are these the GPUs in PCs that people are comparing to current day consoles?
 

BluRayHiDef

Banned
To be fair, I didn't know the RTX Titan cost $2500. That's just crazy man. Are these the GPUs in PCs that people are comparing to current day consoles?

Yep, it's ridiculously expensive. I didn't know until looking it up while reading this thread.
I'm sure most people aren't really aware about the Titan RTX because it's such a niche card.

cKnVL1Y.jpg
 
I'm wondering how do you guys set a budget for your GPU? I mean even if you're made of money, only very few people are dumb enough to just waste cash on a card that won't last them that long. Some people hold onto say an RTX 2080Ti for more than 6 years and are fine with it. Others dump it after 4 years and so on. At what point does a GPU become not worth the investment to you? How much per year or month does a GPU have to cost you in order for it to be considered reasonably priced? I still haven't figured out this for myself.
 
I'm wondering how do you guys set a budget for your GPU? I mean even if you're made of money, only very few people are dumb enough to just waste cash on a card that won't last them that long. Some people hold onto say an RTX 2080Ti for more than 6 years and are fine with it. Others dump it after 4 years and so on. At what point does a GPU become not worth the investment to you? How much per year or month does a GPU have to cost you in order for it to be considered reasonably priced? I still haven't figured out this for myself.

I bought a 1070 4 years ago and it is still perfectly fine in every game I care about. If I bought 1080TI I would still upgrade to 3000 series because it will be a noticeable improvement.
 
I bought a 1070 4 years ago and it is still perfectly fine in every game I care about. If I bought 1080TI I would still upgrade to 3000 series because it will be a noticeable improvement.

Where I live, a GTX 1070 was around 600 Euros at launch (third party). 4 years of usage makes it 12,50 Euros / month which seems very reasonable to me. Good choice. What would you say is a sensible limit here?
 
Last edited:
Where I live, a GTX 1070 was around 600 Euros at launch (third party). 4 years of usage makes it 12,50 Euros / month which seems very reasonable to me. Good choice. What would you say is a sensible limit here?

It all depends on your definition of sensible and your personal circumstances, right?

For me personally, I could afford to drop $1200 on a 3090 or whatever and not be destitute for it. But I'm also a US citizen living in a major city with a relatively high-paying career. This is obviously not the case for everyone.

Even then, I'm not going to just buy the most expensive card and call it a day. After researching, if I find that a $600 3070 will do everything I need from a GPU, then that will probably be the one I go with. I'm more interested in getting very high frames at 1080p or 1440p in MP games than I am in 4K/60FPS "AAA" stuff.
 
Last edited:

waylo

Banned
I can't wait to be super pumped for the 3000 series, and then have my excitement get immediately deflated once I see the price.


"Fuck yeeeeaaa oh..."
This is the boat I'm in. I'm totally hyped for the new cards. This entire time I've been being somewhat optimistic about spending $800-$900 on a 3080. I know they're going to announce it though and the inevitable $1500 price tag is going to bum me out and I'll just wind up settling on a 3070.
 

Madflavor

Member
This is the boat I'm in. I'm totally hyped for the new cards. This entire time I've been being somewhat optimistic about spending $800-$900 on a 3080. I know they're going to announce it though and the inevitable $1500 price tag is going to bum me out and I'll just wind up settling on a 3070.

If the 3070 or hell, even the 3060 is enough to comfortably play CP77 on ultra, I'm fine with that.
 

martino

Member
I'm wondering how do you guys set a budget for your GPU? I mean even if you're made of money, only very few people are dumb enough to just waste cash on a card that won't last them that long. Some people hold onto say an RTX 2080Ti for more than 6 years and are fine with it. Others dump it after 4 years and so on. At what point does a GPU become not worth the investment to you? How much per year or month does a GPU have to cost you in order for it to be considered reasonably priced? I still haven't figured out this for myself.
i bought a 980 ti day one selling my previous gpus (was a of crossfire of 7870xt) (700 euros) and i payed 450 euros in 2015
for my 1080 ti i spent 494 euros (and sold my old 980ti 250 euros) in September 2017
so i'm consistent at arround 14 euros / month in the last years (outside inital buy of gpu long ago i have no concrete track record of. I know i had a crossfire of 6870 and a 9800gx2 i sold before the crossfire of 7870xt)
dunno how much i could get for my current 1080ti.
but this ways of doing it will not buy high end card (or high level of performance when i was a fool to believe in multi gpu) anymore
 

BluRayHiDef

Banned
If the 3070 or hell, even the 3060 is enough to comfortably play CP77 on ultra, I'm fine with that.

Cyberpunk 2077 is a cross-generation game, however. Hence, it shouldn't be used as a basis for determining how viable a graphics card will be for games that are designed for the PS5 and XSX and then ported to PC.
 

BluRayHiDef

Banned
I'm wondering how do you guys set a budget for your GPU? I mean even if you're made of money, only very few people are dumb enough to just waste cash on a card that won't last them that long. Some people hold onto say an RTX 2080Ti for more than 6 years and are fine with it. Others dump it after 4 years and so on. At what point does a GPU become not worth the investment to you? How much per year or month does a GPU have to cost you in order for it to be considered reasonably priced? I still haven't figured out this for myself.

Well, for my current card, I used Microcenter's warranty for graphics cards to upgrade. With the warranty, I upgraded from a 980Ti to a 1080Ti and simply paid the difference, which was two hundred dollars or so. Unfortunately, however, Microcenter had suspended its graphics-card warranties by the time that I upgraded to the 1080Ti, because cryptocurrency mining had increased the rate at which people were using the warranties, which was causing Microcenter to lose money. Hence, I'm going to have to pay completely out of pocket for a 3000 Series card, but I don't intend to upgrade until around Christmas or January.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
lol no. current consoles are like Rx 480 Polaris level tech, even a GeForce 1060 is better than that.

LOL! That's not what I was trying to ask. What I was trying to ask is, are these the type of GPUs that are in PCs of the gamers that love to compare their gaming experience to the gaming experience of the average console player on a PS4 Pro or Xbox One X?

I see PC gamers say they have been playing games at 4K60 with high quality textures. But if these are the types of GPUs that's needed to play those games that way, then...........
 
It was between 3090 and 3080 ti, but now we know it's 3090. Or well, as much as we can until Nvidia themselves confirm it. When I say add 3070 Ti and remove 3080 Ti I mean simply as models, of course a 3070 Ti would not be faster than a 3080 Ti (well, except if you count one existing and one not, then ofc the existing one will be faster than the inexistent one).

0TDWYj2.png

Motherfuckers.

Remember when the -80-tier of cards could be had here for £399 (the GTX 980). Three generations on and the same tier of card costs 300% more! Or if we're lucky in Blighty, the 3080 will go for £1150.

(I know these are not confirmed prices but it'll be close)
 
LOL! That's not what I was trying to ask. What I was trying to ask is, are these the type of GPUs that are in PCs of the gamers that love to compare their gaming experience to the gaming experience of the average console player on a PS4 Pro or Xbox One X?

I see PC gamers say they have been playing games at 4K60 with high quality textures. But if these are the types of GPUs that's needed to play those games that way, then...........

The Titan is a niche card that is more for development/creative work AFAIK. Currently the 2080/2080TI are the only cards getting 60FPS at 4K with ultra settings, and those cards sell well.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
The Titan is a niche card that is more for development/creative work AFAIK. Currently the 2080/2080TI are the only cards getting 60FPS at 4K with ultra settings, and those cards sell well.

How much do these cards cost today vs. what they cost when they first came out?

Yep. I'm on the 3090 bandwagon and will preorder immediately on Sept 1st.

I better not catch anybody trying to compare next-gen consoles to these powerhouse GPUs.

Why should we when they cost triple the money on next-gen consoles?
 

kiphalfton

Member
Man you can buy a PS5 + 4K HDR OLED TV + 15 games for that or Nintendo Switch + 2 Wii U games made to switch games that sell for full price now or 341 years of Xbox Gamepass.

Lol, you're probably not wrong about being able to get an OLED and PS5 for the price of a 3090/Ampere Titan A.

...Jesus that really puts things in perspective.
 

DeaDPo0L84

Member
Yep. I'm on the 3090 bandwagon and will preorder immediately on Sept 1st.

I better not catch anybody trying to compare next-gen consoles to these powerhouse GPUs.

Its already happening in a lot of console threads.

I'm stuck between getting a 3080 or 3090. If the 3080 is max $1,200 I'll go that route. But if the performance difference between the two is drastic I might be swayed to lean towards 3090...
 
Last edited:
Lol, you're probably not wrong about being able to get an OLED and PS5 for the price of a 3090/Ampere Titan A.

...Jesus that really puts things in perspective.

But that doesn't mean anything if you don't want to play PS5 games. I want to play games like Escape from Tarkov and Hunt: Showdown at 240HZ+ and with a mouse and keyboard. Sure it is expensive, but I don't have any interest in games like Spiderman or Horizon so I would be wasting my money anyway.
 

Madflavor

Member
Cyberpunk 2077 is a cross-generation game, however. Hence, it shouldn't be used as a basis for determining how viable a graphics card will be for games that are designed for the PS5 and XSX and then ported to PC.

That's a fair point. For someone like me though the reason why I'm upgrading my PC, is specifically for Cyberpunk 2077.
 

kiphalfton

Member
I'm wondering how do you guys set a budget for your GPU? I mean even if you're made of money, only very few people are dumb enough to just waste cash on a card that won't last them that long. Some people hold onto say an RTX 2080Ti for more than 6 years and are fine with it. Others dump it after 4 years and so on. At what point does a GPU become not worth the investment to you? How much per year or month does a GPU have to cost you in order for it to be considered reasonably priced? I still haven't figured out this for myself.

Sell the old model and use said money towards the new model (this applies to smartphones, video game consoles, computer parts, etc.).
 

BluRayHiDef

Banned
That's a fair point. For someone like me though the reason why I'm upgrading my PC, is specifically for Cyberpunk 2077.

I don't mean to come across as rude, but that's ridiculous. You're upgrading for one 50-hours-or-more experience. What about after that? What if your card isn't good enough for games that are designed primarily with the next-generation of consoles in mind and then ported to PC?
 

GamingArena

Member
I'm wondering how do you guys set a budget for your GPU? I mean even if you're made of money, only very few people are dumb enough to just waste cash on a card that won't last them that long. Some people hold onto say an RTX 2080Ti for more than 6 years and are fine with it. Others dump it after 4 years and so on. At what point does a GPU become not worth the investment to you? How much per year or month does a GPU have to cost you in order for it to be considered reasonably priced? I still haven't figured out this for myself.

Bering on top end is not as expensive as you think, who ever got 2080ti day one at $1200usd had 2 years of top performance and they could of still sell it for $1000-1100 on used market even this week as we speak, that's being on top of GPU chain for no more then $100-200 in 2 years, it does not get better then that.

That's what i'm doing since ever, the initial investment might be high but keeping it going after that is at most 200-300 every 1-2 year upgrade to stay on top i was moving from Titan to Titan and never lost more then $200 on upgrade to next one.
 
What about after that? What if your card isn't good enough for games that are designed primarily with the next-generation of consoles in mind and then ported to PC?

Zero chance that would happen. A 3070 purchase would hold you throughout the entire console generation and then some. Maybe even a 3060.
 
Top Bottom