• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 3080, RTX 3070 leaked specs: up to 20GB GDDR6 RAM

Kenpachii

Member
If the 3070 and 3080 specs are supposedly leaked, it makes me wonder if nvidia will go back to launching those first and then save the 3080 Ti and Titan for later. If so then I’ll be keeping my 11gb 1080 Ti until the big card launches.

It will happen unless AMD forces there hand really. They will do exactly what they did with the 2000 series drip feed. I aspect the 3080ti to launch half a year later. If AMD pushes them directly out of the gate.
 
Last edited:

Ascend

Member
So based on the specs, the RTX 3070 is basically going to be an RTX 2080 Super+, and the RTX 3080 will most likely be slightly slower than the 2080Ti.

Let's see what they do with prices. I'm expecting another price hike, i.e. the RTX 3070 at RTX 2080 prices rather than RTX 2070 prices.
 
Last edited:

pawel86ck

Banned
20GB VRAM? I'm not surprised, because next gen consoles will launch in Q4 2020, so VRAM requirements will go through the roof for sure.
 

Kenpachii

Member
20GB VRAM? I'm not surprised, because next gen consoles will launch in Q4 2020, so VRAM requirements will go through the roof for sure.

Well half the v-ram on a GPU was enough for this entire generation comfortable, so 8gb is probably the going to be it for next generation. With 4k i could see pc having a lot less need for more v-ram in comparison towards this gen.

I would not be shocked if 3070 gets shipped with 8gb again, and 3080 with 10gb, 3080ti 16gb, titan 24gb.
 
Last edited:

skneogaf

Member
The only information I want is whether it will have hdmi 2.1 as my 2080ti is serving me well at the moment but my LG c9 craves 4k@120
 

JordanN

Banned
Well half the v-ram on a GPU was enough for this entire generation comfortable, so 8gb is probably the going to be it for next generation. With 4k i could see pc having a lot less need for more v-ram in comparison towards this gen.

I would not be shocked if 3070 gets shipped with 8gb again, and 3080 with 10gb, 3080ti 16gb, titan 24gb.
4K on PC is going to be old news. The new benchmark is/should be 8K resolution or more.

Edit: I'm correct. There was 8K monitors shipping in 2017.
 
Last edited:

Kenpachii

Member
4K on PC is going to be old news. The new benchmark is/should be 8K resolution or more.

Edit: I'm correct. There was 8K monitors shipping in 2017.

What i mean is, with consoles pushing for 4k and PC staying at 1080p/1440p the v-ram allocation on consoles will rise and PC will lower in comparison towards this generation where PS4 targeted the same resolution as PC.

Also GPU performance will need to be drastically higher then PC at those higher resolutions.
 

proandrad

Member
3398109.png

Ya just wait for the 5080, it will be way better than the 3080.
 

SlimySnake

Flashless at the Golden Globes
thats not really that big tbh. at 2.0 ghz we are looking at 12 and 16 tflops gpus. so 2070 will now have 2080 power +10% and 2080 will now be a little under 2080 ti power.

i expected more. i wonder if they can go above 2.0 ghz clockspeeds because thats when we will seeing a generational difference.
 

Siri

Banned
I already know that the 3080ti will replace my 2080ti - but what I really want to know is if the 3080ti will have an HDMI 2.1 connector to power my LG C9 at 4K 120hz.
 

pawel86ck

Banned
Well half the v-ram on a GPU was enough for this entire generation comfortable, so 8gb is probably the going to be it for next generation. With 4k i could see pc having a lot less need for more v-ram in comparison towards this gen.

I would not be shocked if 3070 gets shipped with 8gb again, and 3080 with 10gb, 3080ti 16gb, titan 24gb.
4GB was enough for the vast majority of games but there are also PS4 ports that can use 10GB (final fantasy for example) or close to it (RE2 remake) at 4K.

At 8K VRAM requirements are even higher, and just look what happens when 2080ti will run out of VRAM



It's a slide show.



Nearly 20GB VRAM usage in certain games and these are just PS4 ports.

Next gen games will be build with SDD streaming in mind, so developers will be able to stream inasane amount of data even if next gen consoles will launch with just 16GB (13GB for games and 3GB for OS).

On PC market you can buy really fast SDD drives, but these arnt build with games in mind, so data decompression speed will bottleneck SDD speed drastically (XSX and PS5 will use HW decompression). Without equally good SDD, developers will be forced to increase memory requirements on PC. Mark my words, PS5/XSX ports will use 20GB (if not more).
 
Last edited:

VFXVeteran

Banned
Every single game is memory bandwidth limited. 20GB isn't even enough tbh. When I first bought my 2080Ti, I wrote a short renderer that would load in tifs from a film character all uncompressed for each of the different lighting variables in a typical shader: diffuse, specular, emission, etc.. total was about 75 tif files for one character. I quickly page faulted the GPU memory (11GB).

If you guys ever want to see film quality graphics in videogames, you'll know it's close when we see 64G of VRAM cards.

3080Ti will be mine or whichever one gives me the most VRAM.
 

VFXVeteran

Banned
4GB was enough for the vast majority of games but there are also PS4 ports that can use 10GB (final fantasy for example) or close to it (RE2 remake) at 4K.

At 8K VRAM requirements are even higher, and just look what happens when 2080ti will run out of VRAM



It's a slide show.



Nearly 20GB VRAM usage in certain games and these are just PS4 ports.

Next gen games will be build with SDD streaming in mind, so developers will be able to stream inasane amount of data even if next gen consoles will launch with just 16GB (13GB for games and 3GB for OS).

On PC market you can buy really fast SDD drives, but these arnt build with games in mind, so data decompression speed will bottleneck SDD speed drastically (XSX and PS5 will use HW decompression). Without equally good SDD, developers will be forced to increase memory requirements on PC. Mark my words, PS5/XSX ports will use 20GB (if not more).


The SSD isn't going to save the memory bandwidth problem. Like, at all.

If you look at the entire pipeline memory stage from CPU RAM (resident) to GPU registers (in like a Volta architecture), here are the different bandwidth ranges at each stage

CPU RAM ~ 150GB/s
PCI-E (SSD controller bus) ~ 16GB/s -- OR -- using NVlink (~50GB/s)
GPU RAM (fetch/store) ~ 900GB/s
L2 cache (fetch/store) ~ 2000GB/s
L1 cache (fetch/store) ~ 2000GB/s
GPU registers/shared memory pool ~ 20000GB/s


So the bottomline is to still try to get as much data into RAM (either CPU or GPU) as possible. I'd rather have significantly more VRAM and a regular SSD than to have less VRAM and a faster SSD. Next-gen consoles will be starved for VRAM/RAM since they use a shared memory model.
 
Last edited:

VFXVeteran

Banned
What i mean is, with consoles pushing for 4k and PC staying at 1080p/1440p the v-ram allocation on consoles will rise and PC will lower in comparison towards this generation where PS4 targeted the same resolution as PC.

Also GPU performance will need to be drastically higher then PC at those higher resolutions.

You don't make any sense here.
 

pawel86ck

Banned
The SSD isn't going to save the memory bandwidth problem. Like, at all.

If you look at the entire pipeline memory stage from CPU RAM (resident) to GPU registers (in like a Volta architecture), here are the different bandwidth ranges at each stage

CPU RAM ~ 150GB/s
PCI-E (SSD controller bus) ~ 16GB/s -- OR -- using NVlink (~50GB/s)
GPU RAM (fetch/store) ~ 900GB/s
L2 cache (fetch/store) ~ 2000GB/s
L1 cache (fetch/store) ~ 2000GB/s
GPU registers/shared memory pool ~ 20000GB/s


So the bottomline is to still try to get as much data into RAM (either CPU or GPU) as possible. I'd rather have significantly more VRAM and a regular SSD than to have less VRAM and a faster SSD. Next-gen consoles will be starved for VRAM/RAM since they use a shared memory model.
I'm aware even ULTRA fast SDD will not match RAM speed, however streaming speed with skyrocket and therefore developers will be able to save RAM memory and load much bigger textures anyway.



As you can see SDD used as virtual VRAM obviously makes a difference. When normal GPU runs out of RAM there's a slide show as digital foundry video shows, however AMD GPU with SDD can run the same test with smooth framerate.

Having ultra fast SDD and insane amount of memory would be the best scenario, but lets be realistic, these consoles will be cheap, therefore people expectations should be realistic. And BTW. RAM is one thing, but game size is another. Do we really need 32GB RAM in consoles, when the vast majority of games are around 50GB?
 
Last edited:
D

Deleted member 17706

Unconfirmed Member
Definitely looking to upgrade my GTX 1080 with either a 3080 or 3080 Ti depending on price/performance gains. These are supposed to be out around summer time, right?
 

VFXVeteran

Banned
I'm aware even ULTRA fast SDD will not match RAM speed, however streaming speed with skyrocket and therefore developers will be able to save RAM memory and load much bigger textures anyway.

...

As you can see SDD used as virtual VRAM obviously makes a difference. When normal GPU runs out of RAM there's a slide show as digital foundry video shows, however AMD GPU with SDD can run the same test with smooth framerate.

With streaming games, yes it would with framerate. But I don't think it's as drastic as you think it is. IOW, this same game would run on a platform with an arbitrary storage device (i.e. regular SSD or HDD) and the CPU could be good enough when loading the level into RAM. Honestly, it all depends on the type of game. Corridor shooters can have the entire level in RAM upon load. You won't see a dramatic change in gameplay whereby you get props and terrain as vast distances from the camera vs. a PC equipped with a regular HDD. I don't believe it will be like that.

Having ultra fast SDD and insane amount of memory would be the best scenario, but lets be realistic, these consoles will be cheap, therefore people expectations should be realistic. And BTW. RAM is one thing, but game size is another. Do we really need 32GB RAM in consoles, when entire games size is somewhere around 50GB?

Sure, why not? The game size might say 50GB but I'm sure all the assets are still compressed and therefore have to be decompressed on-the-fly.
 

lukilladog

Member
You guys seem too excited, I´m confident nvidia will deliver the minimum upgrade they can and milk 7nm for a while. That should able to handle first wave of next gen console ports, barely, choke in a few, and use that frustration to sell you better cards later. Pc video cards have become so boring, too much greed from nvidia.
 
Last edited:

VFXVeteran

Banned
You guys seem to excited, I´m confident nvidia will deliver the minimum upgrade they can and milk 7nm for a while. That should able to handle first wave of next gen console ports, barely, choke in a few, and use that frustration to sell you better cards later. Pc video cards have become so boring, too much greed from nvidia.

Is this a joke post?
 

Kenpachii

Member
4GB was enough for the vast majority of games but there are also PS4 ports that can use 10GB (final fantasy for example) or close to it (RE2 remake) at 4K.

At 8K VRAM requirements are even higher, and just look what happens when 2080ti will run out of VRAM



It's a slide show.



Nearly 18GB in certain games and these are PS4 ports.

Next gen games will be build with SDD streaming in mind, so developers will be able to stream inasane amount of data even if next gen consoles will launch with just 16GB (13GB for games and 3GB for OS).

On PC market you can buy really fast SDD drives, but these arnt build with games in mind, so data decompression speed will bottleneck SDD speed drastically (XSX and PS5 will use HW decompression). Without equally good SDD, developers will be forced to increase memory requirements on PC. Mark my words, PS5/XSX ports will use 20GB (if not more).


I don't see your point. If consoles focus on 4k, PC will have it easier on v-ram in comparison towards last generation. If you story about SSD's would work then even 16gb of v-ram would be useless anyway which isn't helping as argument against what i said.

You don't make any sense here.

I make all the sense, because higher resolutions consumes more v-ram. AC odyssey consumes 4,6gb of v-ram on my 1080ti at 1080p and 6,5gb on 4k.

I'm aware even ULTRA fast SDD will not match RAM speed, however streaming speed with skyrocket and therefore developers will be able to save RAM memory and load much bigger textures anyway.



As you can see SDD used as virtual VRAM obviously makes a difference. When normal GPU runs out of RAM there's a slide show as digital foundry video shows, however AMD GPU with SDD can run the same test with smooth framerate.

Having ultra fast SDD and insane amount of memory would be the best scenario, but lets be realistic, these consoles will be cheap, therefore people expectations should be realistic. And BTW. RAM is one thing, but game size is another. Do we really need 32GB RAM in consoles, when the vast majority of games are around 50GB?


This sounds more like what carmack did back in the day, where they started to adress harddrives for mega textures. However all of those examples is a fixed camera view that barely moves. what if they jank the camera to the other side of the room, u probably get the same issue that nothing gets loaded. While SSD idea is great, i don't see it work or being practical in any game scenario other then fixed area's. I could be wrong here. But i can't see SSD's replace Ram even remotely. For loading in 8k textures and bigger files faster same way traditional harddrives are used sure. but again PC will barely sit at 4k or even 8k so its not really going to matter anyway.
 
Last edited:

VFXVeteran

Banned
I make all the sense, because higher resolutions consumes more v-ram. AC odyssey consumes 4,6gb of v-ram on my 1080ti at 1080p and 6,5gb on 4k.

I'm questioning you saying that consoles will focus on 4k and the PC will focus on a lower resolution. The PC has been powerful enough for 4k gaming for years now. Why would that suddenly be not the case? Why would the PC only be able to handle games at 1080p/1440p? A high-end PC will always have more VRAM available than a console (even next-gen).
 
Last edited:

carsar

Member
Nvidia is not really the kind of company that gives you a lot of VRAM. They won't go from 8 to 20GB with the XX80 cards.

I expect 12. If we are lucky we are getting 16GB, but I think 12 is much more likely.

If you give people too much VRAM then customers have one reason fewer to wait with their upgrades.

The main two reasons for giving people low amounts of VRAM (like 2080 with 8GB or 2080 Ti with 11GB) is

1) cheaper for Nvidia
2) you have to buy a new graphic cards rather sooner than later.

20GB for the RTX 3080 not going to happen, unless it is the flagship model or they are really afraid of AMD/Next gen consoles.
256bit ~8/16gb
320 bit ~10/20gb
384bit ~ 12/24gb.
I doubt nvidia would release 3080 with 384 bit bus.
 

pawel86ck

Banned
I don't see your point. If consoles focus on 4k, PC will have it easier on v-ram in comparison towards last generation. If you story about SSD's would work then even 16gb of v-ram would be useless anyway which isn't helping as argument against what i said.



I make all the sense, because higher resolutions consumes more v-ram. AC odyssey consumes 4,6gb of v-ram on my 1080ti at 1080p and 6,5gb on 4k.



This sounds more like what carmack did back in the day, where they started to adress harddrives for mega textures. However all of those examples is a fixed camera view that barely moves. what if they jank the camera to the other side of the room, u probably get the same issue that nothing gets loaded. While SSD idea is great, i don't see it work or being practical in any game scenario other then fixed area's. I could be wrong here. But i can't see SSD's replace Ram even remotely. For loading in 8k textures and bigger files faster same way traditional harddrives are used sure. but again PC will barely sit at 4k or even 8k so its not really going to matter anyway.

i don't see it work or being practical in any game scenario
MS and Sony engineers probably know what they are doing and Digital Foundry analysis already proves SDD will make a difference.

Man just buy RTX 3080 20GB version instead of 10GB version and thank me later :messenger_grinning_sweat: .
 

Thaedolus

Member
I’ll be on board for a 3080 or 3080ti, depending on how stuff like HL Alyx runs on my Index. The real question is do I already think about retiring a 3 year old i7 7700k? It feels like I just got the damn thing compared to my 6 year stint with an i7 2600k
 

VFXVeteran

Banned
Why should it be?. Look at the GPU upgrade between xbox one and xbox X series, 9 times as fast. That´s not gonna sit well on the PC side if all we get are 20-30% upgrades.

The GPU upgrades is not why I asked if it was a joke post. The current gen 20-series is more than enough to handle console ports.
 

Celcius

°Temp. member
I’ll be on board for a 3080 or 3080ti, depending on how stuff like HL Alyx runs on my Index. The real question is do I already think about retiring a 3 year old i7 7700k? It feels like I just got the damn thing compared to my 6 year stint with an i7 2600k
I’m in the exact same boat as you. I had a 2600k for years and then upgraded to the 7700k as soon as it came out 3 years ago. It still runs all of my games flawlessly (overclocked to 4.8ghz) but I think I’m going to upgrade when the 10th gen intel mainstream chips come out and then reassign the 7700k to htpc duty.
 

lukilladog

Member
The GPU upgrades is not why I asked if it was a joke post. The current gen 20-series is more than enough to handle console ports.

The question is what kind of upgrade is needed over the 20 series to handle with ease ports of console games designed with 12 tflops gpu´s in mind instead of 1.3 tflop gpu´s. Nvidia´s cooked 30% ain´t gonna cut it.
 
Last edited:

Xdrive05

Member
Any word on RTX improvements yet? Wondering if this 30XX series beefs up the RTX linearly as well, ie: 3070 doing 2080 RTX performance too. Or if the RTX is even more improved relative to the first gen RTX silicon.
 

GamerEDM

Banned
how much is this thing going to freaking cost? no thanks ill stick to my gtx 2080 level performance for $500 on PS5
 
Last edited:

VFXVeteran

Banned
The question is what kind of upgrade is needed over the 20 series to handle with ease ports of console games designed with 12 tflops gpu´s in mind instead of 1.3 tflop gpu´s. Nvidia´s cooked 30% ain´t gonna cut it.

Any 2070 or equivalent and higher. You are assuming that it takes a PC 2-3x more power to implement a game's graphics features on a console. That's just not the case. A console with 1080Ti performance will still stuggle with true 4k res games. And depending on the type of game and who developed it - 60FPS is off the table @ 4k.
 
Last edited:

jono51

Banned
Maybe this is why Sony are releasing Horizon ZD on PC. They can't fit Horizon 2 on the 12gb PS5. It will only fit on a 3080.
 

pawel86ck

Banned
Any 2070 or equivalent and higher. You are assuming that it takes a PC 2-3x more power to implement a game's graphics features on a console. That's just not the case. A console with 1080Ti performance will still stuggle with true 4k res games. And depending on the type of game and who developed it - 60FPS is off the table @ 4k.
I know what you are saying is true, however developers will use tricks like VRS (75% fps boost in 3dmark) or checkerboard rendering, so we should see many 4K 60fps games... just not native because even VRS reduce sharpness of details. Personally I would be fine with 1800p upscaled to 4K, because on HDTV I barelly can tell a difference between 1800p and 4K picture, while performance difference is really big. For example my 2GHz strix 1080ti run Rise Of The Tomb Raider at 4K with 45fps dis, but at 1800p the same game run at locked 60fps.
 
Last edited:

Ruben43cb

Member
What card is going to give 20 Tflops+?! I expected nvidias next big boys to do it but the fact they are so close to the 2080 etc is crappy
 
Top Bottom