• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 3060 announced: 13TF, 12GB GDDR6 - $329, (un)available late February

Korranator

Member
Lol way to screw over the early adopters. We are better off just waiting to see the supers and ti versions with this nonsense.
 
Last edited:

DeaconOfTheDank

Gold Member
The way I see it, Nvidia is investing in two different approaches to rendering:
  1. Heavy reliance on AI upscaling to get to native 4K/8K and more developers targeting lower resolutions (e.g.1440p) such that VRAM isn't a problem. Cyberpunk can be seen as an example of this because, frankly, running at native 4K without DLSS is a fool's errand. Many techniques also scale and are less expensive at lower resolutions, thus reducing number of shader teraflops needed and instead placing emphasis on tensor teraflops for upscaling. Additionally, by allowing for very realistic real-time lighting, less lighting is baked into materials and again brings down the amount of VRAM needed. This comes at the cost of placing emphasis on RT teraflops for intersection tests and the CPU for building BVHs. Nvidia can try to steer developers in this direction by artificially enforcing a VRAM limit (e.g. 10 GB) to where native 4K is very difficult to achieve while adopting modern graphics rendering techniques...
  2. Targeting native resolution which requires large amounts of VRAM, shader teraflops, and RT teraflops. Going down this path will probably see an emergence of many temporal techniques that build up data and amortize render cost over many frames. Unreal Engine 5 demo was a good example of this concerning expensive techniques like Global Illumination. By making the cards that support this approach very expensive (MSRP $1499 for RTX 3090), more developers are inclined to look at cheaper cards and hopefully head in the direction that Nvidia is best prepared for.
Finally, with mainstream adoption of SSDs and reliance on storage as an extra pool of pseudo-VRAM primarily for streaming, developers can minimize GPU VRAM usage to assets that are directly in the player's view (e.g. next 5 seconds of gameplay) rather than holding all assets that player might interact with in the near future (e.g. next 30 seconds of gameplay).

Sounds like they're just covering their bases.
 
Last edited:
Considering getting the 3060. I have a rtx 2080 shadow cloud pc. But the image quality is ridiculously bad for $30 a month subscription shadow charges. Not to mention the stuttering in games. I am in a one year contract unfortunately.

I bought into the streaming gaming pc hype and its not worth it.
 
  • Empathy
Reactions: GHG

Md Ray

Member
I'm sure Nvidia will address that issue with Super variants of the 3070 and 3080 which is why I'm waiting for those cards to hit and i'm sure they're not too far off.
They better be not too far because my GTX 970 has died and I'm desperately in need of a new GPU. :(
 

GenericUser

Member
3080 20GB Version will be mine, next christmas. 20 GB Vram with the raw power of the 3080 seems to be the sweet spot to play all "next gen" games at 1440P/60. That will hopefully lost until the end of this console gen.
 

adamosmaki

Member
What a deal 2 years for a gpu barely faster than last gens €350 gpus (if 360ti is anything to go by it will be no more than 15% faster than something like rx 5700) and probably unavailable anywhere close to msrp
 

mxbison

Member
I would've liked some more vram but I'm just glad I got a 3070fe at msrp and don't have to deal with this circus anymore.

Should be fine at 1440p until the 40xx series arrives and probably still worth a decent amount then.
 

TriSuit666

Banned
more like 3050... amirite?
(Seriously).... These were the alleged specs for a 3050 Ti.

The proposed bog standard 3060 was supposed to be a little beefier, but with only 6gb of VRAM.

Y’all fixating on the 12gb are missing the point, you’re still probably looking at 1660S levels of performance, with probably really crappy RTX performance based on the shader count and bitrate.
 
Last edited:
this is so outrageous... I've bought 3080fe like week after release and now 3060 has 2gb more vram ?!
lol fuck pc gaming. officially... At least I've paid 699 for is and sold free watch dogs for 30 so fair enough.
good luck buying 12gb 3060 for 670 :p

And you can still sell your 3080 for $500+ in 2 years when the 4080ti comes out.
It'll be significantly faster than a 1660S, ballpark RTX 2070 / 1080 Ti performance.
Did you mean to put 2070S and 1080ti? Because 2070 to 1080ti is a bit of a range
 
Last edited:
What a deal 2 years for a gpu barely faster than last gens €350 gpus (if 360ti is anything to go by it will be no more than 15% faster than something like rx 5700) and probably unavailable anywhere close to msrp

So spend a tad more and get the 3060ti which will be the best value card in the nvidia lineup this gen...

Or keep waiting on an amd miracle. Oh wait...
 

Xyphie

Member
Did you mean to put 2070S and 1080ti? Because 2070 to 1080ti is a bit of a range

It'll definitely be slower than a 2070S.

If you index performance at 2070 it'll look roughly like this:

1660S: ~70
2070: ~100
3060: ~100-105
1080Ti: ~110
2070S: ~115
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
It'll definitely be slower than a 2070S.

If you index performance at 2070 it'll look roughly like this:

1660S: ~70
2070: ~100
3060: ~100-105
1080Ti: ~110
2070S: ~115
The 1080Ti and 2070 Super have pretty much the same performance in classic workloads.

And the 3060 wont be at that level.
Itll likely be at the 2060Super.

Which makes the card all the more WTF.
You wont be VRAM constrained when the card cant even power its way to filling its VRAM and will perform similar to the card its replacing.
the 2060Super.

I guess its a "decent" card for low tier workstations not needing to go out of core will help.
But for a gaming build its barely worth the price of entry.

It needed to be 300 dollars or less.
Knowing that we will never actually get cards at the MSRP of 329.

If you dont care too much about RT/DLSS the 5700XT is already a better card.
And if you do you could likely find a 2070Super for less than the scalper price you will likely have to pay.
 

TriSuit666

Banned
It'll be significantly faster than a 1660S, ballpark RTX 2070 / 1080 Ti performance.
Nope, this is Super territory spec-wise - bumped memory bandwidth, but lower bitrate. Significantly lower cores though.

But when you've got MLID saying good things about it, the perhaps it does bear looking at the reviews.

What do I care, I got my 3070 OC model at less than MSRP, so I'm happy with its performance.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If it can do 1440 to upscaled 4k decently then I may consider.

Look at 2060 Super benchmarks and youll have an idea of what this card can do at 1440p.
Sub 60 at max settings with modern games....but with some optimized settings you should be able to get to 60 easy work.
 
Look at 2060 Super benchmarks and youll have an idea of what this card can do at 1440p.
Sub 60 at max settings with modern games....but with some optimized settings you should be able to get to 60 easy work.

Wow. The 2060 Super was actually a decent card. Looked at Guru3D's bench mark which ran Witcher 3 at max details in 1440 with an average of 91fps. That is decent. So the 3060 would be pushing towards the 100's? I may have to consider upgrading from an RX5500XT.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Wow. The 2060 Super was actually a decent card. Looked at Guru3D's bench mark which ran Witcher 3 at max details in 1440 with an average of 91fps. That is decent. So the 3060 would be pushing towards the 100's? I may have to consider upgrading from an RX5500XT.

Yeah the 2060 Super was a hell of a card.
At 400 dollars no less.....stock was actually available.
Its just weird to replace the 2060S with the 3060.....no real gen on gen improvements isnt a good look.
Nvidia should have just kept the 3050 name for this card.


If you are playing older titles this card should be more than sufficient for 1440p.
But if you are playing any more modern titles DLSS4K might be a tall order.
Wait for stock to get normal and opt for the 3060ti.....it performs at or beyond 2080Supers which will be a comfortable 1440p for atleast a little while.
 

TriSuit666

Banned
Yeah the 2060 Super was a hell of a card.
At 400 dollars no less.....stock was actually available.
Its just weird to replace the 2060S with the 3060.....no real gen on gen improvements isnt a good look.
Nvidia should have just kept the 3050 name for this card.


If you are playing older titles this card should be more than sufficient for 1440p.
But if you are playing any more modern titles DLSS4K might be a tall order.
Wait for stock to get normal and opt for the 3060ti.....it performs at or beyond 2080Supers which will be a comfortable 1440p for atleast a little while.
Yep, especially when it seems the laptop 3060 part has better specs (but lower VRAM).
 
Last edited:

KungFucius

King Snowflake
(un)available lol indeed.

The fact it has 12GB VRAM surely means there will be 3070/80 ti/super and they will have even more VRAM?

Thats going to be such a shat on people who just paid $1000 for a 3080 if true.
The new prices have the current 3080 at or near 1000 for some models. Regardless the 2080 Ti was still selling for 1200 or so right past the 3080 launch. They don't lower prices, they just add products that surpass the old ones. The 3080 Ti/super will undoubtedly cost more than the 3080 because the RAM will cost more. I fail to understand how introducing a more expensive card several months later shits on those who bought the first cards and have enjoyed them for those several months, even if they bypassed the frustrating and time consuming process of trying to acquire one for MSRP and got one from a scalper.
 

TriSuit666

Banned
The new prices have the current 3080 at or near 1000 for some models. Regardless the 2080 Ti was still selling for 1200 or so right past the 3080 launch. They don't lower prices, they just add products that surpass the old ones. The 3080 Ti/super will undoubtedly cost more than the 3080 because the RAM will cost more. I fail to understand how introducing a more expensive card several months later shits on those who bought the first cards and have enjoyed them for those several months, even if they bypassed the frustrating and time consuming process of trying to acquire one for MSRP and got one from a scalper.
Well, it doesn't, does it?

I for one will be sitting back smugly and chuckling softly to myself at the lamentations of those who are driven before us who tried to order this card, bearing in mind scalpers should find it an easier pick now that pre-orders won't be thing.
 

adamosmaki

Member
So spend a tad more and get the 3060ti which will be the best value card in the nvidia lineup this gen...

Or keep waiting on an amd miracle. Oh wait...
who said anything about amd? also 360ti is not that much better. 11% faster than a 5700xt with a similar launch price and currently no where to be found for less than 600euros. Same think applies to amd this gen although i expect to see their offerings in the 300-400 range
 

Buggy Loop

Member
Slower bandwidth = less effective RTX IO/DirectStorage = have to put more idling data in VRAM = more VRAM allocation

Some of you seem surprised? 6GB would have been way too low for that bandwidth.

And guys, stop thinking like the past 2 decades of VRAM requirements in a linear progression, Microsoft on Xbox and PC with their API and Sony with their IO module just made the biggest ripple in IO management since the creation of GPUs. The way forward is having the VRAM act like a buffer, with barely any idle data. This requires a lot of bandwidth. Maybe some doofus dev will make a shit port on PC with 8k textures and disable all IO improvements found on consoles just to go over 10GB VRAM pool at 4K, but it’ll be in the minorities, especially with the graphic engines shifting to this working method.
 

llien

Member
when the weaker version has more vram

fucking wat

iu
 

RavenSan

Off-Site Inflammatory Member
So is Nvidia just releasing new cards every 6 months now?

I am cry that I bought a 2070 at launch.
 
D

Deleted member 17706

Unconfirmed Member
3080 20GB Version will be mine, next christmas. 20 GB Vram with the raw power of the 3080 seems to be the sweet spot to play all "next gen" games at 1440P/60. That will hopefully lost until the end of this console gen.

I doubt there will be a single game in the next 3 years that needs 10GB VRAM at 1440p.
 

Buggy Loop

Member
I doubt there will be a single game in the next 3 years that needs 10GB VRAM at 1440p.

He will choke on rasterization/RT way before (just take Cyberpunk 2077) and have to drop off effects or use DLSS anyway and end up using less than half his memory.

This is a hundreds of dollars markup for insecure peoples haha.

But he could put 500+ Skyrim mods... 🤔
 
Last edited:
who said anything about amd? also 360ti is not that much better. 11% faster than a 5700xt with a similar launch price and currently no where to be found for less than 600euros. Same think applies to amd this gen although i expect to see their offerings in the 300-400 range
LOL, like the 6800xt and how it was supposed to be cheaper than the 3080? They price gauged even before the recent tariffs. No way the 6700/6700xt come in as as good of values as the 5700/5700xt are now. Those cards were priced just *barely low enough at launch, and only because they got jebaited into lowering by $50 to $400 and $450, respectively. They've only become outstanding values as a result of supply and demand. They've (incorrectly) surmised that because they're now able to charge market price for their CPUs, that would extend to their GPUs

If you're talking about a lower end AMD card to compete in the 3060/3060ti space, well of course they should come in under Nvidia because they're objectively worse. Terrible ray tracing performance, no guarantee of a competent DLSS 2.0 competitor and likely on par in rasterization performance for the same price? No thank you. The 6600 would need to be sub $300 to be considered a win. AMD fucked up this generation in a big way in terms of gaining back GPU market share--that's a fact that will be born out more and more as supply constraints dissipate. They're more focused on the consoles and it shows. That could be the right business decision of course but enthusiasts are going to lament it for another generation to come.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I find it funny that nvidia can just screw people over so badly with their vram and when they put out cards with more only months later .. their fans just call it “ fixing the issue”.

Youre gonna run out of horsepower before the benefit of 12GB of VRAM shows up on the 3060.
It has alot less power than the 3060Ti...Nvidia just couldnt release a 6GB card in this day and age so the choices were limited.

This card will be outperformed every single step of the way by its older brothers the VRAM advantage is practically moot because the chip itself is weak.
Its barely beyond the card its replacing RTX 2060 Super.
This should have been called the 3050.
 

Bo_Hazem

Banned
I can see a scenario where RTX 3070 will get choked to death when using max quality texture in a future game from a studio like id Software whereas the RTX 3060 will have no such issues.

Here's an extreme e.g. at 4K.

Performance gets dragged down when you're VRAM bound. Look at this!

2.png


RTX 3060 Ti = RTX 3070!

Hold off on getting your RTX 3060 Ti and 3070, people.

EDIT:
But rants aside...

My thoughts on RTX 3060 is positive. I can see it being toe to toe with XSX/PS5 is rasterization, or slightly below (depending on games), should be able to handle higher quality textures than consoles in next-gen games. And you get the usual DLSS, faster RT perf, full DX Ultimate support, and now resizable bar support too, for all the 20, 30-series GPUs.

And that's not even native 4K, upscaled by DLSS.
 

Bo_Hazem

Banned
Can there be 1 card, just 1 card where it doesn't look garrish and tacky like the above examples?
What happened to sophistication when it comes to design.

Good lord, all these cards look like baby Transformers.

Mate, the main problem is that we're getting older for these toys. :lollipop_tears_of_joy:
 
Last edited:

TriSuit666

Banned
Youre gonna run out of horsepower before the benefit of 12GB of VRAM shows up on the 3060.
It has alot less power than the 3060Ti...Nvidia just couldnt release a 6GB card in this day and age so the choices were limited.

This card will be outperformed every single step of the way by its older brothers the VRAM advantage is practically moot because the chip itself is weak.
Its barely beyond the card its replacing RTX 2060 Super.
This should have been called the 3050.
Wat this they/them/he/her/she/him said...

This is a rebranded 3050Ti. Shimples.
 

supernova8

Banned
Considering I'm not bothered about super cool gaming performance, I might check back on the prices of something like 2060 super once stock stabilises.
 

Ascend

Member
And that's not even native 4K, upscaled by DLSS.
Remember the X360/PS3 days? Upscaling during those times was considered abysmal and pretty much an abomination. The tech was trashed, ridiculed etc. Even now, with the PS5 and XSX, the consoles are basically shamed for being too weak to run things natively and using things like checkerboarding or dynamic resolution with upscaling.

In comes DLSS, and suddenly upscaling is the greatest thing ever.
And right now we're gonna get a bunch of people telling me that this is different because AI blah blah blah.
 

Bo_Hazem

Banned
Remember the X360/PS3 days? Upscaling during those times was considered abysmal and pretty much an abomination. The tech was trashed, ridiculed etc. Even now, with the PS5 and XSX, the consoles are basically shamed for being too weak to run things natively and using things like checkerboarding or dynamic resolution with upscaling.

In comes DLSS, and suddenly upscaling is the greatest thing ever.
And right now we're gonna get a bunch of people telling me that this is different because AI blah blah blah.

It's pretty strange, actually. Above all of that, it needs special implementation and extra work from the devs to have it working, and you may deal with artifacts and shit in motion or extreme sharpening beyond native 4K. It's a great tech but it's not automatic. Most other options so far, even if you can notice that it's not native 4K, are flawless.

Suddenly it's 4K with DLSS, and for example Demon's Souls, if not told by the developers, it's "more" native 4K in performance mode than most other "native" 4K games, but it's using a 1440p source. It should be automatic and available in every game. Actually TV's now are doing that extra work, at least the new XR upscaling in the new Sony TV's. If it works perfectly, it would boost the results of the game even more.

They already upscale from 1080p to 4K impressively in much older tv's as well. Timestamped:





The only thing that I think nVidia has the edge in is Ray Tracing according to latest comparisons. It could get closer with newer games that take full advantage of AMD's implementation? New drivers? Don't know. But they'll have their own Super Sampling that will work automatically with any game.
 
Last edited:

DeaconOfTheDank

Gold Member
Remember the X360/PS3 days? Upscaling during those times was considered abysmal and pretty much an abomination. The tech was trashed, ridiculed etc. Even now, with the PS5 and XSX, the consoles are basically shamed for being too weak to run things natively and using things like checkerboarding or dynamic resolution with upscaling.

In comes DLSS, and suddenly upscaling is the greatest thing ever.
And right now we're gonna get a bunch of people telling me that this is different because AI blah blah blah.
Upscaling as a concept has never been a bad thing. The implementations in the past were simply not up to par because the sacrifice to visual quality was too great. It wasn't until the PS4 Pro and the introduction of checkerboard rendering that upscaling became good enough for AAA adoption (also, please see Spiderman and Demon's Souls for temporal injection). DLSS is just another implementation that improves on upscaling.

Anybody that argues against upscaling as concept is allowing stupid fanboyism to blind themselves to legitimate benefits and advances in graphics technology.
 
Top Bottom