• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4070 Review Thread

StereoVsn

Member
Folks indeed in general would be better off with say 6950xt which is about the same price, give or take a few $.

You won't get good frame rates on a 4070 with RTX anyways, DLSS or not.
 
7 years ago is a long time ago. You don't see console gamers whining because consoles (with a disc drive) are now $500 instead of the $300 current gen consoles were going for back in 2016.

Well, in 2013 the X1 launched at $500 and the PS4 at $400, in 2020 both launched with models at the same price points (though you did lose the disk on the PS5). Can't really compare the current-gen consoles to GPUs in that way as both did an amazing job holding the line while upping the base tech quite a bit.
 
Last edited:
If I was spending $799 on a GPU I'd get 4070 Ti since it has better RT than 7900 XT.

At 33% more $ I'd hope 7900 XT is better than 4070, but sadly in RT the 7900XT only wins by like 10%. And FSR is still useless at 1440p, compared to DLSS.

I'd never consider such a card. I'll keep buying the xx70 every time and getting a nice bump as long as prices remain reasonable.


But at what price? Even if it's $599 and 15% faster in raster, it's still going to probably lose to the 4070 in RT and probably use more power, while still having abysmal FSR2 1440p image quality (unless FSR2 improves soon).


FSR2 image quality is abysmal at 1440p. 40 FPS is an abysmal frame-frate. I use a high refresh monitor. I like running at around 100 FPS at least and would prefer maxing out my monitor at approaching 200 FPS.


I usually buy a GPU every year. The 3070 at nearly 2 years is the longest I've ever held on to a GPU, I'm overdue an upgrade.

I'll probably have a 5070 in 2025. VRAM limits is the last thing on my mind. Nvidia will give the x70 the VRAM it needs when it matters.
Maybe I'm too old, but in the past I only upgraded my PCs when games were running below 30fps, or at low settings. My GTX1080 can run games with comparable quality to PS5 despite it's age, and that's good enough for me... or should I say it was good enough :p, because now my GTX1080 is VRAM limited in more and more games, and that's indeed a problem for me, because low texture quality looks like garbage.

Yes, locked 40fps is playable to me, in fact 40fps on my 170Hz nano-ips VRR monitor is more sharp and responsive during motion than 60fps on my old 60Hz AS-IPS monitor (I no longer need to use laggy VSYNC and because 40fps is refreshed multiple times on my VRR display flicker improve the sharpness).

Of course I still prefer to game at 60fps, or even higher (especially in fast paced first person shooter games), but I can only tell the difference in latency up to 90fps, and above that it's a placebo territory for me. Only some extremely laggy game (games have different input lag) will benefit from 100fps and more (because higher fps reduce input lag).



"what is the optimal temporal frequency of an object that you can detect?”

And studies have found that the answer is between 7 and 13 Hz. After that, our sensitivity to movement drops significantly. “When you want to do visual search, or multiple visual tracking or just interpret motion direction, your brain will take only 13 images out of a second of continuous flow, so you will average the other images that are in between into one image.”

Please read this quote if you really think you can see 200Hz :p.

To be fair higher fps on sample and hold display will improve sharpness during motion, but you need EXTREMELY HIGH fps (240fps) to even match the motion clarity of plasma running the same game at 60fps (in order to match CRT you would need 1000fps). On sample and hold display it's better to use BFI, because then even 60fps will look reasonably sharp thanks to BFI (way sharper than evev 120fps without BFI). The only downside is much lower brightness, but some displays are bright enough to compensate for this.

Even my plasma has sharper image at 60fps than my nano-ips sample and hold display at 170fps. I would have to buy 240fps monitor (and run games at 240fps on top of that) to match the motion quality of my 42VT30 plasma in 60fps game.
 

Buggy Loop

Gold Member
Folks indeed in general would be better off with say 6950xt which is about the same price, give or take a few $.

You won't get good frame rates on a 4070 with RTX anyways, DLSS or not.

lvS9ewS.jpg


?
 

Haint

Member
There’s such a huge gap between 4090 and 4080, it’s mind blowing.

4080 should have been 4070, 4080 somewhere in that gap range in the ~12k cores.

Really holding on until 5000 series, hopefully we return to a sense of normalcy (ha ha ha… ain’t happening, right?)

Nvidia's stock price is through the roof and they've been holding the line on price for 6 months suggesting demand is mostly matching supply, so no, it ain't happening. The PCMR has spoken and they've overwhelmingly supported $1300 mid range GPU's just like they support $1400 27" monitors and $300 Mice and Keyboards.
 

Xcell Miguel

Gold Member
DLSS3 has very small input lag penalty (at least compared to TVs motion upscalers) but you have to run your games with unlocked fps. If you want to lock your fps (for example with Riva Tuner) when DLSS3 is enabled then you will get additional 100ms penalty. Personally, I always play with fps lock because even on my VRR monitor I can feel when the fps is fluctuating.
Since a few months DLSS3 FG can run with locked FPS with VRR displays. Just enable Vsync in the Nvidia control panel and the game will be locked at 116 FPS if your screen is 120 Hz for example when FG is enabled. It's a new feature that came months ago to make DLSS3 FG work better with VRR displays, locking it under the max refresh rate to keep a low latency.
Also, if a game offers frame limit settings you can set it to half the value you want as DLSS3 FG will double it, that's how I did it in A Plague Tale Requiem before this driver update.
 
Last edited:
Ehhh not quite enough for me to upgrade from a 2080 yet. I'm 1440 for life basically and there are not enough games I want to play out there yet that really make it sweat.
I'm willing to spend $600 - $800 when I do upgrade so maybe Ill wait for the 5070 to get a massive jump and mayyybe start playing around with 4k.
Starfield might make me bite the bullet... if it ends up being something truly amazing. Those AMD cards with all that vram are pretty enticing though.
 

Crayon

Member
AMD cards aren't a great price at release but once they've had some cuts is when they start looking good. Those 16gb 6800's going for under $500 are looking good.
 

Senua

Gold Member
Not really a fair comparison since the x80 used to actually be the mid-tier (104) chip from 900 series all the way up to the 20-series.

GP104 (1070) was 29% faster than full fat GM104 (980)
TU106 (2070) was 16% faster than full fat GP104 (1080)
GA104 (3070) was 26% faster than close to full TU104 (2080)
AD104 (4070) is 26% faster than close to full GA104 (3070)

Looking at it this way nothing has changed.
Happy Cracking Up GIF by MOODMAN


That's a good shill, are you the white cat on Jensen's lap getting your hairs on his nice new jacket? You better lick them off you little shit.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
That's a good shill, are you the white cat on Jensen's lap getting your hairs on his nice new jacket? You better lick them off you little shit.
Instead of name calling, tell me where I lied?

Oh, that's right you can't, so you resort to childish name calling :messenger_tears_of_joy:
 

hinch7

Member
Happy Cracking Up GIF by MOODMAN


That's a good shill, are you the white cat on Jensen's lap getting your hairs on his nice new jacket? You better lick them off you little shit.
Not sure if the guy has stock in Nvidia because that's some questionable logic. Performance wise its the one of the worst jumps we've seen from a 70 class card.

1070 > 980Ti
2070 =1080Ti
3070 > 2080Ti
4070 = 3080

And seeing as this is the first time Nvidia has released a 70 series card with essentially the same specs in terms of cores, TMU, ROP's etc as last gens 70 tier. Its a heavy gimped die and 192-bit bus. Despite being a AD104. And relying on node advantage to give it higher clocks. Plus DDR6X and a modest boost in cache. Really this is a 4060Ti tier card, at best.
 
Last edited:
I can't lie but after holding off due to the pricing, I finally caved in a bought a 4080 Founders Edition card a few weeks ago for £1,199 from NVIDIA (via Scan) as an upgrade to my VRAM-diminished 3080 which was struggling with many recent games at 1440p with RT and I am absolutely delighted with it. However, part of me regrets giving in and paying that amount for an 80 series card when the 3080 was 'only' £749 and I only bought it in June 2022. The pricing of the 40 series card is utterly obnoxious in my opinion but my only upgrade option was the 4080. I don't think the new AMD cards are that impressive and their feature set (FSR etc) is inferior to NVIDIA's. Also, DLSS3 frame generation really is a lovely addition to the NVIDIA GPU feature set.

The 4070 though like all the other cards is overpriced and the VRAM is likely to become an issue even at 1440p based on my own experiences with my 4080 where many games hit 11 GB usage just at 1440p even without RT.
 

DenchDeckard

Moderated wildly
This gen pretty much forced anyone who is serious about this stuff to fork out 1500 for a 4090.

Its an amazing card but there's no doubt nvidia are absolutely awful but also manage to get away with whatever they want. Prime example of what happens to business's with too much market share.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
2070 =1080Ti
2070 was a bit weaker than 1080 Ti.
Not sure if the guy has stock in Nvidia because that's some questionable logic. Performance wise its the one of the worst jumps we've seen from a 70 class card.

1070 = 970Ti
2070 =1080Ti
3070 = 2080Ti
4070 = 3080

And seeing as this is the first time Nvidia has released a 70 series card with essentially the same specs in terms of cores, TMU, ROP's etc as last gens 70 tier. Its a heavy gimped die and 192-bit bus. Despite being a AD104. And relying on node advantage to give it higher clocks. Plus DDR6X and a modest boost in cache. Really this is a 4060Ti tier card, at best.
How is it questionable logic?

GP104 (1070) was 29% faster than full fat GM104 (980)
TU106 (2070) was 16% faster than full fat GP104 (1080)
GA104 (3070) was 26% faster than close to full TU104 (2080)

AD104 (4070) is 26% faster than close to full GA104 (3070)

The bolded was in the picture I quoted.

RTX 4070 continues the same trend of beating the previous gen x104 GPU by a similar margin as the previous 3 x70 GPUs have. Only the 30-series x104 happened to be the 3070 and not the x80 as the x104 has been the prior 3 generations.

It's a fact, but you people will still continue crying about pricing for some reason.

I find it funny that PC gamers are so outraged by pricing this gen, it's been quite the entertainment :messenger_tears_of_joy:
 
Last edited:

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Nothing I say will change anything as the hive mind on the internet will continue believing whatever it wants to believe. Luckily I don't follow the hive mind.

Nvidia's mistake IMO was calling the 3070 a 3070. Usually the 104 is an 80 class GPU (980, 1080, 2080). Nvidia got lambasted trying to call AD104 a 4080 12 GB. In 3 of the 4 prior generations (from 900-20 series) the full fat x104 would have been an 80 class GPU.

Now Nvidia is stuck with hive minded individuals calling the 4070 a 4060.
 

FireFly

Member
People seem to be shitting on the 4070 and that it's only close to 3080 performance for 100 dollars less but are excluding:
- 3080 only has 8GB VRAM, which is too low and was at the time and excluded me from being interested coming from a 1080Ti.
- it has better raytracing performance vs 3080
The 3080 has 10 GB of memory. It looks like the 4070 has about 2% better performance in ray tracing at 1440p.
2070 was a bit weaker than 1080 Ti.

How is it questionable logic?

GP104 (1070) was 29% faster than full fat GM104 (980)
TU106 (2070) was 16% faster than full fat GP104 (1080)
GA104 (3070) was 26% faster than close to full TU104 (2080)

AD104 (4070) is 26% faster than close to full GA104 (3070)

The bolded was in the picture I quoted.

RTX 4070 continues the same trend of beating the previous gen x104 GPU by a similar margin as the previous 3 x70 GPUs have. Only the 30-series x104 happened to be the 3070 and not the x80 as the x104 has been the prior 3 generations.

It's a fact, but you people will still continue crying about pricing for some reason.

I find it funny that PC gamers are so outraged by pricing this gen, it's been quite the entertainment :messenger_tears_of_joy:
The 4070 is 26% faster than the 3070, but is 20% more expensive at MSRP. The other parts you compared launched at the same or a cheaper price than their predecessors. (Well if you compare the 2070 to the 1070 the picture is less favourable)

I feel like 15% more performance per dollar is a good expectation since this should be achievable with clock speed boosts + architectural enhancements.
 
Last edited:

Marlenus

Member
Not great.

Historically the 1060 matched the 980 and the 2060 matched the 1080.

Given this matches the 3080 it would seem that this should be a 4060 and at a reasonable x60 price of $350 to $400 it would be an outstanding product. At $600 though it is the best of a bad bunch. Especially with 12GB of vram, I would not be surprised if in a couple of years it ends up being relegated to a 1080p card and $600 for that is really poor imo.
 
Nothing I say will change anything as the hive mind on the internet will continue believing whatever it wants to believe. Luckily I don't follow the hive mind.

Nvidia's mistake IMO was calling the 3070 a 3070. Usually the 104 is an 80 class GPU (980, 1080, 2080). Nvidia got lambasted trying to call AD104 a 4080 12 GB. In 3 of the 4 prior generations (from 900-20 series) the full fat x104 would have been an 80 class GPU.

Now Nvidia is stuck with hive minded individuals calling the 4070 a 4060.
Accusing others of being "hive minded" while deepthroating corporate dick...the irony.
 
Last edited:

Senua

Gold Member
Instead of name calling, tell me where I lied?

Oh, that's right you can't, so you resort to childish name calling :messenger_tears_of_joy:
The little shit was facetious as I was only calling you that if you were Jensen's cat spreading your white hairs all over the place, they ruin dark clothing you know. But the shill was definitely for you seeing as you constantly twist any negative situation for Nvidia into a positive, the mental gymnastics is pretty insane dude. We know what nvidia has done with the tiers this time around, how does that excuse them? It shows them as even more greedy and shady, but in your mind we ignore this and instead compare to the 3070 and ignore how the gain from last gens 80 card is embarrassing, and a lot of the time is even worse? This is unprecedented and is only defendable by shills or jensens cat so you are one of the two.
 
My Steam Deck plays the vast majority of my games at playable framerates. FSR2 image quality is abysmal at 1440p. 40 FPS is an abysmal frame-frate for desktop gaming. I use a high refresh monitor. I like running at around 100 FPS at least and would prefer maxing out my monitor at approaching 200 FPS.


I usually buy a GPU every year. The 3070 at nearly 2 years is the longest I've ever held on to a GPU, I'm overdue an upgrade.

I'll probably have a 5070 in 2025. VRAM limits is the last thing on my mind. Nvidia will give the x70 the VRAM it needs when it matters.
FSR2 looks reasonably sharp with fidelity FX sharpening. I can still see the difference compared to the native image, but only from up close. I can also play games on my 42VT30 fullhd plsasma without any upscaling.

Why you upgrade GPUs every year when it takes 2 years to release a new generation? Also, good luck trying to sell 3070 at a reasonable price right now, because not many people will want to pay a premium for 8GB GPU these days (the mining craze is over). If the 3070 had 12GB (like the 3060), it would be MUCH better value.
 

StereoVsn

Member
This is running in 1440p with balanced upscaling and getting 50fps.

One would be much better off with RT disabled, quality upscaling and higher FPS in this game. 6950 would do much better at that.

Again, RT can be good, but you are sacrificing significant image quality and frame rates for minor lighting improvements. You turned on Frame-Gen here too which results in artifacts, and isn't present in most games.

4070 is not the card for RT and Cyberpunk is a 3 year old game to boot. It's actually fairly flexible in config settings and scalability now. Hell, I run at on Steamdeck.

All that said, to each their oow preferences.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Yeah after the 20 series the 30 series was a nice upgrade and hope the 50s get back into the proper groove
Thats the only hope I have.
When RTX20s were widely paned Nvidia "fixed" it with the RTX30s......they got greedy again with the RTX40s and hopefully they "fix" things with the RTX50s.

The xx70 should absolutely match a 4090, then even someone upgrading from a 4080 will get a decent upgrade going to a 5070.
 

Buggy Loop

Gold Member
This is running in 1440p with balanced upscaling and getting 50fps.

One would be much better off with RT disabled, quality upscaling and higher FPS in this game. 6950 would do much better at that.

Again, RT can be good, but you are sacrificing significant image quality and frame rates for minor lighting improvements. You turned on Frame-Gen here too which results in artifacts, and isn't present in most games.

4070 is not the card for RT and Cyberpunk is a 3 year old game to boot. It's actually fairly flexible in config settings and scalability now. Hell, I run at on Steamdeck.

All that said, to each their oow preferences.

Since when 50 fps (without frame gen) on a VRR monitor even bad? Dude seriously, I have this performance on my 3080 Ti for ultrawide 1440p and it blows me away in cyberpunk. Peoples who don’t find the difference worth it are clearly not on cards capable of handling it. Of course at sub 30 fps it’s a big no.

What’s the alternative here? Buying a more expensive 6950? These older gen cards consume >100W more, even more so with the 6950. You pay for that, you know, bills, if you have those.

Most of the frame gen artifacts from the day one release have been fixed.

I mean, I just keep hearing the arguments from day 1 review of the tech, such as artifacts and latency. All of that is mostly behind now, latency was debunked, misinformation came from techtubers who have no fucking clue.

I’m all for saying that this gen sucks and only 4090 makes sense (7% price increase for a whooping 50% performance increase is almost unheard of), but no, I would not recommend older gen cards if one were to jump in right now unless it’s bottom range deal like a 6700XT to hold on and SKIP this gen. Or until AMD someday releases their mid range if VRAM is really the best thing in the world and we ignore RT/ML.

Huge pushback on 8GB VRAM in past weeks, AMD marketing saying VRAM matters for a bunch of shit ports sponsored by AMD where even the potato quality mode takes 6GB at 1080p and looks worse texture wise than Crysis with 256MB cards. Now 12GB is a problem? That’s the next push for bullshit sponsored games with memory leaks? :messenger_tears_of_joy:
 
Last edited:
The little shit was facetious as I was only calling you that if you were Jensen's cat spreading your white hairs all over the place, they ruin dark clothing you know. But the shill was definitely for you seeing as you constantly twist any negative situation for Nvidia into a positive, the mental gymnastics is pretty insane dude. We know what nvidia has done with the tiers this time around, how does that excuse them? It shows them as even more greedy and shady, but in your mind we ignore this and instead compare to the 3070 and ignore how the gain from last gens 80 card is embarrassing, and a lot of the time is even worse? This is unprecedented and is only defendable by shills or jensens cat so you are one of the two.
We as consumers do not benefit from planned obsolescence. IDK why he defends Nv so much, but maybe you are right and he does work for Nv.

People were expecting 3090 like peformance from 4070, but is seems nvidia has changed their strategy, because now you have to go with more expensive 4070TI in order to match the top dog (3090) of the last generation.

IMO 3080 like peroformance is still good enough at 1440p (and especially for people like me, who still use pascal GPU), but 12GB is not good enough for sure. I watch YT comparisons and even now some games can max 12GB VRAM (Godfall for example). I hope that NV will also offer the 4070 with 16GB of VRAM, because 12GB is just not enough if you plan to play on the same GPU for more than 2 years.




I can't lie but after holding off due to the pricing, I finally caved in a bought a 4080 Founders Edition card a few weeks ago for £1,199 from NVIDIA (via Scan) as an upgrade to my VRAM-diminished 3080 which was struggling with many recent games at 1440p with RT and I am absolutely delighted with it. However, part of me regrets giving in and paying that amount for an 80 series card when the 3080 was 'only' £749 and I only bought it in June 2022. The pricing of the 40 series card is utterly obnoxious in my opinion but my only upgrade option was the 4080. I don't think the new AMD cards are that impressive and their feature set (FSR etc) is inferior to NVIDIA's. Also, DLSS3 frame generation really is a lovely addition to the NVIDIA GPU feature set.

The 4070 though like all the other cards is overpriced and the VRAM is likely to become an issue even at 1440p based on my own experiences with my 4080 where many games hit 11 GB usage just at 1440p even without RT.
This is exactly what I'm talking about, a planned obsolescence. You paid a premium for 3080 not so long time ago, and now you are forced to upgrade not because your GPU is too slow, but because games require more VRAM.
 
Last edited:

FingerBang

Member
PC gaming is awful this gen, price-wise. Nvidia's strategy is to ass-fuck their user base and expect them to say thank you. Their value proposition is horrible unless you go for the top-end, which is weirdly priced reasonably.

To me, it's ridiculous how they raised the price while offering less for each of the tiers, again, except for the 4090. The 4070 is priced reasonably compared to the 3070, but it's a smaller chip with the same number of CUs. The trend of the xx70 card matching the previous top-end is dead. The 4070ti did, though, but guess what? It's 800 now.

And in all this, where the fuck is AMD? Seriously, this is their moment. They could get in and take the market, but nope. N32 and N33 are nowhere to be seen. All we have is rumors, and after overpromising and underdelivering with N31 (good cards that should have been called 7080XT and 7090XT and be priced probably $200 less), they're letting Nvidia decide the market and setting price for performance. They don't want to be aggressive by setting a price on performance; they are happy to play Nvidia's game. You're the one getting fucked, as always.

Stop defending this. They are making PC games a luxury. Get a console instead of waiting for reasonable prices. You get the same, slightly blur games for a fourth of the price. Or buy used.
 
Just going to get a 4090 and 7800x3d. Yes the 4090 is a gross price but it's such a huge jump over everything else unlike normal gens when it's a few percent better at the top end.

I have like £10k in my PC fund after skipping the 30 series COVID scalpers and while Nvidia just pre-scalp everything now PC gaming is still a very cheap hobby compared to some of the shit people do.
 

Marlenus

Member
We as consumers do not benefit from planned obsolescence. IDK why he defends Nv so much, but maybe you are right and he does work for Nv.

People were expecting 3090 like peformance from 4070, but is seems nvidia has changed their strategy, because now you have to go with more expensive 4070TI in order to match the top dog (3090) of the last generation.

IMO 3080 like peroformance is still good enough at 1440p (and especially for people like me, who still use pascal GPU), but 12GB is not good enough for sure. I watch YT comparisons and even now some games can max 12GB VRAM (Godfall for example). I hope that NV will also offer the 4070 with 16GB of VRAM, because 12GB is just not enough if you plan to play on the same GPU for more than 2 years.





This is exactly what I'm talking about, a planned obsolescence. You paid a premium for 3080 not so long time ago, and now you are forced to upgrade not because your GPU is too slow, but because games require more VRAM.


I don't think it is planned obsolescence. NV have a history of giving you enough for right now. They did it with 700 and 900 series parts and because they launched before the next gen consoles of the time had exited the cross gen period. Then with the post cross gen games hitting vram hard they released the 1000 series and that lasted really well. Ampere was the equivalent of 700 series. Ada is equivalent to 900 series but a proper uplift in terms of vram will occur with the 5000 series imo. That will be the 1000 series equivalent and should last until next gen comes out of the cross gen period.
 
I don't think it is planned obsolescence. NV have a history of giving you enough for right now. They did it with 700 and 900 series parts and because they launched before the next gen consoles of the time had exited the cross gen period. Then with the post cross gen games hitting vram hard they released the 1000 series and that lasted really well. Ampere was the equivalent of 700 series. Ada is equivalent to 900 series but a proper uplift in terms of vram will occur with the 5000 series imo. That will be the 1000 series equivalent and should last until next gen comes out of the cross gen period.
Dude, you must be joking. In 2012 I bought GTX680 2GB and two years later this GPU was extremely VRAM limited. I was forced to lower texture settings in pretty much every new game. In some games like COD Ghosts I had PS2 like textures no matter which texture settings I used.

NThb-O7y91-SIVB44-GQcl-Jun-FTCKl-GV-I48ogrxy-Sd-Pg.jpg


People learn on their mistakes, so in 2016 I bought GTX1080 with insane amount of VRAM (back then 8GB VRAM was lot). On the GTX1080 all textures in COD Ghosts loads correctly and the game is using up to 4GB VRAM, so no wonder I had problems on my 2GB GTX680.

The base GTX780 3GB model also had
insufficient VRAM, but people who bought 6GB model were probably set for the whole generation.

Nvidia can keep selling 4070 12GB, because for now 12GB is still good enough, but they should also offer 16GB model as well for people like me, who dont want to replace GPUs every 2 years. My old GTX1080 can still run games only because it had more than enough VRAM. The RTX4080 has plenty of VRAM and I' sure it can last the whole generation, but it's just too expensive.
 
Last edited:

ToTTenTranz

Banned
The naming is sort of useless in the grand scheme of things. Most people just have a budget and will buy whatever's best within that budget.
One good way to measure how much of the fab process advantage Nvidia has been taking from customers to themselves is to compare performance ratios at launch MSRPs, but for previous products on a similar launch price.
Let's do it on a 2/3 year cadence which is the usual cadence for new architectures.

I'll count with 3.8% average inflation rate until 2021.


Just the sub-$400 bracket is a great indicator of what's been happening:

GTX 660 Ti 2012 ($300) -> GTX 970 2014 ($330): +82% (+70% price and inflation adjusted)

GTX 970 2014 ($330) -> GTX 1070 2016 ($379): +61% (+57% price and inflation adj.)

GTX 1070 2016 ($379) -> RTX 2060 2019 ($350): +18% (+37% price and inflation adj.)

RTX 2060 2019 ($350) -> RTX 3050 2022 ($380): -11% (-12% price and inflation adj.) -> mining craze though


I don't know what card Nvidia is releasing under $400. It might be the RTX4050 but from the looks of it they might launch that card over $400.

Regardless, Nvidia's increased hoarding of the value taken from architectural and fabrication improvements has been linear, generation after generation.
We reached a point where the RTX 4070 has almost the same price/performance as the RTX 3080 from late 2020. The added value for the consumer, between 2020 and 2023, is going down to zero.


The only people who are happy with this are nvidia executives, nvidia shareholders, paid astroturfers and the idiots.



Accusing others of being "hive minded" while deepthroating corporate dick...the irony.
This is the best ever and most adequate use of the word deepthroating I've seen on the internet.
Thank you.
 
Last edited:
Nvidia can keep selling 4070 12GB, because for now 12GB is still good enough, but they should also offer 16GB model as well for people like me, who dont want to replace GPUs every 2 years. My old GTX1080 can still run games only because it had more than enough VRAM. The RTX4080 has plenty of VRAM and I' sure it can last the whole generation, but it's just too expensive.

A 16GB model on this chip is not realistic, they would have to clam-shell the 192bit bus and go with 24GB here. Which would be a lot for 70 series card.

If they had kept the 256bit bus like the 7800xt will have, then they would be at 16GB of memory.

The 7800XT is going to be a lot more interesting than I thought now, because it is likely going to have the solid, across-the-board lead in raster performance that the 7900 series doesn't have vs. the Nvidia counterparts.
 
Last edited:

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
The 4070 is 26% faster than the 3070, but is 20% more expensive at MSRP. The other parts you compared launched at the same or a cheaper price than their predecessors. (Well if you compare the 2070 to the 1070 the picture is less favourable)

I feel like 15% more performance per dollar is a good expectation since this should be achievable with clock speed boosts + architectural enhancements.
I simply compared them to the prior gen top (or nearly top 104 chip) like the picture I was responding to compared them to.
FSR2 looks reasonably sharp with fidelity FX sharpening. I can still see the difference compared to the native image, but only from up close. I can also play games on my 42VT30 fullhd plsasma without any upscaling.

Why you upgrade GPUs every year when it takes 2 years to release a new generation? Also, good luck trying to sell 3070 at a reasonable price right now, because not many people will want to pay a premium for 8GB GPU these days (the mining craze is over). If the 3070 had 12GB (like the 3060), it would be MUCH better value.
FSR2 looks bad to me at 1440p. It's one of the reason why I don't buy AMD sponsored games, because they lack DLSS2, which looks good to me at 1440p.
Why you upgrade GPUs every year when it takes 2 years to release a new generation? Also, good luck trying to sell 3070 at a reasonable price right now, because not many people will want to pay a premium for 8GB GPU these days (the mining craze is over). If the 3070 had 12GB (like the 3060), it would be MUCH better value.
I went from 2070 to 2070 Super to 3070. All at no additional cost because I used to sell the GPUs like a month or two before the new ones came out at the price I initially paid.

You're right, I'll lose a bit of value in selling the 3070 but I don't really care at this point as I got away with upgrading at no additional cost for my last two upgrades.

I don't know what card Nvidia is releasing under $400. It might be the RTX4050 but from the looks of it they might launch that card over $400.
Another hive minder complaining about GPU prices :messenger_tears_of_joy:

Even going as far to suggest that 4050 could launch at over $400 :messenger_tears_of_joy:
 
Last edited:

ToTTenTranz

Banned
The 7800XT is going to be a lot more interesting than I thought now, because it is likely going to have the solid, across-the-board lead in raster performance that the 7900 series doesn't have vs. the Nvidia counterparts.
The 7800XT might be in a tough spot against the previous (and perhaps cheaper to make) Navi 21 cards, namely the 6950XT.

We'll be looking at 60CUs in the new card vs. 80 CUs in the older. Either Navi 31 clocks 33% higher than the average 2400MHz on the 6950XT (some whopping >3100MHz) or AMD had better release a compiler that makes proper use of the new dual-pumped ALUs. Otherwise the new cards won't perform much better than the 6800 non-XT.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
The 7800XT is going to be a lot more interesting than I thought now, because it is likely going to have the solid, across-the-board lead in raster performance that the 7900 series doesn't have vs. the Nvidia counterparts.
7800 XT vs. 4070 is probably going to be a repeat of 7900 XT vs. 4070 Ti

+10% raster
-loses in RT
-loses in efficiency
-loses in 1440p upscaling technology

Nvidia’s just keeping prices high because they know they can sucker their followers to drop $600 on an 8GB GPU.
4070 has 12 GB at $600.
 
I simply compared them to the prior gen top (or nearly top 104 chip) like the picture I was responding to compared them to.

FSR2 looks bad to me at 1440p. It's one of the reason why I don't buy AMD sponsored games, because they lack DLSS2, which looks good to me at 1440p.

I went from 2070 to 2070 Super to 3070. All at no additional cost because I used to sell the GPUs like a month or two before the new ones came out at the price I initially paid.

You're right, I'll lose a bit of value in selling the 3070 but I don't really care at this point as I got away with upgrading at no additional cost for my last two upgrades.


Another hive minder complaining about GPU prices :messenger_tears_of_joy:

Even going as far to suggest that 4050 could launch at over $400 :messenger_tears_of_joy:
You hate FSR2, but DLSS2 looks good to you? To be fair, I have not seen DLSS2 with my own eyes, but on the screenshots they both look about the same to me.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The 7800XT is going to be a lot more interesting than I thought now, because it is likely going to have the solid, across-the-board lead in raster performance that the 7900 series doesn't have vs. the Nvidia counterparts.
The 7800XT is gonna be trading blows with the currently ~600 dollar 6950XT.
Im guessing part of the reason of the heavy price cut on the 6950XT is so its stock disappears and the 7800XT can take its spot easy work.

It will have to be sub 700 though to even make a mark.
With the new better RT performance itll not only be a true competitor for the RTX4070, realistically it should outdo the RTX4070 in every single way.
CUDA/Optix notwithstanding.




RTX3080s were such a good deal either way.
Upgraded to the RTX3080 because mining craze made people go crazy and were swapping 3070FHRs for 3080LHRs.......which ironically shortly after I did my swap LHR was basically defeated completely.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
You hate FSR2, but DLSS2 looks good to you? To be fair, I have not seen DLSS2 with my own eyes, but on the screenshots they both look about the same to me.
I don't hate FSR2, it's okay at 4K, it just sucks at 1440p, my resolution.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
7800 XT vs. 4070 is probably going to be a repeat of 7900 XT vs. 4070 Ti

+10% raster
-loses in RT
-loses in efficiency
-loses in 1440p upscaling technology


4070 has 12 GB at $600.
That would make the 7800XT worse than a 6950XT?

At the very least I expect the 7800XT to match the 6950XT in raster and beat it in Raytracing.
Which in theory would actually make the 7800XT better than the RTX4070 in pretty much everyway.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
That would make the 7800XT worse than a 6950XT?
Roughly matching the 6950 XT in raster.

At the very least I expect the 7800XT to match the 6950XT in raster and beat it in Raytracing.
Which in theory would actually make the 7800XT better than the RTX4070 in pretty much everyway.
But 7800 XT would lose to 4070 in RT and 1440p upscaling and efficiency, making the 4070 better than 7800 XT to me in pretty much every way.

I need more RT performance more than I need 10% more raster.
 
Last edited:

FireFly

Member
I simply compared them to the prior gen top (or nearly top 104 chip) like the picture I was responding to compared them to.
You said "Looking at it this way nothing has changed". And you're right, if we ignore pricing, nothing has changed.

So I guess you are addressing the section of the audience that get their GPUs for free (from Nvidia?).
 

Dr.D00p

Member
Just confirms that this generation was never intended by Nvidia (except the 4090) as any kind of serious or sensible upgrade for 30xx series owners.

To anyone with a 20xx series or older, they make sense...if you can stomach Nvidias price gouging.
 

SmokedMeat

Gamer™
7800 XT vs. 4070 is probably going to be a repeat of 7900 XT vs. 4070 Ti

+10% raster
-loses in RT
-loses in efficiency
-loses in 1440p upscaling technology


4070 has 12 GB at $600.

12GB is the new 8GB. Barely scraping by, and will be forced to drop textures in another year or two.
My old 1070 traded blows with a 980ti. That was back when Nvidia delivered value.
The 4070 barely beats a 3080, and loses on average to AMD’s last gen 6800XT.

This is why Nvidia will continue to dominate. People are too stubborn to open their eyes to even consider alternatives. Jensen has you by the balls.

Saying AMD loses out in RT is amusing. Like it’s some enormous loss, when in reality it’s like 3090 level. FSR difference isn’t even noticeable during gameplay.

I’ll gladly take two very minor losses in upscaling tech, over having to turn textures down to medium - and having worse rasterization. I want a card that delivers power without the need for the upscaling crutch.

But that’s just me. I’m loyal to neither company, but I will absolutely go to bat for whomever I feel is doing things better. I will call out who I feel isn’t. Realistically I’m hoping Intel wrecks both in the future, and lights a fire under their asses.
In the meantime 4070 is mediocrity for $600. It’s missionary position in GPU form. No way would I buy that card in 2023.
 
Last edited:
Top Bottom