• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is the 3070 enough for next gen gaming at 2k 144 ultra settings with DLSS quality and no ray tracing?

reinking

Gold Member
Simple answer.

How much do you need the $650? If you need it. Make the trade. If you don't. Don't. No reason to give up a card that is offering you better performance for what you want and offers more longevity.
 

Rikkori

Member
No, it isn't, not even now. For the games you like to play, especially COD, it's the 8 GB of vram that will severely gimp you in the future. Granted for a 3070 + $650 it's not a bad trade, because that's essentially $900+ you can use to upgrade next year when the big boys drop (aka RDNA 3 & Lovelace/Hopper).
 
Can you please give me some backup for this? I'm not trying to argue-- I want to verify that what you're saying is real because if it is, I'm dropping the $3000.00 on the Sapphire Toxic, as well as getting a 5900x or maybe 5950x.
I'd still look at benches as that graph is kind of old. It's pretty common knowledge that Nvidia is still the king. AMD has caught up drastically in rasterization, but it's still only kind of neck-and-neck with Nvidia without the benefit of having access to RTX or DLSS and even then it's still a case by case basis that's game dependent, neither one is the clear winner across the board. IF you are gonna spend that much, buy Nvidia, if you are trying to cut down the price and don't care about DLSS/RTX, AMD is a great choice. AMD cards have really been knocking it out of the park as far as performance/price even if their latest releases are priced a little too closely to Nvidia for it to be a better choice in my opinion.
 

Rikkori

Member
Except, cough, MSRP of 6900XT is $1000 and street price is more like 1700.
Yeah, but a 3070 is something like $1200 or so now (didn't check US, that's for local in EU), and then it's a lot more trades to jump through. And because it's not LHR then it can still mine as well as a 6900 XT, but you also pocket $650 (and probably will sell the 3070 for $300ish at least in '22). So it's not a bad deal by any means, depends on what he values more & how much risk he wants to take for how the market will play out for next release etc.
 

SLB1904

Banned
Next gen at 2k ...


R O F L


Master Race pesante self exposed



tenor.gif
Damn that's a bad take lol
 

SLB1904

Banned
There are tons of people on GAF with 1070, 1080s or 1080TI.
Yeah there are loads of 1080ti around even with my modest rtx 2070, I can't find a reason to upgrade from my rtx 2070 right now. The only game I found demanding was cp77. I locked that game 60fps with dlss 1440p and its brilliant. And I feel like upgrade right now is a mistake.
 

//DEVIL//

Member
Simple answer.

How much do you need the $650? If you need it. Make the trade. If you don't. Don't. No reason to give up a card that is offering you better performance for what you want and offers more longevity.

It's not really about needing it or not. No, I don't need it. at all. money isn't really a problem here. it's more about the guilt of spending this much on a video card for the purpose of video games only.

I love video games and I play them every day. but I just feel the 6900xt / 3080ti / 3090 are just too much money especially here in Canada where they are around 2000$ Canadian range. So I feel like if I got the 3070, IT would make me feel better about the purchase? I could be wrong.

from my understanding, the 6900XT is in the range of what is a 3070 +DLSS. just in native. DLSS is not so bad compared to native. so that plus 650$ my mind tells me yeah not so bad ?

I really can't make up my mind lol.


Also, thanks to everyone who is making a remark/comment on this thread. I read every opinion and feedback I see posted.
 
It's not really about needing it or not. No, I don't need it. at all. money isn't really a problem here. it's more about the guilt of spending this much on a video card for the purpose of video games only.

I love video games and I play them every day.


This is all you need then. This is your life's number one hobby which you enjoy and take pleasure in every day. What is worth spending money if not that ?
 

SNG32

Member
Honestly you should be majority of this Gen since cross play with older gens are involved. The most common gpu on steam is a 1060 that will probably be the minimum for a lot of games this gen. 144fps is a stretch but you should get 60 and higher on that especially with dlss. Towards the end of the gen you will probably have to knock some setting down a bit though.
 
Last edited:

reinking

Gold Member
It's not really about needing it or not. No, I don't need it. at all. money isn't really a problem here. it's more about the guilt of spending this much on a video card for the purpose of video games only.
My guilt is no where near the same level of yours. I was looking to get a 3060/3070 and ended up getting a 3080. I am looking at it as a minor bump in future proofing. I do not typically swap GPU's every generation or two so I will get more use out of it. If you swap GPU's more often it is probably worth making the trade. It gets you more bang for-the-buck for now and could pay off in the next year or two as newer cards might be more efficient. This being a local trade with someone you know makes it much more compelling.
 
Last edited:

FireFly

Member
Its not actually. Reason it apears 0.7% faster in those results its because A LOT of outlets were using Valhalla and Dirt 5 which net around 20-30% more performance in amd's favour. This type of discrepancy is enough to make one card apear faster. Dirt 5 was using their beta RT branch that wasnt even available for public, but at public release that 30% amd advantage evaporated and the game is now faster on nvidia. I have no ideea why the press used a beta branch that was borked on nvidia and after every card released and got reviews, Dirt 5 got patched and behaves entirely different now
I mean, I don't see why we should exclude games just because they perform worse on our favourite vendor's hardware (beta versions notwithstanding). Your entire point was that it *didn't matter* which games and resolution we picked, Nvidia was simply ahead. We don't even know if removing Dirt 5 and Valhalla would be sufficient, since if you look at TechPowerup's 6700 XT review for example, the 6800 XT is still ahead in the majority of other titles at 1440p.

Right now Nvidia has caught up in Dirt 5, but AMD is ahead in RE Village, and now has caught up in Cyberpunk. So it goes in cycles. But I am happy to revisit this when the 3080 Ti benchmark summary comes out!
 
I mean, I don't see why we should exclude games just because they perform worse on our favourite vendor's hardware (beta versions notwithstanding). Your entire point was that it *didn't matter* which games and resolution we picked, Nvidia was simply ahead. We don't even know if removing Dirt 5 and Valhalla would be sufficient, since if you look at TechPowerup's 6700 XT review for example, the 6800 XT is still ahead in the majority of other titles at 1440p.

Right now Nvidia has caught up in Dirt 5, but AMD is ahead in RE Village, and now has caught up in Cyberpunk. So it goes in cycles. But I am happy to revisit this when the 3080 Ti benchmark summary comes out!


Dirt 5 and Valhalla is not a case of performing better on one vendor or another. Its defective on one vendor vs the other. Both AMD partner games. If you have 2 cards in the same ballpark of performance and 2 games from one camp are performing to the tune of 30% faster on the partnered vendor, i would leave them out of the testing suite since they dont represent reality. They massively skew the results and offer a distorted image of the relative performance when in every other game on the market they dont perform like that.

RE Village is another AMD partnered game, so its only natural it performs better on amd. They seem to do things in all of their partenered game this gen where they perform better in an unbalanced way compared to nvidia.
 

yamaci17

Member
Here you go then, an Nvidia title

fvDI6ZX.png



Please don't bring RT to the discussion, a 3080 practically needs 960p internal rendering to hit 60 frames in this title at 1440p. It looks horrible. DLSS only purposefully works at 2160p since it uses 1440p as base resolution. And at that config, it simply can't push enough frames. 960p or bust for the 3080.



So yeah, you're welcome.

OP, as you can see even 6800xt pushes %45 more frames at 1440p in Cyberpunk 2077. This is what DLSS gives you essentially but at a much worse image quality. I would say stick on with your 6900xt. It will be perfect for non-rt 1440p gaming for foreseeable future. Ampere can't even keep up with RDNA2 on Nvidia branded games. Its clear that games love RDNA2 and its healthy 2.5+ GHz clocks more than the age old 1.9 GHz of Pascal/Turing/Ampere.

You can bet Nvidia itself will push a different architecture where they abandon Turing/Ampere design and revert to a higher frequency architectuer instead. Once that happens, they will abandon writing special driver code paths for their computational heavy cards and all Ampere/Turing GPUs will see regressions in relative performance just like Kepler/Fermi did against Maxwell/Pascal.

1080p/1440p high refresh gaming = RDNA2 :)
 
Last edited:

spyshagg

Should not be allowed to breed
People really need to stop saying AMD is 1st in raster. It never was, the 3090 is ahead. The 3080TI is ahead as well. In every situation, in every resolution, ray tracing, no ray tracing. In EVERY case. DLSS doesnt need to work with every game, just the demanding ones. Which at this point it does. With the inclussion of Red Dead 2, we got pretty much all the heavy titles covered with dlss with the exception of Horizon and AMD partnered Valhalla. DLSS is a pretty safe bet. Games are coming out monthly now and we have reached a point where they're not even announced that they use DLSS, its just there, in the menu. Necromunda Hired Gun came out a few days ago, people found out it has DLSS from reviewes, because they saw in the menu. Its starting to become natural to have DLSS in new games



iEY514x.jpeg


JWDbHl0.jpeg

He is considering trading with the 3070. And for any benchmarks you find where the 3090 is ahead of the 6900xt, you find another where it isn't. I cant claim it, you cant claim it.
 

FireFly

Member
Dirt 5 and Valhalla is not a case of performing better on one vendor or another. Its defective on one vendor vs the other. Both AMD partner games. If you have 2 cards in the same ballpark of performance and 2 games from one camp are performing to the tune of 30% faster on the partnered vendor, i would leave them out of the testing suite since they dont represent reality. They massively skew the results and offer a distorted image of the relative performance when in every other game on the market they dont perform like that.

RE Village is another AMD partnered game, so its only natural it performs better on amd. They seem to do things in all of their partenered game this gen where they perform better in an unbalanced way compared to nvidia.
So should we also not exclude Nvidia partnered games where they perform significantly better on Nvidia hardware? Do we need *any* evidence that a game's performance is "defective" or do we just toss it out of the pile as soon as we see the AMD sticker and Nvidia is losing? For example, Star Wars Squadrons is a recent game where AMD enjoyed an 18% lead at 1440p according to the ComputerBase benchmarks. That's similar to the lead enjoyed in RE Village. Yet as far as I can see Squadrons is not AMD sponsored. The same with Serious Sam 4, which has a 20% lead.

Presumably the people that play these games care about which card will run them faster, no matter what the reason is. I thought that was the motivation behind the whole "Nvidia wins in every situation" type claim. Since that would be a bit different from, "Nvidia wins in specific selection of curated games where AMD was not involved in the development".
 
Last edited:

Buggy Loop

Member
Its not actually. Reason it apears 0.7% faster in those results its because A LOT of outlets were using Valhalla and Dirt 5 which net around 20-30% more performance in amd's favour. This type of discrepancy is enough to make one card apear faster. Dirt 5 was using their beta RT branch that wasnt even available for public, but at public release that 30% amd advantage evaporated and the game is now faster on nvidia. I have no ideea why the press used a beta branch that was borked on nvidia and after every card released and got reviews, Dirt 5 got patched and behaves entirely different now

I don’t know for Valhalla, but Dirt 5 used the FidelityFX VRS which is literally algorithmed for RDNA 2 and borks performances on Ampere. ~30% performance difference in rasterization at 1440p and 16% at 4K, outlying pretty much all other games tested and then when they enabled RT they were « pleasantly surprised » at performances, which had nothing to do with RT capabilities, but the base game is skewed before you even turn RT on.

nOtHinG to dIG furThr herE, GG AMD, nVIdia in TrOOBle
1pi6nv.jpg


Game basically disappeared from radars after reviews due to the shit quality of it, but the damage is done. Just like the game’s shit implementation on Xbox at launch.
 

//DEVIL//

Member
My guilt is no where near the same level of yours. I was looking to get a 3060/3070 and ended up getting a 3080. I am looking at it as a minor bump in future proofing. I do not typically swap GPU's every generation or two so I will get more use out of it. If you swap GPU's more often it is probably worth making the trade. It gets you more bang for-the-buck for now and could pay off in the next year or two as newer cards might be more efficient. This being a local trade with someone you know makes it much more compelling.
I do swap cards every year really .When the 4000 series or amd card I’ll sure.
But in theory, I want a card the last at least till mid/ end of 2022
 
3070 owner with an 3440*1440 monitor and I also always wabt 144 fps with graphics set to high/ultra. Long story short, when a game supports dlss, I'm happy with my 3070.. without dlss, it just isn't the card that can do all the things I want at this resolution. I think it would be perfect for 2560*1440 though.

I am going to get a 3080 as soon as I have the opportunity to do so.
 
Top Bottom