• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is the 3070 enough for next gen gaming at 2k 144 ultra settings with DLSS quality and no ray tracing?

//DEVIL//

Member
I have a friend who wants to trade with me. take my 6900xt and give his suprim X model brand new sealed( not the low hash rate ) plus a bit more than 650$ US )

Now I am looking for something that runs all new and upcoming games at 2k 140 frames with the DLSS Quality option.

I checked fast-paced games like Cold War. and yeah with DLSS it can even reach 180 frames.

But there are some games like Cypherpunk, it's around 90 frames with DLSS on quality.



Now, for open-world or single-player games, I am ok with 80/90 frames if everything ultra settings. the DLSS most of the time produce even a better quality image than native in my eyes ( or let's say they are so close I can't tell the difference )

But in games like COD? I need a locked 144 frames as my monitor.

To be fair, the 6900 xt does fair about the same results but native ( in COD sometimes it goes below 144 on the 6900xt)

The 6900XT I have is a really nice one too. water-cooled Asus card. But I am just worried about the closed AIO, it will die at one point and then it will have to be RMA, etc( can't even change the AIO and replace it with something else due to the weird shape the pump is, otherwise wouldn't have been a concern.

So I am wondering if the 3070 is fine, for a 2k high frames card, or should I forget about the money and keep the 6900xt.

The other reason to be honest I feel these cards are expensive too ( the 6900xt ) that is. so with the 3070 price, I feel almost no regret ? in spending this low compared to a super expensive card? not sure if what I am saying makes sense to some of you lol
 
Last edited:

ZywyPL

Banned
Depends on the games you play, if they support DLSS you'll be good with 3070, but bo one knows what the future will bring, which/how many titles will suport it. So for no RT I'd take the 6900xt, it's basically the most efficient GPU in rasterization on the planet.
 

Mister Wolf

Gold Member
Yes. Easily. You will want a card that has access to DLSS as its becoming more and more prevalent with even developers of smaller titles implementing it and older games like RDR2 opting to patch it in.
 
Last edited:

Boss Mog

Member
You probably won't hit 144fps in the most demanding games but 60fps should be attainable everywhere with DLSS and no RT.
 

//DEVIL//

Member
I would keep the 6900XT, AMD FSR is coming as well... maybe it's decent.
It will be decent. But will not be close to DlSS sadly because it’s lacking hardware.
Honestly the DLSS is a god send. In terms image quality the same as native sometimes or even enhance it.
I love AMD but we all know it’s not close . At least this year
 

Klik

Member
I think its really hard to know until " next gen" games come out(the games that aren't made for PS4/X1 in mind). In 2013 even card that was 40% faster than ps4 gpu had hard time playing games in 2018.
 

Relique

Member
Like you said it will depend on the game. I played cyberpunk on DLSS quality and locked it to 60. I'd expect about the same from other open world games.
For competitive shooters it's enough for some of the older stuff that is out now. I wouldn't expect a completely flawless locked 144 from every game though, many games will have drops even if the average is 144+. Destiny 2 won't hold 144 FPS flawlessly for me at native resolution.
No one can tell you what will happen with future games and DLSS support. For now it seems to be on the edge of just being enough for most DLSS titles.

I say hold on to your 6900xt. You will probably miss the superior rasterization performance on older titles without DLSS. If you want to play the odd Raytracing game here and there your card should still be able to play at locked 60 even without tensor cores.
The MSI brand of 30xx series are also some of the least desired I think. If this question was about the 3080 it would be a much easier decision to make the swap. Even though I love my 3070 and think it's a perfect 1440p card, I am not too confident it will hold a flawless 144 fps for every multiplayer game going forward.
 
Last edited:

Solrac

Member
no, all games right now are cross gen so is kinda easy to a 3070 run those at high frames and resolution, wait till "real" next gen hits.
 
Depends on the games you play, if they support DLSS you'll be good with 3070, but bo one knows what the future will bring, which/how many titles will suport it. So for no RT I'd take the 6900xt, it's basically the most efficient GPU in rasterization on the planet.


Pretty much every big game and even smaller ones have dlss since last year. There are close to 60 games with it now or so. Again, nvidia has the raster crown and every other crown you desire. 3080Ti and 3090 are faster in raster than 6900XT which is closer to the 3080 in raster
 

Md Ray

Member
Based on this launch review data, it gets close at High settings, 1440p. But you are asking for Ultra settings in next-gen games... 144fps at those settings and res in the upcoming games might a bit out of reach. DLSS isn't guaranteed in every game so there's that.

I'd go for 3080 at least or stay with 6900 XT if I was targeting 144fps at 1440p. I actually have a 3070 and a 144Hz display, and let me tell you this. Achieving 144fps at 1440p using Ultra settings is hard in demanding titles. If the games you play have DOOM Eternal levels of optimization and perf profile then you'd be fine.

Average_1440p-p.webp
 
Last edited:

GymWolf

Member
Not for long, also not every game has dlss.

If you scale down some settings from ultra to very high or high to survive a bit longer.
 

jigglet

Banned
My guess is mostly.

The 2070 could pretty much run every last gen game at 144hz @ 1440p. I imagine the 3070 should be enough for most, with the occasional poorly optimised game running circa 100fps.
 

NoviDon

Member
ultra settings and 144 hz? If your playing next gen pong, MAYBE. The top AAA games in a few years are going to be doing crazy with stuff with scale, destruction, npc count, textures that look like movie quality. With crazy gpu intensive games probably high setting and locked 60. If you want bells and wistles you should hold on to your current card and save your money. wait for the gtx 4070-80 or RDNA3 if you want to future proof your rig.
 
with no RT? You already have the best one, the 6900xt. Miles ahead of the 3070. Only consider exchanging with the 3080ti, nothing bellow it.

Don't judge FSR without seeing reviews first. And dont assume DLSS/FSR will work with every game, you might be left out in the dark.

People really need to stop saying AMD is 1st in raster. It never was, the 3090 is ahead. The 3080TI is ahead as well. In every situation, in every resolution, ray tracing, no ray tracing. In EVERY case. DLSS doesnt need to work with every game, just the demanding ones. Which at this point it does. With the inclussion of Red Dead 2, we got pretty much all the heavy titles covered with dlss with the exception of Horizon and AMD partnered Valhalla. DLSS is a pretty safe bet. Games are coming out monthly now and we have reached a point where they're not even announced that they use DLSS, its just there, in the menu. Necromunda Hired Gun came out a few days ago, people found out it has DLSS from reviewes, because they saw in the menu. Its starting to become natural to have DLSS in new games



iEY514x.jpeg


JWDbHl0.jpeg
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
6900XT for a 3070?

The 6900XT does natively what the 3070 does with DLSS.
So why would you want to go DLSS when you could play at Native?
Hell once SuperResolution hits the 6900XT is going to be super god tier.
Since you arent hunting raytracing it really is a nobrainer.

And if you still have your card when the AIO dies.....in 8 - 10 years it has done its time, let it go.
 
You as you wrote this:
War Battle GIF by ImproVivencia


Show me one person on GAF who said a "gt1080 was more than enough to beat consoles." I'll wait.
it's the typical mister racer mantra: "oh, my 4 yo gpu beats consoles just fine. It's future-proof"

I honestly don't know any pc guy with a 4 yo card. They're always upgrading, it's like a ritual in their cult or something...
 
The OP said he will give his 6900XT and receive the 3070 sealed PLUS 650 to 700 dollars extra. Thats worth considering. You get a nice gpu that plays everything well plus msrp for a 3080.
 
Last edited:

JimboJones

Member
it's the typical mister racer mantra: "oh, my 4 yo gpu beats consoles just fine. It's future-proof"

I honestly don't know any pc guy with a 4 yo card. They're always upgrading, it's like a ritual in their cult or something...
Gotta 5 year old rx480 in mine.
 

baphomet

Member
No, youre not going to get 144fps the entire generation with a 3070 even with dlss.

Also trading a 6900xt for a 3070 and cash is a terrible deal on your end.
 
Last edited:

carsar

Member
3070 would run true next gen games(UE5) at 1440p 30-40fps or 1080p 50-60fps.
with dlss or TSR frame rate can be doubled.
 

Rickyiez

Member
Wait are u guys missing something here ? I read this as 6900XT traded with 3070 + $650 . Isn't this no brainer ? The 6900xt isn't ridiculously faster than 3070 , and by the time something newer came you can sell off the 3070 + the $650 to buy another high end card easily .
 
No, youre not going to get 144fps the entire generation with a 3070 even with dlss.

Also trading a 6900xt for a 3070 and cash is a terrible deal on your end.


How so ? 3070's are selling for 1500 dollars on ebay. He's getting that card and another 650 bucks in cash. Why do you see it as a bad deal ?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
You as you wrote this:
War Battle GIF by ImproVivencia


Show me one person on GAF who said a "gt1080 was more than enough to beat consoles." I'll wait.
get-em-gif-10.gif

it's the typical mister racer mantra: "oh, my 4 yo gpu beats consoles just fine. It's future-proof"

I honestly don't know any pc guy with a 4 yo card. They're always upgrading, it's like a ritual in their cult or something...
In other words you talking shit?

Last gen yes, the XB1 and PS4 GPUs werent really up to snuff.
This gen we were expecting the GPUs to be ~RTX 2080 levels....basically at announcement the consoles were equivalent to the second best GPU you could get.
Yeah I know the RTX 2080S was technically the second best GPU you could get
Hell if we count current gen GPUs the current gen consoles could match an RTX 3060Ti in the right conditions, with Raytracing it would still be at or about RTX 2060.
So they arent as weak as prior generations.

No one was thinking a GTX1080 would hold out when we were already expecting the console to match the GTX 1080s successor.
 

RoboFu

One of the green rats
Dlss isn’t something you should strive for if spending lots of money. It has trouble with higher res textures with patterns. I see it all the time. I only ever really used it constantly in cyberpunk for raytracing. It helps out a lot there for performance but I still see the artifacts.
 

GC_DALBEN

Member
If you have no interest in RT keep the 6900xt

I have a similar card (3090) and only used dlss in one game(cyberpunk) and I didn't liked how it looked even with quality mode.

I play at 4k
 

Shifty1897

Member
144fps is super hard to hit for some games at Ultra settings, even with DLSS and ray tracing off. For example, Flight Simulator or Assassin's Creed Odyssey. As we get mid to late gen, developers will start future proofing their PC versions, and a 3070 won't cut it at the level of fidelity you're looking for.
 

supernova8

Banned
I bet you'll need that extra VRAM further down the line. Seems like it always ends up taking much more PC horsepower (ie bang for buck) to produce what consoles can do with games tailored to that specific hardware.

Plus, seems like practically all the big games going forward will end up having native FSR and DXR (as opposed to RTX) support since that's what's in the consoles. Why would any developers/publishers care about DLSS/RTX in the future unless (a) NVIDIA pays them to care or (b) they make it so effortless that's not worth not putting them in?
 
Last gen yes, the XB1 and PS4 GPUs werent really up to snuff.
This gen we were expecting the GPUs to be ~RTX 2080 levels....basically at announcement the consoles were equivalent to the second best GPU you could get.
Yeah I know the RTX 2080S was technically the second best GPU you could get
Hell if we count current gen GPUs the current gen consoles could match an RTX 3060Ti in the right conditions, with Raytracing it would still be at or about RTX 2060.
So they arent as weak as prior generations.

No one was thinking a GTX1080 would hold out when we were already expecting the console to match the GTX 1080s successor.


When you think about it, the 1080 comparison isnt that off the mark. New consoles were never at the level of 2080, nevermind 2080 Super. I dont know where that came from. The paper specs always placed them at 2070/5700XT levels and Digital Foundry even switched to 2060 Super right before launch.

As far as i know, the only current games that are above 2070 for ps5 is Valhalla which is broken on nvidia, so not a valid comparison and Call of Duty which favours amd architecture and has special settings for consoles that dont go low enough on PC so you cant match the settings for an accurate comparison. Everything else is between 2060 super and 2070. A 3060 TI is pretty far from this.

And when you think about it, Turing was pretty much a sidegrade in rasterization except 2080TI. A 2080 was the same as a 1080TI. We can say the new consoles launched at the level of top 3 cards at the time, but that 2nd fastest card in 2018 actually had march 2017 performance. A ps5 is 2016 level of gpu performance, between a 1080 from may 2016 and a 1080TI from march 2017
 

llien

Member
Trading 6900XT that beats 3090 in newest games to roflcopter like 2070, to play with glorified TAA derivative aka "DLSS 2".

Blow Your Mind Wow GIF by Product Hunt


Note, the graph below refers to ALL games TPU normally tests with, not just newest ones (only computerbase does "7 latest" kind of tests among sites I visit)

relative-performance_1920-1080.png


TPU
 
Last edited:

Buggy Loop

Member
Ultra settings are overrated and most devs just do the settings stupidly where there’s no visual return (or gotta DF Zoom 800%) for too much performance penalty.

Always DF optimized settings if available
 
Last edited:

HeisenbergFX4

Gold Member
To pick up DLSS? Not worth it.

To make up a little cash? Depends on your financial situation

Worried about possible breaking down? Thing might run for years but might break next week

Overall totally not worth it the VRAM alone is what would make me keep the 6900xt

Just a few cents though
 

VFXVeteran

Banned
Yes, I would trade the 6900xt for the 3070. But only because of the RT. DLSS is icing on the cake. But if you don't care about the RT, then I'd keep the 6900xt.
 
Top Bottom