• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is the 3070 enough for next gen gaming at 2k 144 ultra settings with DLSS quality and no ray tracing?

Max_Po

Banned
Next gen at 2k ...


R O F L


Master Race peasant self exposed



tenor.gif
 
Last edited:

SubZer0

Banned
I have a friend who wants to trade with me. take my 6900xt and give his suprim X model brand new sealed( not the low hash rate ) plus a bit more than 650$ US )

Now I am looking for something that runs all new and upcoming games at 2k 140 frames with the DLSS Quality option.

I checked fast-paced games like Cold War. and yeah with DLSS it can even reach 180 frames.

But there are some games like Cypherpunk, it's around 90 frames with DLSS on quality.



Now, for open-world or single-player games, I am ok with 80/90 frames if everything ultra settings. the DLSS most of the time produce even a better quality image than native in my eyes ( or let's say they are so close I can't tell the difference )

But in games like COD? I need a locked 144 frames as my monitor.

To be fair, the 6900 xt does fair about the same results but native ( in COD sometimes it goes below 144 on the 6900xt)

The 6900XT I have is a really nice one too. water-cooled Asus card. But I am just worried about the closed AIO, it will die at one point and then it will have to be RMA, etc( can't even change the AIO and replace it with something else due to the weird shape the pump is, otherwise wouldn't have been a concern.

So I am wondering if the 3070 is fine, for a 2k high frames card, or should I forget about the money and keep the 6900xt.

The other reason to be honest I feel these cards are expensive too ( the 6900xt ) that is. so with the 3070 price, I feel almost no regret ? in spending this low compared to a super expensive card? not sure if what I am saying makes sense to some of you lol
More than enough
 

Sentenza

Member
Overkill for Ultra settings in next-gen games at 1440p? So you mean to say it'll do 144+fps?
I can still do 1440p and 80+ fps in most games with a 1070, so I'd say yes.

"Ultra settings" is generic jargon and it doesn't mean shit. If you do stupid, inefficient shit you can have a single setting, say "SSAA 8X", completely destroy your performances even on a 3090.
 

FireFly

Member
People really need to stop saying AMD is 1st in raster. It never was, the 3090 is ahead. The 3080TI is ahead as well. In every situation, in every resolution, ray tracing, no ray tracing. In EVERY case.
Why are the 1080p and 1440p ComputerBase benchmarks missing from your post?
 

Md Ray

Member
I can still do 1440p and 80+ fps in most games with a 1070, so I'd say yes.

"Ultra settings" is generic jargon and it doesn't mean shit. If you do stupid, inefficient shit you can have a single setting, say "SSAA 8X", completely destroy your performances even on a 3090.
I agree Ultra settings are overrated and I personally never shoot for it unless there's meaningful visual gain from it. But OP specifically mentioned "Ultra settings", "144" and "next-gen gaming" in the title, a 3070 simply won't be able to deliver 144fps on ultra settings in next-gen games.
I know because I have one.

Not unless those games happen to have DOOM Eternal levels of perf. This is the only recent game that I've tested on my 3070 that locks at 144fps with some minor dips on Ultra settings, 1440p.
 
Why are the 1080p and 1440p ComputerBase benchmarks missing from your post?


They've only tested those models at 4k. You know, the thousand dollar+ 4k cards arent really of interest how they run at cpu bound 1080p, since they're you know, 4k cards. If you desire every resolution the other site has them.


Showing nvidia ahead at every resolution, even with their selection suite of titles where you have outliers massively favouring amd and skewing the results. Without Valhalla the gap would be even bigger. Are we still pushing hardware unboxed claims from last year that rdna2 is better at 1440p ? Even their more recent tests show different now.
 
I got a 3090, and I lucked up and impulse bought the 6900XT one day because a discord I was in posted a stock update lol. The 6900XT is a really nice card, but for what you're looking for the 3070 is a great choice
 
i have a 2080 and play at 1440p 144hz. most games run perfectly fine at that level of performance. the only games i need to cap are Cyberpunk (60fps) + Flight sim (30fps).

a 3070 will be a great card for another few years but once games start making using of console hardware i don't know how well it'll hold up. if you have a really good CPU and fast NVME ssd you'll probably be alright. also the 10GB VRAM on the 3070 should be fine for 1440p 144hz even if it's pathetic amount for 4K.
 

yamaci17

Member
let's see

6700xt destroys 3070 (up to %30-50 performance difference) at 1080p/1440p in rdna2 optimized games such as valhalla
6700xt benefits more from Resizable Bar, up to %15-20 performance gains while Ampere is weak in application, getting %4-7 in best case scenarios
6700xt is more synergystic with Ryzen CPUs due to having similar caches and this synergy will be explored even further in future, where as Nvidia driver brings down every CPU's performance by %20 due to their scheduler overhead
6700xt uses %20 less cpu compared to 3070 with same cpu, so you will always have %20 more CPU headroom to hit 144 fps (for example, if a game becomes cpu bound at 120 fps with a 3070, it will be cpu bound at 140-150 fps with a 6700xt. this has been proven by hardware unboxed. 6700xt will be more future proof for every CPU out there, unless you upgrade every 2 year)
6700xt has +4 gb vram, which will be able to hold next-gen textures without compromises at 1440p or even at 1080p which 3070 will fail to do so
6700xt has the warranty of getting rdna2 specific optimizations due to consoles being rdna2, see how GCN cards performed SUPERIOR against the abandoned, crap Pascal/Maxwell architectures in certain console ports. Same will happen between Ampere and RDNA2. RDNA2 will start to destroy Amper 2-3 years later and Nvidia will be moving onto their next "special" architectuer while AMD will keep upgrading and improving their RDNA and all RDNA cards ranging from 5500xt to RDNA3 will benefit from it.
6700xt

3070 fails to hit 60 at native 1080p in Cyberpunk with mediocre settings and mediocre RT settings. It needs DLSS Quality at 1080 to hit 60+ frames and it usually returns as hideous image quality. At 1440p with DLSS quality, it again plays at 45-50 with 960p reflections which defeats the whole purpose of RT by introducing garbage looking reflections all around, just like consoles. I know you're not interested in RT but just a reminder.

6700xt is a superior card to 3070.

oh you talking about 6900xt. LOL.

bruh, hold on to that card. that beast will see you through the entire generation. 3070's useful life is only 2-3 years.

final note: i'm a huge nvidia hater. not going to deny that. i would perefer series x/ps5 over a crappy 3070/3080. SX/PS5 can LOCK RELIABLY to 60 FPS in wd:legion with the latest performance mode patch. whereas you need a ryzen 5800x to get 60 fps but even then it will still DROP frames below 60. and people will blame Ubisoft but Nvidia driver is using %20-30 extra CPU cycles. you will always be BOTTLENECKED in CPU BOUND regions in-games. GO with RDNA2. Or fess up big bucks to feed the Nvidia GPU but get bottlenecked nonetheless below 60 FPS. Your choice!

 
Last edited:

tusharngf

Member
my gtx 970 ran for 5yrs and i was able to play games on high settings till i replaced it with 1070 and later i got 3070. I think 1080p 60fps ultra should work till the end of the this gen.
 

Bo_Hazem

Banned
I have a friend who wants to trade with me. take my 6900xt and give his suprim X model brand new sealed( not the low hash rate ) plus a bit more than 650$ US )

Now I am looking for something that runs all new and upcoming games at 2k 140 frames with the DLSS Quality option.

I checked fast-paced games like Cold War. and yeah with DLSS it can even reach 180 frames.

But there are some games like Cypherpunk, it's around 90 frames with DLSS on quality.



Now, for open-world or single-player games, I am ok with 80/90 frames if everything ultra settings. the DLSS most of the time produce even a better quality image than native in my eyes ( or let's say they are so close I can't tell the difference )

But in games like COD? I need a locked 144 frames as my monitor.

To be fair, the 6900 xt does fair about the same results but native ( in COD sometimes it goes below 144 on the 6900xt)

The 6900XT I have is a really nice one too. water-cooled Asus card. But I am just worried about the closed AIO, it will die at one point and then it will have to be RMA, etc( can't even change the AIO and replace it with something else due to the weird shape the pump is, otherwise wouldn't have been a concern.

So I am wondering if the 3070 is fine, for a 2k high frames card, or should I forget about the money and keep the 6900xt.

The other reason to be honest I feel these cards are expensive too ( the 6900xt ) that is. so with the 3070 price, I feel almost no regret ? in spending this low compared to a super expensive card? not sure if what I am saying makes sense to some of you lol

I think that's a massive downgrade. Keep that 6900XT, and FSR is coming. Not to mention going from 16GB VRAM to 10GB.
 
Last edited:

FireFly

Member
They've only tested those models at 4k. You know, the thousand dollar+ 4k cards arent really of interest how they run at cpu bound 1080p, since they're you know, 4k cards. If you desire every resolution the other site has them.


Showing nvidia ahead at every resolution, even with their selection suite of titles where you have outliers massively favouring amd and skewing the results. Without Valhalla the gap would be even bigger. Are we still pushing hardware unboxed claims from last year that rdna2 is better at 1440p ? Even their more recent tests show different now.
So when you said the 3080 Ti was ahead in every resolution, you were just referring to the 3 slot air/liquid cooled factory OC versions?

The 6800 XT *is* ahead of the 3080 at 1440p, but it remains to be seen if the 6900 XT is ahead of the (stock) 3080 Ti.

 
So when you said the 3080 Ti was ahead in every resolution, you were just referring to the 3 slot air/liquid cooled factory OC versions?

The 6800 XT *is* ahead of the 3080 at 1440p, but it remains to be seen if the 6900 XT is ahead of the (stock) 3080 Ti.

Its not actually. Reason it apears 0.7% faster in those results its because A LOT of outlets were using Valhalla and Dirt 5 which net around 20-30% more performance in amd's favour. This type of discrepancy is enough to make one card apear faster. Dirt 5 was using their beta RT branch that wasnt even available for public, but at public release that 30% amd advantage evaporated and the game is now faster on nvidia. I have no ideea why the press used a beta branch that was borked on nvidia and after every card released and got reviews, Dirt 5 got patched and behaves entirely different now
 

Bo_Hazem

Banned
Without a doubt its a downgrade just to what extent is he willing to take the hit

I too think its a no brainer keeping the 6900

If I was building a new PC, it'll be either 3090 or 6900XT. 3090 for more VRAM and better RT performance. DLSS has very low support while FSR will have insane amount of games supporting it going forward due to being GPU/API agnostic, even an nvidia card can use it.

GTX-1060-FSR.jpg


fsron.jpg


And the image should be more stable than DLSS in motion.
 

Buggy Loop

Member
People really need to stop saying AMD is 1st in raster. It never was, the 3090 is ahead. The 3080TI is ahead as well. In every situation, in every resolution, ray tracing, no ray tracing. In EVERY case. DLSS doesnt need to work with every game, just the demanding ones. Which at this point it does. With the inclussion of Red Dead 2, we got pretty much all the heavy titles covered with dlss with the exception of Horizon and AMD partnered Valhalla. DLSS is a pretty safe bet. Games are coming out monthly now and we have reached a point where they're not even announced that they use DLSS, its just there, in the menu. Necromunda Hired Gun came out a few days ago, people found out it has DLSS from reviewes, because they saw in the menu. Its starting to become natural to have DLSS in new games



iEY514x.jpeg


JWDbHl0.jpeg

And then add VR performances, which is still an head scratcher how AMD managed to not follow the same curve of improvement as their rasterization..
 

Kenpachii

Member
Terrible trade 3070 has 8gb of v-ram.

Would not do it, card is going to age like a wet noodle when next gen games come out. If you are interested in ultra settings your only option is 6800xt / 3090 or wait for next generation of cards.
 
Last edited:

HeisenbergFX4

Gold Member
If I was building a new PC, it'll be either 3090 or 6900XT. 3090 for more VRAM and better RT performance. DLSS has very low support while FSR will have insane amount of games supporting it going forward due to being GPU/API agnostic, even an nvidia card can use it.

GTX-1060-FSR.jpg


fsron.jpg


And the image should be more stable than DLSS in motion.

Since FSR is in its infancy I expect it to improve a lot over time because some of the screens they showed us didn't look good for how low of an fps boost it gave.

I do think this type of tech is a lot more exciting they raytracing though
 
DLSS has very low support while FSR will have insane amount of games supporting it going forward due to being GPU/API agnostic, even an nvidia card can use it.


DLSS is in nearly every big game released since last year. Its the exact opposite of very low support. It just gets added into more and more games. UE4 has it as a toggle and now Unity has it as well. If FSR is worse than any other proprietary solution and even the UE5 one nobody is gonna use it
 

Larxia

Member
Well officially 2K is 2048x1080, which is close to 1920x1080 (1080p), it's the equivalent of 4K being 4096x2160, but ultra HD being 3840x2160.
It's much closer to 1080p than 2560x1440.

I don't know why people started talking about "2K", I guess it's for convenience, but it's technically wrong.
 

Bo_Hazem

Banned
DLSS is in nearly every big game released since last year. Its the exact opposite of very low support. It just gets added into more and more games. UE4 has it as a toggle and now Unity has it as well. If FSR is worse than any other proprietary solution and even the UE5 one nobody is gonna use it

As said before: GPU/API agnostic. It can work on any console, and can work on old nvidia cards. Could as well work on Switch. So why devs won't take advantage of that?
 

HeisenbergFX4

Gold Member
Well officially 2K is 2048x1080, which is close to 1920x1080 (1080p), it's the equivalent of 4K being 4096x2160, but ultra HD being 3840x2160.
It's much closer to 1080p than 2560x1440.

I don't know why people started talking about "2K", I guess it's for convenience, but it's technically wrong.

Off track here but even at my local Frys Electronics (RIP) on their huge wall of monitors they full in massive letters separating the displays HD (1080p) 2K (1440p) and 4k so even they went with the moniker
 
As said before: GPU/API agnostic. It can work on any console, and can work on old nvidia cards. Could as well work on Switch. So why devs won't take advantage of that?


Because their own solution is better, as ive said. Remains to be seen how it is in practice, but what amd showed in that comparison pic is horendous. If every dev has a better looking and performing solution, why would they use amd's ? Switch will use DLSS
 

Bo_Hazem

Banned
Because their own solution is better, as ive said. Remains to be seen how it is in practice, but what amd showed in that comparison pic is horendous. If every dev has a better looking and performing solution, why would they use amd's ? Switch will use DLSS

The photo itself is horrendous as it's pretty compressed. Also the UE5 could be the same as it's a collaborative effort between Epic Games and AMD, so no issue if UE5 is better or on par. To AMD they need to fill that gap of not having a DLSS competitor, and DLSS has some weird artifacts in some games in motion and looks perfect statically.

DLSS is HW-based, Switch WILL NOT use DLSS.

About that PC master race thing: the most popular GPU on steam hw (as flawed as that review is) is 1060.

Both PS5 and XSeX laugh at that.

And those with 1060 will be happy with FidelityFX FSR as nVidia left them in the dark.
 
Last edited:

Bo_Hazem

Banned
More into FSR:




Made these two 4K screenshots, couldn't get a clean one for 1060 demonstration. Native 4K still looks slightly sharper, but the sacrifice for FPS is pretty minimal:

image.png


image.png
 
Last edited:

hlm666

Member
As said before: GPU/API agnostic. It can work on any console, and can work on old nvidia cards. Could as well work on Switch. So why devs won't take advantage of that?
You know this has to be put into the game by the devs like dlss right, it's not a 1 driver toggle all games work type deal? I don't see it gaining much traction in the console space, they have been doing it better than what AMD have shown for years already.
 

Razvedka

Banned
Title question:

Is the 3070 enough for next gen gaming at 2k 144 ultra settings with DLSS quality and no ray tracing​


Absolutely not, I'm amazed at anyone here saying yes.

Your requirements are:

all new and upcoming games at 2k 140 frames with the DLSS Quality

...

Now, for open-world or single-player games, I am ok with 80/90 frames if everything ultra settings.

Only way I could say 'yes' is if you limit 'new and upcoming games' to the very near horizon.

Edit: Expanded thoughts

XSX and PS5 are respectively (paper numbers) in the 2080/2070S GPU perf ballpark. I am excluding RT performance in this analysis. You're looking at DLSS as a way to inflate 3070 perf, but the reality here is that these consoles are already doing clever shenanigans to render under 4k while presenting a nearly-4k picture quality*. And then you have to add in the various console optimizations which let them punch above their equivalent PC weight (for example, you're using Windows OS).

From your PC part, you're asking for markedly better net performance then these machines. I'm assuming you have the CPU, SSD, and memory to keep up by the way.

So no, I think your hopes are outlandish and I would genuinely seek to understand the people here giving you thumbs up.

*See: Ratchet and Clank, Spiderman, etc.
 
Last edited:
Look at this way all multiplats will have to run at 30 fps half 4K at the lowest on consoles and they're at ~3060 ti level so 3070 WITH DLSS will easily double framerate.
 
Last edited:

pratyush

Member
Yes without ray tracing you might even get 4k 60 with few settings tweaked


Although not sure if you can hit 140fps. But 70-80 should be doable at ultra settings most of the time
 
Last edited:

Bo_Hazem

Banned
You know this has to be put into the game by the devs like dlss right, it's not a 1 driver toggle all games work type deal? I don't see it gaining much traction in the console space, they have been doing it better than what AMD have shown for years already.

So far only seen better implementation by Bluepoint on Demon's Souls and Insomniac temporal injection for Performance and Performance RT modes (Fidelity 4K is native 4K reconstructed to even higher than 4K and downsampled to have that unique, clean look with zero aliasing).

More than sure FSR will be used much more often than DLSS going forward.
 
More into FSR:




Made these two 4K screenshots, couldn't get a clean one for 1060 demonstration. Native 4K still looks slightly sharper, but the sacrifice for FPS is pretty minimal:

image.png


image.png



These are wallpapers, where they wrote different tiers of fsr, they're not pictures showing the actual fsr at work :)
 

Bo_Hazem

Banned
These are wallpapers, where they wrote different tiers of fsr, they're not pictures showing the actual fsr at work :)

Mate, I made those from that highly compressed video. Open and see for yourself. The 1060 looks like shit though, but the AMD ones are pretty good.

We'll wait for the final build ;)
 
the 3070 will match the 6900xt with DLSS and eventually beat it when the next iteration of dlss comes out. plus you get $650 on top? id say its worth it for sure.

you could even take the $650 and put it towards a 3080ti if you can manage it and then sell the 3070 for $1400

everyone knows AMDs version of dlss will be garbage out of the gate
 
Last edited:

llien

Member
the 3070 will match the 3090 with DLSS
FTFY
Don't forget about this expert level move.

3070 at 1080p will beat 3090 at 4k, that is a fact that must be taken into account too.

And when DLSS 5 comes out although it wouldn't be supported on 3070, because fuck you, peasants that's why, 3070 with DLSS 5 at 1080p might easily beat 5090Ti Ultra WTF BBQ at 8k.
 
Last edited:

hlm666

Member
So far only seen better implementation by Bluepoint on Demon's Souls and Insomniac temporal injection for Performance and Performance RT modes (Fidelity 4K is native 4K reconstructed to even higher than 4K and downsampled to have that unique, clean look with zero aliasing).

More than sure FSR will be used much more often than DLSS going forward.
Yeh we have had super sampling for ages, you can do it many ways like setting the ingame resolution slider above 100% or make a custom resolution, or use the amd/nvidia driver options (dsr/vsr) and it works with dlss if you so choose so yay for amd adding something they already had anyway (just use vsr with fsr is what i'm implying).
 

longdi

Banned
i may trade 6900xt for 3080 and above
tell your friend to cough up the real deal.

just reread, 6900xt for 3070 + extra $650. not that bad then. try to scalp your friend for another 100 bucks, make it $750
 
Last edited:

tkscz

Member
Angry Season 4 GIF by The Office


Firstly, while it probably won't be as good as DLSS due to how long it's been around in comparison, AMD is still bringing FFX which will eventually give you the same results.

Secondly, The 6900XT, outside of Ray Tracing results (which games running on UE5 might not even use) is a much better card than the 3070. So if Ray Tracing is what you're after go ahead with the trade.

Thirdly, the entire purpose of DLSS and FFX are to get 4k levels of fidelity while running the cards at 1080p. If you're trying to do 1440p, then you should be more than good there. Hell, depending on the monitor you're using, 1440p is far more than enough.

Essentially, that's a very bad trade you're making if you don't care for Ray Tracing and trying to get 1440p results.

Edit: Didn't see the $650 included. In that case go ahead. Then sell the 3070 for $2000 on ebay and get you a 3080Ti or 3090.
 
Last edited:

longdi

Banned
OP is getting a 3070 and an extra $650 if i read it right.

the 3070 is also a non crypto nerf version.

seems a good deal if we can bump the extra cash to $750. :messenger_beaming:
 

Velius

Banned
You as you wrote this:
War Battle GIF by ImproVivencia


Show me one person on GAF who said a "gt1080 was more than enough to beat consoles." I'll wait.
He's just being a tool.

Having said that, I do have a GTX 1080, and OC'd, it performs better than PS4Pro/XB1X. So last year if someone had said this, it would have been true.

If I can grab an Asus OC ROG Strix RTX 3090 I'll be upgrading. If not I'm waiting until the 4000's because I think that will be the best time to buy- one of those, with one of the upcoming new Zen AMD's will beat the shit out of anything that could come out of PS5/XBSX.
 

Velius

Banned
Trading 6900XT that beats 3090 in newest games to roflcopter like 2070, to play with glorified TAA derivative aka "DLSS 2".

Blow Your Mind Wow GIF by Product Hunt


Note, the graph below refers to ALL games TPU normally tests with, not just newest ones (only computerbase does "7 latest" kind of tests among sites I visit)

relative-performance_1920-1080.png


TPU
Can you please give me some backup for this? I'm not trying to argue-- I want to verify that what you're saying is real because if it is, I'm dropping the $3000.00 on the Sapphire Toxic, as well as getting a 5900x or maybe 5950x.
 

llien

Member
Can you please give me some backup for this? I'm not trying to argue-- I want to verify that what you're saying is real because if it is, I'm dropping the $3000.00 on the Sapphire Toxic, as well as getting a 5900x or maybe 5950x.
You are commenting on a post with two links on two statements.
Exactly what is "this" that you are missing?
 
Top Bottom