• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

Dampf

Member
If that's the case, neither can RDNA2. Well at least Ampere has DLSS, imagine having none
Yeah, sadly. The worst part is Nvidia forces ridiculous high settings with RT on, so that every card below a 3070/2080Ti will straight up not run the game with all RT effects on at an acceptable framerate. And AMD will suffer.

Transparent RT reflections run at 100% screen resolution and 2km distance for example and it is impossible to change that. That will eat performance alive. Unecessarily so, the game would still look great at reduced resolution and draw distance.

This reeks shady business practices...
 
Last edited:

Rickyiez

Member
I'm pretty sure with a few optimized settings it will run great. Quite certain that both the Radeon and NV camps will come out with game optimized drivers too which will further make it more playable. That's the beauty of PC gaming

I still remember playing Crysis back then with 8800GT, you can't get anything much better than it back then and settled with 30-45FPS
 
Last edited:

BluRayHiDef

Banned
Seems like Ampere is already dead for 4K60. Reaching 4K60 is not even with DLSS possible at max settings and Raytracing on a 3090 in Cyberpunk.

1440p DLSS is needed for 60 FPS.

What makes you say this? What have I missed? I would imagine that the RTX 3090 can run Cyberpunk 2077 at 4K-60Hz via DLSS Performance Mode (i.e. 1080p upscaled to 4K).

As long as the image has a high-level of anisotropic filtering and anti-aliasing and subsequently looks clean and sharp, it'll look great. That's the case with Control, which looks fantastic at 4K via DLSS Performance Mode and consistently runs over 100 frames per second.
 

diffusionx

Gold Member
Seems like Ampere is already dead for 4K60. Reaching 4K60 is not even with DLSS possible at max settings and Raytracing on a 3090 in Cyberpunk.

1440p DLSS is needed for 60 FPS.

Well, what are the max settings? 4K/60fps/ray tracing are demanding for anything by itself, even a pedestrian PS4 port.

Max settings for a long time wasn’t meant for high resolution and frame rate on a current gen GPU, but more for future proofing.
 
Last edited:
Yeah, sadly. The worst part is Nvidia forces ridiculous high settings with RT on, so that every card below a 3070/2080Ti will straight up not run the game with all RT effects on at an acceptable framerate. And AMD will suffer.

Transparent RT reflections run at 100% screen resolution and 2km distance for example and it is impossible to change that. That will eat performance alive. Unecessarily so, the game would still look great at reduced resolution and draw distance.

This reeks shady business practices...

It's gonna be no different than Witcher 3 and all of the Nvidia plug-ins. Hairworks in Witcher 3 would plummet the frame rate until you go in and edit the .ini and lower the settings manually.

It's gonna be fine.

No different than Witcher 2 and "Ubersampling" too.


Krappadizzle Krappadizzle

Shitty angle and pic quality, but here she be. :)



There's a few more shots, including the case with the side panel up in an imgur album.
Love the all black. Well done!

What makes you say this? What have I missed? I would imagine that the RTX 3090 can run Cyberpunk 2077 at 4K-60Hz via DLSS Performance Mode (i.e. 1080p upscaled to 4K).

As long as the image has a high-level of anisotropic filtering and anti-aliasing and subsequently looks clean and sharp, it'll look great. That's the case with Control, which looks fantastic at 4K via DLSS Performance Mode and consistently runs over 100 frames per second.

Considering we've seen CP2077 running at 4k60hz via DLSS and everything maxed on a 2080ti, I wouldn't worry. The guy is spreading FUD. I have no issues calling a spade a spade as most of ya'll know, but until we have more information, there's no reason to go around spreading FUD like Dampf Dampf
 
Last edited:

Dampf

Member
What makes you say this? What have I missed? I would imagine that the RTX 3090 can run Cyberpunk 2077 at 4K-60Hz via DLSS Performance Mode (i.e. 1080p upscaled to 4K).

As long as the image has a high-level of anisotropic filtering and anti-aliasing and subsequently looks clean and sharp, it'll look great. That's the case with Control, which looks fantastic at 4K via DLSS Performance Mode and consistently runs over 100 frames per second.
Because I contacted a German reviewer who was playing the game and he told me this.

Well, maybe it can on DLSS Performance at 4K, but certainly not at quality which it should. But that is not my concern, if a 3090 needs DLSS to play at 1440p, cards below that card will have a very hard time getting good performance.
Considering we've seen CP2077 running at 4k60hz via DLSS and everything maxed on a 2080ti, I wouldn't worry. The guy is spreading FUD. I have no issues calling a spade a spade as most of ya'll know, but until we have more information, there's no reason to go around spreading FUD like Dampf Dampf

Well, that is certainly FUD. We have never seen CP2077 running at 4K60 even without RTX. The trailers are all running at 30 FPS and now we know why. If the 3090 can only do 1440p60 without DLSS and Raytracing, that is 4k30 on max settings.
 
Last edited:

Armorian

Banned
This Palit RTX 3070 is quite amazing. My undervolt testing is concluded, GPU at stock clock is stable at just 0.900V KEEPING 1.965Mhz clock - without undervolt is was going into ~18xx even.

Power draw drops from ~250W to ~185W...

ControlScreenshot202.png
8bCrysisRemasteredScre.png
NMy4r7o.jpg
nPFaE0G.png
nGViLc6.png
TEkDtiR.jpg




Memory OC is next :messenger_tears_of_joy:
 

diffusionx

Gold Member
I signed up for Telegram and their notification bot channels, got the GPU I wanted in my cart on BB, but it sold out before I could check out. :(

Still more promising than anything yet.
 
Last edited:
Well, that is certainly FUD. We have never seen CP2077 running at 4K60 even without RTX. The trailers are all running at 30 FPS and now we know why. If the 3090 can only do 1440p60 without DLSS and Raytracing, that is 4k30 on max settings.
Here ya go bud, took 5 seconds to find.




There's also probably at least another 40 mins. or so of footage from "influencers" that are played on PC with a 2080ti with DLSS at 4k. Do you need me to YouTube that for you too? So.... Yeah, you are spreading FUD.
 
Last edited:

Dampf

Member
Here ya go bud, took 5 seconds to find.




There's also probably at least another 40 mins. or so of footage from "influencers" that are played on PC with a 2080ti with DLSS at 4k. Do you need me to YouTube that for you too? So.... Yeah, you are spreading FUD.

Buddy, that video is in 1080p. And it is over 1 year old, so not really relevant anymore as each day can change so much in development.
 
Buddy, that video is in 1080p. And it is over 1 year old, so not really relevant anymore as each day can change so much in development.
giphy.gif


giphy.gif


They're putting the game out on 10 platforms dude, how much time do you want them to give to the smallest user base? I'm a huge PC gamer and even I don't expect that kind of treatment.
 
Last edited:

Rikkori

Member
Here ya go bud, took 5 seconds to find.




There's also probably at least another 40 mins. or so of footage from "influencers" that are played on PC with a 2080ti with DLSS at 4k. Do you need me to YouTube that for you too? So.... Yeah, you are spreading FUD.

In that video I think at best only diffuse illumination is ray traced, we certainly know for a fact that they didn't have reflections working at the time. But also I specifically remember them not having said "with ray tracing" when the trailer was launched because it was one of the ones I kept examining. Still, like said before - 1080p.

Btw, the footage you saw from those influencers wasn't with DLSS (nor RT), that was the stock footage released by CDPR - the influencers weren't allowed to record and share their own. DF examines it here:

 
In that video I think at best only diffuse illumination is ray traced, we certainly know for a fact that they didn't have reflections working at the time. But also I specifically remember them not having said "with ray tracing" when the trailer was launched because it was one of the ones I kept examining. Still, like said before - 1080p.

Btw, the footage you saw from those influencers wasn't with DLSS (nor RT), that was the stock footage released by CDPR - the influencers weren't allowed to record and share their own. DF examines it here:


Didn't those same influencers also say they played the PC version with DLSS on a 2080ti too maxed out? I know I remember a few different ones saying that.
 
6zUGOqO.jpg


My buddy is waiting for me to get a 3080 so he can have my old 1080ti. He killed his 980ti a few months back and has been borrowing my old 780 and literally texted me today asking "what's he gonna do?" for CP2077. I sent him this and told him to lower his settings..Got a good sensible chuckle.

-----------------------------

I wanna say I heard CDPR say they are gonna add RT for AMD later on down the road. It's gonna be interesting to see how it stacks up.
 
Last edited:

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Looks like I'll be able to run RT Ultra at 2160p. Great.

Also, interesting how the 3080 beats the 6800XT at 4k gaming, yet everyone was fear mongering at the 3080's "lack of" VRAM. Even though it's GDDR6X. Even though there's no evidence of it being an issue. Even though it's a top tier card that will dictate the high end gaming segment.
 
Really pleased with those CP2077 RT requirements. Wonder what fps they're thinking...? I'm guessing 30+, since that seems to be what devs/pubs usually aim for with their recommended requirements and will mention 60fps if that's the intent. Also, is that 4k native? I assume so, but you never know.
 
Last edited:
Really pleased with those CP2077 RT requirements. Wonder what fps they're thinking...? I'm guessing 30+, since that seems to be what devs/pubs usually aim for with their recommended requirements and will mention 60fps if that's the intent. Also, is that 4k native? I assume so, but you never know.
No way. Gotta be 60 yeah? I have no clue, but I mean...they know they have a PC audience. They started as a PC dev for their first 2 games of Witcher so.. I dunno. Good question really.
 

Kenpachii

Member
Looks like I'll be able to run RT Ultra at 2160p. Great.

Also, interesting how the 3080 beats the 6800XT at 4k gaming, yet everyone was fear mongering at the 3080's "lack of" VRAM. Even though it's GDDR6X. Even though there's no evidence of it being an issue. Even though it's a top tier card that will dictate the high end gaming segment.

Game runs on a potato and already requires 10gb of v-ram which by the way only a 3080 has. Its specifically optimized for that 3080 and still uses 10gb of v-ram. Now imagine when 3080 isn't the focus of nvidia anymore and next gen titles are a thing, yea good luck with that logic. Its fermi all over again.
 
Last edited:
No way. Gotta be 60 yeah? I have no clue, but I mean...they know they have a PC audience. They started as a PC dev for their first 2 games of Witcher so.. I dunno. Good question really.
Seems like 'recommended' is whatever will drive 'ultra' settings at the specified resolution at a playable framerate. That last part always seems to vary by dev, unfortunately. =/
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Game runs on a potato and already requires 10gb of v-ram which by the way only a 3080 has. Its specifically optimized for that 3080 and still uses 10gb of v-ram. Now imagine when 3080 isn't the focus of nvidia anymore and next gen titles are a thing, yea good luck with that logic. Its fermi all over again.
What game? 2077? I wasn't referencing that. I mean, benchmarks of the 6800XT against the 3080 generally. 6800XT runs like a champ at 1440P but then falls on its face at 4K compared to 3080. And 4K is where all the VRAM complaining comes from -- no substantiation behind it yet.

If my 3080 tanks in a few years... that would suck. I'm definitely not opposed to swapping it for a 3080Ti with more VRAM but even if the 3080Ti never comes to pass, I feel fine. Definitely not getting a 6800XT on the basis of VRAM alone.

Tbh, I'm kind of bummed about not having a xx80Ti card. This was the first time I could go all in on a card and I've always wanted an x80Ti. Just sounds cool to someone who has always wanted one, heh.
 
Last edited:
I've reached that point where I find myself loading games up, not to play them, but to max out their settings and see how high I can pump the resolution before I can't maintain 60fps no more. Lol Reached 6k resolution with all settings maxxed in Mad Max a bit ago and it was quite the treat racing around the desert, taking in the scenery, details so sharp you could cut yourself on them. Tried 8k, all max, but I reached 100% gpu load and couldn't rise above 50fps. /weep
 
Last edited:

Rickyiez

Member
Game runs on a potato and already requires 10gb of v-ram which by the way only a 3080 has. Its specifically optimized for that 3080 and still uses 10gb of v-ram. Now imagine when 3080 isn't the focus of nvidia anymore and next gen titles are a thing, yea good luck with that logic. Its fermi all over again.

Neither 3080 or 6800xt has an ideal Vram configuration. No amount of extra Vram will helps if you're bandwidth starved , hence 3080 coming out ahead in most if not all 4k benchmarks. The 16gb DDR6 256bit is just not fast enough in 4k

If my 3080 tanks in a few years... that would suck. I'm definitely not opposed to swapping it for a 3080Ti with more VRAM but even if the 3080Ti never comes to pass, I feel fine. Definitely not getting a 6800XT on the basis of VRAM alone.

Tbh, I'm kind of bummed about not having a xx80Ti card. This was the first time I could go all in on a card and I've always wanted an x80Ti. Just sounds cool to someone who has always wanted one, heh.

Pretty sure I would be upgrading mine when rtx40 series or 78xx is around
 
Last edited:
are Zotac cards any good? been looking to get an ASUS card but saw that a Microcenter an hour away from me might have Zotac one available. wondering if I should drive down there and try my luck.
 

Kenpachii

Member
Neither 3080 or 6800xt has an ideal Vram configuration. No amount of extra Vram will helps if you're bandwidth starved , hence 3080 coming out ahead in most if not all 4k benchmarks. The 16gb DDR6 256bit is just not fast enough in 4k



Pretty sure I would be upgrading mine when rtx40 series or 78xx is around

I take slower ram and lower 4k performance over lower v-ram and faster 4k performance any day.
 
Something seems off. I get a higher score at stock with much lower clock speeds.
I think your's is about right. You look to have the better computer overall and, as much as this is a graphics card test, the rest of the system does factor into it. Just remember the old saying, "Behind every great GPU is a great CPU (and memory)."
 
Last edited:
I take slower ram and lower 4k performance over lower v-ram and faster 4k performance any day.


Just so you can see how demented this fanboism is. You just said you will take a slower card instead of a faster one. Thats it, pure and simple. So perverse is the condition of fanboy, that it makes people actually say this, just so they can cheerlead for their fav corporation > i will rather take a slower card than a faster one, because im a fanboy of corporation x. Thats what you just said
 

Patrick S.

Banned
3070 default:


With undervolt (0.900V) and memory oc from 1750 to 2000 (8000 effective):


~6% increase


Man, my CPU must REALLY be holding me back :/

https://www.3dmark.com/spy/15460168?

P.S. How are you inserting those 3Dmark links with the info boxes?
 
Last edited:

Kenpachii

Member
Just so you can see how demented this fanboyism is. You just said you will take a slower card instead of a faster one. Thats it, pure and simple. So perverse is the condition of fanboy, that it makes people actually say this, just so they can cheerlead for their fav corporation > i will rather take a slower card than a faster one, because im a fanboy of corporation x. Thats what you just said

As u clearly don't understand how hardware works i will teach you something.

I rather take 5% hit on memory performance right now or even 10% and that's only at 4k which all of those cards will suck at big time when next gen titles hit as minimum specs go up massively, then have a 100% performance hit when games start to move to higher then 10gb v-ram consumption which will happen or having to scale down settings drastically.

U don't understand this because u got no clue how hardware works. I do understand this because i bought a 580gtx 1,5gb model and next gen happened, every trash tier performing gpu with 2gb of v-ram had no issue's running next gen games while my 580 couldn't even play at at any setting. its straight from 60 fps to 0 every few seconds it swaps. Why? v-ram bottleneck.

Example:



I bet most people here would prefer to have 16gb or 20gb 3080 models, but that's simple not what nvidia offers ( artificial limiting of v-ram is a meta game they often play 500/700/900 series) and frankly i can't blame them for upgrading because there hasn't been a good GPU to upgrade toards in the last 3 years and the 3080 will probably work fine for the upcoming 2 years as next gen titles are probably releasing in 2022 so yea there's that.
 
Last edited:
Looks like I'll be able to run RT Ultra at 2160p. Great.

Also, interesting how the 3080 beats the 6800XT at 4k gaming, yet everyone was fear mongering at the 3080's "lack of" VRAM. Even though it's GDDR6X. Even though there's no evidence of it being an issue. Even though it's a top tier card that will dictate the high end gaming segment.
The reason that Ampere beats Big Navi in 4K is because AMD doesn't have access to GDDR6X and they've finally given up on HBM. They simply don't have enough memory bandwidth and even with Not That Infinite Cache there's not a lot you can do. You either have enough bandwidth or you don't. The cache design of Big Navi is enough for 1440p which is why it's able to edge out Ampere there, but they ran into unknown limitations of die size or yields which prevented them from making the cache large enough for 4K.

Even with the infamous Microsoft Flight Simulator, 10 GB VRAM on 3080 vs. 16 GB VRAM on 6800XT doesn't seem to be making any real difference.

Game runs on a potato and already requires 10gb of v-ram which by the way only a 3080 has. Its specifically optimized for that 3080 and still uses 10gb of v-ram. Now imagine when 3080 isn't the focus of nvidia anymore and next gen titles are a thing, yea good luck with that logic. Its fermi all over again.
Ah, you're living in that imaginary world where the vendor with the 80% market share (Nvidia) stops being the focus and the one with the 20% market share (AMD) becomes the focus. Something which literally hasn't ever happened nor will ever happen because game developers and publishers aren't retarded.
 
Last edited:

Rbk_3

Member
Man, my CPU must REALLY be holding me back :/

https://www.3dmark.com/spy/15460168?

P.S. How are you inserting those 3Dmark links with the info boxes?

Even for a 6700k your graphics score should be way higher, pushing 17000. Here are all the 3080/6700k results



Set Windows Power Mode to High Performance.
Disable Intel Speed Shift in your bios if that's an option. Possibly update your Bios.
Make sure that all gsync/vsync is off
Nvidia control panel make sure you are set to high performance.
Reinstall drivers after using DDU to uninstall the old ones.
Make sure all unnecessary background applications are closed.
Make sure XMP is enabled

My 2080S had almost the same graphics score as you

 
Last edited:

BluRayHiDef

Banned
Just so you can see how demented this fanboism is. You just said you will take a slower card instead of a faster one. Thats it, pure and simple. So perverse is the condition of fanboy, that it makes people actually say this, just so they can cheerlead for their fav corporation > i will rather take a slower card than a faster one, because im a fanboy of corporation x. Thats what you just said

LMAO. That post did convey a degree of madness.
 
Top Bottom