• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

Define "handling" RT. I would argue that the nVidia cards don't handle RT today either. Let's take the biggest most important game that has been used to push RT. None of the RTX cards can handle Cyberpunk max settings with RT.

In order to get playable framerates at 4K, you have to run DLSS in Ultra Performance mode, which is literally running at 720p and upscaling it to 4K (which inevitably doesn't look that great), to get the game running higher than 60fps average, and even then the minimums are below 60. And nobody can dispute that fact.

Cyberpunk-2077-early-benchmarks-2.jpg



"In 4K/Ultra/RT with DLSS Quality, the RTX3090 was able to only offer a 30fps experience."
If you think this performance is somehow acceptable, especially for a $1500 graphics card or you classify it as "handling RT" or being "top of the line", then the performance in the majority of other games with RT is perfectly acceptable for the 6800 cards as well. The same applies to lowering of RT settings to achieve playable framerates.

The best use of DLSS is still without RT. Because it enables something like an RTX 2060 to give playable framerates at 4K. RT is currently a liability, not an asset.


Thats why nvidia released RT together with DLSS. You can RT at Ultra in 1440p with DLSS balanced. 60 frames, though you will go in the 50s on ocassion. But this is with the most ample RT in existence to date. You can tailor the experience by being selective of what RT features you keep enabled. Or you can use DLSS auto, which will keep it most of the time in quality and balanced. Benchmarking the game without DLSS is at most for an academic standpoint, because no user of RT cards will play the game without it. The technicalities of what dlss does, thats its not "true" 4k or 1440p doesnt matter. The end result goes head to head with native and enables 60 frame RT gameplay
 

Ascend

Member
Thats why nvidia released RT together with DLSS. You can RT at Ultra in 1440p with DLSS balanced. 60 frames, though you will go in the 50s on ocassion. But this is with the most ample RT in existence to date. You can tailor the experience by being selective of what RT features you keep enabled. Or you can use DLSS auto, which will keep it most of the time in quality and balanced.
Might as well disable everything to get actually good framerates. But, thanks for accidentally confirming that these cards are still not good enough for RT, even with upscaling. Because if they were, you wouldn't have to selectively disable anything.

And really... If you're going to buy a $1500, are you really going to run the game at upscaled 1440p 60fps...?

Benchmarking the game without DLSS is at most for an academic standpoint, because no user of RT cards will play the game without it. The technicalities of what dlss does, thats its not "true" 4k or 1440p doesnt matter. The end result goes head to head with native and enables 60 frame RT gameplay
All the figures I posted in my last post was WITH DLSS;
"In 4K/Ultra/RT with DLSS Quality, the RTX3090 was able to only offer a 30fps experience."
 
Last edited:
The most ignored issue is that it is a big gamble to bet games will start being made fully with RT even on PC, because the market is regulated by what the consoles can do. PS4 launched 2013 with 8GB of ram, 7 years later 8GB is still all you need on PC GPUs. They dictate the rules, and now they have 16GB and simple RT capabilities. Owners of the 3080 will be crying foul play for years to come, with its anemic 10GB of ram. My 1080ti has 11GB.


Its not regulated at all by what consoles can do. Seeing as we had RT games since 2018 exclusively for PC and we have reached several dozens by now. Going forward, as evidenced by Watch Dogs Legion, consoles will have a very cutback implementaion while PC will go way ahead in quality/extra RT options. 10 gigs is more than enough at this point. I dont believe 3080 owners will cry jack shit becaue they wont have the need to. If 5080 is out by the time 10 gigs arent enough, it means 10 gigs was a good call. We'll just have to wait and see when that game will come out.

Might as well disable everything to get actually good framerates. But, thanks for accidentally confirming that these cards are still not good enough for RT, even with upscaling. Because if they were, you wouldn't have to selectively disable anything.

And really... If you're going to buy a $1500, are you really going to run the game at upscaled 1440p 60fps...?

All the figures I posted in my last post was WITH DLSS;


I didnt confirm anything os the sort. Its your extreme bias seeing what you want to see, morphing reality into what you would like it to be, not what it actually is. What reality actually is is that you can play Cyberpunk at 1440p, ultra evertything, with 60 frames over 90% of the time.

Debating pricing, dlss or other bullshit wont change that.
 

llien

Member
The review on AMD RT performance has been studied and published on all corners of the internet

This is a highly misleading statement.

The elephant in the room is that despite all the "super bright" future promises, all the hyping and outright paying devs to add support for it, it's tech that brings dubious advantages to the table, its most recognizable feature is "it brings down FPS" and it is still available in just a handful of games.

With the pool of games being soo small, most of them green sponsored, with clear outliers like WD:L and Dirt-5, and dubious "DXR 1.1 kinda could run on old hardware" statements, the actual situation is far from clear, no matter how many noname dudes on the internet "review" it.

With even $2000 350+ watt card being brought to its knees with it enabled, and the way things are going (UE5, consoles) the prospects of this "wonderful" tech are far from "guaranteed to flourish".

playable framerates at 4K, you have to run DLSS
It is rather "running it at 1440p, and pretending it is 4k"
 
Last edited:

Ascend

Member
I didnt confirm anything os the sort. Its your extreme bias seeing what you want to see, morphing reality into what you would like it to be, not what it actually is.
Yes you did. You said you have to lower RT settings to get playable flamerates. That equates to not being fast enough.
But obviously you won't admit it because you have to shove this nVidia stuff down everyone's throat. I wouldn't be surprised if you're on their payroll. It would be extremely sad if you weren't.

What reality actually is is that you can play Cyberpunk at 1440p, ultra evertything, with 60 frames over 90% of the time.
With a card that costs over $1000 dollars.

Debating pricing, dlss or other bullshit wont change that.
And the 6800 can also do RT at whatever random resolution reaches 60 fps. Does that somehow justify the 6800 cards being good at RT now? Or are you now suddenly going to talk about pricing and DLSS?
 

Ascend

Member
Seems theres no point in talking with you as you just simply choose to ignore reality. And accuse me or others of shilling by merely stating facts. You even lie and misrepresent things that i just posted. You will go on ignore alongside illen
I'm the one ignoring reality? You're a funny one aren't you? Tell me which fact that I provided was wrong? The one where the RTX 3090 only offers a 4K/30fps experience with RT in Cyberpunk while having DLSS enabled?
Whoops. You're too triggered and littered in cognitive dissonance, that you had to put me on ignore so that you can live with yourself.

I did not misrepresent anything. You said something without wanting to say it, and now you want to take it back but can't. Live with it.

It's better for me for you to put me on ignore. Then I can shoot your propaganda down without having to deal with the vomit in return.
 
Last edited:

spyshagg

Should not be allowed to breed
Its not regulated at all by what consoles can do. Seeing as we had RT games since 2018 exclusively for PC and we have reached several dozens by now. Going forward, as evidenced by Watch Dogs Legion, consoles will have a very cutback implementaion while PC will go way ahead in quality/extra RT options. 10 gigs is more than enough at this point. I dont believe 3080 owners will cry jack shit becaue they wont have the need to. If 5080 is out by the time 10 gigs arent enough, it means 10 gigs was a good call. We'll just have to wait and see when that game will come out.

I dont agree.

Consoles do dictate the baseline for all games except for PC exclusives. PC market isn't driven by the same type of game you see on consoles, and unfortunately the type of market where PC money comes from isn't known to drive technology, and in fact, they are tweaked to be able to run on iGPU's. You won't see multiplat fully RT games being made. We will see RT "options" that are togglable on the PC menus, except for the aforementioned PC exclusives.

This is a highly misleading statement.

The elephant in the room is that despite all the "super bright" future promises, all the hyping and outright paying devs to add support for it, it's tech that brings dubious advantages to the table, its most recognizable feature is "it brings down FPS" and it is still available in just a handful of games.

With the pool of games being soo small, most of them green sponsored, with clear outliers like WD:L and Dirt-5, and dubious "DXR 1.1 kinda could run on old hardware" statements, the actual situation is far from clear, no matter how many noname dudes on the internet "review" it.

With even $2000 350+ watt card being brought to its knees with it enabled, and the way things are going (UE5, consoles) the prospects of this "wonderful" tech are far from "guaranteed to flourish".


It is rather "running it at 1440p, and pretending it is 4k"

You didn't read the rest of the post you quoted where we come to the same conclusion. I was merely pointing out that Dirt5 RT isn't representative.
 

Sun Blaze

Banned
For everyone thinking RT is so important, feel free to participate in this RT blind test;


Why not let people judge for themselves in-game rather than a biased test? RT doesn't look the same in every scene and every game.

In Cyberpunk for instance, in some areas, it absolutely does make a pretty significant difference but in some others, you can only tell by watching your frame rate tank.

RT does make a difference, the question is, is the frame rate drop worth it? In most cases, I would say no. Going from 90fps to the mid 50's with RT on isn't worth a couple of reflections and nicer lights for me.
 

Sun Blaze

Banned
Although I agree with your conclusion, why is this test somehow biased...?
Because his position is obviously RT isn't noticeable so he'd use a video where you don't notice it. If I wanted to showcase RT, I could pick out a scene where you'd have to be blind not to see it.

For Cyberpunk, he uses a fast-moving car chase where you'd miss even reflections (something which is easy to spot).

Not that I disagree with the general conclusion. Outside of reflections and sometimes GI, RT is rather subtle. Shadows are just a waste 99% of the time.

Artists benefit more from it than we do because it cuts down some of their work, but not for now because they need to do raster and RT rather than just RT.
 

mitchman

Gold Member
Might as well disable everything to get actually good framerates. But, thanks for accidentally confirming that these cards are still not good enough for RT, even with upscaling. Because if they were, you wouldn't have to selectively disable anything.

And really... If you're going to buy a $1500, are you really going to run the game at upscaled 1440p 60fps...?

All the figures I posted in my last post was WITH DLSS;
They are good enough for RT, but not with maxed out settings. Cards that can do that will be coming in 1-2 years time, CP 2077 and other RT titles are the Crysis of today, meaning they scale both up and down as technology progresses. For most practical purposes, reducing some settings to get decent performance does not lower the quality in any visible meaningful way for most eyes.
If I'm ever able to get my pre-order of the 3080 delivered, I will double dip and play through CP2077 again on PC, but for now it's XSX (PS5 is impossible to get hold of).
 

Rikkori

Member
So, having re-subbed to ubi+ again after having run through Cyberpunk twice, I've decided to test Legion with RT. Strangely I'm seeing the same sort of underutilisation of the GPU as used to be common with Ghost Recon games. By which I mean the gpu usage is at 99% but the power draw is noticeably down. Normally I'd be pushing the card to 220-230w but here, even with RT on and at 4K (but with all the CPU options down so as to not bottleneck it there) I'm lucky if it stays at 200w. Even if I disable RT it doesn't seem to go much past that still. Clearly there are aspects of the card that are not being put to good use here, as used to be more common for GCN cards and certain engines.

I guess all the talk about the games underutilising the GPU were true after all, I'm 100% sure they could eek out another 20% performance if they'd put the work in. Probably unlikely as they've moved on to the next-gen games but eh, who knows.

The funny thing is I finished it with an RX 480 the first time (and had a good time & performed well) and I've mostly kept the same settings besides upping resolution now with an RX 6800. Or turn on RT if I feel like it, but honestly the SSR does a really good job overall in this game & I don't feel like the RT does that much outside of specific instances (like flying through glass arches) for it to warrant having 1/3rd the performance all the time. Sadly turning up other settings is also mostly a no-go because they bring even optimised 5950x setups to their knees (maxing shadows & extra details etc).

Though on the plus side having played CP2077 and coming back to this game I now see it as so much worse, and so I'm unlikely to do another playthrough regardless of RT or anything else. It's funny, as flawed as CP 2077 is there's so much love & care put into it, and it's so much fun, going back to most other open world games feels like going back a decade. It's just too bad I've had my fill.
 

regawdless

Banned
I guess all the talk about the games underutilising the GPU were true after all, I'm 100% sure they could eek out another 20% performance if they'd put the work in. Probably unlikely as they've moved on to the next-gen games but eh, who knows.

20% increase seems kinda outlandish to me. What do you base this high estimate on if you're 100% sure?
 
llien llien

You can finally stop throwing around that false Watch Dogs Legion raytracing benchmark. An update has fixed the low quality issue and it now provides normal raytracing.



At 1440p RT high
6800XT with SAM - 48 fps
3080 - 66 fps


Yeah it seems those initial comparison charts are flawed/not representative of the "real" performance due to the issues the game had with RT on at launch on AMD cards. Thankfully the issue has now been fixed as most reasonable people knew it would be.

Seeing as we are calling out llien llien on this as he does overuse those charts a bit, are we also going to call out the multiple posters in this thread who purposely posted the RT comparison of this game with the missing reflections etc.. with big red circles pointing them out, despite knowing for a fact that there was an issue with RT confirmed by AMD in their press packets as a "known issue"? This comparison picture was posted multiple times in this thread by multiple posters even after it was pointed out that this was a known issue likely to be fixed shortly with an update.

These people took it one step further in their FUD campaign trying to claim that either the developers purposely left out RT effects or rendered them at lower quality on AMD because the AMD RT solution was just too weak, or that AMD themselves somehow disabled these effects in their drivers to save performance. They even went one step further trying to foster a narrative that all games running with RT effects turned on with AMD cards were somehow purposely rendering less/lower quality effects compared to Nvidia and lying about it in some way.

It turned out in the end that the issue was actually on Ubisoft's end, they had not officially enabled RT for AMD GPUs at launch and thus the glitched RT makes far more sense. They released this update which officially enabled RT on AMD cards. Anyway, I guess my point is, there is a lot of crow to be served regarding RT performance with this game regardless of "camps", "teams" or brand preference.
 

regawdless

Banned
Yeah it seems those initial comparison charts are flawed/not representative of the "real" performance due to the issues the game had with RT on at launch on AMD cards. Thankfully the issue has now been fixed as most reasonable people knew it would be.

Seeing as we are calling out llien llien on this as he does overuse those charts a bit, are we also going to call out the multiple posters in this thread who purposely posted the RT comparison of this game with the missing reflections etc.. with big red circles pointing them out, despite knowing for a fact that there was an issue with RT confirmed by AMD in their press packets as a "known issue"? This comparison picture was posted multiple times in this thread by multiple posters even after it was pointed out that this was a known issue likely to be fixed shortly with an update.

These people took it one step further in their FUD campaign trying to claim that either the developers purposely left out RT effects or rendered them at lower quality on AMD because the AMD RT solution was just too weak, or that AMD themselves somehow disabled these effects in their drivers to save performance. They even went one step further trying to foster a narrative that all games running with RT effects turned on with AMD cards were somehow purposely rendering less/lower quality effects compared to Nvidia and lying about it in some way.

It turned out in the end that the issue was actually on Ubisoft's end, they had not officially enabled RT for AMD GPUs at launch and thus the glitched RT makes far more sense. They released this update which officially enabled RT on AMD cards. Anyway, I guess my point is, there is a lot of crow to be served regarding RT performance with this game regardless of "camps", "teams" or brand preference.

I have no horse in this race and don't care about brands at all. I posted it because I've quoted and corrected llien five times on it while he just continued to spread it.

Wasn't involved in the other fud, but it should be called out as well, of course. No one needs that biased crap.
 
Last edited:

BluRayHiDef

Banned
All the figures I posted in my last post was WITH DLSS;

Ascend said:
"In 4K/Ultra/RT with DLSS Quality, the RTX3090 was able to only offer a 30fps experience."

This is absolute bollocks. I'm playing Cyberpunk 2077 in 4K with the "Ultra" graphics preset (including ray tracing) via DLSS Quality Mode on an RTX 3090; my frame rate ranges from ~36 to ~45 frames per second.

When you consider that this is a huge open-world game with lots of NPCs and vehicles, it's understandable that even an RTX 3090 cannot hit 60 frames per second under the "Ultra" graphics preset, especially with ray tracing enabled. Recall how many generations of cards it took for Grand Theft Auto V to become playable at 4K-60FPS at decent settings.

Having said that, because Cyberpunk 2077 is not a fast-reaction game, it doesn't need to be rendered at 60 frames per second to feel smooth; as long as the frame rate is a few frames above 30, the game feels smooth (I've locked it at 30 to see for myself and it feels sluggish).

 

Rikkori

Member
20% increase seems kinda outlandish to me. What do you base this high estimate on if you're 100% sure?

Just raw power left on tap & from differences I saw with Wildlands vs Breakpoint + the before/after post-Vulkan. And lots of other things that I'm not bored enough to go into.
 

Ascend

Member
This is absolute bollocks. I'm playing Cyberpunk 2077 in 4K with the "Ultra" graphics preset (including ray tracing) via DLSS Quality Mode on an RTX 3090; my frame rate ranges from ~36 to ~45 frames per second.

When you consider that this is a huge open-world game with lots of NPCs and vehicles, it's understandable that even an RTX 3090 cannot hit 60 frames per second under the "Ultra" graphics preset, especially with ray tracing enabled. Recall how many generations of cards it took for Grand Theft Auto V to become playable at 4K-60FPS at decent settings.

Having said that, because Cyberpunk 2077 is not a fast-reaction game, it doesn't need to be rendered at 60 frames per second to feel smooth; as long as the frame rate is a few frames above 30, the game feels smooth (I've locked it at 30 to see for myself and it feels sluggish).


They are not my numbers. They are Toms Hardware numbers. Maybe performance improved after updates. The game was/is buggy after all.

But at this point I'm in a place where I don't give a shit. The truth doesn't matter to people so I'm not gonna bother anymore. Let people believe whatever they want to believe.
 

regawdless

Banned
They are not my numbers. They are Toms Hardware numbers. Maybe performance improved after updates. The game was/is buggy after all.

But at this point I'm in a place where I don't give a shit. The truth doesn't matter to people so I'm not gonna bother anymore. Let people believe whatever they want to believe.

I'm kinda interested how you would summarize the "truth" about the situation regarding the 6800xt vs 3080 for example.

Either way, I think we can all have a pretty clear picture about the performance of the available cards. We literally have hundreds of benchmarks with tons of data.

Normally no need for brand wars.
 

regawdless

Banned
Just raw power left on tap & from differences I saw with Wildlands vs Breakpoint + the before/after post-Vulkan. And lots of other things that I'm not bored enough to go into.

Bummer. You got me hyped. Seeing how the 6800xt does very good in Watch Dogs Legion, I would be very surprised to see a 20% increase and see no logical explanation how that'd be possible.
 

Ascend

Member
I'm kinda interested how you would summarize the "truth" about the situation regarding the 6800xt vs 3080 for example.
Yeah... You're really not. Considering your past actions, you have already shown to not be interested in the truth, so I'm not gonna bother with long stories.
I'll give you a short version, which ultimately you will disregard anyway, but whatever; A car having a max speed of 30mph is better than one having max 15 mph. But none of them are viable for highway driving. And at this point, you get what you can find, if you're willing to fork the money, which I don't recommend.

Either way, I think we can all have a pretty clear picture about the performance of the available cards. We literally have hundreds of benchmarks with tons of data.

Normally no need for brand wars.
Then why are so many constantly talking about RTX cards in a 6800 series thread?

summarize the "truth" about the situation regarding the 6800xt vs 3080 for example
🤷‍♂️
 

regawdless

Banned
Yeah... You're really not. Considering your past actions, you have already shown to not be interested in the truth, so I'm not gonna bother with long stories.
I'll give you a short version, which ultimately you will disregard anyway, but whatever; A car having a max speed of 30mph is better than one having max 15 mph. But none of them are viable for highway driving. And at this point, you get what you can find, if you're willing to fork the money, which I don't recommend.


Then why are so many constantly talking about RTX cards in a 6800 series thread?


🤷‍♂️

Show me where I wasn't interested in the truth and not open for being corrected. I'm always glad if people prove me wrong, so I can adjust my view.

Sorry if it came through the wrong way through, I'm honestly interested in your assessment because we often disagree. That's more valuable input than from people who have the same opinion as me.

I didn't thought that asking about a comparison of two cards is equal to start a brand war. Don't want this to go toxic, it's fine if you don't answer.
 

Rickyiez

Member
Common guys go out and enjoy some games, it's Steam sales again. Stop stressing out over brands or those delusional peoples. When it's too obvious that some are blinded by brands loyalty, there's always the ignore feature.
 

Chiggs

Member
Common guys go out and enjoy some games, it's Steam sales again. Stop stressing out over brands or those delusional peoples. When it's too obvious that some are blinded by brands loyalty, there's always the ignore feature.

I think some of these people you're referring to need sex more than games, quite frankly. Here's hoping everyone gets some this Holiday Season.
 

Rikkori

Member
Bummer. You got me hyped. Seeing how the 6800xt does very good in Watch Dogs Legion, I would be very surprised to see a 20% increase and see no logical explanation how that'd be possible.

Bruh, did you even read the first post you replied to?

Strangely I'm seeing the same sort of underutilisation of the GPU as used to be common with Ghost Recon games. By which I mean the gpu usage is at 99% but the power draw is noticeably down. Normally I'd be pushing the card to 220-230w but here, even with RT on and at 4K (but with all the CPU options down so as to not bottleneck it there) I'm lucky if it stays at 200w. Even if I disable RT it doesn't seem to go much past that still. Clearly there are aspects of the card that are not being put to good use here, as used to be more common for GCN cards and certain engines.

If you wanna argue it's 10-15% instead of 20% - cool, whatever. Maybe it's 13.37% exactly. But to say "no LOGICAL reason"? C'mon.
 

regawdless

Banned
Bruh, did you even read the first post you replied to?



If you wanna argue it's 10-15% instead of 20% - cool, whatever. Maybe it's 13.37% exactly. But to say "no LOGICAL reason"? C'mon.

Why so emotional.

You are referring to the power draw being too low and concluding that the GPU is underutilized. Saying that you're 100% sure that it could be 20% faster. Which put the 6800 on par or even above the 6800XT. Which I would find very unlikely. Because higher power draw doesn't automatically equal incredibly more compute power. That's why I'm asking for the reason of you being 100% sure about that 20%.

Because again, I can't imagine where this increase could come from. Is that such a crazy question from my side?

No need to be offended. "Bruh".
 
Last edited:

Ascend

Member
Show me where I wasn't interested in the truth and not open for being corrected. I'm always glad if people prove me wrong, so I can adjust my view.
I'll take your word for it.

Sorry if it came through the wrong way through, I'm honestly interested in your assessment because we often disagree. That's more valuable input than from people who have the same opinion as me.
Ok. I have a post history if you're so interested. I appreciate the gesture of trying to keep things peaceful and attempting to have a conversation. But I honestly cannot be bothered anymore. If there is any news or something similar regarding the 6000 series, I will post it, and that's that.

I didn't thought that asking about a comparison of two cards is equal to start a brand war. Don't want this to go toxic, it's fine if you don't answer.
Normally it isn't. But with the trolls constantly lurking to dictate which opinion is allowed and which isn't and throwing out backhanded comments every chance they get (there were just two again), to then bully other users, at this point I will simply avoid any discussion regarding anything that is not about the 6800 series by themselves.
 

regawdless

Banned
I'll take your word for it.

Ok. I have a post history if you're so interested. I appreciate the gesture of trying to keep things peaceful and attempting to have a conversation. But I honestly cannot be bothered anymore. If there is any news or something similar regarding the 6000 series, I will post it, and that's that.

Normally it isn't. But with the trolls constantly lurking to dictate which opinion is allowed and which isn't and throwing out backhanded comments every chance they get (there were just two again), to then bully other users, at this point I will simply avoid any discussion regarding anything that is not about the 6800 series by themselves.

Understandable. I mean, I engaged you on several occasions to counter your anti Nvidia stance in other threads. And will continue to challenge people, in the pursuit of "truth". Being glad to be challenges myself. Because it's fun :messenger_grinning_squinting:

Nonetheless, you're an interesting "sparring partner" and my question came from an honest place.
Don't take it too seriously when I'm throwing some shade, this is an only forum and we are supposed to have fun, after all.

In that sense, Merry Christmas :messenger_grinning_smiling:
 

Rikkori

Member
Why so emotional.

You are referring to the power draw being too low and concluding that the GPU is underutilized. Saying that you're 100% sure that it could be 20% faster. Which put the 6800 on par or even above the 6800XT. Which I would find very unlikely. Because higher power draw doesn't automatically equal incredibly more compute power. That's why I'm asking for the reason of you being 100% sure about that 20%.

Because again, I can't imagine where this increase could come from. Is that such a crazy question from my side?

No need to be offended. "Bruh".
OK cpt literal. :rolleyes:
 

psorcerer

Banned
I don't know what you're talking about, but my frametime never goes above ~27ms. The game is smooth.
In afterburner it's called "1% low", can get it in the benchmark mode

BTW "my frametime" should be fixed to "my 1000ms average frametime" because that's what you see on screen
 
Last edited:

BluRayHiDef

Banned
In afterburner it's called "1% low", can get it in the benchmark mode

BTW "my frametime" should be fixed to "my 1000ms average frametime" because that's what you see on screen
You're spewing conjecture because you don't want to accept that some people are enjoying this game with all of the bell and whistles - including ray tracing - and are experiencing great performance. I don't care about rare, 1% lows; the game feels smooth and looks great. That's all that matters.
 
Last edited:

Rikkori

Member
For people fomo'ing on RTX, don't be (RTX on vs off; notice the blurring of reflections - dlss can't upscale it so you get the low-res version as if playing at that resolution):

[IMG] [IMG]

[IMG] [IMG]

[IMG] [IMG]

[IMG] [IMG]

[IMG] [IMG]

If you remember this was pointed out with WD:L too:

UeykcQW.jpg
 
^ you dont even know what those pics are. They're RT on and off, not dlls. Especially the one in day time. Those shadows arent different because of dlss, its because one has RT on and the other one doesnt. Normal shadows have that abrupt cut off at the edges, they're very sharp, regardless if its accurate or not. RT shadows get more diffuse the further the object that casts them is. Many reflections in this game are diffuse and unclear because of the surfaces that cast them. You're not gonna get mirror like reflections from the road asphalt, its not because of dlss what you see there.

DLSS does indeed apply a small layer of bluriness, but its completely manageable, when you factor that its the difference that makes RT playable of not.





So now after we have concrete facts that the AMD lineup is slower at every resolution in raster, lacks every next gen feature, slower in production apps, in streaming, slower and worse in everything - now we're switching to trying to downplay the competitions advantages ? With very low effort post at that.
 

regawdless

Banned
For people fomo'ing on RTX, don't be (RTX on vs off; notice the blurring of reflections - dlss can't upscale it so you get the low-res version as if playing at that resolution):

[IMG] [IMG]

[IMG] [IMG]

[IMG] [IMG]

[IMG] [IMG]

[IMG] [IMG]

If you remember this was pointed out with WD:L too:

UeykcQW.jpg

Later tonight (like in 13hrs) I'll post some screenshots with DLSS on and off, then we'll be able to judge. I have only switched back and forth a couple times when I started the game and haven't seen a significant difference. But it'll be interesting to see the real differences.

It was very bad in WD L, to the point that I did not use DLSS. Seems to be better in Cyberpunk, let's see when I'm at my PC again.
 

psorcerer

Banned
You're spewing conjecture because you don't want to accept that some people are enjoying this game with all of the bell and whistles - including ray tracing - and are experiencing great performance. I don't care about rare, 1% lows; the game feels smooth and looks great. That's all that matters.
Ehm, people enjoy 30fps games on consoles all the time. But last time I've checked it was a big no no in PCMR world...
This game dropping as low as 17 fps on 1% low while maintaining 40 fps on avg (that's what people with Ampere reported) is kind of worse than any console game I know (besides Bloodborne).
 

BluRayHiDef

Banned
Ehm, people enjoy 30fps games on consoles all the time. But last time I've checked it was a big no no in PCMR world...
This game dropping as low as 17 fps on 1% low while maintaining 40 fps on avg (that's what people with Ampere reported) is kind of worse than any console game I know (besides Bloodborne).

I have never experienced any drops to the teens on my RTX 3090. I'm playing the game at 4K via DLSS Quality Mode with the "Ultra" graphics preset and my frame rate has never dropped below the 30s. What you've heard is bollocks.
 
Last edited:

regawdless

Banned
Rikkori Rikkori

Just checked out the DLSS. As others pointed out, you're wrong. I don't see any difference in the reflections.

First you claim some imaginary, magical 20% performance boost for the 6800. Now you post totally wrong comparison pictures trying to trash DLSS.

You don't seem like a happy new GPU owner that is confident in his purchase.
 
Last edited:
Top Bottom