• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Tech Press Accidentally Prove RTX Ray Tracing ISN'T Ready!

Allandor

Member
It looks like some people just dont remember how GPU technology has progressed. I still remember riva TNT times (performance impact in 32bit color mode was HUGE) and shaders in GeForce 3 (90fps in DX7 mode to 30fps in DX8 mode in certain games) Yet technology adopted and these days all games use 32bit color and shaders. But I have to say I dont remember people complaining that much about 32bit mode or Shader performance and saying things like 32bit color mode or shaders arnt ready.
I really disagree in that point about 32Bit color and Shaders at that time. 32-Bit was a huge performance hit these days. Voodoo cards weren't able to produce a 32-bit picture up to Voodoo 4/5. There was a small trick with a so called "22-Bit" color mode, that enhanced the colors at a minor performance impact. With TNT 2 32 bit was much better, but still 16-bit preferable because of the performance hit.
Same goes with Geforce 3 shaders. There were some examples, but the tech was not ready. As soon as shaders were really needed the geforce 3 had a to old shader-version or was to slow.
And T&L ... as soon as the CPU had more than 1 GHz the CPU was faster processing T&L than the GPU. This was a really bad first entry into the market.
But new technologies are normaly not that good in their first attempt (... does anyone remember nv1 ^^).

With RT it is the same. The hardware has the feature, but to put it to a really good use, it is to slow. E.g. look at the minecraft RT demo video. The lights needs multiple seconds to reach a wall. This happens because they calculate it over time, because realtime would be to much for the hardware.
My point is, don't buy RTX cards because you want to play future RT games with it. They are not ready for future games. Buy them because nvidia has the fastest cards right now if you have the money to spare.

I really doubt we see that many rt games in the future that really completely replace the old things with rt, simply because after the 7 nm process, there won't be that many steps to get the chips smaller & more efficient. So at least at high resolutions we always need hybrid solutions. E.g. RT really sucks (performance-wise) if you can see really far in a game (therefore battlefield does not have that many reflections in the distance).
 
Last edited:

thelastword

Banned
Couldn't devs mix ray-tracing with rasterization and cube maps and whatever other techniques currently in use to create much of the same effects with a lesser hit to performance? Does it really have to be one or the other? I mean, at least until RT tech is truly up to the task without tanking performance so much?

On a somewhat similar note:

Seems I remember that on PS5, they said that ray-tracing was going to be used mostly for sound (however that's supposed to work....echoes and surround sound effects I guess) and would be at a much lower performance cost than if it was used for visuals.
That's what AMD is going to do, reduce the footprint considerably....What people don't understand is that if the footprint goes down considerably with a more researched and smarter approach, the quality of raytracing will also go up.......

Also, Sony is not going to use raytracing just for sound, it's just another area where they will utilize RT on their console for better immersion in games.....
 

pawel86ck

Banned
I really disagree in that point about 32Bit color and Shaders at that time. 32-Bit was a huge performance hit these days. Voodoo cards weren't able to produce a 32-bit picture up to Voodoo 4/5. There was a small trick with a so called "22-Bit" color mode, that enhanced the colors at a minor performance impact. With TNT 2 32 bit was much better, but still 16-bit preferable because of the performance hit.
Same goes with Geforce 3 shaders. There were some examples, but the tech was not ready. As soon as shaders were really needed the geforce 3 had a to old shader-version or was to slow.
And T&L ... as soon as the CPU had more than 1 GHz the CPU was faster processing T&L than the GPU. This was a really bad first entry into the market.
But new technologies are normaly not that good in their first attempt (... does anyone remember nv1 ^^).

With RT it is the same. The hardware has the feature, but to put it to a really good use, it is to slow. E.g. look at the minecraft RT demo video. The lights needs multiple seconds to reach a wall. This happens because they calculate it over time, because realtime would be to much for the hardware.
My point is, don't buy RTX cards because you want to play future RT games with it. They are not ready for future games. Buy them because nvidia has the fastest cards right now if you have the money to spare.

I really doubt we see that many rt games in the future that really completely replace the old things with rt, simply because after the 7 nm process, there won't be that many steps to get the chips smaller & more efficient. So at least at high resolutions we always need hybrid solutions. E.g. RT really sucks (performance-wise) if you can see really far in a game (therefore battlefield does not have that many reflections in the distance).
Most xbox classic games (Geforce 3/4 mix) used shaders, and that was first generation of shaders. I have also played games like splinter cell 1 or enclave and shaders looked amazing. If people dont like some demanding grafhics effects like RTX (because they prefer performance for example) they can always turn them off.
 

Ivellios

Member
Because FreeSync vs GSync.
Because DirectX shaders vs "how nVidia destroyed OpenGL with its greed that turned into stupidty".
Because NVs market milking makes Intel look like a saint, to a point when in greed vs common sense fight, the former wins.


For starters, RTX is just an implementation of a rather common ray tracing concept.
There is Microsoft's API to back that up.
When even owners of $1300 cards switch it off, because of performance impact, it is already obsolete.
Current gen RTX cards are simply too slow.

As far as i know Gsync is slightly better than Freesync but terribly expensive in comparison so i agree with you there.

I know Nvidia as a company is greedy as hell and does a lot of shitty practices, but at the end of the day they managed to implement AMD features like RIS on their card even better through patches (according to hardware unboxed).


Some users turning RTX off even on the 2080Ti is not evidence that Nvidia RTX is obsolete or that AMD Ray tracing will be considerable better.

Other users that have even entry levels RTX cards are saying that the RTX effects are so good that its worth the performance cost on games like Metro Exodus and Control too.
 
Last edited:

Dontero

Banned
Couldn't devs mix ray-tracing with rasterization and cube maps and whatever other techniques currently in use to create much of the same effects with a lesser hit to performance? Does it really have to be one or the other?

Restirization is all about emulating ray tracing. Restirization is simulation of 3D with 2D image. Justlike Doom and Wolfenstain 3D are not actually 3D games despite looking like ones. It is all about smoke and mirrors.

Ray-tracing problem is that rasterization got so good that there are very few effects that ray-tracing can do better and those effects that are left usually are very hard problems which you should read as very costly.

The main benefit of ray-tracing though is ease of work. Rasterization is fast because of tricks and those tricks required decades of R&D and still are harder to produce than just switching on ray-tracing which would take care of all lighting.
IF we could go from rasterization to raytracing completely then all engine developement would be very easy.

Rasterization is just too good these days for raytracing to make huge difference. The biggest thing that raytracking offered aka material proprieties was taken away last generation which leaved effectively only reflections as big feature but that could be very effectively simulated by combination of cubemaps and screen space reflection.
 

DeepEnigma

Gold Member
Restirization is all about emulating ray tracing. Restirization is simulation of 3D with 2D image. Justlike Doom and Wolfenstain 3D are not actually 3D games despite looking like ones. It is all about smoke and mirrors.

Ray-tracing problem is that rasterization got so good that there are very few effects that ray-tracing can do better and those effects that are left usually are very hard problems which you should read as very costly.

The main benefit of ray-tracing though is ease of work. Rasterization is fast because of tricks and those tricks required decades of R&D and still are harder to produce than just switching on ray-tracing which would take care of all lighting.
IF we could go from rasterization to raytracing completely then all engine developement would be very easy.

Rasterization is just too good these days for raytracing to make huge difference. The biggest thing that raytracking offered aka material proprieties was taken away last generation which leaved effectively only reflections as big feature but that could be very effectively simulated by combination of cubemaps and screen space reflection.

If we never seen this game before RTX, and we were told this scene was "ray traced", your average gamer would believe it.

 

llien

Member
As far as i know Gsync is slightly better than Freesync
It's vice versa, measurably:



I know Nvidia as a company is greedy as hell and does a lot of shitty practices, but at the end of the day they managed to implement AMD features like RIS on their card even better through patches (according to hardware u
AMD literally said 'hey, we had eDP feature, that lets GPU drive refresh rats, for years", so, hm, not sure what they implemented, let alone "even better". The main reason to introduce gsync chip was to make it an exclusive feature. They even outright ruled out SELLING rights to use it to other parties.

Other users that have even entry levels RTX cards are saying that the RTX effects are so good that its worth the performance cost on games like Metro Exodus and Control too.
There is no statistics on that. All anecdotes I've read so far were mostly "2070 owner, not using git", "2060 owner, lol, you gotta be kidding me", apparently most users own those cars. The hammers came when 2080 and even 2080Ti said they have it off most of the time.

If people are really using it, oh well, great then.
 

Chromata

Member
What? Control benchmarks show RTX high + high settings at 1440p 60fps for 2080 and Ti with DLSS. Seems pretty ready to me. You don't even need DLSS to play BFV on high everything at 1440p 60fps.

Also Cyberpunk native 4k 30fps with ray tracing.
 
Last edited:
OP accidentally proves he's biased

Control maxed with RT maxed on my PC

4K Native
48663286047_578f245ac8_o.png


4K DLSS
48663138241_a9011fe429_o.png



Oh the HORROR :rolleyes:
I don't think this destroys OP's argument as much as you think it does. All you had to do to get it running at 4K60 was...not run it at 4K at all. Infact judging by the framerate I'd say that's running internally at 1440p tops. It provides a performance increase for a reason. DLSS is not native. It does not look like native. In fact it almost always looks worse than simply using old fashioned upscaling from the internal resolution. As for the native 4K...yeah, good luck holding 30FPS when you're sitting at 32FPS in a scene in which absolutely nothing is happening.
 

bryo4321

Member
This thread lol. No matter how good screen space reflections look, as soon as you turn the camera the reflections are gone. With ray tracing objects like tiny little fire extinguishers reflect everything in the room. It’s in the details. You’re talking smoke and mirrors to give the illusion of reality vs a fairly accurate simulation of reality.

It’s an extremely demanding feature if you understand how it works, and trying to run it on non-rtx shows how bad it would perform without dedicated hardware. That said I played metro with rtx, control, even bf5 runs well enough to hold 60 fps at 1440 native if you have a 2070 or better. This is the first gen of Ray tracing, technology takes iteration, and advances in both hardware architecture AND software, so OF COURSE it will get better with time. Both amd and nvidia are competing to give us new ways to push graphics and performance with the likes of dlss and amd’s cas algorithm.

Another thing to point out is that when people compare radeons sharpening filter to dlss, you’re comparing apples to oranges. The amd image sharpening (and whatever nvidia is calling it as they just added it) is simply a CAS algorithm, where as DLSS uses machine learning and thus should be able to get better and more accurate as the algorithm is trained on nvidia’s supercomputers. (I assume improvements are pushed through driver updates but I don’t know.)

A lot of complaints I have seen are just people not understanding how the technology works. As rendering gets better the somewhat diminishing returns on graphics improvements means moving towards more resource heavy but somewhat less dramatic graphical improvements.
 
Last edited:

Knightime_X

Member
Didn't watch yet but are we talking 1080p 20fps or 4k 20fps?
One is far more worrisome than the other.
 
Last edited:
I take offense to that.

My threads are based on facts, not some random click-bait Youtubers opinions...

Well, technically, yes. But it's clear you're enjoying too much triggering thelastword and others.

You get too involved with this things though.

Oh, I just bought a 9900k. I thought you might like the news. xD
 
Last edited:

Chromata

Member
I don't think this destroys OP's argument as much as you think it does. All you had to do to get it running at 4K60 was...not run it at 4K at all. Infact judging by the framerate I'd say that's running internally at 1440p tops. It provides a performance increase for a reason. DLSS is not native. It does not look like native. In fact it almost always looks worse than simply using old fashioned upscaling from the internal resolution. As for the native 4K...yeah, good luck holding 30FPS when you're sitting at 32FPS in a scene in which absolutely nothing is happening.

DLSS + the new sharpening tool looks very close to native. Obviously not as good, but it's a small visual sacrifice for a big fps boost.

The tech works great guys, it's just not really worth the big asking price because you need at least a 2080 to reliably get it performing well on every game.
 

Tygeezy

Member
As far as i know Gsync is slightly better than Freesync but terribly expensive in comparison so i agree with you there.

I know Nvidia as a company is greedy as hell and does a lot of shitty practices, but at the end of the day they managed to implement AMD features like RIS on their card even better through patches (according to hardware unboxed).


Some users turning RTX off even on the 2080Ti is not evidence that Nvidia RTX is obsolete or that AMD Ray tracing will be considerable better.

Other users that have even entry levels RTX cards are saying that the RTX effects are so good that its worth the performance cost on games like Metro Exodus and Control too.
Freesync is on par with gsync these days. Freeysync originally didn't support low framerate compensation which it now does.
 
DLSS + the new sharpening tool looks very close to native. Obviously not as good, but it's a small visual sacrifice for a big fps boost.

The tech works great guys, it's just not really worth the big asking price because you need at least a 2080 to reliably get it performing well on every game.
You know what else looks very close to native? 1620~1800p with post sharpening applied. The same kind of performance as DLSS too. Both Hardware Unboxed and Gamers Nexus have done testing on this. DLSS is objectively useless.
 

01011001

Banned
mimimimimimi RTX sucks because I can't play at 4k60 at max settings!

mimimimimimi you need ab RTX 2080 to run games well....


what a bunch of bullshit xD even a 2060 can run Battlefield 5 with RTX effects and with well balanced settings at good framerates... and that's the budget card!
 

jonnyp

Member
This is why Cerny was talking about 3D audio really being the first viable application for this and didn't really talk about RT graphics in that Wired interview.
 

Chromata

Member
You know what else looks very close to native? 1620~1800p with post sharpening applied. The same kind of performance as DLSS too. Both Hardware Unboxed and Gamers Nexus have done testing on this. DLSS is objectively useless.
Yeah I've heard about that too. Thing is, DLSS is improving with each implementation. I'm no expert by any means but this has been documented and I've seen it first hand. I'm remaining hopeful but regardless I like that both are good options.

It doesn't change the fact that RTX features are ready and fully playable when on.
 
Last edited:

bryo4321

Member
You know what else looks very close to native? 1620~1800p with post sharpening applied. The same kind of performance as DLSS too. Both Hardware Unboxed and Gamers Nexus have done testing on this. DLSS is objectively useless.
As stated, it’s not useless, it actually works completely differently and theoretically dlss can continually improve through training where as what you see with cas sharpening is basically what you will get.
 
As stated, it’s not useless, it actually works completely differently and theoretically dlss can continually improve through training where as what you see with cas sharpening is basically what you will get.
Okay...but where are these improvements? When they "improved" Metro Exodus all they actually did was slap a sharpening shader over top, with rather obvious haloing from oversharpening for good measure. BFV is still blurry ass with DLSS on. When are they going to show up? 6 months? A year? Two years? Don't get me wrong deep learning AI upscaling *can* achieve great results...but it takes seconds, if not minutes, depending on size and complexity, to process a single image. Unfortunately DLSS doesn't have minutes...or seconds...it has milliseconds...and that just isn't long enough to achieve an appreciably better image than traditional bicubic upscaling. I wouldn't say that RTX is a marketing gimmick...but I absolutely would say that DLSS is.
 
Last edited:

bryo4321

Member
Okay...but where are these improvements? When they "improved" Metro Exodus all they actually did was slap a sharpening shader over top, with rather obvious haloing from oversharpening for good measure. BFV is still blurry ass with DLSS on. When are they going to show up? 6 months? A year? Two years? Don't get me wrong deep learning AI upscaling *can* achieve great results...but it takes seconds, if not minutes, depending on size and complexity, to process a single image. Unfortunately DLSS doesn't have minutes...or seconds...it has milliseconds...and that just isn't long enough to achieve an appreciably better image than traditional bicubic upscaling. I wouldn't say that RTX is a marketing gimmick...but I absolutely would say that DLSS is.
It’s totally possible that their algorithm just isn’t good and won’t achieve the results they hoped for no matter how much they train it. Machine learning applications like this are still pretty new so I don’t doubt further improvements after more iteration.

That said I was actually quite pleased with how it looked in control, but I have not used it in other games.
 
Last edited:

thelastword

Banned
It’s totally possible that their algorithm just isn’t good and won’t achieve the results they hoped for no matter how much they train it. Machine learning applications like this are still pretty new so I don’t doubt further improvements after more iteration.

That said I was actually quite pleased with how it looked in control, but I have not used it in other games.
They're probably using it in conjunction with their new sharpening tool. When I get to a desk, I'll delve more into it, but I think that's what's happening here. Still, native resolves much more detail. There are a few comparisons online already.
 

bryo4321

Member
They're probably using it in conjunction with their new sharpening tool. When I get to a desk, I'll delve more into it, but I think that's what's happening here. Still, native resolves much more detail. There are a few comparisons online already.
Yeah native definitely still looks better, but I thought the performance trade off was worth it for those who don’t mind or are determined to max out the effects at the cost of a little IQ. There were a couple edge cases where some strange artifacts would show but they weren’t too distracting, it’s cool tech and I look forward to seeing where it’s taken in the future. I’m a firm believer that eventually ai upscaling could give some pretty amazing results that render more static methods obsolete. However, I am not claiming that day is today.
 

Croatoan

They/Them A-10 Warthog
Ray Tracing is a generational leap in rendering. The problem is current GPUs can't handle it and we are still probably 2 to 3 GPU generations away from 4k 60fps performance with "RTX On".

I have a 2080 TI and I play Control at 1080p 60fps with all RTX features on a 55" 4k tv and have loved it. Would I love 4k 60fps? Sure, but that isn't reasonable right now.
 
Last edited:
I don't think this destroys OP's argument as much as you think it does. All you had to do to get it running at 4K60 was...not run it at 4K at all. Infact judging by the framerate I'd say that's running internally at 1440p tops. It provides a performance increase for a reason. DLSS is not native. It does not look like native. In fact it almost always looks worse than simply using old fashioned upscaling from the internal resolution. As for the native 4K...yeah, good luck holding 30FPS when you're sitting at 32FPS in a scene in which absolutely nothing is happening.
You missed the entire point of the post.

Forget resolution numbers for a minute... If you feel the difference in quality between those 2 shots is SO drastic to say that it's unplayable.... then I don't know what else to say to you. Not to mention the fact that AMD fanboys, himself included, have JUST RECENTLY been going on about how AMD's RIS is better and looks closer to 4K than any DLSS implementation and guess what? That's not 4K at all either...

1440p 60fps with RTX is possible in todays games. 4K 30 is also possible. That's RT maxed btw... RT on Med gives a huge performance boost. If he's got a problem with those figures, then he must have a problem with the 5700XT... oh wait.. Nope.. he loves that shit.

So the point is... it's not nearly as "horribly unplayable" as he, and apparently you, would like to say it is. I wouldn't go back from playing Control with RTX vs without it. And fyi... no.. the FPS stays right around 30...

And get real... there's barely ANY difference in those shots. You wouldn't noticed any difference unless you saw them side by side, and you legitimately can't see a difference on a 4K TV sitting at a distance. But let me guess... people like you can't see any difference between RT on and off... but you can see a HUGE world of difference in those two screens I posted... mmhmhm.. :rolleyes:
 
Last edited:

Pimpbaa

Member
what a bunch of bullshit xD even a 2060 can run Battlefield 5 with RTX effects and with well balanced settings at good framerates... and that's the budget card!

2060 isn't a budget card. It may be the lowest priced RTX, but it's far from budget level video card prices (like the GTX 1650).
 

thelastword

Banned
This seems like the best place for this;


I was going to post this since yesterday, absolute MONSTER of a video...…..It's very similar to my take on RTX and how everything has shaken up and where things are heading.....Kudos to "Moore's Law is Dead"...
 

Croatoan

They/Them A-10 Warthog
This seems like the best place for this;


Is this guy an AMD fanboy?

Not that I disagree, but he says that nvidea cards will never get better at ray tracing. Whats to say the 3080 ti isn't incredible at it or something?

Maybe I just misunderstand this idea of futureproofing. You cannot future proof a PC. You either buy a ~$1000 gpu every 2-3 years or you get left behind.
 
Last edited:

Ascend

Member
Is this guy an AMD fanboy?
Not really. He has contacts in the industry and is basically a leaker. His leaks are more reliable than the likes of AdoredTV most of the time.

Not that I disagree, but he says that nvidea cards will never get better at ray tracing. Whats to say the 3080 ti isn't incredible at it or something?
He means that Turing and RT cores will not get better at ray tracing. Obviously if they change their architecture again all rules are out the window. I'm not sure if he's right though, although most things do point in that direction. nVidia is pushing these cards pretty hard, and despite that, it's not getting the traction that they hoped/thought they would get.

Maybe I just misunderstand this idea of futureproofing. You cannot future proof a PC. You either buy a ~$1000 gpu every 2-3 years or you get left behind.
Depends on what you want... If you use your PC for office work and browsing the internet, a 10 year old PC will still do fine nowadays. Gaming is a more complicated story. But with GPU prices going through the roof lately, some sort of future proofing is really necessary and will inevitably become more relevant. I think nVidia is trying to bank on that by trying to market their cards as the best future proofing cards since only they have ray tracing. That game will definitely be up when consoles have it too.
Technically, one can still easily use a GTX 980 Ti for gaming nowadays. That is provided you didn't increase your gaming standards that much. In other words, if you still do 1080p 60 fps gaming, that card is fine. If you want to do 4k or 120 FPS+ or whatever, older cards will not cut it. But video cards generally can have a great lifetime if you're conservative in your upgrading and don't fall into the hype of needing the newest shiniest thing right now with the highest possible settings.

I'm still using my R9 Fury and am perfectly happy with it. I only have a 75 Hz ultrawide monitor. It does have FreeSync to help with lower framerates... As technology advances, the need to really keep upgrading every year or two diminishes as features dampen the impact of slower hardware. To me, variable refresh rate monitors, anti-lag technology and sharpening techniques are the best things to come out in the last few years, simply because you can easily keep using your older hardware and still keep up. People are too gullible regarding marketing and hardware hype. Emotion is really good at draining your wallet, and some companies are really good at tugging those strings, particularly nVidia in this case.
 
Last edited:

thelastword

Banned
PC gamers sure are scared of new tech.
Raytracing has been there for decades, any GPU can do raytracing......Nvidia did not invent raytracing, they're trying o commercialize it, but failing because the delivery and adoption is not there......Raytracing tbh is inevitable, but only when the hardware is capable, only when it is well researched.....Ideally, we should not have to sacrifice our rez and framerate at such alarming rates, it should be the salad on top of our high rez and high framerates, not the desecrator of our engines.....

FYI, way before RTX debuted, we had cheap Cell Phone hardware which did 6 Gigarays per second, what Nvidia is offering is not anywhere near capable of delivering what they say, and anybody futureproofing by buying RTX is going to be left in the dust because, as I always said, Vega 56. 64, Radeon 7 will be very good with raytracing because of their CU's, so will RDNA 1......When RDNA 2 drops, all devs will use that as the raytracing standard. RDNA 2 GPU's will get an even further boost because of it's dedicated raytracing hardware, but the older AMD cards will perform better than those RTX cards with their so called "RT hardware"......because of their CU's and the radeon rays software.

Remember Minecraft RT runs vey well on AMD cards......I have no doubt that the Quake 2 demo would run just as good/bad as they do on AMD cards as that is not even using the tensor cores on the RTX cards......





Hardly maxing the 5700XT at 1500+ Mhz, now imagine if these were optimized for AMD CU's or were developed using AMD's optimized software or it's Radeon Rays Suite...….
 
I imagine the acceleration structures and polygon intersection tests would be vastly cheaper for a blocktastic game like Minecraft than for something like Control or Metro.

The key is using ray tracing for best bang for buck within the limits of the hardware. That will probably take a couple of generations of games to work out.

Nvidia are after the most impressive footage possible to sell their cards with. Faster but more sparingly used implementations will be the way forward. And they'll be worth it. But it'll be an evolutionary approach rather than an overnight revolution.
 

Ivellios

Member
Is this guy an AMD fanboy?

Not that I disagree, but he says that nvidea cards will never get better at ray tracing. Whats to say the 3080 ti isn't incredible at it or something?

Maybe I just misunderstand this idea of futureproofing. You cannot future proof a PC. You either buy a ~$1000 gpu every 2-3 years or you get left behind.

I looked through his recent playlist and his videos are completely biased towards AMD being good and destroying Nvidia in the future, while calling everything from NVidia being trash and all that.

So personally i dont trust this guy opnion on this.
 

Shai-Tan

Banned
I played half of Control with RTX medium and DLSS (2080, not Ti). Then switched off both things, played the rest of the game in normal 1440p, and had a much better experience. Their RTX implementation introduced some streaming glitches and had some temporary issues, apparent in cutscene cuts and sudden camera movements. In any case, the improvements, while obvious in some scenes, were not worth the massive performance drop.

Every time I test RTX is any game I feel the same way. It's nice, perhaps, but it is not worth the cost. I'd rather play the rasterized version at a higher resolution and framerate.

At current performance levels I wish there was a single button toggle for either RTX or DLSS because I like the look of Metro, Tomb Raider, etc with RTX ultra but it does get in the way to the point where I’ll pause mid level and go into the menu to make the performance bearable.

edit: and part of the problem is that there are huge performance drops depending on what you’re looking at when using RT, way more than typical of games
 
Last edited:

Ascend

Member
I looked through his recent playlist and his videos are completely biased towards AMD being good and destroying Nvidia in the future, while calling everything from NVidia being trash and all that.

So personally i dont trust this guy opnion on this.
Yes, he is... just look at his videos.

To be fair don't look at his videos... it is a waste.
I guess the verdict is in... Anyone that thinks that AMD is in a good position right now must be a fanboy. Because there is no other way that anyone can support what AMD is doing, right? Only nVidia and Intel are ever viable! Anyone that has any interest in AMD or thinks AMD has any sort of merit MUST be a fanboy...

And yeah... This video of the same guy doesn't exist... He MUST be a fanboy;

 

Sosokrates

Report me if I continue to console war
It's like any new graphics tech or demanding game, still to this day not many computers can max out crysis @ 4k 60fps.
 

Ivellios

Member
I guess the verdict is in... Anyone that thinks that AMD is in a good position right now must be a fanboy. Because there is no other way that anyone can support what AMD is doing, right? Only nVidia and Intel are ever viable! Anyone that has any interest in AMD or thinks AMD has any sort of merit MUST be a fanboy...

And yeah... This video of the same guy doesn't exist... He MUST be a fanboy;



Not even close, i just dont trust obvious biased authors who just want to push an agenda for a specific company. And yes he is clearly a fanboy in my opnion.

Hardware Unboxed for example pretty much recommend AMD CPUs and GPUs for the majority of buyers and say that RTX is not a worth investiment for now. But i trust them because they also point out when Nvidia or Intel do something good, so they are reliable to give unbiased information.
 

Jaywill314

Neo Member
Works fine with my 2070.

They do.
IE. Unreal 4.23 added support to use cubemaps on reflective objects which are visible in reflections to allow reflections to look to have proper material. (Without additional bounce of RT or cubemap those metals would be black.)


Turing has some awesome new features in rasterization, sadly many of them needs proper support from developers. (IMHO. Mesh Shader is the biggest new feature.)

Tessellation is great idea, sadly the hardware implementation in GPUs and DX is crap.
Currently the decent way to do it is in compute and best way is with mesh shaders.

The old good times rasterization when 'GPU' accelerated version was slower.

Yup, things will become faster in future, there is lot to improve and find out ways how to do it.
Having great performance from RT is not easy, even though it is better way to sample the scene when compared to rasterization. (As tens of years of research on CPU based RT clearly shows.
What exactly is makes him a fanboy? Is it his criticisms of Nvidia?
 

Leonidas

Member
Hardware Unboxed for example pretty much recommend AMD CPUs and GPUs for the majority of buyers and say that RTX is not a worth investiment for now. But i trust them because they also point out when Nvidia or Intel do something good, so they are reliable to give unbiased information.

I won't accuse Hardware Unboxed as being fanboys but they are biased too.

They always recommend based only on performance per dollar, which is an area AMD has to compete on because their products aren't feature complete, aren't as efficient in gaming, and aren't as fast in gaming as Nvidia and Intel.
 

Jaywill314

Neo Member
I won't accuse Hardware Unboxed as being fanboys but they are biased too.

They always recommend based only on performance per dollar, which is an area AMD has to compete on because their products aren't feature complete, aren't as efficient in gaming, and aren't as fast in gaming as Nvidia and Intel.
They simply tell you what's the best bang for your buck. Any smart consumer doesnt just blow money just to blow it, they weigh the value of the products at hand before purchasing. If rtx is a huge thing for you then buy a rtx card if not then you have more options to choose from. Rtx in general is a proprietary extension of Microsoft's dxr, heading into the future AMD and Intels solutions may be more dev friendly especially since they both seem to be going the open source route. Theres a reason rtx cards aren't selling nearly as good as the previous 10 series cards.
 

Siri

Banned
HNIANh6.jpg


$1200 dollars to get the image on the right running at an acceptable frame rate @1080p. It's ludicrous.

The RTX 2080 Ti is the first consumer level 4K card on the market. I’m at 3440x1440 and my 2080 Ti just blows away my old GTX 1080.

You make it sound like the only reason for buying a 2080 Ti is for ray tracing. Ray tracing is a cool knew feature - and one day it will be huge - but most folks who buy this card are buying it to improve their frame-rates, not to see ray tracing in action.
 
Top Bottom