• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Tech Press Accidentally Prove RTX Ray Tracing ISN'T Ready!

Salvatron

Member
WTF. So he's claiming RTX isn't as big of a deal as it is... because of one example of it's use? "The game world is a little bit shinier. That's really about it. And things are reflecting all over the place. And as I've argued, I don't think theyre reflecting realistically either." Does this guy not understand the difference between path vs ray and how much it affects how light works within a game? Feels like a very weak arguement to write it off when the capabilities and it's adoption are still relatively new.
 

01011001

Banned
people in here obviously weren't around back when 3D was new... if you were able to run Quake 2 back when it came out, at 30fps you considered yourself lucky.

new tech means performance decreases compared to old tech at first... Raytracing is great and already had a better start than 3D accelerated cards had in the 90s
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
No, what I am saying is that some will no longer put in the effort we once had for those without RTX cards. A lot of the basic cube-map reflections are seemingly gone, and less effort in AO/lighting, which would have been there otherwise.

We have games still that have no RTX support, than shit all over non-RTX Control when it comes to those factors. It feels more toned down than necessary.

This maybe true. I agree. But I'm assuming that creating all those cube maps probably cost alot of money and time to create for each scene.
 

Vol5

Member
I wonder

How on earth did Dice accomplished this before RTX.....I guess we never saw reflections through glass before RTX or any reflective surface for that matter, because in Control, it's just a dark void behind glass.....How about proper shadows of an extinguisher handle.....I guess all these things are just not possible without RTX...

I wonder how long it took them to create that scene instead of just creating the surfaces and models and adding one light source.
 

Thaedolus

Gold Member
"DX12 is shit and performance is worse than with DX11!",
"Tesselation sucks, no one needs this... rounder fingertips with a 40% performance hit on my Kepler card!"
"Ambient Occlusion costs too much performance on my old card! Disable that shit!"


Yeah, it's always the same.

It's almost like technology evolves and generally begins in the enthusiast/high end market before having mass market pricing years later....and it's almost like tech press are tech enthusiasts covering the latest tech...

WTF point is this guy trying to make?
 
I've been buying GPU's since the voodoo days, I know how these feature wars work out.

Once nvidia gets bored of beating the consumer over the head with it, It'll end up in a handful of games like tesselation, and for very specific instances. Those shitty screen space reflections will dissapear for example. They always tank performance and they always look like shit.
 

thelastword

Banned
"DX12 is shit and performance is worse than with DX11!",
"Tesselation sucks, no one needs this... rounder fingertips with a 40% performance hit on my Kepler card!"
"Ambient Occlusion costs too much performance on my old card! Disable that shit!"


Yeah, it's always the same.
A new API will take a little time to gain some traction, but even in the very early days of DX12 the difference between a DX11 version and DX12 version was never 4k plundering down to 1080p and the framerate differences were not so egregious....DX11 vs Dx12 at it's worse were the same resolution at say 10-15 fps the difference in the worse case scenario...

Let's be clear here, for everything there is precedent...…..Anything Propietary from Nvidia carries a huge performance hit, yet the results are not the most aesthetically pleasing or accurate......Hairworks is one of those features...….Look at Nvidia Tesselation, it's a significant hit even on NV cards and even moreso on the competition...….HBAO+ takes a hit too, is it the most accurate? Is the hit well worth it over competitive effects? How many games did smokeworks ever appear in, far less at reasonable framerates or how some of NV's overdone physics technology with trash and paper gaining new life on floors, streets etc.....How practical or believable are those in a real-time setting or at playable framerates.....?

It's clear, anything proprietary in the PC circles don't gain traction anymore, people are tired of the monopolistic hold of technological effects in the GPU space......You want something to move forward, everybody must have it or at least be privy to it (from the high end to the low end).....Stop dangling what's not ready yet on current cards at a high price, people will eventually get the trick, because it's been used many times before......All of Nvidia's tech are pricy to the customer, tanks performance a million fold, it's why all their ideas and features are now taking a backseat.....Their ideas are not the future, their ideas is not to advance technology but to sell more expensive cards on the idea of something premium.....

Raytracing will be ready and utilized properly, as I said, only when it's viable on all classes of GPU's, till then we are just paying out the ass for tech demo's today on $1200 GPU's, where low end cards will be able to muster much more impressive results at a fraction of the price without the demolition to you framerates and rez….

I wonder how long it took them to create that scene instead of just creating the surfaces and models and adding one light source.
Well, don't be fooled......Without the effects, Control is a very bland looking world and environment.....Besides, so many devs who promised RTX day 1, could not produce it till much later after launch........So it's not as "it just works" as you think it is......It's very much a hybrid method with rasterization still in tow.......I'll say this, many of these rasterization techniques I speak of are not to be newly developed, in many cases, they're already right there in most engines as their prior games would indicate.........You played BF4 at launch, the campaign? I did, I saw reflections on walls, there were light sources everywhere, shadows were great. Same for Hardline.......How about reflections in Quantum Break prior to Control? or shadows? All of these things were already in the engine in many games, it was only a matter of improving the effects even more...........Yet, what's the point of improving rasterization in newer games or implementing stellar rasterization work at all, if you're not going to sell RTX whilst doing so, minimizing it's impact or making RTX look even sillier at the price and performance hit it incurs????
 
It's OK for people to hate on Nvidia and shout the same platitudes about gimmicks that they always have. Nvidia has been used to that, ever since the days of the original Geforce 256 and Hardware T&L. But Nvidia didn't get where they are today by standing around and doing nothing.

One hopes that AMD has an actual plan for hardware RT of their own besides "let's hope our hardcore fans can shout it down", because RT will not go away. It's going to be the standard for how games are rendered, replacing traditional rasterization, going forward. The same goes for Intel and their still largely secret dGPU project.
 
Last edited:
"DX12 is shit and performance is worse than with DX11!",
"Tesselation sucks, no one needs this... rounder fingertips with a 40% performance hit on my Kepler card!"
"Ambient Occlusion costs too much performance on my old card! Disable that shit!"


Yeah, it's always the same.

Early days ppl. Of course it’s not ready yet it’s just baby days and baby steps. It will be the main factor in improving graphics next gen imo. Rockstar did an incredible job with the lighting in rdr2 without it. Imagine what they’ll do with it! Give it 5 years and many different efficiency improvements and we’re going to start to see some really incredible stuff.

I often wonder if this gen will feel dated in comparison to next gen like the ps3 does now. Will certainly be interesting to see.
 
Last edited:
"DX12 is shit and performance is worse than with DX11!",
"Tesselation sucks, no one needs this... rounder fingertips with a 40% performance hit on my Kepler card!"
"Ambient Occlusion costs too much performance on my old card! Disable that shit!"


Yeah, it's always the same.
Even before then, it was "Hardware T&L doesn't do anything and is useless" and "What do you need programmable shaders for" and stuff like that. Nvidia was always out there introducing new tech and the same usual people were trying to shout it down because it wasn't their favorite company doing it.

This isn't even strictly an Nvidia thing either, people used to shit on tessellation when ATI first introduced it and let's face it, early efforts at tessellation look goofy as hell and ran badly. Kind of where RT is at today, now that I think about it. A polarity shift happened when a different group of people started hating it when Nvidia got better at it, remember when people falsely blamed tessellation for ATI performance problems in Crysis 2? It's just an excuse for human tribalism and it's hilarious in the context of trying to bring real innovation to real-time 3D graphics rendering technologies.
 
Last edited:

Real

Member
Even before then, it was "Hardware T&L doesn't do anything and is useless" and "What do you need programmable shaders for" and stuff like that. Nvidia was always out there introducing new tech and the same usual people were trying to shout it down because it wasn't their favorite company doing it.

This isn't even strictly an Nvidia thing either, people used to shit on tessellation when ATI first introduced it and let's face it, early efforts at tessellation look goofy as hell and ran badly. Kind of where RT is at today, now that I think about it. A polarity shift happened when a different group of people started hating it when Nvidia got better at it, remember when people falsely blamed tessellation for ATI performance problems in Crysis 2? It's just an excuse for human tribalism and it's hilarious in the context of trying to bring real innovation to real-time 3D graphics rendering technologies.

I'll agree with everything that you said except for the Crysis 2 comment.


There were many questionable things that were overly tesselated.
 
Tessellation was not a good idea. Remember the plan is to push the pixels into the correct place, more triangles was not the solution to that
It's like saying foveated rendering should render 8k for the Iris, it kills the benefits.
 
You can turn Ray Tracing off right?

I guess I am not understanding the issue if I can just turn it off if I don't like that particular games performance with it.
 

thelastword

Banned
You can turn Ray Tracing off right?

I guess I am not understanding the issue if I can just turn it off if I don't like that particular games performance with it.
You can of course, but the justification for expensive Nvidia cards is indeed RTX...…….Why buy a $1200 card, if you're not using it's most touted feature at 4k 60fps.....


Just like in the video, using a 2080ti on a 1080p monitor, nobody seems to advertise RTX cards like this, especially the 2080ti...….It's simply, RTX this, RTX that, you can't live without RTX....


So nobody says;


Announcing the $1200 RTX 2080ti, the most powerful GPU ever imagined, it will deliver a revolutionary 1080p 60fps with RTX on..

or

Announcing the $400 RTX 2060S a powerful mid tier card, to deliver you a whopping 60fps at 720p with RTX on...


I think that would have add a different tone to the marketing, but yet, it's the truth....it's what you get with the feature on...
 
OP accidentally proves he's biased
You can of course, but the justification for expensive Nvidia cards is indeed RTX...…….Why buy a $1200 card, if you're not using it's most touted feature at 4k 60fps.....


Just like in the video, using a 2080ti on a 1080p monitor, nobody seems to advertise RTX cards like this, especially the 2080ti...….It's simply, RTX this, RTX that, you can't live without RTX....


So nobody says;

Announcing the $1200 RTX 2080ti, the most powerful GPU ever imagined, it will deliver a revolutionary 1080p 60fps with RTX on..

or

Announcing the $400 RTX 2060S a powerful mid tier card, to deliver you a whopping 60fps at 720p with RTX on...


I think that would have add a different tone to the marketing, but yet, it's the truth....it's what you get with the feature on...
Control maxed with RT maxed on my PC

4K Native
48663286047_578f245ac8_o.png


4K DLSS
48663138241_a9011fe429_o.png



Oh the HORROR :rolleyes:
 

Ivellios

Member
I really enjoy The Good Old Gamer videos, but lately he seems on a crusade against Nvidia.

Control is just one game which from what i read is poorly optimized both on PC and consoles. If you have an RTX card you can choose not to use RTX or play with DLSS on or 30 FPS with DLSS off.

The real question is if its worth paying premium price for this feature instead of getting a cheaper Navi card.

Fail at what, dude?

AMD commands only about one third of discrete GPU market, but 100% of beefy console market.
This means that ultimately it will be AMD who will control the pace at which RT adoption will go, what flavor of it will be the most popular and what would be the baseline "gigarays" figure.
And as we know, AMD will push RT in a major way next year.

Calling out BS (current RTX cards, even the very expensive ones, struggle even with current gen games, what will happen later on???) is just that.


When even 2080Ti, a 750mm monster card sold for $1.3k have it off most of the time, when does it "work" please?

Why do you and some others believe AMD will somehow make RTX far better than what Nvidia is doing at the moment? I dont see them with a miraculous technology that makes Nvidia RTX obsolete.

I play on consoles since forever, i think ray traced games on next gen consoles will either be a very basic implementation or it will force games to run at 30 FPS.
 

thelastword

Banned
Of course the media is pushing it. How else are those game sites going to get their free gpu to test?
Tbh, Tom's hardware is not even embarrassing themselves anymore with that first article on Raytracing and how "would you really go on with your life without RTX" fellatio piece....

But there are one or two publications still cleaning the RTX knob to a spit shine, it's as if their life depended on it...…...JayzTwoCents is an Nvidia guy and even he says RTX is not ready for primetime in it's current iteration, neither does he go head over heels with it as a recommendation.....Then people who actually research tech in-depth like GamersNexus is clear on RTX and of course the benchmark king, where his channel also gave us the lowdown on DLSS is pretty clear on RTX and their recommendations in the here and now....

Of course having said this, no one is crucifying anyone who buys RTX cards, you do you and your wallet, but that's notwithstanding that these cards are bad value with the feature being touted and comes with major chops to resolutions and frames...Actually, rez and frames which gamers have been used to for years on-going...…..A GTX 1070 user has been playing his games for years at 1440p.....Then, in 2019, you ask him to upgrade to a card with dedicated hardware for RTX, so he expects much more performance over his 1070 and that the RTX hardware will do it's thing whilst keeping his go-to rez and frames intact...…..So he buys a 2060S or 2070, only to be gaming at 720p or 900p with RTX maxed on...…...That's a huge turn of events, not for the better and not progressive either......RTX will be ready when my xx70 card can do 1440 60fps with RTX maxed and even play some RTX games at 4k...Just as you do now with your xx70 cards with rasterized games...….. It just shows the dedicated RTX hardware is not what it's touted to be, it's not as potent for the task or the asking price...
 

Vol5

Member
Well, don't be fooled......Without the effects, Control is a very bland looking world and environment.....Besides, so many devs who promised RTX day 1, could not produce it till much later after launch........So it's not as "it just works" as you think it is......It's very much a hybrid method with rasterization still in tow.......I'll say this, many of these rasterization techniques I speak of are not to be newly developed, in many cases, they're already right there in most engines as their prior games would indicate.........You played BF4 at launch, the campaign? I did, I saw reflections on walls, there were light sources everywhere, shadows were great. Same for Hardline.......How about reflections in Quantum Break prior to Control? or shadows? All of these things were already in the engine in many games, it was only a matter of improving the effects even more...........Yet, what's the point of improving rasterization in newer games or implementing stellar rasterization work at all, if you're not going to sell RTX whilst doing so, minimizing it's impact or making RTX look even sillier at the price and performance hit it incurs????

Your talking like there's no downside to current rasterization. I turn on RT in Control and I don't get any screen-space reflection rubbish for starters (don't know what it's called when the reflection just fucks off when you pan the camera). Peter-pan textures...a huge bug bear of mine. This is solved with RT. Colours are reflected correctly onto surfaces instead of them being mapped manually. There's plenty of downsides to current techniques which are solved or overcome with RT. How you can deny calculating light bounces realistically in a scene against faking it is beyond me.
 

StreetsofBeige

Gold Member
Tbh, Tom's hardware is not even embarrassing themselves anymore with that first article on Raytracing and how "would you really go on with your life without RTX" fellatio piece....

But there are one or two publications still cleaning the RTX knob to a spit shine, it's as if their life depended on it...…...JayzTwoCents is an Nvidia guy and even he says RTX is not ready for primetime in it's current iteration, neither does he go head over heels with it as a recommendation.....Then people who actually research tech in-depth like GamersNexus is clear on RTX and of course the benchmark king, where his channel also gave us the lowdown on DLSS is pretty clear on RTX and their recommendations in the here and now....

Of course having said this, no one is crucifying anyone who buys RTX cards, you do you and your wallet, but that's notwithstanding that these cards are bad value with the feature being touted and comes with major chops to resolutions and frames...Actually, rez and frames which gamers have been used to for years on-going...…..A GTX 1070 user has been playing his games for years at 1440p.....Then, in 2019, you ask him to upgrade to a card with dedicated hardware for RTX, so he expects much more performance over his 1070 and that the RTX hardware will do it's thing whilst keeping his go-to rez and frames intact...…..So he buys a 2060S or 2070, only to be gaming at 720p or 900p with RTX maxed on...…...That's a huge turn of events, not for the better and not progressive either......RTX will be ready when my xx70 card can do 1440 60fps with RTX maxed and even play some RTX games at 4k...Just as you do now with your xx70 cards with rasterized games...….. It just shows the dedicated RTX hardware is not what it's touted to be, it's not as potent for the task or the asking price...
Yup. Like all gadgets wait for a few years for things to improve and iron out the kinks. Let the whales dive in first.

It's like HD tvs. Whichever people out there spent $5,000 to buy a 32" 720p LCD with 24ms lag, ghosting, and only 1 HMDI slot, hey that's great. I'll wait a few years.

It's not like console gaming, where the difference between a year 1 console and year 3-4 console, you save maybe $150 and it's basically the same console except maybe a free pack in game and maybe the model got slimmed. With cpus and gpus you can literally save $500-1,000 and the performance will get much better later.
 
Last edited:

Woo-Fu

Banned
Hardware industry always pushes stuff like this because it is the only segment with good margins and the only way to sell product into the "good enough already." segment.

Those of us who are content with 1080p don't need high-end cards to run any game smoothly nowadays. Lower power consumption, lower noise, lower heat are things that might get somebody to upgrade, eventually, but aren't near as sexy from a marketing standpoint as something like RTX.
 
What I don't get is why it tanks frame rates so hard when its using different hardware - a dedicated chip that without RTX is just sat there doing nothing.
Raytracing is very expensive in therms of calculation, we need much better hardware for it to be beneficial at gaming level performance.

It's probably nice to have access to the tech for development, 3d rendering, and bragging rights for gamers... Not that every time some new features were introduced they were ready, bump mapping, anti aliasing, pixel shaders, etc. All incurred some serious performance penalty at first.
 

thelastword

Banned
Your talking like there's no downside to current rasterization. I turn on RT in Control and I don't get any screen-space reflection rubbish for starters (don't know what it's called when the reflection just fucks off when you pan the camera). Peter-pan textures...a huge bug bear of mine. This is solved with RT. Colours are reflected correctly onto surfaces instead of them being mapped manually. There's plenty of downsides to current techniques which are solved or overcome with RT. How you can deny calculating light bounces realistically in a scene against faking it is beyond me.
I have no problem with raytracing or advanced technology, just do it when it's viable...…..Or rather do enough research into it, where the best solutions are engineered for the end-user....

Current RTX vs Rasterization is not some generation leap, it's pretty much like a Super Ultra setting vs Ultra....It's nothing like 2D-3D or Halo 3 to Crysis in terms of rendering.....You also can sing the praises of RTX as something you can't live without, when the devs are not even giving us the best rasterized picture they can....You think those glass panes would be without reflections in Control if the game had no RTX at all? So whilst you are singing the praises of RTX and thinking it's a one button (graphics on level 3 have been transformed), then you are simply not aware that rasterization is still present in current RTX games, since it's hybrid...….So they're not done with rasterization yet, it's just that they dumb down rasterization techniques to make RTX look better, and even then, it's not a big difference, but yet, even with dumbed down rasterization, at least you get the chance to play at 1440p-4k 60fps with a sharp IQ.

All I ask is that devs continue to give us good rasterization implementations till raytracing becomes a thing in late 2020 or early 2021 across low end to high end, consoles and all....



Yup. Like all gadgets wait for a few years for things to improve and iron out the kinks. Let the whales dive in first.

It's like HD tvs. Whichever people out there spent $5,000 to buy a 32" 720p LCD with 24ms lag, ghosting, and only 1 HMDI slot, hey that's great. I'll wait a few years.

It's not like console gaming, where the difference between a year 1 console and year 3-4 console, you save maybe $150 and it's basically the same console except maybe a free pack in game and maybe the model got slimmed. With cpus and gpus you can literally save $500-1,000 and the performance will get much better later.

Even in hindsight this stings, but at least I'm thinking these 32" 720P Sharp AQUOS guys had more content to play on their TV's then than a handful of RTX games......
 

Darak

Member
I played half of Control with RTX medium and DLSS (2080, not Ti). Then switched off both things, played the rest of the game in normal 1440p, and had a much better experience. Their RTX implementation introduced some streaming glitches and had some temporary issues, apparent in cutscene cuts and sudden camera movements. In any case, the improvements, while obvious in some scenes, were not worth the massive performance drop.

Every time I test RTX is any game I feel the same way. It's nice, perhaps, but it is not worth the cost. I'd rather play the rasterized version at a higher resolution and framerate.
 

Rayderism

Member
Couldn't devs mix ray-tracing with rasterization and cube maps and whatever other techniques currently in use to create much of the same effects with a lesser hit to performance? Does it really have to be one or the other? I mean, at least until RT tech is truly up to the task without tanking performance so much?

On a somewhat similar note:

Seems I remember that on PS5, they said that ray-tracing was going to be used mostly for sound (however that's supposed to work....echoes and surround sound effects I guess) and would be at a much lower performance cost than if it was used for visuals.
 

oagboghi2

Member
You can of course, but the justification for expensive Nvidia cards is indeed RTX...…….Why buy a $1200 card, if you're not using it's most touted feature at 4k 60fps.....


Just like in the video, using a 2080ti on a 1080p monitor, nobody seems to advertise RTX cards like this, especially the 2080ti...….It's simply, RTX this, RTX that, you can't live without RTX....


So nobody says;

Announcing the $1200 RTX 2080ti, the most powerful GPU ever imagined, it will deliver a revolutionary 1080p 60fps with RTX on..

or

Announcing the $400 RTX 2060S a powerful mid tier card, to deliver you a whopping 60fps at 720p with RTX on...


I think that would have add a different tone to the marketing, but yet, it's the truth....it's what you get with the feature on...
1. Stop worrying about other people’s money
2. Stop pretending as if the 2080 ti is the only worthwhile rtx card out there.
 

llien

Member
Why do you and some others believe AMD will somehow make RTX far better than what Nvidia is doing at the moment?
Because FreeSync vs GSync.
Because DirectX shaders vs "how nVidia destroyed OpenGL with its greed that turned into stupidty".
Because NVs market milking makes Intel look like a saint, to a point when in greed vs common sense fight, the former wins.

I dont see them with a miraculous technology that makes Nvidia RTX obsolete.
For starters, RTX is just an implementation of a rather common ray tracing concept.
There is Microsoft's API to back that up.
When even owners of $1300 cards switch it off, because of performance impact, it is already obsolete.
Current gen RTX cards are simply too slow.
 
Last edited:

sertopico

Member
OP accidentally proves he's biased

Control maxed with RT maxed on my PC

4K Native
48663286047_578f245ac8_o.png


4K DLSS
48663138241_a9011fe429_o.png



Oh the HORROR :rolleyes:
This. I was also skeptical and I do not own a RTX card, still I've been able to enable the effects on my 1080Ti. At first you don't see much going on but then you realize it is a day/night difference. The way light behaves with ray tracing changes drastically the final IQ. It's being a step closer to photorealism. DLSS works decently here I must say, so it is possible to achieve good framerates on current gen cards with a good IQ.
 

pawel86ck

Banned
"DX12 is shit and performance is worse than with DX11!",
"Tesselation sucks, no one needs this... rounder fingertips with a 40% performance hit on my Kepler card!"
"Ambient Occlusion costs too much performance on my old card! Disable that shit!"


Yeah, it's always the same.
It looks like some people just dont remember how GPU technology has progressed. I still remember riva TNT times (performance impact in 32bit color mode was HUGE) and shaders in GeForce 3 (90fps in DX7 mode to 30fps in DX8 mode in certain games) Yet technology adopted and these days all games use 32bit color and shaders. But I have to say I dont remember people complaining that much about 32bit mode or Shader performance and saying things like 32bit color mode or shaders arnt ready. These days however people complain to the extreme when new important and demanding graphic feature is used in games.


cYjz178.jpg


IMO RT is ready but only at 1080p, or in selected games and low/medium RTX settings at 1440p. 63fps average on gamepad is very good result. In comparsion some console games like control run at 15-30fps and people play games in framerate like that, and yet the same people say 63 fps in unplayable with RT. That's just absurd🙄.

Personally there's no way I will buy my next GPU without HW RT. Maybe RTX performance still isnt perfect, but at least you still have a choice and you can experiment with RTX fearures. However If you have GPU without HW RT you have no choice at all, and I'm sure when PS5 xbox scarlett multiplatform games will launch no people will want to play them with downgraded graphics. In fact some games will probably not even run without HW RT.
 
Last edited:
nVidia sold 3 cards with basically the same rasterizer performance as a 2 year old 1080ti, 2 of them at the same price. Progress!!! "But that's okay I can get 30fps @ 4K in an empty room". "And almost 60 if I smear Vaseline over it. "
 

Whitecrow

Banned
Ray Tracing is here not to make everything more pretty, but more accurate to real life, those things do not always go hand by hand.
But it's not a magic wand, the fact that it can simulate better the behaviour of light doesnt mean artists and programmers will suddently nail the textures and material light properties and will suddently create a real-life scene, but it definitely helps.

If you dont care about that, you can leave it off and play.
 

Solomeena

Banned
More clueless people like OP who don't understand the importance of Ray Tracing, this is just sad. DURR OLD GAMES DID THE LIGHTING LIKE THIS GAME WITH RTX. Old games use baked lighting, it cannot be changed, it is literally baked into the scene. Ray Tracing is real time lighting and shadows that are not baked into the scene. Not sure why or how you people cannot understand this simple explanation. My guess is that it is mostly clueless console gamers mixed with AMD fanboys who are trying to spread FUD about Nvidia because their precious AMD can't get their shit together and does not have Ray Tracing in their own lousy hardware.
 

Arun1910

Member
I actually think Control looks amazing with RTX on, at least from what I played (which was the whole thing). It's actually the only RTX game where I left it on because it looked stunning. There are so many windows and reflective surfaces in the building you play in, that the Ray Tracing is literally constantly in your face. It's not like BFV for example where you need to go up to a puddle to see a reflection.
 
Last edited:

Darklor01

Might need to stop sniffing glue
As one of those clueless console gamers.. I understand why it is important. I think what people are upset about is that something is out there they can obtain which doesn’t yet fully deliver an optimal experience for what they would be willing to pay for. They are bothered by this rather than the tech be put out for purchase when ready. The problem with that is, buying theses cards proves to the company why funds in R&D should go to Ray Tracing. Buy, don’t buy.. up to the person buying. Either way, quit bitching. It’ll get there. The result will be with it. Imagine photogrammetry with proper Ray Tracing and top performance? The future looks good (at some point).
 
Last edited:
Couldn't devs mix ray-tracing with rasterization and cube maps and whatever other techniques currently in use to create much of the same effects with a lesser hit to performance?
As far as I know not a single game uses pure raytracing to render is image, hybrid rendering is not new either, it has been used for offline rendering for years.
 

carsar

Member
We live in the era of temporal antialiasing. So 4K is the only way to get rid of insane blur and rough image quality.
4k is not a show off, it is a necessity for current gen games. I don't say we need 4k for past gen games, because they looks not so blurry even at 1080p.
Even more -hyper ultra settings you like in pc, they are not as important like 4k is important in current gen games.
If you have 1080p monitor and 2080 ti - would be much better to force 200% resolution for perfect render quality and ignore rtx.
 

pottuvoi

Banned
HNIANh6.jpg


$1200 dollars to get the image on the right running at an acceptable frame rate @1080p. It's ludicrous.
Works fine with my 2070.
Couldn't devs mix ray-tracing with rasterization and cube maps and whatever other techniques currently in use to create much of the same effects with a lesser hit to performance? Does it really have to be one or the other? I mean, at least until RT tech is truly up to the task without tanking performance so much?
They do.
IE. Unreal 4.23 added support to use cubemaps on reflective objects which are visible in reflections to allow reflections to look to have proper material. (Without additional bounce of RT or cubemap those metals would be black.)
nVidia sold 3 cards with basically the same rasterizer performance as a 2 year old 1080ti, 2 of them at the same price. Progress!!! "But that's okay I can get 30fps @ 4K in an empty room". "And almost 60 if I smear Vaseline over it. "

Turing has some awesome new features in rasterization, sadly many of them needs proper support from developers. (IMHO. Mesh Shader is the biggest new feature.)
Tessellation was not a good idea. Remember the plan is to push the pixels into the correct place, more triangles was not the solution to that
It's like saying foveated rendering should render 8k for the Iris, it kills the benefits.
Tessellation is great idea, sadly the hardware implementation in GPUs and DX is crap.
Currently the decent way to do it is in compute and best way is with mesh shaders.
people in here obviously weren't around back when 3D was new... if you were able to run Quake 2 back when it came out, at 30fps you considered yourself lucky.

new tech means performance decreases compared to old tech at first... Raytracing is great and already had a better start than 3D accelerated cards had in the 90s
The old good times rasterization when 'GPU' accelerated version was slower.

Yup, things will become faster in future, there is lot to improve and find out ways how to do it.
Having great performance from RT is not easy, even though it is better way to sample the scene when compared to rasterization. (As tens of years of research on CPU based RT clearly shows.
 
Last edited:
Top Bottom