• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Battlefield 5 vs RTX 2060: Is 1080p60 Ray Tracing Really Possible?

Kenpachii

Member
Solid card

Negatives:

1) Price too high, 299 would have been better with maybe a 199/250 version with ram changes.
2) V-ram limited for the future, 8gb would have been better with 6gb model for the cheaper one.

Other then that seems decent for now. However if you got a 1060 or 970 already don't see much reason to upgrade unless for the 970 you really need the v-ram.
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
Im talking of DLSS which is the post you quoted not rtx
Its literally a new feature. Its like saying pixel shader support sucked when Geforce 3 came out in 2001.

Obviously it sucks. Games just dont automagically gain support for such a feature.
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
I've already read about sebbi's method, it's great.
Then you would know the difference.

I call going from 4k to 1080p, with less consistent framerate for some puddles, junk technology.
I call early day code and the fact BFV is not the best example for RT a more likely candidate. I know its tempting to write off new tech when the support is still on-going, but if everyone held that thought, we would still be driving with T&L based games without shaders.

We've had planar reflections and clever tricks for reflections that look great for years.
Sure, and for most situations it will suffice. But it will break realism under different conditions, plus you can't use real time window reflections so you can't tell who is shooting behind you.

There are so many implications where RT will actually impact gameplay, even when BFV is not the best use case, people vouching against that really ought to think a little more about this (Not specifically you.)

This method seen in Battlefield is the brute force method and there's nothing clever about it.
How exactly is it brute force?
 

ethomaz

Banned
6GB VRAM probably match a 12GB+ console... remember Scorpio has 9GB for games.

With the rumored consoles at 16GB total I can see they having around 12GB for games that is equal what that GTX 2060 have if you include the system memory.

So next-gen with 12GB shared will be similar to a 6GB GPU card Plus system RAM.

Of course new consoles could have 24GB total or even 32GB total... that can boost the selling to 8GB VRAM and 12GB VRAM respectively.
 
Last edited:
Redneckerz Redneckerz I think I know the difference, again. Sebbi's a clever programmer and Nvidia RT is a bloated waste.

Planar reflections don't break. It's an older technique superior to SSR. Playing FF15 I remember how horrible the SSR looked because it disappeared when you camera pan. Think about that, half life 2 has superior techniques compared to SSR. Gamecube games FFS could have awesome reflections.

Like I said bud, Universal, real ray tracing is a good step in real time rendering - when our hardware actually has the fucking resources for it! Even this piecemeal shit like in BF5 or metro where only one part of the rendering is ray traced... Is still cutting resolution and framerates to ribbons!

Stay with me bud. FOR NOW it is a waste, and we'd be much better served if devs would use the superior old techniques in clever and conservative ways. That's all I have to say.
 

Redneckerz

Those long posts don't cover that red neck boy
Redneckerz Redneckerz I think I know the difference, again.
I asked how exactly RT Cores, which accelerate a part of the process, are brute forcing the tech. If you mean brute forcing it in terms of pushing it into a game that clearly was not meant for it, then yeah, perhaps.

Sebbi's a clever programmer and Nvidia RT is a bloated waste.
I am not denying that, i am just asking if you know what's going on.

Planar reflections don't break. It's an older technique superior to SSR.
Alright, so why don't more games use it instead of SSR?

Like I said bud, Universal, real ray tracing is a good step in real time rendering - when our hardware actually has the fucking resources for it!
Guess what, they do! In a mixed fashion, but they do!

Even this piecemeal shit like in BF5 or metro where only one part of the rendering is ray traced... Is still cutting resolution and framerates to ribbons!
What you want is an instant jump to full scene raytracing - Everyone knows progression comes in intermediate steps. Hybrid rendering is one such step.

Stay with me bud. FOR NOW it is a waste, and we'd be much better served if devs would use the superior old techniques in clever and conservative ways. That's all I have to say.
I am not your ''bud''.
Anymore old techniques you can think of?
 
Last edited:

ethomaz

Banned
1. “Old” tech are vastly INFERIOR to Ray-tracing tech.
2. Ray-tracing is old tech too.
3. You need to bring “new” tech to refine and make it better... nVidia is doing exactly that.

Like I said I should never buy a GPU today without Ray-tracing.

You need to be happy the tech is finally moving from rasterization... it took decades for that.
 
Last edited:
Redneckerz Redneckerz

Ok Sunshine -

What exactly is your point? My main point is this tech is simply not worth the current cost. That's it. If you disagree, ok great. But let's stay on the same page.

There's a number of possible reasons why planar reflections may not be used, one its more expensive (however no where near as expensive as RT, so there's no "gotcha!" here) than SSR and 2. Devs might not necessarily know it. Believe it or not, not every dev has the time to look at every possible solution that ever existed since they're busy making games.

One reason SSR is popular is because it is featured in current middleware, and is extremely easily implemented.

It's also a trend. Kinda like how chromatic aberration, eye adaptations, lens flare or anything else that's common in current games ; doesn't mean it's necessarily the best solution.

What I "want" is to not sacrifice resolution, textures or anything else for an extremely computationally expensive solution that doesn't necessarily look a while lot better than previously used techniques.

To repeat myself since you're on a warpath with selective reading, yes this hybrid solution may be an important first step but that doesn't mean it is a good or worthwhile solution at the moment.
 

SonGoku

Member
Its literally a new feature. Its like saying pixel shader support sucked when Geforce 3 came out in 2001.

Obviously it sucks. Games just dont automagically gain support for such a feature.
But its supposed to be a simple to implement feature is it not? By your own admission RTX a more complex and harder to implement feature is seeing better support than dsll
Perhaps its not suited for all games or not as easy as they touted it was. It gives me PS4 Pro CB vibes were a supposedly easy to implement feature got ignored by most 3rd parties
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
Redneckerz Redneckerz

Ok Sunshine -

What exactly is your point? My main point is this tech is simply not worth the current cost. That's it.
And i disagree, especially for the RTX 2060. Having 1080p on RTX on a midrange card and nearing 60 fps is quite impressive, especially when its early days. For the record: This card likely would have hit 30 fps or less with BFV RTX initial release, which should be telling as to how many strides there are undertaken.

I am not your sunshine, by the way, so you don't have to patronize me simply because i disagree with you.

There's a number of possible reasons why planar reflections may not be used, one its more expensive (however no where near as expensive as RT, so there's no "gotcha!" here) than SSR and 2. Devs might not necessarily know it. Believe it or not, not every dev has the time to look at every possible solution that ever existed since they're busy making games.
To your own accord its an older technique. So, what makes you think people went with SSR instead? Perhaps that was easier to set up in their renderers?

We can't know for sure, ofcourse. Just as we can't know for sure why planar reflections aren't more broadly supported. One renderer is not like the other

It's also a trend. Kinda like how chromatic aberration, eye adaptations, lens flare or anything else that's common in current games ; doesn't mean it's necessarily the best solution.
You have to start somewhere, which is the point. Your opinion is that this starting point should be discarded from the beginning.

What I "want" is to not sacrifice resolution, textures or anything else for an extremely computationally expensive solution that doesn't necessarily look a while lot better than previously used techniques.
Those window reflections and door reflections look a lot more natural than what one could achieve with SSR, especially from an angle. Windows in particular.

To repeat myself since you're on a warpath with selective reading,
Such a useless accusation to make Chozo, especially when i cited most of your post.

yes this hybrid solution may be an important first step but that doesn't mean it is a good or worthwhile solution at the moment.
Initial release i would have agreed. Now its actually a viable solution, at full HD which is what most gamers have. So its totally worthwhile. And again, BFV is not the best example of the tech.

But its supposed to be a simple to implement feature is it not? By your own admission RTX a more complex and harder to implement feature is seeing better support than dsll
DLSS requires adapting to an AI profile which requires user end data and game end data which may or may not be what every dev wants.

Its a worthwhile addition though, considering the gains. Its in a way similar to the neural network HD texture packs that we have seen.
 

SonGoku

Member
DLSS requires adapting to an AI profile which requires user end data and game end data which may or may not be what every dev wants.

Its a worthwhile addition though, considering the gains. Its in a way similar to the neural network HD texture packs that we have seen.
So this might slow down its adoption and might no be as widespread as some claim, more of a niche feature

eh.. in reality by numbers 4k is a meme almost just as much as rtx.
lol funniest post i read this week honest:messenger_tears_of_joy:
People spend $700+ in GPUs alone just to meme, its just a meme bro
 
Last edited:

Jigsaah

Gold Member
I mean hedropped the textures to high...which is basically what the X is running. Although the X is in 4k 60fps, with no raytracing. I dunno if I'm willing to spend 350 bucks to upgrade to the 2060 to do 1080p with raytracing. I'd like to see how it does with raytracing off because not all games are gonna have that. If it's got a significant boost over the 1080ti then i think it's worth it.
 

Redneckerz

Those long posts don't cover that red neck boy
So this might slow down its adoption and might no be as widespread as some claim, more of a niche feature
I dunno. I mean, the performane gains are obvious, and if game devs love anything, its obvious performance gains, so it would be interesting if this keeps put out of the loop.

Nvidia themselves could also point out the obvious advantages over this in perf gains vs the obvious perf disadvantages when doing RT. Fortunately DLSS migiates that so they should point that out more.
 

Mahadev

Member
Paying all that money to barely have 1080p and 60fps just for raytracing, wow. It's fucking preposterous if you ask me, ray tracing right now is nothing but a useless marketing gimmick.
 
Last edited:
Redneckerz Redneckerz I already said what it is I think stops PR from being in current games. Like I said - read better.

Luckily sunshine I don't feel this tech needs be discarded in the first place because it's relegated to the PC and those gamers are the beta testers. I consider the PC a testing platform anyway so I suppose all is right, really.

I don't mean that condescendingly either, everyone knows that's how brand new tech works. I think it's stupid to buy, but you're not worse than me if you so choose.

Next gen consoles will keep their games high res regardless of RT on PC, so I'm not exactly disappointed.
 

Redneckerz

Those long posts don't cover that red neck boy
Redneckerz Redneckerz I already said what it is I think stops PR from being in current games. Like I said - read better.
And i replied to that accordingly. Like you say, read better.

[
Luckily sunshine I don't feel this tech needs be discarded in the first place because it's relegated to the PC and those gamers are the beta testers. I consider the PC a testing platform anyway so I suppose all is right, really.
I agree. Except for the sunshine part which you are just doing to provoke. Childish, really.

[
I don't mean that condescendingly either, everyone knows that's how brand new tech works. I think it's stupid to buy, but you're not worse than me if you so choose.
What i read is you dismissing the tech for a variety of reasons that may not hold true.
 
Last edited:

SonGoku

Member
I dunno. I mean, the performane gains are obvious, and if game devs love anything, its obvious performance gains, so it would be interesting if this keeps put out of the loop.

Nvidia themselves could also point out the obvious advantages over this in perf gains vs the obvious perf disadvantages when doing RT. Fortunately DLSS migiates that so they should point that out more.
That's why i said it gives me Pro CB vibes, a "simple" to implement feature with obvious performance and quality benefits goes ignored by most 3rd parties in favor of more rudimentary solutions.
 
That's why i said it gives me Pro CB vibes, a "simple" to implement feature with obvious performance and quality benefits goes ignored by most 3rd parties in favor of more rudimentary solutions.
Checkerboarding is easy enough to implement, but there are various implementations with varied results. At times it can look worse than no upscaling at all.

Actually I would say at all times, but some methods are particularly bad like dragon quest or red dead.

Temporal injection is the gold standard of upscaling techniques and needs no special hardware. It's hilarious to see anyone touting dlss as something new/superior.
 

Redneckerz

Those long posts don't cover that red neck boy
That's why i said it gives me Pro CB vibes, a "simple" to implement feature with obvious performance and quality benefits goes ignored by most 3rd parties in favor of more rudimentary solutions.
Yeah but CB, although it looks close enough to a 4K image, it isnt 4K. It also does not enhance perf that much.

DLSS is closer to a native 4K (I think???) and enhances perf. Together with Variable Rate Shading so you have different parts of the screen in different rendering res, we could see a lot of gains.

Though i do think there is something to be said against AI profiles for games though. I am more into the neural network HD texture pack thing which uses similar tech.

Imagine that games could have something like DLSS but more on-the-fly and with shaders.
 

SonGoku

Member
Checkerboarding is easy enough to implement, but there are various implementations with varied results. At times it can look worse than no upscaling at all.

Actually I would say at all times, but some methods are particularly bad like dragon quest or red dead.

Temporal injection is the gold standard of upscaling techniques and needs no special hardware. It's hilarious to see anyone touting dlss as something new/superior.
The PS4 Pro has a hardware based CB feature that goes ignored by most 3rd parties. Sony 1st party has the best image quality on that console and they use it most of the time games like Horizon and GoW showcase it.
Im just saying dlss sounds hell of a lot like Pro CB, great in paper but it barely receives support by anyone other than the hw manufacturer
Yeah but CB, although it looks close enough to a 4K image, it isnt 4K. It also does not enhance perf that much.

DLSS is closer to a native 4K (I think???) and enhances perf. Together with Variable Rate Shading so you have different parts of the screen in different rendering res, we could see a lot of gains.

Though i do think there is something to be said against AI profiles for games though. I am more into the neural network HD texture pack thing which uses similar tech.

Imagine that games could have something like DLSS but more on-the-fly and with shaders.
Oh no, im not saying they are the same tech wise
Im just seeing similarities in support (or lack there off). Im sure DLSS is much better quality but also more complex to implement which makes it even less likely to receive widespread support
 
Last edited:
The PS4 Pro has a hardware based CB feature that goes ignored by most 3rd parties. Sony 1st party has the best image quality on that console and they use it most of the time games like Horizon and GoW showcase it.
Im just saying dlss sounds hell of a lot like Pro CB, great in paper but it barely receives support by anyone other than the hw manufacturer

Oh no, im not saying they are the same tech wise
Im just seeing similarities in support (or lack there off). Im sure DLSS is much better quality but also more complex to implement which makes it even less likely to receive widespread support
I think it's ignored because developers don't like the look of it and how it effects their lighting and particle systems.

Another consideration is that yes, Pro has built in hardware for it, but x1x doesn't. Maybe CB isn't used in some cases on pro just for the simplicity of multiplatform development.

Maybe you don't notice, but in the checkerboarded games I've played the artifacts make it look worse than 1080p to me. Like Nex Machine from housemarque. Good Sony devs.
 

Redneckerz

Those long posts don't cover that red neck boy
Oh no, im not saying they are the same tech wise
Im just seeing similarities in support (or lack there off). Im sure DLSS is much better quality but also more complex to implement which makes it even less likely to receive widespread support
I mean, this much performance improvements is too much to just keep on getting ignored, no? Unless Nvidia has a whole list of restrictions before DLSS tech can be implemented, it seems like an obvious choice?
 

SonGoku

Member
I think it's ignored because developers don't like the look of it and how it effects their lighting and particle systems.
So how come sony 1st party image quality excels any 3rd party effort on the console?
Perhaps we are back to the its more difficult and complex to implement properly dilemma?

RDR2 for example, i don't know what they used but it looks like ass worse than 1080p
I mean, this much performance improvements is too much to just keep on getting ignored, no? Unless Nvidia has a whole list of restrictions before DLSS tech can be implemented, it seems like an obvious choice?
But as we seen with Pro, theres always the cheap up-scaling route. idk Personally i hope dsll or a better version of it catches on in the future
Im just not very confident it will ever be more than a Nvidia physx feature.

Raytracing however will absolutely become the standard in 10+ years
 
Last edited:

thelastword

Banned
The raytraced implementation in BFV reminds me so much of the inception of bump mapping and shaders…...We had shiny walls/surfaces everywhere.....Then there was the custom UE3 look that was prevalent across many games.....or when the industry just went crazy over implementing bloom lighting with a heavy yellow tint in every game or when they went berserk with Chromatic Aberration and lens flare......

I look at BFV and this game looks so unaesthetically pleasing to me, it's just downright noisy and unrealistic...….. Reflections everywhere the world is not...……. and it looks worse with low poly gun models with jagged and unsmooth surfaces...

The fact that leadbetter is trying so hard to make a feature work decently with a $350.00 card says a lot, because it does not even look that good...
 
So how come sony 1st party image quality excels any 3rd party effort on the console?
Perhaps we are back to the its more difficult and complex to implement properly dilemma?

RDR2 for example, i don't know what they used but it looks like ass worse than 1080p

I don't agree that horizon or god of war looks better than a straight up 1440p image. Sure, it's more crisp when standing still, but in motion it can fall apart around alpha effects and lower resolution elements like lighting. And the edges have uneven patterns. Personally I prefer 1080p.

I mean, who knows how good Sony's tools are to implement it. Yeah it might not be as easy as flipping a switch.

DF says red deax uses checkerboard, just not as good as horizon.
 

SLB1904

Banned
6GB is more than enough for 2k, but not for 4K. Resident Evil 2 Remake demo on PC uses up to 12gb of VRAM at 4k/max settings.
re2 demo is not a good example because its a demo
12gb vram is to much for what that game is doing
but again im not an expert lets wait for the final version
 

Spukc

always chasing the next thrill
DP, but this post demonstrates exactly why render engines and game engines should not be compared, but also how silly it is to look down upon this from a game engine perspective.

Its very much raytracing, just a part of it. Hence why it is mixed rendering. All those render engines aren't laughing at RTX because they know its not in the same bracket. RTX, to them, is like their little cousin getting accustomed to walking/raytracing.
mind you i never stated that what i use is the same.
just saying that
"Im clueless but Im sure I'll get some atenttion with this"
EDGY
D
G
Y
 

Meh3D

Member
I find people's attitudes overall towards DLSS on this forum quite interesting. I feel that quote that goes "what's old is new again" is appropriate. DLSS is pretty much deep learning applied to upscaling. One generation (this forum), it didn't matter what the final output was but what quality the internal rendering was being done at. At least when it came to arguments in graphic comparisons on this forum. Now, I'm seeing people mention this as a feature . Reading about the excitement over something rendering below 1080 and then up classed via DLSS on a 350 USD card on a forum that had heated arguments about internal rendering vs upscaled output is pretty funny.

Personally I feel the card is too expensive by about 50 USD. On one hand I do like the idea of smarting upscaling on entry and mid range cards. On the other, I'm wondering if anyone at Nvidia tried to offer this to console manufacturers as a selling point for their GPUs. While this is not the thread, I think selling this as a middleware software package to console game developers would be great day. (Aka Physix Console Middleware.)
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
mind you i never stated that what i use is the same.
just saying that

EDGY
D
G
Y
But what Nvidia isn't so much bullshit as it is just tracing a subset of the entire raytracing paradigm. And its doing that still without Metropolis Light Transport.

As soon as we can get real-time MLT going, then we are really approaching cinematic rendering. MLT has only been around since 1997, so you can imagine that this tech is still new in terms of graphics.
 
Top Bottom