• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Battlefield V with RTX Initial Tests: Performance Halved (1080p)

SonGoku

Member
https://www.techpowerup.com/249557/battlefield-v-with-rtx-initial-tests-performance-halved
We began testing with a GeForce RTX 2080 Ti graphics card with GeForce 416.94 WHQL drivers on Windows 10 1809. Our initial test results are shocking. With RTX enabled in the "ultra" setting, frame-rates dropped by close to 50% at 1080p.
xrYUUUhhEbIsk3b8.jpg

8VylbQkws3CIEKOv.jpg

RvTj4XVUAZOp1PzG.jpg


So it seems RTX is nowhere near ready for prime (but hey gotta start from the bottom) atm AMD is better suited making cards focusing on raster performance, the might even surpass Nvidia in that department since they will have more transistors to play with-
 

lefty1117

Gold Member
I thought I read that Nvidia said BF5 has to be patched to leverage the new RTX drivers they released. Read that this morning.
 

SonGoku

Member
Ray tracing in games is new born. Give it time to matture.
hw is new born, I'd say we are 5 arch releases away from 4k 60fps. I don't think software optimization will get much more performance out of current hw

Still 1080p at 60fps for raytracing is mighty good
 
Dropped by close to 50%? Am I blind? I swear I see a bigger than 50% drop there.

Edit: everyone knows the drop will be huge, what matters here is to know if that tradeoff is worthwhile.
 
Last edited:

Whitecrow

Banned
hw is new born, I'd say we are 5 arch releases away from 4k 60fps. I don't think software optimization will get much more performance out of current hw

Still 1080p at 60fps for raytracing is mighty good
Ray tracing at 1080 60 fps is something I would choose instead of 4K.
Or RT at 30 fps instead of rasterization at 60 fps.

Options are good.
 

Pagusas

Elden Member
Anyone buying this is beta testers, this is gen 1 hardware folks, we all know it, if you want it now you'll deal with the early tech restrictions. Im just glad we are finally concentrating on this.
 

Shai-Tan

Banned
Ray tracing at 1080 60 fps is something I would choose instead of 4K.
Or RT at 30 fps instead of rasterization at 60 fps.

Options are good.

I’ve tried both and I don’t considering “ray tracing” here is just reflections. I don’t find it particularly impressive. It’s kind of like how the physical simulations in PhysX were all very simple with limited bodies interacting all the while games with fake physics like Red Faction Guerrila were much more interesting to look at. Considering the state of physical simulation in graphics research I can see a nice hybrid scheme where there is some interaction but it requires a lot of diligent work and the same follows for RTX: it won’t “just work” because it runs into the same problem of PhysX of it being inadequate to a more complex simulation so if we get something good it it will be years down the line with a lot of hacks and it will require much more powerful hardware. My opinion then is RTX right now is a toy. I’m hoping the lighting in Metro Exodus makes it worth it but it will have to be at the opportunity cost of resolution which in my opinion makes a bigger difference to the visual clarity and quality of a scene considering the limited nature of the current ray tracing technology.
 

Vlaphor

Member
RTX seems cool, but these cards always seemed like early adopter cards. Barely any improvement over the previous series ( I have a 1080ti and nothing I've seen in terms of performance I've seen makes me want to upgrade), and a feature barely anything makes use of tells me that I should wait for the next gen before jumping in.
 

Redneckerz

Those long posts don't cover that red neck boy
https://www.techpowerup.com/249557/battlefield-v-with-rtx-initial-tests-performance-halved

xrYUUUhhEbIsk3b8.jpg

8VylbQkws3CIEKOv.jpg

RvTj4XVUAZOp1PzG.jpg


So it seems RTX is nowhere near ready for prime (but hey gotta start from the bottom) atm AMD is better suited making cards focusing on raster performance, the might even surpass Nvidia in that department since they will have more transistors to play with-
I think its actually fairly decent, as it highlights really well that yes, RTX is a heavy performance penalty. There are a few things in mind though:
  • This is early code.
  • This is on Frostbite, so on other implementations, the penalty may vary.
  • Raytracing support may differ per engine. Atomic Heart has far more usage of raytracing implemented but is also a late 2019 game. I think that game is going to be the true test for these cards.
  • DLSS was not employed, it seems. So hypothetically performance can rise on higher resolutions and higher RTX settings if DLSS is employed (and supported).
Even though its early days, its actually fairly decent, especially for a generation zero (No pun intended) product. Remember that the only thing coming close to hybrid raytracing silicon prior to this was Imagination Wizard, and that was just a few years ago.

It can't be understated how impressive this geniunely is, even when its a 50-65% performance deficit (Meaning multiple ms are spent doing RT) and at 1080p resolution.

I mean, it's over 60fps which is good... but thats at 1080p. They need do this test with 1440p and 4k monitors.
Why should they? AFAIK BFV does not employ DLSS so higher resolutions will see it (naturally) drop below 60 on anything above RTX Low. 1440p could be achievable with RTX Low and 60 fps, but 4K60 is an absolute pipedream.

Its overkill to expect that out of gen 0 hardware imo.

hw is new born, I'd say we are 5 arch releases away from 4k 60fps. I don't think software optimization will get much more performance out of current hw

Still 1080p at 60fps for raytracing is mighty good
See the quote to DeepEnigma below, but i think RTX Low and Medium are the first steps to tackle. RTX Low is a heavy penalty but does keep it over 80 fps, so there is potential there for that setting on higher resolutions and that is to be accepted imo. Medium, High and Ultra all pose similar performance in BFV, which seems to be an issue, but Medium is the first one to tackle.

Honestly, DF should take a look at what the differences between the various settings are: What is the sample size? How many rays are traced? And what is performance with a regular GTX 1080Ti on DXR Fallback (If this is supported by the game)?

All kinds of questions that we need answers to establish a metric of potential for RTX, imo. :)

Why does medium have a lower frame-rate than Ultra?
Early code. Its already strange that Medium, High and Ultra deliver similar framerates, which is either a bug or there is hardly any performance difference between the 3 (probably the former). Low gives you a 20-30 fps increase compared to Medium so that one is actually interesting: Still a big performance loss, but this is one that could do well on lower spec RTX cards and might be interesting for higher resolutions aswell. 1440p Medium RTX and 4K Low RTX are the first boundaries that we can overtake within a few gens of cards, sort to say. Id say those are the most realistic settings for now and forthcoming 1 to 2 years and at those resolutions regarding RTX.

RTX seems cool, but these cards always seemed like early adopter cards. Barely any improvement over the previous series ( I have a 1080ti and nothing I've seen in terms of performance I've seen makes me want to upgrade), and a feature barely anything makes use of tells me that I should wait for the next gen before jumping in.
It is silly to consider an upgrade over GTX 1080 Ti), even DF says so that for this moment, that's a useless upgrade. But when it comes to future proofing on tech, the RTX cards are an interesting prospect.
 
Last edited:

demigod

Member
Anyone buying this is beta testers, this is gen 1 hardware folks, we all know it, if you want it now you'll deal with the early tech restrictions. Im just glad we are finally concentrating on this.

I bailed out after I heard 1080p with Raytracing at the $1200 pricetag. I can wait until the 3000 series and when monitor companies get their act with 4k hdr and not $2000 pricetag.
 

Imtjnotu

Member
I can't see the big push for 4k lately to end up going back to 1080p for better light and reflections. I'll hold off until performance picks up
 

Leonidas

Member
1440p performance seems reasonable given this is the first piece of software to implement the features.

Seems odd to include 1080p results and skip over the 1440p ones.
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
1440p performance seems reasonable given this is the first piece of software to implement the features.

Seems odd to include 1080p results and skip over the 1440p ones.
Because its a $1200 GPU that nvidia has been hyping the shit out of.
I mean i am on the same page as Leonidas here.

Its Nvidia. Ofcourse they will hype it. What would be far more interesting is to see if performance holds up (relatively) with RTX 2070. If it does, then it also proves that 4K testing is not needed, and 1440p likely only holds up 60 with RTX Low.
 

Filben

Member
Too much performance loss for the given quality improvement that is only slightly noticeable in motion and action. I can wait for the technology to be more mainstreamed and affordable for the average consumer. Gladly it's not a big jump that blows me away, so I don't even think about getting a new card to replace my GTX 1070.
 

Lort

Banned
Xbox one x players be like lol at PC master race ... worse performance (with better reflections ) in a game at 1080p than a xbox one x at 4k.

What a waste of money this technology is right now.
 

Allandor

Member
Did they not know DX12 tanks the framerate in most games? Turn that jive Off.
??? There is no DXR without DX12, and btw, performance-loss in DX12 has nothing to do with DX12 beeing bad, it has to do with optimizations that now the developer needs to do. They just don't seem to have their engine-code really DX12/Vulkan ready. But you can see how good you can still optimize for DX11.

Well the hardware unboxed video just shows that the 2070 can't really be a RTX card and that there won't be a RTX 2060. It just doesn't make any sense to get even a lower number of those RTX "cores" on a card.
 

mrMUR_96

Member
https://www.techpowerup.com/249557/battlefield-v-with-rtx-initial-tests-performance-halved

So it seems RTX is nowhere near ready for prime (but hey gotta start from the bottom) atm AMD is better suited making cards focusing on raster performance, the might even surpass Nvidia in that department since they will have more transistors to play with-

Ray tracing isn't rasterized in this case. RTX cards run it on custom "tensor" cores that are optimised for ray tracing. Running it straight up on standard cores will have awful performance.
 

kraspkibble

Permabanned.
it's disappointing sure but it will take a few generations to mature and work it's way down across all models of GPU.
 

pawel86ck

Banned

20-30fps with shaders, 80-130fps without shaders, so geforcd 3 shading performance was very poor, yet every game today use shaders. It was the same with HDR on SM3, tesselation etc. and in the near future also RTX will also become a new standard. I'm guessing next RTX cards from Nv will double raytracing performance (40fps in 1440p should be 80fps then), and thanks to DLSS 1440p will be upscaled to 4K with very good results.
 
Last edited:

Allandor

Member

20-30fps with shaders, 80-130fps without shaders, so geforcd 3 shading performance was very poor, yet every game today use shaders. It was the same with HDR on SM3, tesselation etc. and in the near future also RTX will also become a new standard. I'm guessing next RTX cards from Nv will double raytracing performance (40fps in 1440p should be 80fps then), and thanks to DLSS 1440p will be upscaled to 4K with very good results.

the problem is the fabrication steps are nearing a physical limit. There won't be enough shrinks to get it running at 4k because RT is more or less always brute force. And we only see a few better reflections. No better RT illumination or something like hat.
Doubling the performance of the RT cores won't get you enough performance for 4k if it is already struggling at 1080p.

Also the first shaders made the world more dynamic. You could really see the visual difference while moving in the world. In BFV there are just better reflections ... well nothing more.
 
Last edited:
I'm running an RTX 2080ti, 9900K & 16GB 4000Mhz DDR 4. It's a bummer I wont make 60hz minimum in 3440x1440 with DXR on. The only hopeful message ive seen is from NVIDIA's offical FAQ

https://forums.geforce.com/default/topic/1080010/geforce-rtx-20-series/battlefield-v-dxr-faq/

Performance Settings in Battlefield V

We recommend in this first release of DXR that “DXR Raytraced Reflections Quality” be set to “Low” due to Battlefield V’s Known Issues when using “Medium”, “High”, and “Ultra” settings. EA, DICE, and NVIDIA will also continue to optimize this implementation and deliver regular updates.

This is also from EA's known issues post.

https://forums.battlefield.com/en-us/discussion/161023/battlefield-vs-known-issues-list

DXR
  • Direct X Raytracing - The BFV 11-14-2018 Update which introduces first release of DXR ray tracing shipped with a couple of known bugs. We recommend in this first release of DXR ray tracing that “DXR Raytraced Reflections Quality” be set to “Low” for the best experience. We will continue to work with Nvidia to further enhance your ray tracing experience.
  • “Medium” DXR quality preset setting not applied correctly
  • May cause performance and quality in the “Medium” preset to vary depending on which preset you selected last
  • Status: Fix coming in an upcoming update.
  • DXR Performance degraded in maps which feature a lot of foliage
  • This particularly effects War Stories “Liberte” and “Tirailleur”
  • Status: Currently investigating
  • DXR is not automatically enabled for users who can use the tech
  • This is known. We've provided a brief guide on enabling which includes downloading latest BFV update, Windows update, and GPU drivers.
  • Status: Workaround provided here.
 
Last edited:

pawel86ck

Banned
the problem is the fabrication steps are nearing a physical limit. There won't be enough shrinks to get it running at 4k because RT is more or less always brute force. And we only see a few better reflections. No better RT illumination or something like hat.
Doubling the performance of the RT cores won't get you enough performance for 4k if it is already struggling at 1080p.

Also the first shaders made the world more dynamic. You could really see the visual difference while moving in the world. In BFV there are just better reflections ... well nothing more.
In battlefield RTX is only used for reflections, but Nv tech demos looks like new generation of graphics.





1440p and around 50fps on average and on top of that people suggest RTX performance in BF5 is bugged (for some reasons leaves on the ground hammer raytracing performance) so people can expect performance improvements, so it's not that bad for a first game supporting REAL TIME RAY TRACING.
 

SonGoku

Member
Ray tracing isn't rasterized in this case. RTX cards run it on custom "tensor" cores that are optimised for ray tracing. Running it straight up on standard cores will have awful performance.
That goes without saying, no one is claiming otherwise
 

CuNi

Member
Yes, Ray tracing will be huge in the future and from now on will only get more and more impressive.

Yes, it's quit a feat that we can run raytracing next to raster in real time and at 60fps!

No, it should not be included in consumer GPU's.

Clearly it's still very early adopters tech. It should be in workstation GPU's for developers to already experiment, develop workflows and implement those features into coming engines but IMHO it should not be a selling point for consumer GPU's. It just uses up die space that could've be used for more raster. power which right now would probably had a bigger visual impact than raytracing has rn. Also it just unnecessarily blows the price into outer space.

They should have done it like AMD does. Wait till you can offer this feature to all cards for normal prices. There no use for this tech if barely anyone has cards that support it. Small game deva won't support it since it's wasted effort and tbh currently the biggest margin of interesting games comes from indies or AA games, not from known tripple A developers.
 
It's been quite a while since I followed graphics hardware and especially raytracing, but won't raytracing always require more hardware than rasterization to achieve comparably same levels of fidelity?
 

Redneckerz

Those long posts don't cover that red neck boy
Ray tracing isn't rasterized in this case. RTX cards run it on custom "tensor" cores that are optimised for ray tracing. Running it straight up on standard cores will have awful performance.
Almost correct. The Tensor Cores are for AI and Deep Learning (And enable the DLSS). The RT Cores are the specialized ASIC's that accelerate the ray tracing hierarchy.

That goes without saying, no one is claiming otherwise
I am claiming otherwise. ;)

So the RTX2070 is worthless for Raytracing.
At the moment, yes, but RTX isn't a concept of instant visual boost jetzt!, it is a concept of progressive evolution as time goes by. This is a completely new rendering paradigm for consumer cards: Having this performance at 1080p is quite acceptable to say the least. As for the RTX 2070: The buggy results with DXR Low/Medium/High/Ultra should be addressed first.

I think the real stress test will be Atomic Heart. If RTX 2070 can run that reflections and all, then RTX is redeemed.

It's been quite a while since I followed graphics hardware and especially raytracing, but won't raytracing always require more hardware than rasterization to achieve comparably same levels of fidelity?
Turing is doing a mixture of both and as such a lot of sillicon is spent on implementing the denoising and raytracing tech. Turing is a concept of hybrid raytracing where rasterization is intermixed with raytraced effects throughout the scene. It is not full scene raytracing. The fidelity levels thus are still the same, as Turing achieves these visuals with a big portion of the chip dedicated to RT.
 

SonGoku

Member
I am claiming otherwise. ;)
Sorry my bad, it blew right over my head that he was talking about tensor cores, i grouped it as specialized hw. Im aware RT runs on the RTX cores and the tensor cores are used for denoising

What i meant is no one is taking away from the advantage of specialiced hw running specific code, just that its very much 1st gen implementation and we still got ways to go before RT is common place (hybrid or otherwise), i expect RT will be ready for the PS6 perhaps some earlier implementation for midgen refreshes
 
Last edited:

ShirAhava

Plays with kids toys, in the adult gaming world
Where HDMI 2.1 support? I would've paid a premium for that! Raytracing doesn't impress me at all
 

Redneckerz

Those long posts don't cover that red neck boy
Sorry my bad, it blew right over my head that he was talking about tensor cores, i grouped it as specialized hw. Im aware RT runs on the RTX cores and the tensor cores are used for denoising
You aren't entirely incorrect though: With Volta hardware, when the Star Wars Demo was shown, it was run through CUDA and Tensor Cores. I am not sure but i reckon the latter cores were used as semi-RT cores of some sort.s

What i meant is no one is taking away from the advantage of specialiced hw running specific code, just that its very much 1st gen implementation and we still got ways to go before RT is common place (hybrid or otherwise), i expect RT will be ready for the PS6 perhaps some earlier implementation for midgen refreshes
Totally. We are just half way now from full scene raytracing, where it would effectively replace rasterization. The hybrid approach is the one where Nvidia is focussing now, and who knows, perhaps they attain consistent real life imagery through this.
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
This is why I moved away from the bleeding edge on PC. PC gaming is great when it is stable, but the bleeding edge seldom guarantees a better experience.
 
Top Bottom