• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD David Wang: We Won’t Implement DirectX RayTracing (DXR) In Games Until Its Offered In All (Gaming) Product Ranges.

Hoping this also means it's included with ~2020 consoles. Maybe adaptation will take a while, but an entire 5-7 year console generation missing out on any form of it while it was taking off on PCs would be a big miss.

I think a lack of raytracing and such would be a good baseline console. Let people who want to pay for that do so in the mid-gen refresh. when the tech isn't so bleeding edge.
 

SonGoku

Member
are not pursuing hybrid raytracing pipeline, or at least announcing anything, is worrying.
I'd argue this is the perfect opportunity for AMD not to do that
By focusing on raster performance they can catch up and even surpass nvidia in that department since they dont have to carry dead-weight around (ai/rtx cores). That's what Navi is for

Since consoles dictate AAA development, raytracing adoption will be slow and reserved as a niche nvidia GameWorks feature
I think a lack of raytracing and such would be a good baseline console. Let people who want to pay for that do so in the mid-gen refresh. when the tech isn't so bleeding edge.
Exactly, no need to sacrifice raster performance of base consoles for very rudimentary raytracing, Mid-gen refresh is better suited for that
 
Last edited:
I'd argue this is the perfect opportunity for AMD not to do that
By focusing on raster performance they can catch up and even surpass nvidia in that department since they dont have to carry dead-weight around (ai/rtx cores). That's what Navi is for

Since consoles dictate AAA development, raytracing adoption will be slow and reserved as a niche nvidia GameWorks feature

Exactly, no need to sacrifice raster performance of base consoles for very rudimentary raytracing, Mid-gen refresh is better suited for that

Well I take this way: for video game, hybrid RT offers very poor results for the overhead needed. As a 3D designer, I absolutely need hybrid RT pipelines for optimisation and shortcuts, and this is also part of the crowd that leads the market.

But for gaming you're right: it's not a priority or doesn't work well enough. And since almost anything from Nvidia is mostly vaporware that ends being unused...
 

LordOfChaos

Member
I think a lack of raytracing and such would be a good baseline console. Let people who want to pay for that do so in the mid-gen refresh. when the tech isn't so bleeding edge.

The problem with only supporting it in a mid gen refresh is how many titles that extra effort gets applied to. It would be more effort than the PS4-Pro divide, which is largely the same hardware with more power.

Not suggesting consoles at console costs by 2020 will even have RTX 2080 levels of ray tracing inferencing performance, but supporting the feature in-hardware to any degree would still allow more effects using it from the outset than we already have in games (several games ray trace some minor things on non-accelerated hardware already), while it could be more easily scaled up with a mid gen refresh.


Budget is not some magical currency that transforms into ideas, conceptions, planning, research and engineering the more you pour in. I think AMD have done a way better job with less budget for years, until they completely abandon the competition 2 years ago, which would have been okay if they waiting the RTX to react...

The fact that they still don't have a card with 1080ti performance, are not pursuing hybrid raytracing pipeline, or at least announcing anything, is worrying.

It's not magic, but it can certainly attract top engineering talent.

See: Apples multi year lead in personal assistants evaporated by Google outspending on deep learning and grabbing all the top talent, Tesla (briefly) snagging LLVM creator Chris Lattner for their deep learning project, etc etc.

In AMDs case it seems like they can only really be competitive on CPUs or GPUs at a time, never both in alignment. Ryzen 2 on 7nm looks like it could really take Core head to head rather than just being almost as good and cheaper, while at the same time their GPU side faltered. Meanwhile their GPU side carried them through the Bulldozer years.
 
Last edited:
The problem with only supporting it in a mid gen refresh is how many titles that extra effort gets applied to. It would be more effort than the PS4-Pro divide, which is largely the same hardware with more power.

Not suggesting consoles at console costs by 2020 will even have RTX 2080 levels of ray tracing inferencing performance, but supporting the feature in-hardware to any degree would still allow more effects using it from the outset than we already have in games (several games ray trace some minor things on non-accelerated hardware already), while it could be more easily scaled up with a mid gen refresh.




It's not magic, but it can certainly attract top engineering talent.

See: Apples multi year lead in personal assistants evaporated by Google outspending on deep learning and grabbing all the top talent, Tesla (briefly) snagging LLVM creator Chris Lattner for their deep learning project, etc etc.

In AMDs case it seems like they can only really be competitive on CPUs or GPUs at a time, never both in alignment. Ryzen 2 on 7nm looks like it could really take Core head to head rather than just being almost as good and cheaper, while at the same time their GPU side faltered. Meanwhile their GPU side carried them through the Bulldozer years.

The problem with that is when they come out. If it's late 2020, raytracing might not be a terrible idea.

If it's next year OTOH...
 

jonnyp

Member
If their is one constant in life like the Sun rising every day is people making excuses for AMD and it's incompetence with Graphics cards. Whether you like it or not RayTracing is the future. It's pathetic that AMD is using an excuse to cover up their failings in keeping up with Nvidia.

It's at least two GPU generations away from cards being powerful enough to use it properly in real-time in games. And RT is only for games going for realistic graphics, I don't see how ray tracing helps in heavily stylized or cell-shaded games for instance. So no, AMD is on the ball here.
 

LordOfChaos

Member
It's at least two GPU generations away from cards being powerful enough to use it properly in real-time in games. And RT is only for games going for realistic graphics, I don't see how ray tracing helps in heavily stylized or cell-shaded games for instance. So no, AMD is on the ball here.

Fortnite uses ray tracing to make shadows look better. It's certainly not only for realistic looking games. Mild applications of it can help fix up the flaws in raster graphics in a lot of ways, which as I said is why I'm hoping for at least some acceleration of it in the next gen consoles, even if it's way short of the 2080 by then, some acceleration > none.




 
Last edited:

SonGoku

Member
The problem with only supporting it in a mid gen refresh is how many titles that extra effort gets applied to. It would be more effort than the PS4-Pro divide, which is largely the same hardware with more power.
Depends if all the major engines support it, could be just a toggle to turn on for the midgen refreshes, could potentially be simpler to implement than CB
Fortnite uses ray tracing to make shadows look better. It's certainly not only for realistic looking games. Mild applications of it can help fix up the flaws in raster graphics in a lot of ways, which as I said is why I'm hoping for at least some acceleration of it in the next gen consoles, even if it's way short of the 2080 by then, some acceleration > none.





The problem with specialized hw in base consoles is that it will eat into the silicon budget sacrificing raster performance

If the choice is a 10TF hybrid system vs a 14TF pure raster, i'd take the latter any day, its just not worth the performance trade off. Base consoles should be as strong as posible, midgen refreshes are better suited for rtx not to mention it will be more mature by then
 
Last edited:

jonnyp

Member
Fortnite uses ray tracing to make shadows look better. It's certainly not only for realistic looking games. Mild applications of it can help fix up the flaws in raster graphics in a lot of ways, which as I said is why I'm hoping for at least some acceleration of it in the next gen consoles, even if it's way short of the 2080 by then, some acceleration > none.






Ok, but are you gonna waste valuable die space to make shadows look better in games? I know it can also make reflections look better but I feel that's a huge waste personally. Go full out on rasterization until GPUs can do that Star Wars demo in real time gameplay at 4K is my view of the situation.
 
I member when ATI/AMD used to be the first to release a new feature, and the market pretended it didn't exist until intel/nvidia followed suit.

Eat some of your own medicine, Nvidia. Eat it hard.
 

RoboFu

One of the green rats
I think this is misleading. Any gpu with compute can take advantage of direct x raytracing. Maybe he means dedicated hardware?
 

LordOfChaos

Member
Depends if all the major engines support it, could be just a toggle to turn on for the midgen refreshes, could potentially be simpler to implement than CB

The problem with specialized hw in base consoles is that it will eat into the silicon budget sacrificing raster performance

If the choice is a 10TF hybrid system vs a 14TF pure raster, i'd take the latter any day, its just not worth the performance trade off. Base consoles should be as strong as posible, midgen refreshes are better suited for rtx not to mention it will be more mature by then


That's a fair argument and I would agree if the impact was that large. Though dedicated hardware and inferencing being so much faster than generalized hardware, my question would be how little die size can be spent while still having a significant speedup over no hardware acceleration for ray tracing.

Given that some games already use it on non-ray tracing hardware to a degree, what could they do with 10x more for "free", for a small change in die area, etc.

I'm thinking of like early tessellation units, they were kind of crappy and could not do world scale tessellation, but you could still do something on them and the 7th gen didn't miss out on a feature for the years it was blooming on PC, even if it had to be scaled way back and tailored down for them to more localized zones rather than the visible world.
 

Dontero

Banned
Hoping this also means it's included with ~2020 consoles. Maybe adaptation will take a while, but an entire 5-7 year console generation missing out on any form of it while it was taking off on PCs would be a big miss.

here is the answer for you:
not going to happen in BF5 it cuts fps from 160 to barely 60 at 1080p and those are averages not minimums.

index.php
 
Last edited:

Woo-Fu

Banned
Sad to see AMD continue to fall farther behind GPU technology.

They still haven't matched 1080 Ti(which is a previous gen card now).

I feel just the opposite. AMD needs to focus on low to mid-range and move volume to markets Nvidia isn't addressing and/or is pricing themselves out of with their predatory pricing. I'm not going to pay nvidia $500 for a mid-range GPU and thanks to AMD I won't have to.
 
Last edited:

LordOfChaos

Member
here is the answer for you:
not going to happen in BF5 it cuts fps from 160 to barely 60 at 1080p and those are averages not minimums.

index.php


Yikes. Fair enough.

I wonder if this is going to be like Netburst, Nvidia focusing on the wrong thing and allowing AMD some breathing room then.
 

Whitecrow

Banned
If some mad persons didnt make weird experiments in the past, the life we know right now wouldnt exist.

Nvidia bet on ray tracing had to happen at some point in time, and I think we will be grateful some years form now.
Meanwhile, if AMD sleeps and keeps 100% full on rasterization, is bread for today, hunger for tomorrow.
 
Last edited:

SonGoku

Member
That's a fair argument and I would agree if the impact was that large. Though dedicated hardware and inferencing being so much faster than generalized hardware, my question would be how little die size can be spent while still having a significant speedup over no hardware acceleration for ray tracing.

Given that some games already use it on non-ray tracing hardware to a degree, what could they do with 10x more for "free", for a small change in die area, etc.

I'm thinking of like early tessellation units, they were kind of crappy and could not do world scale tessellation, but you could still do something on them and the 7th gen didn't miss out on a feature for the years it was blooming on PC, even if it had to be scaled way back and tailored down for them to more localized zones rather than the visible world.
The question is then how much die space or how much of a speed up all things considered and is this aplicable to most games or only to certain development design desitions?

If the die spent on specialized hw sacrifices anymore than 1TF of raster performance it won't be worth it to get what amounts to better looking shadows and reflections. That's another issue with current hybrid raytracing, its applications are very limited
 
Last edited:

SonGoku

Member
If some mad persons didnt make weird experiments in the past, the life we know right now wouldnt exist.

Nvidia bet on ray tracing had to happen at some point in time, and I think we will be grateful some years form now.
Meanwhile, if AMD sleeps and keeps 100% full on rasterization, is bread for today, hunger for tomorrow.
Of course we have to start from the bottom up, i think the main take away its that raytracing is not ready for consoles and AMD is better suited focusing its limited R&D budget on releasing pure raster cards while they work on developing raytracing for the next decade, this might even give them a chance to surpass Nvidia raster performance.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Focus on rasteurization. If Navi offers a nice performance increase over 2000-series in non-RT apps it will be a resounding success.
 

jonnyp

Member


Good video explaining the problem currently with ray tracing. Would you trade that kind of performance drop for better reflections or shadows? Imagine how much faster the 2080 series would've been with a pure rasterization design and same die size.
 


Good video explaining the problem currently with ray tracing. Would you trade that kind of performance drop for better reflections or shadows? Imagine how much faster the 2080 series would've been with a pure rasterization design and same die size.


Year 2018, nvidia provides a $1200 card for 1080@60fps gaming. I guess R in rtx is for Revolutionary. And no, rtx does not make the picture lifelike, at all. Just look at this car at the snapshot above, I do not see a war there, I see a car museum with somebody smashing a molotov next to a lacquered exhibit. People, just look at the US marine footage below from a helmet camera , there is no nvidia rtx in real warfare, so stop calling this gimmick lifelike.
( safe, no blood)
 

Pagusas

Elden Member
Year 2018, nvidia provides a $1200 card for 1080@60fps gaming. I guess R in rtx is for Revolutionary. And no, rtx does not make the picture lifelike, at all. Just look at this car at the snapshot above, I do not see a war there, I see a car museum with somebody smashing a molotov next to a lacquered exhibit. People, just look at the US marine footage below from a helmet camera , there is no nvidia rtx in real warfare, so stop calling this gimmick lifelike.
( safe, no blood)



You are not this dense. Its an early use of the technology, its a first generation release of hardware and software. Get off your idiotic soapbox. Everything has to start somewhere, the tools and hardware will be refined and we will get some incredible things from it, it just takes time. If you want to get even a taste of this tech you'll be paying for it right now.
 

llien

Member
Meanwhile, if AMD sleeps and keeps 100% full on rasterization, is bread for today, hunger for tomorrow.
`Radeon Rays - it's quite a bit older than RTX.

I doubt AMD would sit and do nothing.

Cementing monopoly by carving out RTX market and building on it is the most likely plan nVidia has, it should be quite clear to AMD too.
 
Last edited:
Get off your idiotic soapbox. Everything has to start somewhere, the tools and hardware will be refined and we will get some incredible things from it, it just takes time. If you want to get even a taste of this tech you'll be paying for it right now.
Nvidia won't hire you as their PR/marketing manager.

I'll give you an example of a real lifelike tech inserted into gaming, this is a reg-doll physics presented in Hitman (the very first one). And it did not require a new $1200 card. Reflections and dynamic shadows were presented, I don't know for sure, I guess in early 2000x. This RTX is a marketing gimmick for justifying an enormous price tag over 10xx gen, an evolution of their gameworks poop which did not run well even on latest Ti and Titan cards. As llien llien said above, Radeon Rays is older than RTX, but AMD's marketing department is not that crazy to make a $1200-card for full-hd-gaming over it.
 

Mahadev

Member
Good, raytracing is just a very expensive gimmick. It should only be widely implemented when it comes at little cost which won't happen any time soon.
 

Allandor

Member
Well just reflections (and not really realistic … it is to reflective. Never saw such reflective mud). Illumination still missing via RT. So at 7nm the RTX cards get finally in a … well better size, maybe the RTX cores get bigger and 1080p might get an option with all RT "effect" hybrids but not for 4k. At 5nm well not really much will happen.
RT at this state in fabrication is just to late for gaming or the chips get really really big (and even more expensive) in the future.
 
May be somebody gotta implement a different rendering for the main picture and for the rays of reflections and shadows/shades/ambient occlusion. Do the picture in 4k, then apply to it rays is 2k. Reflections and shadows do not have to be of the same sharpness. Just look at the reflections and shadows around you, they never sharp unless its mirrors (and well polished ones) or a hard artificial lighting. Reflections in puddles and wet dirt after rain also do not look like made of nvidia mercury with lacquer sprayed thereon.

If somebody gotta implement it, I guess it will be AMD (call it Radeon Rays 2.0 or something). And they can utilize unused cores from ryzen cpus, e.g. a combination of 16-cores threadripper, which is $900, and vega "128", which hopefully be around $599. AMD did the mantle renderer which entailed dx12 and vulkan. What Nvidia did? An expensive video card. AMD did freesync which is already even a part of the hdmi 2.1 standard. What Nvidia did? An expensive monitor expansion board (gsync). Well, AMD did competitive 8/16/32/64 cores cpus. What Nvidia did? A Nintendo Switch SoC. Alright, that one wasn't too bad.
 
Last edited:

dirthead

Banned
The true problem with ray tracing is that it's so trivial to fake effects that look like shadows and reflections that the average person will simply not give a shit or care. Like in those Battlefield demos, if you just added some extra fake environment maps it would basically look the same to most people.

We really have hit diminishing returns with graphics. The only way you can sell this is if you design an entire game around a mechanic that's reliant on real time reflections or something, which wouldn't even necessarily be fun to play.
 

Dontero

Banned
Yeah that is current argument. Restirization got so good that most of the effects of raytracing are more or less emulated already. They are not emulated in correct way but layman aka 99.99999% of gamers will not notice difference.

When gaming industry moved to PBR lighting it basically killed the biggest raytracing weapon, correctly showing materials structure via different bounces of those different surfaces. Other big one was introduction of screen space reflections which are buggy but layman will hardly find problems with it

Obviously raytracing is easier and faster than faking stuff because you don't need to bake and create those sets but unless it gets to great FPS AND provides noticeable difference real time ray-tracing is pointless.

Raytracing right now is where proper GI is. It is easier and cheaper to use GI and raytracing but we got so good at faking that stuff that both proper GI and raytracing simply does not have enough oomph to justify FPS downgrade.

What raytracing could do though is to push as toward softness of movies in games. We try to emulate that with chromatic aberation, motion blurr etc but those fundamentally can't reproduce effect of light pollution/distortion on materials and they try instead to emulate movie lense problems rather than real life human eye.
 

Shai-Tan

Banned
Well I think real time shadows/lighting/reflections do make games look significantly better but there is a huge performance gulf standing in the way between what you might call fake/baked effects and more simulated effects. The same thing happened with PhysX. They can only simulate a very limited number of bodies and it looks terrible compared to fake or offline physical simulations. Ray traced shadows/lighting/reflections in a hybrid render are more computationally tenable than (even simplified) physics simulations but to get performance up it's going to require a lot of hacks and workarounds and it's still not going to be photorealistic in real time
 
Top Bottom