Seems highly unlikely.Probably means the end of rt cores and tensor cores in the next series.
If I wanted the opinion of a corporate slave i'd ask you hon.Seems highly unlikely.
If I wanted the opinion of a corporate slave i'd ask you hon.
man i fucking hate team green, such fucking bullshit this
hairworks
gsync
ray tracing
and any other tech they keep in their pocket.
i know why, but in the end us gamers lose.
oh well
agreed but lets not add the + price to f over consumerstbh they deserve to keep gsync in their pocket, it's a miracle technology that sufficiently obliterates screen tearing without any added stutter
although there are alternatives now, scan sync via RTSS is in many ways superior to gsync / free / fast
agreed but lets not add the + price to f over consumers
if only we could send the message as gamers.true
If I wanted the opinion of a corporate slave i'd ask you hon.
my problem is that it is mostly salty gamers that cry their (new) card got old.Looks like leather jacket at NVHQ has heard grumblings of the coming consoles doing RT-RT?
The entire RTX range does have the stench of poorly binned enterprise parts being flogged to gamers.
It will probably strengthen the case for it actually.Probably means the end of rt cores and tensor cores in the next series.
Probably means the end of rt cores and tensor cores in the next series.
Can't wait to see the benchmarks!
Currently 1080 Ti runs Metro RTX at 18 FPS at 1440p
Maybe Big Pascal can reach 30 FPS at 1080p in this game.
tbh they deserve to keep gsync in their pocket, it's a miracle technology that sufficiently obliterates screen tearing without any added stutter
although there are alternatives now, scan sync via RTSS is in many ways superior to gsync / free / fast
That was using exactly the same connection as FreeSync, when run on notebooks.... G-Sync... ...it's a miracle technology...
Just use the ignore function. I haven't seen corporate shill threads in quite a while .If I wanted the opinion of a corporate slave i'd ask you hon.
Keep in mind that there are a lot of limitations in that CryEngine demo. There are plenty of objects which do not reflect at all and some of the reflection exhibit artefacts. It also only seems to showcase mirror reflections rather than glossy reflections. It's not as robust or complete as BFV's implementation, basically.The power of that CryEngine RT demo running on Vega 56, hum?
Just when NVidia thought they would make every1 think RTX 2080Ti GIGARAYS was the only way to get Ray traced games.
Nice try, at least they managed to push it and I love the few applications we got so far.
Ray Tracing is going to get cheaper and cheaper it seems.
I vote for Next gen consoles having it.
have fun with your raytracing then. you think it's bad on RTX cards just wait until you play on a pascal card.Pascal remains by far the best generation of graphics card produced by Nvidia.
no.Wasn't RT the only unique thing about the RTX cards? What's the difference between a 1080Ti and RTX 2080 if both support RT?
It's true and I agree.Keep in mind that there are a lot of limitations in that CryEngine demo. There are plenty of objects which do not reflect at all and some of the reflection exhibit artefacts. It also only seems to showcase mirror reflections rather than glossy reflections. It's not as robust or complete as BFV's implementation, basically.
However, I think it's 'good enough' and fast enough to work well as a solution. It would be a much better solution than solutions such as SSR right now and more viable, perhaps.
nVidia did a good job convincing people that tensor cores are somehow dedicated hardware for ray tracing.The Crytek RT video from the other day gives me hope that card without dedicated hardware can have good results.
nVidia did a good job convincing people that tensor cores are somehow dedicated hardware for ray tracing.
nVidia did a good job convincing people that tensor cores are somehow dedicated hardware for ray tracing.
The tensor core isn't for ray tracing but the RT core is absolutely dedicated hardware for ray tracing. What the heck else would it be?nVidia did a good job convincing people that tensor cores are somehow dedicated hardware for ray tracing.
RT cores are dedicated hardware for ray-tracing. Cards without RT cores(or similar) are using software for ray-tracing. People who were convinced you needed tensor or RT cores for ray-tracing weren't paying attention.
The RT cores are ASICs on the Tensor core. But to somehow classify RTX as the hardware solution, and the rest as 'software', is borderlne dishonest, especially when nVidia's own earlier GPUs offer hardware acceleration for ray tracing, including in the professional space.The tensor core isn't for ray tracing but the RT core is absolutely dedicated hardware for ray tracing. What the heck else would it be?
They are unrelated.The RT cores are ASICs on the Tensor core.
RTX is hardware-accelerated raytracing. Clearly not everything is carried in hw -- shaders, for instance, will always remain as 'general purposes' code on the cuda cores.But to somehow classify RTX as the hardware solution, and the rest as 'software', is borderlne dishonest, especially when nVidia's own earlier GPUs offer hardware acceleration for ray tracing, including in the professional space.
Replace 'CPU' with 'GPU' and this is very much what RTX does -- it hw-accelerates parts of the raytracing pipeline which parts are carried as gen-purpose code on other GPUs.The lines are being blurred here. Because generally, the whole definition of hardware acceleration is; "the use of computer hardware specially made to perform some functions more efficiently than is possible in software running on a general-purpose CPU."
https://en.wikipedia.org/wiki/Hardware_acceleration
That's the very definition of 'software implementation'.But now we're at the point where if a dedicated ASIC is not present, we start seeing it as a software implementation...?
For me the rasterization differences were at least as interesting as additions of RT/Tensor cores.Wasn't RT the only unique thing about the RTX cards? What's the difference between a 1080Ti and RTX 2080 if both support RT?
Can you elaborate what scan sync is? Never heard of it.
I tried reading the tutorial but it's not easy to understand does it replace gsync/freesync? Is there a more easy guide then on blurbusters?scanline sync makes tearing disappear while keeping frame times consistent, and input lag is minimal
it's probably my favorite driver level function ever, every pc owner should be using it