So a Nvidia card is better prepared for the future...
With 10GB? Nope.So a Nvidia card is better prepared for the future...
The other way around. The simpler the RT, the closer AMD is to competitive performance. As complexity increases RTX pulls ahead.Basically the higher the raytracing settings the lower the difference between the two but if you want low raytracing then 3080 is your card.
lol that’s a weird sell. The king of crappy raytracing!!
With 10GB? Nope.
How long until the dreaded time where 10 gigs will not be enough comes ? We're almost 4 months into ampere's life. No issues so far. Nothing on the horizon this year that should pose any troubles. When should we expect those troubles to come ?
Basically if you care about RT, go with NVIDIA and if you don’t care about RT, go with AMD.
Hoping to see some major improvements in RDNA 3 though.
Can I have your crystal ball?Nothing on the horizon this year that should pose any troubles.
Not quite. After a certain point the difference starts shrinking, which is the opposite of what we'd expect. Watch the video again.The other way around. The simpler the RT, the closer AMD is to competitive performance. As complexity increases RTX pulls ahead.
With 10GB? Nope.
Basically if you care about RT, go with NVIDIA and if you don’t care about RT, go with AMD.
Hoping to see some major improvements in RDNA 3 though.
Cry moar?DF totally-not-shills "deep dive", no thanks.
Wake me up if anandtech bothers.
Weird because the nVidia card is cheaper and has better rasterization for these that doesn’t care about RT.Basically if you care about RT, go with NVIDIA and if you don’t care about RT, go with AMD.
Hoping to see some major improvements in RDNA 3 though.
Basically if you care about RT, go with NVIDIA and if you don’t care about RT, go with AMD.
Hoping to see some major improvements in RDNA 3 though.
Heard that's a really good card too.Here I am with my 6800xt nitro. I had an opportunity to get a new card and dont regret it.
This is "Zen3 is RISC" level of BS.It does say in the image it uses the CUs. The CUs may have "ray accelerators", but it still means that in the overall scene, RT performance is cannibalized from shader performance by having CUs do raytracing instead of shader work.
Because I doubt you, or most users hyping the shit out of NV tech, understand what it means.Why it is BS ? It is written in your AMD screen that "traversal of the BVH and shading of ray results is handled by shader code running on the Compute Units". I'm wrong?
Welp, NV manages to hide effects unseen before DXR for more than 2 years, so, I'd give AMD some time too.AMD manages to hide from us for 2 months
This is true. NVIDIA's offerings have historically been cheaper than AMD offerings in my country as well.Weird because the nVidia card is cheaper and has better rasterization
Why not?With 10GB? Nope.
Be careful, 3060Ti with 12GB VRAM is inbound, while 3080 is MIA.AMD on other side is wasted their additional VRAM capacity
If your aim is to get 2x the perf (e.g. 30fps -> 60fps) of the next-gen base console (i.e. Series S), then I suspect even a 3060 Ti would be enough. RTX 3070 will allow you to enable some higher quality non-RT effects/increase draw distance on top while also delivering 2x the perf.While we on topic of Nvidia cards?
3060 Ti or 3070?
for 1080p, 1440p and occasionally 4K (if game is really well optimized like Doom Eternal. Fuck me, this game can run at 4K/60 on a goddamn toaster)
How long until the dreaded time where 10 gigs will not be enough comes ? We're almost 4 months into ampere's life. No issues so far. Nothing on the horizon this year that should pose any troubles. When should we expect those troubles to come ?
They also overlooked the fact 3080 has huge bandwidth @760GB/s and RTX IO support. In fact it will age better than AMD cards I think.How long until the dreaded time where 10 gigs will not be enough comes ? We're almost 4 months into ampere's life. No issues so far. Nothing on the horizon this year that should pose any troubles. When should we expect those troubles to come ?
Oh, boy, see, is it just a coincident that most green tech is surrounded by smoke and mirrors?They also overlooked the fact 3080 has huge bandwidth @760GB/s and RTX IO support.
Well... I don't think AMD needs so much bandwidth. Many bandwidth intensive things are "catched" by the big cache. But it will be really interesting in future benchmarks. But it is like with any GPU generation, don't buy GPUs for the future, buy them for the games you want to play currently.They also overlooked the fact 3080 has huge bandwidth @760GB/s and RTX IO support. In fact it will age better than AMD cards I think.
What AMD calls 'resizable bar' again? Being a corporation they can call it whatever they want as long as they implement it.Oh, boy, see, is it just a coincident that most green tech is surrounded by smoke and mirrors?
DirectStorage API (you can' call it RTX IO, DLSS BigBoobs, or Susan, if it makes you feel batter) is part of the DX 12 Ultimate, supported by RDNA2.
Not even mentioning the "direct form SSD" hype coming from PS5 world, which, last time I've checked, was powered by AMD APU.
It doesn't require it. There's DirectML for example.What AMD calls 'resizable bar' again? Being a corporation they can call it whatever they want as long as they implement it.
DLSS is another story and it requires additional transistors and proprietary IP.
In 10 years we likely will all be buying cards with RT, and comparing RT performance at that point is warranted.
Right now, basing your graphics card buying decision primarily on RT performance is like basing your current car purchase primarily on how good its self-driving / autonomous driving capability is.
AMD navi card just lacks the hardware for DLSS-like feature. If AMD could pull this off then I will applaud for them. Right now, I have put my money to Ryzen 2 CPU and RTX 3080 and I'm satisfied with this purchase.It doesn't require it. There's DirectML for example.
It's the way nVidia chose to implement it, like nVidia does with all their tech. Look at PhysX, G-Sync and GameWorks.
Maybe. Maybe not. In hindsight, one could also have argued that monitors lacked the hardware for variable refresh rate like G-Sync, and look where we are now. Freesync pretty much pushed G-Sync out of the market, and now HDMI 2.1 is making both irrelevant.AMD navi card just lacks the hardware for DLSS-like feature. If AMD could pull this off then I will applaud for them. Right now, I have put my money to Ryzen 2 CPU and RTX 3080 and I'm satisfied with this purchase.
I am not interested in paying over $700 to lower graphics settings for playable framerates.But buying an RTX card today is feasible and makes sense. The current gen with the 3080 are the target HW for raytracing, the devs scale and adjust the raytracing to work reasonably well on these cards. 10 years in the future is an absurdly long time in GPU terms.
I never said to ignore it, but to not put a high priority on it.Ignoring that feature seems pretty ignorant, especially because it's the main differenciator between these two cards. Sure, you can also put up the VRAM argument, slower but significantly more for AMD, while less but significantly faster for Nvidia. It will be a factor in 3+ years maybe, while raytracing is already a factor in some key games and will just gain importance.
Well I wouldn't say never. There was a point where MSAA, AO, tessellation each were too demanding. Over time RT will become more playable. But with tech like this it's generally not a good idea to be an early adopter.Am I the only one that couldn't give 2 shits about Ray Tracing? That will never be worth the performance hit to me
Well I wouldn't say never. There was a point where MSAA, AO, tessellation each were too demanding. Over time RT will become more playable. But with tech like this it's generally not a good idea to be an early adopter.
DLSS is a stronger argument to go for RTX over AMD cards than RT is, in my view. DLSS works best for mid range GPUs to run 4K, or if you want high framerate gaming. But the idea was to make DLSS work for all games without developer intervention. Until that is the case, I really would not base my purchase primarily on that either. It is indeed being adopted. But ultimately every nVidia tech seems to end up obsolete as a non-proprietary solution is implemented that works for everyone, or dead.
8h later your done with that game...Glad I can play Miles Morales full rtx on my 3080
Am I the only one that couldn't give 2 shits about Ray Tracing? That will never be worth the performance hit to me
I also think DLSS is pretty overrated. Cold War even on Quality, it isn't great. You can definitely tell it is not native resolution.
You played Miles Morales in 8 hours on a 3080? Tell us more about your experience!8h later your done with that game...