Freelance Yakuza
Member
Great
I will just wait to see what Nvidia/AMD will launch in the next 2 years and upgrade my pc
I will just wait to see what Nvidia/AMD will launch in the next 2 years and upgrade my pc
Sorry, I meant 2070s, which I have.Easy? Every benchmark i saw 2070 run from 45 to 55 at 4K ultra depending on the weather..
A £450 console doesn't have 2080 performance, a card that cost almost twice as much? Well, I'm surprised.
People are just super high on console launches at the moment, Sony fans are loving the underperformance from Ubi on Xbox and trying to say it's a measurement and proof of things to come, while they argue over milliseconds of torn frames...It's bizarro land for sure, but some people will come back down to reality as the new car smell wears off. As it stands, I just try and let them have their fun, it's not gonna last forever.OP, really does not know how console launches work. Acting like XSX is not more powerful than the 2060 Super lol.
This is a lazy, low-effort port, it’s not really optimized for XSX, it’s a launch game. Every console generation launch we have the same lazy ports, this is nothing new.
Is this peoples first console launch? Or what is going on here? I thought people on this board actually know stuff, but looking at OP it seems to be his first console launch ever. Cute.
Console have basic RT performance, but even miles morales has far better raytracing than I'd ever imagine the PS5 to have.
It's fun to poke fun at, but I think it probably has more to do with Ubisoft than the hardware available in the XsX.
This is questionable already. The PS5 has 36 RT cores, the XSX has 52.So this should mean both Xbox and PS5 are equivalent to a 2060 Super when it comes to ray tracing.
TFLOPS is useless here. And considering the architectures are different, counting RT cores is only so useful. The RTX2060S has 34 RT cores, while the XSX has 52 RT cores. But they are implemented differently and can behave differently depending on the engine and the game as mentioned above. And even within the same architecture, it doesn't scale linearly. Is the 2080Ti twice as fast at RT compared to the RTX 2060S? It's 68 vs 34 RT cores... No it's not (Source).Not a good first effort by AMD in my opinion, if their 12 tflops gpu is offering worse performance in ray traced games than a 7 tflops nvidia gpu then we are going to have a pretty lame gen.
This is questionable already. The PS5 has 36 RT cores, the XSX has 52.
Take clock speeds into account and the XSX should be around 15% faster than the PS5 at RT. What that difference should be? It's about the difference between the 2070S and the 2060S (40 RT cores vs 34 RT cores).
So either the XSX is like a 2060S and the PS5 is like a 2060 non-S but can put out stuff like Miles Morales RT, or, you are jumping to conclusions.
Remember that nVidia and AMD cards both have their strengths and weaknesses. Whatever is done with RT is not done through rasterization. This has to be taken into account. Say nVidia is better at ambient occlusion with rasterization than AMD, but AMD is better with reflections. If you have a game that uses RT for its reflections, AMD's relatively performance hit will be higher, even if the RT performance is the same for both. Likewise, if you have a game that uses RT for ambient occlusion, nVidia's relative performance hit will be higher, because they replaced what is done faster on their GPU with a taxing RT implementation.
Comparing one game is just that; a sample of one. It is not enough to be representative, and there are too many variables.
TFLOPS is useless here. And considering the architectures are different, counting RT cores is only so useful. The RTX2060S has 34 RT cores, while the XSX has 52 RT cores. But they are implemented differently and can behave differently depending on the engine and the game as mentioned above. And even within the same architecture, it doesn't scale linearly. Is the 2080Ti twice as fast at RT compared to the RTX 2060S? It's 68 vs 34 RT cores... No it's not (Source).
I am not saying AMDs RT implementation will be awesome. At this point, RT accelerations in GPUs are still mediocre for gaming performance-wise. The consoles can only do so much. They already have to try and run 4K, which is actually beyond the optimal resolution for their class of GPUs, and on top of that they are expected to do RT... It's kind of too much to ask of these machines, and well, they look good with what they can offer.
AMD's implementation will be a failure if the 6800XT with 72 RT cores performs the same as a 2060S with 34 RT cores. I doubt that's going to happen. I also doubt that it will match nVidia's Ampere GPUs (it's likely going to be slower), but in practice, I doubt it really matters, considering how poor RT performs. Unless you want to pay $700 to game at 1080p with RT, it's not going to matter much.
In before "BUT DLSS"
I just hope this low-end ray tracing doesn't hold PC gaming back over the next few years, because the tech is going to grow by leaps and bounds and the GPUs are going to get much more capable.
Judging by the difference between Spiderman and watchdogs, I would say how a developer implements ray tracing is equally important, watchdog seems like it's just a bit shit.
I mean, it’s kinda hard to make any legit claims when your only sample is a Ubisoft crossgen game that’s not fully optimized for next gen consoles specifically
Why do PC gamers *always* forget the benefits of developing for a closed system.
Show me a PC Game on a 1.8tf GPU that looks as good as TLOU2...
Same will happen here, PC gamers will scoff then the exclusives will roll out (especially on PS5) then they will go quiet again, this includes RT which is in its infancy on these machines
Would it not make sense if raytraced reflections activated only when a character is moving at walking or slow running pace.
Then when in a car or swinging or traversing quickly, the raytracing could be limioted apart from on the car....
Not sure if that would work, it's just an idea.
You forgot to take into account that BVH traversal will be significantly faster on the PS5 over the XSX due to higher clock speeds, but XSX should be able to launch more rays. Maybe that was implied, though. What the end result will be is hard to say.This is questionable already. The PS5 has 36 RT cores, the XSX has 52.
Take clock speeds into account and the XSX should be around 15% faster than the PS5 at RT. What that difference should be? It's about the difference between the 2070S and the 2060S (40 RT cores vs 34 RT cores).
Oh baby give me that 3090ti and I’ll never ask for anything else thanksI just hope this low-end ray tracing doesn't hold PC gaming back over the next few years, because the tech is going to grow by leaps and bounds and the GPUs are going to get much more capable.
Yer that 2060 does look a hell of a lot better than the xbox version.
Then again spiderman looks incredible so surely the xbox can improve it.
another reason is maybe Insomic games are just a tier above Ubisoft in that department.
Not wanting to through shade on Ubisoft but i don't rate them anywhere near the naughty dogs or sucker punches of this world, this is probably just that. A good studio, not a masterful one, or maybe i am being to harsh,
I don't think you actually understood what I wrote.Everything you said is NOT how things work when the game engine is rendering.
There is no special performance chips that know what a particular algorithm is for a given hardware set. You won't have "this card" does RT reflections better than the other card. The algorithm doesn't care what the hardware does with the technique. There is no special sauce for different algorithms in the chipsets.
No I did not forget to take that into account. If I would, that would be closer to 45% faster for the XSX, rather than the mentioned 15%.You forgot to take into account that BVH traversal will be significantly faster on the PS5 over the XSX due to higher clock speeds, but XSX should be able to launch more rays. Maybe that was implied, though. What the end result will be is hard to say.
Where are your receipts for this claim? I'm curious. WD:L is using the GPU pretty heavily and their RT implementation is pretty thorough especially with what their reflections are doing. You might want to watch the DF video on it to get a better understanding of how the reflections implementation is of super high quality and will cost GPU cycles.
I’m not saying that the raytracing in watch dogs isn’t great, I’m just saying (admittedly this is an assumption) that they probably were developing with NVidia rtx as the target rather than AMD - so I don’t know if we can fully state what AMD’s rt is capable of based off this game alone.Yes, RT in WD is quite great:
Wood and metal reflects (probably not on consoles)
Lamp reflected on picture glass
Yes, RT in WD is quite great:
Wood and metal reflects (probably not on consoles)
Lamp reflected on picture glass
The NPCs look like crap on both games. Stop your little hate boner that you have. It’s embarrassing having to come into a thread and see you acting like a child yet again.Even the NPCs are of better quality. This game overall looks way better than MM. The PC version maxed out just has many 3d features that are expensive. Look at the SSS of the skin on your character. You can't find that quality SSS on any console during gameplay (not cinematics).
The NPCs look like crap on both games. Stop your little hate boner that you have. It’s embarrassing having to come into a thread and see you acting like a child yet again.
They don’t look good period. I don’t have any game in mind because NPCs are basic. All games look that way. Throw up your pictures. You have crazies in this thread saying the XBox is like a 3080. Quite your crap and act like an adult.They do not look like crap. Show me an NPC that looks the best of all games while in gameplay and I'll match it with an NPC in WD:L. You go first. I already have my screenshots ready to roll.
And you would be wrong and your comment comes off as arrogant and ignorant at the same time. Ubisoft has multiplatforms to think about when making their graphics engine A 1st party doesnt' have to worry about that. Ubi's graphics engine is more robust and definitely contains more tech in it.
Spiderman MM reflections are complete garbage compared to WD:L reflections. They don't look better by any strech of the imagination. Just by avoiding the original shader calls in MM should make you see the difference right there.
Ubi is huge dude. Just like all the other large studios that have way more money and resources because they have to support multiple platforms. It does NOT mean their graphics engine is automatically unoptimized just because its a 3rd party company.
I guess you work for Ubisoft or something, hence your shitty tone in what you deem arrogant and ignorant.
Spiderman reflections look fucking amazing, proper game developers have been saying so, not sure why you think your a better authority over them, well your not.
We get it, you think your the bee's knees around here, but your not, your just average joe at the end of a pc.
So, when do we get your Youtube channel explaining all this, will you still refer to yourself in third person then?Yea, average joe that gets paid to do this stuff on a daily basis. Sorry, but I know exactly how to write an algorithm for spiderman's reflection shader code. It's very easy to write. I also know that WD:L has way more computations and fires more rays to get better reflections. I have written those shaders for years.
Wait, does it mean that they are both completely equal to the 2060 performance for raytracing, or does it mean that the developers of Watch Dogs have a profile for the 2060 that they just decided to use for the consoles to make their lives easier while making their mediocre game?
You are delusional if you thing that consoles are running those games at native 4k60 ultra. None of the titles tested so far are native, it's all reconstructed and dynamic resolution or 4k30. And don't get me wrong, I'm ok with it, I think the games look great, but is innocence believe they are attempting the same IQ as the PC. If you lower some settings like the consoles you would be surprised on how well the RTX 2060 run games at 4k. And when the card need to use a lower resolution, most of the time it still higher then consoles. I'm still think they are like 2080 in rasterization and like the 2060S in RT, but we still need to see it in games. So far, it's 2060S performance.Easy? Every benchmark i saw 2070 run from 45 to 55 at 4K ultra depending on the weather..
I think that what they do on consoles are quite impressive. It's obvious to pc gamers that this games don't run at same level of fidelity, but because of it, they can pull very impressive gains in performance compare to pc space when most games don't even feature dynamic resolution. And of course most entusiasts want the game running on ultra. That's why a lot of people dismiss the RTX 2060 S as a weak card, there so much better options on pc space, and nobody optimize settings for the RTX when comparing performance. It's obvious it will tank.MM is about the expertises and big resources Sony first party can throw at the problems.
afaik MM RT is but a mix of cube maps and half baked RT.
Technically it is not even close to WDL.
Technique-lly, it is a smart way to solve the consoles weaker RT hardware.
It's not that bad. You won't play with 800% zoomThat video made me not even want to get a series x. I can't stand aliasing. I thought for sure that crap would be gone this gen.
I think that what they do on consoles are quite impressive. It's obvious to pc gamers that this games don't run at same level of fidelity, but because of it, they can pull very impressive gains in performance compare to pc space when most games don't even feature dynamic resolution. And of course most entusiasts want the game running on ultra. That's why a lot of people dismiss the RTX 2060 S as a weak card, there so much better options on pc space, and nobody optimize settings for the RTX when comparing performance. It's obvious it will tank.
The engine that ubi uses on AC is amazing and heavy, I don't think is unoptimized at all, try any modern assassin's creed game on pc with ultra settings. It is pretty much nextgen.I think the ray tracing capabilties of these consoles are probably going to be sub par but come on OP this is a shit way to prove your point. A multi-platform game from a developer that's known to fart stuff out. Yeah...
I think saying a first gen cross gen game is where ray tracing will be in 5 or 6 years on both consoles is a bit premature. both consoles will get better as the gen progresses , API's will get better and coding of the games will far exceed what we have now.
the best looking game so far IMO is Spiderman MM and that will look pretty poor by the time we get to the end of this gen. when you look back at the first games on PS4/XB1 to what we have now there is a huge difference.
same will be applied to ray tracing techniques as they find their feet better with both consoles
Hmm. I know that part is at the very beginning. Going to check it out on the 1xIt's not that bad. You won't play with 800% zoom
In this gen the thing developers are going to get better and it's new in both console and pc space is the direct access to fast SSD's for nextgen streaming capabilities. But I agree with your post.Mhm this may be true for old generations. But since they moved to an easy to program architecture last gen, the gains aren't that huge. Infamous Second Son is still one of the better looking PS4 games if you ask me.
The NPCs look like crap on both games. Stop your little hate boner that you have. It’s embarrassing having to come into a thread and see you acting like a child yet again.