Omega Supreme Holopsicon
Member
That is probably not just with rtx, but highest setting on a 3090 vs ps4.PS4:
RTX:
That is probably not just with rtx, but highest setting on a 3090 vs ps4.PS4:
RTX:
Crysis Remastered uses VBGI on the Xbox One X at 1080p without dedicated hardware. Ray tracing reflections as well. Considering the more than double increase in performance with the XSX/PS5 plus dedicated RT hardware I don't see why Crysis Remastered cannot run way better on the new consoles.
I was referring to the previous generation of consoles. If the Xbox One X can run VBGI at 1080p without dedicated RT hardware, then the new consoles with GPU's more than twice as powerful than the One X and dedicated RT hardware can certainly run it much better than that.OF course if we want to really use RT with good quality then we'll have to take resolution all the way down to 1080p/30FPS. Not trying to respect gameplay that takes us back to the PS3/Xbox 360 era.
I was referring to the previous generation of consoles. If the Xbox One X can run VBGI at 1080p without dedicated RT hardware, then the new consoles with GPU's more than twice as powerful than the One X and dedicated RT hardware can certainly run it much better than that.
When you're swinging past a building at 60 mph, you probably don't have time to count how many leaves each individual tree has. Good optimisation means putting your resources where they deliver the most visual benefit.The reflection optimization for Spiderman MM has me worried that these consoles just don't have enough in them for producing good RT.
When you're swinging past a building at 60 mph, you probably don't have time to count how many leaves each individual tree has. Good optimisation means putting your resources where they deliver the most visual benefit.
Leaving this here (timestamped at about 25 minutes);
In terms of performance impact vs visual quality it is.LOL @ these videos. Why can't ya'll have a youtube channel of an actual graphics developer that can answer these questions. A dev will know all the ins and outs of how RT and it's limits will affect gaming now and for the foreseeable future. RT is nowhere near the same as AA on a polygon.
Leaving this here (timestamped at about 25 minutes);
I feel this is a subtle/stealth brag postMy PC:
Intel 9900k @ 4.8Ghz
32Gb @ 3200Mhz
Gigabyte 2080ti
Samsung Monitor 1440p/144hz
SDD for games.
Should i be ok for the best RTX experience at 1440p?
My PC:
Intel 9900k @ 4.8Ghz
32Gb @ 3200Mhz
Gigabyte 2080ti
Samsung Monitor 1440p/144hz
SDD for games.
Should i be ok for the best RTX experience at 1440p?
In terms of performance impact vs visual quality it is.
Hopefully I can run it at 60 fps with my 6600k/2060 at 1080p with RTX.
You're deliberately misrepresenting their position. No one said it is the same in terms of coding, nor in terms of function. But it is the same in terms of being taxing in performance and improving visuals. It was an analogy, not something to be taken literally.Dude, how can you tell me it is when you haven't studied how RT actually impacts performance in general rendering compared to AA on a polygon? RT can have infinite bounces with recursion that will literally make a computer run out of memory. It's an exponential algorithm. Computing a filter kernel for a polygon edge is nowhere near as computensive or bandwidth eating. The two are not even in the same field of study. Visual quality is on another level again. One is using Monte Carlo algorithm using pseudo-random number generators for each of the x,y,z components of a vector in world space to approximate light bounce from a light source(s). The other is all about smoothing out a rendering artifact from sampling a scene too low because of the discrete nature of pixels in our display devices (i.e. analog to digital doesn't translate well).
That's what kills me about you guys. You speak like you actually KNOW. But you haven't written a single line of code in your life concerning graphics. And then you talk to me as if I haven't written a single line of code either. And you wonder why I beat my chest a lot. Because you assume I know nothing and I have to keep reminding you that I've actually coded this stuff that you are talking about in practice.
Are you saying that it's the same when you TALK about it them in a different light? If so, then I take that back. But if you are comparing them to be equivalent? No way.You're deliberately misrepresenting their position. No one said it is the same in terms of coding, nor in terms of function. But it is the same in terms of being taxing in performance and improving visuals. It was an analogy, not something to be taken literally.
They focus on recommendations for consumers. RT might have a bunch of long term benefits for developers, but the hardware still is not capable enough, despite all of you producing a bunch of milk out of the carrot for it. By the time RT is prevalent enough all these cards will be too slow to properly run with RT anyway.
Atomview can allow use of film assets with prebaked film ray traced lighting in real time.so of course the GPUs aren't going to be anywhere close to getting those kinds of visual in realtime
Btw if lumen can run at 1440p 60fps on ps5 with a bit of optimization it should be possible at 4k 30fps
What proof do you have that Lumen can run at 1440p 60fps? That sure isn't what Epic said, and the UE5 demo was sub 1440p.
The PS5 devkit was able to steadily maintain 1440p 30FPS "most of the time" with cinematic-quality 8K textures, global illumination, and other new advanced techniques.
Read more: https://www.tweaktown.com/news/7251...an-at-1440p-30fps-with-dynamic-res/index.html
UE5 Lumen Aiming For 60 FPS On PS5 & Xbox Series X
Unreal Engine 5 Lighting Tech Is Targeting 60 FPS On PS5 & Xbox Series X - PlayStation Universe
Epic Games' Lumen tech is aiming for 60FPS on PS5 and Xbox Series X, the Fortnite developer has revealed.www.psu.com
They've said it ran above 30fps on ps5 but was locked to 30fps, they expect to be able to reach 60fps after optimizations.
Most of the time running at 1440p and higher than 30fps. Optimizations expected to allow 60fps sustained.What they showed in the demo was using a dynamic resolution scaler and averaged sub 1440p so how was it a locked 1440p 30fps? If they had locked the resolution it would be dropping below 30fps, that's the the reason they used a dynamic resolution.
Most of the time running at 1440p and higher than 30fps. Optimizations expected to allow 60fps sustained.
Nanite runs easily at 60+fps. Lumen needs some optimization. They've already said they expect it to easily reach 60fps. If you have problem with Epic's aimed framerate you should ask them why they think it's possible.Epic sheds light on the data streaming requirements of the Unreal Engine 5 demo
It's going to be a shocker for some and for others like me not so much, because I've been saying this for months now. I hate to break it to some of you but that demo's data streaming could be handled by a 5 year old SATA SSD. 768MB is the in view streaming requirement on the hardware to...www.neogaf.com
It averaged sub 1440p. You do understand what dynamic resolution scalers are for right? What do you think would have happened to the framerate if they locked the resolution to 1440p.
Nanite runs easily at 60+fps. Lumen needs some optimization. They've already said they expect it to easily reach 60fps. If you have problem with Epic's aimed framerate you should ask them why they think it's possible.
I'm considering what they expect to reach by next year.I dont have a problem with what they're aiming for. My problem was with your initial statement that it could run 1440p 60fps when Epic's own released data on the demo showcased doesn't back that up.
I'm considering what they expect to reach by next year.
My 2070s at 1440p is already outdated if I wanna use ray tracing
Well see what happens with the final ue5 release but they probably expect games to be able to run at 60fps.Dude, just stop it. This is a demo without even being a game.