• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Cyberpunk 2077 Official Ray Tracing PC Requirements

PS4:

oB8LoJG.jpg

RTX:

2rNMImI.png
That is probably not just with rtx, but highest setting on a 3090 vs ps4.
 

VFXVeteran

Banned
Crysis Remastered uses VBGI on the Xbox One X at 1080p without dedicated hardware. Ray tracing reflections as well. Considering the more than double increase in performance with the XSX/PS5 plus dedicated RT hardware I don't see why Crysis Remastered cannot run way better on the new consoles.

OF course if we want to really use RT with good quality then we'll have to take resolution all the way down to 1080p/30FPS. Not trying to respect gameplay that takes us back to the PS3/Xbox 360 era.
 

Zathalus

Member
OF course if we want to really use RT with good quality then we'll have to take resolution all the way down to 1080p/30FPS. Not trying to respect gameplay that takes us back to the PS3/Xbox 360 era.
I was referring to the previous generation of consoles. If the Xbox One X can run VBGI at 1080p without dedicated RT hardware, then the new consoles with GPU's more than twice as powerful than the One X and dedicated RT hardware can certainly run it much better than that.
 

VFXVeteran

Banned
I was referring to the previous generation of consoles. If the Xbox One X can run VBGI at 1080p without dedicated RT hardware, then the new consoles with GPU's more than twice as powerful than the One X and dedicated RT hardware can certainly run it much better than that.

It depends on the resolution but you are right in so far as they WILL run it for sure (ignoring any downgrades to the quality). I think when I made that statement I was thinking about it running in 4k at the quality of High or Ultra settings like the PC. The reflection optimization for Spiderman MM has me worried that these consoles just don't have enough in them for producing good RT. The resolution limit (all upsampled for 4k/60FPS) is also very discouraging whether people want to admit it or not.
 
Last edited:

FireFly

Member
The reflection optimization for Spiderman MM has me worried that these consoles just don't have enough in them for producing good RT.
When you're swinging past a building at 60 mph, you probably don't have time to count how many leaves each individual tree has. Good optimisation means putting your resources where they deliver the most visual benefit.
 

VFXVeteran

Banned
When you're swinging past a building at 60 mph, you probably don't have time to count how many leaves each individual tree has. Good optimisation means putting your resources where they deliver the most visual benefit.

I know that. I've optimized before ya know.
 

VFXVeteran

Banned
Leaving this here (timestamped at about 25 minutes);



LOL @ these videos. Why can't ya'll have a youtube channel of an actual graphics developer that can answer these questions. A dev will know all the ins and outs of how RT and it's limits will affect gaming now and for the foreseeable future. RT is nowhere near the same as AA on a polygon.
 

Ascend

Member
LOL @ these videos. Why can't ya'll have a youtube channel of an actual graphics developer that can answer these questions. A dev will know all the ins and outs of how RT and it's limits will affect gaming now and for the foreseeable future. RT is nowhere near the same as AA on a polygon.
In terms of performance impact vs visual quality it is.
 
My PC:

Intel 9900k @ 4.8Ghz
32Gb @ 3200Mhz
Gigabyte 2080ti
Samsung Monitor 1440p/144hz
SDD for games.

Should i be ok for the best RTX experience at 1440p?
 

VFXVeteran

Banned
In terms of performance impact vs visual quality it is.

Dude, how can you tell me it is when you haven't studied how RT actually impacts performance in general rendering compared to AA on a polygon? RT can have infinite bounces with recursion that will literally make a computer run out of memory. It's an exponential algorithm. Computing a filter kernel for a polygon edge is nowhere near as computensive or bandwidth eating. The two are not even in the same field of study. Visual quality is on another level again. One is using Monte Carlo algorithm using pseudo-random number generators for each of the x,y,z components of a vector in world space to approximate light bounce from a light source(s). The other is all about smoothing out a rendering artifact from sampling a scene too low because of the discrete nature of pixels in our display devices (i.e. analog to digital doesn't translate well).

That's what kills me about you guys. You speak like you actually KNOW. But you haven't written a single line of code in your life concerning graphics. And then you talk to me as if I haven't written a single line of code either. And you wonder why I beat my chest a lot. Because you assume I know nothing and I have to keep reminding you that I've actually coded this stuff that you are talking about in practice.
 
Last edited:

Madflavor

Member
Hopefully I can run it at 60 fps with my 6600k/2060 at 1080p with RTX.

You will if you keep RTX to it's minimum settings, and your GFX settins to medium. Will probably need DLSS enabled too.

I'm almost on the same boat as you, except I have a 1440p monitor, so I might be fucked when it comes to Raytracing, since my RTX 2060 is the bare minimum.
 

Ascend

Member
Dude, how can you tell me it is when you haven't studied how RT actually impacts performance in general rendering compared to AA on a polygon? RT can have infinite bounces with recursion that will literally make a computer run out of memory. It's an exponential algorithm. Computing a filter kernel for a polygon edge is nowhere near as computensive or bandwidth eating. The two are not even in the same field of study. Visual quality is on another level again. One is using Monte Carlo algorithm using pseudo-random number generators for each of the x,y,z components of a vector in world space to approximate light bounce from a light source(s). The other is all about smoothing out a rendering artifact from sampling a scene too low because of the discrete nature of pixels in our display devices (i.e. analog to digital doesn't translate well).

That's what kills me about you guys. You speak like you actually KNOW. But you haven't written a single line of code in your life concerning graphics. And then you talk to me as if I haven't written a single line of code either. And you wonder why I beat my chest a lot. Because you assume I know nothing and I have to keep reminding you that I've actually coded this stuff that you are talking about in practice.
You're deliberately misrepresenting their position. No one said it is the same in terms of coding, nor in terms of function. But it is the same in terms of being taxing in performance and improving visuals. It was an analogy, not something to be taken literally.
They focus on recommendations for consumers. RT might have a bunch of long term benefits for developers, but the hardware still is not capable enough, despite all of you producing a bunch of milk out of the carrot for it. By the time RT is prevalent enough all these cards will be too slow to properly run with RT anyway.
 

VFXVeteran

Banned
You're deliberately misrepresenting their position. No one said it is the same in terms of coding, nor in terms of function. But it is the same in terms of being taxing in performance and improving visuals. It was an analogy, not something to be taken literally.
They focus on recommendations for consumers. RT might have a bunch of long term benefits for developers, but the hardware still is not capable enough, despite all of you producing a bunch of milk out of the carrot for it. By the time RT is prevalent enough all these cards will be too slow to properly run with RT anyway.
Are you saying that it's the same when you TALK about it them in a different light? If so, then I take that back. But if you are comparing them to be equivalent? No way.

I've never advocated that full RT is capable on the PC or even what they've presented. But I have advocated that RT does show a remarked difference between rasterization lighting/shading techniques and RT techniques. The results are there for everyone to see. They just want to downplay it as not significant enough even if it is a world of difference. RT can go so far as to tax even the film studios, so of course the GPUs aren't going to be anywhere close to getting those kinds of visual in realtime. But that's not the point. The point is that it is feasible to run NOW and makes a big difference NOW compared to the old conventional rasterization techniques.
 
Last edited:
What proof do you have that Lumen can run at 1440p 60fps? That sure isn't what Epic said, and the UE5 demo was sub 1440p.
The PS5 devkit was able to steadily maintain 1440p 30FPS "most of the time" with cinematic-quality 8K textures, global illumination, and other new advanced techniques.

Read more: https://www.tweaktown.com/news/7251...an-at-1440p-30fps-with-dynamic-res/index.html

UE5 Lumen Aiming For 60 FPS On PS5 & Xbox Series X

They've said it ran above 30fps on ps5 but was locked to 30fps, they expect to be able to reach 60fps after optimizations.
 

Mister Wolf

Member

They've said it ran above 30fps on ps5 but was locked to 30fps, they expect to be able to reach 60fps after optimizations.

What they showed in the demo was using a dynamic resolution scaler and averaged sub 1440p so how was it a locked 1440p 30fps? If they had locked the resolution it would be dropping below 30fps, that's the the reason they used a dynamic resolution.
 
What they showed in the demo was using a dynamic resolution scaler and averaged sub 1440p so how was it a locked 1440p 30fps? If they had locked the resolution it would be dropping below 30fps, that's the the reason they used a dynamic resolution.
Most of the time running at 1440p and higher than 30fps. Optimizations expected to allow 60fps sustained.
 

Mister Wolf

Member
Most of the time running at 1440p and higher than 30fps. Optimizations expected to allow 60fps sustained.


dQOnqne.png


It averaged sub 1440p. You do understand what dynamic resolution scalers are for right? What do you think would have happened to the framerate if they locked the resolution to 1440p.
 
Last edited:

dQOnqne.png


It averaged sub 1440p. You do understand what dynamic resolution scalers are for right? What do you think would have happened to the framerate if they locked the resolution to 1440p.
Nanite runs easily at 60+fps. Lumen needs some optimization. They've already said they expect it to easily reach 60fps. If you have problem with Epic's aimed framerate you should ask them why they think it's possible.
 

Mister Wolf

Member
Nanite runs easily at 60+fps. Lumen needs some optimization. They've already said they expect it to easily reach 60fps. If you have problem with Epic's aimed framerate you should ask them why they think it's possible.

I dont have a problem with what they're aiming for. My problem was with your initial statement that it could run 1440p 60fps when Epic's own released data on the demo showcased doesn't back that up.
 
Last edited:

Madflavor

Member
My 2070s at 1440p is already outdated if I wanna use ray tracing

I don't believe so. My opinion of course, but if you don't go hog wild with the GFX and RTX settings, and set them to something reasonable, with DLSS enabled, you'll be just fine with a 2070 on a 1440p. I would enable DLSS (duh), play with your graphics settings and set them med/high (mostly medium), and if you enable RTX, turn it to it's low/mid settings. I can't imagine your FPS chugging as a result of that.
 
Last edited:

Katajx

Gold Member
Considering a 3080 with a 3700x. Cyberpunk is the only game with ray tracing support that interests me at the moment.

I feel like I’m more interested in the tech than possibly the game itself. Like if this is the best use of ray tracing yet, I want to see it.

Every damn thing else seems to be getting delayed into next year.Got a ps5, and all the next gen patches and features getting delayed sucks.
 
I expect the equivalent of the One X version running at 60fps on next gen consoles. If not, they are clearly sucking Nvidia’s nuts and/or gunning for double dippers.
 

MadYarpen

Member
Was anyone able to establish if those requirements are for 30 or 60 FPS?

With GPU supply issues many, I think, will have to settle for 2060 / 5700XT because this is the only thing you can buy now for more or less reasonable price. So I'm wondering if 60fps in 3440x1440 is what you could expect in this scenario.
 
Top Bottom