• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Quantum Break PC performance thread

Sydle

Member
Oh sure, I don't blame Remedy, they're more than capable of producing great PC versions of their games. So it is most likely a case of limited time / budget provided by the publisher.

According to Sam Lake they were the ones who told MS they could make a PC version launch at the same time as the previously set Xbox One release. Previous to that MS had said there was no PC version in the works.

Sorry, it was Remedy.
 
yea looks pretty good. honestly there isnt too much of a difference between upscaling on and off




i think whatever they are doing just isnt worth the performance cost. the game looks good, but isnt mind blowing IMO
Small team, running as much in real time as possible means they saved a lot of time and money in development since their art team didn't have to go in and create everything prebaked. It works fine on hardware that can handle the game's heavy implementation of asynchronous compute.
 

ZSeba

Member
Small team, running as much in real time as possible means they saved a lot of time and money in development since their art team didn't have to go in and create everything prebaked. It works fine on hardware that can handle the game's heavy implementation of asynchronous compute.

All excuses, didn't stop other studios from producing better looking and better running games.
 

SimplexPL

Member
If you disable upscaling, you can forget about 60fps in native 1080p. Even overclocked 1080 struggles with that.
With upscaling enabled, there is a chance of getting 60fps at 1080p (upscaled from 720p) with some options dialed down (especially volumetric lightning, if I am not mistaken).
 

dr_rus

Member
Responses in bold.
- Hyperthreading is the ability to run several threads on the same execution pipeline at once. That's exactly what all GPUs do since GF256 or maybe even earlier. You seem to think that GCN's async compute is something different to an ability to run threads of compute context alongside threads of graphics context. What happens when there are no async compute? There are several threads of graphics context running on the same CU at once. It's still "hyperthreading" essentially, always was. You can even have compute inside the graphics context threads since graphics is the "fattest" context type which can basically do everything. That's probably also how NV runs async compute on Kepler and Maxwell at the moment - by packing it into graphics context which can be run alongside other graphics obviously.

- Instructions can be issued before a task is finished. Pascal have 20 SMs which can switch dynamically between contexts on a pixel level granularity. Two Pascal "cores" (SMs in fact) will complete a task which require running two contexts simultaneously faster than one GCN "core" (CU) will complete both. AMD has actually implemented the same option of CU split between contexts in GCN4 (and GCN3 via microcode update probably) as it actually can be more beneficial in practice than mixing of contexts on one CU because of a higher data locality in such approach. Most AMD fans kinda missed that part of Polaris announcement.

- Asynchronous compute doesn't need any "integration" since it's an API feature which all DX12 GPUs support by default in their drivers. There are no "proper way" of executing this feature in h/w since different h/w can have different optimal ways of such execution.

- It doesn't make things slower, and we're having this conversation on a false premise of you declaring that QB's performance advantage on GCN h/w is because of its async compute capabilities while all the data we have on async gains on GCN on PC (that's an important part) are pointing to ~5-10% of performance gain at max. With such gain a Fury X would be able to reach 980Ti's performance level while in practice it's ~10-15% faster than 980Ti in QB. This advantage is too big to attribute it solely to async compute usage.

- No, I don't agree with that at all. NV decided to release an architecture which doesn't require neither DX12 nor async compute to perform at its peak capabilities while keeping on the same performance level in DX12 GPU limited situations as in all other APIs. To me this is a better architecture which doesn't force a developer into using a new "console like" API to achieve its peak utilization. And from the market situation it's pretty clear that most people agree with me. Fact is that there's like five DX12 games on the market and for each new DX12 release there are 10+ DX11 releases. An architecture which can't handle these 10+ releases in an optimal way is a bad one at this moment in time.

It could even be argued, considering the performance boosts older GCN cards get, that Nvidia's choice not to adopt the tech years ago has been holding the industry back.

There are no indication that NV "adopting the tech" would provide NV h/w any kind of performance gain. There are indications that it would actually be a net performance loss for them. I fail to see how an average performance loss would help the industry. It'll be interesting to see how they'll handle this in Volta.
 

CHC

Member
If you disable upscaling, you can forget about 60fps in native 1080p. Even overclocked 1080 struggles with that.
With upscaling enabled, there is a chance of getting 60fps at 1080p (upscaled from 720p) with some options dialed down (especially volumetric lightning, if I am not mistaken).

That's just not true at all. I played on a 1070 at 60 FPS. 1080p, up scaling off, all medium effects with ultra shadows and textures. A couple dips, perhaps but it was very stable.

I could also get 60 by just going all ultra with upscaling on, which makes it like 720p.
 
- Hyperthreading is the ability to run several threads on the same execution pipeline at once. That's exactly what all GPUs do since GF256 or maybe even earlier. You seem to think that GCN's async compute is something different to an ability to run threads of compute context alongside threads of graphics context. What happens when there are no async compute? There are several threads of graphics context running on the same CU at once. It's still "hyperthreading" essentially, always was. You can even have compute inside the graphics context threads since graphics is the "fattest" context type which can basically do everything. That's probably also how NV runs async compute on Kepler and Maxwell at the moment - by packing it into graphics context which can be run alongside other graphics obviously.
It allows one core to do the job of two by running two different types of tasks concurrently, Pascal still can't do that.
- Instructions can be issued before a task is finished. Pascal have 20 SMs which can switch dynamically between contexts on a pixel level granularity. Two Pascal "cores" (SMs in fact) will complete a task which require running two contexts simultaneously faster than one GCN "core" (CU) will complete both. AMD has actually implemented the same option of CU split between contexts in GCN4 (and GCN3 via microcode update probably) as it actually can be more beneficial in practice than mixing of contexts on one CU because of a higher data locality in such approach. Most AMD fans kinda missed that part of Polaris announcement.
Pascal can do this, but sadly, it's only half of the async compute pie. GCN can do this while also running both graphics and compute concurrently.


http://www.eteknix.com/pascal-gtx-1080-async-compute-explored/

Even with all of these additions, Pascal still won’t quite match GCN. GCN is able to run async compute at the SM/CU level, meaning each SM/CU can work on both graphics and compute at the same time, allowing even better efficiency.

- Asynchronous compute doesn't need any "integration" since it's an API feature which all DX12 GPUs support by default in their drivers. There are no "proper way" of executing this feature in h/w since different h/w can have different optimal ways of such execution.
It does if you want your GPU to run it efficiently, otherwise all cards would be Dx12 compatible even the old obsolete ones. Pascal only makes it half way to compatibility.
- It doesn't make things slower, and we're having this conversation on a false premise of you declaring that QB's performance advantage on GCN h/w is because of its async compute capabilities while all the data we have on async gains on GCN on PC (that's an important part) are pointing to ~5-10% of performance gain at max. With such gain a Fury X would be able to reach 980Ti's performance level while in practice it's ~10-15% faster than 980Ti in QB. This advantage is too big to attribute it solely to async compute usage.
It's a sliding scale. Fury gets a performance boost while the 980ti's performance drops. They both move in opposite directions regarding performance, this creates a larger gap than just fury with vs fury without async.
- No, I don't agree with that at all. NV decided to release an architecture which doesn't require neither DX12 nor async compute to perform at its peak capabilities while keeping on the same performance level in DX12 GPU limited situations as in all other APIs. To me this is a better architecture which doesn't force a developer into using a new "console like" API to achieve its peak utilization. And from the market situation it's pretty clear that most people agree with me. Fact is that there's like five DX12 games on the market and for each new DX12 release there are 10+ DX11 releases. An architecture which can't handle these 10+ releases in an optimal way is a bad one at this moment in time.
and if devs choose to hold off from using the new tech because Nvidia doesn't fully support it, then that by definition an example of Nvidia's choices slowing down or holding back the industry.


There are no indication that NV "adopting the tech" would provide NV h/w any kind of performance gain. There are indications that it would actually be a net performance loss for them. I fail to see how an average performance loss would help the industry. It'll be interesting to see how they'll handle this in Volta.

Then they should have developed a more forward thinking design.
In bold


Also, we are having this conversation because people were blaming Remedy for a bad port, when its much more likely that Nvidia cards just can't handle the amount of compute and graphics tasks being thrown at them by Remedy's game engine in order to do as much as it does in real time.
 
Be honest with me GAF. What kind of performance can I expect on a GTX 980 (heavy overclock) with 3970x (4.6ghz) and 32gb ram at 1080p? Will I get a solid 60? Solid 30? What would I have to dial down to attain it?

Xbone settings would get you a solid 60 or close to it i think
 

CHC

Member
Textures don't really influence performance, it's just a VRAM issue. I hardly noticed a performance difference between medium and ultra, but the visual improvement is huge.
 

Daingurse

Member
Try turning the textures and effects down. That 3.5 GB limit could be biting you in the ass with the 970.

Yup, you got to make a lot of concessions with that card. My 970 definitely feels the hurt from the ~3.5GB "wall" in QB, forcing me to turn down a lot of shit if I want decent performance with upscaling off. I really want a card with more VRAM now, sigh . . .
 

SimplexPL

Member
That's just not true at all. I played on a 1070 at 60 FPS. 1080p, up scaling off, all medium effects with ultra shadows and textures. A couple dips, perhaps but it was very stable.

I could also get 60 by just going all ultra with upscaling on, which makes it like 720p.
Well, if you nerf everything to medium then yeah, it works, but you are playing with xboxe one quality, or similar.
 
So it's been almost 6 weeks since this game received its last (relatively minor) patch. Is it safe to say that M$/Remedy are essentially saying "fuck off" to PC gamers and we aren't going to see this game ever reach a properly playable state?
 
So it's been almost 6 weeks since this game received its last (relatively minor) patch. Is it safe to say that M$/Remedy are essentially saying "fuck off" to PC gamers and we aren't going to see this game ever reach a properly playable state?

Considering the performance issues are largely due to how the engine works, if you mean "playable" by "good framerate at lower-end hardware with better image quality", then yeah it is a bust. I just tested it on my newly arrived 1080, with upscaling disabled at 1080p, everything else maxed out gives me 60fps with dips. IQ is great though, but the forced 4X MSAA definitely is way too much for anything below higher end hardware.
 

shandy706

Member
So it's been almost 6 weeks since this game received its last (relatively minor) patch. Is it safe to say that M$/Remedy are essentially saying "fuck off" to PC gamers and we aren't going to see this game ever reach a properly playable state?

Game runs just fine on my 980 Ti. I think people underestimate everything going on in this game tech wise. It took a miracle of tricks, lowered settings, and wizardry to run it on the X1. I think it runs exactly as expected with everything turned to 11.

Lower end to mid-level hardware just isn't going to run it with everything cranked.
 

CHC

Member
So it's been almost 6 weeks since this game received its last (relatively minor) patch. Is it safe to say that M$/Remedy are essentially saying "fuck off" to PC gamers and we aren't going to see this game ever reach a properly playable state?

It's a demanding game, and it looks the part. There's never going to be some magic patch that allows you to max it out on at 60 FPS on an average PC. It's also completely playable without major bugs or crashes so no they're not sayin "fuck off." It isn't the worlds greatest PC port but it's certainly at least average by now.
 

SimplexPL

Member
There's never going to be some magic patch that allows you to max it out on at 60 FPS on an average PC.
You also can't max it out on at 60 FPS on a high end PC (1080 @ 2GHz, 6700K @ 4.5GHz) at 1080p, which makes it one of the most unoptimized ports in recent history (together with the episodic Hitman). For example, Rise of the Tomb Raider is also a great looking game and I am able to play in 1440p with everything maxed and get stable 60fps.
 

Daingurse

Member
Quantum Break is definitely one of those games where I just said fuck it, and cap the game at 30fps. I find that to be pretty playable with a 360 controller. You got to be willing to make a lot of concessions to play this game. I can't blame anyone who can't put up with that. Game is too expensive to expect people to deal with it.
 

pa22word

Member
You also can't max it out on at 60 FPS on a high end PC (1080 @ 2GHz, 6700K @ 4.5GHz) at 1080p, which makes it one of the most unoptimized ports in recent history (together with the episodic Hitman). For example, Rise of the Tomb Raider is also a great looking game and I am able to play in 1440p with everything maxed and get stable 60fps.
In the history of PC gaming there have been tons of games that couldn't run high framerates on current hardware due to utilizing new tech.

This entitlement of newer PC gamers towards their framerate and resolution of choice is frankly idiotic, and holds the platform back more than anything because devs lock out options or lie to idiots in order to make them feel better about "running max settings brah!!11"
 
In the history of PC gaming there have been tons of games that couldn't run high framerates on current hardware due to utilizing new tech.

This entitlement of newer PC gamers towards their framerate and resolution of choice is frankly idiotic, and holds the platform back more than anything because devs lock out options or lie to idiots in order to make them feel better about "running max settings brah!!11"

Eh but in this case we are talking about the absolutely most powerful hardware on the market, that is ahead of everything else by a significant margin. When you can't even bruteforce something with sheer power, it is probably wrong to the root, like how Unreal Engine 3 was stretched way too far in Arkham Knight, that dips will happen regardless of hardware.

The last time this happens was the original Crysis, which was far far too ahead of its time, with some really poor (non-existent even) optimization choices, there was no hardware that could run it max out with even acceptable framerates.
 

pa22word

Member
Eh but in this case we are talking about the absolutely most powerful hardware on the market, that is ahead of everything else by a significant margin. When you can't even bruteforce something with sheer power, it is probably wrong to the root, like how Unreal Engine 3 was stretched way too far in Arkham Knight, that dips will happen regardless of hardware.

The last time this happens was the original Crysis, which was far far too ahead of its time, with some really poor (non-existent even) optimization choices, there was no hardware that could run it max out with even acceptable framerates.

The thing is though, nothing out there looks like qb does. I've seen people post about tomb raider, when most of that is all baked effects. The game wowed me with its lighting in a way no game has done since STALKER, and if the price to pay for that is I have lower framerates than when playing the much less impressive Tomb Raider then that's a trade off I'm pretty happy to pay for.
 

CHC

Member
You also can't max it out on at 60 FPS on a high end PC (1080 @ 2GHz, 6700K @ 4.5GHz) at 1080p, which makes it one of the most unoptimized ports in recent history (together with the episodic Hitman). For example, Rise of the Tomb Raider is also a great looking game and I am able to play in 1440p with everything maxed and get stable 60fps.

I don't know what to tell you man, you seem intent on insisting that the game is in some unplayable state. "Maxing out" isn't some absolute, it's a relative term specific to each individual game. Quantum Break has a lot of demanding tech going on and it still looks fantastic on non-maxed settings.
 
The thing is though, nothing out there looks like qb does. I've seen people post about tomb raider, when most of that is all baked effects. The game wowed me with its lighting in a way no game has done since STALKER, and if the price to pay for that is I have lower framerates than when playing the much less impressive Tomb Raider then that's a trade off I'm pretty happy to pay for.

Yeah I can see what you mean. But the problem with QB is that its rendering pipeline wasn't built with PC in mind, the reconstruction is unnecessary on PC, and the 4X MSAA is subsequently unwarranted, if the non-upscaling option was there by default and the AA solution is simplified, I think in its current state QB would be a far better performing title with still superior image quality than the Xbox One version.
 
People saying performance is justified are wrong when it comes to nvidia. I mean a 980ti is close to 5x faster than the xbone gpu and just offers performance slightly over double.
 

SimplexPL

Member
I don't know what to tell you man, you seem intent on insisting that the game is in some unplayable state. "Maxing out" isn't some absolute, it's a relative term specific to each individual game. Quantum Break has a lot of demanding tech going on and it still looks fantastic on non-maxed settings.
When did I ever state the game is unplayable? If I did, quote me on that. I'm just stating the objectively verifiable fact - it's not possible to max out this game in 1080p on a OC'ed 1080 with OC'ed 6700K and achieve a stable 60fps.
I am aware of the folly of "maxing out" - http://www.neogaf.com/forum/showthread.php?t=885444
 

drotahorror

Member
They should have made the built in AA an option. The fact that there's no way to disable it (it has native AA, not the AA you checkmark in the options or w/e it's listed as). I'm sure 2x or 4x msaa takes up plenty of power since it's supposedly doing things that no other game is doing.

I didn't try but I wonder if you were to lower all settings to the minimum but turn upscaling off, could you still hold a constant 60fps? No way I'll install the game again just to see but I'm curious. And I mean running atleast a framerate counter and playing on an intensive level like for instance towards the end of the game where you battling plenty of fools in the parking lot inside and outside specifically.
 
Game runs just fine on my 980 Ti. I think people underestimate everything going on in this game tech wise. It took a miracle of tricks, lowered settings, and wizardry to run it on the X1. I think it runs exactly as expected with everything turned to 11.

Lower end to mid-level hardware just isn't going to run it with everything cranked.

It's a demanding game, and it looks the part. There's never going to be some magic patch that allows you to max it out on at 60 FPS on an average PC. It's also completely playable without major bugs or crashes so no they're not sayin "fuck off." It isn't the worlds greatest PC port but it's certainly at least average by now.

Come on, are people really trying to convince themselves that this port is acceptable now?

I have a 980 Ti heavily overclocked, a 4790k heavily overclocked, 16GB of RAM, and a Samsung EVO 840 1TB SSD.

This is the only game I've ever played that I can't maintain 60fps on @1440p. Hell, this game can't even maintain 60fps at 1080P with my spec! And there's literally a dozen+ games out there that look better than this one.

It's the worst big budget PC port released in nearly a decade. It makes even Arkham Knight look perfect in comparison.
 
This is the only game I've ever played that I can't maintain 60fps on @1440p. Hell, this game can't even maintain 60fps at 1080P with my spec!
It can.
Just not at Ultra settings.

Also this:
In the history of PC gaming there have been tons of games that couldn't run high framerates on current hardware due to utilizing new tech.

This entitlement of newer PC gamers towards their framerate and resolution of choice is frankly idiotic, and holds the platform back more than anything because devs lock out options or lie to idiots in order to make them feel better about "running max settings brah!!11"
 
It can.
Just not at Ultra settings.

Also this:
In the history of PC gaming there have been tons of games that couldn't run high framerates on current hardware due to utilizing new tech.

This entitlement of newer PC gamers towards their framerate and resolution of choice is frankly idiotic, and holds the platform back more than anything because devs lock out options or lie to idiots in order to make them feel better about "running max settings brah!!11"

pa22word's quote has absolutely nothing to do with this game. If this game was actually pushing the envelope, looking better than anything out there, that would be one thing. But it's not pushing the envelope and it's certainly not looking better than anything else out there. I can list a dozen games off the top of my head that look better than this game. This is not a Crysis type scenario. Stop trying to pretend that it is. It's a shitty, half-assed port that had no business ever even being released in the state that it's in.
 

tioslash

Member
PLayable yet on xb1 settings on a 970 yet?

It has been playable on xb1 settings for a very long time. In fact I finished the game using a 970 with pretty much 60fps locked (except in one or two sections where it dropped to low 50´s) and zero stutters on a mix of Medium/High settings.

edit: That´s on 1080p.
 
It has been playable on xb1 settings for a very long time. In fact I finished the game using a 970 with pretty much 60fps locked (except in one or two sections where it dropped to low 50´s) and zero stutters on a mix of Medium/High settings.

edit: That´s on 1080p.

Mind listing your graphics options? I am about 4 hours into it but it kept dropping like mad and was really unstable with driver crashes, on the latest NVidia drivers. Not played in about 2 months though.
 

horkrux

Member
pa22word's quote has absolutely nothing to do with this game. If this game was actually pushing the envelope, looking better than anything out there, that would be one thing. But it's not pushing the envelope and it's certainly not looking better than anything else out there. I can list a dozen games off the top of my head that look better than this game. This is not a Crysis type scenario. Stop trying to pretend that it is. It's a shitty, half-assed port that had no business ever even being released in the state that it's in.

But it is in certain aspects. The global illumination is taking it's toll. You might say that it wasn't worth it (and I'd want to agree), but the game doesn't run like shit for no reason.
 
But it is in certain aspects. The global illumination is taking it's toll. You might say that it wasn't worth it (and I'd want to agree), but the game doesn't run like shit for no reason.

It runs like shit because it's a shitty, rushed, port. A 980 Ti is about 5x faster than the hardware in the XB1, yet it's not even getting 2x the performance. It's a terrible port. There's no excuse for it.
 

horkrux

Member
It runs like shit because it's a shitty, rushed, port. A 980 Ti is about 5x faster than the hardware in the XB1, yet it's not even getting 2x the performance. It's a terrible port. There's no excuse for it.

Runs better on AMD, though. It might not be optimized for Nvidia like at all, but it's not like it was impossible to get reasonable performance.
 
It runs like shit because it's a shitty, rushed, port. A 980 Ti is about 5x faster than the hardware in the XB1, yet it's not even getting 2x the performance. It's a terrible port. There's no excuse for it.
It's not just about flops. The game's gi uses a voxel based representation of the engine. You increase the resolution and that causes a non linear increase (because it's a volume and not an area).

Running at xbone settings makes current hardware eat the xbone alive in performance like it should, increasing the graphics makes the game a lot heavier.

And looks can be subjective, the gi technique, the quality of assets and the particle effects are nor. This game is top notch on all these aspects.
 

Vuze

Member
Did anyone happen to play it from start to end on a 1080 at 1440p60+?
I just don't want to sacrifice more visual fidelity than required to make this work but I feel like the beginning is a really bad point to configure settings due to lack of action.
 

tioslash

Member
Mind listing your graphics options? I am about 4 hours into it but it kept dropping like mad and was really unstable with driver crashes, on the latest NVidia drivers. Not played in about 2 months though.

Sure.

Drivers I´m using are 365.19 and I didn´t have a single crash using these during my gameplay.

The in-game settings are:

. Resolution - 1920x1080
. Fullscreen Mode - ON
. Vsync - ON
. Lock to 30FPS - OFF
. Volumetric Lighting - MEDIUM
. Shadow Resolution - MEDIUM
. Shadow Filtering - MEDIUM
. Texture Resolution - ULTRA
. Geometry level of detail - HIGH
. Screen Space Ambiente Occlusion - ON
. Screen Space Reflections - MEDIUM
. Effects Quality - HIGH
. Global Illumination Quality - MEDIUM
. Anti-Aliasing - OFF
. Upscaling - ON
 
Did anyone happen to play it from start to end on a 1080 at 1440p60+?
I just don't want to sacrifice more visual fidelity than required to make this work but I feel like the beginning is a really bad point to configure settings due to lack of action.

I don't think it's possible, maybe select periods will run at 60fps.
 

SimplexPL

Member
You can't even play it 1080p60 at max settings on a 1080, let alone 1440p.
If you are willing to leave scaling/reconstruction enabled and dial down some settings (especially volumetric lighting, which is murder to nvidia cards) then you may be able to achieve stable 60fps at 1080p (reconstructed from 720p) or on some resolution between 1080p and 1440p.

Sure.

Drivers I´m using are 365.19 and I didn´t have a single crash using these during my gameplay.

The in-game settings are:

. Resolution - 1920x1080
. Fullscreen Mode - ON
. Vsync - ON
. Lock to 30FPS - OFF
. Volumetric Lighting - MEDIUM
. Shadow Resolution - MEDIUM
. Shadow Filtering - MEDIUM
. Texture Resolution - ULTRA
. Geometry level of detail - HIGH
. Screen Space Ambiente Occlusion - ON
. Screen Space Reflections - MEDIUM
. Effects Quality - HIGH
. Global Illumination Quality - MEDIUM
. Anti-Aliasing - OFF
. Upscaling - ON

So you are running almost everything on medium and on resolution reconstructed from 720p. No wonder it runs acceptably at 720p medium.

This is the only game I've ever played that I can't maintain 60fps on @1440p. Hell, this game can't even maintain 60fps at 1080P with my spec! And there's literally a dozen+ games out there that look better than this one.

Max out The Division at 1080p and start running in front of your base. I guarantee you will not maintain 60fps. I don't on OC'ed 1080 (~2GHz) and OC'ed 6700K (4.5 GHz)
 

tioslash

Member
So you are running almost everything on medium and on resolution reconstructed from 720p. No wonder it runs acceptably at 720p medium.

Yes, the person asked if the game was playable on XB1 settings using a GTX 970. It is. At 60fps, which is fine by me.
 
You can't even play it 1080p60 at max settings on a 1080, let alone 1440p.
If you are willing to leave scaling/reconstruction enabled and dial down some settings (especially volumetric lighting, which is murder to nvidia cards) then you may be able to achieve stable 60fps at 1080p (reconstructed from 720p) or on some resolution between 1080p and 1440p.



So you are running almost everything on medium and on resolution reconstructed from 720p. No wonder it runs acceptably at 720p medium.



Max out The Division at 1080p and start running in front of your base. I guarantee you will not maintain 60fps. I don't on OC'ed 1080 (~2GHz) and OC'ed 6700K (4.5 GHz)

I didn't say maxed out. There's definitely a few games I can't run maxed out at a locked 60fps/1440p with my 980 Ti. What I'm saying is I can't get Quantum Break to run at a locked 60fps at 1440p regardless of what settings I change. I can set everything to low and it doesn't stay locked. I'm not talking the reconstructed bullshit 1440p either. Native 1440p.
 

SimplexPL

Member
The game was never optimized for disabling reconstruction, it was never planned, they only added the option to placate the community that demanded it. Maybe in a few years it will be possible play in high resolution and frame rate with brute forcing.

Yes, the person asked if the game was playable on XB1 settings using a GTX 970. It is. At 60fps, which is fine by me.
So 970 is roughly twice as fast as XBO GPU.
 
The game was never optimized for disabling reconstruction, it was never planned, they only added the option to placate the community that demanded it. Maybe in a few years it will be possible play in high resolution and frame rate with brute forcing.


So 970 is roughly twice as fast as XBO GPU.

970 is actually over 3x as fast as xbone gpu tho. it also cant hold 60 fps at xbone settings
 
Top Bottom