I suck at this, might need to restart on easy. Can't downin the first area after sheMiacuts your hand off and attacks with a chainsaw in the attic.
Anyone here having issues with the game streaming to another PC or Steam Link?, every other game works in my library, but RE7 seems to go all black when launching the game.
Anyone here having issues with the game streaming to another PC or Steam Link?, every other game works in my library, but RE7 seems to go all black when launching the game.
I played the game from beginning to end on my steam link with no issues. I'm wired though.
I'm also wired, but it seems the game launches in windowed mode then goes to full screen a few seconds later, this might be messing with the stream itself?, does yours also do this ?
Very weird
I played the game from beginning to end on my steam link with no issues. I'm wired though.
Could you share your Host and Client settings? Most games work flawlessly for me on the steam link but this stutters like crazy on the TV, even though it's fine on the host/PC screen.
Why did they chart the 970 performance on the 1080p benchmarks yet neglect it for the 1440p+ scores? Even the 960 2GB gets a spot in all of them.
Those with SLI can get better performance using these bits using Nvidia Inspector.
Guru 3D still doesn't even have the scores up for updated drivers on their main benchmark page. I have no idea what they are doing.
These might just be the default settings, I'm not sure if I ever bothered to change anything.
The following are checked under advanced host options, everything else is unchecked:
-Adjust resolution to improve performance
-Enable hardware encoding
-Enable hardware encoding on Nvidia GPU
Number of software encoding threads set to automatic.
Client options: Beautiful
Advanced client options
-Limit bandwidth set to automatic
-Limit resolution set to display
-Speaker config set to automatic
-Hardware encoding checked
Running At 4K HDR on a Samsung KS8000
Specs
i7 7700K
GTX 1080
16GB
Evo 960 SSD
Getting as high as 60 FPS with these settings but seeing dips as low as 30FPS in some areas. Is there anything in these settings that can improve perfomance, if I enable Motion Blur or Depth of Field I see a massive perfomance hit.
Screen Resolution: 4k
Refresh Rate - 59
Display Mode - Full screen
Field of View - Default
Frame Rate: Variable
V-Sync ON
Rendering - Normal
Resolution Scaling - 1.0
Texture Quality - Very High
Texture Filtering -Very High
Mesh Quality - Very High
Anti-Aliasing - FXAA+TAA
Motion Blur - OFF
Effects Rendering - High.
Depth of Field - OFF
Shadow Quality - High.
Dynamic Shadow - ON
Shadow Cache OFF
Ambient Occlusion - SSAO
Assuming that you are not hitting a performance target (e.g. 60 FPS frame limiter / V-Sync), having your GPU being under-utilized - especially to the point where the GPU is downclocking - is typically a sign of a big CPU bottleneck.
With a 2500K at 4.5GHz I had issues with that happening in nearly every big game released in 2016.
I upgraded from a 960 to a 1070 and the only thing which changed in those sections of games is that the GPU usage dropped further while performance remained the same.
I can't wait for Ryzen to be released so that I can either upgrade to it or a 7700K, depending on the price/performance offered.
Patch #1 Release Notes (Jan. 27th)
Community Announcements - WBacon [capcom]
The following issues have been addressed in today's update:
- Fixed an issue where saving game progress was no longer possible if the player deleted the local save file while Steam Cloud has been turned off.
- Fixed an issue where HDR mode is turned back to ON upon app launch if the player quits the prior game session in full screen mode and with HDR set to OFF.
Whats your reflections setting? Dropping to variable made a difference for me. Plus, it looks better, as full reflections cause all kinds of horrible artefacts.
If that's what is happening on your system - where forcing it into a higher power state is improving performance, and that significantly - it sounds like something has gone wrong and is preventing the GPU from boosting properly.I'm on 3770K at 4.2ghz so I think you are right I am bottlenecked in a number of recent games, but how does that explain the GPU running at 900mhz, me getting 40ish fps in a game, then (I know Prefer Max Perf doesnt always force the core/mem clocks to the EVGA boost values - 1329 in my case - but thats my experience here) forcing Prefer Max Perf and getting a locked 60 fps after that?
My CPU is still 4.2ghz, whats changed?
The way the reflections are sampled is very "noisy". If you look at the floorboards in the lower-right of this video it's pretty noticeable.Not seeing any artifacts it just drops occasionally. It's set to High I believe.
Looks like it still happens in the full game. (gfy link)
I should have recorded this in 720p because their compression hides a lot of the problems with SMAA.
Is no-one else seeing this?
As I said before, it gets worse the higher the framerate is. (120 FPS in the video again)
I can't believe they didn't include a sharpening filter for the TAA option in the full game either.
---
For the people complaining about black crush and banding in HDR: it's not just HDR which is affected.
The black crush almost seems to be an artistic decision to intentionally reduce what you can see.
Here's the game running in SDR at the default brightness in sRGB:
Patch #1 Release Notes (Jan. 27th)
Community Announcements - WBacon [capcom]
The following issues have been addressed in today's update:
- Fixed an issue where saving game progress was no longer possible if the player deleted the local save file while Steam Cloud has been turned off.
- Fixed an issue where HDR mode is turned back to ON upon app launch if the player quits the prior game session in full screen mode and with HDR set to OFF.
I didn't mean to suggest that it was the same, just that banding is still a problem for the game in SDR too, so it's a general problem for the game/engine that is not limited to HDR.This isn't the issue I'm talking about. The banding in the white image is extremely subtle to what I see. On my screen, the banding is more like this when using HDR (it's less pronounced in SD, but still not quite as good as your shot):
The point is, in HDR it should be even more subtle than in your shot, not more pronounced.
I assume that the game only allows you to enable HDR in Fullscreen mode, but if it still gives you the option to use it in Borderless Windowed Mode, you should avoid that. Unless HDR is treated differently, Windowed Mode is limited to 8-bit color while Fullscreen Mode is capable of outputting > 8-bit color.
Resident Evil 7- Patch #1 Changelog:
Fixed an issue where saving game progress was no longer possible if the player deleted the local save file while Steam Cloud has been turned off.
Fixed an issue where HDR mode is turned back to ON upon app launch if the player quits the prior game session in full screen mode and with HDR set to OFF.
I've been posting my SLI findings in here, curious if you guys have done any testing.
In particular I'd like to find a solution for the banding/flickering problem that occasionally showed up, and better options for AA. SMAA wasn't cutting it but I couldn't get any kind of forced AA to work. TAA goes all bonkers with SLI.
Switching between RGB and YCC shouldn't have a noticeable effect on color reproduction.It only allows it in Fullscreen. I still don't believe that it's actually outputting in 10-bit though. I can't even select 10-bit from the Nvidia control panel options unless I'm using 4:2:2 or 4:2:0. Both of which screw with the colors and are incorrect. RGB can't use more than 8-bit either, and I have to have that selected. Unless when the game switches to HDR, and it doesn't actually care what you have your video card set to bit wise and overrides it, it's only outputting in 8-bit for me.
Wider color gamuts can also make banding more noticeable, as the difference between two steps in color is larger. Bit-depth also needs to increase with gamut.I think maybe the issue I'm having wrapping my head around banding with HDR is that I'm not just thinking of the highlights in brighter and darker areas, but also the wider color gamut that should be along for the ride. But I guess since it's possible to use HDR without the wider color gamut, I shouldn't assume that it is. With the wider color gamut, there should be less banding.
Is anyone getting slower transitions alt-tabbing in and out of games after installing the newest Nvidia driver?
I have a 980Ti.
Can someone explain this to me: why will the framerate be halved with double buffering and vsync if the FPS drops below 60, but not when triple buffering? I'm trying to picture it in my mind, but I don't get it. Should I always use triple buffering?
borderless windowed?
Double buffering always syncs the framerate to divisors of your refresh rate.Can someone explain this to me: why will the framerate be halved with double buffering and vsync if the FPS drops below 60, but not when triple buffering?
Switching between RGB and YCC shouldn't have a noticeable effect on color reproduction.
The only change I might expect would be the levels switching from full to limited.
Dropping from 4:4:4 to 4:2:2 or 4:2:0 shouldn't affect color reproduction either, only chroma resolution.
If your video card is in a mode which doesn't support >8-bit color, then I would expect that the game outputs 10-bit or whatever it uses and the GPU converts that down to 8-bit. I wouldn't recommend that.
You should be using a mode that supports at least 10-bit color for HDR.
Wider color gamuts can also make banding more noticeable, as the difference between two steps in color is larger. Bit-depth also needs to increase with gamut.
Hmm, I see. Thanks for the info! Should I then still cap it to 30 or leave it on "Variable" in-game when I chose Adaptive+Half-Refresh Vsync in Nvidia settings?Double buffering always syncs the framerate to divisors of your refresh rate.
So if you have a 60Hz screen that is 60/30/20/15 etc.
If you have a 144Hz screen that is 144/72/48/36 etc.
With triple buffering, the framerate isn't synced with the refresh rate when it's anything less than it.
Frame presentation is synced to the refresh rate so that there isn't any tearing though.
If you have an NVIDIA GPU and you want to lock a game to 30 FPS, I would suggest setting V-Sync to Adaptive (Half-Refresh Rate) in the NVIDIA Control Panel.
That uses double buffered V-Sync (1 frame less lag than triple buffering) and allows it to tear if the framerate drops below 30.
Setting the maximum pre-rendered frames to 1 further reduces lag with V-Sync on.
Never set these on the global profile, set them on the game profile.
This assumes that the game is at 30 FPS the majority of the time though. If it's constantly dropping below 30 I guess triple buffering might be the best option.
The game is borderline unplayable on my buddy's GTX 970 PC.
Game should run like a dream on max settings on a 970. He shouldn't have to lower anything to medium.Turn off Shadow Cache
Set textures to medium or high
Set reflections to variable
Dynamic Lighting is also a culprit, but when turned off you lose some of the atmosphere
Set AO to SSAO variable
That kinda fixed it for me!
Game should run like a dream on max settings on a 970. He shouldn't have to lower anything to medium.
Game should run like a dream on max settings on a 970. He shouldn't have to lower anything to medium.