• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NeoGAF PC Gaming General Performance Tweaking Thread

Great thread, thanks for the input everyone.

I've just swapped from a 6950 to a 580 GTX and installed Nvidia Inspector. Is there any standard settings that I need to change in the Nvidia Control Panel or NI? Or is every setting game specific?

Change shooters to max render ahead = 0.

Will doing this cause any conflict when nvidia eventually releases the official drivers for non 600 cards?

Nope.
 
Awesome, thanks.

And is there a general guide for Nvidia Inspector? I'm still pretty new to PC gaming and tweaking and reading through TweakGuide's Nvidia Control Panel guide was very helpful. Should I be using Inspector instead of the Control Panel?

I believe there are guides, I can try to find you one. And yes, use Inspector instead of Control Panel, it has more settings and is easier to use!
 
I believe there are guides, I can try to find you one. And yes, use Inspector instead of Control Panel, it has more settings and is easier to use!
If you found me a guide that would be great. I tried looking and failed.

And Control Panel is conveniently in my taskbar whereas I have to open Inspector every time. Is there a workaround for this besides just pinning it as a quicklaunch icon? If I use Inspector is it recommended that I close Control Panel and not use it at all?
 
If you found me a guide that would be great. I tried looking and failed.

And Control Panel is conveniently in my taskbar whereas I have to open Inspector every time. Is there a workaround for this besides just pinning it as a quicklaunch icon? If I use Inspector is it recommended that I close Control Panel and not use it at all?

Just Quickpin Inspector. That's what I do. Nah, leave Control Panel running.

Says that the graphics driver could not find compatible hardware. Could you elaborate on what I do and where to get the inf file? I've been dying to try out adaptive v sync.

Here's the INF: http://dl.dropbox.com/u/22834557/nv_disp.inf (I dropboxed it for you)

Now what you do is go to C:\NVIDIA\DisplayDriver\301.10\WinVista_Win7_64\English\Display.Driver

Drop that inf in there. Then click setup.exe in C:\NVIDIA\DisplayDriver\301.10\WinVista_Win7_64\English

And boom, it'll install.
 

Jtrizzy

Member
Just Quickpin Inspector. That's what I do. Nah, leave Control Panel running.



Here's the INF: http://dl.dropbox.com/u/22834557/nv_disp.inf (I dropboxed it for you)

Now what you do is go to C:\NVIDIA\DisplayDriver\301.10\WinVista_Win7_64\English\Display.Driver

Drop that inf in there. Then click setup.exe in C:\NVIDIA\DisplayDriver\301.10\WinVista_Win7_64\English

And boom, it'll install.

Thanks, but that link seems to be taking me to the text of the file for some reason, instead of downloading the inf.
 

scitek

Member
Didn't work.

Two things to try. First, try overriding the application settings via the drop-down menu in RadeonPro. If THAT doesn't work, try setting it to Use Application Settings, then changing the AA mode to Supersample. You'd then adjust between 2x, 4x, 8x, etc. from inside the game itself.
 

Smokey

Member
NFS Hot Pursuit really does look a completely different game through Inspector.

8x MSAA + 8x SGSSAA @ 60 fps with not a damn jagged edge in site is awesome
 

Jtrizzy

Member
Adaptive V-Sync seems to help a lot with Skyrim. Pretty good with AC Revelations too, but not as good. It was good in NBA 2k12 also.

So if I'm using it, should I disable d3doverider and enable triple buffering through inspector?

Also, since I force the AF via inspector, should I be disabling that in games?
 
Is there much in the way of difference between forcing Triple Buffering+Vsync in the Nvidia Control Panel/Inspector vs D3DOverrider?
 
Adaptive V-Sync seems to help a lot with Skyrim. Pretty good with AC Revelations too, but not as good. It was good in NBA 2k12 also.

So if I'm using it, should I disable d3doverider and enable triple buffering through inspector?

Also, since I force the AF via inspector, should I be disabling that in games?

You don't need triple buffering at all with adaptive v sync - that's the point. No, I'd always force AF through Inspector. Some games simply have no AF control.


Is there much in the way of difference between forcing Triple Buffering+Vsync in the Nvidia Control Panel/Inspector vs D3DOverrider?

Triple buffering in NVidia Control Panel/NVidia Inspector doesn't work - it's for OpenGL games, not Direct3D. So it'll do nothing. So you'll HAVE to use D3DOverrider to force it if you want triple buffering and the game doesn't support it.

Vsync depends on the game. Some games have good Vsync implementation, some don't.
 
Triple buffering in NVidia Control Panel/NVidia Inspector doesn't work - it's for OpenGL games, not Direct3D. So it'll do nothing. So you'll HAVE to use D3DOverrider to force it if you want triple buffering and the game doesn't support it.

Vsync depends on the game. Some games have good Vsync implementation, some don't.
Interesting. Good to know.

If I'm going to force Triple Buffering+Vsync via D3DOverrider should I also disable Vsync in the control panel and in game?
 
Interesting. Good to know.

If I'm going to force Triple Buffering+Vsync via D3DOverrider should I also disable Vsync in the control panel and in game?

No... leave Vsync as 'application controlled' in Inspector/Nv Ctrl Pnl and turn it off in game. I suggest just turning it on in game and forcing Triple Buffering only, Medium Application Detection.
 
BP..

Got any SGSSAA settings for BF3 that I can try via inspector?

I wish. I've been looking for the same thing. The deferred rendering fucks everything up. Can't even go past 4x MSAA.

What I suggest doing is downsampling from a higher resolution to your native res. So if your native res is 1600p, try doing 3040x1900 (that's 16:10 ratio) and running in that res to downsample down to 1600p.

Here's an aspect ratio calculator: http://andrew.hedges.name/experiments/aspect_ratio/

Input what ratio type you want on the left (for example, I put in 1920x1080) and then it'll go off that to make your ratios on the right. So I set 1920x1080 as the 'input', then mess around on the right to figure out 16:9 aspect custom resolutions on the right.
 
So I'm having a little trouble with D3DOverrider.

Vsync doesn't seem to work at all when I force it. I'm testing this on Bioshock and Borderlands right now. When I launch Bioshock I get the Windows noise which I guess means that triple buffering is enabled, but I'm still getting screen tearing, so no vsync. I know I can turn it on in the in game menu but clearly this ain't right.

Now Borderlands doesn't even give me the Windows noise prompt and I'm getting screen tearing so vsync isn't working either. I've tried high, medium and low detection levels for the global settings. I've also tried adding the .exe to the application list and it does nothing. Borderlands doesn't have in game vsync so it's force it here, through the nvidia control panel or in the .ini.

Anyone know what the problem is? These are also Steam versions, maybe that has something to do with it?
 
So I'm having a little trouble with D3DOverrider.

Vsync doesn't seem to work at all when I force it. I'm testing this on Bioshock and Borderlands right now. When I launch Bioshock I get the Windows noise which I guess means that triple buffering is enabled, but I'm still getting screen tearing, so no vsync. I know I can turn it on in the in game menu but clearly this ain't right.

Now Borderlands doesn't even give me the Windows noise prompt and I'm getting screen tearing so vsync isn't working either. I've tried high, medium and low detection levels for the global settings. I've also tried adding the .exe to the application list and it does nothing. Borderlands doesn't have in game vsync so it's force it here, through the nvidia control panel or in the .ini.

Anyone know what the problem is? These are also Steam versions, maybe that has something to do with it?

Force it through ini. Like I said, use D3Doverrider to set Triple Buffering, then just use in-game/forced Vsync.
 
Force it through ini. Like I said, use D3Doverrider to set Triple Buffering, then just use in-game/forced Vsync.
Alright, thanks. I guess I'm just pretty anal and like to have everything working properly even if I choose not to use it.

What about not getting the sound prompt when launching Borderlands? Wouldn't that mean that D3DOverrider isn't forcing triple buffering?
 
Alright, thanks. I guess I'm just pretty anal and like to have everything working properly even if I choose not to use it.

What about not getting the sound prompt when launching Borderlands? Wouldn't that mean that D3DOverrider isn't forcing triple buffering?

See the little icon in the right bottom corner of D3DOverrider? It's a speak. When you have a profile selected, you can turn the speaker on or off. Make sure you didn't mute it.
 
See the little icon in the right bottom corner of D3DOverrider? It's a speak. When you have a profile selected, you can turn the speaker on or off. Make sure you didn't mute it.
It's not muted. I've checked all the settings in D3DOverrider. I get the sound prompt when I launch Bioshock but not Borderlands.
 
Enable Borderlands Vsync through Inspector or Ini. See if it works.
Nope, still no sound prompt. And still not able to force vsync via D3DOverrider. Yes, I'm able to get vsync to work in the control panel and ini, but still. :(

I'd check if triple buffering was working in the Event History Panel but I can't seem to find this damn thing in Rivatuner.

Edit: The plot thickens. I just tested D3DOverrider on Mass Effect 1 and not only do I get the sound prompt on startup, I can also force vsync via D3DOverrider no problem. So it works as it should. What the heck is going on?

Bioshock - Sound prompt for triple buffering, but can't force vsync via D3DOverrider
Borderlands - No sound prompt, and can't force vsync via D3DOverrider
Mass Effect - Sound prompt, CAN force vsync via D3DOverrider, perfect

This is all using the latest Nvidia drivers (301.10 with custom inf) on a 560TI.
 
Sorry for the double post, but I figured what the problem was. D3DOverrider doesn't work properly with SMAA Inject 1.2. My Mass Effect install was clean so it didn't have any of those files, once I deleted them from the Bioshock and Borderlands directories D3DOverrider operated perfectly. However, I didn't notice much of a difference in Borderlands with triple buffering on. When using just vsync via Nvidia control panel before the game would dip to around 40-50fps in certain areas (I don't know if you're familiar with the game areas, Treachers Landing and Headstone Mine). I've messed with all the settings previously and came to the conclusion that dynamic shadows caused the dips in those areas. With triple buffering and vsync forced on via D3DOverrider I got pretty much exactly the same performance as before but now without SMAA.

Bioshock was always at 60fps before I tried triple buffering so I noticed no difference. But AA doesn't work with Bioshock in DX10 so that's what SMAA was for. Now I have to either use FXAA or run it in DX9. This PC is new so I'm just testing with the few games I already have installed.

So here's where I am confused. From what I've read and researched, I thought your fps was supposed to drop to like 30 if you're using vsync and your fps goes below 60? So if triple buffering from the nvidia control panel was doing nothing before, and I was getting framerate dips from areas with heavy dynamic shadows, how come my fps never went down to around 30? The whole time I've been playing it's been 60 at almost all times, drops down to around 50-55 in a couple of heavily shadowed areas and I've isolated one area where it can go to around 40-45. Am I misunderstanding vsync and the benefits of triple buffering?

And can anybody clarify what exactly the "Maximum pre-rendered frames" option in Nvidia control panel does?
 

dr_rus

Member
But AA doesn't work with Bioshock in DX10
Should work (override via cpl), it worked last time I played Bioshock.

From what I've read and researched, I thought your fps was supposed to drop to like 30 if you're using vsync and your fps goes below 60?
That's wrong. If you're using vsync and your fps goes below your display refresh rate you'll get lower fps than you'd get without vsync. For example you'll get 45 fps instead of 55, 35 instead of 42, etc. That happens because with vsync on your videocard is waiting for your display refresh timings to switch to the next frame. This waiting is what lowers fps with vsync when they're lower than display refresh rate.

Am I misunderstanding vsync and the benefits of triple buffering?
The benefit of tripple buffering is that your videocard has not one but two back buffers. Thus when vsynced it doesn't wait for the display refresh timings to render a new back buffer because the back buffer awaiting to be showed in the next display refresh and the back buffer used for rendering are two different buffers.

And can anybody clarify what exactly the "Maximum pre-rendered frames" option in Nvidia control panel does?
Limits the number of frames your GPU will prerender if there are spare resources (memory, CPU) before showing them on screen. This limit adds to the screen buffers thus you should count them like this:
1. front buffer
2. back buffer
3. back buffer (2) - if you're using tripple buffering
4. a number of prerendered frames
 
I'll have to double check on the AA in Bioshock, but thanks for the info!

Well now that I've figured out how to use triple buffering properly I need to decide between FXAA and triple buffering, or SMAA and adaptive vsync. Well for games that don't get along with other forms of AA anyways, like Borderlands.
 

Dice

Pokémon Parentage Conspiracy Theorist
Two things to try. First, try overriding the application settings via the drop-down menu in RadeonPro. If THAT doesn't work, try setting it to Use Application Settings, then changing the AA mode to Supersample. You'd then adjust between 2x, 4x, 8x, etc. from inside the game itself.
Still didn't work.
 
a question: my performance with PC Skyrim isn't so hot. is this due to the need for more patches, or is my system to blame? or both?

quick specs:
Core 2 Quad Q6600 2.4GHz
4GB DDR2 system RAM
Nvidia GTX 570 797 MHz, 1,280MB RAM

would going to 8GB system RAM help, or is it just time to upgrade the whole MOBO/CPU/RAM?
 

Dice

Pokémon Parentage Conspiracy Theorist
Already tried that, lol. I guess it'll just be shitty for no reason like Shogun 2.

ATI YEEEAAAAAHHH!!
 

dr_rus

Member
a question: my performance with PC Skyrim isn't so hot. is this due to the need for more patches, or is my system to blame? or both?

quick specs:
Core 2 Quad Q6600 2.4GHz
4GB DDR2 system RAM
Nvidia GTX 570 797 MHz, 1,280MB RAM

would going to 8GB system RAM help, or is it just time to upgrade the whole MOBO/CPU/RAM?
Core 2 Quad Q6600 2.4GHz -- not enough for Skyrim without some kind of SSAA on. Skyrim is very CPU intensive in outdoors. That would be my first candidade to upgrade to get better Skyrim performance.
Nvidia GTX 570 797 MHz -- may lag in indoors at higher resolutions because of Skyrim engine is crap.
1,280MB RAM -- may not be enough in higher resolutions with high res textures pack installed.

would going to 8GB system RAM help
No. Since Skyrim is 32-bit it can't use more than 4 GB of RAM. And as it is it uses a lot less than that -- closer to 2 GBs.
 
GeForce 301.24 Driver Win-7 64-bit Win-7 32-bit

Bringing 600-series features to 500, 400, 9x00, and 8x00 series cards:

  • NVIDIA FXAA Technology – shader-based anti-aliasing technology available from the NVIDIA Control Panel that enables ultra-fast anti-aliasing in hundreds of PC games. FXAA delivers similar quality to 4x multi-sample antialiasing (MSAA) but is up to 60% faster, enabling substantially higher performance in games. FXAA is supported on all GeForce 8-series and later GPUs. Note: This feature is disabled for games that already have built-in support for FXAA. Visit to learn more.
  • NVIDIA Adaptive Vertical Sync – dynamically enables vertical sync based on your current frame rates for the smoothest gaming experience. Adaptive VSync is supported on all GeForce 8-series and later GPUs. Visit to learn more.
  • NVIDIA Frame Rate Target – dynamically adjusts frame rate to a user specified target. Support for this feature is enabled via third party applications via NVAPI. Visit to learn more.

Probably worth at least a test, imo.
 
Core 2 Quad Q6600 2.4GHz -- not enough for Skyrim without some kind of SSAA on. Skyrim is very CPU intensive in outdoors. That would be my first candidade to upgrade to get better Skyrim performance.

all right, thanks for the info. looks like my aging platform is (over)due for an upgrade.

might wait until Ivy Bridge hits to pull the trigger. lord knows I've got a large enough backlog on portables to keep me busy until then!
 

Jtrizzy

Member
The new drivers are doing some weird stuff with Arkham City DX11, I have to alt+tab out and back in to get it past 30 fps.

Any chance to get this game running closer to 60 in DX11 on a 580? I have everything maxed except for tesselation on normal, and FXAA on low but it's not good enough.

How are 680's doing with AC?
 

dr_rus

Member
The new drivers are doing some weird stuff with Arkham City DX11, I have to alt+tab out and back in to get it past 30 fps.

Any chance to get this game running closer to 60 in DX11 on a 580? I have everything maxed except for tesselation on normal, and FXAA on low but it's not good enough.

How are 680's doing with AC?
300s drivers seems to have a problem of sorts in BAC. Last time I tried it on GTX680 on 301.10 the game was almost unplayable and ran worse than my previous playtime on a GTX470 with some 29x.xx driver. Haven't checked 301.24 yet but from what you're describing it sounds like the problem is still there.
 

kamspy

Member
Little help here.

I've been having problems since installing 290.10

My 580 stays stuck in either 3D clocks, or the one in between 3D and 2D (400mhz I think?)

I've driver swept, rolled back, rolled forward. It works the first time, but the first time I reboot, no matter what driver I'm using now, clocks are stuck.

Help?
 
Little help here.

I've been having problems since installing 290.10

My 580 stays stuck in either 3D clocks, or the one in between 3D and 2D (400mhz I think?)

I've driver swept, rolled back, rolled forward. It works the first time, but the first time I reboot, no matter what driver I'm using now, clocks are stuck.

Help?

Switch drivers? Use MSI Afterburner/Nvidia Inspector to control your clocks.
 

kamspy

Member
Switch drivers? Use MSI Afterburner/Nvidia Inspector to control your clocks.

I've gone from 285-301 and every beta variant in between. Driver sweeping every time to no avail.

I guess I could manually downclock my stuff when I'm not gaming, I just can't figure out why it's not doing it automatically anymore.
 

Kareha

Member
Little help here.

I've been having problems since installing 290.10

My 580 stays stuck in either 3D clocks, or the one in between 3D and 2D (400mhz I think?)

I've driver swept, rolled back, rolled forward. It works the first time, but the first time I reboot, no matter what driver I'm using now, clocks are stuck.

Help?

Try this guide for removing your Nvidia drivers, it's what I use all the time and it's never failed me yet:

http://www.overclock.net/t/1150443/how-to-remove-your-nvidia-gpu-drivers#post_15432476
 

dr_rus

Member
Little help here.

I've been having problems since installing 290.10

My 580 stays stuck in either 3D clocks, or the one in between 3D and 2D (400mhz I think?)

I've driver swept, rolled back, rolled forward. It works the first time, but the first time I reboot, no matter what driver I'm using now, clocks are stuck.

Help?
Try solutions from this thread. It looks like with regards to power management on non-Kepler hardware R300 drivers are a bit too beta at the moment.
 
Top Bottom