After roughly 12 hours of testing I've settled on what I believe to be the optimal settings for my system. My goal was to maximize IQ as far as my hardware could push it whilst retaining a perfect refresh of 30fps. Dips of any kind were not tolerated regardless of what was happening on screen. This is a rock-the-fuck solid config on my system.
Hopefully this helps someone!
My Hardware:
i5-4570 (3.2 GHz)
GTX 770 (2GB)
8GB DDR3 1600
7200RPM HDD
Panasonic ST30 (50" plasma)
Xbox One controller
Display Settings (in-game):
Fullscreen Mode: Fullscreen
Resolution: 1920x1080 59.94Hz
Vertical Sync: Off
Graphics Settings (in-game):
Resolution Scale: 100
Graphics Quality: Custom
Mesh Quality: Ultra
Tessellation Quality: Medium
Texture Quality: High
Shadow Quality: Ultra
Terrain Quality: High
Vegetation Quality: Ultra
Water Quality: Medium
Post-Process Quality: High
Ambient Occlusion: SSAO
Effects Quality: High
Post-Process Antialiasing: Off
Multisample Antialiasing: 2x MSAA
Nvidia Inspector Settings:
Antialiasing - Mode: Enhance the application setting
Antialiasing - Setting: 2x [2x Multisampling]
Antialiasing - Transparency Multisampling: Disabled
Antialiasing - Transparency Supersampling: 2x Supersampling
NVIDIA Predefined FXAA Usage: Allowed
Toggle FXAA on or off: On
Anisotropic filtering mode: User-defined / Off
Anisotropic filtering setting: 12x
Texture filtering - Quality: Quality
Texture filtering - Trilinear optimization: On
Vertical Sync Tear Control: Adaptive
Vertical Sync: 1/2 Refresh Rate
(all other options set to Nvidia defaults)
Shortcut Addendum:
"...\DragonAgeInquisition.exe" -GameTime.MaxSimFps 30 -GameTime.ForceSimRate 30+ -RenderDevice.RenderAheadLimit 2
Notes:
The redundant 2x MSAA in Inspector may not be doing anything, but it didn't affect performance so I left it in.
My goal with Inspector's AA settings was to get the 2x Transparency Supersampling working. I cannot say with certainty that it is, but again, the performance hit was negligible.
FXAA gets a bad rap due to the low quality of the injectors that first used it. Nvidia's FXAA implementation is much better. Blur is negligible, about the same as SMAA. This is true when playing on a big screen from 2-3m away, at least.
More importantly, FXAA provides better IQ with a lower performance hit than the in-game "post-process antialiasing" setting.
Anisotropic filtering at 12x/Quality is perfectly for the living room environment. And by "perfectly fine" I really mean "you'd think it was 16x/High Quality if I didn't tell you."
I'm still not 100% content with the antialiasing in this config, but aside from downsampling from 3840x2160 there doesn't seem to be a better option. The performance hit for the in-game 4x MSAA is far too high for only marginally improved IQ.