Well see, here I don't really think it introduced additional banding (maybe it did I'm far from an expert on this), the distinct bands that were already present in the lower part of a grayscale gradient I was looking at were moved upwards along it as I lowered the gamma and I don't think it introduced any new ones. In many situations the banding was quite frankly atrocious before changing anything, so this issue is unquestionably with the monitor itself. I suppose it needing gamma correction on top of that only exacerbates the issue.
Edit: perhaps I'm not being clear enough, the bands I'm talking about looked more like a block or actual rectangle of color in the gradient. So typical banding could have totally been introduced by changing the gamma, and I doubt I would have even noticed compared to the hard defined sections of the monitor's own banding that were moved around.
It's only a rough approximation since I didn't want to set up my calibration hardware just for this, but here's a comparison between gamma being adjusted in the NVIDIA Control Panel vs being adjusted on the display by selecting a different profile on my PG348Q:
Adjusting the GPU LUT crushes levels together, while internal adjustments keep all 256 steps visible.
I do wish that G-Sync monitors had better calibration controls, but that's what you get when it's a latency-optimized display for gaming I suppose.
As I said: I believe that banding may be an issue with the Dell monitor, but GPU LUT adjustments are likely to make the problem worse.
You should never touch the GPU LUT with an NVIDIA card connected to an 8-bit display, G-Sync or not - at least until they fix this.