Your numbers are right, but I'd argue with how they're presented. By giving different values for 1800c and "what it's checkerboarded to" you're falling into the semantic trap that some rendered pixels are "more real" than others.
I think it's more accurate to say that 1800c has exactly the same number of pixels as 1800p. (That's why it's called that!) The difference is that half the 1800c pixels have the potential to differ from the ideal render. (In practice, not all of them will; and for most, the difference will be imperceptible.)
Well, while you're fighting the holy semantic crusade (and i'm with you, don't get me wrong.) just of whether or not to call it "scaling", there's still people who believe 1800p checkerboarded means it takes an 1800p image and 'checkerboards' it to full 4k.
That's the far more common misconception, that's the more technical misconception
"In any specific mode, the PS4 renders (via checkerboarding) one, and only one, internal resolution."
While this isn't - strictly speaking - wrong, the problem with those numbers is that the absolute pixel counts don't translate to actual graphics-compute costs using checkerboard.
Which isn't an issue in of itself - but it promotes a (sadly false) narrative that CB exactly halves the cost of native res you're targeting. One of the reasons quality is/can be much better than upscaling is that a fair-portion of the pipeline operates at native-target resolution (after or during reconstruction) - eg. anti-aliasing, for instance.
well, i didn't make any implications of "GPU costs to render a frame". The poster i quoted simply wanted to know the pixel count, and i thought one of the most meaningful pieces of information would be the fact that EACH of the 'checkerboarded' frames (1600x1800pixels) already contains more 'unique' pixels and thereby image information than a full frame of 1920*1080.