That's not how it works. The quote says "consoles run 2x or so better than equal PC hardware." You can't prove anything about that by using a 2x (actually more) powerful system. It's irrelevant.
Just think about it: on some PCs, those games will be completely unplayable. So the 8800GT PC will be infinitely better than the PC with a less GPU, even though the 8800GT itself is not infinitely better than the lesser GPU. Have you just proved that PCs are infinitely better than PCs? No, the method was flawed. For example, let's say the 8800GT is 3 times more powerful than the 7900gs... would that mean that at equivalent settings you will always get 3 times the performance? Or could it be much more?
What? Processors function via math. Relative to each other, they're comparable in terms of percent or multiples, to generalize based on the average tendencies of said hardware in a given software application. In fact, there are multiple parts of a processor that may be more or less stressed by rendering certain things, but they still tend to perform roughly the same relative to one another in lieu of an external limitation in similar programs. 2x faster means 2x faster and if it doesn't in Carmack's quote, then it is the quote that is irrelevant or highly taken out of context, not the processor that proves the 2x figure is completely unsupported.
Processors function by math, not the odd conjecture of people wanting to believe in magic, and if the Carmack quote were taken literally, he would be saying literally at least half the cycles of any processor (CPU and GPU) in a PC, half of all memory capacity, and half of all throughput are completely wasted, which is complete and laughably-generalized BS that cannot possibly be true. No results confirm that is close to the case, no engineer would overlook whatever causes such a ridiculous flaw, and if the quote is then not taken literally and is some random estimate that was thrown out there of how well developers can limit and hide things selectively in a game designed purely for that one platform to maximize perceived value, it is not mathematical, factual, or quantifiable and therefore is irrelevant.
I'd like to get some hard numbers again on the GPUs of the consoles versus the 8800GT, though the 750 ti is also at least in practice roughly in line with the new consoles, not way ahead, and I'd think the real intent of such a thread trying to defend the 750ti would be to find a budget, roughly-console-equal GPU from 2007, not the mid range one people paid about $200-$300+.
Except you're using a midrange GPU as a stand in for a low range GPU. Now that I look at prices the 760 is probably a better comparison to the 8800 GT rather than the 970... but there's still that underlying message that the 750 ti should be able to keep competing in the long run, when if anything the 8800GT that was an out and out console stomper isn't THAT far ahead in a later gen multiplat game, whereas the 750 Ti's more a console matcher and looks like it could have serious problems keeping on matching later on in the gen.
The 360's GPU was a custom design (the first with unified shaders ever, not an insignificant detail) somewhere in between an x1800 and an x1900 if I recall correctly with it's own unique improvements. The PS3's was a cut-down 7800 chip that struggled immensely relative to the 360's GPU and whose performance ultimately is obfuscated by the impact of the PS3's Cell processor. It's not quite so easy to compare directly given the uniqueness of both setups at the time relative to PCs, but the 8600GT is more of a match if you're looking for something on the same level. Tomb Raider no doubt runs at 30 FPS or less on both platforms with minimalistic settings, FXAA at best, and probably little to no AF. In any case, the OP's intent regarding the 750Ti might be off-based, but the fact of the 8800's sustained potency after all its years is what I care about as a demonstration of the OP's title, not equating it to a 750Ti.