What would you rather measure CPU perf in, giga ops?, from any metric its only 6% faster.
CPU crunches numbers just like the GPU, its just that its better suited to a different problem set due to the nature of its more complicated front end.
Good question. I don't really know. I don't think there's any real metric in measuring CPU performance outside real life benchmarks unless they are of the same exact architecture.
GPUs can be vaguely measured in flops since it's generally discouraged to use branches in shader codes. But with types of programs that runs on CPUs which have lots of branches, it's not easy to measure their performance by theoretical number of some ops per second. Not to mention the way different CPUs have different number of registers and handle caching differently which are a lot more important for CPUs than GPUs because memory accesses are a lot more random.
I know from your posts that you are very knowledgeable about computers. Would you serious add CPU flops to GPU flops?
(btw, ps3's Cell was a different story since from the way it looks, the cell chip was basically an APU with a single core CPU and 6 other GPU cores)
Edit: Also the vast majority of GPU operations are floating point operations. On CPUs, it is not.