I would like to know this as well. I find the FLOPS units to be a very unreliable way to measure performance. I have 2x HD6850's in my PC and combined they produce 2976 GFLOPS (Source) but apparently the new PS4 is the greatest thing ever and it only produces 2000 GFLOP, so there must be a lot more to it than these numbers and I would love someone in the know to give an explanation of it.
Flops don't take into account fixed function hardware I believe and it also is a measurement of only one kind of operation. If someone finds a more efficient way to, for example draw a 3d object, you can increase the performance of a 3d engine even though flops are unaffected.
it's basically a measurement of the brute force of main computational unit in the piece of hardware.