not psycho
Member
I just call it logic and common sense when one gauge consistently gives out higher values than the other, both during halftime and the measurements done by Exponent.
That doesn't work because we don't have a shred of recorded evidence from anyone, not the teams, not the officials. No measurements, no identification of the gauge the official used at which time. We have the official's memory, which agrees with the Patriots. To refute the official, all we have is a general statement about 12.5 and 13. The halftime measurements themselves are evidence against the Colts balls being 13 pre-game. And even if the Colts were around 13 or 13.1 average pre-game, they only measured 4 balls at halftime.
Also, even a single gauge has its own variance. We were never talking about exact numbers. The Colts measured 11.45, 11.35, 11.75 on 3 repeated tests of the intercepted ball (despite testing it at all being a violation of NFL rules).
If I was the commissioner I would have given the Colts a small fine for breaking NFL rules and then fixed the testing procedure. The case against the Patriots, regardless of if you think they cheated, should have been thrown out immediately, based on the Colts rule breaking and the faulty procedure.
We know they measured the Pats balls and then inflated them before they measured the Colts balls. It's entirely possible they accidentally switched the gauges during that whole process and given that none of the officials mentions they did on purpose, it's the most likely scenario.
As the report says, the 12.95 psi value reading is an anomaly and likely the result of a human error
So while running out of time they were only able to measure 4 of the Colts balls, and by coincidence they switched gauges at least once, and misread 1 of the 4 results. Yet we are to assume they were perfectly careful about not using one gauge more than the other, and not misreading any other results.
Getting a reading of 12.95 for a ball which supposedly started at 13 was just bad luck, probably misread a 6 for a 9. Maybe they had it upside down. Don't read anything into it. Like... maybe it didn't start at 13.
Still, they didn't decrease the temperature to 67°F randomly to somehow "make the Colts balls fit"
Yes they did, and they are very clear about it. This is the strongest piece of evidence that they had a preconceived notion:
However, the pre-game temperature was set at 67°F because this was the only temperature that allowed the Colts balls to subsequently reach their average pressure during the simulated Locker Room Period. Any pre-game temperature that was higher than 67°F resulted in the Colts balls reaching the Game Day halftime average pressure later than 13.5 minutes into the Locker Room Period.
The pre-game temperature was set at 67 because that was the temperature that allowed the Colts balls to fit. They explicitly set up their test to force the Colts balls to fit.
Put it at 71 and the Patriots balls fit but the Colts balls are too high. And again, there are tons of explanations for why the Colts balls would be too high. A pre-game gauge switch. Pre-game pressures above 13. Using the wrong gauge at halftime. Misreading results at halftime. These possibilities are supported by the known gauge switch at halftime and the 12.95 reading.
For that matter, did they eliminate the 12.95 reading when deciding to lower the temperature? I'm not going to read it all again, but I don't remember them saying that. Only in the later analysis do they mention getting rid of it. So they might have said "temperature must have only been 67 to get that 12.95, let's lower it to force it to fit." And yet you are arguing that the 12.95 was an erroneous measurement.