I think your math is wrong. I'm too lazy to do diagonal length math, so let's just use logic.
Suppose 1 player @ 1080p compared to 4 players @ 1080p. The 4 players each get 1/4 the screen area AND 1/4 the pixels. So pixels/area == (pixels/4) / (area/4). Same PPI or whatever metric you want to use. There's no way you can cut the pixels in half and expect the same or more PPI.
I compared the ppi of one player with 1080p (singleplayer) to the dpi of one player with 720p ( 4player multiplayer).
So by logic it the first case would be: pixels / area
the second: (pixels / 2) / (area / 4)
I didnt think about the cpu load like garett hawke suggested, so that could be the reason why they went with 1080p/30