This thread has morphed into a religious issue.
As already mentioned, 1080i is better in terms of absolute resolution.
1080i = 1080*1920 = 2,073,600 pixels each 1/30 sec.
720p = 720*1280 = 921,600 pixels each 1/60 sec.
If a still picture is shown, then 1080i is clearly sharper. However, if there is heavy motion, the 1080i picture will show motion artifacts. In extreme cases, you could effectively half the 1080i resolution to 1,036,800. Still hiogher, but the motion effect is noticable and will hurt perceived resolution.
I think that blind tests would show that the difference in thoeretical resolution would be lost by a bunch of real life factors.
First, all providers are compressing the datastream to various degrees that hurts 1080i actual resolution more than for 720P.
Second, the native resolution of the TV will have a big effect. 720P will look better on a 720P set and 1080i will look better on a 1080i/p set.
Then there is the quality of the conversion algorithms, de-interlacers, etc.
Bottom line is that this is kind of a stupid argument, especially since there isn't much we can do to affect or improve what we get.