Whats better 1080p or 1080i?

1080i (60Hz or 50Hz) is the high definition standard for broadcast. 1080p (60Hz or 50Hz) is a format that is currently available only from local sources such as Bluray players, games consoles and computers.
The difference between them is way the image is refreshed. 1080p 60hz means that the image is completely refreshed 60 times every second. 1080i 60Hz means that every other line is refreshed 60 times each second so a full refresh of all the image lines happens 30 times each second. Theoretically, 1080p 60Hz should handle fast movement better than 1080i 60Hz. In practice, there is little to choose.

There are also different versions of 1080p. 1080p 24Hz or 25Hz or is sometimes used for a transfer from film. These frame rates compare with the film frame rate. 1080p 24 is actually a lower data rate than the usual 1080i 50Hz or 60Hz broadcast rates.

Finally, almost all video content is compressed to reduce the amount of data required to be stored or transmitted. The amount of compression can vary widely and the level of compression has an impact on the image quality. Compression often has a greater impact on image quality than the original display format used.

A note about frame rates: In North America, the standard frame rate of 29.97Hz (normally referred to as 30Hz) is used for television. Frame rates of 30Hz and multiples of 30Hz are used, In Europe, the standard frame rate is 25Hz and multiples of it. Therefore, 30Hz and 25Hz suggest North American and European content respectively.