What is the difference between 1080p and 1080i HDTV sets?

720p, 1080i, and 1080p are three types of broadcast signals that may reach your HDTV set as an input, with 1080p being the highest quality and, as of mid 2008, only available on Blue-Ray discs and in high-end computer games. People often confuse these broadcast signals with native resolution of the TV sets. All flat panel HDTVs (i.e., plasma and LCD), as opposed to CRT tube sets, are inherently progressive in nature. For marketing reasons, however, some manufacturers promote 720p (p for progressive) HDTV as 1080i (i for interlaced), mainly to signal, I suppose, that it supports 1080i signal and to improve their sales. The so called '1080i HDTVs' take a 1080i signal and downconvert the picture to the 720p resolution. Additionally, they de-interlace the 1080i signal and display it in progressive scan mode but in 720p resolution. So, a 1080i TV set is in reality a 720p set, but many manufactures designated 720p sets as such as soon as 1080p sets came along. 1080p sets, on the other hand, take 1080i cable or satellite signal and only deinterlace it, creating a progressive scan, meaning the picture is painted from the top to the bottom line (there are 1080 such horizontal lines) in a single pass and this process happens 60 times per second (in the US). Finally, According to a CNET reviewer, see the link below, the extra sharpness afforded by the 1080p as compared to 720p televisions is not noticeable when watching 1080i sources on 50-inch or smaller sets from the distance of at least 8 feet. Last but not least, according to the Imaging Science Foundation, the most important aspect of picture quality is contrast ratio, the second most important is color saturation, the third is color accuracy, and only the fourth is resolution, despite being easily the most-talked-about.