Originally Posted by sfhub
I do not see why it would be filtered for plasma or LCD display if it is applying IVTC.
How could a broadcaster or brd/hd-dvd player know what kind of display is used?
I have tried to ask if there's a filter within every 1080i display sold in US.
Nobody seems to know.
If there is, then filtering the broadcast signal isn't mandatory, because filtering is done inside the television if needed.
If there isn't, then broadcast signal must be filtered before transmission.
Question isn't very essential, because Vern knew that 1080i signal is filtered anyway to achieve more efficient compression.
Anyway, this stuffing progressive and 24fps to 1080i60 seems a bit weird. Why all the effort?
Why brd/hd-dvd players can't just output plain 1080p24?
Why all the (new) displays couldn't input 1080p24/25/30/50/60?
All hdmi-chipsets that have came to market this year already support 1080p.
Why (new) hd-dvd players and displays use old chipsets that does not support 1080p? To save $10 with $500 player or $2500 display?
This is simply crazy...
I'm not buying anything that does not support 1080p, regardless what kind tweaks can be made to 1080i to make it more or less equal to 1080p.
We all want 1080p, so why try to desperately hang with 1080i?