Joined
·
160 Posts
Do high-end European "100hz" TVs show real, honest-to-god 100hz progressive output, or are companies like Sony and Phillips up to just as much mischief over there as they are over here -- taking huge liberties with terms like "progressive" and "100hz" to market displays that knowledgeable computer users would sneer at if they saw the real specs laid out in computer terms?
Offhand, I can think of a few ways a marketing department could define 100hz PAL:
* deinterlace broadcast 540i50 to 540p50, then render each field in 1/100th of a second two times in a row -- creating beautiful, rock-solid flicker-free 100hz progressive output. And then we all wake up, and remember that TV manufacturers are scum who'll do anything to shave $1.37 off the manufacturing cost of a TV intended to sell for $3,000+.
* deinterlace broadcast 540i50 to 540p50, then use each newly-created pseudo-progressive frame to create two 100hz interlaced fields. Under this scheme, broadcast field #1 (odd) would correspond to rendered field #1 (odd), broadcast field #2 (even) would correspond to rendered field #4 (even), and rendered fields #2 (even) and #3 (odd) would be figments of Faroudja's imagination (*grin*). Technically, the source WOULD be deinterlaced, and the scanrate would technically be 100hz, but I'd personally beat the living crap out of any salesperson who tried to convince me such a display were 100hz progressive.
* some god-awful scheme even worse than the second... dispensing with Faroudja entirely, and just buffering each pair of fields for display twice. This evil scheme would render PAL field #1 in 1/100th of a second as the display's odd scanlines, PAL field #2 in 1/100th of a second as the display's even scanlines, then repeat. On one hand, it wouldn't flicker... but I suspect the output would stutter badly, since it would constantly be taking "2 steps forward, 1 step back" as it updated the display.
So... am I close with any of them, or do 100hz European TVs do something entirely different altogether?
Offhand, I can think of a few ways a marketing department could define 100hz PAL:
* deinterlace broadcast 540i50 to 540p50, then render each field in 1/100th of a second two times in a row -- creating beautiful, rock-solid flicker-free 100hz progressive output. And then we all wake up, and remember that TV manufacturers are scum who'll do anything to shave $1.37 off the manufacturing cost of a TV intended to sell for $3,000+.
* deinterlace broadcast 540i50 to 540p50, then use each newly-created pseudo-progressive frame to create two 100hz interlaced fields. Under this scheme, broadcast field #1 (odd) would correspond to rendered field #1 (odd), broadcast field #2 (even) would correspond to rendered field #4 (even), and rendered fields #2 (even) and #3 (odd) would be figments of Faroudja's imagination (*grin*). Technically, the source WOULD be deinterlaced, and the scanrate would technically be 100hz, but I'd personally beat the living crap out of any salesperson who tried to convince me such a display were 100hz progressive.
* some god-awful scheme even worse than the second... dispensing with Faroudja entirely, and just buffering each pair of fields for display twice. This evil scheme would render PAL field #1 in 1/100th of a second as the display's odd scanlines, PAL field #2 in 1/100th of a second as the display's even scanlines, then repeat. On one hand, it wouldn't flicker... but I suspect the output would stutter badly, since it would constantly be taking "2 steps forward, 1 step back" as it updated the display.
So... am I close with any of them, or do 100hz European TVs do something entirely different altogether?