Originally Posted by alluringreality
I'm not sure if I posted the link, but somewhere online I read that the monoscope pattern is 1080i. Neither the HD Guru or Cnet posts appear to confirm if the pattern is 1080i, but it seems reasonable that could be the video type. Assuming the pattern is interlaced, it would primarily apply to things like sports from CBS or NBC.
It would apply to any live television that has motion (not static images), broadcast in 1080i, which is quite common in many cable and over-the-air US markets. The motion doesn't need to be "sports" or very fast at all to see the quality loss, as demonstrated by the rather leisurely panning of the test patterns I've seen used:
[Note: This Youtube video is too low quality to make any distinctions between the two TVs shown, but I've included it to show the speed of the moving monoscope pattern typically used. From other documents I believe it is 5 seconds/ screen change.]
In A/V reproduction accuracy, there is no concept of "accounting for taste". We don't "pick" the level of bass, etc., any more than we pick the ending of a play. High fidelity means an unmodified, neutral, exact copy (or "reproduction") of the original artist's tonal balance, timing, dynamics, etc..