Originally Posted by Schoman
..."Pixellation occurs when the bandwidth cannot keep up with the amount of data it is expected to carry. This is most frequently a signal compression issue rather than a television problem. You may have noticed it most severely in that fashion show because of all the flash photography. Most scenes, even those containing fairly rapid motion, have large areas that don't change as the picture goes from frame to frame. The areas that keep the same color and luminosity are amenable to high compression ratios. When you get a lot of flashbulbs popping, however, you need to rapidly refresh EVERY SINGLE PIXEL on the whole screen, and this requires much more data to be transmitted, more than the system can handle. So they deal with this by making the "pixels" bigger, and thus the pixellation.
This describes what I'm seeing, so the culprit again seems to be my incoming WoW signal. But is it a crappy, overly compressed signal or the TV's inability to respond quickly enough to the rapid changes?...
The explanation you have quoted is correct up to the last line, which I have emphasized. It began as a great, scientific explanation, then sharply veered into the land of fairy tales. Pixels do not change size, firstly; if they could that would be a great and impossible magic trick, however useless. But there is an explanation.
What is typically referred to as pixellation is more correctly referred to by another colloquial term, which is "macroblocking". In MPEG-2, areas of the picture are divided into blocks 8 pixels wide by 8 pixels high, called macroblocks. In MPEG-4 they are dynamic and can be anywhere from 2x2 to 64x64 pixels, depending on the content and motion. These blocks of pixels are processed as individual units for motion vector efficiency reasons.
The above description indicates that when there is lots of activity, the bitstream can't deliver enough data to reconstruct all new pixels for each frame, and for consumer delivery that is correct, which is why temporal compression (use of P and B frames) is used, which allows a smaller bitstream to be used by representing not each pixel but the difference between pixels frame to frame. Compression ratios reach as much as 100:1 or 200:1, which means that 99 to 99.5% of the original data is discarded in compression; the fact that this can still normally result in good HD pictures is nothing short of remarkable.
So here is what is really going on:
When the MPEG decoder in your TV or STB is able to decode the streaming data at the display rate or above, everything looks great; each macroblock updates properly on each scan of the display.
When the data is partially missing or decoded too slowly (buffer underflow) what happens is some of the macroblocks do not update from frame to frame, but remain frozen on the screen awaiting the new data. You can think of them as "waiting for the bus", and when the "bus" does not arrive in time (no good data representing the new macroblock is available) they have to remain there frozen waiting for the next "bus" (the next frame with good data for that macroblock) to come along and refresh what that macroblock is supposed to be displaying, sometimes for many frames in a row. Its then a selective "freeze-frame', on a macroblock by macroblock basis; some are refreshed on time; some aren't and remain frozen. When a new, refreshed macroblock sits adjacent to a stale one (which is common) and there is motion, it creates a mosaic effect that delineates the hard edges of the macroblock which would otherwise be invisibly, seamlessly stitched together.
That is the pixellation effect you hear described, and is likely what you are seeing. It can be accompanied by another DCT distortion artifact called "mosquito noise" which makes hard edges look as if they are surrounded by a cloud of dust, sort of like PigPen in the "Peanuts" comic strip. If decoding slows enough, parts of or the entire screen is/are muted to black momentarily or permanently, on a macroblock by macroblock basis, creating an effect that is reminiscent of the old video game "Brickbreaker".
If you have poor reception macroblocking will typically be regular and not related to the amount of movement on the screen. If you have good reception and the macroblocking is only on high motion, that means the bit starving very likely occurred somewhere other than in your local MPEG decoder; more likely in compression for delivery or even in the original shooting and production.
Short answer, it is highly unlikely that this is a problem with your set, and very likely that you have now become accustomed to the sorts of new artifacts that digital delivery can bring, artifacts which we did not have with analog delivery (although those artifacts were usually a lot worse and degraded the picture quality constantly to some degree).