Join Date: Sep 2004
Location: Redwood City, CA
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 15 Post(s)
The discussion so far centers around the final delivery link, but you need to look at how the content was transmitted all the way from the content provider out to the "edge network" (cable, OTA, Sat)
For the most part, the video stream is packetized, and shoved into the "network pipes" so as not to waste any time slots in the TDM transmission stream. Satellite broadcasters were the first to use "statistical multiplexing" to get 100% utilization of the uplink bandwidth, but cable companies have been adding "stat-mux" to hybrid fiber coax distribution systems.
This is a technique that "looks ahead" at the data that needs to be packed into a TDM stream (a lambda in fiber optic systems). The goal is to not waste any time slots with "no data". The stat-mux decides who's packet gets to go to the head of the queue and get sent.
This can create high latency, out of order packets and other network conditions that result in macroblocking in your tv image.
If you were one of the first 1,000 Directv customers to "beta" their 2 HD channels, you were a guinea pig for testing new stat-mux algorithms for HD. It's come a long way in 15 years, and still improving.
Bottom line is that the bit rate is not the only determining factor in video quality, since a content stream does not get guaranteed bandwidth end to end in the network(s).
All of this "magic" happens at "Layer 2" in the network. Whether Layer 1, the physical layer, is cable, DSL/VDSL, Fiber, or Satellite RF, is not as important as Layer 2, where the packet stream is managed.
All of the CSP's are converging on using the same set of signal processing techniques at L2, so I expect to see smaller and smaller differences between providers.