I recently switched from Comcast to RCN in the Boston area.
I've been noticing more macroblocking artifacts on RCN than I'd seen
on Comcast, especially on SD channels. Has anyone else observed this?
It appears than RCN broadcasts many of their SD channels
at a higher bitrate than Comcast. The SciFi channel seems
to be encoded at about a 70% higher bitrate on RCN.
For example, I'll see Comcast at around 2.7 Mbps in SD
while RCN will be around 3 to 4.5 Mbps in SD.
This does seem to vary by channel on both, possibly more on RCN.
Within this, the audio bitrate is 192kbps on Comcast vs 256kbps on RCN
on the one show I looked at (Stargate Atlantis on SciFi).
See the attached images for a comparison of the opening credits from Stargate Atlantis on the SciFi channel between Comcast and RCN.
As you can see, RCN's image has much more detail in high-frequency areas
(such as text and line). However, the RCN image also has much worse
macroblocking in low-frequency areas such as flat surfaces and fog.
I wonder if there is a difference in their encoder settings? It's quite
annoying to me on the RCN side as the macroblocking artifacts
can be fairly visible during scenes in dark rooms with flat wall surfaces
and distract away from the center of the picture. (Or at least I notice
this on my Panasonic plasma... The danger of having good contrast!)
I wonder if there is anything RCN can do to improve this?
For HD, the bitrate seems to be more consistent between the two.
(For example both broadcast Lost at around 17 Mbps and Terminator
at around 10 Mbps.)
I'm also looking forward to Comcast and RCN offering the SciFi channel
in HD in Boston. Does anyone know when this may happen?