Originally Posted by Bengbeng
I've been doing some image quality comparison tests between my local cable provider here in Holland, and a Skandinavian satellite provider which i receive by dish.
Both providers run the same channel, Discovery-HD, in about the same bandwidth (~14Mbit/s h.264/MPEG4), so that's a nice one to compare.
Satellite wins. The image is a little bit more sharp. It looks more "real".
A problem is that i can't blame the cable receiver, because i can't use another one from a different brand to try out. Only that specific receiver is usable on our cable network.
If both the cable and DBS sources, or outside hardware firms, can provide a means of recording the MPEG4 data (DVRs) you could compare the average bit rates for the same show and see if one source has a slight edge over the other bit-rate-wise. There's a current thread in the HD programming forum (AVSer bfdtv) comparing two cable sources here, all-fiber FIOS and hybrid Comcast (MPEG2). But here believe there's some tech reason why MPEG4 recordings can't be analyzed like this and the same may apply there.
But spectrum analysis comparing the luma output of STBs, if available, looking at various higher-resolution scenes, might reveal quantitatively the PQ differences you see. Here's an analysis
by AVSer dr1394 of a crowd scene (next post) using a free/low-cost software program. Various firms, including one in Denmark, make plug-in PC cards (analog luma or MPEG readings) for spectrum analysis, too. -- John