or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › TV Signal Compression In The Real World
New Posts  All Forums:Forum Nav:

TV Signal Compression In The Real World - Page 6

post #151 of 270
Quote:
Originally Posted by homogenic View Post

Would 4:0:0 look inferior to us?
Wouldn't that be black and white (greyscale) TV? smile.gif

Did you mean 4:2:0?
Here's a link:
http://upload.wikimedia.org/wikipedia/en/0/06/Colorcomp.jpg
http://en.wikipedia.org/wiki/Chroma_subsampling

I think normally with video it shouldn't really look that inferior to 4:4:4 most of the time. The differences will be more obvious with CGI/test cards.

But if it doesn't require any increase in bitrate (might it even take less bitrate?) to have the full 4:4:4 colour for compressed video and it will always have the full quality I think it's worth using.
Edited by Joe Bloggs - 8/29/13 at 11:45pm
post #152 of 270
Quote:
Originally Posted by andy sullivan View Post

Horribly compressed is the key phrase here I think. What is/can be done to rectify this problem? What would have to happen to allow for a totally uncompressed signal? What provider is delivering the best signal to it's subs today?

I'm not sure if it's even possible for a cable or satellite provider to supply a high-quality HDTV signal these days. You have to keep in mind that the same over-compression is also happening on the main c-band uplink from HBO, SHO, etc. Maybe someone with a big c-band dish in their yard can confirm but I'm pretty sure most channels are now distributed in low-bitrate (~6 Mbps) h.264. So even providers like FIOS who don't over-compress, still have poor quality because that is what they receive. You know what they say about garbage-in, garbage-out. There are simply too many channels for the available bandwidth.
post #153 of 270
What would the world be like if there was no compression? What would the world be like with 50 high quality high resolution channels versus millions of channels of low-rez garbage?
post #154 of 270
Quote:
Originally Posted by Chronoptimist View Post

Colorspace (color gamut) and chroma sampling are two very different things.

Chroma sampling is the resolution of the color component of a TV signal. 4:2:0 means that it is 1/4 the resolution of the luma (detail) component.
4:2:2 means it is half the resolution (horizontally) and 4:4:4 means that it is the same resolution as the luma component.

Colorspace is the range of colors that the display is capable of showing. A wider color gamut (colorspace) means that the content can contain more vivid and more accurate colors. (because there are things which have color beyond the limits of the current BT.709 colorspace)

Yeah, I know. I mis-spoke when I said color space.

What I'm getting at is why you think 4:4:4 vs 4:2:0 is less important for video than some think. Sure, the eye is more sensitive to luma than chroma. And there are hardly any TVs available that can make use of anything better than 4:2:0. Is that what you mean?

But what about Rec.2020 (either 10 or 12 bit) on a UHD set capable of taking advantage of it? In that case, will the difference between 4:4:4 and 4:2:0 still be...hm, not very important?
post #155 of 270
Thread Starter 
It's obvious that there are several here who really have a handle on what the problems are as well as what's causing them. Assuming that you can't just make a bunch of channels disappear to free up more space, is there an affordable solution? Is it impossible or do they just think that enough of us really don't care? I'm guessing it's the latter.
post #156 of 270
Quote:
Originally Posted by andy sullivan View Post

It's obvious that there are several here who really have a handle on what the problems are as well as what's causing them. Assuming that you can't just make a bunch of channels disappear to free up more space, is there an affordable solution? Is it impossible or do they just think that enough of us really don't care? I'm guessing it's the latter.

 

Ironically, I just posted elsewhere that I do believe that what we'll see with 4K in the FIOS/Comcast/Dish/DirecTV/etc. world will be a weeding out of channels to make room.  We'll see some of the low volume lackluster channels take a back seat (and be removed).  Particularly a number of the SD channels I'm betting.

 

The reason for this is simple: it'll be similar to what it was like when HD showed up: each provider advertising their own number of 4K offerings and bragging that they have more of them than the other.

post #157 of 270
Quote:
Originally Posted by fritzi93 View Post

What I'm getting at is why you think 4:4:4 vs 4:2:0 is less important for video than some think. Sure, the eye is more sensitive to luma than chroma. And there are hardly any TVs available that can make use of anything better than 4:2:0. Is that what you mean?
But what about Rec.2020 (either 10 or 12 bit) on a UHD set capable of taking advantage of it? In that case, will the difference between 4:4:4 and 4:2:0 still be...hm, not very important?
Colorspace (BT.2020) doesn't really apply when it comes to chroma resolution. The argument for using 4:2:0 encoding is that the eye is less sensitive to color - but that is actually not my experience. It's obvious to me when a game is displayed at 4:2:2 or lower. (games are natively rendered in RGB, which is equal to 4:4:4)

Good upsampling can minimize the effects of chroma subsampling with video though, and that's not even worthwhile on a number of displays, as many only perform their processing in 4:2:2 anyway - so even if you had 4:4:4 native video, the television would be throwing half of that away.

There are so many better uses for the additional bandwidth/disc space it would require, that would provide much clearer benefits than increased chroma resolution, that it's not worth thinking about yet.

Another thing to consider is that many of the cameras used to shoot video content use a bayer matrix, so content does not necessarily have full chroma resolution to begin with.
post #158 of 270
Quote:
Originally Posted by Chronoptimist View Post
 
Quote:
Originally Posted by fritzi93 View Post

What I'm getting at is why you think 4:4:4 vs 4:2:0 is less important for video than some think. Sure, the eye is more sensitive to luma than chroma. And there are hardly any TVs available that can make use of anything better than 4:2:0. Is that what you mean?
But what about Rec.2020 (either 10 or 12 bit) on a UHD set capable of taking advantage of it? In that case, will the difference between 4:4:4 and 4:2:0 still be...hm, not very important?
Colorspace (BT.2020) doesn't really apply when it comes to chroma resolution. The argument for using 4:2:0 encoding is that the eye is less sensitive to color - but that is actually not my experience. It's obvious to me when a game is displayed at 4:2:2 or lower.

 

That just means that you can detect the differences, not that the eye isn't less sensitive to color than luminescence, because it is.

post #159 of 270
Thread Starter 
Quote:
Originally Posted by tgm1024 View Post

Ironically, I just posted elsewhere that I do believe that what we'll see with 4K in the FIOS/Comcast/Dish/DirecTV/etc. world will be a weeding out of channels to make room.  We'll see some of the low volume lackluster channels take a back seat (and be removed).  Particularly a number of the SD channels I'm betting.

The reason for this is simple: it'll be similar to what it was like when HD showed up: each provider advertising their own number of 4K offerings and bragging that they have more of them than the other.
I hope you are correct even though I'm sure that providing the sub with better PQ will have nothing to do with a move to 4K (they don't even care about providing 1080P now). Simple marketing strategy, ie profits, will understandably be the driver of this car, as it always has been. I can live with that.
post #160 of 270
Quote:
Originally Posted by Chronoptimist View Post

There are so many better uses for the additional bandwidth/disc space it would require, that would provide much clearer benefits than increased chroma resolution, that it's not worth thinking about yet.

That answers my question, thanks.
post #161 of 270
A FIOS user on another forum just posted this. I think it illustrates what I said above regarding bit starving at the c-band uplink source. Nothing any cable or satellite provider can do about it. This is just an example but other popular channels share a similar fate. A 6-8 Mbps real-time encoded feed is never going to compete with a 20-30 Mbps Blu-Ray. Just stick to buying or renting any content you care about on BR. This is the reason I'm always 1 season behind certain shows like Game of Thrones.

http://www.dslreports.com/forum/r28526842-MPEG-4-is-now-being-used-for-the-.TV-channels~time=1376095553
Quote:
The picture quality all depends on how the source provider sends the channels on C-Band to the distributor. The old saying Garbage in, Garbage Out applies here, just as it did with AMC packing four MPEG-2 HD channels per QAM slot last year.

The .TV channels originate on AMC 18 at 105.0 degrees West on Transponder 11. The six HD .TV channels share the transponder with three other HD channels and two other SD channels. This makes for a C-Band transponder with 9 HD channels and 2 SD channels encoded in MPEG-4 with 8PSK modulation and 5/6 forward error correction yielding usable bandwidth of 75 MB/second for the entire transponder, compared to only about 39 MB/sec for a QAM-256 RF channel. IF the entire C-Band transponder were to be carried, two QAM-256 RF channels should be allocated for no loss in picture quality. The average bit rate for the .TV channels should be in the 6 to 8 MB/sec range. Therefore, the best configuration would be to load five MPEG-4 channels per QAM-256 slot, which is exactly the ratio being used for MLB, NHL, NBA, and Spanish MPEG-4 HD channels. Even though the .TV channels have been transitioned to MPEG-4 for FiOS delivery, they remain loaded only 3 per QAM slot (this will soon change) so there is plenty of leftover bandwidth right now. This would imply that the quality of the C-Band source is substandard.
post #162 of 270
Thread Starter 
I bought the BR of season one of Game of Thrones and recorded season two. The PQ differences were very slight if any at all.
post #163 of 270
Quote:
Originally Posted by andy sullivan View Post

I bought the BR of season one of Game of Thrones and recorded season two. The PQ differences were very slight if any at all.

 

Yeah, except for color (especially saturation), I'm often surprised at how good the much maligned broadcast 1080i can be (from FIOS at least).

post #164 of 270
Quote:
Originally Posted by Joe Bloggs View Post

What about H265 and chroma prediction blocks? Isn't that using the chroma portion for motion estimation?
You're thinking of intra prediction in H.264 and H.265. It's a spatial prediction from neighboring pixels in the same frame. Since the chroma can be spatially different than the luma, it's useful to be able to chose a different mode (direction) for the chroma blocks.
Quote:
Originally Posted by Joe Bloggs View Post

Isn't it going to help with compression if an object moves 1 pixel in any direction (without any other change) if, with 4:4:4 colour you already know each colour value but with 4:2:0 you won't?
I'm not sure what you're trying to say. A 4:2:0 encoder has no knowledge of the original 4:4:4 frame. It just codes the chroma blocks as though they were full size. In 4:2:0, each 16x16 macroblock has four 8x8 luma blocks, one 8x8 Cb chroma block and one 8x8 Cr chroma block. It's up to the decoder to up-sample back to 4:4:4.

Ron
post #165 of 270
Quote:
Originally Posted by tgm1024 View Post

Yeah, except for color (especially saturation), I'm often surprised at how good the much maligned broadcast 1080i can be (from FIOS at least).

The difference to Blu-Ray is very obvious on my 110" projection screen, especially during motion. Maybe GOT was not the best example - try any AMC series. HBO/Max actually does a better than average job at squeezing the most out of their limited broadcast bandwidth. Most content is encoded at 24 fps with field repeat flags, thus saving ~20% data compared to encoding at 30 with hard telecine. But some of their less popular channels are crammed at up to 10 per transponder. (http://www.lyngsat.com/hd/galaxy14.html).

Another thing to consider is that most TVs now have internal noise reduction processing which masks many of the artifacts (at the cost of increased input lag). There used to be a thread somewhere here at AVS where people would post unprocessed comparison screenshots extracted directly from the recorded bitstream.
post #166 of 270
Quote:
Originally Posted by Joe Bloggs View Post

But if it doesn't require any increase in bitrate (might it even take less bitrate?) to have the full 4:4:4 colour for compressed video and it will always have the full quality I think it's worth using.
The bit rate increase going from 4:2:0 to 4:4:4 would be small but the added decoder cost would be fairly large. Here is a link to a discussion about chroma subsampling that took place on the Doom9 forum. I like the idea of 4:4:4 but it would be better to increase the bit depth and frame rate.
post #167 of 270
Quote:
Originally Posted by Richard Paul View Post

The bit rate increase going from 4:2:0 to 4:4:4 would be small but the added decoder cost would be fairly large.
Do you have any links to tests where viewers indicated the 4:4:4 compressed video looked worse than the 4:2:0 at the same bitrate?
post #168 of 270
Quote:
Originally Posted by Joe Bloggs View Post

Do you have any links to tests where viewers indicated the 4:4:4 compressed video looked worse than the 4:2:0 at the same bitrate?

A few years ago, I did some testing between 4:2:0 and 4:2:2. As expected, the 4:2:2 bitstream required 33% more bitrate to match the PSNR of the 4:2:0 bitstream. From an objective point of view, there's no free lunch.

Ron
post #169 of 270
Quote:
Originally Posted by dr1394 View Post

A few years ago, I did some testing between 4:2:0 and 4:2:2. As expected, the 4:2:2 bitstream required 33% more bitrate to match the PSNR of the 4:2:0 bitstream. From an objective point of view, there's no free lunch.

Ron
But that's not the same as, from a viewers perspective which looks the best at that bitrate. Or which, at that bitrate has the colour/chroma values as well as luma that are the closest to the original 4:2:2 or 4:4:4 version.

And if PSNR is based only on luma (which it normally is?), it's not the best for comparing the quality/accuracy of the colour. Surely we should be doing tests that include tests of the colour/chroma quality against the original 4:2:2 or 4:4:4 version both with viewers and without (ie. computing the differences).

ie. if the original source is 4:2:2, at a particular bitrate, which gives the most accurate / best according to viewers compressed version of it - compressing with 4:2:2 or 4:2:0, and not just using PSNR (based only on luma) for comparison.
Similar for 4:4:4 sources - are they best compressed at 4:4:4 or 4:2:2 or 4:2:0 for the end result at a particular bitrate (including correctness of colour, not just luma) to be best for the viewer?

This is what Ben Waggoner in the doom9 thread linked to above by Richard says (the last post in that thread) :
Quote:
Originally Posted by benwaggoner 
Yes, I would also anticipate that a well-tuned 4:4:4 encoder wouldn't require any more bitrate to match 4:2:0 quality. If anything it's likely to be slightly less.

Edited by Joe Bloggs - 9/1/13 at 12:50am
post #170 of 270
Thread Starter 
I read somewhere here on AVS that DirecTv offers a 1920 signal and Dish Network offers a 1440 signal. First question, is this true? Second question, is this a compression issue or something else?
post #171 of 270
Quote:
Originally Posted by andy sullivan View Post

I read somewhere here on AVS that DirecTv offers a 1920 signal and Dish Network offers a 1440 signal. First question, is this true? Second question, is this a compression issue or something else?
I don't know but the BBC in the UK used to use 1440x1080 instead of 1920x1080 for a long time (unless they were transmitting 3D). I'm sure they thought that it gave the best quality / least compression artefacts at the (lower) bitrates they were using with their encoders and that most of the cameras they were using for a lot of their content were recording 1440x1080 anyway.
Edited by Joe Bloggs - 9/3/13 at 5:28pm
post #172 of 270
Quote:
Originally Posted by andy sullivan View Post

I read somewhere here on AVS that DirecTv offers a 1920 signal and Dish Network offers a 1440 signal. First question, is this true? Second question, is this a compression issue or something else?

First question, no, it isn't.

Second question, Dish probably uses more aggressive compression than DirecTV on some channels, but it's not done by just chopping resolution. There is so much more to compression than merely asking, "what's the resolution? what's the chroma sampling?" There's a fundamental choice about how many bits you are going to allow and then the encoder is going to make a million tradeoffs to fit the content into that bit budget. It will choose to cut some or all things as needed to make it work out.
post #173 of 270
Quote:
Originally Posted by rogo View Post

First question, no, it isn't.

Second question, Dish probably uses more aggressive compression than DirecTV on some channels, but it's not done by just chopping resolution. There is so much more to compression than merely asking, "what's the resolution? what's the chroma sampling?" There's a fundamental choice about how many bits you are going to allow and then the encoder is going to make a million tradeoffs to fit the content into that bit budget. It will choose to cut some or all things as needed to make it work out.
Not really compressing, they are taking the feeds that are 1920x1080 and dropping down to 1440x1080. Of course DirecTV is known to broadcast programming at 1280 x 1080i. You have to love HD-Lite.
post #174 of 270
Quote:
Originally Posted by gregzoll View Post
 
Quote:
Originally Posted by rogo View Post

First question, no, it isn't.

Second question, Dish probably uses more aggressive compression than DirecTV on some channels, but it's not done by just chopping resolution. There is so much more to compression than merely asking, "what's the resolution? what's the chroma sampling?" There's a fundamental choice about how many bits you are going to allow and then the encoder is going to make a million tradeoffs to fit the content into that bit budget. It will choose to cut some or all things as needed to make it work out.
Not really compressing, they are taking the feeds that are 1920x1080 and dropping down to 1440x1080. Of course DirecTV is known to broadcast programming at 1280 x 1080i. You have to love HD-Lite.

 

Even if that was what they're doing, then it absolutely would be compressing. That 1440 still has to display on a 1920 screen.  Imagine dropping a resolution to 100 across and then expanding it out at the destination to 1920.  That's a horrifying compression.

 

2nd: If you're looking at the wikipedia page for "HD-Lite", be aware that the page is a load of undocumented crap.  The references do not say anywhere what you're saying about Dish, and the talk page is a complete mess (I'm going to have to reformat that to get it to work with a table of contents) and is written by what looks like 2 people.  There's one reference that even discusses 1440 as a VERTICAL effect of 720 being written twice.  And one reference is a mere pile of notes.  Further, a "citation needed" is still there with a date of 2007!  It reads like a usenet argument.

 

One more humorous thing.  If you google "Dish network" and "1440", 1440 primarily shows up as part of its address.

post #175 of 270

GODallmighty do I @#$%ing hate wikipedia's hands-off policy notion (except for exceeding rare "office actions").

post #176 of 270
That "1280x1080i" on Wikipedia for Directv has got to be wrong smile.gif. That resolution isn't part of the ATSC standards as far as I can see.
Edited by Joe Bloggs - 9/4/13 at 7:09am
post #177 of 270
Quote:
Originally Posted by Joe Bloggs View Post

That "1280x1080i" on Wikipedia for Directv has got to be wrong smile.gif. That resolution isn't part of the ATSC standards as far as I can see.

 

Wikipedia is an absolute disaster.  Regard this as axiomatic: The largest proponents of wikipedia just do not read through the citations that miscreants put in there to lend credibility to their claims.  And, of course, if they cite a particular page of a particular book out of print they can say absolutely anything.

post #178 of 270
Quote:
Originally Posted by Joe Bloggs View Post

That "1280x1080i" on Wikipedia for Directv has got to be wrong smile.gif. That resolution isn't part of the ATSC standards as far as I can see.

But they could still use it. The STB would just have to convert to an ATSC standard which I'd well within it's hardware capabilities
post #179 of 270
Quote:
Originally Posted by Glimmie View Post

But they could still use it. The STB would just have to convert to an ATSC standard which I'd well within it's hardware capabilities
Wouldn't the set top box be built to only support set resolutions - such as those that are part of the ATSC / DVB standards?
post #180 of 270
Quote:
Originally Posted by tgm1024 

Even if that was what they're doing, then it absolutely would be compressing. That 1440 still has to display on a 1920 screen.  Imagine dropping a resolution to 100 across and then expanding it out at the destination to 1920.  That's a horrifying compression.

2nd: If you're looking at the wikipedia page for "HD-Lite", be aware that the page is a load of undocumented crap.  The references do not say anywhere what you're saying about Dish, and the talk page is a complete mess (I'm going to have to reformat that to get it to work with a table of contents) and is written by what looks like 2 people.  There's one reference that even discusses 1440 as a VERTICAL effect of 720 being written twice.  And one reference is a mere pile of notes.  Further, a "citation needed" is still there with a date of 2007!  It reads like a usenet argument.

One more humorous thing.  If you google "Dish network" and "1440", 1440 primarily shows up as part of its address.

I never heard of HD Lite but its a nice description of whats going on:
HD Lite = insufficient screen resolution + insufficient transmission bitrates

http://hdcampaign.kk5.org/#/whats-the-problem/4545230766
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › TV Signal Compression In The Real World