AVS Forum banner
1 - 5 of 5 Posts

·
Registered
Joined
·
62 Posts
Discussion Starter · #1 ·
Like the title says, what are the mechanisms behind higher bitrate content looking better compared to a low bitrate one? For example, it has been said that HBO Max streams 1080p at a high bitrate - watching the Flight Attendant last night I noticed it looked nearly as good as some other 4K streams on my 65” LG C9 from other services. I know compression artifacts can play into it but are there other things happening? HBO Max streams seem to also have a richer color than other SDR streams from other services and while it doesn’t compare to HDR it feels like it gets close.
 

·
Registered
Joined
·
13,485 Posts
Think reverse. How low bitrate movie is produced, by throwing away details to achieve lower bitrate on the same codec.

Sent from my Pixel 3 using Tapatalk
 

·
Registered
Joined
·
235 Posts
It has to do with how the lower bitrate is achieved. To get a lower bitrate for the same movie, higher compression will be used. Compression is achieved by grouping pixels: e.g. in stead of saying that some group of 100 pixels are all different and specifying each of them, we take some average value and use that for all of them. The same can be done along the time-axis, as often only the differences to the previous frame are stored: by shifting the pixel a bit in colour it may be closer to the colour in the previous frame, there are less differences thus less bits needed.
I know I'm simplifying it a lot, but I hope it helps as an illustration. Consider when you are on a video call and sometimes the image gets blocky: there you see the grouping of pixels. :)

Simply put: in lower bitrates, there is less information to represent the same image; this is achieved making the images less "complicated" or subsequent frames less "different", by slightly altering colours of pixels.
There are different compression algorithms, so some can be better at keeping detail at some bitrate than others, that is (part of) the whole story of which codec to use in which situation. And even parameters within a single codec.

In addition: the signal (e.g. hdr, 4k, ...) does not necessary say anything about the quality. You can e.g. easily upscale a low resolution image to 4K: it will still be a low resolution image (that will compress well compared to true 4K as there will be many similar pixels), the receiver will get a 4K image, but it will not be 4K quality.
 
1 - 5 of 5 Posts
Top