Originally Posted by NxNW
I'm kinda new to this 4K HDR UHD thing: I know my HDMI cable should be able to carry 18 giga with a G bits per second.
But you're saying Netflix only sends 15 mega with an M bits per second.
So there's like 1000:1 compression going on somewhere, right?
I know this is basic, but I'm sure an expert here can succinctly help me frame my thinking about this.
I'm no expert, but here is how I would explain it:
The "streaming bitrate" is the rate the data is being sent in compressed format from Netflix servers to your media playback device. Technically it is the rate of the entire file, but audio is such a small part of that bandwidth that we will just talk about video bitrates. To put it into perspective, Blu Ray's typically store a version of the video file with a bitrate around 15-25Mbps for 1080p video, depending on compression codec, content, and framerate, which is about as good as you will ever see 1080p SDR content (i.e. as close to "uncompressed" as you get in the digital age). The h265 codec is better with 4k video than h264 or any other codec used for 1080p, so while 4k has four times the video content of 1080p (along with more color and chroma data), the bitrate is only around double that, or about 30-60Mbps for 24hz content. Again, if your source is in this bitrate range, you are getting the best 4k video quality available, and you could consider it as close to "lossless" as you can get. So if Netflix is streaming 4k at 15Mbps, it is compressed 2-4 times as much as a UHD blu ray.
In all reality, this extra compression over "blu ray quality" is what is really important because there isn't much room for more compression without noticeably losing quality at that point. But you asked about the relationship between your HDMI bitrate and the streaming bitrate.
HDMI bandwidth is based on raw video data flowing from the device that decoded the video to the display, and is 100% uncompressed pixel by pixel data with all the other data to go with it for color, chroma, etc. A 4k resolution, 12 bit color with 4:2:2 Chroma compression and 60hz framerate video requires ~ 18gbps of bandwidth to get from your decoder to your display. That's a LOT of data! More than any copper ethernet cables can do, and faster than any commercial storage device could record or play back in real time. This is why we have codecs to compress the video signal into something manageable.
So technically, yes, there is a LOT of compression going on when taking raw video and putting it into a file that can be put on a disc or sent over the internet. But it is 1's and 0's, and the majority of that compression is using algorithms that are lossless, so you get the same 1's and 0's on the other end. Codecs have their limit though, and whether a particular frame was compressed without any quality loss is based on several factors - The codec used to compress the data, the content of a video frame including things like how many pixels of the same color and brightness are in a section of the screen, and the frame rate of the video.
So you can't just look at the two numbers and draw a direct correlation between streaming bitrate and HDMI bitrate without breaking it down to the frame by frame analysis and take content into account, but you can judge the average quality of a video based on the average amount of compression used. In my experience, for h265 4k HDR video, once you get below 20-25Mbps, you start to notice banding, tiling, softness of the image, and other artifacts that are particularly noticeable on larger screens. Personally, I equate 15Mbps 4k video to about the same quality as 1080p on bluray, which is still pretty darn good. Unfortunately audio takes a hit too, so that compression takes a toll in more ways than one.