|
|
Thread Tools |
I thought higher bitrate = higher quality AND I know compression plays a big role. Does Netflix use some kind of super-compression? I know they often use film grain and dithering to improve image quality but that alone wouldn't make video's look so good. How does Netflix manage such excellent image quality @ only 6Mbps???
I had another thought - maybe the resolution from which the image was downscaled plays a big role? For example, maybe content downscaled from 8K or even 16K resolution down to 1080p would look BETTER than content downscaled from 4K resolution down to 1080p EVEN though content downscaled from 8K/16K resolution would have a much lower bit-rate. Is that a valid theory?
Sponsored Links | |||
Advertisement |
|
- kelson h
The bitterness of poor quality lasts long after the sweetness of the low price is forgotten . . . life is too short to drink bad wine
I thought higher bitrate = higher quality AND I know compression plays a big role. Does Netflix use some kind of super-compression? I know they often use film grain and dithering to improve image quality but that alone wouldn't make video's look so good. How does Netflix manage such excellent image quality @ only 6Mbps???
I had another thought - maybe the resolution from which the image was downscaled plays a big role? For example, maybe content downscaled from 8K or even 16K resolution down to 1080p would look BETTER than content downscaled from 4K resolution down to 1080p EVEN though content downscaled from 8K/16K resolution would have a much lower bit-rate. Is that a valid theory?
Where low bit rate is exposed in my experience is dark scenes, on a projector.
Sponsored Links | |||
Advertisement |
|
John
Sony 55A1E, A9F / LG 55OLEDC8
Marantz 7012, Ohm Walsh Speakers
Klein K10-A, Jeti 1501, Murideo Six-G Gen2
Calman Ultimate, ISF Level III Certified
- kelson h
The bitterness of poor quality lasts long after the sweetness of the low price is forgotten . . . life is too short to drink bad wine
Ian
The best way to succeed in life is to act on the advice you give to others

Blu-ray movies played on my BDP makes Netflix's PQ look like crap.IMO NF @5.8 Mbps looks more like very good SD compared to Blu-ray.
I guess that's the diff.between the highly compressed 5.8Mbps from NF and the uncompressed 25-40Mbps from my BDP.

Last edited by greaser; 08-04-2015 at 09:41 AM.
I was talking about DOWNSCALING from 4K to 1080 and from 8K to 1080p. Maybe if the film was downscaled from 8K to 1080p instead of 4K to 1080p then it would look better at lower bit-rate than a film downscaled from 4K to 1080p at a higher bit-rate. Does it make sense?
I also think HEVC / H.265 compression can be used for any resolution, including 1080p and 720p, but the difference in video file size and quality would be negligible when compared against H.264 compression.


Blu-ray movies played on my BDP makes Netflix's PQ look like crap.IMO NF @5.8 Mbps looks more like very good SD compared to Blu-ray.
I guess that's the diff.between the highly compressed 5.8Mbps from NF and the uncompressed 25-40Mbps from my BDP.


Ian
The best way to succeed in life is to act on the advice you give to others

All I Really Need to Know I Learned in

Do I think they look better or equal to Blu-ray? No, but they can be close at times. Netflix's image quality has improved steadily over the years and I've been impressed with their progress.
Can you give a specific title(s) you think is better than the Blu-ray version. I would be happy to compare them.
All I Really Need to Know I Learned in
There is a poster here, @DotJun , who is quite expert at H.264 encoding and understands how to tweak all those mystery parameters that the rest of us have no clue about and leave alone. He claims he can greatly reduce the file size of a BD rip to this size range with only minor PQ degradation -- something the rest of us can't even begin to approach using the canned profiles in Handbrake. I'm sure Netflix has their own cadre of experts doing their encoding with sophisticated tools.
- kelson h
The bitterness of poor quality lasts long after the sweetness of the low price is forgotten . . . life is too short to drink bad wine

Ian
The best way to succeed in life is to act on the advice you give to others
Last edited by mailiang; 08-05-2015 at 01:29 PM.
AFAIK, Netflix is still using eyeIO (H.264) for their 1080 encodes.
Netflix 2160 (HoC S2 & S3 and couple of sample videos ) can, at times, approach BD as viewed via a Sony FMP-X10 and Sony VW600 projector.
AFAIK, Netflix is still using eyeIO (H.264) for their 1080 encodes.
Netflix 2160 (HoC S2 & S3 and couple of sample videos ) can, at times, approach BD as viewed via a Sony FMP-X10 and Sony VW600 projector.

Ian
The best way to succeed in life is to act on the advice you give to others
Last edited by mailiang; 08-05-2015 at 04:19 PM.
Ian
The best way to succeed in life is to act on the advice you give to others
I don't care about apathy
Main System: Sony XBR75X940C * Marantz SR6010 * Marantz MM-9000 * MSI Cubi HTPC (OpenPHT/Aeon Nox) * Roku Ultra * Samsung Blu-Ray * Paradigm Studio/40 v.2 * Paradigm Studio/CC v.2 * Paradigm ADP-370 * SVS PB12-Plus/2 * Harmony Ultimate One * HDFury Linker (Zone 2)
I actually prefer some netflix versions because netflix movies/shows are alot less grainy than blu ray and have "almost" as much detail. been watching netflix movies more lately and the quality really surprises me. not to mention the good 4k tvs do a great job of upscaling 1080P content (as oppposed to 720/1080i).
I too am very surprised that for such low bit rate, it looks as good as it does.
82Q90R*75Q9FN(RIP)*55C8OLED*Galaxy Note10+*Ub820 fed into Oppo 203*XB1X*4k DenonX4200
MASTER LIST OF HDR CONTENT THREAD HERE, UPDATED OFTEN
. . . Sound quality is a different matter, as nobody streams in Dolby TrueHD or DTS HD-Master. Still, you generally get very respectable Dolby Digital
When you bit-starve H.264 -- and make no mistake about it, all streaming video is bit-starved -- the PQ degradation is gradual and can be more subtle -- lots of other things happen before obviously visible artifacts appear. The picture becomes progressively "softer". This may be less apparent to people with soft displays, but if you have a tack-sharp display like a good plasma and your viewing distance is reasonable, you will notice the softness. Grain, that is part of the original presentation for that "film-look", is smoothed out and progressively disappears. A surprisingly large number of people don't like grain and actually prefer it to be gone -- they consider it akin to noise and feel it makes the picture look less Hi-Def. As the bit-rate is lowered, fine detail in shadow areas is lost -- this is always a criterion reviewers use when comparing a reference BluRay encoding to a streamed offering. If you look for it, one of the most obvious effects of H.264 bit-starvation is that very dark scenes, lose their smooth homogeneity and look grossly blotchy because there is not enough bits available to provide a smooth gradation of blacks in dark areas. Once you know to look and you see it, you can never ignore it again. If you have a good panel with deep blacks (i.e.plasma) it can make you cringe [1].
H.264 is a "more efficient" codec but it is not a miracle codec -- you can't just drop 75% of the original bitrate and come out with something "just as good". H.264/AVC's hallmark is that when bit-starved it degrades the PQ in gradual less visibly obvious and less objectionable ways. What's objectionable is subjective to the viewer, his equipment and his tolerance for something less [2]
Audio is in an analogous situation. Steaming HD audio would be the parallel of streaming native BluRay in terms of bandwidth usage. Lossy DTS core tracks are encoded at 1.5Mbps and I see DTS-MA on BluRay to be in the 3Mbps range -- streaming either would be bandwidth prohibitive when your total A/V payload is in the 6Mbps range. In contrast DD/AC3 maxes bitrate at a very bandwidth friendly 640Kbps. Broadcast DD/5.1 content is generally around 380Kbps and even DVD DD/5.1 is usually in the 540Kbps range. The only 640Kbps DD/5.1 audio tracks I have seen are on BluRay. I suspect regular DD/5.1 is streamed at 380Kbps or less. What I'm seeing at Amazon Prime is a lot of shows coming over with DD+ which is a higher quality audio -- still not lossless but with a higher bitrate ceiling than DD/5.1. Since Prime streams their A/V payload at 10Mbps they can afford to spend say 1Mbps on the audio track.
[1] Ever since ABC cut the bitrate of their main channel to < 8Mbps so they could have an HD sub-channel, I find their shows to be borderline unwatchable because of the blotchy dark scenes. This was particularly true with How To Get Away With Murder which has a lot of dark scenes in each episode. The fact that broadcast OTA is MPEG-2 encoded just makes things all the more worse.
[2] I always find it amusing that people can talk with equal enthusiasm about the super resolution and clarity of 4K UHD displays and the viewing of bit-starved streaming content.
- kelson h
The bitterness of poor quality lasts long after the sweetness of the low price is forgotten . . . life is too short to drink bad wine

When you bit-starve H.264 -- and make no mistake about it, all streaming video is bit-starved -- the PQ degradation is gradual and can be more subtle -- lots of other things happen before obviously visible artifacts appear. The picture becomes progressively "softer". This may be less apparent to people with soft displays, but if you have a tack-sharp display like a good plasma and your viewing distance is reasonable, you will notice the softness. Grain, that is part of the original presentation for that "film-look", is smoothed out and progressively disappears. A surprisingly large number of people don't like grain and actually prefer it to be gone -- they consider it akin to noise and feel it makes the picture look less Hi-Def. As the bit-rate is lowered, fine detail in shadow areas is lost -- this is always a criterion reviewers use when comparing a reference BluRay encoding to a streamed offering. If you look for it, one of the most obvious effects of H.264 bit-starvation is that very dark scenes, lose their smooth homogeneity and look grossly blotchy because there is not enough bits available to provide a smooth gradation of blacks in dark areas. Once you know to look and you see it, you can never ignore it again. If you have a good panel with deep blacks (i.e.plasma) it can make you cringe [1].
H.264 is a "more efficient" codec but it is not a miracle codec -- you can't just drop 75% of the original bitrate and come out with something "just as good". H.264/AVC's hallmark is that when bit-starved it degrades the PQ in gradual less visibly obvious and less objectionable ways. What's objectionable is subjective to the viewer, his equipment and his tolerance for something less [2]
Audio is in an analogous situation. Steaming HD audio would be the parallel of streaming native BluRay in terms of bandwidth usage. Lossy DTS core tracks are encoded at 1.5Mbps and I see DTS-MA on BluRay to be in the 3Mbps range -- streaming either would be bandwidth prohibitive when your total A/V payload is in the 6Mbps range. In contrast DD/AC3 maxes bitrate at a very bandwidth friendly 640Kbps. Broadcast DD/5.1 content is generally around 380Kbps and even DVD DD/5.1 is usually in the 540Kbps range. The only 640Kbps DD/5.1 audio tracks I have seen are on BluRay. I suspect regular DD/5.1 is streamed at 380Kbps or less. What I'm seeing at Amazon Prime is a lot of shows coming over with DD+ which is a higher quality audio -- still not lossless but with a higher bitrate ceiling than DD/5.1. Since Prime streams their A/V payload at 10Mbps they can afford to spend say 1Mbps on the audio track.
[1] Ever since ABC cut the bitrate of their main channel to < 8Mbps so they could have an HD sub-channel, I find their shows to be borderline unwatchable because of the blotchy dark scenes. This was particularly true with How To Get Away With Murder which has a lot of dark scenes in each episode. The fact that broadcast OTA is MPEG-2 encoded just makes things all the more worse.
[2] I always find it amusing that people can talk with equal enthusiasm about the super resolution and clarity of 4K UHD displays and the viewing of bit-starved streaming content.
Ian
The best way to succeed in life is to act on the advice you give to others
- kelson h
The bitterness of poor quality lasts long after the sweetness of the low price is forgotten . . . life is too short to drink bad wine
When you bit-starve H.264 -- and make no mistake about it, all streaming video is bit-starved -- the PQ degradation is gradual and can be more subtle -- lots of other things happen before obviously visible artifacts appear. The picture becomes progressively "softer". This may be less apparent to people with soft displays, but if you have a tack-sharp display like a good plasma and your viewing distance is reasonable, you will notice the softness. Grain, that is part of the original presentation for that "film-look", is smoothed out and progressively disappears. A surprisingly large number of people don't like grain and actually prefer it to be gone -- they consider it akin to noise and feel it makes the picture look less Hi-Def. As the bit-rate is lowered, fine detail in shadow areas is lost -- this is always a criterion reviewers use when comparing a reference BluRay encoding to a streamed offering. If you look for it, one of the most obvious effects of H.264 bit-starvation is that very dark scenes, lose their smooth homogeneity and look grossly blotchy because there is not enough bits available to provide a smooth gradation of blacks in dark areas. Once you know to look and you see it, you can never ignore it again. If you have a good panel with deep blacks (i.e.plasma) it can make you cringe [1].
H.264 is a "more efficient" codec but it is not a miracle codec -- you can't just drop 75% of the original bitrate and come out with something "just as good". H.264/AVC's hallmark is that when bit-starved it degrades the PQ in gradual less visibly obvious and less objectionable ways. What's objectionable is subjective to the viewer, his equipment and his tolerance for something less [2]
Audio is in an analogous situation. Steaming HD audio would be the parallel of streaming native BluRay in terms of bandwidth usage. Lossy DTS core tracks are encoded at 1.5Mbps and I see DTS-MA on BluRay to be in the 3Mbps range -- streaming either would be bandwidth prohibitive when your total A/V payload is in the 6Mbps range. In contrast DD/AC3 maxes bitrate at a very bandwidth friendly 640Kbps. Broadcast DD/5.1 content is generally around 380Kbps and even DVD DD/5.1 is usually in the 540Kbps range. The only 640Kbps DD/5.1 audio tracks I have seen are on BluRay. I suspect regular DD/5.1 is streamed at 380Kbps or less. What I'm seeing at Amazon Prime is a lot of shows coming over with DD+ which is a higher quality audio -- still not lossless but with a higher bitrate ceiling than DD/5.1. Since Prime streams their A/V payload at 10Mbps they can afford to spend say 1Mbps on the audio track.
[1] Ever since ABC cut the bitrate of their main channel to < 8Mbps so they could have an HD sub-channel, I find their shows to be borderline unwatchable because of the blotchy dark scenes. This was particularly true with How To Get Away With Murder which has a lot of dark scenes in each episode. The fact that broadcast OTA is MPEG-2 encoded just makes things all the more worse.
[2] I always find it amusing that people can talk with equal enthusiasm about the super resolution and clarity of 4K UHD displays and the viewing of bit-starved streaming content.
I would be curious to know what bit-rate the various services use for their audio tracks. When I compress a Blu-Ray, I generally use the highest quality available (640kbps DD). I have to keep the total bit-rate below 15Mbps for device compatibility reasons and generally target something in the 12Mbps range. Those rips actually look and sound better than most streamed content.
I don't care about apathy
Main System: Sony XBR75X940C * Marantz SR6010 * Marantz MM-9000 * MSI Cubi HTPC (OpenPHT/Aeon Nox) * Roku Ultra * Samsung Blu-Ray * Paradigm Studio/40 v.2 * Paradigm Studio/CC v.2 * Paradigm ADP-370 * SVS PB12-Plus/2 * Harmony Ultimate One * HDFury Linker (Zone 2)
There is quite a bunch of factors, which could do tricks.
-What device you are using watching Netflix? If TV´s app/other device than PC, are you sure your display calibration/picture settings affect Netflix´s picture similarly than MPC-HC via HDMI?
-What is madVR exactly doing in your 1080p -video chain with MPC-HC? It is a scaler/post processing program from DVD-era, shouldn´t have anything to post process nowadays 1080p -videos and 1080p -displays.
-Are your video levels (0-255/16-235) and color decoding (YCbCr/RGB) correct in MPC-HC and/or in your video card´s control panel (+ madVR)?
-Are you even comparing exactly same titles and -scenes, exactly same conditions, same lighting, same sobriety level? Many of Netflix´s recent own series are shot in 4k high end cameras, which yields distinctively superior image quality results against oldies-but-not-goldies -productions; for example McGyver BD Collectors Edition, Knight Rider Ultimate Directors Cut BD or X-Files -VHS rip.
Last edited by sladi75; 08-06-2015 at 01:23 PM.
Not according to Dolby http://www.dolby.com/us/en/technolog...ital-plus.html, or am I missing something?
Ian
The best way to succeed in life is to act on the advice you give to others
- kelson h
The bitterness of poor quality lasts long after the sweetness of the low price is forgotten . . . life is too short to drink bad wine

Apple TV 3 still uses DD 384 Kbps and I hear no audio difference between the two.
All I Really Need to Know I Learned in
Sponsored Links | |||
Advertisement |
|
Thread Tools | |
Show Printable Version Show Printable Version
Email this Page Email this Page
|
|
Posting Rules | |