AVS Forum banner
1 - 20 of 27 Posts

· Registered
Joined
·
8,244 Posts

· Super Moderator
JVC NZ9 | Sony 760ES | ST130 G4 135" | AVM-60 | MC303 &152 | 7.2.4: B&W 2x802D3/803D3/805D3| 4x15 IB
Joined
·
17,585 Posts
Yea who cares if the player is internal or requires installing something else and still appears internal?
 

· Registered
Joined
·
1,485 Posts
Discussion Starter · #5 ·
Who cares?!? You may not realize it but you being able to configure and launch external players successfully puts you in probably the top 3% of Kodi users. Not everyone has the skills that you and the rest of this community have. Most folks want something that just works right out of the box.
 

· Registered
Joined
·
8,244 Posts
Who cares?!? You may not realize it but you being able to configure and launch external players successfully puts you in probably the top 3% of Kodi users. Not everyone has the skills that you and the rest of this community have. Most folks want something that just works right out of the box.


I get that. And I agree that external players can be a pain. But in regards to DSplayer it's more of a substitute internal player than launching an external.

But the madvr part does add a layer of complexity.

I always feel like everyone else knows how to use Kodi well and I'm just figuring stuff out.
 

· Registered
Joined
·
927 Posts
The problem with DSPlayer is it's not getting updated anymore. Maybe it's not issue right now but there are features in v18 that you can't have if you are running v17/DSPlayer. So it is a big deal if Kodi can do 4k HDR internally.
 

· Registered
Joined
·
8,244 Posts
The problem with DSPlayer is it's not getting updated anymore. Maybe it's not issue right now but there are features in v18 that you can't have if you are running v17/DSPlayer. So it is a big deal if Kodi can do 4k HDR internally.
Yeah that is a bummer as a couple add-ons I'd like to run require v18.

But for me the bigger benefit of DSplayer is madvr. With tone mapping HDR just looks better than any other player or device I have tried.

So while HDR passthrough is cool, plenty of other apps and devices are doing that now (Kodi on the Shield)
 

· Registered
Joined
·
1,715 Posts
This new Kodi fork is built using Kodi v.19, not v.18.
I see no one mentioned reports indicate it also renders iso full menus including UHD menus something the DSPlayer nor MPC players are capable of. I've been using v.3/v.5 of DVDFab Media Players externally for full menus. I may not need to anymore. Personally, menus are a major benefit to my home theatre experience.
The picture VideoPlayer produces is on par with the Oppo the author provided using snapshots.

I see at least one poster here prefers madVR tone mapping instead of passthrough on his $450 55" panel. I thought tone mapping was for 1080p SDR PJ's with huge screens that wanted to simulate HDR most of us with UHD chains simply passthrough. I wonder what benefits the tone mapping improvements vs passing through provide on that 55" entry level panel thus requiring madVR? Maybe I haven't taken something obvious into account?

Imo, madVR processing on a 55" UHD panel isn't even needed nor will you notice any big improvements now that there are other non-madVR HDR switching players using private GPU API's. In other words, comparing this new Kodi build sbs with any of the madVR driven players or front end builds is going to look very similar on smaller screens imo. I use a small 65" older flagship model. I can't see much of any difference using 3 different madVR players at NGU high, 3 after market players using the private API's, and this new Kodi VideoPlayer build using the nVidia API and slated to include AMD's.

That said, I personally have not been able to get the full menus working using this new build probably due to my archaic system and/or quite old GPU drivers. At some point I will diagnose and repair as others have testified menus work fine. I have sampled various playback and they look very good to my eyes. One thing it may lack (I didn't test while installed) is 3D MVC Frame Packed. That will probably continue to require another player(s).
 

· Registered
Joined
·
8,244 Posts
I see at least one poster here prefers madVR tone mapping instead of passthrough on his $450 55" panel. I thought tone mapping was for 1080p SDR PJ's with huge screens that wanted to simulate HDR most of us with UHD chains simply passthrough. I wonder what benefits the tone mapping improvements vs passing through provide on that 55" entry level panel thus requiring madVR? Maybe I haven't taken something obvious into account?
Not sure if that was meant to be a sly jab at my cheap TV?

But anyways I thought the same thing. That it was only meant for projectors, but on my TV HDR always looked pretty dim, with madvr tone mapping it looks better than any other player when playing HDR. It has been a nice improvement as the picture is brighter noe (used to be pretty dim with HDR).

Regardless of that I am happy to see Kodi soon having HDR work with the internal player. Your comment that it is on par with Oppo is a big one, I hope that is the case, screenshots don't always tell the whole story but are helpful for sure.
 

· Registered
Joined
·
62 Posts
I thought tone mapping was for 1080p SDR PJ's with huge screens that wanted to simulate HDR most of us with UHD chains simply passthrough. I wonder what benefits the tone mapping improvements vs passing through provide on that 55" entry level panel thus requiring madVR? Maybe I haven't taken something obvious into account?

Tone mapping can be done on UHD video. Instead of using the base HDR10 "static" metadata, madVR does its own "dynamic" mapping. Essentially more like a Dolby Vision type HDR but it can do it live on the fly. It takes a lot more gpu power than just basic madVR processing. It is also possible to take HDR measurements of individual movies and tweak the metadata for each movie to a person's liking. It can be subjectively better than HDR 10. I don't know if it is better than Dolby Vision, but afaik you can't do Dolby Vision from an HTPC anyways (happy to hear if I'm wrong). I've played with the tone mapping with UHD, but there are so many variables and I haven't been able to become confident that I have the settings right as it varies based on the display capabilities. Plus there doesn't seem to be an easy way to tell (that I am aware of) to compare the output madVR dynamic HDR to simple passthrough HDR10.
 

· Registered
Joined
·
1,715 Posts
with madvr tone mapping it looks better than any other player when playing HDR.
Your comment that it is on par with Oppo is a big one, I hope that is the case, screenshots don't always tell the whole story but are helpful for sure.
What other non-madVR UHD players have you used?
Have you visited the thread and looked at the screenshots and related comments?
 

· Registered
Joined
·
1,715 Posts
Tone mapping can be done on UHD video. Instead of using the base HDR10 "static" metadata, madVR does its own "dynamic" mapping. Essentially more like a Dolby Vision type HDR but it can do it live on the fly. It takes a lot more gpu power than just basic madVR processing. It is also possible to take HDR measurements of individual movies and tweak the metadata for each movie to a person's liking. It can be subjectively better than HDR 10. I don't know if it is better than Dolby Vision, but afaik you can't do Dolby Vision from an HTPC anyways (happy to hear if I'm wrong). I've played with the tone mapping with UHD, but there are so many variables and I haven't been able to become confident that I have the settings right as it varies based on the display capabilities. Plus there doesn't seem to be an easy way to tell (that I am aware of) to compare the output madVR dynamic HDR to simple passthrough HDR10.
madVR tone map processing requires a pretty beefy expensive video card. A UHD entry and even midlevel card isn't going to work. The cost for return is further diminished on smaller displays. Of course those with PJ's just have to bite the bullet. For those that want a nice no cost front end and a way to play UHD files from their computer without dipping in to their kids college funds, this should fit the bill imo. I'd feel bad if it was dismissed because it doesn't need to incorporate madVR and therefor must be inferior especially backed up with the need for expensive tone mapping on small UHD passthrough capable displays.
 

· Registered
Joined
·
8,244 Posts
What other non-madVR UHD players have you used?
Have you visited the thread and looked at the screenshots and related comments?
I have used the Nvidia Shield (with various apps), Roku TV, and just HDR passthrough on a windows PC. For my setup they all look dim. This is seen across two TV's, hence the Roku TV.

I have not been to the thread do you have a link?

madVR tone map processing requires a pretty beefy expensive video card. A UHD entry and even midlevel card isn't going to work. The cost for return is further diminished on smaller displays. Of course those with PJ's just have to bite the bullet. For those that want a nice no cost front end and a way to play UHD files from their computer without dipping in to their kids college funds, this should fit the bill imo. I'd feel bad if it was dismissed because it doesn't need to incorporate madVR and therefor must be inferior especially backed up with the need for expensive tone mapping on small UHD passthrough capable displays.
I'm doing it with an old GTX 1060, these can be had on Ebay for around $125-150, probably not breaking the college fund.

I get what you mean but lets be serious.
 

· Registered
Joined
·
1,715 Posts
I have used the Nvidia Shield (with various apps), Roku TV, and just HDR passthrough on a windows PC. For my setup they all look dim. This is seen across two TV's, hence the Roku TV.

I have not been to the thread do you have a link?

I'm doing it with an old GTX 1060, these can be had on Ebay for around $125-150, probably not breaking the college fund.

I get what you mean but lets be serious.
To be serious, the 3GB models are abundant at that price especially used ones. Not owning one, I'd assume 3GB is insufficient for tone mapping duties but I could be wrong.

The better capable 6GB models are all over $200 and supposedly new. These are also older models.

The new 2060's are trending at $360.

I know all those prices are inexpensive to most but for many others are outrageous for a PC video card. When you get up into the 2070 or 2080 models, they're untouchable for many of us frugal enthusiasts that are savers, not borrowers, bucking the current world economic climate of influencer advices.
 

· Registered
Joined
·
8,244 Posts
To be serious, the 3GB models are abundant at that price especially used ones. Not owning one, I'd assume 3GB is insufficient for tone mapping duties but I could be wrong.

The better capable 6GB models are all over $200 and supposedly new. These are also older models.

The new 2060's are trending at $360.

I know all those prices are inexpensive to most but for many others are outrageous for a PC video card. When you get up into the 2070 or 2080 models, they're untouchable for many of us frugal enthusiasts that are savers, not borrowers, bucking the current world economic climate of influencer advices.
If you go to ebay there are many buy it now options under $150 for the 6gb version. I understand the frugal buyer trust me. No need to assume that others are just burying themselves in debt to get a better madvr experience. You seem to like to take underhanded jabs. I'm just going to move on.
 

· Registered
Joined
·
62 Posts
madVR tone map processing requires a pretty beefy expensive video card. A UHD entry and even midlevel card isn't going to work. The cost for return is further diminished on smaller displays. Of course those with PJ's just have to bite the bullet. For those that want a nice no cost front end and a way to play UHD files from their computer without dipping in to their kids college funds, this should fit the bill imo. I'd feel bad if it was dismissed because it doesn't need to incorporate madVR and therefor must be inferior especially backed up with the need for expensive tone mapping on small UHD passthrough capable displays.

I know it takes a more powerful card, which is why I mentioned it. I only gave my answer because you said:


"I thought tone mapping was for 1080p SDR PJ's with huge screens that wanted to simulate HDR"


and


"I wonder what benefits the tone mapping improvements"


So I tried to answer those questions. Cheers.
 

· Registered
Joined
·
1,715 Posts
I did go to Ebay before making my comments to be sure. You wanted to be serious so let's keep it real. No one is taking underhanded jabs. Link one brand new 1060 6GB full sized dual fan / buy it now listing that isn't open box, refurbished, used, non-generic, for parts only, etc. for under $150 incl shipping and I'll believe your comments.

To stay on point though, much less expensive cards that don't need madVR tone mapping will work just fine with this new Kodi fork. I don't have any need for madVR tone mapping requiring any video card upgrade. I passthrough HDR because my entire chain is UHD HDR and not faux k and native 1080p. But, since my card is not capable of tone mapping using madVR processing, I don't know for certain if it would create any distinguishable real world difference. My due diligence tells me most likely not and perhaps worse thus why I don't rush out and upgrade my 960 and continue enjoying as I have for years.

Furthermore the upscale of lower than 2160p resolutions looks very, very good hardware decoding the duties using my cheap archaic GTX 960 4GB at NGU high/medium. I don't get the "dim" picture you report because I'm not using madVR tone mapping? I still don't believe that tone mapping instead of passing through is of any benefit on an HDR capable display especially a 55" or 65" screen size. Maybe someone could put up screens passthrough vs tone mapped on a 65" so some of us can see what we're missing... or not missing.

Until then, this new Kodi fork appears just fine on my display... without madVR. This isn't to say I'm not a madVR fan. It's a daily driver for me also.
 

· Registered
Joined
·
2,039 Posts
The picture VideoPlayer produces is on par with the Oppo the author provided using snapshots.
That is factually incorrect. That single, zoomed-out screenshot does not qualify as a critical analysis of image quality between a high-end Oppo video player and Nvidia DXVA2 video rendering through VideoPlayer.

There will always be a visible difference between these players, but given the source is a native 4K UHD video played on a native 4K UHD display, that visual difference may be small or even unnoticeable to most viewers. Many video calibration experts would testify that the image quality between the Oppo and Kodi VideoPlayer are not entirely identical. This has been true in previous comparisons between the Oppo players and other lower-quality Blu-ray players, of which VideoPlayer would be associated with, where the Oppo has been considered the better reference video player.

A better way to say it is that the visual differences between Nvidia DXVA2 video rendering and the Oppo player is not significant enough to ignore Windows and Kodi VideoPlayer as a viable way to watch and enjoy 4K UHD content on Windows.

I see at least one poster here prefers madVR tone mapping instead of passthrough on his $450 55" panel. I thought tone mapping was for 1080p SDR PJ's with huge screens that wanted to simulate HDR most of us with UHD chains simply passthrough. I wonder what benefits the tone mapping improvements vs passing through provide on that 55" entry level panel thus requiring madVR? Maybe I haven't taken something obvious into account?

Imo, madVR processing on a 55" UHD panel isn't even needed nor will you notice any big improvements now that there are other non-madVR HDR switching players using private GPU API's.
brazen, I think you are a fine contributor to the forums, but this is a horrible and uneducated statement about tone mapping and watching 4K HDR content on a TV.

Presenting HDR content on any display type has NOTHING to do with screen size or resolution. HDR video is about using a much larger range of brightness and contrast. In SDR video, blue peaks at only 8 nits. In HDR video, bright blue on a 1,000 nit display peaks at 80 nits, and HDR peak white can be as high as 10,000 nits. That is a massive increase in peak brightness that must be mapped from top to bottom by the display panel without clipping or dimming the image. The less peak brightness a display has, the less bright blue becomes and the more difficult it becomes to display that 10,000 nit highlight without clipping it or making the rest of the image too dim.

On Whiteboy's display, the image tends to be mapped with less accurate EOTF tracking and clipped highlight detail because it lacks the necessary peak brightness (range) to accurately display HDR video. Using madVR's tone mapping can offer a significant improvement in image quality by improving the display's gamma tracking at the low end to make the whole image brighter and by doing a far better job not clipping the specular highlights at the top of the range. This is no small thing for less bright displays.

Simply watching a video from start to end and saying I saw no visual loss of quality is not proof of faithful and accurate rendition of HDR video. With the right demonstration content, most displays will show obvious deficiencies in tone mapping quality during difficult scenes. Your edge-lit display with 500 nits of peak brightness cannot accurately represent the bright HDR highlights that occupy less than 10% of the screen area (e.g., most HDR specular highlights in HDR videos) without having a huge number of local dimming zones positioned at the back of the display and an appreciable increase in peak brightness to go with it. It is simply not possible with edge-lit LED technology to balance the dark parts and bright parts of the image at the same time to produce proper HDR highlights. If you don't object to watching any HDR scenes on your TV, then your display is doing a very good job of tone mapping. But that is not the case for many high-quality HDR displays.

Even the best current HDR displays struggle with this task at times. Take this brand new, expensive Sony A9F OLED for example. This image was posted in the Spears & Munsil UHD HDR Benchmark thread:

https://www.avsforum.com/forum/139-display-calibration/3075780-spears-munsil-uhd-hdr-benchmark-disc-discussion-13.html#post58448678

This screenshot comparison comes from the Spears & Munsil UHD HDR Benchmark disc. The first image is presented in Dolby Vision and shows a faithful rendition of the scene with a good tone mapping curve. However, when the display is asked to do the tone mapping itself with a HDR10 input, severe clipping of the highlights occurs, even when the source is first tone mapped to 1,000 nits by a Panasonic Blu-ray player. The peak brightness of this scene can't be managed by the display when taking the 1,000 nits input from the Blu-ray player, that has been already been tone mapped from 4,000 nits to 1,000 nits, and mapped to the 650-700 nits that the display can actually produce. Even with the small amount of tone mapping required, the display tone mapping still fails to preserve the majority of the highlights in the image. As a result, the whole image is blown out due to poor tone mapping.

Pixel shader tone mapping in madVR set to 650-700 nits could bring all of this highlight detail back into the image to look more like the Dolby Vision example. On a projector, even more tone mapping would be required to represent this scene, and converting the source to SDR would help balance the lower part of the image without dimming it too much while still leaving room to map the highlights and avoid blowing out the image. Again, for a projector owner, this type of tone mapping is no small thing.

This is a video that explains how HDR highlights are mapped back into display range with good tone mapping on TVs with proper HDR brightness levels, but static and ineffective tone mapping curves. It is called the Panasonic HDR Optimizer and is similar to the option available in madVR when you choose pixel shaders with HDR output:


This form of tone mapping can even benefit bright HDR TVs that use straight HDR passthrough because the shape of the tone curve used by madVR is better at preserving bright specular highlight detail and the tone mapping can be done dynamically for each scene.
 

· Registered
Joined
·
1,715 Posts
That is factually incorrect. That single, zoomed-out screenshot does not qualify as a critical analysis of image quality between a high-end Oppo video player and Nvidia DXVA2 video rendering through VideoPlayer.

There will always be a visible difference between these players, but given the source is a native 4K UHD video played on a native 4K UHD display, that visual difference may be small or even unnoticeable to most viewers. Many video calibration experts would testify that the image quality between the Oppo and Kodi VideoPlayer are not entirely identical. This has been true in previous comparisons between the Oppo players and other lower-quality Blu-ray players, of which VideoPlayer would be associated with, where the Oppo has been considered the better reference video player.

A better way to say it is that the visual differences between Nvidia DXVA2 video rendering and the Oppo player is not significant enough to ignore Windows and Kodi VideoPlayer as a viable way to watch and enjoy 4K UHD content on Windows.
Yes, the screenshots the author provided are not a critical analysis. More about correctly tone mapping comparison since it was struggling prior to madshi involvement and was his proof it had been corrected. You are correct. What you wrote is a much better way of transferring my thoughts to words, something I struggle with at times.

I think VideoPlayer provides REASONABLE tone mapping and it looks pretty damn good on my crappy display. I'm happy with how everything looks. I never watch something and tell myself it looks too dim or the specular highlights come at the cost of crushing blacks or I'm incapable of filling the gamut and my nit value is too low. I'm at a very happy medium without breaking the bank. I think many more users fall into this category than the high end, high setup niche. Still, it's interesting to know what more money will buy.

I could spend a whole bunch of money and upgrade everything I have to further improve the picture but the cost vs return to me is negligible to be perfectly honest. I need a bigger jump. Those improvements would be so small imo, I can't possibly justify the money to myself or anyone else. I semi, very semi, splurged a few years ago and upgraded to 4k knowing higher k's and newer standards will always be on the horizon. I will do the same when 8k approaches mainstream and explore HDMI 2.1 using DYNAMIC HDR on a frame by frame basis and if nothing else, more display zones. OLED will not be in my future. I will wait until pricing is affordable for me and will not be an early adopter. I'm not well off enough or willing to go into debt and borrow to please myself watching TV. I spend what I have and can afford on other things too but that's another story.


brazen, I think you are a fine contributor to the forums, but this is a horrible and uneducated statement about tone mapping and watching 4K HDR content on a TV.

Presenting HDR content on any display type has NOTHING to do with screen size or resolution.
I brought up screen size because PJ's are known to be dimmer than panels and my emphasis on my other statement was concerning alternate ways to switch into HDR mode also using API's without relying on madVR as the soul way to accomplish this. I could have explained that better but some things I take for granted others understand without having to say it.

I'd like to thank you for taking the time to share what you did in the rest of your post below. I read much about tone mapping but since I can't actually test it's hard to retain the theories. I have to go by testimonials and I continue to respect yours as I have others.

HDR video is about using a much larger range of brightness and contrast. In SDR video, blue peaks at only 8 nits. In HDR video, bright blue on a 1,000 nit display peaks at 80 nits, and HDR peak white can be as high as 10,000 nits. That is a massive increase in peak brightness that must be mapped from top to bottom by the display panel without clipping or dimming the image. The less peak brightness a display has, the less bright blue becomes and the more difficult it becomes to display that 10,000 nit highlight without clipping it or making the rest of the image too dim.

On Whiteboy's display, the image tends to be mapped with less accurate EOTF tracking and clipped highlight detail because it lacks the necessary peak brightness (range) to accurately display HDR video. Using madVR's tone mapping can offer a significant improvement in image quality by improving the display's gamma tracking at the low end to make the whole image brighter and by doing a far better job not clipping the specular highlights at the top of the range. This is no small thing for less bright displays.

Simply watching a video from start to end and saying I saw no visual loss of quality is not proof of faithful and accurate rendition of HDR video. With the right demonstration content, most displays will show obvious deficiencies in tone mapping quality during difficult scenes. Your edge-lit display with 500 nits of peak brightness cannot accurately represent the bright HDR highlights that occupy less than 10% of the screen area (e.g., most HDR specular highlights in HDR videos) without having a huge number of local dimming zones positioned at the back of the display and an appreciable increase in peak brightness to go with it. It is simply not possible with edge-lit LED technology to balance the dark parts and bright parts of the image at the same time to produce proper HDR highlights. If you don't object to watching any HDR scenes on your TV, then your display is doing a very good job of tone mapping. But that is not the case for many high-quality HDR displays.

Even the best current HDR displays struggle with this task at times. Take this brand new, expensive Sony A9F OLED for example. This image was posted in the Spears & Munsil UHD HDR Benchmark thread:

https://www.avsforum.com/forum/139-display-calibration/3075780-spears-munsil-uhd-hdr-benchmark-disc-discussion-13.html#post58448678

This screenshot comparison comes from the Spears & Munsil UHD HDR Benchmark disc. The first image is presented in Dolby Vision and shows a faithful rendition of the scene with a good tone mapping curve. However, when the display is asked to do the tone mapping itself with a HDR10 input, severe clipping of the highlights occurs, even when the source is first tone mapped to 1,000 nits by a Panasonic Blu-ray player. The peak brightness of this scene can't be managed by the display when taking the 1,000 nits input from the Blu-ray player, that has been already been tone mapped from 4,000 nits to 1,000 nits, and mapped to the 650-700 nits that the display can actually produce. Even with the small amount of tone mapping required, the display tone mapping still fails to preserve the majority of the highlights in the image. As a result, the whole image is blown out due to poor tone mapping.

Pixel shader tone mapping in madVR set to 650-700 nits could bring all of this highlight detail back into the image to look more like the Dolby Vision example. On a projector, even more tone mapping would be required to represent this scene, and converting the source to SDR would help balance the lower part of the image without dimming it too much while still leaving room to map the highlights and avoid blowing out the image. Again, for a projector owner, this type of tone mapping is no small thing.

This is a video that explains how HDR highlights are mapped back into display range with good tone mapping on TVs with proper HDR brightness levels, but static and ineffective tone mapping curves. It is called the Panasonic HDR Optimizer and is similar to the option available in madVR when you choose pixel shaders with HDR output:

https://www.youtube.com/watch?v=oTw_Toh0PzA

This form of tone mapping can even benefit bright HDR TVs that use straight HDR passthrough because the shape of the tone curve used by madVR is better at preserving bright specular highlight detail and the tone mapping can be done dynamically for each scene.
I think my panel reaches 540 nits on normal content. HDR is even higher. This doesn't mean I can't display HDR content proper. As you stated, specular highlights will not be as bright as newer high end panels. I can live without the brighter highlights for the cost tbh. PJ's, I won't speculate because I don't know their nit values but I suspect they are much lower than panels and greatly improve using tone mapping via madVR.

Are you suggesting that my panel has such a low nit value that I would benefit using madVR tone mapping instead of passing through HDR content?

That UB9000 has a pretty nice feature with that optimizer by taming the specular highlights without affecting darker images and I understand you linked for visual tone mapping reference. I use a PC for all my playback though...

Thanks again for participating in this discussion
 
1 - 20 of 27 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top