Fair Warning: some people will want to skip this. It's mostly about problems you might encounter trying to mix regular consumer electronics devices with HTPCs.
Originally Posted by JonStatt
If we think back to non-HDMI days
Pal was 0 IRE, US NTSC was 7.5 IRE, and Japan NTSC was 0 IRE
This is was the reason for the mess.
I dont believe standard HDMI defines black as 7.5 IRE does it? If you want to support blacker than black etc with expanded HDMI then perhaps that changes. But just for a standard HDMI interchange surely we didn't centre it around the US when the rest of the world was 0 IRE?
EDIT: I scoured a few forum posts and there seems to be a lot of confusion followed by a consensus to use 0 IRE for standard HDMI. For example, see http://www.avforums.com/forums/denon...ettings-3.html
My suggestion is this. Assuming the RS40 is configured as standard HDMI. Try it with a test disc that outputs a 0 IRE field. If the black gets darker when changing from 7.5 IRE to 0 IRE then 0 is the correct setting. if it doesn't change, then either the setting does nothing over HDMI so it doesn't matter, or 7.5 is the right setting.
My dad has a favorite expression: "Don't get me started lying to you." That translates roughly to, "Don't ask me a question I'm not able to answer, because I might just end up confusing both of us even more." I've been unclear about this issue for a very long time, but...
My understanding is that there is no such thing as IRE for digital signals, but that 7.5 is loosely comparable to digital 16. Standard/Video HDMI is a signal in the range of 16-235. Enhanced/expanded/PC level HDMI is a signal in the range of 0-255. Most consumer devices output digital signals as standard HDMI, while in the past most computer cards output digital signals in the range of 0-255. Current generation computer video cards (at least all the AMD Radeon cards) can be switched to output either.
Where it gets especially confusing is that (from what I've read) standard HDMI is capable of producing blacker than black (BTB) and whiter than white (WTW) shades, while enhanced HDMI signals can't. This to me is counter intuitive, but it was verified for me in the past by trying to see digital values lower than 16 on a computer monitor. 16 was the lowest setting I could see, no matter how much I cranked the brightness. Same for whites above digital 235 - no white shades above that level if the computer was outputting PC/enhanced level video.
But here's where the issue has always lost me completely. From what I've read, the "enhanced" of "expanded" part of PC digital signals means that the 16-235 levels are expanded out to a full 0-255 (0 then becomes the lowest visible shade you can see, but it was 16 before the expansion, so no BTB). Why that would be done is a mystery to me. I've asked the question a bunch of times over the years, and I've gotten tons of conflicting responses (or you could interpret it as my simply not understanding the responses
). I'd appreciate any links to articles that explain the difference, or that contradict my understanding - anyone?
Regardless of the explanation, in practical terms, normal HDMI signals allow us to see BTB and WTW, PC/enhanced levels don't. Even that terminology seems reversed at times, though, so sometimes it's just a matter of experimenting until you find out what's really being output. The problem is when you try to mix different sources (for example, with an HDMI switch). If one of the devices on the switch uses standard HDMI and another uses enhanced, your display might show you an image that looks fine for one source but washed out for the other, or too dark. Auto detection of the HDMI signal type doesn't always work.
What I try to do is make sure that all my devices are outputting the same type of signal. I have two HTPCs and two satellite receivers on one HDMI matrix switch. Both Dish receivers output standard HDMI, and on each of the HTPCs I have the video card (Radeons on both) set for standard HDMI also. (They call it RGB 4:4:4 Pixel Format Studio (Limited RGB).) There are actually 4 different settings in Catalyst Control Center, and I've experimented with all of them. Get it wrong and setting black and white levels for a projector can become a nightmare as you switch back and forth between sources. On top of all this, there are inconsistencies in Windows as to what individual software programs output. Windows Media Player and Windows Media Center default to different HDMI signal types. It can be maddening.
As always, I'm open to anyone with more technical expertise than I have to correct any of the above statements. I'm all for clarifying these issues. I think I have a handle on them, and suddenly something unexpected comes up to confuse me again. I wouldn't be without my HTPCs, but it's definitely a challenge trying to deal with the problems that come up. On the bright side, things seem to be improving. It's definitely easier than it was a few years ago to make things work.