Why does SXGA (via DVI/VGA) look so much better than 1080p (via HDMI)? - AVS Forum
Forum Jump: 
 
Thread Tools
Old 12-29-2012, 10:55 AM - Thread Starter
Member
 
KenIAm's Avatar
 
Join Date: Jun 2009
Posts: 126
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
This is something I’ve wondered about for far too long, so I’d like to know if anyone has an answer:

I have my HTPC hooked up to my TV via an HDMI cable and set to 1080p (1920x1080@60Hz). But I’ve always noticed that the OTA feed direct to the TV looks much better than the one through the HTPC (using Win7 64-bit WMC and a Hauppauge 2250). Even though the TV settings are the same, it’s noticeably brighter, more contrasty, more 3D-like than the HTPC picture.

The TV (a Toshiba 55SV670U) has a “PC IN” terminal (an analog VGA port). So I connected the PC via the DVI port on the graphics card (an ATI Radeon 5450 made by PowerColor). The highest res available with that connection is SXGA (1280x1024). The picture obviously doesn’t fill the width of the TV, but it looks indistinguishable to the OTA quality (IOW, far better than the 1080p HDMI connection).

I’m using the latest ATI 5400 series drivers and CCC, and the HDMI cable is a 2 meter High Speed HDMI cable. Anyone have any explanations, thoughts, or ideas?

Thanks!

Ken
KenIAm is offline  
Sponsored Links
Advertisement
 
Old 12-29-2012, 12:31 PM
AVS Special Member
 
Tong Chia's Avatar
 
Join Date: Nov 2002
Posts: 1,140
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Quote:
Originally Posted by KenIAm View Post

This is something I’ve wondered about for far too long, so I’d like to know if anyone has an answer:
I have my HTPC hooked up to my TV via an HDMI cable and set to 1080p (1920x1080@60Hz). But I’ve always noticed that the OTA feed direct to the TV looks much better than the one through the HTPC (using Win7 64-bit WMC and a Hauppauge 2250). Even though the TV settings are the same, it’s noticeably brighter, more contrasty, more 3D-like than the HTPC picture.
The TV (a Toshiba 55SV670U) has a “PC IN” terminal (an analog VGA port). So I connected the PC via the DVI port on the graphics card (an ATI Radeon 5450 made by PowerColor). The highest res available with that connection is SXGA (1280x1024). The picture obviously doesn’t fill the width of the TV, but it looks indistinguishable to the OTA quality (IOW, far better than the 1080p HDMI connection).
I’m using the latest ATI 5400 series drivers and CCC, and the HDMI cable is a 2 meter High Speed HDMI cable. Anyone have any explanations, thoughts, or ideas?
Thanks!
Ken

VGA uses the RGB encoding, the color information is transmitted at full bandwidth. The blacklevels (brightness) are also at full resolution (0-255) prior to the D/A conversion.

HDMI on the other hand is a cheap consumer digital interface, in the interests of saving bandwidth (and pennies), it tries to economize on the transmission, the result is a transformed encoding where RGB is converted to YCbCr, in the process the color resolution is halved, at 1080p is 4:2:2 and to make the issue worse it reduces the the bandwidth of the intensity component Y by compressing the value to between 16-235. The official reason is to be able to represent "below black" accurately.

The reduction in color resolution (chrominance/chroma) and compression of the intensity (luminance/luma) is responsible for the relative difference you are seeing.

You can get the ATI card out of this silly mode by forcing it to use the following via the CCC control panel
1)RGB and not YCbCr
2)0-255 for the HDMI data

However the HDMI input on your TV must be able to support this, check the specs on your TV manual. If you have a DVI interface on the TV (rare) use that instead.

Given that you can see the effect of chroma and luma degradation, I suggest investigating the MadVR renderer, it is designed to deal with these kinds of problems. Your existing 54XX is just about sufficient with the default settings, if you like what you see then get something faster.

The actual counterpart of VGA is HD-SDI not HDMI
Tong Chia is offline  
Old 12-29-2012, 03:41 PM
AVS Addicted Member
 
Foxbat121's Avatar
 
Join Date: Jul 2003
Location: VA
Posts: 10,052
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 250 Post(s)
Liked: 250
WMC defaults to video range for output over HDMI. Sounds like you have a mis-matched range setup. Your TV seems to setup for full-range RGB for PC desktop and you didn't change the WMC to full range. Hence, you get washed out pictures you described.

So to recap, out of box, your PC will output full-range (0-255) video signal and your Video card can re-map that into video range over HDMI output or leave it along depending on your settings on video card. WMC always output limited range (16-254) signal (and if you have video card set up to re-map, God knows what it will end up with). Various video players on the PC may output full-range (WMP) or limit range depending on their settings. TVs can be configured to accept either full-range or limited range but not both.

What you want is to configure as following:

1. Keep the video ouput of your PC to full-range. You don't want video card to re-map it. Apply a registry key changes to force WMC to output full-range as well. This will ensure most of your PC applications to get optimal color rendering.

2. Keep your TV set to full-range mode as you have now. If your TV doesn't have a full-range setup (most Panasonic TVs don't), you will have to use your Video card to re-map the PC output to limited range output.

All this can be avoided, if you use a WMC extender instead of connecting the PC directly:)
Foxbat121 is online now  
Old 12-29-2012, 05:44 PM - Thread Starter
Member
 
KenIAm's Avatar
 
Join Date: Jun 2009
Posts: 126
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thank you both for your replies. I love avsforum posters! So helpful and willing to share. And I learn so much every time I ask a question.

So here's what I've found:

1. I set my 5450 to RGB full as Tong Chia recommended. That did make a visible difference. It's still darker than the live, OTA picture, but the contrast is higher. But ...

2. I discovered the blacks and whites test videos in WMC. Guess what? The picture I was complaining about doesn't have crushed blacks or whites, but the one using RGB 0-255 from the video card does. So what does that mean? I prefer my blacks & whites crushed? eek.gif

3. I found a registry key change that is supposed to do what Foxbat121 suggested (set WMC to full range), however it did not make any difference that I could see. I kept it at 0-255 (value=1), though, just in case.

4. I changed my TV setting to 0-255 from "auto". That was responsible for the crushed blacks & whites! With the TV set to 0-255, and my 5450 set to RGB full (0-255), WMC's test videos still look ok. So, definitely some improvement there.

Even though it is an improvement, it still doesn't look as good as the OTA picture. But I think the difference now is mainly brightness (it's pretty difficult to tell flipping between two live sources like that). I'll probably fool around with the TV brightness on the computer HDMI input to see if I can get it to look more the like OTA picture (which I really like), but without calibration I'll just be guessing. Still, it's already improved from what I had, so that's great. I will probably also play around with the MadVR renderer when I get the time (because it'll be fun smile.gif.) If I encounter anything that might be useful to someone else reading this in the future, I'll post an update.

Thanks again, guys!
KenIAm is offline  
Old 12-30-2012, 10:07 AM
AVS Special Member
 
Postmoderndesign's Avatar
 
Join Date: Jul 2007
Posts: 1,113
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 31 Post(s)
Liked: 32
I have a Panasonic plasma that will output WXGA, but it has a native 1080p resolution 16:9 on HDMI. I am using an Nvidia GT440 graphics card and MadVR. Before I spend a lot of time fooling around with WXGA it is my understanding that i get more pixels with HDMI. With MadVR i am not seeing noticeably better color rendition with the plasma's tuner than with the HD Homerun playing live TV under 7MC.

Would you think i could get a better picture from WXGA?
Postmoderndesign is offline  
Old 12-30-2012, 11:19 AM
AVS Addicted Member
 
Foxbat121's Avatar
 
Join Date: Jul 2003
Location: VA
Posts: 10,052
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 250 Post(s)
Liked: 250
Quote:
Originally Posted by Postmoderndesign View Post

I have a Panasonic plasma that will output WXGA, but it has a native 1080p resolution 16:9 on HDMI. I am using an Nvidia GT440 graphics card and MadVR. Before I spend a lot of time fooling around with WXGA it is my understanding that i get more pixels with HDMI. With MadVR i am not seeing noticeably better color rendition with the plasma's tuner than with the HD Homerun playing live TV under 7MC.
Would you think i could get a better picture from WXGA?
no. op just has misconfigged color space.
Foxbat121 is online now  
Old 12-30-2012, 11:22 AM
AVS Special Member
 
Postmoderndesign's Avatar
 
Join Date: Jul 2007
Posts: 1,113
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 31 Post(s)
Liked: 32
No. I answered my own question. WXGA does not seem to be a better picture in my case. I agree with Foxbat121
Postmoderndesign is offline  
Old 01-02-2013, 01:31 PM
Senior Member
 
Aluminum's Avatar
 
Join Date: Feb 2006
Posts: 257
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 19
Quote:
Originally Posted by Tong Chia View Post

VGA uses the RGB encoding, the color information is transmitted at full bandwidth. The blacklevels (brightness) are also at full resolution (0-255) prior to the D/A conversion.
HDMI on the other hand is a cheap consumer digital interface, in the interests of saving bandwidth (and pennies), it tries to economize on the transmission, the result is a transformed encoding where RGB is converted to YCbCr, in the process the color resolution is halved, at 1080p is 4:2:2 and to make the issue worse it reduces the the bandwidth of the intensity component Y by compressing the value to between 16-235. The official reason is to be able to represent "below black" accurately.
The reduction in color resolution (chrominance/chroma) and compression of the intensity (luminance/luma) is responsible for the relative difference you are seeing.
You can get the ATI card out of this silly mode by forcing it to use the following via the CCC control panel
1)RGB and not YCbCr
2)0-255 for the HDMI data
However the HDMI input on your TV must be able to support this, check the specs on your TV manual. If you have a DVI interface on the TV (rare) use that instead.

Good answer, although DVI inputs on TVs might not be all that rare, sometimes they are just hidden 'behind' a hdmi connector since DVI single lane and hdmi use the same wires for video.

My vizio 55" has 4 hdmi inputs and 1 side input, when you change input source on the remote it lists the hidden port (#3) as HDMI/DVI while all the others are just HDMI.

If the onscreen input menu or manual don't tell you, one possible test is these "real" ports are most likely to work with the extreme oldschool low rez BIOS/POST screens.


I can't wait to see what caveats come with hdmi on 4k displays but I'm sure they will ignore displayport because a superior open standard without royalties would make too much sense rolleyes.gif

Nearly dead silent HTPC ver 2.0: i3-4340 w/ Noctua NH-L9i on Z87E-itx inside CM130 elite, fanless PSU, SSD OS drive
SAN shares via 40GbE tunneled over 56Gb infiniband links
microcenter & ebay = severe risks to my wallet
Aluminum is offline  
Old 01-02-2013, 06:13 PM
AVS Addicted Member
 
Foxbat121's Avatar
 
Join Date: Jul 2003
Location: VA
Posts: 10,052
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 250 Post(s)
Liked: 250
Quote:
Originally Posted by Aluminum View Post

Good answer, although DVI inputs on TVs might not be all that rare, sometimes they are just hidden 'behind' a hdmi connector since DVI single lane and hdmi use the same wires for video.
My vizio 55" has 4 hdmi inputs and 1 side input, when you change input source on the remote it lists the hidden port (#3) as HDMI/DVI while all the others are just HDMI.
If the onscreen input menu or manual don't tell you, one possible test is these "real" ports are most likely to work with the extreme oldschool low rez BIOS/POST screens.
I can't wait to see what caveats come with hdmi on 4k displays but I'm sure they will ignore displayport because a superior open standard without royalties would make too much sense rolleyes.gif

No, that is actually incorrect answer. HDMI can pass raw RGB as well as YCbCr. As for why full-range 0-255 vs limited 16-235, that is a video industry's choice to leave head room for blacker-than-black and whiter-than-white information.

The video bits on DVD or BD are already encoded as YCbCr and 16-235 limited range. Conversion into full-range RGB actually will introduce conversions artifacts. So, it is not which one is better. You just need to make sure both ends match.
Foxbat121 is online now  
Old 01-02-2013, 06:26 PM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,243
Mentioned: 13 Post(s)
Tagged: 0 Thread(s)
Quoted: 241 Post(s)
Liked: 389
Internally the data are converted to RGB first for video processing anyway, no reason to reconvert to YCbCr just for output...
renethx is offline  
Old 01-03-2013, 03:06 AM
AVS Special Member
 
Tong Chia's Avatar
 
Join Date: Nov 2002
Posts: 1,140
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Quote:
Originally Posted by Foxbat121 View Post

No, that is actually incorrect answer. HDMI can pass raw RGB as well as YCbCr. As for why full-range 0-255 vs limited 16-235, that is a video industry's choice to leave head room for blacker-than-black and whiter-than-white information.
The video bits on DVD or BD are already encoded as YCbCr and 16-235 limited range. Conversion into full-range RGB actually will introduce conversions artifacts. So, it is not which one is better. You just need to make sure both ends match.

There is a vast difference in what the spec says and how the consumer electronics companies have chosen to implement it, and they are driven by costs and this is the reason they go YCbCr. Making matters worse is the PC video cards automatically use YCbCr the moment they see a HDMI connection, they do this this because some cheaper TV sets do not support RGB and you end up with the strange magenta colors if you forced RGB.

The subtlety in the of encoding on BD is that it is 4:2:0 and HDMI 1.x does not support this, the source must do some form of color space conversion to convert it to 4:2:2 with the TV doing the rest, so you have 2 conversions 4:2:0 -> 4:2:2 - > RGB (Eventually)

RGB is the best, if the TV's scaler electronics does not mess with the input and sends it to the display panel as renethx pointed out.
Tong Chia is offline  
Old 01-03-2013, 07:05 AM
AVS Addicted Member
 
Foxbat121's Avatar
 
Join Date: Jul 2003
Location: VA
Posts: 10,052
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 250 Post(s)
Liked: 250
Quote:
Originally Posted by Tong Chia View Post

There is a vast difference in what the spec says and how the consumer electronics companies have chosen to implement it, and they are driven by costs and this is the reason they go YCbCr. Making matters worse is the PC video cards automatically use YCbCr the moment they see a HDMI connection, they do this this because some cheaper TV sets do not support RGB and you end up with the strange magenta colors if you forced RGB.
The subtlety in the of encoding on BD is that it is 4:2:0 and HDMI 1.x does not support this, the source must do some form of color space conversion to convert it to 4:2:2 with the TV doing the rest, so you have 2 conversions 4:2:0 -> 4:2:2 - > RGB (Eventually)
RGB is the best, if the TV's scaler electronics does not mess with the input and sends it to the display panel as renethx pointed out.

Your perception of HDMI is totally wrong. Support of RGB format is mandated from day one with YCbCr being optional. The reason most video card and TVs default to YCbCr is because it is more optimal for bandwidth consumption, e.g. you can use lesser quality HDMI cable than with RGB without problems. Something that is important to projector users.

RGB is not the best. Both are equal. The reason consumer TVs support limited range video is because that is the film industry standard. And film industry considers RGB full-range used on PC desktop as inferior.
Foxbat121 is online now  
Old 01-03-2013, 10:52 AM
AVS Special Member
 
Tong Chia's Avatar
 
Join Date: Nov 2002
Posts: 1,140
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Quote:
Originally Posted by Foxbat121 View Post

Your perception of HDMI is totally wrong. Support of RGB format is mandated from day one with YCbCr being optional. The reason most video card and TVs default to YCbCr is because it is more optimal for bandwidth consumption, e.g. you can use lesser quality HDMI cable than with RGB without problems. Something that is important to projector users.
RGB is not the best. Both are equal. The reason consumer TVs support limited range video is because that is the film industry standard. And film industry considers RGB full-range used on PC desktop as inferior.

No it is not, although it is mandated by the standard, not all TV sets actually implemented it. I got burnt by a few TV sets I bought.

The question was about video processing on the PC and not about video in general.
Tong Chia is offline  
Old 01-03-2013, 10:55 AM
AVS Addicted Member
 
Foxbat121's Avatar
 
Join Date: Jul 2003
Location: VA
Posts: 10,052
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 250 Post(s)
Liked: 250
OP's issue has been resolved. Stop spreading misinformation.
Foxbat121 is online now  
Old 01-03-2013, 01:13 PM
AVS Special Member
 
Tong Chia's Avatar
 
Join Date: Nov 2002
Posts: 1,140
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Quote:
Originally Posted by Foxbat121 View Post

OP's issue has been resolved. Stop spreading misinformation.

The recent PC video renderers (VMR,EVR,MadVR) work in the the RGB domain. MadVR's documentation takes pains to encourage the use of RGB. This is not misinfomation, but what it takes to get the best out of the platform.

As for HDMI, the problem is not the spec but broken handling of RGB over HDMI in some TVs. I went thru this exercise with Panasonic twice to get the firmware on my TVs fixed.
I have also returned TVs where the manufacturer did not step up. Before buying my most recent TV, I brought along a laptop with 0-255 RGB output over HDMI as a final check. My point is to not put 100% faith in the manufacturers implementing this correctly, I do not consider this misinformation.
Tong Chia is offline  
Old 01-03-2013, 02:13 PM
AVS Addicted Member
 
Foxbat121's Avatar
 
Join Date: Jul 2003
Location: VA
Posts: 10,052
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 250 Post(s)
Liked: 250
You get your priorities backwards. Consumer TVs only need to support 16-235 limited range. There is no requirement to support 0-255 full range (nor is there any advantage to that). Panasonic TVs are one typical examples. Using 16-235 limited range does not mean inferior. Clipping the BtB or WtW is. Since using YCbCr automatically assuming 16-235 limited range, most video cards will default to this for best compatibility.

In fact, I suspect for TVs that do support 0-255 full range RGB internally may remap it into 16-235 limited range in order to share the processing ciruitary/modules with other video paths where 16-235 is normal setting. But who knows. it doesn't really matters.

If you convert your 16-235 video into 0-255 full range, you automatically lost BtB and WtW and it is a big no-no to many video experts. You will never be able to calibrate your TV properly this way.
Foxbat121 is online now  
Old 01-03-2013, 09:11 PM
AVS Special Member
 
Tong Chia's Avatar
 
Join Date: Nov 2002
Posts: 1,140
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
Quote:
Originally Posted by Foxbat121 View Post

You get your priorities backwards. Consumer TVs only need to support 16-235 limited range. There is no requirement to support 0-255 full range (nor is there any advantage to that). Panasonic TVs are one typical examples. Using 16-235 limited range does not mean inferior. Clipping the BtB or WtW is. Since using YCbCr automatically assuming 16-235 limited range, most video cards will default to this for best compatibility.
In fact, I suspect for TVs that do support 0-255 full range RGB internally may remap it into 16-235 limited range in order to share the processing ciruitary/modules with other video paths where 16-235 is normal setting. But who knows. it doesn't really matters.
If you convert your 16-235 video into 0-255 full range, you automatically lost BtB and WtW and it is a big no-no to many video experts. You will never be able to calibrate your TV properly this way.

I look for TVs that do both RGB 0-255 for the PC and YCbCr 16-235 (for everything else) correctly

Limited range on PCs require the video drivers to do the correct thing, both Nvidia and AMD have had history of not keeping this functionality stable.
Nvidia periodically clips BTB, AMD compresses the levels in one release and clips it on the next release.

Ths is one reason I switched to MadVR, it does this with no fuss and no regressions.

HTPCs require a display capable of RGB 0-255 for the video renderer to perform at its best. The renderer on windows cannot output YCbCr even if it wanted to.
MadVR's author Madshi put it most succinctly
Quote:
For us HTPC users it's even worse: The graphics cards do not offer any way for us developers to output untouched YCbCr data. Instead we have to use RGB. Ok, e.g. in ATI's control panel with some graphics cards and driver versions you can activate YCbCr output, *but* it's rather obvious that internally the data is converted to RGB first and then later back to YCbCr, which is a usually not a good idea if you care about max image quality.

http://forum.doom9.org/showthread.php?s=90000bf98d5f2a08101e68fe522d7231&p=1271418#post1271418

Displays than do not do true 0-255 RGB is an issue, that is the reason I have the laptop test. It has both DataVision's Spyder and Chromapure on it.
If it fails to calibrate as an ordinary PC display with the included sensor or the Eye One, this is a red flag for further investigation.
Tong Chia is offline  
Old 01-04-2013, 06:57 AM
AVS Addicted Member
 
Foxbat121's Avatar
 
Join Date: Jul 2003
Location: VA
Posts: 10,052
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 250 Post(s)
Liked: 250
I'm well aware of the limitations of pc. But your previous posts sounded like it is the hdmi that is to blame and cheaper tv makers as well. That is not the case. The best pq tvs on the market are top Panasonic plasma models. Ant Panasonic never really implemented full range support for pcs.
Foxbat121 is online now  
 
Thread Tools


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off