VGA preferred over HDMI for PC input - why? - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 12 Old 05-14-2013, 04:21 PM - Thread Starter
Member
 
ddl25's Avatar
 
Join Date: May 2011
Posts: 16
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I've got a cheap Philips TV connected to a windows PC with an ATI card. When I connect it over straight HDMI or DVI>HDMI, I get lots of ugly chroma aliasing on text and a blurry picture at any resolution, even if I fix the overscan issues. The TV claims to be 1366x768@60hz, but the highest supported resolution is 1360x768, a resolution I can't seem to use over HDMI. Catalyst Control Center wants to send a "720p" signal and won't give me much more detail than that.

I bought a DVI-A cable (DVI to VGA) and this solved the problem entirely. The native resolution is supported and the picture looks fine. I would have shrugged it off, but I tried setting up an HTPC for my friend and the case was the same with his set - VGA cable was an improvement.

This seems backwards to me- Isn't HDMI intended to be the preferred modern format? Is this just a limitation of my video card?
ddl25 is offline  
Sponsored Links
Advertisement
 
post #2 of 12 Old 05-14-2013, 09:30 PM
Member
 
PobjoySpecial's Avatar
 
Join Date: Jan 2013
Posts: 172
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
...
PobjoySpecial is offline  
post #3 of 12 Old 05-14-2013, 10:01 PM
AVS Special Member
 
Mrkazador's Avatar
 
Join Date: Jun 2005
Posts: 3,844
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 44 Post(s)
Liked: 250
I don't think 1360x768 is in the HDMI spec but some TVs do allow that resolution through HDMI. I guess it depends on the manufacturer.
Mrkazador is offline  
post #4 of 12 Old 05-14-2013, 10:46 PM - Thread Starter
Member
 
ddl25's Avatar
 
Join Date: May 2011
Posts: 16
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by PobjoySpecial View Post

There are a few practical downsides to VGA, but they don't include picture quality.

1) No HDMI CEC
2) Separate audio/video (more wires)
3) No bit-streaming of audio to offload DTS down-mixing to a supported TV
4) Getting 3D to work will probably be a chore

A nice feature of the VGA input is bypassing a lot of the image processing circuitry of the TV. What you input is what you get.

So basically what you're telling me is that VGA is fine, and struggling to get HDMI working properly is probably unnecessary? I have DTS bitstreaming working over toslink optical, so that's fine. However, I DID think that VGA was subject to distortion due to noise, being an analog signal. I don't notice any issue though.

Quote:
Originally Posted by Mrkazador View Post

I don't think 1360x768 is in the HDMI spec but some TVs do allow that resolution through HDMI. I guess it depends on the manufacturer.

I think it must be either that this is a low-end TV with less features, or my video card is limited in that it won't output anything but a predefined list of common TV signals over HDMI.

Thanks for the replies.
ddl25 is offline  
post #5 of 12 Old 05-15-2013, 12:40 AM
Member
 
PobjoySpecial's Avatar
 
Join Date: Jan 2013
Posts: 172
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 16
...
PobjoySpecial is offline  
post #6 of 12 Old 05-15-2013, 12:58 AM
AVS Special Member
 
Nethawk's Avatar
 
Join Date: Sep 2001
Posts: 2,513
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 115
VGA tops out at 2048 x 1536, there should be no difference.

Nethawk is offline  
post #7 of 12 Old 05-15-2013, 04:43 AM
AVS Addicted Member
 
stanger89's Avatar
 
Join Date: Nov 2002
Location: Marion, IA
Posts: 17,357
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 85 Post(s)
Liked: 128
Quote:
Originally Posted by ddl25 View Post

However, I DID think that VGA was subject to distortion due to noise, being an analog signal. I don't notice any issue though.

It is, but in practice it's only over long runs and/or low quality cables.

See what an anamorphoscopic lens can do, see movies the way they were meant to be seen
stanger89 is offline  
post #8 of 12 Old 05-15-2013, 06:56 AM
AVS Addicted Member
 
arnyk's Avatar
 
Join Date: Oct 2002
Location: Grosse Pointe Woods, MI
Posts: 14,290
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 680 Post(s)
Liked: 1144
Quote:
Originally Posted by ddl25 View Post

I've got a cheap Philips TV connected to a windows PC with an ATI card. When I connect it over straight HDMI or DVI>HDMI, I get lots of ugly chroma aliasing on text and a blurry picture at any resolution, even if I fix the overscan issues. The TV claims to be 1366x768@60hz, but the highest supported resolution is 1360x768, a resolution I can't seem to use over HDMI. Catalyst Control Center wants to send a "720p" signal and won't give me much more detail than that.

I bought a DVI-A cable (DVI to VGA) and this solved the problem entirely. The native resolution is supported and the picture looks fine. I would have shrugged it off, but I tried setting up an HTPC for my friend and the case was the same with his set - VGA cable was an improvement.

This seems backwards to me- Isn't HDMI intended to be the preferred modern format? Is this just a limitation of my video card?

It is supposed to be possible to drive any typical modern monitor at its native resolution via HDMI if you have a reasonably new display card or on-board video.

I run all of my displays via HDMI at their native resolutions and am very pleased with the results. Frankly, when DVI is an option, they work the same either way.

I don't know if you have experimented with the screen rendering options in Windows, but setting up Cleartype can help. How you get to the Cleartype setting varies with your release of Windows.
arnyk is offline  
post #9 of 12 Old 05-15-2013, 09:39 AM
AVS Special Member
 
blueiedgod's Avatar
 
Join Date: Jan 2011
Location: Amherst, NY
Posts: 1,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 67 Post(s)
Liked: 67
I use VGA and get better picture quality on LG 55LM4600, mainly because true motion is not enabled in VGA.

I send audio via optical to the receiver. If I am playing music, I can turn off the TV without affecting the music.

6 TV's in the house on FiOS and we only pay $4.99/month to connect them all!!! Power to the CableCard and WMC7!!!
blueiedgod is offline  
post #10 of 12 Old 05-15-2013, 09:42 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,562
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
The TV is the problem here, not the graphics card.

1366x768 is an awkward resolution that some older HDTVs ended up on, and was before PC compatibility was much of a concern.

Older displays or graphics cards used to only like inputs that were multiples of 8, so you had to send either 1360x768 or 1368x768 rather than 1366x768.
1366x768 is not an officially supported resolution over HDMI. You can do 720p or 1080p, but nothing in-between. Most of the televisions this resolution would only accept 1360x768 over VGA - if that. A few would accept their native resolution over HDMI, but this is rare.
Chronoptimist is offline  
post #11 of 12 Old 05-15-2013, 11:15 AM
Advanced Member
 
Roussi's Avatar
 
Join Date: Nov 2007
Location: NW USA
Posts: 558
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 23
I run my P50C490 (1366 native) through an HDMI input labeled "DVI/PC" from an HD5450 at 1360 "Basic desktop" resolution at 1:1 pixel mapping (with 3 dark pixels on each side). The tv reports 1080p60 signal.

I remember initially, at installation (Win8-64) the signal was automatically set to 720p, however after setting a 1080p "HDTV" resolution from CCC, restarting, and re-setting to 1360 "Basic", the 1080p60 was retained.

My previous tv, a Vizio, could actually receive a 1366 resolution, matching the native with 1:1 pix mapping.

Both resolutions were available under "Basic" desktop resolutions, however setting 1366 with the current tv reverts to 1360.
Roussi is offline  
post #12 of 12 Old 05-15-2013, 01:54 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,562
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by Roussi View Post

I run my P50C490 (1366 native) through an HDMI input labeled "DVI/PC" from an HD5450 at 1360 "Basic desktop" resolution at 1:1 pixel mapping (with 3 dark pixels on each side). The tv reports 1080p60 signal.
Sounds like the video card is rendering 1360x768 and scaling it up to 1080p, with the display then scaling that back down to 1366x768 - not 1:1 mapping.
Chronoptimist is offline  
Reply Home Theater Computers

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off