AVS Forum banner
1 - 12 of 12 Posts

·
Registered
Joined
·
16 Posts
Discussion Starter · #1 ·
I've got a cheap Philips TV connected to a windows PC with an ATI card. When I connect it over straight HDMI or DVI>HDMI, I get lots of ugly chroma aliasing on text and a blurry picture at any resolution, even if I fix the overscan issues. The TV claims to be [email protected], but the highest supported resolution is 1360x768, a resolution I can't seem to use over HDMI. Catalyst Control Center wants to send a "720p" signal and won't give me much more detail than that.


I bought a DVI-A cable (DVI to VGA) and this solved the problem entirely. The native resolution is supported and the picture looks fine. I would have shrugged it off, but I tried setting up an HTPC for my friend and the case was the same with his set - VGA cable was an improvement.


This seems backwards to me- Isn't HDMI intended to be the preferred modern format? Is this just a limitation of my video card?
 

·
Registered
Joined
·
16 Posts
Discussion Starter · #4 ·

Quote:
Originally Posted by PobjoySpecial  /t/1472754/vga-preferred-over-hdmi-for-pc-input-why#post_23317537


There are a few practical downsides to VGA, but they don't include picture quality.


1) No HDMI CEC

2) Separate audio/video (more wires)

3) No bit-streaming of audio to offload DTS down-mixing to a supported TV

4) Getting 3D to work will probably be a chore


A nice feature of the VGA input is bypassing a lot of the image processing circuitry of the TV. What you input is what you get.

So basically what you're telling me is that VGA is fine, and struggling to get HDMI working properly is probably unnecessary? I have DTS bitstreaming working over toslink optical, so that's fine. However, I DID think that VGA was subject to distortion due to noise, being an analog signal. I don't notice any issue though.

Quote:
Originally Posted by Mrkazador  /t/1472754/vga-preferred-over-hdmi-for-pc-input-why#post_23317607


I don't think 1360x768 is in the HDMI spec but some TVs do allow that resolution through HDMI. I guess it depends on the manufacturer.

I think it must be either that this is a low-end TV with less features, or my video card is limited in that it won't output anything but a predefined list of common TV signals over HDMI.


Thanks for the replies.
 

·
Registered
Joined
·
23,131 Posts

Quote:
Originally Posted by ddl25  /t/1472754/vga-preferred-over-hdmi-for-pc-input-why#post_23317686


However, I DID think that VGA was subject to distortion due to noise, being an analog signal. I don't notice any issue though.

It is, but in practice it's only over long runs and/or low quality cables.
 

·
Banned
Joined
·
14,420 Posts

Quote:
Originally Posted by ddl25  /t/1472754/vga-preferred-over-hdmi-for-pc-input-why#post_23316545


I've got a cheap Philips TV connected to a windows PC with an ATI card. When I connect it over straight HDMI or DVI>HDMI, I get lots of ugly chroma aliasing on text and a blurry picture at any resolution, even if I fix the overscan issues. The TV claims to be [email protected], but the highest supported resolution is 1360x768, a resolution I can't seem to use over HDMI. Catalyst Control Center wants to send a "720p" signal and won't give me much more detail than that.


I bought a DVI-A cable (DVI to VGA) and this solved the problem entirely. The native resolution is supported and the picture looks fine. I would have shrugged it off, but I tried setting up an HTPC for my friend and the case was the same with his set - VGA cable was an improvement.


This seems backwards to me- Isn't HDMI intended to be the preferred modern format? Is this just a limitation of my video card?

It is supposed to be possible to drive any typical modern monitor at its native resolution via HDMI if you have a reasonably new display card or on-board video.


I run all of my displays via HDMI at their native resolutions and am very pleased with the results. Frankly, when DVI is an option, they work the same either way.


I don't know if you have experimented with the screen rendering options in Windows, but setting up Cleartype can help. How you get to the Cleartype setting varies with your release of Windows.
 

·
Registered
Joined
·
1,879 Posts
I use VGA and get better picture quality on LG 55LM4600, mainly because true motion is not enabled in VGA.


I send audio via optical to the receiver. If I am playing music, I can turn off the TV without affecting the music.
 

·
Registered
Joined
·
3,127 Posts
The TV is the problem here, not the graphics card.


1366x768 is an awkward resolution that some older HDTVs ended up on, and was before PC compatibility was much of a concern.


Older displays or graphics cards used to only like inputs that were multiples of 8, so you had to send either 1360x768 or 1368x768 rather than 1366x768.

1366x768 is not an officially supported resolution over HDMI. You can do 720p or 1080p, but nothing in-between. Most of the televisions this resolution would only accept 1360x768 over VGA - if that. A few would accept their native resolution over HDMI, but this is rare.
 

·
Registered
Joined
·
819 Posts
I run my P50C490 (1366 native) through an HDMI input labeled "DVI/PC" from an HD5450 at 1360 "Basic desktop" resolution at 1:1 pixel mapping (with 3 dark pixels on each side). The tv reports 1080p60 signal.


I remember initially, at installation (Win8-64) the signal was automatically set to 720p, however after setting a 1080p "HDTV" resolution from CCC, restarting, and re-setting to 1360 "Basic", the 1080p60 was retained.


My previous tv, a Vizio, could actually receive a 1366 resolution, matching the native with 1:1 pix mapping.


Both resolutions were available under "Basic" desktop resolutions, however setting 1366 with the current tv reverts to 1360.
 

·
Registered
Joined
·
3,127 Posts

Quote:
Originally Posted by Roussi  /t/1472754/vga-preferred-over-hdmi-for-pc-input-why/0_100#post_23319308


I run my P50C490 (1366 native) through an HDMI input labeled "DVI/PC" from an HD5450 at 1360 "Basic desktop" resolution at 1:1 pixel mapping (with 3 dark pixels on each side). The tv reports 1080p60 signal.
Sounds like the video card is rendering 1360x768 and scaling it up to 1080p, with the display then scaling that back down to 1366x768 - not 1:1 mapping.
 
1 - 12 of 12 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top