Originally Posted by Red-3
Picked up a Samsung 2333HD TV/Monitor and trying out the various connections to my PC.
The Monitor is native 1080p and has a TV tuner built in.
It has 2xHDMI, 1xDVI and 1xVGA (plus a coaxial and component input).
DVI and even VGA produce a beautiful 1:1 pixel mapped 1920x1080p image.
HDMI is a different story - the text is 'messy' not crisp like the other two. The picture also looked 'brighter'/'garish' and more 'TV-like' - if that means anything.
Tried it with two different HDMI sources - desktop PC with ATI 4850HD card and laptop PC with ATI 3200HD onboard video.
Also tried two separate cables.
Tried adjusting the image, best results were to reduce the sharpness on the TV which reduced, but did not solve it. DVI still looked way better.
Tried different modes - most modes were nicely scaled and looked smooth and characteristically soft as expected, but 1280x720 and 1920x1080 both looked similar - like badly scaled text with no smooth anti-aliasing. (Similar to when you scan a page of text in black and white mode rather than greyscale.)
Any idea where the problem lies? ATI card or driver problem? HDMI problem in the TV/Monitor? I'm doubting it's the HDMI cable itself...
Is this a common problem?
Considering taking the TV back and just buying a monitor without HDMI, but it's a shame to revert to VGA and not use HDMI when my laptop has it.
(Also tried using the HDMI out to my Sony 1080p projector. Noticed some slight flaws in the image around some of the next but the 1:1: pixel mapping looked good to me... but it has me a little suspicious of the ATI cards.)
I have exactly the same problem, coincidently with a Samsung LED monitor.
The monitor has an HDMI input, a DVI input, and a VGA input. I've limited my tests to the first two. I have a 2011 Mac mini. It has an HDMI output and a "thunderbold/mini displayport" out.
I have two high-quality HDMI cables, an HDMI to DVI adaptor + el cheapo DVI cable complete with a chinglish warning tag on it, and a mini displayport to HDMI adaptor.
The DVI cable attached to the HDMI out adaptor gives a nice clean crisp highly legible display. All combinations using HDMI in to my monitor are vastly inferior, and no amount of monitor hardware calibration, or software calibration on the Mac mini, will change anything. Both HDMI cables are equivalent (and have been tested in other contexts where they behave without issue).
I'm unsure if it is the monitor at this point, or the Mac mini, but your post makes me suspect the Samsung monitor.
It seems idiotic to convert from HDMI out to DVI rather than to just use HDMI, but after wasting more than an hour to adjust the image quality, I decided to get on with my life (assuming posting counts as life). But I just wanted to corroborate this observation. It is the first hit if one googles these symptoms, suggesting it isn't a unique observation.