AVS Forum banner

1 - 20 of 27 Posts

·
Registered
Joined
·
347 Posts
Discussion Starter #1
Picked up a Samsung 2333HD TV/Monitor and trying out the various connections to my PC.

The Monitor is native 1080p and has a TV tuner built in.

It has 2xHDMI, 1xDVI and 1xVGA (plus a coaxial and component input).


DVI and even VGA produce a beautiful 1:1 pixel mapped 1920x1080p image.

HDMI is a different story - the text is 'messy' not crisp like the other two. The picture also looked 'brighter'/'garish' and more 'TV-like' - if that means anything.


Tried it with two different HDMI sources - desktop PC with ATI 4850HD card and laptop PC with ATI 3200HD onboard video.

Also tried two separate cables.


Tried adjusting the image, best results were to reduce the sharpness on the TV which reduced, but did not solve it. DVI still looked way better.


Tried different modes - most modes were nicely scaled and looked smooth and characteristically soft as expected, but 1280x720 and 1920x1080 both looked similar - like badly scaled text with no smooth anti-aliasing. (Similar to when you scan a page of text in black and white mode rather than greyscale.)


Any idea where the problem lies? ATI card or driver problem? HDMI problem in the TV/Monitor? I'm doubting it's the HDMI cable itself...


Is this a common problem?

Considering taking the TV back and just buying a monitor without HDMI, but it's a shame to revert to VGA and not use HDMI when my laptop has it.


(Also tried using the HDMI out to my Sony 1080p projector. Noticed some slight flaws in the image around some of the next but the 1:1: pixel mapping looked good to me... but it has me a little suspicious of the ATI cards.)
 

·
Registered
Joined
·
7,056 Posts
DVI and HDMI are pin compatible, same quality


VGA is analog so it's subject to analog interference, and when cabled to a digital display (LCD, Plasma, DLP, SXRD, everything except CRT) has to go back through an analog to digital conversion wich is a lossy process.


How lossy is VGA?

Depends on the cables. Depends on the distance. Depends on the ADC (if needed).

But assuming a good quality ADC, VGA is pretty much indistinguishable from DVI/HDMI.


Your HDMI problem is likely that the TV is overscanning the signal, it scales the picture to 105% and then crops off the 5% so you end up with a streched scaled mess. This is because TV's for years have overscanned video images so even fixed pixel displays replicate the overscan (even if it means scaling and cropping). The samsung should have pixle perfect, 1:1, just scan, native mode something like that, it'll be a zoom option. Also the ATI may try to underscan HDMI so it may not be apperant that it is being cropped, once you get it in native mode you may need to turn off the ATI scaling.


Alternatively why not just use DVI? You're not loosing any quality.
 

·
Registered
Joined
·
16 Posts
Red-3 is spot on...


DVI and HDMI are the EXACT same technology, just a different connector.


However... there is something that everyone does forget.


Notebook manufactuers that are now putting HDMI into their notebooks do not treat them the same... ATI and nVidia actually do treat them differently...


If you are running a lower end notebook with a dedicated video card, chances are it's running the HDMI port via the video card within the notebook's processor (Centrino processors are notorious for this). Whereas the DVI cable does not run off that video gpu, it likely runs off it's own.


If you are running an AMD processor (which likely has an nVidia video adapter) you might be running it off the same CPU (that has a GPU built in) - at which point, it won't make a difference.


It all depends on your notebook...


I had the same problem with my last notebook (before work gave me a shiny new HP EliteBook - which I swear by). I caved, and just went out and bought a DVI to HDMI cable... that fixed my problem.


Not sure what type of notebook you've got... but if you test other HDMI devices, and you don't have the same issue, you know for sure it's the notebook not the TV.


Also... make sure to play with the settings... you can tweak the signal output (etc) in expert mode on the ATI control panel...


BUT BE CAREFUL. Depending on the TV you're testing your settings on, you might actually blow your TV.


I suggest calling the manufacturer and talking to them before returning the unit... simple phone call might save you a trip (and gas) to the local electronics store.


Hope this helps!
 

·
Registered
Joined
·
347 Posts
Discussion Starter #5

Quote:
Originally Posted by sotti /forum/post/17144859


Alternatively why not just use DVI? You're not loosing any quality.

Laptop only has VGA and HDMI - no DVI


(Laptop is HP Pavilion DV3 with AMD 64 x2 w/ ATI HD3200 onboard video)
 

·
Registered
Joined
·
7,056 Posts

Quote:
Originally Posted by Red-3 /forum/post/17145333


Laptop only has VGA and HDMI - no DVI


(Laptop is HP Pavilion DV3 with AMD 64 x2 w/ ATI HD3200 onboard video)

gotcha.


Like I said, check the TV settings zoom mode for a native/just scan mode. That should get your pixels sharp.


Also don't worry about blowing your TV, that is olden days stuff from running CRTs at funky refresh rates the tube couldn't handle. An LCD/Plasma will just go non-compatible signal. But you dont' even need to worry about that. Just get the TV into pixel perfect mode then see what else needs to be fixed.


Make sure the laptop isn't in clone mode with it's display.
 

·
Registered
Joined
·
1,375 Posts
When you are connected by vga, the tv act like a monitor and displays the resolution that you send to it (1920x1080) without overscan.The result is a crystal clear 1:1 pixel mapping. When you use HDMI, the tv thinks it it a tv and overscans the picture resulting in a blurred desktop. In your TV settings menu there should be a setting for "Just Scan" or "dot for dot" or similar that tells the tv not to overscan the hdmi signal. This setting may be available on only one of the HDMI ports. If your tv does not have a "just scan" setting (I'm not familiar with this model), you might try an HDMI>DVI cable hooked to the DVI of your TV. The TV probably will not overscan the DVI input because it will think it is a monitor like when it is hooked to VGA.


But if you are getting a good picture by VGA, you could just use it.


BB
 

·
Registered
Joined
·
347 Posts
Discussion Starter #8

Quote:
Originally Posted by Bigbird999 /forum/post/17146094


When you are connected by vga, the tv act like a monitor and displays the resolution that you send to it (1920x1080) without overscan.The result is a crystal clear 1:1 pixel mapping. When you use HDMI, the tv thinks it it a tv and overscans the picture resulting in a blurred desktop. In your TV settings menu there should be a setting for "Just Scan" or "dot for dot" or similar that tells the tv not to overscan the hdmi signal. This setting may be available on only one of the HDMI ports. If your tv does not have a "just scan" setting (I'm not familiar with this model), you might try an HDMI>DVI cable hooked to the DVI of your TV. The TV probably will not overscan the DVI input because it will think it is a monitor like when it is hooked to VGA.


But if you are getting a good picture by VGA, you could just use it.


BB

Just feels a bit cheap that I have this state of the art digital connector on my laptop actually produces inferior picture to the analog VGA connector.


Will try playing with the ATI card settings and the TV settings again, but so far no joy...
 

·
Registered
Joined
·
2,618 Posts

Quote:
Originally Posted by Red-3 /forum/post/17147357


Just feels a bit cheap that I have this state of the art digital connector on my laptop actually produces inferior picture to the analog VGA connector.


Will try playing with the ATI card settings and the TV settings again, but so far no joy...

As previously stated, VGA analog really isn't inferior to DVI/HDMI. Yes, it can have interference, but it's rare and any decent quality cable would help prevent it. If the picture quality looks good and your text is sharp and you don't have to worry about carrying audio over the HDMI, then don't worry so much about the fact that you have to use VGA over HDMI.
 

·
Registered
Joined
·
7,056 Posts
The mode your TV is called "Just Scan"


Use the P.Size button to get there.


If the image is underscanned, you can fix that in your ATI drivers.
 

·
Registered
Joined
·
347 Posts
Discussion Starter #11

Quote:
Originally Posted by sotti /forum/post/17147664


The mode your TV is called "Just Scan"


Use the P.Size button to get there.


If the image is underscanned, you can fix that in your ATI drivers.

I already had 'Just Scan' selected. Overscan set to 0. Yet still the issue persisted...


...UNTIL...


I came across a setting in the Samsung TV under the INPUT menu.

The function is EDIT NAME. The TV gives a list of names from TV, Camcorder, Game, to PC.

To my complete surprise, not only does this set the 'name' for the HDMI input but also adjusts settings to 'match' the input signal with the 'input peripheral'. Setting it to PC causes a 1:1 mapping to occur and the TV to do a true pass through of the signal from the PC to the output display. (The screen even blanks out for a second while it adjusts to the PC 'mode'.)


Who would have figured, eh?

Of all the dumbest named TV settings... this one has to take the prize!
 

·
Registered
Joined
·
1,375 Posts
So, now it is time to be totally honest. Which input gives the better image quality? VGA or HDMI?



BB
 

·
Registered
Joined
·
2,618 Posts
My Samsung DLP LED does that as well - changes different settings based off the name you select for the HDMI port. Just Scan would work under any name though. Odd that you had to set it to PC in order to get it to work.
 

·
Registered
Joined
·
347 Posts
Discussion Starter #14

Quote:
Originally Posted by Bigbird999 /forum/post/17149125


So, now it is time to be totally honest. Which input gives the better image quality? VGA or HDMI?



BB

Actually I'm a little surprized because my 1080p Sony Projector won't accept a full 1080p HD signal via the VGA input. Like with the Component Input, I was lead to believe this was a limitation of the format, but it displays fine on the Samsung (which is also 1920x1080p native rez.) So I guess that was a mis-noma.

(I thought I beat it once, but discovered it was 1080i not 1080p.)


In my experience the differences between VGA and DVI/HDMI are subtle rather than obvious. Sometimes you might experience shimmer around text or interference around greeked patterns (such as pixel on/pixel off grid graphics).

But it's rare you encounter anything that you can definitively say "See that there on my screen? That's because I'm using a VGA cable."
 

·
Registered
Joined
·
347 Posts
Discussion Starter #15

Quote:
Originally Posted by jrwalte /forum/post/17149212


My Samsung DLP LED does that as well - changes different settings based off the name you select for the HDMI port. Just Scan would work under any name though. Odd that you had to set it to PC in order to get it to work.

Odd indeed...
 

·
Registered
Joined
·
7,056 Posts

Quote:
Originally Posted by Red-3 /forum/post/17149795


Actually I'm a little surprized because my 1080p Sony Projector won't accept a full 1080p HD signal via the VGA input. Like with the Component Input, I was lead to believe this was a limitation of the format,

Nah VGA can go at least to [email protected] if not more, but I've never seen anything bigger.


It all depends on how many mhz the DAC is 1 pixel per hz. Most DAC's are 400mhz these days (400M pixels per second).
 

·
Registered
Joined
·
2,618 Posts
The VGA limitiation of 1080i is your projector, not the PC or the cable technology. Many TVs do not support their full (as in progressive) native resolution on anything other than HDMI.
 

·
Registered
Joined
·
16 Posts
Guys... what I was referring to is an adapter like this:


www dot amazon dot com/DVI-HDMI-Cable-6ft-Male-Male/dp/B0002CZHN6


It's a simple DVI to HDMI cable... you can get it in any config (M-M, M-F, F-M, F-F)


I have one for my laptop... and NO WAY VGA is better than DVI... not a chance.


 

·
Registered
Joined
·
110 Posts
DVI and HDMI video are for all intents and purposes identical. Most likely your monitor is set to treat those inputs differently, both with scaling and video quality settings.
 

·
Registered
Joined
·
16,749 Posts

Quote:
Originally Posted by Red-3 /forum/post/17149801


Odd indeed...

Not odd at all PCs do not sent the audio over an HDMI connection in the same manner as do STBs.

It appears that PCs like to send 23.976 video over HDMI so that the audio is sent over the interface separatly as compared to a STB which sends an integrated video and audio signal at 24FPS therfore it is important for many of todays HDTV systems to know if an HDMI input is from a PC or a cable/satellite STB or from a separate disk player unit.
 
1 - 20 of 27 Posts
Top