Inferior Picture Quality with HDMI vs DVI/VGA - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 27 Old 09-08-2009, 08:39 PM - Thread Starter
Senior Member
 
Red-3's Avatar
 
Join Date: Jan 2003
Location: Calgary, Canada
Posts: 340
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Picked up a Samsung 2333HD TV/Monitor and trying out the various connections to my PC.
The Monitor is native 1080p and has a TV tuner built in.
It has 2xHDMI, 1xDVI and 1xVGA (plus a coaxial and component input).

DVI and even VGA produce a beautiful 1:1 pixel mapped 1920x1080p image.
HDMI is a different story - the text is 'messy' not crisp like the other two. The picture also looked 'brighter'/'garish' and more 'TV-like' - if that means anything.

Tried it with two different HDMI sources - desktop PC with ATI 4850HD card and laptop PC with ATI 3200HD onboard video.
Also tried two separate cables.

Tried adjusting the image, best results were to reduce the sharpness on the TV which reduced, but did not solve it. DVI still looked way better.

Tried different modes - most modes were nicely scaled and looked smooth and characteristically soft as expected, but 1280x720 and 1920x1080 both looked similar - like badly scaled text with no smooth anti-aliasing. (Similar to when you scan a page of text in black and white mode rather than greyscale.)

Any idea where the problem lies? ATI card or driver problem? HDMI problem in the TV/Monitor? I'm doubting it's the HDMI cable itself...

Is this a common problem?
Considering taking the TV back and just buying a monitor without HDMI, but it's a shame to revert to VGA and not use HDMI when my laptop has it.

(Also tried using the HDMI out to my Sony 1080p projector. Noticed some slight flaws in the image around some of the next but the 1:1: pixel mapping looked good to me... but it has me a little suspicious of the ATI cards.)

Red-3 Standing By...

My Blu Heaven! Home Theatre Pics
Sony VW40 1080p FP Screenshots
Red-3 is offline  
Sponsored Links
Advertisement
 
post #2 of 27 Old 09-08-2009, 08:51 PM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,581
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 163
DVI and HDMI are pin compatible, same quality

VGA is analog so it's subject to analog interference, and when cabled to a digital display (LCD, Plasma, DLP, SXRD, everything except CRT) has to go back through an analog to digital conversion wich is a lossy process.

How lossy is VGA?
Depends on the cables. Depends on the distance. Depends on the ADC (if needed).
But assuming a good quality ADC, VGA is pretty much indistinguishable from DVI/HDMI.

Your HDMI problem is likely that the TV is overscanning the signal, it scales the picture to 105% and then crops off the 5% so you end up with a streched scaled mess. This is because TV's for years have overscanned video images so even fixed pixel displays replicate the overscan (even if it means scaling and cropping). The samsung should have pixle perfect, 1:1, just scan, native mode something like that, it'll be a zoom option. Also the ATI may try to underscan HDMI so it may not be apperant that it is being cropped, once you get it in native mode you may need to turn off the ATI scaling.

Alternatively why not just use DVI? You're not loosing any quality.

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is online now  
post #3 of 27 Old 09-08-2009, 10:16 PM
ffj
Member
 
ffj's Avatar
 
Join Date: Sep 2009
Location: Greater Toronto Area, Ontario, Canada
Posts: 16
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Red-3 is spot on...

DVI and HDMI are the EXACT same technology, just a different connector.

However... there is something that everyone does forget.

Notebook manufactuers that are now putting HDMI into their notebooks do not treat them the same... ATI and nVidia actually do treat them differently...

If you are running a lower end notebook with a dedicated video card, chances are it's running the HDMI port via the video card within the notebook's processor (Centrino processors are notorious for this). Whereas the DVI cable does not run off that video gpu, it likely runs off it's own.

If you are running an AMD processor (which likely has an nVidia video adapter) you might be running it off the same CPU (that has a GPU built in) - at which point, it won't make a difference.

It all depends on your notebook...

I had the same problem with my last notebook (before work gave me a shiny new HP EliteBook - which I swear by). I caved, and just went out and bought a DVI to HDMI cable... that fixed my problem.

Not sure what type of notebook you've got... but if you test other HDMI devices, and you don't have the same issue, you know for sure it's the notebook not the TV.

Also... make sure to play with the settings... you can tweak the signal output (etc) in expert mode on the ATI control panel...

BUT BE CAREFUL. Depending on the TV you're testing your settings on, you might actually blow your TV.

I suggest calling the manufacturer and talking to them before returning the unit... simple phone call might save you a trip (and gas) to the local electronics store.

Hope this helps!
ffj is offline  
post #4 of 27 Old 09-08-2009, 10:18 PM
ffj
Member
 
ffj's Avatar
 
Join Date: Sep 2009
Location: Greater Toronto Area, Ontario, Canada
Posts: 16
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Another thread in this forum asks something similar.

Make sure to check this out too... similar to what I was just saying.

http://www.avsforum.com/avs-vb/showthread.php?t=1173340
ffj is offline  
post #5 of 27 Old 09-08-2009, 11:16 PM - Thread Starter
Senior Member
 
Red-3's Avatar
 
Join Date: Jan 2003
Location: Calgary, Canada
Posts: 340
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by sotti View Post

Alternatively why not just use DVI? You're not loosing any quality.

Laptop only has VGA and HDMI - no DVI

(Laptop is HP Pavilion DV3 with AMD 64 x2 w/ ATI HD3200 onboard video)

Red-3 Standing By...

My Blu Heaven! Home Theatre Pics
Sony VW40 1080p FP Screenshots
Red-3 is offline  
post #6 of 27 Old 09-08-2009, 11:27 PM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,581
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 163
Quote:
Originally Posted by Red-3 View Post

Laptop only has VGA and HDMI - no DVI

(Laptop is HP Pavilion DV3 with AMD 64 x2 w/ ATI HD3200 onboard video)

gotcha.

Like I said, check the TV settings zoom mode for a native/just scan mode. That should get your pixels sharp.

Also don't worry about blowing your TV, that is olden days stuff from running CRTs at funky refresh rates the tube couldn't handle. An LCD/Plasma will just go non-compatible signal. But you dont' even need to worry about that. Just get the TV into pixel perfect mode then see what else needs to be fixed.

Make sure the laptop isn't in clone mode with it's display.

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is online now  
post #7 of 27 Old 09-09-2009, 06:36 AM
AVS Special Member
 
Bigbird999's Avatar
 
Join Date: Aug 2007
Posts: 1,375
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
When you are connected by vga, the tv act like a monitor and displays the resolution that you send to it (1920x1080) without overscan.The result is a crystal clear 1:1 pixel mapping. When you use HDMI, the tv thinks it it a tv and overscans the picture resulting in a blurred desktop. In your TV settings menu there should be a setting for "Just Scan" or "dot for dot" or similar that tells the tv not to overscan the hdmi signal. This setting may be available on only one of the HDMI ports. If your tv does not have a "just scan" setting (I'm not familiar with this model), you might try an HDMI>DVI cable hooked to the DVI of your TV. The TV probably will not overscan the DVI input because it will think it is a monitor like when it is hooked to VGA.

But if you are getting a good picture by VGA, you could just use it.

BB

New wealth is created in 2 ways:
You dig it from the ground (mining and oil)
Or you grow it (fishing, farming and forestry).
Everything else is just processing what you dug up or grew.
Bigbird999 is offline  
post #8 of 27 Old 09-09-2009, 09:47 AM - Thread Starter
Senior Member
 
Red-3's Avatar
 
Join Date: Jan 2003
Location: Calgary, Canada
Posts: 340
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Bigbird999 View Post

When you are connected by vga, the tv act like a monitor and displays the resolution that you send to it (1920x1080) without overscan.The result is a crystal clear 1:1 pixel mapping. When you use HDMI, the tv thinks it it a tv and overscans the picture resulting in a blurred desktop. In your TV settings menu there should be a setting for "Just Scan" or "dot for dot" or similar that tells the tv not to overscan the hdmi signal. This setting may be available on only one of the HDMI ports. If your tv does not have a "just scan" setting (I'm not familiar with this model), you might try an HDMI>DVI cable hooked to the DVI of your TV. The TV probably will not overscan the DVI input because it will think it is a monitor like when it is hooked to VGA.

But if you are getting a good picture by VGA, you could just use it.

BB

Just feels a bit cheap that I have this state of the art digital connector on my laptop actually produces inferior picture to the analog VGA connector.

Will try playing with the ATI card settings and the TV settings again, but so far no joy...

Red-3 Standing By...

My Blu Heaven! Home Theatre Pics
Sony VW40 1080p FP Screenshots
Red-3 is offline  
post #9 of 27 Old 09-09-2009, 10:14 AM
AVS Special Member
 
jrwalte's Avatar
 
Join Date: Jun 2007
Posts: 2,537
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by Red-3 View Post

Just feels a bit cheap that I have this state of the art digital connector on my laptop actually produces inferior picture to the analog VGA connector.

Will try playing with the ATI card settings and the TV settings again, but so far no joy...

As previously stated, VGA analog really isn't inferior to DVI/HDMI. Yes, it can have interference, but it's rare and any decent quality cable would help prevent it. If the picture quality looks good and your text is sharp and you don't have to worry about carrying audio over the HDMI, then don't worry so much about the fact that you have to use VGA over HDMI.
jrwalte is offline  
post #10 of 27 Old 09-09-2009, 10:30 AM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,581
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 163
The mode your TV is called "Just Scan"

Use the P.Size button to get there.

If the image is underscanned, you can fix that in your ATI drivers.

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is online now  
post #11 of 27 Old 09-09-2009, 01:40 PM - Thread Starter
Senior Member
 
Red-3's Avatar
 
Join Date: Jan 2003
Location: Calgary, Canada
Posts: 340
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by sotti View Post

The mode your TV is called "Just Scan"

Use the P.Size button to get there.

If the image is underscanned, you can fix that in your ATI drivers.

I already had 'Just Scan' selected. Overscan set to 0. Yet still the issue persisted...

...UNTIL...

I came across a setting in the Samsung TV under the INPUT menu.
The function is EDIT NAME. The TV gives a list of names from TV, Camcorder, Game, to PC.
To my complete surprise, not only does this set the 'name' for the HDMI input but also adjusts settings to 'match' the input signal with the 'input peripheral'. Setting it to PC causes a 1:1 mapping to occur and the TV to do a true pass through of the signal from the PC to the output display. (The screen even blanks out for a second while it adjusts to the PC 'mode'.)

Who would have figured, eh?
Of all the dumbest named TV settings... this one has to take the prize!

Red-3 Standing By...

My Blu Heaven! Home Theatre Pics
Sony VW40 1080p FP Screenshots
Red-3 is offline  
post #12 of 27 Old 09-09-2009, 01:49 PM
AVS Special Member
 
Bigbird999's Avatar
 
Join Date: Aug 2007
Posts: 1,375
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
So, now it is time to be totally honest. Which input gives the better image quality? VGA or HDMI?

BB

New wealth is created in 2 ways:
You dig it from the ground (mining and oil)
Or you grow it (fishing, farming and forestry).
Everything else is just processing what you dug up or grew.
Bigbird999 is offline  
post #13 of 27 Old 09-09-2009, 02:00 PM
AVS Special Member
 
jrwalte's Avatar
 
Join Date: Jun 2007
Posts: 2,537
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
My Samsung DLP LED does that as well - changes different settings based off the name you select for the HDMI port. Just Scan would work under any name though. Odd that you had to set it to PC in order to get it to work.
jrwalte is offline  
post #14 of 27 Old 09-09-2009, 03:36 PM - Thread Starter
Senior Member
 
Red-3's Avatar
 
Join Date: Jan 2003
Location: Calgary, Canada
Posts: 340
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Bigbird999 View Post

So, now it is time to be totally honest. Which input gives the better image quality? VGA or HDMI?

BB

Actually I'm a little surprized because my 1080p Sony Projector won't accept a full 1080p HD signal via the VGA input. Like with the Component Input, I was lead to believe this was a limitation of the format, but it displays fine on the Samsung (which is also 1920x1080p native rez.) So I guess that was a mis-noma.
(I thought I beat it once, but discovered it was 1080i not 1080p.)

In my experience the differences between VGA and DVI/HDMI are subtle rather than obvious. Sometimes you might experience shimmer around text or interference around greeked patterns (such as pixel on/pixel off grid graphics).
But it's rare you encounter anything that you can definitively say "See that there on my screen? That's because I'm using a VGA cable."

Red-3 Standing By...

My Blu Heaven! Home Theatre Pics
Sony VW40 1080p FP Screenshots
Red-3 is offline  
post #15 of 27 Old 09-09-2009, 03:37 PM - Thread Starter
Senior Member
 
Red-3's Avatar
 
Join Date: Jan 2003
Location: Calgary, Canada
Posts: 340
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by jrwalte View Post

My Samsung DLP LED does that as well - changes different settings based off the name you select for the HDMI port. Just Scan would work under any name though. Odd that you had to set it to PC in order to get it to work.

Odd indeed...

Red-3 Standing By...

My Blu Heaven! Home Theatre Pics
Sony VW40 1080p FP Screenshots
Red-3 is offline  
post #16 of 27 Old 09-09-2009, 05:40 PM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,581
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 163
Quote:
Originally Posted by Red-3 View Post

Actually I'm a little surprized because my 1080p Sony Projector won't accept a full 1080p HD signal via the VGA input. Like with the Component Input, I was lead to believe this was a limitation of the format,

Nah VGA can go at least to 2560x1600@60hz if not more, but I've never seen anything bigger.

It all depends on how many mhz the DAC is 1 pixel per hz. Most DAC's are 400mhz these days (400M pixels per second).

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is online now  
post #17 of 27 Old 09-10-2009, 02:02 PM
AVS Special Member
 
jrwalte's Avatar
 
Join Date: Jun 2007
Posts: 2,537
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
The VGA limitiation of 1080i is your projector, not the PC or the cable technology. Many TVs do not support their full (as in progressive) native resolution on anything other than HDMI.
jrwalte is offline  
post #18 of 27 Old 09-12-2009, 12:39 PM
ffj
Member
 
ffj's Avatar
 
Join Date: Sep 2009
Location: Greater Toronto Area, Ontario, Canada
Posts: 16
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Guys... what I was referring to is an adapter like this:

www dot amazon dot com/DVI-HDMI-Cable-6ft-Male-Male/dp/B0002CZHN6

It's a simple DVI to HDMI cable... you can get it in any config (M-M, M-F, F-M, F-F)

I have one for my laptop... and NO WAY VGA is better than DVI... not a chance.


ffj is offline  
post #19 of 27 Old 09-12-2009, 04:20 PM
Member
 
roots4x's Avatar
 
Join Date: Feb 2009
Location: San Jose, CA
Posts: 110
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
DVI and HDMI video are for all intents and purposes identical. Most likely your monitor is set to treat those inputs differently, both with scaling and video quality settings.
roots4x is offline  
post #20 of 27 Old 09-12-2009, 05:14 PM
AVS Addicted Member
 
walford's Avatar
 
Join Date: May 2003
Location: Orange County, CA
Posts: 16,789
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Red-3 View Post

Odd indeed...

Not odd at all PCs do not sent the audio over an HDMI connection in the same manner as do STBs.
It appears that PCs like to send 23.976 video over HDMI so that the audio is sent over the interface separatly as compared to a STB which sends an integrated video and audio signal at 24FPS therfore it is important for many of todays HDTV systems to know if an HDMI input is from a PC or a cable/satellite STB or from a separate disk player unit.
walford is offline  
post #21 of 27 Old 10-03-2009, 07:27 AM
Member
 
ChunkyDark's Avatar
 
Join Date: Mar 2008
Posts: 59
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thank you all for the great info in this post. It really helped clear up my screen.

As a side note after reading a bit more about it in my hp-t4254 manual there is a line in there about only using HDMI 1 for a htpc. Sure enough I tried that input and it worked beautifully!
Who'd a thunk it, the answer was in the manual
ChunkyDark is offline  
post #22 of 27 Old 07-27-2012, 11:35 PM
Newbie
 
Laine Mikael's Avatar
 
Join Date: Jul 2012
Posts: 1
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
In my Samsung monitor's menu, there is "Setup" -> "AV Mode" (on 2nd page, just scroll down) ->Off

This helped for me at least.
Laine Mikael is offline  
post #23 of 27 Old 07-28-2012, 12:54 PM
AVS Special Member
 
olyteddy's Avatar
 
Join Date: Oct 2005
Posts: 3,155
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 26 Post(s)
Liked: 185
Another thing that a PC input does that a standard HDMI doesn't is use the proper colorspace. HDMI usually uses 4:2:2 color sampling while a PC looks best in RGB or 4:4:4.
olyteddy is offline  
post #24 of 27 Old 11-30-2012, 05:40 PM
Member
 
wgscott's Avatar
 
Join Date: Aug 2010
Posts: 91
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 30
Quote:
Originally Posted by Red-3 View Post

Picked up a Samsung 2333HD TV/Monitor and trying out the various connections to my PC.

The Monitor is native 1080p and has a TV tuner built in.

It has 2xHDMI, 1xDVI and 1xVGA (plus a coaxial and component input).


DVI and even VGA produce a beautiful 1:1 pixel mapped 1920x1080p image.

HDMI is a different story - the text is 'messy' not crisp like the other two. The picture also looked 'brighter'/'garish' and more 'TV-like' - if that means anything.


Tried it with two different HDMI sources - desktop PC with ATI 4850HD card and laptop PC with ATI 3200HD onboard video.

Also tried two separate cables.


Tried adjusting the image, best results were to reduce the sharpness on the TV which reduced, but did not solve it. DVI still looked way better.


Tried different modes - most modes were nicely scaled and looked smooth and characteristically soft as expected, but 1280x720 and 1920x1080 both looked similar - like badly scaled text with no smooth anti-aliasing. (Similar to when you scan a page of text in black and white mode rather than greyscale.)


Any idea where the problem lies? ATI card or driver problem? HDMI problem in the TV/Monitor? I'm doubting it's the HDMI cable itself...


Is this a common problem?

Considering taking the TV back and just buying a monitor without HDMI, but it's a shame to revert to VGA and not use HDMI when my laptop has it.


(Also tried using the HDMI out to my Sony 1080p projector. Noticed some slight flaws in the image around some of the next but the 1:1: pixel mapping looked good to me... but it has me a little suspicious of the ATI cards.)



I have exactly the same problem, coincidently with a Samsung LED monitor.

The monitor has an HDMI input, a DVI input, and a VGA input. I've limited my tests to the first two. I have a 2011 Mac mini. It has an HDMI output and a "thunderbold/mini displayport" out.

I have two high-quality HDMI cables, an HDMI to DVI adaptor + el cheapo DVI cable complete with a chinglish warning tag on it, and a mini displayport to HDMI adaptor.

The DVI cable attached to the HDMI out adaptor gives a nice clean crisp highly legible display. All combinations using HDMI in to my monitor are vastly inferior, and no amount of monitor hardware calibration, or software calibration on the Mac mini, will change anything. Both HDMI cables are equivalent (and have been tested in other contexts where they behave without issue).

I'm unsure if it is the monitor at this point, or the Mac mini, but your post makes me suspect the Samsung monitor.

It seems idiotic to convert from HDMI out to DVI rather than to just use HDMI, but after wasting more than an hour to adjust the image quality, I decided to get on with my life (assuming posting counts as life). But I just wanted to corroborate this observation. It is the first hit if one googles these symptoms, suggesting it isn't a unique observation.
wgscott is offline  
post #25 of 27 Old 12-08-2012, 09:42 AM
Newbie
 
nyty-nyt's Avatar
 
Join Date: Dec 2012
Posts: 1
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I just went through the hassle of applying for an account to submit my reply to your post.

Here it is.

Bless you. I have been googling everywhere for a solution, playing with settings on my Mac Mini and pulling hair out.
You nailed it.
nyty-nyt is offline  
post #26 of 27 Old 12-08-2012, 07:55 PM
AVS Special Member
 
Karyk's Avatar
 
Join Date: Dec 2001
Location: Seattle
Posts: 6,207
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 14
Quote:
Originally Posted by Red-3 View Post


I already had 'Just Scan' selected. Overscan set to 0. Yet still the issue persisted...


...UNTIL...


I came across a setting in the Samsung TV under the INPUT menu.

The function is EDIT NAME. The TV gives a list of names from TV, Camcorder, Game, to PC.

To my complete surprise, not only does this set the 'name' for the HDMI input but also adjusts settings to 'match' the input signal with the 'input peripheral'. Setting it to PC causes a 1:1 mapping to occur and the TV to do a true pass through of the signal from the PC to the output display. (The screen even blanks out for a second while it adjusts to the PC 'mode'.)


Who would have figured, eh?

Of all the dumbest named TV settings... this one has to take the prize!

Thank you! I have a fairly new Samsung TV and it does the same thing. I had noticed things changed when setting the name of inputs, but I hadn't noticed that change. I'd rejected the PC setting because I didn't like the pre-set picture settings (one called Entertainment and one called Economy). The DVR option had three settings, and I liked adjusting "Dynamic" the best. I don't know why they limit the choices depending on the type of input.

What I just noticed is that using the PC setting, there is no control over color settings at all, they are all grayed out, so you have to do it all through your PC. The standard settings are really rather accurate though. My old TV was a Toshiba which had horrible greens, so color accuracy was one of the main things I was looking at when picking the set. So this probably isn't a major issue for most, but I adjusted mine anyway.

I had to re-calibrate when I changed it back to PC just now. All the shows I had looked great, except Conan. It was both dark and washed out at the same time, if that makes sense. In the past I've usually found Conan to have rather good PQ. Anyone have any idea what that's about?
Karyk is offline  
post #27 of 27 Old 12-09-2012, 11:06 AM
AVS Special Member
 
Karyk's Avatar
 
Join Date: Dec 2001
Location: Seattle
Posts: 6,207
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 14
I just noticed that the commercials on Conan also look bad, so maybe it's something with the way Comcast is now sending the channel. Hopefully it's just something temporary.
Karyk is offline  
Reply Home Theater Computers

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off