VGA vs HDMI? - AVS Forum
Forum Jump: 
 
Thread Tools
Old 02-21-2008, 10:58 AM - Thread Starter
Member
 
petronin's Avatar
 
Join Date: Dec 2007
Posts: 51
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Hello again,

Newbie question here. If you have a TV that will accept 1920x1080 from the VGA input (Sony KDL52XBR4) and assuming you're using a direct connection between PC to receiver for audio: is there any difference in video between VGA and HDMI??

I just bought a new HD3650, and not only have I not been able to get the audio working over HDMI, but the resolution is all messed up over HDMI. I set it to 1920x1080 and it leaves room on the top/bottom and sides of the screen. And the fonts are blurry as hell. Playing a 1080 h264 went very smooth and looked great, but again didn't fill up the entire screen.

Over the VGA, I get crystal clear desktop. All fonts are perfect, and the same 1080 h264 movie looked just as good, only it filled the entire screen. Am I missing something here? Why wouldn't I just stick with VGA and keep life simple?

Thanks for all your help.
petronin is offline  
Sponsored Links
Advertisement
 
Old 02-21-2008, 11:03 AM
AVS Special Member
 
Java Jack's Avatar
 
Join Date: Mar 2006
Location: Austin TX
Posts: 1,784
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
VGA is not a protected video path, therefore, you will not be able to display protected content. You need to have an HDCP compliant solution (either DVI + HDCP) or HDMI.

The other issue you are experiencing is related to scalaing. HD TV's scale the resolution and content differently than a PC platform. Therefore, your resolutions will be slightly different. Sometimes, you just have to play with it a little to get close, then use the s/w drivers to scale (stretch/shrink) the screen to fit appropriately.

Regards.
Java

There are 10 types of people in the world, those that understand binary, and those that don't.

AMD@Home Blog: http://links.amd.com/Home
Twitter: http://twitter.com/Java_Jack
Java Jack is offline  
Old 02-21-2008, 11:26 AM
AVS Special Member
 
Somewhatlost's Avatar
 
Join Date: Jul 2006
Location: some small blue-green planet thingy...
Posts: 1,793
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 17
Quote:
Originally Posted by Java Jack View Post

VGA is not a protected video path, therefore, you will not be able to display protected content. You need to have an HDCP compliant solution (either DVI + HDCP) or HDMI.

that just isn't true...
well it is true that VGA isn't "protected" ... but who benefits from this so called protection? us consumers? not really...
so it is just better to strip out/crack that protection when needed... and there are already ways to do this...

Quote:
Originally Posted by Java Jack View Post

The other issue you are experiencing is related to scalaing. HD TV's scale the resolution and content differently than a PC platform. Therefore, your resolutions will be slightly different. Sometimes, you just have to play with it a little to get close, then use the s/w drivers to scale (stretch/shrink) the screen to fit appropriately.

what is to scale? 1080P VGA should be = 1080P hdmi... 1920x1080 is 1920 x 1080 is it not? what does vga or hdmi have to do with that? assuming it really is a 1080P capable set?

NOTE: As one wise professional something once stated, I am ignorant & childish, with a mindset comparable to 9/11 troofers and wackjob conspiracy theorists. so don't take anything I say as advice...
Somewhatlost is offline  
Old 02-21-2008, 11:32 AM - Thread Starter
Member
 
petronin's Avatar
 
Join Date: Dec 2007
Posts: 51
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Somewhatlost View Post

what is to scale? 1080P VGA should be = 1080P hdmi... 1920x1080 is 1920 x 1080 is it not? what does vga or hdmi have to do with that? assuming it really is a 1080P capable set?

That's kind of what I thought. If it's really outputting 1920x1080, why does it look different when I use HDMI and VGA?
petronin is offline  
Old 02-21-2008, 11:51 AM
 
reggea_boy's Avatar
 
Join Date: Feb 2008
Posts: 4
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
does one provide a better signal quality than the other?
reggea_boy is offline  
Old 02-21-2008, 12:19 PM
Advanced Member
 
mslide's Avatar
 
Join Date: Nov 2006
Posts: 882
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 13
Not considering the protected path thing, it's really just a personal preference. Some people think VGA looks better, with their setup. Others have better luck with DVI/HDMI. That's really all it comes down to. Try each one and see which one you like.

I currently use VGA, but only because my ancient HTPC doesn't have DVI out. I'm happy with the picture. However, when I had my macbook hooked up to my TV, I couldn't see any difference between VGA and HDMI.
mslide is offline  
Old 02-21-2008, 12:34 PM
Member
 
jcuesico's Avatar
 
Join Date: Jun 2006
Posts: 32
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I'm also using VGA on my HTPC. My TV is a 1080p LCD. However I can't turn off the edge detection on the HDMI port. The VGA port doesn't have edge detection. So I like the image I get over the VGA port much better. As for playing HD content, my HTPC plays both Blu-ray and HD-DVD discs and QAM Cable channels just fine over the VGA connection. I don't have to worry about HDCP. Both my TV and Video card are HDCP compliant by the way...

For now I find using VGA gives me a better picture.
jcuesico is offline  
Old 02-21-2008, 12:42 PM - Thread Starter
Member
 
petronin's Avatar
 
Join Date: Dec 2007
Posts: 51
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thanks for the replies. Since I decided that I'm going to keep the dedicated PC-receiver audio hookup so I can play tunes when the TV is not on, I think I'm going to stick with VGA. Based on what I've heard here, there is no loss of video resolution.

I'll re-post if I can't get BluRay going once I get a drive.
petronin is offline  
Old 02-21-2008, 01:36 PM
AVS Special Member
 
Java Jack's Avatar
 
Join Date: Mar 2006
Location: Austin TX
Posts: 1,784
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Somewhatlost View Post

that just isn't true...
well it is true that VGA isn't "protected" ... but who benefits from this so called protection? us consumers? not really...
so it is just better to strip out/crack that protection when needed... and there are already ways to do this...



what is to scale? 1080P VGA should be = 1080P hdmi... 1920x1080 is 1920 x 1080 is it not? what does vga or hdmi have to do with that? assuming it really is a 1080P capable set?

This was not about who benefits...the simple fact is that if you want to enjoy protected content (legally) on an HTPC, you must have a protected video path. Without it, content will be scaled down or not run at all. Hence, HDMI or DVI with HDCP.

As for the scaling, it has nothing to do with HDMI vs. VGA. My point was that the resolution for a TV does not always align well with the resolution of a PC output (regardless of HDMI, DVI or VGA) because PC displays do not over/under scan which is why there are scaling issues and it does not always stretch to the edge of the screen very well.

That is why video driver guys are releasing tools to help adjust the scaling so things line up better.

Regards.
Java

There are 10 types of people in the world, those that understand binary, and those that don't.

AMD@Home Blog: http://links.amd.com/Home
Twitter: http://twitter.com/Java_Jack
Java Jack is offline  
Old 02-21-2008, 01:44 PM
Member
 
gobigotchi's Avatar
 
Join Date: Jan 2007
Posts: 37
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by petronin View Post

Hello again,

Newbie question here. If you have a TV that will accept 1920x1080 from the VGA input (Sony KDL52XBR4) and assuming you're using a direct connection between PC to receiver for audio: is there any difference in video between VGA and HDMI??

I just bought a new HD3650, and not only have I not been able to get the audio working over HDMI, but the resolution is all messed up over HDMI. I set it to 1920x1080 and it leaves room on the top/bottom and sides of the screen. And the fonts are blurry as hell. Playing a 1080 h264 went very smooth and looked great, but again didn't fill up the entire screen.

Over the VGA, I get crystal clear desktop. All fonts are perfect, and the same 1080 h264 movie looked just as good, only it filled the entire screen. Am I missing something here? Why wouldn't I just stick with VGA and keep life simple?

Thanks for all your help.

just fyi, you should be able to go into your catalyst control center, advanced mode, go into DTV, find the scaling options section, and drag the slider all the way to the right from Underscan 15% to Overscan 0%, and that should get it to fill your screen at 1080p over HDMI. had to do the same for my 3450.
gobigotchi is offline  
Old 02-21-2008, 01:54 PM
Member
 
uplift1's Avatar
 
Join Date: Jul 2007
Posts: 25
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
The long answer short is: use HDMI/DVI unless a direct comparison with VGA reveals that your TV is applying unwanted image correction on the HDMI port.

Read up on DVI here:
http://en.wikipedia.org/wiki/DVI

http://forums.nvidia.com/lofiversion...hp?t50789.html
Quote:


DVI - information goes directly from your video card to your monitor. The color of each pixel on your monitor is calculated by your video card and then sent as digital information to your monitor so that no conversion is necessary. An LCD monitor simply reads this information and displays it directly

VGA - Information is converted from digital to [red,green,blue] format. Some accuracy and time is lost in this converstion. How much is lost depends on the monitor's conversion hardware.

Image Quality:

On a CRT monitor, there is no real image quality difference between DVI and VGA. This is because a CRT is natively based on the [red,green,blue] format for displaying each pixel.

On an LCD, you will notice a difference between the 2 formats if you look hard enough. Different LCDs will handle the conversion differently. You may start to see dithering, banding, "dancing pixels" and blander/incorrect colors when using vga on an LCD. The larger the LCD/resolution the more you will notice these differences.

DVI also has a faster data transfer rate, which means that the higher the resolution, the worse the input lag will be if you use VGA. This is very important if you play fast(twitch) shooter games.

Finally, VGA only contains the color information for your monitor's image. DVI includes more than that. That's why when you connect using DVI, you don't have to adjust your monitor's image position, phase, and clock corrections to sync. It contains exactly how/what your video card wants to display.

If you hook up your LCD with VGA, you will notice that several monitor adjustments become available were they were not under DVI. That is because DVI carries all the information your monitor needs to configure itself where as VGA does not.

uplift1 is offline  
Old 02-21-2008, 02:18 PM
Member
 
Emissary's Avatar
 
Join Date: Oct 2003
Posts: 138
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Java Jack View Post

This was not about who benefits...the simple fact is that if you want to enjoy protected content (legally) on an HTPC, you must have a protected video path. Without it, content will be scaled down or not run at all. Hence, HDMI or DVI with HDCP.

That is only the case if ICT is flagged. At the moment (as far as I know) no Blu-Ray discs have ICT flagged and therefore all will play with full resolution over an analogue connection. The possibility exists that one day the studios will decide to flag it and constrain analogue connections, but so far none has indicated they will do so and it would be a few years away anyway.

For what it's worth, I use a VGA connection to my Sony 32" as it's a 768p TV but only accepts standard HDTV resolutions over DVI (720p or 1080i) and overscan cannot be turned off. So VGA is a lot easier for me.
Emissary is offline  
Old 02-21-2008, 03:22 PM
AVS Special Member
 
Somewhatlost's Avatar
 
Join Date: Jul 2006
Location: some small blue-green planet thingy...
Posts: 1,793
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 17
Quote:
Originally Posted by uplift1 View Post

The long answer short is: use HDMI/DVI unless a direct comparison with VGA reveals that your TV is applying unwanted image correction on the HDMI port.

actually the correct answer would be use whatever works better... in some cases VGA will work better, in other HDMI will... and in the case where they look the same, use VGA since it is a cheaper cable

as for the so called legality of stripping out the ICT and any other DRM that our Benevolent Media Overlords would choose to foist upon us, I believe this following quote sums up our duty pretty well...
Quote:


But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security..

it would seem a bunch of old dead people knew what our Benevolent Media Overlords would be up to in the future...

NOTE: As one wise professional something once stated, I am ignorant & childish, with a mindset comparable to 9/11 troofers and wackjob conspiracy theorists. so don't take anything I say as advice...
Somewhatlost is offline  
Old 02-21-2008, 06:21 PM
Member
 
uplift1's Avatar
 
Join Date: Jul 2007
Posts: 25
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Somewhatlost View Post

actually the correct answer would be use whatever works better... in some cases VGA will work better, in other HDMI will... and in the case where they look the same, use VGA since it is a cheaper cable

Which is not really what the OP wanted to hear. You're telling him that if he has a CD player (DVI/HDMI digital cable) and a tape player (VGA analog cable) to use the CD player over the tape player unless it's a really crummy CD player and a really good tape player. DVI-HDMI is designed to send a digital signal to LCD, it has exact color values for 60 frames, rather than analog color intensities that have to be digitized by the LCD. You're going to get a less accurate signal on analog, you can't argue that, the question is only whether YOUR eyes are good enough to see it and if your TV is dumb enough to interfere with rendering the signal (or does not accept it's own native resolution via DVI-HDMI as Emissary pointed out above). Many displays do generate over/under scan problems which can sometimes be relieved by video driver adjustments. I would recommend that if the OP cares enough about color accuracy to check his driver settings to see if this can be corrected.

And you can get HDMI cables for nothing if you go to www.monoprice.com or a similar site (I've gotten one for $5 at Fry's).
uplift1 is offline  
Old 02-21-2008, 07:43 PM
Member
 
pastishe's Avatar
 
Join Date: Sep 2007
Posts: 121
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
my 2 cents, I messed around with dvi-hdmi for ages getting a perfectly fitting image and eventually got it right...

..but when I update my drivers it either stays the same or changes and goes all messed up. And also sometimes I'll have to reboot my system for it to "notice" that i've got a monitor plugged in.

The difference between DVI/HDMI are hardly noticeable imo on my set i.e. if you go about 1 feet away you can see that text is a little crisper on DVI/HDMI but then who sits that close to a 46" screen anyways.

There is a worry about playing protected content but I don't have a Blu-ray drive yet, but when I do i'll just buy anydvd-hd and carry on using VGA.
pastishe is offline  
Old 02-21-2008, 08:02 PM - Thread Starter
Member
 
petronin's Avatar
 
Join Date: Dec 2007
Posts: 51
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by pastishe View Post

There is a worry about playing protected content but I don't have a Blu-ray drive yet, but when I do i'll just buy anydvd-hd and carry on using VGA.

Again, thanks for all the replies.

The above quote sums up exactly the conclusion I came to. At present, my eyes prefer VGA. (HDMI text seems fuzzy to me, while VGA is crystal clear) I did play around with HDMI setting and got it to fill the screen, but I still preferred VGA.

Now if/when I eventually get a BluRay player and encounter HDCP problems, I'll switch over to HDMI. Until then, I'm happy with VGA.

This forum rocks. I'm learning a ton!
Thanks
petronin is offline  
Old 02-22-2008, 07:02 AM
Member
 
Emissary's Avatar
 
Join Date: Oct 2003
Posts: 138
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by petronin View Post

Now if/when I eventually get a BluRay player and encounter HDCP problems, I'll switch over to HDMI. Until then, I'm happy with VGA.

And for at least the next couple of years, you shouldn't have any HDCP problems with VGA. Most studios have announced that they have no plans to enable the image constraint token that limits analog playback any time soon. So you can Blu-Ray away over your VGA connection as much as you'd like right now.
Emissary is offline  
Old 02-22-2008, 07:54 AM
AVS Special Member
 
IAM4UK's Avatar
 
Join Date: Aug 2003
Location: United States
Posts: 6,051
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 10 Post(s)
Liked: 24
I prefer DVI/HDMI over VGA for computer output to monitor or television. HDCP is a practical reality, but beyond that, I think the image is better with the digital connection.

But no one can tell you what looks best to your eyes. That's up to you.

A.L.a.E.o.t.U.S., as proven 3/21 - never forget.
Defend liberty.
Knowledge isn't Truth; it's just mindless agreement.
IAM4UK is offline  
Old 02-22-2008, 08:24 AM
Senior Member
 
vapore0n's Avatar
 
Join Date: Aug 2005
Posts: 498
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
for my set

I decided for HDMI because the TV set doesnt allow color correction on PC input.
PC input just looked bad and couldnt adjust the color on the PC well enough to be accurate on anything.(had bad red over exposure)

I still get the native 1366x768 resolution with no over/underscan

Samsung LN46A650, Core 2 Duo E6400, HD4850, MSI 945P NEO3-F, 4GB PC2-6400, Harmony 659
vapore0n is offline  
Old 06-26-2008, 04:34 PM
AVS Special Member
 
markabuckley's Avatar
 
Join Date: Jan 2007
Posts: 1,081
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 19
whats the bit-depth of colour VGA vs HDMI 1.1 (bascially non-deep colour)

on my Panasonic Plasma I seem get much more neutral colour on VGA if slightly softer vs HDMI. Wheras HDMI seems to be a bit garish - more cartoony (not that bad but you know what I mean) and a bit more banding ?
markabuckley is offline  
Old 06-27-2008, 12:03 AM
Member
 
syner's Avatar
 
Join Date: Jun 2007
Posts: 197
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
those years away may be closer than you think

http://www.engadget.com/2008/06/11/f...-drm-proposal/

while not from the t.v. endpoint it is a start in trying to get rid of analog

Robert Conlin
syner is offline  
Old 06-27-2008, 12:23 AM
Senior Member
 
Meparch's Avatar
 
Join Date: Mar 2008
Posts: 222
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
You need to use the DVI/HDMI adapter that came with your card to get audio out. Also you have to set your audio output in your Audio Properties to the ATi/HD Rear Audio option.
The scaling is annoying and resets itself all the time on me. It really is a piss poor design. In that sense VGA may work better. HDMI should look good though.
Meparch is offline  
Old 06-27-2008, 07:46 AM
AVS Addicted Member
 
stanger89's Avatar
 
Join Date: Nov 2002
Location: Marion, IA
Posts: 17,491
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 173 Post(s)
Liked: 156
Quote:
Originally Posted by petronin View Post

Again, thanks for all the replies.

The above quote sums up exactly the conclusion I came to. At present, my eyes prefer VGA. (HDMI text seems fuzzy to me, while VGA is crystal clear) I did play around with HDMI setting and got it to fill the screen, but I still preferred VGA.

Sounds like your TV treats HDMI as a "video" signal and overscans it, while it treats VGA as a "PC" signal and doesn't.

You should try HDMI and see if there's a way to disable overscan in the TV. You should get a perfect, razor-sharp picture with 1080p-to-1080p HDMI.

See what an anamorphoscopic lens can do, see movies the way they were meant to be seen
stanger89 is offline  
Old 07-18-2009, 12:16 PM
Newbie
 
BarbaraThompson's Avatar
 
Join Date: Aug 2008
Posts: 13
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I spent hours forcing my eyes trying to find a difference between my VGA and HDMI signals. Now I know there is no noticeable difference.
All is part of a movement to try to digitalize every output and control our content. So STICK with VGA as long as you can and let the manufacturers know that you are not willing to invest hours of YOUR time tweaking settings to protect OTHER people's money. We gave up our analog TV and now we have the same commercial interruptions while we are paying for the time. Don't give up VGA.
Syner: that link is sooo educative (and the comments are hysterical)
BarbaraThompson is offline  
Old 07-18-2009, 04:32 PM
AVS Special Member
 
rdunnill's Avatar
 
Join Date: May 2001
Location: South Surrey, BC, Canada
Posts: 1,138
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by BarbaraThompson View Post

I spent hours forcing my eyes trying to find a difference between my VGA and HDMI signals. Now I know there is no noticeable difference.

If the display devices are digital, quality is noticeably better with digital connections like HDMI over analogue ones. At least it's been with all the displays I've had.

Blu-ray: 50+
HD-DVD: 23
DVD: 600+ and lost count
rdunnill is offline  
Old 07-18-2009, 05:22 PM
AVS Addicted Member
 
stanger89's Avatar
 
Join Date: Nov 2002
Location: Marion, IA
Posts: 17,491
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 173 Post(s)
Liked: 156
It just comes down to the analog inputs an how they're handled. Some displays handle Component/VGA just as well as DVI/HDMI and the difference is negligible. Some (maybe like computer monitors) assume only the digital input is going to be used and handle the analog ones rather poorly.

See what an anamorphoscopic lens can do, see movies the way they were meant to be seen
stanger89 is offline  
Old 07-18-2009, 09:56 PM
AVS Special Member
 
Mark_A_W's Avatar
 
Join Date: Dec 2002
Location: Melbourne, Australia
Posts: 8,110
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Java Jack View Post

This was not about who benefits...the simple fact is that if you want to enjoy protected content (legally) on an HTPC, you must have a protected video path. Without it, content will be scaled down or not run at all. Hence, HDMI or DVI with HDCP.

Clueless..completely clueless.

VGA is ALLOWED. If someone sets the ICT then you can spout this rubbish.


But even if they do, Anydvd HD fixes all.


There is nothing protected at all about bluray, thanks very much to Anydvd. We can do what we want, and I plan on running VGA to my CRT projector for many years to come.

Cracking the bluray and converting it to mkv is the only way to get GAMMA adjustment, and it's the easiest way to get full resolution audio.

They shot themselves in the foot, making it so hard to play back, that the path of least resistance is to crack the encryption.




As for the original question, whether HDMI provides a better picture that VGA depends on many things. It should on paper, but a test is the best way to make up your mind. HDMI does have the advantage of bundling the audio, if you need that feature.

Loving my Electric Bike!!
Mark_A_W is offline  
Old 07-18-2009, 11:02 PM
AVS Special Member
 
sneals2000's Avatar
 
Join Date: May 2003
Location: UK
Posts: 7,065
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 64 Post(s)
Liked: 49
VGA has a major downside - it doesn't always support 50Hz video. If you're in Europe, large parts of Asia, bits of South America, or Australasia, then you want to watch SDTV and HDTV at 50Hz - yet many PCs will only output VGA at 60Hz and above, and many TVs will only accept VGA inputs at these frequencies - so you can't always use VGA at 50Hz.

The same is also true of 24Hz output for 1080p Blu-ray replay.

The same PCs and displays WILL often accept 1080p at 24Hz and 50Hz via HDMI though...

There are also issues that going VGA can introduce a D/A and an A/D process that may not be used in an HDMI signal path - and consumer level D/A and A/D processes - and imperfect analogue cabling - may not be lossless at the bit depths involved and may introduce video noise, HF filtering, ringing etc. None of these processes and failings are inherent in HDMI - though HDMI implementations on specific displays could involve analogue processing. (At least one Sony HDTV had an HDMI to analogue component converter in it AIUI)
sneals2000 is offline  
Old 07-18-2009, 11:39 PM
AVS Special Member
 
Mark_A_W's Avatar
 
Join Date: Dec 2002
Location: Melbourne, Australia
Posts: 8,110
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Powerstrip.

Loving my Electric Bike!!
Mark_A_W is offline  
Old 07-19-2009, 03:14 AM
AVS Special Member
 
Mr.D's Avatar
 
Join Date: Dec 2001
Location: UK
Posts: 3,307
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 15
Quote:
Originally Posted by Mark_A_W View Post

Powerstrip.


Yup never had a problem hitting 50Hz over vga with powerstrip.
I also find the refresh rates stated by the graphics drivers app are never accurate enough.
With the camera button in powerstrip I can tickle the refresh rate until I get my desired refresh rates accurate to 3 decimal places ( don't think powerstrip goes any more precise than that.

Reclock is also handy as an alternative measure , I find it tallies with powerstrips camera button 100%.

Graphics drivers tend to be way out and inconsitent...stutter city.

I've never found any picture quality advantage from VGA vs dvi/hdmi.
I don't believe the lag issue mentioned in a previous post either from a gaming perspective.

digital film janitor
Mr.D is offline  
 
Thread Tools


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off