AVS Forum banner

Status
Not open for further replies.
1 - 7 of 7 Posts

·
Registered
Joined
·
36 Posts
Discussion Starter #1
After much forum reading on the Radeon, I have upgraded from the GeForce2 GTS to the Radeon VE. This mostly for the multiple VGA and S-Video output flexibility - and PQ (and cheap). However, I have not seen a post on the following:


I've found a color (temperature) descrepancy between the two VGA ports. Was first noticed upon installation (defaults to clone mode) where the basic Windows desktop on one port was much more blue -- and overall a bit darker -- than the other. Eventually loading the drivers (and various driver versions), resolutions, etc., make no difference.


Subsequent viewing of a colorbar BMP confirms the difference. The DVI/VGA port looks closer to normal than the dediciated VGA port. Switching to TV out seems to match the DVI port (probably the same DACs) and also looks good.


I setup two identical monitors, same cables, exact same setting parameters. When swapping connectors, the problem follows the port! So I exchanged the card and have the EXACT same results. Again, this can be seen even before drivers are loaded (operating in standard VGA) so it appears a h/w issue.


The obvious concern is had I not noticed this now, I would have eventually plugged my PJ into the screwy port and been inadvertantly compensating for it in PJ gamma, etc.


Has anyone noticed this? Any thoughts?


Thanks.
 

·
Registered
Joined
·
472 Posts
Hi TMEngineering,


Very interesting observation. Your post tweaked my interest so I decided to test one of my VE equipped HTPCs. I have a Linksys KVM switcher and connected both vga cable ends to each head of the VE so unlike your setup, I am using a single LCD monitor but can switch heads instantaneously. I activated both heads of the VE and used a DVI-VGA adapter on the 2nd head. I tested in Win2K. Although I can see a slight change in saturation and also a slight, but noticeable increase in clarity of text, my primary VGA head was the one with slightly richer saturation and slightly clearer text. At first I did not notice any difference but upon closer examination, there is a difference in output between the 2 heads but in your case the DVI connector was subjectively better whereas on my setup, the VGA connector was slightly better. Fortunately my G15 is connected to the VGA connector so I am getting the maximum quality from that head.


I will have to test my other VE equipped HTPC and make sure the projector is connected to the best head.


In my situation, the display from each head was very close but one was noticeably better. It was not dramatic or obvious, but subtle. I do not know if the DVI-VGA adapter on the 2nd head degraded my signal resulting in a slightly inferior display. TMEngineering, did you use an adapter for the 2nd head?


It looks as though Radeon VE owners should test both heads of their card and see which provides the most accurate display and use that for the projector. ;)


Rick
 

·
Registered
Joined
·
36 Posts
Discussion Starter #3
Rick,

Yes, my 2nd monitor was via the DVI-I port with the little adapter. Although, if I understand the spec correctly (which I may not) there's nothing going on in that adapter, other than bringing the pins out.


After further testing I found that the standard VGA port is providing a substantially higher analog signal than the DVI/VGA port is. I posted my findings in the HT Computer/"10bit vs 8bit..." thread. (I do more reading than writing on this forum and I don't know how to make a link to my exact post). Because of that, the VGA port looks "brighter/sharper" to me, yet the color balance is incorrect when compared to the DVI/VGA port -- on mine anyway.


I purchased two VE's this week. Maybe it's a newer run. How old are yours?


Also, my goal with the VE was to keep a desktop VGA monitor active while having the card switch between the PJ and TV, depending on what I was watching. I found this combination can only occur when the projector is plugged into the DVI/VGA port. Can you make similar selections with your PJ in the main VGA port?


-Tom
 

·
Registered
Joined
·
472 Posts
Hi Tom,

Quote:
Because of that, the VGA port looks "brighter/sharper" to me, yet the color balance is incorrect when compared to the DVI/VGA port -- on mine anyway.
Can you tell me specifically how you tested the color balance? All I did was to switch various desktop themes which have a color display and then switched the heads to see if there was a difference. I don't know how to get to the colorbar bmp setting you mentioned earlier.


Quote:
I purchased two VE's this week. Maybe it's a newer run. How old are yours?
The one I tested last night was an older oem VE. I purchased it when the VE first hit the marketplace. My other VE which I have not yet tested yet was purchased about 2 months ago. When I get some time, I will evaluate that as well.

Quote:
Also, my goal with the VE was to keep a desktop VGA monitor active while having the card switch between the PJ and TV, depending on what I was watching. I found this combination can only occur when the projector is plugged into the DVI/VGA port. Can you make similar selections with your PJ in the main VGA port?
I have not used my htpc in that configuration so I cannot give you any information regarding that setup. I have my primary head connected to the projector and the secondary head connected my analog lcd monitor. Before I got my projector I did use the S-Video connector but the picture quality was less than optimal when viewing on my 70" RPTV so I have not used it since.


Rick
 

·
Registered
Joined
·
36 Posts
Discussion Starter #5
Rick,


I tested the color balance only by throwing up a BMP of NTSC color bars, and comparing it to a DV tape playback on a cal'd monitor. This was close enough to see that there was a real problem.


I also measured the output of each color on a scope and found Blue to be way down.


However, to make a long story short, after getting my 3rd VE today (2 returns later) I found a board that doesn't exhibit any of the problems with the reduced output level or color balance. Both the VGA and DVI/VGA ports are spot on at 720mVp-p full level. The RGB outputs track correctly and the output doesn't crap out as the brightness is raised.


For what it's worth, I pulled a new card off the shelf that the vendor received in May'01. Both defective cards (identical problem!) were received Sept'01. I can only assume there was a bad batch.


It's still probably something everyone should check for so that they don't compensate by overly tweeking their PJ.


-Tom
 

·
Registered
Joined
·
36 Posts
Discussion Starter #6
At least, don't use the DVI/VGA output for your projector or main display.


As stated in earlier threads, twice I've returned my new Radeon VE because the color balance was off on the DVI/VGA (analog) output. Scope measurement showed that the Blue output was down about 120mV below normal, as compared to the R & G outputs, resulting in an obvious color problem. I figured the cards were bad.


Tonight my 3rd Radeon VE, which appeared to be working normally upon installation, started doing the same thing. That is, as soon as I plugged into the S-Video output. Now, after numerous plug/unplug tests and measurements, I'm confident that this is the story:


With nothing plugged into the S-video port, both the VGA and DVI/VGA behave fine.


When the supplied composite video adapter (judging by the extra pins, I don't think this is a Y/C to composite summing adapter -- but rather a board specific connector which pulls the composite signal off the board) is plugged in and attached to a monitor (loaded), the Blue output of the DVI/VGA goes negative about 120mV. However, as the Red & Green outputs don't (at least not near as far), the color balance is corrupted. Leaving this adapter in the card but disconnecting the attached RCA cable from it results in all outputs returning to normal. So it's some sort of a loading (or load sensing) issue, and a design flaw as far as I'm concerned being that they advertise all combinations of monitors being usable.


When the supplied adapter is not used and a regular s-video cable is plugged into a Y/C monitor the Red and Green levels go negative, and the Blue output doesn't, again resulting in goofy color and reduced brightness on the DVI/VGA monitor. When the S-video cable is left in the board but unplugged from it's load (monitor), the Red & Green outputs jump up to 800mV (~80mV above normal). Not sure what that's about -- either the board is compensating for something (NTSC color space??) or it's a line reflection thing. Either way, it shows that there is very poor isolation between the Y/C output and the DVI/VGA output, either electrically or functionally.


Note: All this occurred with the video output DISABLED via the ATI driver s/w.


The bottom line is, for those using VE's, you can't use the video output for anything and also expect the DVI/VGA output to be used for any serious viewing.


I have not tested the digital DVI output for the same inter-relationship.


I hope this post helps at least someone. It's been 3 nights of cursing for me.


-Tom
 

·
Registered
Joined
·
472 Posts
Tom,


Great work! Thanks for the refinement in your findings. I guess the reason I have not noticed any display problems is that I have not used my s-video output for anything recently and have used my projector in the primary and monitor in the secondary output. As stated earlier, I saw a very slight color shift when switching from the vga to dvi heads.


Anyone considering using the s-video in conjunction with their primary or secondary outputs should realize that it will affect their color balance in the other ports and chances are you are going to get the best display from your primary head.


Glad there are people with engineering backgrounds in this forum! ;)


Rick
 
1 - 7 of 7 Posts
Status
Not open for further replies.
Top