recently i noticed newer graphics cards have only one DVI-I (as opposed to 2) and one DVI-D on them ex: Geforce 600 series and newer
the thing is for me i run two Trinitron CRTs, i have been wanting to upgrade my graphics card from my GTX 570 to a GTX 670 or 760, however i am stuck because i need to find a Display Port to VGA adapter that works well, most only go up to 1920x1080 or 1920x1200
but i need to support at least 1920x1440 at 85Hz, maybe even 2048x1536 at 85hz (i plan to get a GDM-F520 if i can find one
i have also seen them say 24-bit color on adapters, but on my computer it says i am running at 32bpp 32-bit color
is there any difference between a Display port to VGA adapter and a DVI-I to VGA adapter in color quality? will my picture look the same, better or worse?
also can CRTs display 32bit color or 30 bits (8 bits per pixel) and is there any way to get something that can convert to that?
The only downside is that if your monitor supports very high resolutions and refresh rates (e.g. 2048x1536@85Hz or so) you may not be able to use combinations of high resolution and high refresh rate, as the CRTC in the adaptor is not as good as the ones in most graphics cards.
As for color depth, CRTs are fully analog and don't have an inherent color depth. The color depth is a feature of the DAC, which is part of your graphics card (or digital to VGA adpater). I don't know exactly what color depths are supported by the Apple adapters.