I don't think this accurate. For instance, my DVD player is connected via HDMI and it offers three settings, each dependent on the display not the connection. The player has three video settings for HDMI: Standard TV, Monitor, and TV-RGB. I get the best picture when using Standard TV. Here is the explination I was given in the DVD forum:
If your HDTV works with the HD960 in "TV" mode (i.e. using YCrCb values), then that is most likely the best mode to use for a couple of potential reasons. The most important reason is that "TV" mode is the only mode you can get 10-bit color if your TV supports 10-bit color, meaning 10-bits per color channel per pixel. RGB modes only support 8-bit color (that is, 8 bits per pixel per color channel...a total of 24 bits per pixel). Consequently, using "Monitor" or "TV-RGB" will limit you to 8-bit. Both "Monitor" and "TV-RGB" modes use RGB values, but they normalize them differently. "Monitor" displays expect black=0 and white=255. "TV-RGB" displays (in the U.S.) expect black=16 and white=235.
I have extracted the above and not included the entire response, but I think you can get the drift from this.