Mark, when I read this thread, I found myself hoping there would be a warning near the very top of the original post with words to the effect of:
If your TV shows 1080 @ 120Hz
it does NOT mean that it is actually
displaying all 120 frames per second.
You MUST run one of the following
utilities and visually inspect the results.
Use this link to do this from your browser.
You cannot trust monitor configuration
I'm just worried that if you don't put something like this near the top, people will gloss over the post and we'll be back to endless "HEY IT WORKS!" false posts. Those kinds of posts are problematic because they can influence someone's buying decision.
Mark, I remember this discussion in the thread. He was mistaken---his final output at the end was showing frame discarding at 1080p/120. You mention this at the bottom of the quote in a note, but should make "720p/120" what is listed right below the TV models, because that's the only thing confirmed. People won't read through to the note at the end, they'll just quote the top over and over.
And coincidentally, someone in the R550A thread just pointed out a similar confusion with that entry from your website. Here's their post in the R550A thread.
Mark, I would feel better if you fixed that line to state 720p/120.
Do you have a link to one of those warnings? This might have been true in the CRT days (wasn't there a timing circuit that could overheat?), but now?
There's nothing that can burn out: Just interpolation alone is already driving the panel at that rate, and the input stream is a digital feed. It either can be read at that rate or it cannot (with the side weirdness of frame discarding of course).
Just because your TV announces in the corner that it's receiver 1080p 120Hz doesn't mean that it's displaying 1080p 120Hz. It could well be doing what several displays so far have been found doing: keeping the resolution (1080) but "frame dropping" (or "frame discarding"). You need to run a utility and visually inspect the results to see if all 120 frames that are coming in every second are being displayed. The nVidia control pannel thing has no understanding of this at all---it wouldn't know either way.
The TV already knows how to drive itself at that rate (for interpolation). Who said it, and what was supposed to overheat? It doesn't preclude it from deciding to frame drop when driven through HDMI at 120Hz however.
I'd like to know why. It doesn't make any sense to me at all.