Originally Posted by rdqlus
There's been a trend lately to overclock the input stage of display devices to increase refresh rates. Blurbusters, 120hz.net, and other sites document several successes with various consumer monitors and HDTVs.
I find this a bit silly as a concept.
Overclocking is typically the process by which you tell the Bios of a computer to run a processor at a higher speed than it is rated for. Typically it requires special cooling, etc. to not overheat and shut down.
I wouldn't call feeding any 120hz source to a display an 'over clocking' thing.
The display's input either accepts 120hz at certain resolutions or it does not. Since the input processor is almost never manufactured by the company which makes the display, it wouldn't surprise me in the least to find out that the display manufacturer is only receiving partial lists of accepted resolutions. Often full PAL formats are accepted, and resolutions that are odd are also often accepted, but just not listed in the 'official list' of supported resolutions.
I would think of overclocking a display's input as actually making a real modification to that input to force it to accept a resolution that it truly was never designed to accept and having it work reliably at that resolution.
But, feeding a display a 120hz source will all depend on the chip sets used in the display, and if a display supports 120hz and actually looks good with a 120hz source, then that's a good thing. Still, it's probably a LCD flat panel which doesn't generally look as good as a plasma display will look at 60hz, but that's a different forum. For projectors, you can try almost any resolution you want, but I wouldn't expect 1080p/120 to be accepted by any projector over HDMI yet. Soon, yes, but just not yet. The HDMI 2.0 specification leads me to believe that 1080p 3D and 120hz will be accepted in the next 1-2 years, along with UHD resolutions. At which point, people will want UHD 60hz/120hz capability.
All in a little package that should never cost more than $1,000.