Does running a high refresh rate on a CRT monitor cause more wear or generate more heat than running a lower refresh rate?
I have a Sony CPD-E210 that I run at 1024x768 at 85 Hz. Looking at the specs in the manual I see that this monitor supports up to 85 kHz horizontal frequency. If I wanted to run 100 Hz refresh at 1024x768 the monitor is running at 81.3 kHz vs only 68.6 kHz at 85 Hz
The preset resolutions in windows go up to only 85 Hz when the "hide modes that this monitor cannot display" checkbox is checked. If I uncheck that box I can run 100 Hz. The question then is why do they impose an artificial limit of 85 Hz if the monitor can run 100 Hz within spec of the max horizontal frequency (85 kHz)?
Does running near the limit of max horizontal frequency reduce the lifespan of the monitor?
Or is it totally safe to run at 100 Hz since it's under 85 kHz horizontal frequency?
The Windows display controls are very limited and not really suitable for getting the most out of a CRT. I only remember Matrox having good driver controls for CRTs as they based their business around it. Try the EnTech Powerstrip utility.
I might be wrong but I don't think a modern CRT will allow you to run beyond preset limits, at least in progressive mode. I remember CRTs looking blurry when stepping up the Hz long before they went out of sync.
I've always wondered this too. Any insight would be appreciated.
During games I'm running my Lacie at 75hz at 2048x1536. That comes out to about 120Hz horizontally (or vertically, can't remember). If this is going to take years of my monitor I'd like to know, so I could dial it back.
Any techs out there or someone really knowledgeable about CRTS that can answer this? Higher refresh rates do they cause quicker wear of the components or generate more heat and thereby reduce lifespan? Searched this on the net and can't find much of a consensus on it.