Originally Posted by xPC
I'm having a bit of a struggle with my Dell U2417H. I bought the monitor primarily for gaming as I have read that the factory calibration and color accuracy are very good, which was my primary concern (color accuracy).
The factory calibration report shows an almost spot-on calibration of sRGB, but when I use that mode, something just seems off. The color temp looks cool, and I feel like i'm seeing banding rather than smooth gradients.
I used my Spectracal C3 and Calman to do a basic calibration of the custom mode, which allows me to adjust RGB levels; this produced a visually-better image to my untrained eye, and the reading were more accurate. The issue is, my C3 colorimeter is ~3 years old, and I've never had it calibrated, so I'm not sure how much I can trust it. This brings me to my question(s): should I even be using sRGB for gaming? If so, should I trust Dells calibration over my C3? Also, I noticed that the monitor seems to be unable to produce WTW detail, stopping strictly at 235; shouldn't a computer monitor be able to display all the way to 255?
Sorry for the rambling, but i'm debating on whether I should return it or not. Any feedback or advice is greatly appreciated. TIA!
Hi, the SpectraCAL's C3 you have is a custom OEM version of the ColorMunki Smile (X-Rite's HelpDesk as a Reference: http://xritephoto.com/ph_product_ove...5722&catid=149
), because by design is the entry level of X-Rite meters, it's filters are exposed and they drift sooner, so you can't really trust it after 3 years.
To see how much it has been 'off' you need to compare it using as refererence a spectrophotometer like i1PRO2 for example or fix it's color accuracy issue if you create a 4-color matrix meter correction table using a spectro.
If you don't have access to a reference and you want to decide what calibration to trust, display a full field 100% white pattern (or a grayscale ramp) to your Dell and use the factory settings...and compare it with the settings where you have calibrated using your C3...so see which looks more neutral.
About the WTW, Blu-Ray/Satellite/Terrestial/Cable signal is using TV/Video Legal Levels 15-235 (8bit signal has 256 values...Reference Black is 16, Reference White is 235...from the 1-254 transmitted 8bit signal...1,255 are used for timing.)
PC's are working in PC/Data Levels 0-255 (0 is Reference Black, 255 is Reference White)
When you view a movie from a PC, the software player will expand the Video Levels to PC Levels (16-235 -> 0-255).
If it will not expand then you will see the Black as 6.3% Gray. Since Black @ Data Levels (PC) is 0 while @ Video (TV Legal) Levels is 16.
But when a software player will expand from 16-235 to 0-255 then all information 1-16 becomes Black 0 and all information 234-254 becomes 255. This is the reason any pattern with flashing bars with below black 1-15 or above reference white (236-254) will clip (not flashing) when you are using a PC monitor. If you use a TV, you can set your software player to not expand (so to output the movie to Video Levels 16-235)...set your PC output to Data Levels (0-255) and set the TV to receive to it's input Video Levels (16-235)...doing this you will be able to see above reference white bars flashing to the TV with correct black level while the same time the black will look as gray to your PC....but the playback from the TV will be correct.