Originally Posted by bejoro
I have an older i1pro1 Rev.D (calibrated 2010), a 4 years old i1pro2 Rev.E and a brand new i1 display pro (EODIS3, Retail, Rev. B-02).
I have tested the three instruments on a DELL U2713H GB-LED, wide gamut monitor, using HCFR.
I have used the correct RG_Phosphor spectral data for the EODIS3. The DELL U2713H was hardware calibrated using i1profiler (Dell software DUCS 1.6.5) and the new EODIS3.
All three instruments show very different readings.
What could be the reasons for these differences?
My intention was to use my i1pro2 as a reference for creating spectral data or sensor correction matrix for the EODIS3.
RG_Phosphor stores several WRGB spectral samples. The last one of those spectral samples in a GB-LED (actually a Dell U2413). A well behaved i1d3 should not vary too much if you use a "cleaned" version with just GB-LED or Xrite's RG_Phosphor with the other "rg" spectral samples (with all averaging that ArgyllCMS related software -and Xrite's tools- does when using multi-sample CCSS).
What you did in your 3 screenshots is to compare that generic correction with 10nm measurements taken witha i1Pro/i1Pro2.
This is quite different from what you seem to want to do.
If you want to compare a CCSS corrected i1d3 to your i1Pro2 the make a U2713H CCSS by yourself. Make that CCSS from a native gamut configuration: Standard, Custom Color OSD modes or DUCCS 1.6.5 native gamut calibration in CAL1/CAL2).
Then you'll se if there is such difference between measurement devices "if you trust i1d3 spectral sensivity data stored in firmware".
-Maybe your U2713H has a blue spike moved a bit towards violet than last sample (U2413 GB-LED) stored in RG_phosphor, so you need a custom CCSS for some GB-LED variation like the one in PA242W models (and you can contribute to community uploading it).
-Maybe your i1d3 measures closer to i1Pro2 if you used a "clean" RG_phpshor using just the last sample (last 4 spectral rows) which is an actual GB-LED instead of using "averaged" correction with all spectral samples contained in RG_phosphor file.
-Maybe 10nm readings from an i1Pro with all the noise and averaging from its internal 3nm resolution causes some kind of measurement error in Z coordinate (because of fast rising/falling z-bar in observer). Comparing your i1Pro2 readings to a JETI for example would shed light in that issue. Some users (Maciej Koper) did it for a PA242W and i1Pro2 missreadings seem to be close to what you measure, his i1Pro2 measure a little "yellow" white point from actual whitepoint. I cannot put links right now, it you cannot find Maciej Koper comparision of measurement devices with a PA242W GB-LED I'll try to post it later.
So IMHO you did the wrong test. Try to do it as I suggest:
-use a clean RG_phoshphor with just last 4 spectral data rows (you need to change number of rows and row index at the beginning of each line). That means use an actual GB-LED CCSS
-use a CCSS made by you with your i1Pro2 and your U2713H. Compare to i1Pro2 readings. Plot it (specplot tool in ArgyllCMS or an spreadsheet).
Compare that plot to a plot of the "clean GB-LED" from last 4 samples in RG_phosphor. Look for blue spike position.
Does that blue spike position varies if you build a 10nm CCSS compared to a 3nm CCSS with high res mode from ArgyllCMS driver?
I think that this task is easy and may solve all or at least a lot of your questions.