2018 LG OLED Calibration and User Settings (No price talk) - Page 113 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 2349Likes
Reply
 
Thread Tools
post #3361 of 3441 Old 09-10-2019, 11:09 AM
AVS Forum Special Member
 
Rolls-Royce's Avatar
 
Join Date: Apr 2003
Location: Victorville, CA
Posts: 4,132
Mentioned: 38 Post(s)
Tagged: 0 Thread(s)
Quoted: 1131 Post(s)
Liked: 1235
Quote:
Originally Posted by mrtickleuk View Post
I see, thanks. It's looking more and more that I need to try to send real DV patterns.

I have thought of a possible option. I already use the Raspberry Pi and we know that it is bit-accurate in RGB mode. So, could I use this as a "baseline" to test my laptop's video output (it is DisplayPort, running through a DP-> HDMI adaptor) ? I could, in SDR mode, take readings with the Raspberry Pi, and then switch to using the laptop's video output on the TV, with the Calman internal pattern window showing on the TV, and repeat those readings. I would be able to compare the results and if they were very similar, then I could trust that the laptop's video output is bit-accurate maybe?

I know this does not take account of "panel drift" but it is the best I could do, and I would prefer this to faffing about with Changing contrast and tweaking OLED light.

Has anyone else done this? In case I get lucky, it's a Lenovo T520i with Intel HD Graphics 3000 chipset (https://laptopmedia.com/video-card/i...elena-pamet-2/).
It wouldn't be just panel drift but meter repeatability as well, Mike. If you are trying to determine whether your laptop and Pi are accurate to within one or two digits, I just don't think your suggested method would be near good enough to tell.

...Royce...

"I never drink...wine."
Bela Lugosi, DRACULA, 1931
Rolls-Royce is online now  
Sponsored Links
Advertisement
 
post #3362 of 3441 Old 09-10-2019, 11:39 AM
Advanced Member
 
janos666's Avatar
 
Join Date: Jan 2010
Location: Hungary
Posts: 733
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 82 Post(s)
Liked: 49
Quote:
Originally Posted by mrtickleuk View Post
I see, thanks. It's looking more and more that I need to try to send real DV patterns.

I have thought of a possible option. I already use the Raspberry Pi and we know that it is bit-accurate in RGB mode. So, could I use this as a "baseline" to test my laptop's video output (it is DisplayPort, running through a DP-> HDMI adaptor) ? I could, in SDR mode, take readings with the Raspberry Pi, and then switch to using the laptop's video output on the TV, with the Calman internal pattern window showing on the TV, and repeat those readings. I would be able to compare the results and if they were very similar, then I could trust that the laptop's video output is bit-accurate maybe?

I know this does not take account of "panel drift" but it is the best I could do, and I would prefer this to faffing about with Changing contrast and tweaking OLED light.

Has anyone else done this? In case I get lucky, it's a Lenovo T520i with Intel HD Graphics 3000 chipset (https://laptopmedia.com/video-card/i...elena-pamet-2/).

I always assumed all nVidia cards (from this decade at least) to provide totally accurate Full Range RGB output. Other formats are always converted (internally) and usually dithered nowadays (for example, when you output YCC 12-bit or similar on the last few generation of their cards with relatively new drivers, it's dithered to 12-bit from some unknown internal conversion precision --- it's always dithered to whatever bit depth you have as you output format but dithering shuts off automatically when it's 1:1 Full RGB or activates when it's anything else).
With AMD, dithering seems to be "always on" (and least on consumer cards), so even Full RGB could have some noise. But I don't think that causes any significant errors (keep in mind that the TV's internal processor most probably adds even more dither noise to the image) and besides this dither noise I assume it's accurate. (And I think it's possible to shut dithering off with unsupported third-party tweak utils or something.)
I have no idea about Intel (I wouldn't trust it for serious stuff, to be honest, but that's just me). But I see no reason to assume they can't pass Full RGB (their internal conversion to other formats is probably far behind AMD and nVidia, could be inaccurate and is probably handled worse --- e.g. not dithered adequately, or at all but rounded/truncated after converted at relatively low precision --- but I could be wrong).

I don't know why you guys play with those Pis when you have PCs. I doubt they can convert to YCC better than AMD/nVidia and the test patterns usually go though RGB anyways (at some point somewhere, Pi or PC).

"DIY certified hobby-calibrator" (based on ChadB's "warning signs" list

Last edited by janos666; 09-10-2019 at 11:42 AM.
janos666 is online now  
post #3363 of 3441 Old 09-10-2019, 11:47 AM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Quote:
Originally Posted by janos666 View Post
I always assumed all nVidia cards (from this decade at least) to provide totally accurate Full Range RGB output. Other formats are always converted (internally) and usually dithered nowadays (for example, when you output YCC 12-bit or similar on the last few generation of their cards with relatively new drivers, it's dithered to 12-bit from some unknown internal conversion precision --- it's always dithered to whatever bit depth you have as you output format but dithering shuts off automatically when it's 1:1 Full RGB or activates when it's anything else).
Only 8-bit output is possible with my card, and I would never attempt YCC anyway; as we know the Raspberry Pi is not accurate in that mode it would be a complete waste of my time to compare laptop-YCC vs Raspberry Pi YCC.

Quote:
I don't know why you guys play with those Pis when you have PCs. I doubt they can convert to YCC better than AMD/nVidia and the test patterns usually go though RGB anyways (at some point somewhere, Pi or PC).
That's simple! Because the Raspberry Pi is proven to be bit-accurate in RGB mode, using very expensive equipment tested kindly by some lovely people, and it's a very wonderful thing to have access to such a cheap known bit-accurate generator in any mode at all.

It's known that there are small visible differences in using bit-accurate RGB for calibration vs using bit-accurate YCC for calibration on this TV (Tedd has posted about this many many times), but of the options available it's still better to calibrate with bit-accurate RGB (silver standard) than throw our hands in the air and give up just because we don't have bit-accurate YCC (gold standard).

Rolls, I take your point about meter repeatability. I have time. I can do multiple passes and look for patterns which may show up each time. I'm not attempting the accuracy you crave, just a "sanity check" to see if it looks as if my laptop's RGB output might be close to accurate or totally wacky. Of my two options, I prefer this option to trying to use other values of Contrast/OLED light.

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
Sponsored Links
Advertisement
 
post #3364 of 3441 Old 09-10-2019, 12:18 PM
Advanced Member
 
janos666's Avatar
 
Join Date: Jan 2010
Location: Hungary
Posts: 733
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 82 Post(s)
Liked: 49
Quote:
Originally Posted by mrtickleuk View Post
That's simple! Because the Raspberry Pi is proven to be bit-accurate in RGB mode, using very expensive equipment tested kindly by some lovely people, and it's a very wonderful thing to have access to such a cheap known bit-accurate generator in any mode at all.
But why do you think AMD/nVidia/Intel GPUs from this decade are not fully accurate in their Full RGB output mode (disregard the rare edge cases of broken OS/driver versions)?

I not sure if we can ever tell if it's RGB or YCC that's more accurate on these LG TVs. All we know that they behave a little different relative to each other. I wouldn't take either for "golden".
I think the safe bet is to assume that they both deviate from the theoretical truth by the same amount (the internal processing inaccuracy affects them differently but roughly by the same amount).
mrtickleuk likes this.

"DIY certified hobby-calibrator" (based on ChadB's "warning signs" list
janos666 is online now  
post #3365 of 3441 Old 09-10-2019, 12:49 PM
Senior Member
 
Join Date: Jan 2015
Posts: 397
Mentioned: 8 Post(s)
Tagged: 0 Thread(s)
Quoted: 306 Post(s)
Liked: 168
Quote:
Originally Posted by janos666 View Post
I don't know why you guys play with those Pis when you have PCs. I doubt they can convert to YCC better than AMD/nVidia and the test patterns usually go though RGB anyways (at some point somewhere, Pi or PC).
PC with AMD/nVidia GPU configured for full range RGB output, and madTPG configured for limited range RGB output (with TV black level set to low) for the win.
iSeries is online now  
post #3366 of 3441 Old 09-10-2019, 01:09 PM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,149
Mentioned: 199 Post(s)
Tagged: 1 Thread(s)
Quoted: 3123 Post(s)
Liked: 3865
Quote:
Originally Posted by mrtickleuk View Post
I see, thanks. It's looking more and more that I need to try to send real DV patterns.
What is your plan? ...to try using your notebook output with HDFury to add vendor infoframe and CalMAN software generator?

If you manage to make this work, it will require a bit-perfect 1080p60 full-range output from your VGA, it will mean can output accurate levels, for RGB, if it will work.

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
ConnecTEDDD is online now  
post #3367 of 3441 Old 09-10-2019, 01:15 PM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Quote:
Originally Posted by janos666 View Post
But why do you think AMD/nVidia/Intel GPUs from this decade are not fully accurate in their Full RGB output mode (disregard the rare edge cases of broken OS/driver versions)?
I don't know either way, but we're taught to doubt everything which does not have proof.

Quote:
I not sure if we can ever tell if it's RGB or YCC that's more accurate on these LG TVs. All we know that they behave a little different relative to each other. I wouldn't take either for "golden".
That's a very good point, yes

Quote:
Originally Posted by ConnecTEDDD View Post
What is your plan? ...to try using your notebook output with HDFury to add vendor infoframe and CalMAN software generator?

If you manage to make this work, it will require a bit-perfect 1080p60 full-range output from your VGA, it will mean can output accurate levels, for RGB, if it will work.
Yes that's my plan as I said, and I know it would require a bit-perfect output (it's not VGA, it's DP->HDMI as I explained). As I said, the whole point is that I do not know that it's bit-perfect. All I said was this might be a way to test it, including the problems and doubts that I described. These problems and doubts are less evil to me, than changing Contrast/OLED Light values. Neither will be perfect. I cannot afford perfection (a hardware DV pattern generator) so I am doing what I can.

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
post #3368 of 3441 Old 09-10-2019, 02:24 PM
Advanced Member
 
janos666's Avatar
 
Join Date: Jan 2010
Location: Hungary
Posts: 733
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 82 Post(s)
Liked: 49
Quote:
Originally Posted by mrtickleuk View Post
I don't know either way, but we're taught to doubt everything which does not have proof.
I agree that doubting and questioning everything is good but assuming that everything is inherently broken might be considered paranoid. The latter will often leave you with ignoring the easiest "perfect" solution and opting for the complicated alternatives.
Endless content is created with nVidia (and sometimes AMD) cards (and the professional line often shares the same base silicon chips with the consumer line as well as a huge portion of the driver code), multi-million $ projects included.
I mentioned the dithering issue with AMD because that was looked into by content creators and it's known that some of them preferred nVidia for that reason (where dithering turns off automatically when it's not needed based on input<>output format match). So, I guess some of them would have noticed by now if the nVidia (or even AMD) GPUs couldn't output 1:1 RGB from front buffer to HDMI. Yes, it is possible. But really likely? I guess not. For me, it's almost like asking if the Earth is flat. A lot of people looked into that issue over the years already. We don't have to re-calculate the Pi for ourself every time we want the circumference of a circle (yeah, I guess that's a "pun intended", though I am not sure what that means ).
mrtickleuk likes this.

"DIY certified hobby-calibrator" (based on ChadB's "warning signs" list

Last edited by janos666; 09-10-2019 at 02:34 PM.
janos666 is online now  
post #3369 of 3441 Old 09-10-2019, 03:22 PM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Sure, we don't calculate Pi because there's proof. I'm being less paranoid than Teddd on this! And if I had an Nvidia card in a full PC, I'd be far more relaxed about this than a cheap Intel motherboard-based card from 2011.

It was just an idea of how I might test the laptop's bit-correctness. Maybe it's not worth it, I dunno.

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
post #3370 of 3441 Old 09-11-2019, 02:10 PM
Member
 
Guz911's Avatar
 
Join Date: Nov 2006
Posts: 25
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 13 Post(s)
Liked: 13
Quote:
Originally Posted by mrtickleuk View Post
Sure, we don't calculate Pi because there's proof. I'm being less paranoid than Teddd on this! And if I had an Nvidia card in a full PC, I'd be far more relaxed about this than a cheap Intel motherboard-based card from 2011.

It was just an idea of how I might test the laptop's bit-correctness. Maybe it's not worth it, I dunno.
Just made some tests with DVDO Avlab TPG with all my devices and these are my findings: Chinoppo (oppo udp 203 clone), himedia q10 pro, zidoo z9s are all bit perfect with Teds patterns using ycbcr 8-bit. Apple tv 4th generation and my surface book 2 laptop (tested both intel integrated gpu and nvidia gpu) using both madvr and calman client 3 were not bit perfect. Apple tv 4th generation was tested with ycbcr 444 and the surface book 2 was tested using RGB 8-bits. Did not bother testing Apple TV 4k as it was already tested in the cheap pattern generator thread.

So I wouldn't consider any laptop bit perfect just because it has one of the latest intel or nvidia gpus.

Edit: Forgot to mention I used calman client 3 bypass on LG TV and x-rite calibration tester to reset gamma curves. When moving the colorchecker cursor on the DVDO, the rgb triplets changed values indicating some processing on the signal done by the non-bit perfect devices (appletv and laptop).
ConnecTEDDD and mrtickleuk like this.

S/W: LightSpace HTL, CalMAN Home for LG
P/G: DVDO AVLab TPG, VideoForge Pro
Meters: i1PRO2, SpectraCAL C6 HDR2000
TV: LG Oled 65 c8, chinoppo udp-203, himedia q10 pro
AVR: Denon x-4500 5.1.4, SVS Prime Satellite 5.1, RSL C34E

Last edited by Guz911; 09-11-2019 at 02:53 PM.
Guz911 is online now  
post #3371 of 3441 Old 09-11-2019, 03:11 PM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Many thanks for that, @Guz911 ! I think all I can realistically do with the equipment. I have is to take some reading with the Raspberry Pi, and then switch to using the laptop's RGB output and re-measure. See how similar the dE's look. Do enough runs to try to average out panel drift / meter drift. That's pretty much it. It's be no means a firm test though.

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
post #3372 of 3441 Old 09-11-2019, 03:40 PM
Member
 
Guz911's Avatar
 
Join Date: Nov 2006
Posts: 25
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 13 Post(s)
Liked: 13
Quote:
Originally Posted by mrtickleuk View Post
Many thanks for that, @Guz911 ! I think all I can realistically do with the equipment. I have is to take some reading with the Raspberry Pi, and then switch to using the laptop's RGB output and re-measure. See how similar the dE's look. Do enough runs to try to average out panel drift / meter drift. That's pretty much it. It's be no means a firm test though.
I just did something similar LOL. Did the pre-cal measurements of calman home for LG (colorchecker classic + 11-point greyscale) and to my surprise dE avg and dE max looked pretty similar between DVDO, VFP and madVR. So you might not get perfection with your laptop's hdmi but might get close enough to the results of using a bit-perfect PG. It's probably more important to consider getting a spectro instead of a real expensive PG IMHO.

However results have to be looked upon closer, because as Ted states dE errors don't indicate the direction of the error. So as an example if you are 1 dE away in the color's blue direction, but the reference result is 1 dE away in the red's direction, it is still 1 dE difference but in reality it could be 1.8 dE away as an example from the reference result.

S/W: LightSpace HTL, CalMAN Home for LG
P/G: DVDO AVLab TPG, VideoForge Pro
Meters: i1PRO2, SpectraCAL C6 HDR2000
TV: LG Oled 65 c8, chinoppo udp-203, himedia q10 pro
AVR: Denon x-4500 5.1.4, SVS Prime Satellite 5.1, RSL C34E

Last edited by Guz911; 09-11-2019 at 03:46 PM.
Guz911 is online now  
post #3373 of 3441 Old 09-11-2019, 03:55 PM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Quote:
Originally Posted by Guz911 View Post
I just did something similar LOL. Did the pre-cal measurements of calman home for LG (colorchecker classic + 11-point greyscale) and to my surprise dE avg and dE max looked pretty similar between DVDO, VFP and madVR. So you might not get perfection with your laptop's hdmi but might get close enough to the results of using a bit-perfect PG. It's probably more important to consider getting a spectro instead of a real expensive PG IMHO.
Oh, fully agreed. This hobby I've enjoyed but there are limits to what I can/should spend to be sensible. I have a bit-perfect Raspberry Pi and I can perfectly happily calibrate SDR and HDR10/HLG using my HDFury (another purchase just to do a tiny simple thing!).

So the last few days have been considering Dolby Vision calibration as a "would be nice". Even if I had the money, there's no way in hell I could see myself buying a hardware pattern generator as a priority, validating such a ludicrous racket (if I may say so). If I buy anything, it would be a spectro next, but Tyler pointed out in a post a while back that the EDRs that I can use with my C6HDR2000 are created with a much more expensive and higher-resolution Spectro than any profile I could make myself with, say, an X-Rite i1 Pro 2. So, more doubts that it would be a waste of money.

Then someone mentioned that the X-Rite i1 Pro 3 was due out soon, and I thought I'd wait to see what the resolution of that Spectro turns out to be, compared with the X-Rite i1 Pro 2.

Quote:
Results have to be looked upon closer, because as Ted states dE errors don't indicate the direction of the error. So as an example if you are 1 dE away in the color's blue direction, but the reference result is 1 dE away in the red's direction, it is still 1 dE difference but in reality it could be 1.8 dE away as an example from the reference result.
Yes, understood. I need a workflow page which lists all the numbers in a table for set A and set B, I think! I'll have to see which workflows and pages may be the most suitable. I have Ted's disc so I have his excellent Colour Comparison workflow, and also the new Calman Home workflows also have a similar A/B page in them.

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
post #3374 of 3441 Old 09-11-2019, 04:43 PM
AVS Forum Special Member
 
Rolls-Royce's Avatar
 
Join Date: Apr 2003
Location: Victorville, CA
Posts: 4,132
Mentioned: 38 Post(s)
Tagged: 0 Thread(s)
Quoted: 1131 Post(s)
Liked: 1235
Quote:
Originally Posted by mrtickleuk View Post
Oh, fully agreed. This hobby I've enjoyed but there are limits to what I can/should spend to be sensible. I have a bit-perfect Raspberry Pi and I can perfectly happily calibrate SDR and HDR10/HLG using my HDFury (another purchase just to do a tiny simple thing!).

So the last few days have been considering Dolby Vision calibration as a "would be nice". Even if I had the money, there's no way in hell I could see myself buying a hardware pattern generator as a priority, validating such a ludicrous racket (if I may say so). If I buy anything, it would be a spectro next, but Tyler pointed out in a post a while back that the EDRs that I can use with my C6HDR2000 are created with a much more expensive and higher-resolution Spectro than any profile I could make myself with, say, an X-Rite i1 Pro 2. So, more doubts that it would be a waste of money.

Then someone mentioned that the X-Rite i1 Pro 3 was due out soon, and I thought I'd wait to see what the resolution of that Spectro turns out to be, compared with the X-Rite i1 Pro 2.



Yes, understood. I need a workflow page which lists all the numbers in a table for set A and set B, I think! I'll have to see which workflows and pages may be the most suitable. I have Ted's disc so I have his excellent Colour Comparison workflow, and also the new Calman Home workflows also have a similar A/B page in them.
X-Rite currently doesn't make a spectro with better than 10nm resolution, even in their very expensive desktop models. I strongly doubt the Pro 3 will be any improvement in that regard, my friend. If it is, you can expect the price to be commensurately higher.

...Royce...

"I never drink...wine."
Bela Lugosi, DRACULA, 1931
Rolls-Royce is online now  
post #3375 of 3441 Old 09-11-2019, 11:45 PM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,149
Mentioned: 199 Post(s)
Tagged: 1 Thread(s)
Quoted: 3123 Post(s)
Liked: 3865
Quote:
Originally Posted by mrtickleuk View Post
Then someone mentioned that the X-Rite i1 Pro 3 was due out soon, and I thought I'd wait to see what the resolution of that Spectro turns out to be, compared with the X-Rite i1 Pro 2
X-Rite i1PRO3 Plus OEM has been released and available for sale also over one month from LightIllusion.

LightSpace is already supporting it with the version released @ 10 September 2019.

But there no any improvement over the i1PRO2 to the Emissive mode measurements (display calibration measurements), as the extended luminance range up to 5000 nits will not really used when you will create a meter correction table, where you will take measurements with 100 nits, even if you profile for SDR or HDR10 mode....you create the meter profile at 100 nits peak white, for the display to be more stable.

There more updates to improve performance related to Reflectance mode (measure papers/tiles), features we are not interested in display calibration world.

In Reflectance mode, i1PRO1/2 are using their internal tungsten lamp, while the i1PRO3 Plus is using a full spectrum LED ...to illuminate the object (the color on printed paper...since you using the meter in contact mode to the paper, if the lamp will not light up, it will be like taking a dark reading), so after the llumination (which i1PRO's has certification since they used about 25 colored ceramic tiles to test/pass the certification) the reflected light will be analyzed by the i1PRO1/2/3.

When you are measuring a display; since the display has it's own illumination; the meter is working in Emissive mode and internal lamp (tungsen or full spectrum LED) are not used.

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
ConnecTEDDD is online now  
post #3376 of 3441 Old 09-11-2019, 11:51 PM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,149
Mentioned: 199 Post(s)
Tagged: 1 Thread(s)
Quoted: 3123 Post(s)
Liked: 3865
Quote:
Originally Posted by mrtickleuk View Post
Then someone mentioned that the X-Rite i1 Pro 3 was due out soon, and I thought I'd wait to see what the resolution of that Spectro turns out to be, compared with the X-Rite i1 Pro 2.
The meter FOV is larger and the measurement frequency increased, the performance remain the same as i1PRO2.

see the specs PDF:

https://www.avsforum.com/forum/139-d...l#post58337064
chunon and mrtickleuk like this.

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
ConnecTEDDD is online now  
post #3377 of 3441 Old 09-12-2019, 09:15 AM
Senior Member
 
LeRoyK's Avatar
 
Join Date: Dec 2004
Location: Woodstock Georgia
Posts: 417
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 79 Post(s)
Liked: 101
I still have a question. If after doing my autocal of HDR with Brightness 50, Color 50, OLED Light 100. Do I lose my calibrated 1D and 3D LUTs if I later change the Brightness value to get true black in HDR mode? I need Brightness 49 to be truely black.

LG OLED65C8PUA, Onkyo TX-RZ920, Sony UBP-X700, Apple TV 4K, Amazon Fire TV Stick 4K, AT&T Gigabit Internet & U-verse
LeRoyK is online now  
post #3378 of 3441 Old 09-12-2019, 10:35 AM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Quote:
Originally Posted by LeRoyK View Post
I still have a question. If after doing my autocal of HDR with Brightness 50, Color 50, OLED Light 100. Do I lose my calibrated 1D and 3D LUTs if I later change the Brightness value to get true black in HDR mode? I need Brightness 49 to be truely black.
Yes, but no.
  • If you change your brightness afterwards, you wouldn't lose the LUTs, but they would be all wrong, and they are created such that you need to have those values for it to be correct.
  • The "I need Brightness 49 on my panel" part should disappear during the calibration. You have a 1DLUT with 1,024 values in it between Black and White. The calibration process fixes the black crush(1), sometimes with manual tweaking, that's part of the point of doing it. This is covered in Tyler's videos on YouTube (2).

So you should have no worries in using Brightness=50 afterwards, if it's been done correctly!

(1) Also far higher accuracy than you can achieve with the step changes of brightness between 49,50,51 etc.

(2) Comments at 9:46

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs

Last edited by mrtickleuk; 09-12-2019 at 10:52 AM.
mrtickleuk is online now  
post #3379 of 3441 Old 09-12-2019, 10:41 AM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Quote:
Originally Posted by ConnecTEDDD View Post
X-Rite i1PRO3 Plus OEM has been released and available for sale also over one month from LightIllusion.

LightSpace is already supporting it with the version released @ 10 September 2019.

But there no any improvement over the i1PRO2 to the Emissive mode measurements (display calibration measurements), as the extended luminance range up to 5000 nits will not really used when you will create a meter correction table, where you will take measurements with 100 nits, even if you profile for SDR or HDR10 mode....you create the meter profile at 100 nits peak white, for the display to be more stable.
Thanks @ConnecTEDDD for your very comprehensive answer. It seems clear that there is no benefit in getting the X-Rite i1 Pro 3 over the X-Rite i1 Pro 2. Also, there is no benefit to me buying an X-Rite i1 Pro 2 to profile my LG C8, because I would only be able to create a low-resolution profile and there's no way it could ever be more accurate than Spectracal's very high resolution one - even though theirs is a different LG C8 and not my personal LG C8.

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
post #3380 of 3441 Old 09-12-2019, 02:11 PM
AVS Forum Special Member
 
Rolls-Royce's Avatar
 
Join Date: Apr 2003
Location: Victorville, CA
Posts: 4,132
Mentioned: 38 Post(s)
Tagged: 0 Thread(s)
Quoted: 1131 Post(s)
Liked: 1235
Quote:
Originally Posted by mrtickleuk View Post
Thanks @ConnecTEDDD for your very comprehensive answer. It seems clear that there is no benefit in getting the X-Rite i1 Pro 3 over the X-Rite i1 Pro 2. Also, there is no benefit to me buying an X-Rite i1 Pro 2 to profile my LG C8, because I would only be able to create a low-resolution profile and there's no way it could ever be more accurate than Spectracal's very high resolution one - even though theirs is a different LG C8 and not my personal LG C8.
TWO variables there: not your TV and not your meter...

...Royce...

"I never drink...wine."
Bela Lugosi, DRACULA, 1931
Rolls-Royce is online now  
post #3381 of 3441 Old 09-12-2019, 02:47 PM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Quote:
Originally Posted by Rolls-Royce View Post
TWO variables there: not your TV and not your meter...
Still far more accurate than anything I could do with an X-Rite i1 Pro 2.

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
post #3382 of 3441 Old 09-12-2019, 03:08 PM
AVS Forum Addicted Member
 
D-Nice's Avatar
 
Join Date: Dec 2004
Location: Columbia, SC
Posts: 16,833
Mentioned: 123 Post(s)
Tagged: 1 Thread(s)
Quoted: 1871 Post(s)
Liked: 3681
Quote:
Originally Posted by mrtickleuk View Post
Still far more accurate than anything I could do with an X-Rite i1 Pro 2.
I would say it’s about the same.
Rolls-Royce likes this.
D-Nice is offline  
post #3383 of 3441 Old 09-12-2019, 03:54 PM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Quote:
Originally Posted by D-Nice View Post
I would say it’s about the same.
Ok, thanks. That's very interesting.

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
post #3384 of 3441 Old 09-13-2019, 12:03 AM
Member
 
Dirk Pajonk's Avatar
 
Join Date: Nov 2012
Posts: 17
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 13 Post(s)
Liked: 8
Hello,

i want to connect my pc for gaming to the Oled55B8.
1. Which Symbol Do i have to select for the HDMI port? Game or Pc?
2. Which picture Profile i have to set?
- For Game Mode the color Gammut is locked!
- Can I also use ISF Dark Room with disabling all picture Enhancements and will get the same input lag as game mode?

Thanks
Dirk Pajonk is offline  
post #3385 of 3441 Old 09-13-2019, 04:35 AM
Advanced Member
 
janos666's Avatar
 
Join Date: Jan 2010
Location: Hungary
Posts: 733
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 82 Post(s)
Liked: 49
Quote:
Originally Posted by mrtickleuk View Post
Still far more accurate than anything I could do with an X-Rite i1 Pro 2.
This is an old article about a test with an admittedly limited sample size (both in terms of number of individual probes and display types) but ever since I read this I tend to assume that an i1d3 colorimeter is roughly as accurate (on absolute scale with bright patches) as an i1Pro spectro (neither should be treated as inherently superior) when it comes to reasonably common display types: https://www.drycreekphoto.com/Learn/Calibration/MonitorCalibrationHardware.html
After all, the i1d3 is a relatively great colorimeter (the units have spectral sensitivity profiles and stable filters, etc) and the i1Pro is a fairly limited spectrophotometer (10m resolution is not that great, especially not when it comes to the high-end display panels of this era with all kinds of LEDs and QDs). Thus, one should probably trust an i1d3 over an i1Pro (including all revisions) when it comes to a display which you have a good spectral profile for (like an EDR with <=3nm high quality spectro). And I would consider it a 50/50 bet when you have no profile (and choose the so called factory "raw" table or the closest assumption, like RGB OLED instead of a WRGB OLED, etc).
And I doubt it makes any significant difference if you profile your own display panel or any other of the same kind (e.g. two LG OLED from the same year). The WRGB patches should show the same spectral shapes and this correction is not huge anyways (just compare the results of any two similar factory tables, like RGB vs. WRGB OLED tables and see how much it matters --- I mean, it varies from unit to unit, but my old i1d3 barely cares about any EDR to begin with, the "raw" and any X-Rite supplied EDR results in dE<=1 difference).

"DIY certified hobby-calibrator" (based on ChadB's "warning signs" list

Last edited by janos666; 09-13-2019 at 05:05 AM.
janos666 is online now  
post #3386 of 3441 Old 09-13-2019, 07:43 AM - Thread Starter
AVS Forum Special Member
 
Join Date: Nov 2014
Location: New York
Posts: 5,386
Mentioned: 189 Post(s)
Tagged: 0 Thread(s)
Quoted: 4418 Post(s)
Liked: 5421
Quote:
Originally Posted by janos666 View Post
This is an old article about a test with an admittedly limited sample size (both in terms of number of individual probes and display types) but ever since I read this I tend to assume that an i1d3 colorimeter is roughly as accurate (on absolute scale with bright patches) as an i1Pro spectro (neither should be treated as inherently superior) when it comes to reasonably common display types: https://www.drycreekphoto.com/Learn/Calibration/MonitorCalibrationHardware.html
After all, the i1d3 is a relatively great colorimeter (the units have spectral sensitivity profiles and stable filters, etc) and the i1Pro is a fairly limited spectrophotometer (10m resolution is not that great, especially not when it comes to the high-end display panels of this era with all kinds of LEDs and QDs). Thus, one should probably trust an i1d3 over an i1Pro (including all revisions) when it comes to a display which you have a good spectral profile for (like an EDR with <=3nm high quality spectro). And I would consider it a 50/50 bet when you have no profile (and choose the so called factory "raw" table or the closest assumption, like RGB OLED instead of a WRGB OLED, etc).
And I doubt it makes any significant difference if you profile your own display panel or any other of the same kind (e.g. two LG OLED from the same year). The WRGB patches should show the same spectral shapes and this correction is not huge anyways (just compare the results of any two similar factory tables, like RGB vs. WRGB OLED tables and see how much it matters --- I mean, it varies from unit to unit, but my old i1d3 barely cares about any EDR to begin with, the "raw" and any X-Rite supplied EDR results in dE<=1 difference).
This is an interesting comment but you need to be careful, as Ted as stated many times, and common sense reasoning applies, there will be variances between id3's and you will never know how accurate or inaccurate yours is unless you have some reference device to compare it to. The i1d3 is a very inexpensive device. But it's good for the money and if you are not trying to get perfection.

John
Sony 55A1E, A9F / LG 55OLEDC8
Marantz 7012, Ohm Walsh Speakers
Klein K10-A, Jeti 1501, Murideo Six-G Gen2
Calman Ultimate, ISF Level III Certified
jrref is offline  
post #3387 of 3441 Old 09-13-2019, 08:00 AM
Advanced Member
 
janos666's Avatar
 
Join Date: Jan 2010
Location: Hungary
Posts: 733
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 82 Post(s)
Liked: 49
Quote:
Originally Posted by jrref View Post
This is an interesting comment but you need to be careful, as Ted as stated many times, and common sense reasoning applies, there will be variances between id3's and you will never know how accurate or inaccurate yours is unless you have some reference device to compare it to. The i1d3 is a very inexpensive device. But it's good for the money and if you are not trying to get perfection.
The same applies for the i1Pro (all revs up to Pro 3): relatively cheap when it comes to spectrophotometers (and equally limited in accuracy).
But sure, I kept the i1Pro 2 I had to periodically compare their results (and assume that they wouldn't drift the same way, given their fundamental differences) and assume that they are both within expected margin of error as long as they mostly agree. But I don't blindly trust the i1Pro over the i1d3 within this close range of results (actually, I prefer to trust the i1d3 when the EDR is supposedly close enough).

"DIY certified hobby-calibrator" (based on ChadB's "warning signs" list
janos666 is online now  
post #3388 of 3441 Old 09-13-2019, 09:58 AM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Quote:
Originally Posted by janos666 View Post
This is an old article about a test with an admittedly limited sample size (both in terms of number of individual probes and display types) but ever since I read this I tend to assume that an i1d3 colorimeter is roughly as accurate (on absolute scale with bright patches) as an i1Pro spectro (neither should be treated as inherently superior) when it comes to reasonably common display types: https://www.drycreekphoto.com/Learn/Calibration/MonitorCalibrationHardware.html
After all, the i1d3 is a relatively great colorimeter (the units have spectral sensitivity profiles and stable filters, etc) and the i1Pro is a fairly limited spectrophotometer (10m resolution is not that great, especially not when it comes to the high-end display panels of this era with all kinds of LEDs and QDs). Thus, one should probably trust an i1d3 over an i1Pro (including all revisions) when it comes to a display which you have a good spectral profile for (like an EDR with <=3nm high quality spectro). And I would consider it a 50/50 bet when you have no profile (and choose the so called factory "raw" table or the closest assumption, like RGB OLED instead of a WRGB OLED, etc).
Very interesting and thankyou also.

Quote:
And I doubt it makes any significant difference if you profile your own display panel or any other of the same kind (e.g. two LG OLED from the same year). The WRGB patches should show the same spectral shapes and this correction is not huge anyways (just compare the results of any two similar factory tables, like RGB vs. WRGB OLED tables and see how much it matters --- I mean, it varies from unit to unit, but my old i1d3 barely cares about any EDR to begin with, the "raw" and any X-Rite supplied EDR results in dE<=1 difference).
All good to know! I think it really helps me to get it into perspective.

Quote:
Originally Posted by janos666 View Post
The same applies for the i1Pro (all revs up to Pro 3): relatively cheap when it comes to spectrophotometers (and equally limited in accuracy).
But sure, I kept the i1Pro 2 I had to periodically compare their results (and assume that they wouldn't drift the same way, given their fundamental differences) and assume that they are both within expected margin of error as long as they mostly agree. But I don't blindly trust the i1Pro over the i1d3 within this close range of results (actually, I prefer to trust the i1d3 when the EDR is supposedly close enough).
Everyone keeps saying "i1d3", which I assume is the FSI-badged one here:
https://www.shopfsi.com/i1D3OEM-p/i1d3oem.htm

which looks exactly like the i1displayPro here: https://www.xrite.com/categories/cal.../i1display-pro (Model #: eodis3)

Separately I am well aware from Teddd's website here that the Rev. B models can read up to 2,000 nits; my SpectraCAL branded C6-HDR2000, I am brashly reading from the name, is one of those Rev. B versions.

My point of confusion (and it's really X-Rite's fault for their model names) is the use of the term "i1d3" which I think only FSI uses? Where does the "3" even come from?!

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
post #3389 of 3441 Old 09-13-2019, 04:11 PM
Advanced Member
 
janos666's Avatar
 
Join Date: Jan 2010
Location: Hungary
Posts: 733
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 82 Post(s)
Liked: 49
Quote:
Originally Posted by mrtickleuk View Post
My point of confusion (and it's really X-Rite's fault for their model names) is the use of the term "i1d3" which I think only FSI uses? Where does the "3" even come from?!
It's one of the several codenames, I prefer to use this when I wish to refer to all editions and revision of this probe. The "Display Pro" (i1d3) is the successor of the "i1 Display 2" (which was developed by GretagMacbeth, if I recall - before X-Rite bought the Swiss manufacturer).
Quote:
Originally Posted by mrtickleuk View Post
All good to know! I think it really helps me to get it into perspective.
Do check this with your probe though. Yours could be different. (Just try two roughly similar factory EDRs and compare the results...)

"DIY certified hobby-calibrator" (based on ChadB's "warning signs" list
janos666 is online now  
post #3390 of 3441 Old 09-14-2019, 01:50 AM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 6,911
Mentioned: 179 Post(s)
Tagged: 0 Thread(s)
Quoted: 4741 Post(s)
Liked: 7041
Quote:
Originally Posted by ConnecTEDDD View Post
What HDR10 metadata you used? Looks so low for HDR10 400 nits.
A late answer to this question, and the problem with my TV apparently only going up to 400 nits. I used Metadata created using the AVTopController in Calman.

BT2020 (AVI) Format: RGB
Mastering Primaries: P3
Mastering WP: D65
Max MDL: 1000
Min MDL: 0.005
MaxCLL: 1000
MaxFALL: 400

HDR infoframe: 87:01:1a:74:02:00:c2:33:c4:86:4c:1d:b8:0b:d0:84:80 :3e:13:3d:42:40:e8:03:32:00:e8:03:90:01
AVI infoframe: 00:E8:64:5D:00

But, I don't like that AVI infoframe. ["5D" = 2160p24 which is a lie for my video card. So I used "10": 1080p60]

AVI infoframe: 00:E8:64:10:00

I then pasted these manually into the Integral GUI app and saw the TV switch into HDR10 mode.

I was doing experiments last night with the Raspberry Pi's output vs my laptop's output.

Quote:
Originally Posted by mrtickleuk View Post
Sorry but I can't remember I will have to try again (hopefully this weekend) and pay more attention to all the settings in the Calman metadata creation boxes, and what I set in the HDFury PC Utility for AVI Infoframe too. It does sound like I got one of the settings wrong. I wasn't too worried at the time because I remember I've measured 742 nits last year. But there are so many different things to remember to do, all in the perfect order, and one tiny mistake and it's all wasted.
Quote:
Originally Posted by ConnecTEDDD View Post
What is your plan? ...to try using your notebook output with HDFury to add vendor infoframe and CalMAN software generator?

If you manage to make this work, it will require a bit-perfect 1080p60 full-range output from your VGA, it will mean can output accurate levels, for RGB, if it will work.
Right. If I understand you correctly, what you are saying is that the Dolby Vision pattern source window in Calman can be used as a way to test if the laptop is bit accurate? Instead of all my pain of comparing readings - it's a proper full test of bit accuracy?

In that case I have a failure because on the "test" page I got a red/yellow pattern on the TV and it didn't switch into Dolby Vision mode.

Tyler mentions this only very quickly here:


So, I think the answer I have now is that my laptop isn't bit accurate, and/or that method simply doesn't work.

Back to my problem of only getting 400 nits - I turned on TPC in the service menu, all the processing was turned off (Logo luminance adustment, dynamic tone mapping, dynamic contrast etc). Is there anything else that might be causing it?

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
Sponsored Links
Advertisement
 
Reply Display Calibration

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off