Okay, I'll do it. I'm still curious, and I know the rest of the old-timers are going to have a stroke if we keep talking about this on the other thread.
Here's why I started this thread.
One, there's this video - http://www.youtube.com/watch?v=ijq0mV9GPbY
. It's infamous now, given that the person who shot it, video313, is unnecessarily abrasive with everyone who tries to weigh in on the subject. Consequently, he is encouraged to leave this thread alone so as avoid chasing away people who may help us answer these questions.
Two, people other than video313 have seen the same flicker issue in both the Pure Cinema "Standard" and "Advanced" settings, so it does not appear to be a complete fluke:
Originally Posted by sheedoe
Please don't kill me for bringing this up, but I have a question regarding the PC modes and 60hz vs 72hz (yes I know... here we go again!
Last night while I was setting my G9 digital camera up to take some screen shots
, I noticed something similar on the LCD screen of my camera to Video313's finding here
. The camera was set @ 1/10 shutter speed and f 8.0 aperture.
PS3 was set to output 1080/24P and my Onkyo 885 pre/pro video out set to "Through" to ensure proper 1080/24P feed.
With a 1080/24P feed from the PS3 (playing Wall-E blu-ray), switching between the different PC modes, I've noticed a faster flicker (of the same type) in "Standard" and "Advanced" mode, and a slower flicker (of the same type) in the "Off" and "smooth" modes, same as on video313's youtube video.
With a non 1080/24P feed from my digital cable box, I noticed a slow flicker for all modes as if they were being refreshed at 60hz.
Now I know a camera is not a proper tool to measure refresh rate. But,1) Why am I seeing these distinct flicker patterns between the pair of PC modes?
2) Is this the refresh rate that is being picked up on the camera? If not, what is it?
If it is indeed the refresh rate, then it would seem that video313 observation is correct in that both the "standard" and "advanced" mode is refreshing at similar 72hz and the other two modes is rereshing at 60hz. I'd really appreciate an answer and thanks in advance.
Three, other people have apparently noticed that when the signal is 1080p/24, the TV will not use the ControlCal calibrated grayscale, so there is at least other corroborating evidence to suggest that SOMETHING different is going on in Standard mode than in Off mode when the TV receives a 1080p/24 signal:
Originally Posted by Michael St. Clair
This is true except with 1080p/24 native input. I did ControlCal greyscale adjustment this weekend. In 1080p/60 plus 1080i/720p/480p/480i, it is true that Off, Standard, and Smooth use the modified greyscale. However, with 1080p/24, Standard goes back to using the original greyscale and only Off/Smooth use the adjusted one. I verified this both with my eye, and photographically (taking DSLR pictures of test patterns of the different modes using manual settings). I even set up a greenish grayscale, temporarily, to make it easier to spot the differences at a glance (no, nothing so green that it would wear the green phosphors unevenly, and even then only for a few minutes).
This has been observed by others:http://www.avsforum.com/avs-vb/showt...7#post15891567http://www.controlcal.com/forum/show...&postcount=226
So here is the question I hope we might punch around, and it has nothing to do with perceived "smoothness" of the video, so stay away video313 - by your own admission this was not your original question, nor what you are interested in discussing. The question is:
What could explain the apparent flicker when a video camera views 1080p/24 content on a PDP-5020FD set to "Standard" or "Advanced" Mode, and why doesn't it show up in "Off" mode since that one is apparently refreshing at 72 Hz (by virtue of the fact that the TV automatically refreshes at 72 Hz when it detects a 1080p/24 signal if left in "Off" mode)?
One thought as a jumping off point - is it possible that the processing algorithms for performing inverse telecine are still taking place in the Standard and Advanced modes (albeit unnecessarily with 24 fps content), and as a result of the power demands of those processing circuits the panel is changing the frequency of the current being supplied to the panel? Is that crazy? Any thoughts would definitely intrigue some of us (at least me and sheedoe).
Thanks, and - you can't see this - but I have my hands up as if to say "don't kill me! - I'm just asking a question!" Hopefully we can have a fun, insult-free discussion of this without getting anyone too riled up.