Originally Posted by Nektarios
Agreed, but you didn't see that I said displaying the same frame multiple times = effective retinal persistence, do you disagree on that?
I don't understand why a scanning gun that uniformly scans the entire screen line by line at the exact same speed, eventually produce different amounts of flicker on different sections of it. It doesn't make sense.
Generally displaying same frame multiple times does not create retina persistence (I am excluding brain washing here
), guess eyes don't have IR... The persistence is proportionate to luminance and I think contrast. Hence u get retina persistence if shown for eg head lights. In a dark background with single light your retina will also have the after image of the light.
So assuming a plasma is showing a STILL picture at 60fps, you will not get retina persistence, as per xrox chart, the luminance tapers off fast enough for that effect not to happen. But an LCD still picture at 60fps is almost the same as displaying it at 1fps, very sharp and clear. On the other hand if plasma is 1fps, you will see it go black. Hence LCD is perfect for still and PC use. But once you change from still to moving at 60fps, you will notice the blur in the LCD. But fairer to say part of the reason is also the lag in the crystal transition.
The distance from the gun to the bottom and top of the CRT screen vs the middle is different. Hence it is brighter at the middle. But this is perfect as the human eye will perceive them as normal on a 2.2 gamma. There's a lot of literature on this and I must admit I don't understand all of them, hence my previous question if modern plasma actually adjust for this human perception. I am assuming it doesn't because D-Nice has categorically say 2.2 is not the right figure for FPTV calibration.
Originally Posted by sonic_blue
This confuses me. If they can already achieve perfect motion resolution as long as the framerate is high enough (albeit via interpolation), why can't they just put in, say, a blanking frame instead of those interpolated frames, and achieve perfect motion resolution at normal lower framerates. Would it introduce flicker? Some other artefact perhaps?
They did. It is called Black Frame Insertion (BFI). BFI in LCD is trying to emulate the luminance drop like the plasma to reduce the hold effect, as explained above, but I think not very successful due to the slow crystal transition. Hence newer LCD uses MCFI rather than BFI.
Originally Posted by Jogi
I really don't understand whole motion blur issue... I had a 50/60hz Sony LCD RP-TV and it handled normal TV/dvd/xbox360 gaming better than my current Samsung 63" 3d plasma because of motion blur!!
60fps gaming and 50i TV shows fine on both but when showing "25p" tv material or 30fps gaming it really tends to be less enjoyable with plasma as my old LCD blurs fast movements little bit and just seems smoother. In plasma 25/30fps material just seems jerky/sudden and creates more "double vision" by showing same frame 2 times a row without any buildin blurring (with a bluray it is .... "4 vision", during fast panning I sometimes see 4 copies of same frame 24*4 = 96hz, just eye/brain thing I know..)
Well at least with this plasma 60fps gaming is free of this frame doubling, tripling... issue, I had a Samsung 120hz LCD tv few days and it doubled 60hz to 120hz even during gaming with game mode on!!!! just horrible double vision
So I can not understand why to have any of this 120/240hz lcd stuff with backlight flickering, black frame insertion because plain old 50/60hz LCD tech was just fine perfect, absolutely no flickering and smooth movement with 50/60hz & 25/30hz material.
Frankly Jogi, I am not sure I understand your post. Firstly LCD RPTV and LCD FPTV are very different unless I am missing something here? Secondly you are the only one I know that can see motion frame by frame... if that is true I have to withdraw my skepticism towards <33ms input lag making a perceivable difference
If your claim is right, can you actually see a black frame during BFI??
Motion stutter in plasma from low fps is a different topic as explained previously. To put it more precisely and avoid confusion, motion stutter is evident in ALL displays if they are running at 24Hz. I'm trying to say this is not a display dependent issue, it is a source issue.
P.S> Jogi you may need to reduce your sharpness control on your plasma to soften the picture. Or it could be your plasma fail the telecine test, which is not uncommon based on some threads I read here, and Chad B's review.
Originally Posted by specuvestor
This is different from the main topic of the thread. This is due to low frame rate which will affect panning for eg 24fps movie. Read in other threads that is why directors have to be careful when doing panning scenes in movies. So it is normal u see stutter during panning. Low frame rate gives an artistic unreal feel to the movie content, which some enjoy.
With ref to the thread, you will see flicker if it were 30fps@30Hz but at 60Hz 2:2 pulldown most of us don't see it.