Originally Posted by JSpectre88
For some things, I agree. Games shouldn't be 30 fps to begin with, that's the problem. I would think sports at 60 fps should be fast enough, but if the plasma is causing judder and blur that a CRT would not, then I suppose that's a different story. However, in the context of film to which I was referring, the judder is correct, while the smoother is an artificial effect. I wonder if he saw The Hobbit at 48 FPS, he probably would have loved it.
I can attest to 60fps content looking extremely smooth on a plasma display; I have a GT50 I use almost exclusively for PC gaming. The technology introduces some extra dithering during motion but is otherwise smooth -- maybe not as good as a CRT, but a very solid performer (Mark could probably detail other ways plasma's motion is not up to CRT standards but I'm coming at this as a user without the capacity to do extensive testing). For someone whose content is primarily 60 frames/sec, a plasma is massively superior to any LCD technology out there (let's ignore LightBoost since this is already barely about the GT50 and that's a PC-only technology) .
The problem is that essentially nothing is 60fps outside PC gaming, so judder is always going to be present as the screen refreshes at 60hz (well, plasma's more complicated than that but essentially 60hz). I can understand why, coming from a motion interpolated LCD, a person might wonder why everyone talks about plasma motion performance, but they must remember that the majority of people here consider interpolated frames (invented by algorithms and generally not particularly accurate) to be completely unacceptable. If we throw that technology out completely, and assume we're talking about 30fps TV, then we're left with a choice:
Motion blur (LCD) or judder (plasma)? This becomes a matter of personal preference. I prefer judder over blur, because at least if I'm seeing 2 frames at once they both clearly show detail definition. And, as I said, a large portion of my media consumption is at 60 fps, where the plasma wins with no close competition. If the user is going to be watching non-interactive 30 (or 24) fps media and is okay with interpolation, then I think it really comes down to comparing how good the interpolation processing of various TVs is. I can't speak for that because it's not something I use.
The real solution to all of this is higher frame rates in content. 60 would get everything running smooth on plasmas, and 120 would allow the full backlight strobing of the "LightBoost" monitors to be applicable to LCD TVs. Film traditionalists hate the idea, and I understand that, but I personally have no problem with higher frame rates so long as we're talking about actual source frames from cameras, not produced by interpolation algorithms. Of course I grew up with mostly 60fps media (PC games) so anything less than that just feels like it's dropping frames to me.
One last bit... I don't believe that judder in film is actually "correct" really, is it? People are resistant to the idea of higher framerates (see: Hobbit backlash), so for a long time they've been doubling up frames, which creates judder. I'm fairly certain (but cannot test it, so someone let me know if I'm wrong) that if you actually played a film at a true 24 fps you could lose the judder in exchange for that old-timey flicker. The frame rate would still be quite low, but you wouldn't get all that confusing double image effect. So judder may be accurate to the current experience of film, but it's not inherent to the 24 fps media. From my perspective it's just another argument that we need higher real frame rates in our media (again, I know film people will want to murder me and I do apologize). The Hobbit may have lost too much of the appearance of film that we expect, but it was really the first try at making this work. I'm sure that, over time, filmmakers would figure out how to get the effects they desire at higher frame rates.