Originally Posted by tgm1024
Trust me, you know I've been pounding my keyboard about native FPS loudly (and forEVER).
But it doesn't change the fact that some panels are better than others (today) at motion. I'm very
sensitive to it, and some I can watch a football game on, and others I cannot.
I didn't mean to imply that there weren't any differences between displays, just that most "motion blur" complaints are actually about the source content than the display. And I don't mean native framerate, though that is important, but simply due to the shutter speeds used on the cameras. For example, I've seen lots of complaints about objects blurring as they move across the screen, or when the camera pans, but if you actually pause the image, you see that it's source material itself, and what you're seeing is motion blur caused by the camera.
Random shot from the Blu-ray I happened to have in right now:
Here you see that you can't read the text on the van, and the background is also blurred, but that wouldn't change no matter how good the display gets.
While there are still differences between panels, most of them are now a lot
better than they were a couple of years ago - particularly LCDs - as manufacturers have been forced to improve motion handling to add good 3D support. (at least with Active 3D)
With games though, there are still lots that don't use any kind of motion blur (and many people prefer not to have any if it's optional) so you are supposed to have perfectly sharp motion at all times, and that's what really
shows motion handling differences between displays, rather than any filmed content.
Originally Posted by vinnie97
I would wager that motion blur to you is the equivalent of dithering to Chrono.
Posterization is my biggest complaint about most displays these days, though dithering can be a side-effect of that. (the Kuros used a lot
of dither to try and mask it, for example)