Been thinking about this a bit and google didn't turn up any answers for me.
My question is why don't we see motion blur in real life, at the least to the point we do on modern LCD/LED tvs?
CRT/plasma with decaying phosphors or strobing backlight for LED/LCD reduce the effect, but real life does not have decay or strobing effect so why don't we see motion blur in real life? Is it still that the eye/brain have much higher refresh rate? But I thought at this point motion blur in LCD/LED is not about response time and refresh rate anymore.
My gut says what we see on LCD/LED should be more like real life since there is no decay, but blur on these sets is very noticeable to me so wondering where my thought process is wrong.
DON'T ARGUE WITH ME. I AM NEVER WONG.
The 24-60fps we see on displays is nothing like reality, which is why you need to employ techniques like interpolation or strobing the image to reduce image persistence, which reduces motion blur.
And often when filming at low framerates - particularly 24fps, the image is intentionally shot using a shutter speed which leaves some blur in the image, to help reduce judder. The sharper the image is, the more it will judder at low framerates.
Moving to high framerates drastically reduces motion blur, without relying on interpolation or strobing the image to reduce persistence.
In real life it would not work because the real world is not just sound and vision.
Other people above mentions source-based motion blur (e.g. blur built into movies), but does not specifically explain sample-and-hold motion blur, which is endemic to many LCD/LED televisions until recently. So, excluding source-based motion blur:
There are no static frames in real life. Real life does not operate on a frame rate. The human brain does not operate on a frame rate.
This is called the "sample-and-hold" effect, also shown in scientific references (many science/academic links). In simple explanation:
- LCD/LED displays static frames continuously.
- Your eyes are moving while tracking moving images.
- Your eyes are in a different position at the beginning of a refresh than at the end of a refresh.
- Therefore, the static refreshes are blurred across your retinas.
- That's motion blur.
Repeat this process every refresh (e.g. 60 times a second). On a continual basis. As a result, Your eyes are always seeing 1/60sec worth of motion blur. This is the motion blur caused by sample-and-hold. This motion blur has nothing to do with the speed of pixel transitions (GtG). This motion blur even occurs on 0ms instant-pixel-response displays, if the frames are still displayed continuously and statically.
See this Web Animation on TestUFO: http://www.testufo.com/eyetracking
View this in a recent web browser on an LCD display at 60fps. This demonstrates eye-tracking based motion blur -- the motion blur enforced by static frames on a sample-and-hold display.
How Do You Fix This Motion Blur?
Answer: Shorten the length of individual static frames.
..........Via strobing (e.g. CRT, plasma, strobe backlight)
..........Via more frames/refreshes (e.g. extra unique frames in extra Hz, either original frames or interpolated frames)
..........Via infinite framerate (e.g. real life. No frames are static)
Otherwise, the static frames gets smeared across your eyes.
The Simple Photographer's Analogy
Eye-tracking of moving objects on a 60Hz LCD. A frame continuously displayed for 1/60sec will create the same amount of perceived motion blur as waving around a digital camera (1/60sec shutter) while trying to take a picture at the same panning speed. The sensor (eye, camera) is moving relative to the material (scenery, static frame on display). e.g. If the scene moves 1 inch during 1/60sec, you get 1 inch of motion blurring trail.
Now use a fast shutter. 1/1000sec for example. Try taking a picture while moving around the camera. You get no motion blur (or only very little). Likewise, a display that has 1/1000sec static frames (e.g. brief strobes, or "1000Hz" display), you will get far less motion blur this way.
However, a 60Hz LCD without strobe backlight, without interpolation, will have the same amount of motion blur as a waving digital camera set to a 1/60sec shutter. Moving eyes, moving camera, creates the same amount of perceived/photographed motion blur on the static material (e.g. static scenery, static frame) for the same panning speed.
Go to Talladega and watch a REAL RACE. If you don't see BLUR then you need to drink another 12-pack!
The main place to see BLUR is if you view LCD from an angle!
If it looks great to you then you are the problem!
It's just semantics and concepts at this stage.
I am using Merriam-Webster Dictionary, Definition 2b, "b : shape, construct" -- e.g. a static state.
If we interpret "frame" as a "static state", then you understand what I mean.
Displays are designed to display a finite number of static states per second, while real life displays an infinite number of static states per second.
1 -- Einstien often used the phrase, "frame of reference", so the word "frame" is not necessarily a terminology only used towards on-screen material. Now let's put a spin on that. The eyeball (seeing the screen) can be the frame of reference relative to the on-screen moving object, or the imagery itself (the static image) can be the frame of reference relative to the eyeball. They interact with each other to create perceived motion blur (the sample-and-hold effect, as frequently explained in other posts, and in science). There are a finite number of positions for a moving object on a display, while there's an infinite number of positions for a moving object in real life. So in real life, there's an infinite rate of changing positions relative to the moving eyeball (eye tracking the moving object).
2 -- For the purposes of calculating perceived motion blur enforced by the sample-and-hold effect, where "hold" is the length of a single static state (frozen position of moving object), the hold time becomes zero for real-life, which only occurs if infinity is assigned to "frame rate". High-speed jerky movement (vibrations) in real-life can create motion blur too, so it's possible to create blur via a real-life sample-and-hold effect (e.g. stepper motors) without using a display. If you stepper-motored a small ultra-lightweight object sideways at only 60Hz or 120Hz, you will see blurry looking motion (blur from what looks like 60Hz to 120Hz vibration). Moving a real-world object at 60 or 120 static positions per second. That won't be zero blur. This is a real-life analogue to the perceived motion blur witnessed on a 60Hz or 120hz sample-and-hold display. Forcing a finite frame rate into the world of infinite frame rate, creates apparent motion blur. So it's mathematically valid, from the standpoint of motion blur mathematics. If a display has infinite frame rate, it would have no perceived motion blur caused by sample-and-hold. Hold time would be zero. Just like real life.
So I'm correct, too, when interpreted this way. I just use the word "frame" a bit unconventionally.
^Mark. Aye yi yi. We use the term "frame rate" and "frame" consistently in AVS. In that context, we have eyes that don't have a frame rate.
This isn't worth a discussion.
GMOs? Pfffffft. Can we worry about real issues please?