The Soap Opera Effect is a misnomer, I think.
60 fps is closer to reality than 24 fps, so it's more appropriate if 24 fps had a nickname associated with it.
Like Flickery Choppy Motion Blur Effect.
Poorly interpolated 60 fps or 120 fps often have artifacts, using older processing methods, but I dispute whether those artifacts are the main cause of the hate for HFR.
Many people hate high framerate irrespective of whether the HFR is real or faked. I think those who hate MEMC hate real HFR even though it has no interpolation artifacts to speak of.
However, objectively speaking, real 60 fps or 120 fps content is actually less of an "effect" on the incoming shots. It's often frustrating to see 24 fps being called an artistic choice, namely an effect, but high framerate is considered unnatural despite it being literally the reduction of an extreme effect of chopping up light coming in a camera into blurry exposed sensor buckets we call frames. The more frames captured, the closer it is to reality.
So I think people who are opposed to HFR are actually arguing in bad faith most of the time. They praise the "artistic effect" of low framerate, while claiming they want purity, and call the absense of unnatural blurring and choppiness, to be also an effect.
One thing that people ignore in all this, is the fact that TV manufacturers leave MEMC on by default, because it helps sell TVs in the show room. Which should tell you all you need to know about how popular it really is compared to the hype that the lowframerate purists are pushing.
If Samsung, etc, saw MEMC hurting sales, because more people hated it than liked its effect on motion, they not only wouldn't turn it on by default, they would likely remove it entirely.
To anyone who knows anything about business and market research, do you think it's a sensible hypothesis that these TV companies, which spend millions in market research and billions in R&D, wouldn't know exactly the effect MEMC had on TV sales?
These companies are turning on MEMC on the showroom floor for the same reason that brighter TV modes are the default: because brightness sells TVs, and smoother motion sells TVs, too.
Otherwise it simply wouldn't add up that they'd enable it by default, if it hurt sales. Do any of you actually think the default settings on all these electronic gadgets aren't specifically chosen because they appeal to the most people?
I think the anti-SOE crowd is deluding themselves into believing that their preferences are more popular. And there are studies our there which confirm this. HFR is more popular than LFR. It is also, more a more accurate depiction of reality, less unnatural.
And I think that's the case even with MEMC based HFR, which in recent years has gotten so good that it's main problems are basically negligible compared to real HFR. That can be debated or analyzed, and I'd like to see some scientific comparisons between the various MEMC techniques and chips out there. This technology is constantly improving. Heck, even Valve has just added frame extrapolation (lag-free) to SteamVR. And artifacts in VR are especially noticeable and disconcerting. But having a smooth constant framerate is better than having some rare issues due to the processing. I don't know why TVs or projectors would be any different. MEMC is a terrific new addition to VR (instead of mere re-projection when it detects late frames), and it seems to be popular enough to boost TV sales in a crowded showroom. When TVs are seen side-by-side, those which have the smoothness setting off are at a competitive disadvantage.
MEMC being set to enabled by default is a direct result of hyper-capitalism, where tons of market research is done, proving MEMC's effects are not only not perceived badly by the general population, but that it's actually preferred.
Apple's moving to 120hz now, and with VR and AR, I think it's only a matter of time before 24 fps joins SDR in becoming obsolete. I don't want people who hate HFR to be forced to watch it though, so I think they should film everything at 120 fps and then produce 120, 60, 30, and 24 fps versions for various markets and distribution methods. It's easier and better to blend high framerate into low framerate than the other way around, but technology is getting so good that going in either direction is going to allow everyone to watch their content exactly the way they want to, without being forced to sit through something that bothers them.
I only get irritated when I see 24 fps super fans tell others to STFU and take it, as if their preferences are good and right and proper and ours aren't. It's not only annoying, but it's also untrue. HFR is more popular than LFR and that fact isn't going to change, it's hard-wired into the human psyche. The argument that people prefer 24 fps movies because they're used to it is not a good reason to keep using it. Nobody uses film any more and higher framerates compress very well.